CN108450032B - Flight control method and device - Google Patents
Flight control method and device Download PDFInfo
- Publication number
- CN108450032B CN108450032B CN201680076224.8A CN201680076224A CN108450032B CN 108450032 B CN108450032 B CN 108450032B CN 201680076224 A CN201680076224 A CN 201680076224A CN 108450032 B CN108450032 B CN 108450032B
- Authority
- CN
- China
- Prior art keywords
- target
- horizontal plane
- unmanned aerial
- aerial vehicle
- flying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 title claims abstract description 43
- 230000002452 interceptive effect Effects 0.000 claims description 18
- 230000008685 targeting Effects 0.000 claims 1
- 230000004888 barrier function Effects 0.000 abstract description 3
- 238000003384 imaging method Methods 0.000 description 63
- 238000010586 diagram Methods 0.000 description 16
- 238000013519 translation Methods 0.000 description 7
- 241000712469 Fowl plague virus Species 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000005265 energy consumption Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A flight control method and apparatus (400), the method comprising: determining a designated position in the image as a first target (S201); when the included angle between the horizontal plane and the connecting line of the current positions of the first target and the unmanned aerial vehicle is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the size of the included angle (S202); and controlling the unmanned aerial vehicle to fly towards a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than the preset distance (S203). Even when the line of first target and unmanned aerial vehicle's current position and the contained angle of horizontal plane are greater than first preset angle, also can control unmanned aerial vehicle towards the flight of second target for unmanned aerial vehicle flies to the second target apart from the first target preset distance, and unmanned aerial vehicle can not touch the barrier easily like this, has guaranteed unmanned aerial vehicle flight safety, has also enlarged the target location scope that unmanned aerial vehicle pointed the flight.
Description
Technical Field
The embodiment of the invention relates to the technical field of unmanned aerial vehicles, in particular to a flight control method and device.
Background
The existing unmanned aerial vehicle shoots pictures through a camera device arranged on the unmanned aerial vehicle, and displays the pictures to a user in real time through a display interface, if the user is interested in a certain object in the pictures, the unmanned aerial vehicle can be controlled to enter a pointing flight mode, namely the user designates a position on the pictures, and the plane flies towards the position. However, when the camera is facing the ground, the aircraft cannot enter the pointing flight mode for safety reasons.
Disclosure of Invention
The embodiment of the invention provides a flight control method and a flight control device, which are used for preventing an unmanned aerial vehicle from easily touching a barrier, ensuring the flight safety of the unmanned aerial vehicle and expanding the range of a target position where the unmanned aerial vehicle flies in a pointed manner.
In a first aspect, an embodiment of the present invention provides a flight control method, including:
determining a first target according to a specified position in the image;
when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the size of the included angle;
and controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
In a second aspect, an embodiment of the present invention provides a flight control apparatus, including:
a target determination module for determining a first target according to a specified position in the image;
the flight mode determining module is used for determining the flight mode of the unmanned aerial vehicle according to the included angle when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle;
and the control module is used for controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
In a third aspect, an embodiment of the present invention provides a flight control apparatus, including: a memory and a processor;
the memory for storing code for performing a flight control method;
the processor is used for calling the codes stored in the memory and executing: determining a first target according to a specified position in the image; when an included angle between a connecting line of the first target and the current position of the unmanned aerial vehicle and a horizontal plane is larger than a first preset angle, determining a flight mode of the unmanned aerial vehicle according to the size of the included angle; and controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
In a fourth aspect, an embodiment of the present invention provides a flight control system for an unmanned aerial vehicle, including: an unmanned aerial vehicle; and a flight control apparatus as provided in the second or third aspect of the invention.
According to the flight control method and device and the flight control system of the unmanned aerial vehicle, when the included angle between the horizontal plane and the connecting line of the current position of the unmanned aerial vehicle and the first target determined according to the designated position in the image is larger than the first preset angle, the unmanned aerial vehicle is controlled to fly towards the second target, so that the unmanned aerial vehicle flies to the second target away from the first target by the preset distance, the unmanned aerial vehicle cannot easily touch the obstacle, the flight safety of the unmanned aerial vehicle is guaranteed, and the target position range of the unmanned aerial vehicle in pointed flight is expanded.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic architectural diagram of an unmanned aerial vehicle system 100 according to an embodiment of the present invention;
FIG. 2 is a flow chart of a flight control method provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a method for determining a first target using a plurality of imaging devices according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an included angle between a horizontal plane and a connection line between a first target and a current position of the unmanned aerial vehicle, provided by the embodiment of the present invention;
fig. 5 is a schematic diagram that an included angle between a horizontal plane and a connection line between a first target and a current position of an unmanned aerial vehicle provided by an embodiment of the present invention is greater than a first preset angle;
fig. 6 is a schematic view of a flight mode of the unmanned aerial vehicle according to the embodiment of the present invention;
fig. 7 is a schematic view of a flight mode of the unmanned aerial vehicle according to the embodiment of the present invention;
fig. 8 is a schematic diagram of an included angle between a connection line between a first target and a current position of an unmanned aerial vehicle and a horizontal plane, which is greater than a first preset angle and smaller than a second preset angle according to an embodiment of the present invention;
fig. 9 is a schematic view of a flight mode of the unmanned aerial vehicle according to the embodiment of the present invention;
fig. 10 is a schematic view of a flight mode of the drone provided by the embodiment of the present invention;
fig. 11 is a schematic diagram that an included angle between a horizontal plane and a line connecting a first target and a current position of an unmanned aerial vehicle, provided by the embodiment of the present invention, is greater than a second preset angle;
fig. 12 is a schematic view of a flight mode of the drone provided by the embodiment of the present invention;
fig. 13 is a schematic diagram of a ground control device controlling the flight of an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 14 is a diagram illustrating a preset icon according to an embodiment of the present invention;
FIG. 15 is a diagram illustrating a preset icon according to an embodiment of the present invention;
FIG. 16 is a schematic structural diagram of a flight control apparatus according to an embodiment of the present invention;
fig. 17 is a schematic structural diagram of a flight control device according to a second embodiment of the present invention;
fig. 18 is a schematic structural diagram of a flight control system of an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a flight control method and device and a flight control system of an unmanned aerial vehicle. The following description of the invention uses a drone as an example of a drone. It will be apparent to those skilled in the art that other types of drones may be used without limitation, and embodiments of the present invention may be applied to various types of drones. For example, the drone may be a small or large drone. In some embodiments, the drone may be a rotorcraft (rotorcraft), for example, a multi-rotor drone propelled through the air by a plurality of propulsion devices, embodiments of the invention are not so limited, and the drone may be other types of drones.
Fig. 1 is a schematic architecture diagram of an unmanned aerial vehicle system 100 according to an embodiment of the present invention. The present embodiment is described by taking a rotor unmanned aerial vehicle as an example.
The unmanned flight system 100 can include a drone 110, a pan and tilt head 120, a display device 130, and a steering device 140. Among other things, the drone 110 may include a power system 150, a flight control system 160, and a frame 170. The drone 110 may be in wireless communication with ground control devices, which may include a steering device 140 and/or a display device 130.
The frame 170 may include a fuselage and a foot rest (also referred to as a landing gear). The fuselage may include a central frame and one or more arms connected to the central frame, the one or more arms extending radially from the central frame. The foot rest is connected with the fuselage for play the supporting role when unmanned aerial vehicle 110 lands.
The power system 150 may include an electronic governor (abbreviated as an electric governor) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, wherein the motors 152 are connected between the electronic governor 151 and the propellers 153, and the motors 152 and the propellers 153 are disposed on corresponding arms; the electronic governor 151 is configured to receive a drive signal generated by the flight control system 160 and provide a drive current to the motor 152 based on the drive signal to control the rotational speed of the motor 152. The motor 152 is used to drive the propeller in rotation, thereby providing power for the flight of the drone 110, which power enables the drone 110 to achieve one or more degrees of freedom of motion. In certain embodiments, the drone 110 may rotate about one or more axes of rotation. For example, the above-mentioned rotation axes may include a roll axis, a translation axis, and a pitch axis. It should be understood that the motor 152 may be a dc motor or an ac motor. In addition, the motor 152 may be a brushless motor or a brush motor.
The pan/tilt head 120 may include an electric tilt 121 and a motor 122. The pan/tilt head is used to carry the photographing device 123. The flight controller 161 can control the movement of the pan/tilt head 120 through the electrical tilt 121 and the motor 122. Optionally, as another embodiment, the pan/tilt head 120 may further include a controller for controlling the movement of the pan/tilt head 120 by controlling the electric tilt 121 and the motor 122. It should be understood that the pan/tilt head 120 may be separate from the drone 110, or may be part of the drone 110. It should be understood that the motor 122 may be a dc motor or an ac motor. In addition, the motor 122 may be a brushless motor or a brush motor. It should also be understood that the pan/tilt head may be located at the top of the drone, as well as at the bottom of the drone.
The photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and perform photographing under the control of the flight controller.
The display device 130 may communicate with the drone 110 wirelessly and may be used to display pose information for the drone 110. In addition, an image photographed by the photographing device may also be displayed on the display apparatus 130. It is to be understood that the display device 130 may be a stand-alone device or may be provided in the manipulation device 140.
The display device may include a screen. The screen may or may not be a touch screen. The screen may be a Light Emitting Diode (LED) screen, an OLED screen, a Liquid Crystal Display (LCD) screen, a plasma screen, or any other type of screen. The display device may be configured to display a Graphical User Interface (GUI). The GUI may display images that may allow the user to control the motion of the UAV. For example, the user may select a target from the image. The target may be a stationary target or a moving target. The user may select a direction of travel from the image. The user may select a portion of the image (e.g., a point, a region, and/or an object) to define the target and/or orientation. The user may select the target and/or direction by directly touching the screen (e.g., a touch screen). A user may touch a portion of the screen. The user may touch the portion of the screen by touching a point on the screen. Alternatively, the user may select one region from a pre-existing set of regions on the screen, or may draw a boundary for one region, or specify a portion of the plane in any other way. The user may select the target and/or direction by selecting the portion of the image by means of a user interaction device (e.g., a mouse, joystick, keyboard, trackball, touch pad, button, verbal command, gesture recognition, gesture sensor, thermal sensor, touch capacitive sensor, or any other device). The touch screen may be configured to detect a user's touch position, duration of touch, touch pressure, and/or touch motion, wherein each of the above-described touch patterns may indicate a particular input command of the user.
The image on the display device may show the views collected by means of the payload of the movable object. For example, an image collected by the imaging device may be displayed on the display apparatus. This may be considered a First Person View (FPV). In some cases, a single imaging device may be provided and a single FPV may be provided. Alternatively, a plurality of imaging devices having different fields of view may be provided. The views may be switched between the plurality of FPVs, or the plurality of FPVs may be displayed simultaneously. The multiple FPVs may correspond to (or be generated by) different imaging devices that may have different fields of view. A user at a user terminal may select a portion of the images collected by the imaging device to specify a target and/or direction of motion of the movable object.
In another example, the image on the display device may show an atlas that may be generated by means of information from the payload of the movable object. This atlas may optionally be generated by means of multiple imaging devices (e.g., right camera, left camera, or more cameras), which may utilize stereo mapping techniques. In some cases, this map may be generated based on positional information about the UAV relative to the environment, the imaging device relative to the environment, and/or the UAV relative to the imaging device. The position information may include pose information, spatial position information, angular velocity, linear velocity, angular acceleration, and/or linear acceleration. Such a map may optionally be generated by means of one or more additional sensors, for example as described in more detail elsewhere herein. Such an atlas may be a two-dimensional atlas or a three-dimensional atlas. The two-dimensional atlas view may be switched between the three-dimensional atlas view or the two-dimensional atlas view and the three-dimensional atlas view may be displayed simultaneously. A user at the user terminal may select a portion of this atlas to specify the goal and/or direction of motion of the movable object. The views may be converted between one or more FPVs and one or more atlas views, or the one or more FPVs and one or more atlas views may be displayed simultaneously. The user may use any of these views to select a target or direction. The user-selected portion may include a target and/or a direction. The user may select the portion using any of the selection techniques described.
In some embodiments, the image may be provided in a 3D virtual environment displayed on a user terminal (e.g., a virtual reality system or an augmented reality system). The 3D virtual environment may optionally correspond to a 3D atlas. The virtual environment may include a plurality of points or objects that may be manipulated by a user. The user may manipulate the points or objects through a variety of different actions in the virtual environment. Examples of such actions may include: selecting one or more points or objects, drag and drop, translation, rotation, spin, push, pull, zoom in, zoom out, and the like. Any type of movement action of these points or objects in the three-dimensional virtual space is conceivable. A user at the user terminal may manipulate these points or objects in the virtual environment to control the flight path of the UAV and/or the motion characteristics of the UAV.
The steering device 140 may wirelessly communicate with the drone 110 for remote steering of the drone 110. The control device may be, for example, a remote controller or a user terminal installed with an Application program (APP for short) for controlling the unmanned aerial vehicle, and the user may output a flight control instruction or a shooting device instruction to the unmanned aerial vehicle through the touch screen of the terminal device, for example, one or more of the remote controller, the laptop, the smartphone, the tablet computer, the ground control station, the smart watch, and the smart bracelet, because the control device is configured with the touch screen. In the embodiment of the invention, the input of the user is received through the control equipment, and the unmanned aerial vehicle can be controlled through input devices such as a wheel pulling device, a button, a key, a rocker and the like on a remote controller or a User Interface (UI) on a user terminal.
It should be understood that the above-mentioned nomenclature for the components of the unmanned flight system is for identification purposes only, and should not be construed as limiting embodiments of the present invention.
The execution main body of the flight control method of the invention may be an unmanned aerial vehicle in an unmanned flight system, and may also be a ground control device in the unmanned flight system, which is not limited herein.
Fig. 2 is a flowchart of a flight control method according to an embodiment of the present invention, and as shown in fig. 2, the method according to the embodiment may include:
s201, determining a first target according to the designated position in the image.
In this embodiment, the image may be, for example, an image displayed in the interactive interface, and the specified position may be determined by an operation of the interactive interface. For example: when a user wants to control the unmanned aerial vehicle to point and fly towards a certain point on the ground or a certain direction on the ceiling, the user performs a contact operation on a certain position on the ground or a certain position on the ceiling in the image through an interactive interface, and accordingly, the position corresponding to the contact operation is taken as a pointing position in the embodiment.
The specified location may be obtained based on a selected point in the one or more images. The images may be captured by an imaging device on the drone at the current location. When the user selects one or more points in the image on the display, at least a portion of the designated locations displayed in the image may be selected. In some cases, selecting the one or more points may cause the entire designated location displayed in the image to be selected.
The selected point in the one or more images may be associated with a set of image coordinates. The target may be located at a second target location associated with a set of world targets. A transformation from the set of image coordinates to the set of world coordinates may be generated. A direction vector from the current position to the second target position may be calculated based on the transformation. A path for controlling the flight of the drone may be generated based on the direction vector.
In some embodiments, a selected point in an initialization image may be received from a user. The initialization image may be included within the one or more images. A plurality of object candidates may be provided for selection by the user, where each object candidate may be referred to using a bounding box. When a user selects a bounding box associated with a selected target candidate, the selected target candidate may be received as a target.
In some embodiments, a projective transformation of the first target in the one or more images may be obtained based on the state information of the imaging device. The status information of the imaging device may be determined based on the position and attitude information of the drone and the attitude information of the imaging device.
In some embodiments, a selected point in an initialization image may be received from a user. The initialization image may be included within the one or more images.
The first target is determined based on a specified location in the image, and specifically, the location of the first target in the real world (i.e., world coordinates) or the orientation of the first target in the real world relative to the drone may be determined.
Wherein a single imaging device, or multiple imaging devices, may be used to determine the location of the first target in the real world.
When the first target is determined using a single imaging device, triangulation methods may be used for the determination. First, the imaging device may be translated (by moving the movable object) in a lateral manner relative to the target and perpendicular to the direction from the imaging device to the first target. During this lateral translation, the imaging device may capture multiple images. The plurality of images may be provided to the image analyzer, which then calculates a distance from the first target to the movable object based on: (1) the first target variation in the plurality of images, and (2) a distance traveled by the movable object during the lateral translation. The distance covered during the lateral translation may be recorded by the IMU on the imaging device and/or the movable object. Alternatively, the distance covered during the lateral translation may be obtained from one or more Global Navigation Satellite Systems (GNSS). For example, a GNSS receiver on the imaging device and/or movable object may determine an estimated position, velocity, and accurate time (PVT) by processing signals broadcast by the satellites. The PVT information can be used to calculate the distance covered during the lateral translation.
The IMU may be an electronic device configured to measure and report the velocity, orientation, and gravity of the UAV using a combination of multiple accelerometers and multiple gyroscopes. A magnetometer may optionally be included. The IMU may use one or more accelerometers to detect the current acceleration rate and one or more gyroscopes to detect changes in rotational properties like pitch, roll and yaw. A magnetometer may be included to assist in calibrating for orientation deviations.
In some embodiments, the first target may be determined using a single imaging device, which is a time of flight (TOF) camera. In these embodiments, the first target may be determined without moving the TOF camera. A time-of-flight camera (TOF camera) may be a range imaging camera system that can resolve distances based on a known speed of light by measuring the time of flight of the light signal between the camera and the object for each point of the image. In some cases, using a TOF camera may improve tracking accuracy.
In some other embodiments, multiple imaging devices may be used to determine the first target. Fig. 3 shows an example where multiple imaging devices may be used to determine the first target. A first imaging device 304 and a second imaging device 306 may be provided. The first and second imaging modalities may be disposed at different locations. For example, the first imaging device may be a payload carried by the movable object 302, while the second imaging device may be located on or within the movable object. In some embodiments, the first imaging device may be a camera and the second imaging device may be a binocular vision sensor. In some embodiments, the first and second imaging modalities may be part of the same binocular camera. The first IMU may be disposed on the payload, e.g., on the first imaging device itself, or on a carrier coupling the payload to the movable object. The second IMU may be located within the body of the moveable object. The first and second imaging devices may have different optical axes. For example, a first imaging device may have a first optical axis 305 and a second imaging device may have a second optical axis 307. The first and second imaging modalities may belong to different inertial frames of reference that move independently of each other. Alternatively, the first and second imaging modalities may belong to the same inertial frame of reference. The first imaging device may be configured to capture an image 310, which is displayed on an output device of the user terminal. The second imaging device may be configured to capture a binocular image 314 including a left eye image 314-1 and a right eye image 314-2. As shown in fig. 3, the first and second imaging devices may capture multiple images of a target 308. However, the position of the first target in the captured images may be different, as the first and second imaging devices are at different positions. For example, in FIG. 3, the location 308' of the target in the image 310 may be at the bottom right corner of the image. Instead, the position 308-1 'of the object in the left-eye image 314-1 and the position 308-2' of the object in the right-eye image 314-2 may be located in the left portion of the corresponding left-and right-eye images. The positions 308-1 'and 308-2' in the left-eye and right-eye images may also be slightly different due to binocular vision.
The difference in position between the first imaging modality and the second imaging modality may be determined based on real-time position information obtained from the first IMU and the second IMU. The real-time location information from the first IMU may indicate an actual location of the first imaging device because the first IMU is mounted on the payload. Likewise, real-time position information from the second IMU may indicate the actual position of the second imaging modality because the second IMU is mounted on the body of the movable object at the second imaging modality. In some cases, the flight controller may adjust the pose of the movable object and/or payload based on the calculated position difference. The image analyzer may be configured to correlate the images obtained by the second imaging device with the images obtained by the first imaging device based on the calculated positional difference. The first target may be determined based on an association between images of the first and second imaging devices and a difference in position of the first and second imaging devices at different times.
In some embodiments, the actual position of the first target is not necessarily known. The tracking may be based primarily on the size and/or location of the first target in the image. For example, the movable object may be configured to move towards the target until the size of the first target within the image reaches a predetermined threshold. Alternatively, the imaging device of the movable object may be zoomed in on the first target without the movable object until the size of the first target within the image reaches a predetermined threshold. Alternatively, the imaging device may zoom in on the lens and the movable object may be moved towards the target object simultaneously until the size of the target within the image reaches a predetermined threshold. In some embodiments, the actual position of the first target may be known. The size of the first object within the image includes a characteristic length of the first object within the image. The feature length of the first target within the image may be based on a most significant size scale of the first target. The most significant dimension scale for the object may be represented by a length, width, height, thickness, arc, and/or circumference of a significant portion of the first object. The predetermined threshold may be defined based on a width of the image. In some embodiments, the movable object may be configured to move towards the first target and/or the imaging device may be actuated until the first target within the image is displayed in the target area. The target region may be a central portion of the image, as well as any other portion of the image. Such actuation of the imaging device in n degrees of freedom may be achieved using a carrier (e.g., a pan-tilt).
The movable object may be configured to move along the path from the first position to the second position. For many real-world applications, knowing the location of the first target and the movable object alone may not be sufficient for real-time tracking. For example, the surrounding environment may include an obstacle in the path between the movable object and the first target. These obstacles may be stationary, capable of moving, or in motion. Thus, information about the external environment is necessary to make the movable object avoid such obstacles by re-planning the path in real time. In some embodiments, information about the external environment may be provided as a 3D atlas based on one or more images captured by one or more imaging devices. The flight path of the movable object may be generated by using the 3D atlas.
S202, when an included angle between a connecting line of the first target and the current position of the unmanned aerial vehicle and a horizontal plane is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the size of the included angle.
S203, controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
In this embodiment, after the first target is determined, an included angle between a line connecting the first target and the current position of the unmanned aerial vehicle and a horizontal plane may be obtained, fig. 4 is a schematic view of an included angle between a line connecting the first target and the current position of the unmanned aerial vehicle and a horizontal plane according to an embodiment of the present invention, as shown in fig. 4, a pointing position (i.e., the first target) is located on an obstacle surface, which may be a ground surface located below the unmanned aerial vehicle or a ceiling located above the unmanned aerial vehicle, which is not limited in this embodiment. This embodiment determines whether the included angle is greater than a first preset angle. In the prior art, in order to ensure the flight safety of the unmanned aerial vehicle, an obstacle avoidance range is preset in the front, and the unmanned aerial vehicle flies towards a pointing position according to a pointing flight mode only when the pointing position is within the obstacle avoidance range; when the pointing position exceeds the obstacle avoidance range, the unmanned aerial vehicle cannot fly towards the pointing position. In some embodiments, the first preset angle is determined according to an obstacle avoidance range of the drone, such that when an angle between a line connecting the first target and the current position of the drone and a horizontal plane is greater than the first preset angle, the first target does not belong to the obstacle avoidance range of the drone.
Therefore, when the included angle is larger than the first preset angle, the flight mode of the unmanned aerial vehicle is determined according to the included angle, the unmanned aerial vehicle is controlled to fly to a second target according to the determined flight mode of the unmanned aerial vehicle, and the distance between the second target and the first target is not smaller than the preset distance.
For example, one mode is to calculate and acquire coordinates of the second target, generate a path from the current position to the second target according to the coordinates of the current position and the coordinates of the second target, and then control the unmanned aerial vehicle to fly to the second target according to the path. For example, after the pointing position is determined in the image, the geographic coordinates of the pointing position in the geographic environment can be calculated from the geographic environment (three-dimensional environment) in the image. Alternatively, a direction vector of the pointing position in the image is acquired, an intersection between the direction vector and an obstacle surface (e.g., a floor or a ceiling) in the image is determined, and the geographic coordinates of the intersection are taken as the geographic coordinates of the specified position.
For another example, one way is: the target direction for the unmanned aerial vehicle to move is determined based on the designated position in the image, and the flying direction is changed when the unmanned aerial vehicle flies to a preset distance away from the obstacle surface (i.e. the plane where the first target is located) along the target direction until the unmanned aerial vehicle flies to a second target, wherein the second target can be a target located above the first target. Therefore, the unmanned aerial vehicle finally flies to a second target which is not less than the preset distance away from the first target, so that the unmanned aerial vehicle cannot touch the barrier easily, and the flight safety of the unmanned aerial vehicle is ensured.
In some embodiments, the target direction of the drone may be dynamically adjusted such that the drone avoids one or more obstacles in the target direction. The attitude of the imaging device and/or UAV may be adjusted to maintain the first target within the field of view of the imaging device when the drone avoids the one or more obstacles. For example, yaw and translational movements of the drone may be controlled so as to maintain the first target within the field of view.
In some embodiments, a failure to fly toward a target may be determined to have occurred when the target (which may be the first target or the second target) is no longer present in the one or more images and/or within the field of view of the imaging device. In such cases, the position and pose of the movable object and/or the pose of the imaging device may be adjusted to recapture the target in one or more subsequent images. The one or more subsequent images may be analyzed to detect a target and, once detected, may be flown towards the target.
In some embodiments, the distance and/or speed of the target relative to the drone may be obtained. The target may be flown towards based on the distance and/or speed of the target relative to the UAV.
In some embodiments, the flight path of the drone may be an optimized route between the current location (associated with the drone) and the target (associated with the first target or the second target). The path may be optimized based on one or more parameters including distance of flight, time of flight, energy consumption, altitude, weather effects including wind direction and wind speed, and/or tracking of the target (e.g., velocity and direction of the target). The path may also be optimized to cause the drone to avoid one or more obstacles between the current location and the target. The path may include a plurality of straight lines and/or a plurality of curved lines.
For example, the path may be configured to minimize energy consumption of the drone as the drone moves from the current location to the target. The path may be configured to minimize the effect of weather on drone movement. The path may be optimized based on wind speed and wind direction. The path may be configured to reduce movement of the drone in upwind. The path may be configured to account for changes in altitude and pressure as the drone moves toward the target. The path may be configured based on an ambient landscape between the current location and the second target. For example, the path may be configured to take into account artificial structures and natural terrain present in the surrounding landscape. For example, the path may be configured to pass around/over/under obstacles in the path between the current location and the second target, such as artificial structures and natural terrain.
In some embodiments, a 3D model of the surrounding landscape may be obtained based on: (1) one or more images captured by one or more imaging devices on the drone, and (2) a topographical map obtained from Global Positioning System (GPS) data. The GPS data may be provided from the server to a user terminal used to control the drone. The path may be configured such that, as the drone moves from the current location to a target, a point of interest, which may be a target and/or other object, is maintained within a field of view of an imaging device of the drone.
When the included angle is determined to be smaller than or equal to the first preset angle, the first target belongs to the obstacle avoidance range of the unmanned aerial vehicle, the flight mode of the unmanned aerial vehicle can be determined to be a pointing flight mode according to the scheme in the prior art, and the unmanned aerial vehicle flies towards the first target according to the pointing flight mode.
In summary, according to the above-mentioned scheme of the embodiment of the present invention, even when an included angle between a horizontal plane and a line connecting a current position of the unmanned aerial vehicle and a first target is greater than a first preset angle, the present embodiment can control the unmanned aerial vehicle to fly toward the second target, so that the unmanned aerial vehicle flies to the second target away from the first target by a preset distance, and thus the unmanned aerial vehicle does not easily touch an obstacle, thereby ensuring the flight safety of the unmanned aerial vehicle, and simultaneously expanding the target position range of the unmanned aerial vehicle in pointed flight compared with the prior art.
When the unmanned aerial vehicle flies to the second target in the following way: determining a target direction for the unmanned aerial vehicle to move based on the designated position in the image, and changing the flight direction when the unmanned aerial vehicle flies to a preset distance from an obstacle surface (namely a plane where the first target is located) or approaches the preset distance along the target direction until the unmanned aerial vehicle flies to a second target; there are various ways to fly the flight path of the drone. The following examples are given.
Optionally, the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance. Correspondingly, one implementation mode for determining the flight mode of the unmanned aerial vehicle according to the size of the included angle is as follows: when the included angle is greater than the first preset angle (as shown in fig. 5); determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; see, for example, fig. 6. Accordingly, one possible implementation manner of the above S203 is: control unmanned aerial vehicle from the current position towards first horizontal plane flight to first horizontal plane on, unmanned aerial vehicle's vertical direction's speed drops to 0 when unmanned aerial vehicle reachs first horizontal plane, then controls unmanned aerial vehicle again and flies to the second target along first horizontal plane, and unmanned aerial vehicle's horizontal direction's speed also drops to 0 when unmanned aerial vehicle reachs the second target.
Optionally, the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance. Correspondingly, one implementation mode for determining the flight mode of the unmanned aerial vehicle according to the size of the included angle is as follows: when the included angle is greater than the first preset angle (as shown in fig. 5); determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track; for example, see fig. 7, where the flying of the drone toward the first target may be toward the first target along a connection line between the current position and the first target, or may not fly along the connection line, and it is a solution of the embodiment of the present invention if the flying process makes the distance between the drone and the first target closer. Accordingly, one possible implementation manner of the above S203 is: and controlling the unmanned aerial vehicle to fly to a third position from the current position towards the direction of the first target, wherein the distance between the first position and the first target in the vertical direction is greater than the distance between the first horizontal plane and the first target in the vertical direction, and then controlling the unmanned aerial vehicle to fly to the second target from the first position towards the second target according to the arc-shaped track.
Optionally, the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance. Correspondingly, one implementation mode for determining the flight mode of the unmanned aerial vehicle according to the size of the included angle is as follows: when the included angle is greater than the first preset angle and smaller than a second preset angle, the second preset angle is greater than the first preset angle (as shown in fig. 8); determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; see, for example, fig. 6.
Optionally, the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance. Correspondingly, one implementation mode for determining the flight mode of the unmanned aerial vehicle according to the size of the included angle is as follows: when the included angle is greater than the first preset angle and smaller than a second preset angle, the second preset angle is greater than the first preset angle (as shown in fig. 8); determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track; for example, see fig. 7, where the flying of the drone toward the first target may be toward the first target along a connection line between the current position and the first target, or may not fly along the connection line, and it is a solution of the embodiment of the present invention if the flying process makes the distance between the drone and the first target closer.
Optionally, when it is determined that the flight mode of the drone is from the current position to the first level and then flies along the first level to the second target, the flying from the current position to the first level includes: and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane. For example, as shown in fig. 9, the unmanned aerial vehicle flies toward the first horizontal plane along the direction of the connecting line between the current position and the second target, the position reaching the first horizontal plane is the second position, that is, the intersection point position of the connecting line between the current position and the second target and the first horizontal plane, and when the unmanned aerial vehicle is controlled to fly to the second position of the first horizontal plane, the speed of the unmanned aerial vehicle in the vertical direction drops to 0.
Optionally, when it is determined that the flight mode of the drone is from the current position to the first level and then flies along the first level to the second target, the flying from the current position to the first level includes: flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane according to an arc-shaped track. For example, as shown in fig. 10, the flying direction is toward the first horizontal plane along the direction of the line between the current position and the first target, and when the flying direction reaches the third position, the embodiment is not limited to flying only in the direction of the line between the current position and the first target, and for example, the flying direction may be toward the first horizontal plane along the direction of the line between the current position and the second target. The distance between the third position and the first target in the vertical direction is larger than the distance between the first horizontal plane and the first target in the vertical direction, then the unmanned aerial vehicle flies to the first horizontal plane according to the arc-shaped track, and when the unmanned aerial vehicle is controlled to fly to the second position of the first horizontal plane, the speed of the unmanned aerial vehicle in the vertical direction is reduced to 0.
Optionally, one implementation manner of determining the flight mode of the unmanned aerial vehicle according to the size of the included angle is as follows: when the included angle is greater than or equal to a second preset angle (as shown in fig. 11), the second preset angle is greater than the first preset angle, and the flight mode of the unmanned aerial vehicle is determined as follows: flying from the current position to the second target along the horizontal plane of the current position, wherein the second target and the current position are located on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane; such as shown in fig. 12. Accordingly, one possible implementation manner of the above S203 is: controlling the unmanned aerial vehicle to fly to a second target (at the moment, the speed of the unmanned aerial vehicle in the vertical direction is reduced to 0) along the horizontal plane where the current position is located from the current position (at the moment, the speed of the unmanned aerial vehicle in the horizontal direction is reduced to 0), wherein the second target is located in the vertical direction of the first target, and the distance between the second target and the first target in the vertical direction is equal to the distance between the current position and the first target in the vertical direction.
In some embodiments, the performing subject of the method is a ground control device, the determining a specified position in the image as the first target includes: acquiring a picture frame operation through an interactive interface; and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target. For example, as shown in fig. 13, an image captured by the unmanned aerial vehicle through the capturing device is displayed through the interactive interface, when the unmanned aerial vehicle is to be controlled to fly according to an object selected by a user, the user may perform a frame operation on the object through the interactive interface, accordingly, the ground control device of this embodiment acquires the frame operation through the interactive interface, acquires the object in the image framed by the frame operation, then determines whether the object in the image belongs to a preset type (e.g., a person, an automobile, etc.), when the object in the image does not belong to the preset type, determines that a position (i.e., a designated position) of the object in the image framed by the frame operation is a first target, and then, when an included angle between a connecting line between the first target and a current position of the unmanned aerial vehicle and a horizontal plane is greater than a first preset angle, executes the schemes shown in S202 and S203; and when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is smaller than or equal to a first preset angle, controlling the unmanned aerial vehicle to fly according to the pointing flight mode. And when the object in the image is not the preset following object, controlling the unmanned aerial vehicle to fly according to the tracking flight mode.
Optionally, when an object in the image selected by the frame operation frame belongs to a preset type, determining that the object is a target following object; and controlling the unmanned aerial vehicle to fly along with the object according to the object as a target following object. For an implementation of how to fly along with the object, reference may be made to related descriptions in the prior art, and details are not described here.
In some embodiments, the execution subject of the method is a ground control device, the method further comprising: displaying a preset icon at the pointing position in the image; after determining the flight mode of the unmanned aerial vehicle according to the size of the included angle, the method further comprises the following steps: and moving the preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image. In the present embodiment, after the specified position in the image is determined as the first target, a preset icon is displayed at the specified position in the image as shown in fig. 14 to indicate that the user has successfully specified the position in the image. When the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the included angle, after determining a second target according to the flight mode of the unmanned aerial vehicle, moving the preset icon displayed at the appointed position in the image to the position, corresponding to the second target, in the image from the appointed position so as to indicate that the unmanned aerial vehicle flies to the second target. As shown in fig. 15, the method is used for indicating that the unmanned aerial vehicle is controlled to fly to a second target when the designated position is at a first target, and is used for avoiding touching an obstacle and ensuring the flight safety.
Fig. 16 is a schematic structural diagram of a flight control device according to a first embodiment of the present invention, and as shown in fig. 16, a flight control device 400 according to this embodiment may include: a target determination module 401, a flight mode determination module 402 and a control module 403.
A target determination module 401 for determining a first target according to a specified position in the image;
a flight mode determining module 402, configured to determine a flight mode of the unmanned aerial vehicle according to an included angle between a horizontal plane and a connection line between the first target and the current position of the unmanned aerial vehicle, and the included angle is greater than a first preset angle;
a control module 403, configured to control the unmanned aerial vehicle to fly to a second target according to the determined flight mode, where a distance between the second target and the first target is not less than a preset distance.
Optionally, the flight mode determining module 402 is specifically configured to: when the included angle is larger than the first preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
Optionally, the flight mode determining module 402 is specifically configured to: when the included angle is larger than the first preset angle and smaller than a second preset angle, the second preset angle is larger than the first preset angle, and the flight mode of the unmanned aerial vehicle is determined as follows: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
Optionally, when the flight mode determination module 402 determines that the flight mode of the drone is from the current location to the first horizontal plane and then flies along the first horizontal plane to the second target, the flying from the current location to the first horizontal plane includes: and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane.
Optionally, when the flight mode determination module 402 determines that the flight mode of the drone is from the current location to the first horizontal plane and then flies along the first horizontal plane to the second target, the flying from the current location to the first horizontal plane includes: flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane according to an arc-shaped track.
Optionally, the flight module determining module 402 is specifically configured to: when the included angle is not smaller than the second preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: and flying to the second target from the current position along the horizontal plane of the current position, wherein the second target and the current position are positioned on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane.
Optionally, the target determining module 401 is specifically configured to: acquiring a picture frame operation through an interactive interface; and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target.
Optionally, the target determining module 401 is further configured to determine, when an object in the image selected by the frame operation box belongs to a preset type, that the object is a target following object;
the control module 403 is further configured to control the unmanned aerial vehicle to follow the object to fly according to the object being a target following object.
Optionally, the flight control apparatus 400 of the present embodiment further includes: a display module 404.
A display module 404 for displaying a preset icon at the designated position in the image; and after the flight mode of the unmanned aerial vehicle is determined by the flight mode determining module 402 according to the size of the included angle, moving a preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image.
The apparatus of this embodiment may be configured to implement the technical solutions of the above method embodiments of the present invention, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 17 is a schematic structural diagram of a flight control device according to a second embodiment of the present invention, and as shown in fig. 17, a flight control device 500 according to this embodiment may include: a memory 501 and a processor 502. The memory 501 and the processor 502 are connected by a bus.
The Processor 502 may be a Central Processing Unit (CPU), other general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 501 is used for storing codes for executing a flight control method;
the processor 502 is configured to call the code stored in the memory 501, and perform: determining a first target according to a specified position in the image; when an included angle between a connecting line of the first target and the current position of the unmanned aerial vehicle and a horizontal plane is larger than a first preset angle, determining a flight mode of the unmanned aerial vehicle according to the size of the included angle; and controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
Optionally, the processor 502 is specifically configured to: when the included angle is larger than the first preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
Optionally, the processor 502 is specifically configured to: when the included angle is larger than the first preset angle and smaller than a second preset angle, the second preset angle is larger than the first preset angle, and the flight mode of the unmanned aerial vehicle is determined as follows: flying from the current position to the first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
Optionally, when the processor 502 determines that the flight mode of the drone is from the current position to the first level and then flies along the first level to the second target, the flying from the current position to the first level includes: and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane.
Optionally, when the processor 502 determines that the flight mode of the drone is from the current position to the first level and then flies along the first level to the second target, the flying from the current position to the first level includes: flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane according to an arc-shaped track.
Optionally, the processor 502 is specifically configured to: when the included angle is not smaller than the second preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: and flying to the second target from the current position along the horizontal plane of the current position, wherein the second target and the current position are positioned on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane.
Alternatively, the flight control device 500 in the above embodiments may be a drone, or may be a ground control device.
Optionally, the flight control apparatus 500 of the present embodiment is a ground control device, and the flight control apparatus 500 further includes: an interactive interface 503. The interactive interface 503 is connected to the processor 502 via a bus.
Optionally, the interactive interface 503 is configured to detect a frame operation.
The processor 502 is specifically configured to: acquiring the picture frame operation through the interactive interface 503; and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target.
Optionally, the processor 502 is further configured to determine that an object in the image selected by the frame operation box is a target following object when the object belongs to a preset type; and controlling the unmanned aerial vehicle to fly along with the object according to the object as a target following object.
Optionally, an interactive interface 503 for displaying a preset icon at the pointing position in the image; and after the processor 502 determines the flight mode of the unmanned aerial vehicle according to the size of the included angle, moving a preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image.
The apparatus of this embodiment may be configured to implement the technical solutions of the above method embodiments of the present invention, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 18 is a schematic structural diagram of a flight control system of an unmanned aerial vehicle according to an embodiment of the present invention, and as shown in fig. 18, a flight control system 800 of an unmanned aerial vehicle according to this embodiment includes: flight control device 600 and drone 700. The flight control device 600 may adopt the structure of the device embodiment shown in fig. 16 or fig. 17, and accordingly, may execute the technical solutions of the above method embodiments of the present invention, and the implementation principle and the technical effect thereof are similar, and are not described herein again.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: Read-Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (28)
1. A flight control method, comprising:
determining a first target according to a specified position in the image;
when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the size of the included angle;
and controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
2. The method of claim 1, wherein determining the flight mode of the drone according to the size of the included angle comprises:
when the included angle is larger than the first preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target;
or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
3. The method of claim 1, wherein determining the flight mode of the drone according to the size of the included angle comprises:
when the included angle is larger than the first preset angle and smaller than a second preset angle, the second preset angle is larger than the first preset angle,
determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target;
or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
4. The method of claim 2 or 3, wherein when determining that the mode of flight of the drone is from the current location to the first level and then along the first level to the second target, the flying from the current location to the first level comprises:
and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane.
5. The method of claim 2 or 3, wherein when determining that the mode of flight of the drone is from the current location to the first level and then along the first level to the second target, the flying from the current location to the first level comprises:
flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target;
flying from the third position to the first horizontal plane according to an arc-shaped track.
6. The method of claim 3, wherein determining the flight mode of the drone according to the size of the included angle further comprises:
when the included angle is not smaller than the second preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: and flying to the second target from the current position along the horizontal plane of the current position, wherein the second target and the current position are positioned on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane.
7. The method according to any one of claims 1 to 3 or 6, wherein the method is performed by a ground control device, and determining a specified position in the image as the first target includes:
acquiring a picture frame operation through an interactive interface;
and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target.
8. The method of claim 7, further comprising:
when the object in the image selected by the frame operation box belongs to a preset type, determining that the object is a target following object;
and controlling the unmanned aerial vehicle to fly along with the object according to the object as a target following object.
9. The method of any one of claims 1-3, 6, 8, wherein the subject of execution of the method is a surface control device, the method further comprising:
displaying a preset icon at a pointing position in the image;
after determining the flight mode of the unmanned aerial vehicle according to the size of the included angle, the method further comprises the following steps:
and moving the preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image.
10. A flight control apparatus, comprising:
a target determination module for determining a first target according to a specified position in the image;
the flight mode determining module is used for determining the flight mode of the unmanned aerial vehicle according to the included angle when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle;
and the control module is used for controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
11. The apparatus of claim 10, wherein the flight mode determination module is specifically configured to: when the included angle is larger than the first preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
12. The apparatus of claim 10, wherein the flight mode determination module is specifically configured to: when the included angle is larger than the first preset angle and smaller than a second preset angle, the second preset angle is larger than the first preset angle, and the flight mode of the unmanned aerial vehicle is determined as follows: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
13. The apparatus of claim 11 or 12, wherein when the flight mode determination module determines that the flight mode of the drone is flying from the current location to the first level and then flying along the first level to the second target, the flying from the current location to the first level comprises: and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane.
14. The apparatus of claim 11 or 12, wherein when the flight mode determination module determines that the flight mode of the drone is from the current location to the first level and then along the first level to the second target, the flying from the current location to the first level comprises: flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane according to an arc-shaped track.
15. The apparatus of claim 12, wherein the flight module determination module is specifically configured to: when the included angle is not smaller than the second preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: and flying to the second target from the current position along the horizontal plane of the current position, wherein the second target and the current position are positioned on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane.
16. The apparatus according to any one of claims 10-12, 15, wherein the targeting module is specifically configured to: acquiring a picture frame operation through an interactive interface; and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target.
17. The apparatus of claim 16, wherein the target determining module is further configured to determine that the object is a target following object when the object in the image selected by the frame operation box belongs to a preset type;
the control module is further used for controlling the unmanned aerial vehicle to fly along with the object according to the object as a target following object.
18. The apparatus of any one of claims 10-12, 15, 17, further comprising:
the display module is used for displaying a preset icon at a pointing position in the image; and after the flight mode of the unmanned aerial vehicle is determined by the flight mode determining module according to the size of the included angle, moving a preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image.
19. A flight control apparatus, comprising: a memory and a processor;
the memory for storing code for performing a flight control method;
the processor is used for calling the codes stored in the memory and executing: determining a first target according to a specified position in the image; when the included angle between the horizontal plane and the connecting line of the first target and the current position of the unmanned aerial vehicle is larger than a first preset angle, determining the flight mode of the unmanned aerial vehicle according to the size of the included angle; and controlling the unmanned aerial vehicle to fly to a second target according to the determined flight mode, wherein the distance between the second target and the first target is not less than a preset distance.
20. The apparatus of claim 19, wherein the processor is specifically configured to: when the included angle is larger than the first preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
21. The apparatus of claim 19, wherein the processor is specifically configured to: when the included angle is larger than the first preset angle and smaller than a second preset angle, the second preset angle is larger than the first preset angle, and the flight mode of the unmanned aerial vehicle is determined as follows: flying from the current position to a first horizontal plane and then flying along the first horizontal plane to the second target; or, determining that the flight mode of the unmanned aerial vehicle is: flying from the current position to a first position in the direction of the first target, wherein the first position is positioned on one side of the first horizontal plane, which faces away from the first target, and flying from the first position to the second target according to an arc-shaped track;
the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane which is away from the first target by the preset distance.
22. The apparatus of claim 20 or 21, wherein when the processor determines that the drone has a flight mode that flies from the current location to the first level and then along the first level to the second target, the flying from the current location to the first level comprises: and flying from the current position to a second position on the first horizontal plane, wherein the second position is the intersection point position of a connecting line between the first target and the current position and the first horizontal plane.
23. The apparatus of claim 20 or 21, wherein when the processor determines that the drone has a flight mode that flies from the current location to the first level and then along the first level to the second target, the flying from the current location to the first level comprises: flying from the current position in a direction toward the first target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane according to an arc-shaped track.
24. The apparatus of claim 21, wherein the processor is specifically configured to: when the included angle is not smaller than the second preset angle, determining that the flight mode of the unmanned aerial vehicle is as follows: and flying to the second target from the current position along the horizontal plane of the current position, wherein the second target and the current position are positioned on the same horizontal plane, and the connecting line of the second target and the first target is vertical to the horizontal plane.
25. The apparatus of any one of claims 19-21, 24, wherein the apparatus is a surface control device, the apparatus further comprising:
the interactive interface is used for detecting picture frame operation;
the processor is specifically configured to: acquiring the picture frame operation through the interactive interface; and when the object in the image selected by the frame operation frame does not belong to a preset type, determining that the position of the frame operation frame is the first target.
26. The apparatus of claim 25, wherein the processor is further configured to determine that the object is a target following object when the object in the image selected by the frame operation box belongs to a preset type; and controlling the unmanned aerial vehicle to fly along with the object according to the object as a target following object.
27. The apparatus of any one of claims 19-21, 24, 26, wherein the apparatus is a surface control device, the apparatus further comprising: the interactive interface is used for displaying a preset icon at a pointing position in the image; and after the processor determines the flight mode of the unmanned aerial vehicle according to the size of the included angle, moving a preset icon displayed at the pointing position in the image to a position corresponding to the second target in the image.
28. The apparatus according to any one of claims 19 to 21, 24,
the device is an unmanned aerial vehicle;
alternatively, the device is a ground control device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110169187.8A CN112987782A (en) | 2016-12-22 | 2016-12-22 | Flight control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/111564 WO2018112848A1 (en) | 2016-12-22 | 2016-12-22 | Flight control method and apparatus |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110169187.8A Division CN112987782A (en) | 2016-12-22 | 2016-12-22 | Flight control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108450032A CN108450032A (en) | 2018-08-24 |
CN108450032B true CN108450032B (en) | 2021-03-02 |
Family
ID=62624251
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680076224.8A Expired - Fee Related CN108450032B (en) | 2016-12-22 | 2016-12-22 | Flight control method and device |
CN202110169187.8A Withdrawn CN112987782A (en) | 2016-12-22 | 2016-12-22 | Flight control method and device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110169187.8A Withdrawn CN112987782A (en) | 2016-12-22 | 2016-12-22 | Flight control method and device |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN108450032B (en) |
WO (1) | WO2018112848A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109540834A (en) * | 2018-12-13 | 2019-03-29 | 深圳市太赫兹科技创新研究院 | A kind of cable aging monitoring method and system |
CN109947096B (en) * | 2019-02-25 | 2022-06-21 | 广州极飞科技股份有限公司 | Controlled object control method and device and unmanned system |
CN110673642B (en) * | 2019-10-28 | 2022-10-28 | 深圳市赛为智能股份有限公司 | Unmanned aerial vehicle landing control method and device, computer equipment and storage medium |
CN113759985A (en) * | 2021-08-03 | 2021-12-07 | 华南理工大学 | Unmanned aerial vehicle flight control method, system, device and storage medium |
CN114115351B (en) * | 2021-12-06 | 2024-07-05 | 歌尔科技有限公司 | Obstacle avoidance method for aircraft, and computer-readable storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105141851A (en) * | 2015-09-29 | 2015-12-09 | 杨珊珊 | Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle |
CN105278543A (en) * | 2015-09-28 | 2016-01-27 | 小米科技有限责任公司 | Method and device for increasing flight security, and electronic equipment |
CN105334980A (en) * | 2007-12-31 | 2016-02-17 | 微软国际控股私有有限公司 | 3D pointing system |
CN105518555A (en) * | 2014-07-30 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Systems and methods for target tracking |
CN105955304A (en) * | 2016-07-06 | 2016-09-21 | 零度智控(北京)智能科技有限公司 | Obstacle avoidance method, obstacle avoidance device and unmanned aerial vehicle |
CN105955292A (en) * | 2016-05-20 | 2016-09-21 | 腾讯科技(深圳)有限公司 | Aircraft flight control method and system, mobile terminal and aircraft |
CN106022274A (en) * | 2016-05-24 | 2016-10-12 | 零度智控(北京)智能科技有限公司 | Obstacle avoiding method, obstacle avoiding device and unmanned machine |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI408568B (en) * | 2010-06-24 | 2013-09-11 | Hon Hai Prec Ind Co Ltd | Handheld device and method for controlling a unmanned aerial vehicle using the handheld device |
CN201804119U (en) * | 2010-08-19 | 2011-04-20 | 中国测绘科学研究院 | Aerial photographic navigation control system for airborne global positioning system |
CN102707724B (en) * | 2012-06-05 | 2015-01-14 | 清华大学 | Visual localization and obstacle avoidance method and system for unmanned plane |
CN102854886B (en) * | 2012-08-29 | 2016-01-20 | 深圳一电科技有限公司 | The method and apparatus of flight line editor and control |
IL222053A (en) * | 2012-09-23 | 2016-11-30 | Israel Aerospace Ind Ltd | System, method and computer program product for maneuvering an air vehicle |
CN103019250B (en) * | 2012-12-03 | 2015-01-07 | 华北电力大学 | Bevel take-off control method of inspection flying robot |
KR101483058B1 (en) * | 2014-01-21 | 2015-01-15 | 엘아이지넥스원 주식회사 | Ground control system for UAV anticollision |
US9881021B2 (en) * | 2014-05-20 | 2018-01-30 | Verizon Patent And Licensing Inc. | Utilization of third party networks and third party unmanned aerial vehicle platforms |
CN105517666B (en) * | 2014-09-05 | 2019-08-27 | 深圳市大疆创新科技有限公司 | Offline mode selection based on scene |
CN104991563B (en) * | 2015-05-12 | 2023-10-03 | 零度智控(北京)智能科技有限公司 | Unmanned aerial vehicle hierarchical operation method and system |
CN105700550B (en) * | 2016-01-26 | 2018-06-26 | 深圳市大疆创新科技有限公司 | Unmanned plane and its flight control method and system |
CN105867400A (en) * | 2016-04-20 | 2016-08-17 | 北京博瑞爱飞科技发展有限公司 | Flying control method and device for unmanned aerial vehicle |
CN105955298B (en) * | 2016-06-03 | 2018-09-07 | 腾讯科技(深圳)有限公司 | A kind of automatic obstacle-avoiding method and device of aircraft |
-
2016
- 2016-12-22 WO PCT/CN2016/111564 patent/WO2018112848A1/en active Application Filing
- 2016-12-22 CN CN201680076224.8A patent/CN108450032B/en not_active Expired - Fee Related
- 2016-12-22 CN CN202110169187.8A patent/CN112987782A/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105334980A (en) * | 2007-12-31 | 2016-02-17 | 微软国际控股私有有限公司 | 3D pointing system |
CN105518555A (en) * | 2014-07-30 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Systems and methods for target tracking |
CN105278543A (en) * | 2015-09-28 | 2016-01-27 | 小米科技有限责任公司 | Method and device for increasing flight security, and electronic equipment |
CN105141851A (en) * | 2015-09-29 | 2015-12-09 | 杨珊珊 | Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle |
CN105955292A (en) * | 2016-05-20 | 2016-09-21 | 腾讯科技(深圳)有限公司 | Aircraft flight control method and system, mobile terminal and aircraft |
CN106022274A (en) * | 2016-05-24 | 2016-10-12 | 零度智控(北京)智能科技有限公司 | Obstacle avoiding method, obstacle avoiding device and unmanned machine |
CN105955304A (en) * | 2016-07-06 | 2016-09-21 | 零度智控(北京)智能科技有限公司 | Obstacle avoidance method, obstacle avoidance device and unmanned aerial vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN108450032A (en) | 2018-08-24 |
WO2018112848A1 (en) | 2018-06-28 |
CN112987782A (en) | 2021-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11644832B2 (en) | User interaction paradigms for a flying digital assistant | |
US11635775B2 (en) | Systems and methods for UAV interactive instructions and control | |
US11861892B2 (en) | Object tracking by an unmanned aerial vehicle using visual sensors | |
US11879737B2 (en) | Systems and methods for auto-return | |
US11749124B2 (en) | User interaction with an autonomous unmanned aerial vehicle | |
US10860040B2 (en) | Systems and methods for UAV path planning and control | |
US11604479B2 (en) | Methods and system for vision-based landing | |
CN108351654B (en) | System and method for visual target tracking | |
CN108450032B (en) | Flight control method and device | |
CN107850436B (en) | Sensor fusion using inertial and image sensors | |
CN107850901B (en) | Sensor fusion using inertial and image sensors | |
CN107615211B (en) | Method and system for estimating state information of movable object using sensor fusion | |
CN107850899B (en) | Sensor fusion using inertial and image sensors | |
CN112567201A (en) | Distance measuring method and apparatus | |
CN114879715A (en) | Unmanned aerial vehicle control method and device and unmanned aerial vehicle | |
WO2020107487A1 (en) | Image processing method and unmanned aerial vehicle | |
JP2021047738A (en) | Moving object, flight path control method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210302 |
|
CF01 | Termination of patent right due to non-payment of annual fee |