CN108450032A - Flight control method and device - Google Patents

Flight control method and device Download PDF

Info

Publication number
CN108450032A
CN108450032A CN201680076224.8A CN201680076224A CN108450032A CN 108450032 A CN108450032 A CN 108450032A CN 201680076224 A CN201680076224 A CN 201680076224A CN 108450032 A CN108450032 A CN 108450032A
Authority
CN
China
Prior art keywords
target
unmanned plane
level face
current location
fly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680076224.8A
Other languages
Chinese (zh)
Other versions
CN108450032B (en
Inventor
郭灼
苏冠华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202110169187.8A priority Critical patent/CN112987782A/en
Publication of CN108450032A publication Critical patent/CN108450032A/en
Application granted granted Critical
Publication of CN108450032B publication Critical patent/CN108450032B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A kind of flight control method and device (400), the method include:Designated position in the picture is determined as first object (S201);When the angle of the line of the current location of first object and unmanned plane and horizontal plane is more than the first presetting angle, the offline mode (S202) of unmanned plane is determined according to the size of angle;According to determining offline mode control unmanned plane towards the second target flight, the distance between the second target and first object are not less than preset distance (S203).Even if the line of the current location of first object and unmanned plane and the angle of horizontal plane are more than the first presetting angle, unmanned plane can also be controlled towards the second target flight, so that second target of the unmanned plane during flying extremely apart from first object preset distance, unmanned plane will not touch barrier easily in this way, it ensure that unmanned plane during flying safety, the target location range that unmanned plane gives directions flight have also been enlarged.

Description

Flight control method and device Technical field
The present embodiments relate to air vehicle technique field more particularly to a kind of flight control methods and device.
Background technique
Existing unmanned plane passes through the photographic device shooting picture being arranged on, and picture is passed through into display interface real-time exhibition to user, if user is to a certain subject interests in picture, it then can control unmanned plane and enter indication offline mode, i.e. user specifies a position on picture, and aircraft flies towards the position.But when filming apparatus is towards ground, it is in security consideration, aircraft cannot be introduced into indication offline mode.
Summary of the invention
The embodiment of the present invention provides a kind of flight control method and device, for making unmanned plane not touch barrier easily, guarantees unmanned plane during flying safety, expands the target position range that unmanned plane gives directions flight.
In a first aspect, the embodiment of the present invention provides a kind of flight control method, comprising:
First object is determined according to designated position in the picture;
When the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle;
The unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
Second aspect, the embodiment of the present invention provide a kind of flight control assemblies, comprising:
Target determination module, for determining first object according to designated position in the picture;
Offline mode determining module, for determining the offline mode of the unmanned plane according to the size of the angle when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle;
Control module, for controlling the unmanned plane during flying to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
The third aspect, the embodiment of the present invention provide a kind of flight control assemblies, comprising: memory and processor;
The memory, for storing the code for executing flight control method;
The processor executes for calling the code stored in the memory: determining first object according to designated position in the picture;And when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle;The unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
Fourth aspect, the embodiment of the present invention provide a kind of flight control system of unmanned plane, comprising: unmanned plane;And such as second aspect of the present invention or the flight control assemblies of third aspect offer.
Flight control method and device provided in an embodiment of the present invention, and in the flight control system of unmanned plane, when the angle of the line of the current location of the first object and unmanned plane that are determined according to designated position in the picture and horizontal plane is greater than the first presetting angle, the unmanned plane is controlled towards the second target flight, so that second target of the unmanned plane during flying extremely apart from first object preset distance, unmanned plane will not touch barrier easily in this way, it ensure that unmanned plane during flying safety, the target position range that unmanned plane gives directions flight have also been enlarged.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, a brief description of the drawings needed to be used in the description of the embodiments or the prior art will be given below, apparently, drawings in the following description are some embodiments of the invention, for those of ordinary skill in the art, without creative efforts, it is also possible to obtain other drawings based on these drawings.
Fig. 1 is the schematic architectural diagram of unmanned flight's system 100 of embodiment according to the present invention;
Fig. 2 is the flow chart of flight control method provided in an embodiment of the present invention;
Fig. 3 is a kind of schematic diagram provided in an embodiment of the present invention that first object is determined using multiple imaging devices;
Fig. 4 is the schematic diagram of the line of the current location of first object provided in an embodiment of the present invention and unmanned plane and the angle of horizontal plane;
Fig. 5 is the schematic diagram that the line of the current location of first object provided in an embodiment of the present invention and unmanned plane and the angle of horizontal plane are greater than the first presetting angle;
Fig. 6 is a kind of schematic diagram of the offline mode of unmanned plane provided in an embodiment of the present invention;
Fig. 7 is a kind of schematic diagram of the offline mode of unmanned plane provided in an embodiment of the present invention;
Fig. 8 is that the line of the current location of first object provided in an embodiment of the present invention and unmanned plane and the angle of horizontal plane are greater than the first presetting angle and less than the schematic diagram of the second presetting angle;
Fig. 9 is a kind of schematic diagram of the offline mode of unmanned plane provided in an embodiment of the present invention;
Figure 10 is a kind of schematic diagram of the offline mode of unmanned plane provided in an embodiment of the present invention;
Figure 11 is the schematic diagram that the line of the current location of first object provided in an embodiment of the present invention and unmanned plane and the angle of horizontal plane are greater than the second presetting angle;
Figure 12 is a kind of schematic diagram of the offline mode of unmanned plane provided in an embodiment of the present invention;
Figure 13 is a kind of schematic diagram that ground control equipment provided in an embodiment of the present invention controls unmanned plane during flying;
Figure 14 is a kind of schematic diagram of display preset icon provided in an embodiment of the present invention;
Figure 15 is a kind of schematic diagram of display preset icon provided in an embodiment of the present invention;
Figure 16 is the structural schematic diagram for the flight control assemblies that the embodiment of the present invention one provides;
Figure 17 is the structural schematic diagram of flight control assemblies provided by Embodiment 2 of the present invention;
Figure 18 is a kind of structural schematic diagram of the flight control system of unmanned plane provided in an embodiment of the present invention.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, following will be combined with the drawings in the embodiments of the present invention, technical scheme in the embodiment of the invention is clearly and completely described, obviously, described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, every other embodiment obtained by those of ordinary skill in the art without making creative efforts, shall fall within the protection scope of the present invention.
The embodiment provides the flight control systems of flight control method and device and unmanned plane.The following description of this invention uses example of the unmanned plane as unmanned plane.For it will be apparent to those skilled in art that other kinds of unmanned plane can be used without restriction, the embodiment of the present invention can be applied to various types of unmanned planes.For example, unmanned plane can be small-sized or large-scale unmanned plane.In certain embodiments, unmanned plane can be rotor wing unmanned aerial vehicle (rotorcraft), for example, passing through air-driven multi-rotor unmanned aerial vehicle by multiple pushing meanss, the embodiment of the present invention is not limited to this, and unmanned plane is also possible to other types of unmanned plane.
Fig. 1 is the schematic architectural diagram of unmanned flight's system 100 of embodiment according to the present invention.The present embodiment is illustrated by taking rotor wing unmanned aerial vehicle as an example.
Unmanned flight's system 100 may include unmanned plane 110, holder 120, display equipment 130 and commanding apparatus 140.Wherein, unmanned plane 110 may include dynamical system 150, flight control system 160 and rack 170.Unmanned plane 110 can be carried out wireless communication with ground control equipment, wherein the ground control equipment may include commanding apparatus 140 and/or display equipment 130.
Rack 170 may include fuselage and foot prop (also referred to as undercarriage).One or more horns that fuselage may include centre frame and connect with centre frame, one or more horns radially extend from centre frame.Foot prop is connect with fuselage, for playing a supportive role when unmanned plane 110 lands.
Dynamical system 150 may include electron speed regulator (referred to as electricity is adjusted) 151, one or more propellers 153 and one or more motors 152 corresponding with one or more propellers 153, wherein motor 152 is connected between electron speed regulator 151 and propeller 153, and motor 152 and propeller 153 are arranged on corresponding horn;Electron speed regulator 151 is used to receive the driving signal of the generation of flight control system 160, and provides driving current to motor 152, to control the revolving speed of motor 152 according to driving signal.Motor 152 is for driving propeller to rotate, so that the flight for unmanned plane 110 provides power, which makes unmanned plane 110 can be realized the movements of one or more freedom degrees.In certain embodiments, unmanned plane 110 can be around one or more rotary shaft rotations.For example, above-mentioned rotary shaft may include roll axis, translation shaft and pitch axis.It should be understood that motor 152 can be direct current generator, it can also be with alternating current generator.In addition, motor 152 can be brushless motor, it can also be with brush motor.
Flight control system 160 may include flight controller 161 and sensor-based system 162.Sensor-based system 162 is used to measure the posture information of unmanned plane, i.e., unmanned plane 110 space location information and status information, for example, three-dimensional position, three-dimensional perspective, three-dimensional velocity, three-dimensional acceleration and three-dimensional angular velocity etc..Sensor-based system 162 for example may include at least one of sensors such as gyroscope, electronic compass, Inertial Measurement Unit (English: Inertial Measurement Unit, abbreviation: IMU), visual sensor, Global Navigation Satellite System and barometer.For example, Global Navigation Satellite System can be global positioning system (English: Global Positioning System, referred to as: GPS) or.Flight controller 161 is used to control the flight of unmanned plane 110, for example, can be according to the flight for the posture information control unmanned plane 110 that sensor-based system 162 measures.It should be understood that flight controller 161 can control unmanned plane 110 according to the program instruction finished in advance, unmanned plane 110 can also be controlled by responding one or more control instructions from commanding apparatus 140.
Holder 120 may include that electricity adjusts 121 and motor 122.Holder is for carrying filming apparatus 123.Flight controller 161 can adjust the movement of 121 and the control holder 120 of motor 122 by electricity.Optionally, As another embodiment, holder 120 can also include controller, for controlling the movement of holder 120 by control electricity tune 121 and motor 122.It should be understood that holder 120 can be independently of unmanned plane 110, or a part of unmanned plane 110.It should be understood that motor 122 can be direct current generator, it can also be with alternating current generator.In addition, motor 122 can be brushless motor, it can also be with brush motor.It should also be understood that holder can be located at the top of unmanned plane, the bottom of unmanned plane can also be located at.
Equipment of the filming apparatus 123 such as can be camera or video camera for capturing image, filming apparatus 123 can be communicated with flight controller, and be shot under the control of flight controller.
Display equipment 130 can be communicated with unmanned plane 110 wirelessly, and be displayed for the posture information of unmanned plane 110.Furthermore it is also possible to show the image of filming apparatus shooting in display equipment 130.It should be understood that display equipment 130 can be independent equipment, also can be set in commanding apparatus 140.
The display equipment may include screen.The screen may or may not be touch screen.The screen can be the screen of light emitting diode (LED) screen, OLED screen curtain, liquid crystal display (LCD) screen, plasma screen or any other type.The display equipment is configured for one graphic user interface (GUI) of display.The GUI can show the image that can permit the movement of user's control UAV.For example, user can from the image selection target.The target can be the target of static target or movement.User can select direction of travel from the image.User can choose a part (for example, point, region, and/or object) of the image to limit the target and/or direction.User can select the target and/or direction by directly touching screen (for example, touch screen).User can touch a part of screen.User can touch the part of screen by touching a point on screen.Alternatively, user can select a region from pre-existing regional ensemble on the screen or can draw boundary for a region or specify a part of the plane in any other manner.User can be by the part by means of user interaction means (for example, mouse, control stick, keyboard, tracking ball, torchpad, button, verbal order, gesture identification, attitude transducer, thermal sensor, touching formula capacitive sensor or any other device) selection image come selection target and/or direction.Touch screen is configured for the touch location of detection user, touches duration, touch pressure, and/or touch movement, wherein each in above-mentioned touch manner can indicate the specific input order of user.
Image in the display equipment can show view collected by the payload by means of loose impediment.For example, the image that imaging device is collected can be shown on the display apparatus.This can be considered as first person video (FPV).In some cases, can provide single imaging device and Single FPV can be provided.Alternatively, multiple imaging devices with the different visuals field can be provided.Video can be converted between this multiple FPV, or can show this multiple FPV simultaneously.This multiple FPV can correspond to can have the different imaging devices (or being generated by it) in the different visuals field.User at user terminal can choose a part of image collected by the imaging device to specify the target and/or the direction of motion of loose impediment.
In another example, the image in the display equipment can show the map that can be generated by means of the information from the payload of loose impediment.This map can be optionally by means of multiple imaging devices (for example, right camera, Zuo Xiangji, or more camera) generate, this can use three-dimensional mapping techniques.In some cases, this figure can be based on about UAV relative to environment, imaging device relative to environment, and/or UAV relative to imaging device location information and generate.Location information may include pose information, spatial positional information, angular speed, linear velocity, angular acceleration, and/or linear acceleration.This map can be optionally generated by means of one or more additional sensors, such as elsewhere herein in greater detail.This map can be two-dimensional map or three-dimensional map.It can be converted between two-dimensional map video and three-dimensional map video, or can show two-dimensional map video and three-dimensional map video simultaneously.User at user terminal can choose a part of this map to specify the target and/or the direction of motion of loose impediment.Video can be converted between one or more FPV and one or more map videos, or can show one or more FPV and one or more map videos simultaneously.Any of these videos can be used to select target or direction in user.The part of user's selection may include target and/or direction.Described any selection technique can be used to select the part in user.
In some embodiments, which can be the 3D virtual environment to be shown in user terminal (for example, virtual reality system or augmented reality system) to provide.The 3D virtual environment can optionally correspond to a 3D map.The virtual environment may include the multiple points or object that can be manipulated by user.User can manipulate these points or object by a variety of different movements in the virtual environment.The example of these movements may include: selection one or more points or object, drag and drop, translation, rotation, spin, pushes away, drawing, amplifying, reducing.Any kind of shift action to these points or object is contemplated that in virtual three dimensional space.User at user terminal can manipulate in virtual environment these point or object to control the flight path of UAV and/or the kinetic characteristic of UAV.
Commanding apparatus 140 can be communicated with unmanned plane 110 wirelessly, for carrying out remote control to unmanned plane 110.Commanding apparatus for example can be remote controler or be equipped with answering for control unmanned plane With program (English: Application, user terminal referred to as: APP), due to being equipped with the terminal device of touch screen, user can export flight control instruction to unmanned plane by the touch screen to terminal device or filming apparatus instructs, such as one of remote controler, laptop computer, smart phone, tablet computer, ground control station, smartwatch, Intelligent bracelet etc. or a variety of.In the embodiment of the present invention, the input of user is received by commanding apparatus, can refer to and unmanned plane is manipulated by the user interface (UI) pulled out on the input units such as wheel, button, key, rocking bar or user terminal on remote controler.
It should be understood that the above-mentioned name for unmanned flight's system components is only the purpose for mark, it is not construed as the limitation to the embodiment of the present invention.
The executing subject of flight control method of the invention can be the unmanned plane in unmanned flight's system, the ground control equipment being also possible in unmanned flight's system, herein with no restrictions.
Fig. 2 is the flow chart of flight control method provided in an embodiment of the present invention, as shown in Fig. 2, the method for the present embodiment may include:
S201, first object is determined according to designated position in the picture.
In the present embodiment, image for example can be the image shown in interactive interface, which can be through the operation of interactive interface and determine.Such as: the obstacles such as ground or ceiling face is shown in image, when user, which wants control unmanned plane, gives directions flight towards a direction on the certain point on ground or ceiling, user carries out operating of contacts to a certain position on a certain position on the ground in image or ceiling by interactive interface, correspondingly, the present embodiment is using the corresponding position of the operating of contacts as indication position.
Designated position, which can be based on the Chosen Point in the one or more image, to be obtained.These images can be by the imaging device on the unmanned plane in this prior position capture.When selecting one or more points in the image of user on the display, so that it may at least part of the designated position of selected display in the images.In some cases, the one or more points is selected to may cause to the entire designated position of selected display in the images.
The Chosen Point in the one or more image can be associated with one group of image coordinate.Target can be located at the second target location associated with one group of world's target.The transformation from this group of image coordinate to this group of world coordinates can be generated.The direction vector from the current location to second target position can be calculated based on the transformation.The path for controlling unmanned plane during flying can be generated based on direction vector.
In some embodiments, the Chosen Point in initialisation image can be received from user.The initialisation image can be included in the one or more image.Multiple objects candidate item can be provided to select for the user It selects, wherein each object candidate item can be using bounding box reference.When user's selection bounding box associated with selected target candidate item, the selected target candidate item can receive as target.
In some embodiments, the status information based on the imaging device can obtain projective transformation of the first object in the one or more image.The status information of the imaging device, which can be based on the posture information of the position of unmanned plane and posture information and the imaging device, to be determined.
In some embodiments, the Chosen Point in initialisation image can be received from user.The initialisation image can be included in the one or more image.
First object is determined according to designated position in the picture, specifically, can be position (that is, world coordinates) of the determining first object in real world, alternatively, being also possible to determine first object in real world with respect to the orientation of unmanned plane.
Wherein, when determining position of the first object in real world, single imaging device or multiple imaging devices can be used to determine.
When being to determine the first object using single imaging device, triangulation method can be used to determine.It is possible, firstly, to make the imaging device relative to the target with landscape mode and perpendicular to being translated from the imaging device to the first object direction (by the mobile loose impediment).During this transverse translation, which can capture multiple images.Multiple image can be supplied to the image dissector, which then calculates the distance from the first object to loose impediment based on the following terms: (1) in the plurality of image the first object variation and (2) during the transverse translation loose impediment travel distance.The distance covered during the transverse translation can be recorded by the IMU in the imaging device and/or loose impediment.Alternatively, the distance covered during the transverse translation can be obtained from one or more Global Navigation Satellite System (GNSS).For example, the GNSS receiver in the imaging device and/or loose impediment can determine estimated position, speed and correct time (PVT) by handling the signal that these satellites are broadcasted.The PVT information can be used to calculate the distance covered during the transverse translation.
The IMU can be arranged to measure and report the speed of UAV, the electronic device of orientation and gravity using the combination of multiple accelerometers and multiple gyroscopes.Magnetometer can be optionally included.One or more accelerometers can be used to detect when preacceleration rate and detect using one or more gyroscopes the variation of rotatable property (as pitching, roll and yaw) in the IMU.May include magnetometer come auxiliary needle to orientation deviate calibrate.
In some embodiments, single imaging device can be used to determine first object, the imaging device For flight time (TOF) camera.In these embodiments, first object can be determined in the case where not moving the TOF camera.Time-of-flight camera (TOF camera) can be can measure the flight time of the optical signal between the camera and object by each point for image, the range image camera system of distance is parsed based on the known light velocity.In some cases, tracking accuracy can be improved using TOF camera.
In some other embodiments, multiple imaging devices can be used to determine first object.Fig. 3 shows the example that multiple imaging devices can be used to determine first object.First imaging device 304 and the second imaging device 306 can be provided.First imaging device and the second imaging device can be arranged at different locations.For example, the first imaging device can be loose impediment 302 carrying payload, and the second imaging device can be located at the loose impediment on or within.In some embodiments, the first imaging device can be camera and the second imaging device can be binocular vision sensor.In some embodiments, first imaging device and the second imaging device can be a part of same binocular camera.First IMU can be arranged in payload, such as the first imaging device itself is upper or payload is connected on the carrier of loose impediment.2nd IMU can be located in the ontology of the loose impediment.First imaging device and the second imaging device can have different optical axials.For example, the first imaging device can have the first optical axial 305 and the second imaging device can have the second optical axial 307.First imaging device and the second imaging device may belong to the different inertial reference systems to move independently from one another.Alternatively, first imaging device and the second imaging device may belong to same inertial reference system.First imaging device is configured for capturing image 310, which is displayed on the output device of user terminal.Second imaging device is configured for capturing the binocular image 314 including left-eye image 314-1 and eye image 314-2.As shown in figure 3, first imaging device and the second imaging device can capture the multiple images of a target 308.However, position of the first object in these images captured can be different, because the first imaging device and the second imaging device are at different positions.Such as in Fig. 3, position 308 ' of the target in image 310 can be located at the You Di corner of the image.On the contrary, the target in left-eye image 314-1 position 308-1 ' and the target can be located in the left half of corresponding left eye and eye image in the position 308-2 ' in eye image 314-2.Position 308-1 ' and 308-2 ' in left eye and eye image may also be slightly different due to binocular vision.
Position difference between first imaging device and the second imaging device, which can be based on the real-time position information obtained from the first IMU and the 2nd IMU, to be determined.Real-time position information from the first IMU can indicate the physical location of the first imaging device, because the first IMU is mounted in the payload.Similarly, the real-time position information from the 2nd IMU can indicate the reality of the second imaging device Border position, because the 2nd IMU is mounted on the ontology of the loose impediment at second imaging device.In some cases, the flight controller can based on calculated position difference adjust the posture of the loose impediment and/or payload.The image dissector is configured to the calculated position difference of institute and is associated second imaging device these images obtained and first imaging device these images obtained.Can based between the image of first and second imaging device association and first and second imaging device in the position difference of different moments determine first object.
In some embodiments, the physical location of first object requires no knowledge about.Tracking can be based primarily upon the size of first object and/or position in image.For example, the size that the loose impediment may be configured to be moved to the first object in the image towards the target reaches predetermined threshold.Alternatively, the imaging device of the loose impediment can be zoomed in on the first object with camera lens, without loose impediment, until the size of the first object in the image reaches predetermined threshold.Optionally, which can further camera lens, and loose impediment can be mobile simultaneously towards target object, until the size of the target in the image reaches predetermined threshold.In some embodiments, the physical location of first object can be known.The size of the first object includes the characteristic length of the first object image Nei in the image.The characteristic length of the first object can be the most significant dimensional scales based on the first object in the image.The most significant dimensional scales of the target can be and be indicated with the length of the signal portion of the first object, width, height, thickness, radian, and/or circumference.The predetermined threshold can be the restriction of the width based on the image.In some embodiments, which may be configured to mobile towards first object and/or can activate the imaging device until the first object in image is shown in the target area.The target area can be the center portion of the image and any other part of the image.The imaging device can be with this actuating of n freedom degree using carrier (for example, holder) Lai Shixian's.
The loose impediment may be configured to be moved to the second position from the first position along the path.For the application of many real worlds, only the position of known first object and loose impediment may be not enough to carry out real-time tracking.For example, ambient enviroment may include barrier in the path between loose impediment and first object.These barriers may be static, can moving or during exercise.In this way, about external environment information for by real time again planning path come so that loose impediment is evaded such barrier be necessary.In some embodiments, it may be provided in the 3D map for one or more images that imaging device based on one or more is captured about the information of external environment.The flight path of the loose impediment, which can be by using the 3D map, to be generated.
S202, when the line of the current location of the first object and the unmanned plane and the angle of horizontal plane When greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle.
S203, the unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
In the present embodiment, after determining first object, the line of the current location of available first object and unmanned plane and the angle of horizontal plane, Fig. 4 is the schematic diagram of the line of the current location of first object provided in an embodiment of the present invention and unmanned plane and the angle of horizontal plane, as shown in Figure 4, position (i.e. first object) is given directions to be located on an obstacle face, the obstacle face can be the ground below unmanned plane, it is also possible to be located at the ceiling above unmanned plane, the present embodiment is not limited thereto.The present embodiment judges whether the angle is greater than the first presetting angle.In the prior art in order to guarantee the flight safety of unmanned plane, have an avoidance range in front preset, only when giving directions position to be located within the scope of the avoidance, unmanned plane according to give directions offline mode towards indication position flight;And when giving directions position to be more than this avoidance range, unmanned plane can not fly towards the indication position.In some embodiments, first presetting angle is determined according to the avoidance range of unmanned plane, make in this way when the angle of the line of the current location of first object and unmanned plane and horizontal plane is greater than the first presetting angle, which is not belonging to the avoidance range of the unmanned plane.
Therefore, when determining that above-mentioned angle is greater than the first presetting angle, the offline mode of unmanned plane is determined according to the size of the angle, and according to the offline mode of determining unmanned plane, the unmanned plane during flying is controlled to the second target, the distance between second target and first object are not less than preset distance.
For example, a kind of mode is, the coordinate for obtaining the second target is calculated, according to the Coordinate generation current location of the coordinate of current location and the second target to the path of the second target, unmanned plane is then controlled and flies according to this path to the second target.For example, determining to give directions after position in the picture, according to the geographical environment (three-dimensional environment) in image, the geographical coordinate for giving directions position in geographical environment can be calculated.Alternatively, obtaining the direction vector of the indication position in image, the crosspoint between the obstacle face (such as ground or ceiling) in direction vector and image is determined, using the geographical coordinate in the crosspoint as the geographical coordinate of designated position.
In another example, a kind of mode are as follows: the target direction mobile for unmanned plane is determined based on designated position in the picture, the change of flight direction when unmanned plane flies along the target direction to apart from obstacle face (plane i.e. where first object) preset distance, until flight is to the second target, which can be the target above first object.Unmanned plane is made finally to fly to the second target for being not less than preset distance apart from first object in this way, such unmanned plane will not touch barrier easily, ensure that unmanned plane during flying safety.
In some embodiments, the target direction of unmanned plane can be dynamically adjusted, so that unmanned plane evades one or more barriers on the target direction.The posture of the adjustable imaging device and/or UAV so as to the first object is maintained into the imaging device when the unmanned plane evades the one or more barrier the visual field in.For example, the yaw angle that can control unmanned plane is mobile and is moved in translation to maintain first object in the visual field.
In some embodiments, when target (can be first object or the second target) is no longer present in the one or more image and/or in the visual field of the imaging device, it can determine and have occurred towards target flight failure.In such situations, the position of the adjustable loose impediment and posture and/or the posture of the imaging device to capture the target again in one or more subsequent images.The one or more subsequent image be can analyze to detect target, and once detecting can be towards the target flight.
In some embodiments, distance and/or speed of the target relative to unmanned plane can be obtained.Can based on target relative to UAV distance and/or speed come towards the target flight.
In some embodiments, the flight path of unmanned plane can be the optimization route between current location (associated with unmanned plane) and target (associated with first object or the second target).The path parameter can optimize based on one or more, including flying distance, the flight time, energy consumption, height above sea level, influence including the weather of wind direction and wind speed, and/or tracking (for example, the rate of target and direction) to target.The path can also be optimized to that unmanned plane is made to evade one or more barriers between position and the target in this prior.The path may include a plurality of straight line and/or a plurality of curve.
For example, the path may be configured to the minimum energy consumption for making the unmanned plane when unmanned plane is moved to target from the current location.The path is configured for so that weather minimizes the mobile influence of unmanned plane.The path can be optimized based on wind speed and direction.The path is configured for reducing movement of the unmanned plane in contrary wind.The path may be configured to consider the change of height above sea level and pressure when unmanned plane is mobile towards target.The path can be configured based on the surrounding landscape between the current location and second target.For example, the path may be configured to consider the man-made structures present in the surrounding landscape and physical relief.For example, the path may be configured to around barrier such as man-made structures and physical relief in the path between position in this prior and second target/above/below process.
In some embodiments, the 3D model of surrounding landscape can be obtained based on the following terms: (1) the one or more images captured by one or more imaging devices on the unmanned plane, and the topographic map that (2) are obtained from global positioning system (GPS) data.The GPS data can be provided to the user terminal for controlling unmanned plane from server.The path may be configured so that, when unmanned plane is from current When position is moved to target, point of interest is maintained in the visual field of the imaging device of the unmanned plane, and wherein the point of interest can be target and/or other objects.
When determining that the above-mentioned angle is less than or equal to the first presetting angle, the first object belongs to the avoidance range of the unmanned plane, by the scheme of the prior art, can determine the offline mode of unmanned plane to give directions offline mode, and according to the indication offline mode, fly towards first object.
In summary, above scheme through the embodiment of the present invention, even if the line of the current location of first object and unmanned plane and the angle of horizontal plane are greater than the first presetting angle, the present embodiment also can control the unmanned plane towards second target flight, so that second target of the unmanned plane during flying extremely apart from first object preset distance, unmanned plane will not touch barrier easily in this way, it ensure that unmanned plane during flying safety, while expanding the target position range that unmanned plane gives directions flight in terms of existing technologies.
When unmanned plane is flown in the following ways to the second target: determining the target direction mobile for unmanned plane based on designated position in the picture, the change of flight direction when unmanned plane flies along the target direction to apart from obstacle face (plane i.e. where first object) preset distance or close to preset distance, until flight is to the second target;There are many modes for the flight path of unmanned plane.It is exemplified below.
Optionally, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.Correspondingly, a kind of implementation of the offline mode of the unmanned plane is determined according to the size of the angle are as follows: when the angle is greater than first presetting angle (as shown in Figure 5);Determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Such as it may refer to as shown in Figure 6.Correspondingly, a kind of feasible implementation of above-mentioned S203 are as follows: control unmanned plane flies from current location towards first level face to first level face, when unmanned plane reaches first level face, the speed of the vertical direction of unmanned plane is reduced to 0, then it controls unmanned plane again to fly along first level face to the second target, when unmanned plane reaches the second target, the speed of the horizontal direction of unmanned plane is also reduced to 0.
Optionally, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.Correspondingly, a kind of implementation of the offline mode of the unmanned plane is determined according to the size of the angle are as follows: when the angle is greater than first presetting angle (as shown in Figure 5);Determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;Such as it may refer to shown in Fig. 7, wherein unmanned plane can be towards the flight of the direction of first object along working as The line of front position and first object can not also fly towards the direction of first object along the line, as long as the process of flight makes unmanned plane closer scheme for belonging to the embodiment of the present invention at a distance from first object.Correspondingly, a kind of feasible implementation of above-mentioned S203 are as follows: fly to the third place in direction of the control unmanned plane from current location towards first object, wherein, first position and first object be greater than at a distance from vertical direction first level face and first object in vertical direction at a distance from, then control unmanned plane again according to arching trajectory from first position towards the second target flight to the second target.
Optionally, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.Correspondingly, a kind of implementation of the offline mode of the unmanned plane is determined according to the size of the angle are as follows: when the angle is greater than first presetting angle and when less than the second presetting angle, second presetting angle is greater than first presetting angle, (as shown in Figure 8);Determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Such as it may refer to as shown in Figure 6.
Optionally, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.Correspondingly, a kind of implementation of the offline mode of the unmanned plane is determined according to the size of the angle are as follows: when the angle is greater than first presetting angle and when less than the second presetting angle, second presetting angle is greater than first presetting angle, (as shown in Figure 8);Determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;Such as it may refer to shown in Fig. 7, wherein, unmanned plane can be the direction along the line of current location and first object towards first object towards the flight of the direction of first object, it can not also fly along the line, as long as the process of flight makes unmanned plane closer scheme for belonging to the embodiment of the present invention at a distance from first object.
Optionally, when the offline mode for determining the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to the first level face, it include: to fly from the current location to the second position on the first level face, the intersection position of line of the second position between the first object and the current location and the first level face.Such as shown in Figure 9, it flies along the direction of the line between current location and the second target towards first level face, the position for reaching first level face is the second position, the as intersection position of the line between current location and the second target and first level face, in control unmanned plane Fly to first level face the second position when, the speed of the vertical direction of unmanned plane is reduced to 0.
Optionally, when the offline mode for determining the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, it include: to fly from the current location towards the direction of the first object to the third place, the third place is located at the first level face backwards to the side of the first object;It flies from the third place to the first level face according to arching trajectory.Such as shown in Figure 10, it shows and flies along the direction of the line between current location and first object towards first level face, in flight to the third place, but the present embodiment is not limited to only be flown according to the direction of the line between current location and first object, such as can also fly along the direction of the line between current location and the second target towards first level face.The third place and first object are greater than first level face with first object at a distance from vertical direction at a distance from vertical direction, then it flies according still further to arching trajectory to first level face, when controlling the second position of the unmanned plane during flying to first level face, the speed of the vertical direction of unmanned plane is reduced to 0.
Optionally, the size according to the angle determines a kind of implementation of the offline mode of the unmanned plane are as follows: when the angle is greater than or equal to the second presetting angle (as shown in figure 11), second presetting angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane;Such as shown in figure 12.Correspondingly, a kind of feasible implementation of above-mentioned S203 are as follows: control unmanned plane (speed of the vertical direction of unmanned plane is reduced to 0 at this time) since current location is flown along the horizontal plane where current location to the second target (speed of the horizontal direction of unmanned plane is reduced to 0 at this time), second target is located in the vertical direction of first object, and the second target is equal to current location at a distance from the vertical direction of first object at a distance from the vertical direction of first object.
In some embodiments, the executing subject of the method is ground control equipment, described that designated position in the picture is determined as first object, comprising: obtains picture frame operation by interactive interface;When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.Such as shown in figure 13, the image that unmanned plane was photographed by filming apparatus is shown by interactive interface, when the object flight that unmanned plane to be controlled is selected according to user, user can carry out picture frame operation to the object by interactive interface, correspondingly, the ground control equipment of the present embodiment obtains picture frame operation by interactive interface, obtains the object in the image of picture operation box choosing, so Judge whether the object in the image belongs to preset kind (such as people, automobile etc.) afterwards, when the object in the image is not belonging to preset kind, the position (i.e. designated position) for determining the object in the image of picture frame operation box choosing is first object, then when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, scheme shown in above-mentioned S202 and S203 is executed;When the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is less than or equal to the first presetting angle, unmanned plane during flying is controlled according to indication offline mode.When the object in the image is not to preset to follow object, unmanned plane during flying is controlled according to tracking offline mode.
Optionally, when the object in the described image of picture frame operation box choosing belongs to preset kind, determine that the object is that target follows object;It is that target follows object according to the object, controls the unmanned plane and the object is followed to fly.Wherein, the implementation for how following object to fly may refer to associated description in the prior art, and details are not described herein again.
In some embodiments, the executing subject of the method is ground control equipment, the method also includes: preset icon is shown at the indication position in described image;The size according to the angle determines after the offline mode of the unmanned plane, further includes: the preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.In the present embodiment, after the designated position in image is determined as first object, specified location in the picture shows preset icon, as shown in figure 14, to indicate user successful designated position in the picture.When the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle, after determining the second target according to the offline mode of unmanned plane, the preset icon that specified location in image is shown is moved in image to the position for corresponding to the second target by the specified location, to indicate that unmanned plane arrives flight at second target.As shown in figure 15, it is used to indicate when designated position is in first object and controls unmanned plane during flying to the second target, touch barrier for avoiding, guarantee flight safety.
Figure 16 is the structural schematic diagram for the flight control assemblies that the embodiment of the present invention one provides, and as shown in figure 16, the flight control assemblies 400 of the present embodiment may include: target determination module 401, offline mode determining module 402 and control module 403.
Target determination module 401, for determining first object according to designated position in the picture;
Offline mode determining module 402, for determining the offline mode of the unmanned plane according to the size of the angle when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle;
Control module 403, for controlling the unmanned plane during flying to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
Optionally, the offline mode determining module 402, it is specifically used for: when the angle is greater than first presetting angle, determines the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
Optionally, the offline mode determining module 402, it is specifically used for: when the angle is greater than first presetting angle and less than the second presetting angle, second presetting angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
Optionally, when the offline mode determining module 402 determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to the first level face, it include: to fly from the current location to the second position on the first level face, the intersection position of line of the second position between the first object and the current location and the first level face.
Optionally, when the offline mode determining module 402 determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, it include: to fly from the current location towards the direction of the first object to the third place, the third place is located at the first level face backwards to the side of the first object;It flies from the third place to the first level face according to arching trajectory.
Optionally, the flight module determining module 402, it is specifically used for: when the angle is not less than second presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane.
Optionally, the target determination module 401, is specifically used for: obtaining picture frame operation by interactive interface;When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.
Optionally, the target determination module 401 is also used to when the object in the described image that the picture frame operation box selects belongs to preset kind, determines that the object is that target follows object;
The control module 403 is also used to be that target follows object according to the object, controls the unmanned plane and the object is followed to fly.
Optionally, the flight control assemblies 400 of the present embodiment further include: display module 404.
Display module 404 shows preset icon for the specified location in described image;And after the offline mode determining module 402 determines the offline mode of the unmanned plane according to the size of the angle, the preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.
The device of the present embodiment can be used for executing the technical solution of the above-mentioned each method embodiment of the present invention, and it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Figure 17 is the structural schematic diagram of flight control assemblies provided by Embodiment 2 of the present invention, and as shown in figure 17, the flight control assemblies 500 of the present embodiment may include: memory 501 and processor 502.Memory 501 is connect with processor 502 by bus.
Above-mentioned processor 502 can be central processing unit (English: Central Processing Unit, CPU), the processor can also be other general processors, digital signal processor (English: Digital Signal Processor, DSP), specific integrated circuit (English: Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (English: Field-Programmable Gate Array, ) or other programmable logic device FPGA, discrete gate or transistor logic, discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor etc..
The memory 501, for storing the code for executing flight control method;
The processor 502, for calling the code stored in the memory 501, execution: First object is determined according to designated position in the picture;And when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle;The unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
Optionally, the processor 502, it is specifically used for: when the angle is greater than first presetting angle, determines the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
Optionally, the processor 502, it is specifically used for: when the angle is greater than first presetting angle and less than the second presetting angle, second presetting angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
Optionally, when the processor 502 determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to the first level face, it include: to fly from the current location to the second position on the first level face, the intersection position of line of the second position between the first object and the current location and the first level face.
Optionally, when the processor 502 determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, comprising: from the current location towards described The direction of one target is flown to the third place, and the third place is located at the first level face backwards to the side of the first object;It flies from the third place to the first level face according to arching trajectory.
Optionally, the processor 502, it is specifically used for: when the angle is not less than second presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane.
Optionally, the flight control assemblies 500 in above-described embodiment can be unmanned plane, or can be ground control equipment.
Optionally, the flight control assemblies 500 of the present embodiment are ground control equipment, the flight control assemblies 500 further include: interactive interface 503.Interactive interface 503 is connect with processor 502 by bus.
Optionally, interactive interface 503, for detecting picture frame operation.
The processor 502, is specifically used for: obtaining the picture frame by the interactive interface 503 and operates;When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.
Optionally, the processor 502 is also used to when the object in the described image that the picture frame operation box selects belongs to preset kind, determines that the object is that target follows object;It is that target follows object according to the object, controls the unmanned plane and the object is followed to fly.
Optionally, interactive interface 503, for showing preset icon at the indication position in described image;And after the processor 502 determines the offline mode of the unmanned plane according to the size of the angle, the preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.
The device of the present embodiment can be used for executing the technical solution of the above-mentioned each method embodiment of the present invention, and it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Figure 18 is a kind of structural schematic diagram of the flight control system of unmanned plane provided in an embodiment of the present invention, and as shown in figure 18, the flight control system 800 of the unmanned plane of the present embodiment includes: flight control assemblies 600 and unmanned plane 700.Wherein, flight control assemblies 600 can use the structure of Figure 16 or Figure 17 shown device embodiment, accordingly, can execute the technical solution of the above-mentioned each method embodiment of the present invention, it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Those of ordinary skill in the art will appreciate that: realizing all or part of the steps of above method embodiment, this can be accomplished by hardware associated with program instructions, and program above-mentioned can store computer-readable in one It takes in storage medium, which when being executed, executes step including the steps of the foregoing method embodiments;And storage medium above-mentioned includes: read-only memory (English: Read-Only Memory, referred to as: ROM), the various media that can store program code such as random access memory (English: Random Access Memory, abbreviation: RAM), magnetic or disk.
Finally, it should be noted that the above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;Although present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it is still possible to modify the technical solutions described in the foregoing embodiments, or equivalent substitution of some or all of the technical features;And these are modified or replaceed, the range for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.

Claims (28)

  1. A kind of flight control method characterized by comprising
    First object is determined according to designated position in the picture;
    When the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle;
    The unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
  2. The method according to claim 1, wherein the size according to the angle determines the offline mode of the unmanned plane, comprising:
    When the angle is greater than first presetting angle, the offline mode of the unmanned plane is determined are as follows: fly from the current location to the first level face, fly further along the first level face to second target;
    Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  3. The method according to claim 1, wherein the size according to the angle determines the offline mode of the unmanned plane, comprising:
    When the angle is greater than first presetting angle and when less than the second presetting angle, second presetting angle is greater than first presetting angle,
    Determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;
    Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  4. According to the method in claim 2 or 3, which is characterized in that when the offline mode for determining the unmanned plane is to fly from the current location to the first level face, further along the first level face It is described to fly from the current location to the first level face when flight to second target, comprising:
    It flies from the current location to the second position on the first level face, the intersection position of line of the second position between the first object and the current location and the first level face.
  5. According to the method in claim 2 or 3, it is characterized in that, when the offline mode for determining the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, comprising:
    Direction from the current location towards the first object is flown to the third place, and the third place is located at the first level face backwards to the side of the first object;
    It flies from the third place to the first level face according to arching trajectory.
  6. According to the method described in claim 3, it is characterized in that, the size according to the angle determines the offline mode of the unmanned plane, further includes:
    When the angle is not less than second presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane.
  7. Method described in -6 any one according to claim 1, which is characterized in that the executing subject of the method is ground control equipment, described that designated position in the picture is determined as first object, comprising:
    Picture frame operation is obtained by interactive interface;
    When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.
  8. The method according to the description of claim 7 is characterized in that further include:
    When the object in the described image of picture frame operation box choosing belongs to preset kind, determine that the object is that target follows object;
    It is that target follows object according to the object, controls the unmanned plane and the object is followed to fly.
  9. Method according to any one of claims 1 to 8, which is characterized in that the executing subject of the method is ground control equipment, the method also includes:
    Preset icon is shown at the indication position in described image;
    The size according to the angle determines after the offline mode of the unmanned plane, further includes:
    The preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.
  10. A kind of flight control assemblies characterized by comprising
    Target determination module, for determining first object according to designated position in the picture;
    Offline mode determining module, for determining the offline mode of the unmanned plane according to the size of the angle when the angle of the line of the current location of the first object and the unmanned plane and horizontal plane is greater than the first presetting angle;
    Control module, for controlling the unmanned plane during flying to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
  11. Device according to claim 10, it is characterized in that, the offline mode determining module, it is specifically used for: when the angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  12. Device according to claim 10, it is characterized in that, the offline mode determining module, it is specifically used for: when the angle is greater than first presetting angle and less than the second presetting angle, second presetting angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  13. Device according to claim 11 or 12, it is characterized in that, when the offline mode determining module determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to the first level face, it include: to fly from the current location to the second position on the first level face, line of the second position between the first object and the current location and described the The intersection position of one horizontal plane.
  14. 1 or 12 device stated according to claim 1, it is characterized in that, when the offline mode determining module determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, it include: to fly from the current location towards the direction of the first object to the third place, the third place is located at the first level face backwards to the side of the first object;It flies from the third place to the first level face according to arching trajectory.
  15. Device according to claim 12, it is characterized in that, the flight module determining module, it is specifically used for: when the angle is not less than second presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane.
  16. Device described in 0-15 any one according to claim 1, which is characterized in that the target determination module is specifically used for: picture frame operation is obtained by interactive interface;When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.
  17. Device according to claim 16, which is characterized in that the target determination module is also used to when the object in the described image that the picture frame operation box selects belongs to preset kind, determines that the object is that target follows object;
    The control module is also used to be that target follows object according to the object, controls the unmanned plane and the object is followed to fly.
  18. Device described in 0-17 any one according to claim 1, which is characterized in that further include:
    Display module shows preset icon for the specified location in described image;And after the offline mode determining module determines the offline mode of the unmanned plane according to the size of the angle, the preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.
  19. A kind of flight control assemblies characterized by comprising memory and processor;
    The memory, for storing the code for executing flight control method;
    The processor executes for calling the code stored in the memory: determining first object according to designated position in the picture;And the present bit when the first object and the unmanned plane When the angle of the line and horizontal plane set is greater than the first presetting angle, the offline mode of the unmanned plane is determined according to the size of the angle;The unmanned plane during flying is controlled to the second target according to the offline mode of the determination, wherein the distance between second target and the first object are not less than preset distance.
  20. Device according to claim 19, it is characterized in that, the processor, it is specifically used for: when the angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  21. Device according to claim 19, it is characterized in that, the processor, it is specifically used for: when the angle is greater than first presetting angle and less than the second presetting angle, second presetting angle is greater than first presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to the first level face, fly further along the first level face to second target;Or, determine the offline mode of the unmanned plane are as follows: fly to first position in the direction from the current location towards the first object, the first position is located at the first level face backwards to the side of the first object, flies from the first position to second target according to arching trajectory;
    Wherein, second target is located on first level face, and the first level face is the horizontal plane at a distance from the first object for the preset distance.
  22. The device according to claim 20 or 21, it is characterized in that, when the processor determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to the first level face, it include: to fly from the current location to the second position on the first level face, the intersection position of line of the second position between the first object and the current location and the first level face.
  23. The device stated according to claim 20 or 21, it is characterized in that, when the processor determines that the offline mode of the unmanned plane is to fly from the current location to the first level face, when flying further along the first level face to second target, it is described to fly from the current location to first level face, it include: to fly from the current location towards the direction of the first object to the third place, it is described The third place is located at the first level face backwards to the side of the first object;It flies from the third place to the first level face according to arching trajectory.
  24. Device according to claim 21, it is characterized in that, the processor, it is specifically used for: when the angle is not less than second presetting angle, determine the offline mode of the unmanned plane are as follows: fly from the current location to second target along the horizontal plane where the current location, second target and the current location are located at same level, and the line of second target and the first object is perpendicular to horizontal plane.
  25. Device described in 9-24 any one according to claim 1, which is characterized in that described device is ground control equipment, described device further include:
    Interactive interface, for detecting picture frame operation;
    The processor, is specifically used for: obtaining the picture frame by the interactive interface and operates;When the object in the described image of picture frame operation box choosing is not belonging to preset kind, determine that the position of the picture frame operation box choosing is the first object.
  26. Device according to claim 25, which is characterized in that the processor is also used to when the object in the described image that the picture frame operation box selects belongs to preset kind, determines that the object is that target follows object;It is that target follows object according to the object, controls the unmanned plane and the object is followed to fly.
  27. Device described in 9-26 any one according to claim 1, which is characterized in that described device is ground control equipment, described device further include: interactive interface, for showing preset icon at the indication position in described image;And after the processor determines the offline mode of the unmanned plane according to the size of the angle, the preset icon shown at the indication position in described image is moved in described image at the position for corresponding to second target.
  28. Device described in 9-24 any one according to claim 1, which is characterized in that
    Described device is unmanned plane;
    Alternatively, described device is ground control equipment.
CN201680076224.8A 2016-12-22 2016-12-22 Flight control method and device Expired - Fee Related CN108450032B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110169187.8A CN112987782A (en) 2016-12-22 2016-12-22 Flight control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/111564 WO2018112848A1 (en) 2016-12-22 2016-12-22 Flight control method and apparatus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110169187.8A Division CN112987782A (en) 2016-12-22 2016-12-22 Flight control method and device

Publications (2)

Publication Number Publication Date
CN108450032A true CN108450032A (en) 2018-08-24
CN108450032B CN108450032B (en) 2021-03-02

Family

ID=62624251

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110169187.8A Withdrawn CN112987782A (en) 2016-12-22 2016-12-22 Flight control method and device
CN201680076224.8A Expired - Fee Related CN108450032B (en) 2016-12-22 2016-12-22 Flight control method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110169187.8A Withdrawn CN112987782A (en) 2016-12-22 2016-12-22 Flight control method and device

Country Status (2)

Country Link
CN (2) CN112987782A (en)
WO (1) WO2018112848A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109540834A (en) * 2018-12-13 2019-03-29 深圳市太赫兹科技创新研究院 A kind of cable aging monitoring method and system
CN109947096A (en) * 2019-02-25 2019-06-28 广州极飞科技有限公司 The control method and device of controll plant, Unmanned Systems
CN110673642A (en) * 2019-10-28 2020-01-10 深圳市赛为智能股份有限公司 Unmanned aerial vehicle landing control method and device, computer equipment and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113759985A (en) * 2021-08-03 2021-12-07 华南理工大学 Unmanned aerial vehicle flight control method, system, device and storage medium
CN114115351A (en) * 2021-12-06 2022-03-01 歌尔科技有限公司 Obstacle avoidance method for aircraft, aircraft and computer-readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105141851A (en) * 2015-09-29 2015-12-09 杨珊珊 Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle
CN105278543A (en) * 2015-09-28 2016-01-27 小米科技有限责任公司 Method and device for increasing flight security, and electronic equipment
CN105334980A (en) * 2007-12-31 2016-02-17 微软国际控股私有有限公司 3D pointing system
CN105518555A (en) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for target tracking
CN105955292A (en) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 Aircraft flight control method and system, mobile terminal and aircraft
CN105955304A (en) * 2016-07-06 2016-09-21 零度智控(北京)智能科技有限公司 Obstacle avoidance method, obstacle avoidance device and unmanned aerial vehicle
CN106022274A (en) * 2016-05-24 2016-10-12 零度智控(北京)智能科技有限公司 Obstacle avoiding method, obstacle avoiding device and unmanned machine
US20160300492A1 (en) * 2014-05-20 2016-10-13 Verizon Patent And Licensing Inc. Utilization of third party networks and third party unmanned aerial vehicle platforms

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI408568B (en) * 2010-06-24 2013-09-11 Hon Hai Prec Ind Co Ltd Handheld device and method for controlling a unmanned aerial vehicle using the handheld device
CN201804119U (en) * 2010-08-19 2011-04-20 中国测绘科学研究院 Aerial photographic navigation control system for airborne global positioning system
CN102707724B (en) * 2012-06-05 2015-01-14 清华大学 Visual localization and obstacle avoidance method and system for unmanned plane
CN102854886B (en) * 2012-08-29 2016-01-20 深圳一电科技有限公司 The method and apparatus of flight line editor and control
IL222053A (en) * 2012-09-23 2016-11-30 Israel Aerospace Ind Ltd System, method and computer program product for maneuvering an air vehicle
CN103019250B (en) * 2012-12-03 2015-01-07 华北电力大学 Bevel take-off control method of inspection flying robot
KR101483058B1 (en) * 2014-01-21 2015-01-15 엘아이지넥스원 주식회사 Ground control system for UAV anticollision
CN105517666B (en) * 2014-09-05 2019-08-27 深圳市大疆创新科技有限公司 Offline mode selection based on scene
CN104991563B (en) * 2015-05-12 2023-10-03 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle hierarchical operation method and system
CN108319288A (en) * 2016-01-26 2018-07-24 深圳市大疆创新科技有限公司 Unmanned plane and its flight control method and system
CN105867400A (en) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 Flying control method and device for unmanned aerial vehicle
CN105955298B (en) * 2016-06-03 2018-09-07 腾讯科技(深圳)有限公司 A kind of automatic obstacle-avoiding method and device of aircraft

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105334980A (en) * 2007-12-31 2016-02-17 微软国际控股私有有限公司 3D pointing system
US20160300492A1 (en) * 2014-05-20 2016-10-13 Verizon Patent And Licensing Inc. Utilization of third party networks and third party unmanned aerial vehicle platforms
CN105518555A (en) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for target tracking
CN105278543A (en) * 2015-09-28 2016-01-27 小米科技有限责任公司 Method and device for increasing flight security, and electronic equipment
CN105141851A (en) * 2015-09-29 2015-12-09 杨珊珊 Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle
CN105955292A (en) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 Aircraft flight control method and system, mobile terminal and aircraft
CN106022274A (en) * 2016-05-24 2016-10-12 零度智控(北京)智能科技有限公司 Obstacle avoiding method, obstacle avoiding device and unmanned machine
CN105955304A (en) * 2016-07-06 2016-09-21 零度智控(北京)智能科技有限公司 Obstacle avoidance method, obstacle avoidance device and unmanned aerial vehicle

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109540834A (en) * 2018-12-13 2019-03-29 深圳市太赫兹科技创新研究院 A kind of cable aging monitoring method and system
CN109947096A (en) * 2019-02-25 2019-06-28 广州极飞科技有限公司 The control method and device of controll plant, Unmanned Systems
CN110673642A (en) * 2019-10-28 2020-01-10 深圳市赛为智能股份有限公司 Unmanned aerial vehicle landing control method and device, computer equipment and storage medium
CN110673642B (en) * 2019-10-28 2022-10-28 深圳市赛为智能股份有限公司 Unmanned aerial vehicle landing control method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
WO2018112848A1 (en) 2018-06-28
CN112987782A (en) 2021-06-18
CN108450032B (en) 2021-03-02

Similar Documents

Publication Publication Date Title
US11861892B2 (en) Object tracking by an unmanned aerial vehicle using visual sensors
US11644832B2 (en) User interaction paradigms for a flying digital assistant
US11635775B2 (en) Systems and methods for UAV interactive instructions and control
US11749124B2 (en) User interaction with an autonomous unmanned aerial vehicle
US20210072745A1 (en) Systems and methods for uav flight control
US11263761B2 (en) Systems and methods for visual target tracking
US10860040B2 (en) Systems and methods for UAV path planning and control
CN112567201B (en) Distance measuring method and device
CN108450032A (en) Flight control method and device
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
KR101155761B1 (en) Method and apparatus for presenting location information on augmented reality
CN111164958A (en) System and method for processing and displaying image data based on pose information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210302