CN111930147A - Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle - Google Patents

Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle Download PDF

Info

Publication number
CN111930147A
CN111930147A CN202011069171.1A CN202011069171A CN111930147A CN 111930147 A CN111930147 A CN 111930147A CN 202011069171 A CN202011069171 A CN 202011069171A CN 111930147 A CN111930147 A CN 111930147A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
current position
circle
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011069171.1A
Other languages
Chinese (zh)
Other versions
CN111930147B (en
Inventor
饶丹
王陈
任斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Jouav Automation Technology Co ltd
Original Assignee
Chengdu Jouav Automation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Jouav Automation Technology Co ltd filed Critical Chengdu Jouav Automation Technology Co ltd
Priority to CN202011069171.1A priority Critical patent/CN111930147B/en
Publication of CN111930147A publication Critical patent/CN111930147A/en
Application granted granted Critical
Publication of CN111930147B publication Critical patent/CN111930147B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a storage medium, flight control method, a device, autopilot and unmanned aerial vehicle, relate to flight control technical field, photography equipment when unmanned aerial vehicle disposes is shooting the target object, obtain the current position parameter that this photography equipment's field of vision center corresponds by unmanned aerial vehicle, and when this current position parameter satisfies the parameter renewal condition of setting for, through controlling the flight plan flight of unmanned aerial vehicle with the renewal back, make unmanned aerial vehicle can combine photography equipment's field of vision central point to carry out the tracking flight to the target object, in order to improve unmanned aerial vehicle's tracking efficiency.

Description

Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle
Technical Field
The application relates to the technical field of flight control, in particular to a storage medium, a flight control method and device, an automatic pilot and an unmanned aerial vehicle.
Background
The unmanned aerial vehicle carrying the photographic equipment such as the photoelectric pod can carry out investigation search on the ground, and after a target is found, the unmanned aerial vehicle can carry out image locking tracking on the target so as to ensure that the investigation target is not lost and the target continuously appears in the pod image view field.
However, for the moving target, the flight path of the unmanned aerial vehicle needs to be updated in real time according to the movement characteristic of the moving target, so that the target can be continuously tracked in a long distance, and more favorable investigation information can be obtained.
However, most of the current moving target tracking is carried out based on a fixed air route of the unmanned aerial vehicle, and when the moving range of the target exceeds the visual field range of an airborne pod of the unmanned aerial vehicle, the unmanned aerial vehicle loses the tracking capability of the target.
Disclosure of Invention
The application aims to provide a storage medium, a flight control method and device, an automatic pilot and an unmanned aerial vehicle, which can track and fly a target object by combining with the center position of the visual field of a photographic device so as to improve the tracking efficiency and flexibility of the unmanned aerial vehicle.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides a flight control method, applied to an unmanned aerial vehicle, where the unmanned aerial vehicle is configured with a camera device; the method comprises the following steps:
acquiring a current position parameter corresponding to the center of a visual field of the photographic equipment when the photographic equipment shoots a target object;
judging whether the current position parameter meets a set parameter updating condition;
and when the current position parameter meets the parameter updating condition, controlling the unmanned aerial vehicle to fly according to the updated flight plan.
In a second aspect, the present application provides a flight control apparatus for an unmanned aerial vehicle, the unmanned aerial vehicle being configured with a camera device; the device comprises:
the processing module is used for acquiring current position parameters corresponding to the center of the field of view of the photographic equipment when the photographic equipment shoots a target object;
the processing module is further used for judging whether the current position parameter meets a set parameter updating condition;
and the control module is used for controlling the unmanned aerial vehicle to fly according to the updated flight plan when the current position parameter meets the parameter updating condition.
In a third aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the flight control method described above.
In a fourth aspect, the present application provides an autopilot that includes a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the flight control method described above.
In a fifth aspect, the present application provides an unmanned aerial vehicle, which is equipped with the above-mentioned autopilot.
The application provides a storage medium, flight control method, a device, autopilot and unmanned aerial vehicle, photography equipment when unmanned aerial vehicle disposes is at the shooting target object, obtain the current position parameter that this photography equipment's field of vision center corresponds by unmanned aerial vehicle, and when this current position parameter satisfies the parameter update condition of setting for, fly with the flight plan after the update through controlling unmanned aerial vehicle, make unmanned aerial vehicle can combine photography equipment's field of vision central point to put and follow the trail of the target object and fly, in order to improve unmanned aerial vehicle's tracking efficiency and flexibility.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed for the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also derive other related drawings from these drawings without inventive effort.
FIG. 1 is a block schematic diagram of an autopilot provided herein;
FIG. 2 illustrates an exemplary flow chart of a flight control method provided herein;
FIG. 3 illustrates an exemplary block diagram of a flight control apparatus provided herein.
In the figure: 100-autopilot; 101-a memory; 102-a processor; 103-a communication interface; 300-a flight control device; 301-a processing module; 302-control module.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. The components of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on a part of the embodiments in the present application without any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In a scenario where the drone tracks a target object, such as the drone described above, taking a fixed-wing drone or a compound-wing drone as an example, when the drone performs a tracking task, the drone generally tracks a target with a stable motion state, such as an automobile that can be considered to run straight on a highway, or a target that is close to a stationary state, and the drone can perform the tracking task on the target in a stable state; for example, in the scene that the tracking target is an automobile running on a highway, the unmanned aerial vehicle can fly along with the automobile; for another example, when the tracked target is a stationary target, the drone may fly around the target.
However, the above tracking strategy can generally only track the target in some special scenes, such as the above tracking the car running on the highway, or tracking the target in a stationary state. In some wider application scenarios, the motion state of the target tracked by the drone is generally uncertain, such as the possibility that the automobile may be stopped urgently while keeping a steady driving, and the stationary target may start moving. Therefore, when a target object is tracked and flown by using some tracking strategies such as the above example, when the motion state of the target object tracked by the unmanned aerial vehicle changes, the continuous motion of the target object may exceed the tracking view of the pod, which may cause the unmanned aerial vehicle to lose the tracking target, and further cause the tracking task to be executed unsuccessfully, and the tracking efficiency and flexibility are low.
Therefore, based on the defects of the tracking scheme, the application provides a possible implementation mode as follows: when the photography equipment configured by the unmanned aerial vehicle shoots a target object, the unmanned aerial vehicle acquires the current position parameter corresponding to the visual field center of the photography equipment, and when the current position parameter meets the set parameter updating condition, the unmanned aerial vehicle flies on the basis of the updated flight plan through real-time control, so that the unmanned aerial vehicle can track and fly the target object by combining the visual field center position of the photography equipment, and the tracking precision of the unmanned aerial vehicle is improved.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 shows a schematic block diagram of an autopilot 100 provided herein, and in some embodiments, the autopilot 100 may include a memory 101, a processor 102, and a communication interface 103, and the memory 101, the processor 102, and the communication interface 103 are electrically connected to one another, directly or indirectly, to enable data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to the flight control apparatus provided in the present application, and the processor 102 executes the software programs and modules stored in the memory 101 to execute various functional applications and data processing, thereby executing the steps of the flight control method provided in the present application. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 1 is merely illustrative and that the autopilot 100 may include more or fewer components than shown in fig. 1 or may have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
In addition, the present application also provides a drone (not shown) that may be configured with an autopilot 100 and a camera device such as shown in fig. 1; for example, the photographing device may be pod equipment mounted on the unmanned aerial vehicle, a vehicle-mounted cradle head, or some other devices that are disposed on the unmanned aerial vehicle, integrated with a camera, and capable of providing an image photographing function, and the specific form of the photographing device is not limited in the present application.
In some embodiments of the present application, the photographing device may photograph the target object to obtain video image data of the target object, and send the video image data to the autopilot 100, and the autopilot 100 may control a flight state of the unmanned aerial vehicle according to the received video image data, so that the photographing device may continuously track and photograph the target object; additionally, in some embodiments, the drone may be a fixed wing drone, or a compound wing drone.
The following takes an unmanned aerial vehicle equipped with a camera device and an autopilot 100 as shown in fig. 1 as an exemplary execution subject, and an exemplary flight control method provided by the present application is described.
Referring to fig. 2, fig. 2 shows an exemplary flowchart of a flight control method provided by the present application, which may include the following steps:
step 201, acquiring a current position parameter corresponding to the center of the field of view of the photographing device when the photographing device photographs the target object.
Step 203, judging whether the current position parameter meets the set parameter updating condition; if yes, go to step 205; when no, step 201 is performed.
And step 205, controlling the unmanned aerial vehicle to fly according to the updated flight plan.
In the scene that foretell unmanned aerial vehicle tracked the target flight, for example, unmanned aerial vehicle takes off the back, can fly according to preset flight plan earlier to utilize the photography equipment of configuration real-time acquisition video image data, and give the receiving terminal with video image data transmission, thereby show the video image data that unmanned aerial vehicle shot for the user by the receiving terminal.
The receiving end can receive a tracking instruction input by a user on a display interface of the receiving end, and the tracking instruction can instruct the unmanned aerial vehicle to track at least one target object in the video image data; after the receiving end obtains the tracking instruction, the receiving end can forward the tracking instruction to the unmanned aerial vehicle; as such, after receiving the tracking instruction, the drone may, in response to the tracking instruction, develop a flight plan using a route guidance algorithm, such as L1 guidance, and execute the flight plan to track the target object for flight.
In addition, in some possible scenes, when the unmanned aerial vehicle tracks the target object in real time to fly, the unmanned aerial vehicle can transmit video image data of the target object shot by the shooting equipment in real time to the receiving end, so that a user at the receiving end can watch a video picture of the target object in real time.
In some embodiments, the photographing apparatus may adjust the posture of the photographing apparatus in real time, so that the target object can be located near the center of the field of view of the photographing apparatus, and thus the target object can also be located near the center of the video frame in the video frame viewed by the user.
In some embodiments, after the unmanned aerial vehicle acquires the video image data of the target object, the unmanned aerial vehicle may process the video image data to acquire a current position parameter corresponding to a center of a field of view of the photographing device when the photographing device photographs the target object; for example, in combination with data such as an attitude angle of the photographing device, a GPS (Global Positioning System) position of the unmanned aerial vehicle, a three-axis attitude angle, and ground elevation data, a position parameter of a geographic point corresponding to a center of a field of view of the photographing device is calculated, for example, a coordinate value, an altitude, and other parameters of the geographic point in a geodetic coordinate System.
Next, the drone may determine whether the current position parameter satisfies a preset parameter update condition based on the preset parameter update condition; the set parameter update condition may be used to instruct the drone to update the currently executed flight plan.
It should be noted that the set parameter update condition may be a parameter update condition stored by the unmanned aerial vehicle by default, or may be a parameter update condition sent by some devices, such as the receiving end, received by the unmanned aerial vehicle.
When the unmanned aerial vehicle judges that the current position parameter does not meet the parameter updating condition and represents that the unmanned aerial vehicle can continuously track the target object currently, the unmanned aerial vehicle can return to execute the step 201; on the contrary, when the unmanned aerial vehicle judges that the current position parameter meets the parameter updating condition, the representation unmanned aerial vehicle can not continuously track the target object currently, the unmanned aerial vehicle can update the flight plan, and step 205 is executed to control the unmanned aerial vehicle to fly with the updated flight plan, so that the requirement that the unmanned aerial vehicle continuously tracks the target object to fly is met.
It is thus clear that based on the above-mentioned embodiment that this application provided, when the photography equipment of unmanned aerial vehicle configuration is shooting the target object, obtain the current position parameter that this photography equipment's field of vision center corresponds by unmanned aerial vehicle to when this current position parameter satisfies the parameter update condition of setting for, through the flight plan flight after controlling unmanned aerial vehicle with the update, make unmanned aerial vehicle can combine photography equipment's field of vision central point to carry out the tracking flight to the target object, with improvement unmanned aerial vehicle's tracking efficiency.
In some possible scenarios, when the unmanned aerial vehicle executes step 201, the unmanned aerial vehicle may adopt devices such as a laser range finder and a GPS to obtain a distance between the unmanned aerial vehicle and the center position of the field of view of the photographing device, and calculate current position parameters such as a coordinate value and an altitude of the center position of the field of view of the photographing device in the geodetic coordinate system by combining the coordinates of the unmanned aerial vehicle in the geodetic coordinate system measured by the GPS and respective attitude parameters of the unmanned aerial vehicle and the photographing device.
In addition, in some scenarios, in order to improve the calculation accuracy of the current position parameter of the center position of the field of view, the unmanned aerial vehicle may perform iterative calculation on the position parameter of the center position of the field of view of the photographing device in combination with geographical Elevation data, such as a pre-stored DEM (Digital Elevation Model), so as to obtain a more accurate current position parameter of the center of the field of view. For convenience and simplicity of description, the specific formula for calculating the current position parameter of the center of the field of view by using the scheme may refer to chinese patent application with application number CN201810409853.9, which is not described herein again.
In some possible application scenarios, when the drone is flying while tracking the target object, a flight plan may be made based on a scheme such as L1 guidance, and the target object may be tracked in a surrounding manner; in this scenario, if the target object is stationary, the flight trajectory of the unmanned aerial vehicle may be circular; if the target object moves along a straight line, the flight trajectory of the drone may be helical.
Based on this, as a possible implementation manner, when the unmanned aerial vehicle executes step 203, an expected guidance circle may be constructed by taking the coordinate point in the current position parameter as the center of a circle, where the expected guidance circle is used to indicate the guidance circle when the unmanned aerial vehicle flies according to the updated flight plan; when the unmanned aerial vehicle constructs the expected guidance circle, the radius of the guidance circle of the expected guidance circle can be obtained by adopting any one of the following modes:
the drone may store a preset roll angle that may be indicative of a maximum roll angle of the drone while maintaining normal flight (e.g., the preset roll angle may be set to 15 deg); based on the above, the unmanned aerial vehicle can calculate a first guidance circle radius of the unmanned aerial vehicle when the unmanned aerial vehicle flies according to the preset roll angle, so that the first guidance circle radius is used as the guidance circle radius of the constructed expected guidance circle.
Illustratively, the calculation formula of the first guidance circle radius may satisfy the following:
Figure 27992DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,R_BANKrepresenting a first guiding circle radius;TASrepresenting the flight speed of the drone;GRAVITYrepresents the acceleration of gravity; tan represents a tangent function;MaxBankindicating the preset roll angle.
It can be understood that, the preset roll angle may be a roll angle stored by the unmanned aerial vehicle by default, or may be a roll angle sent by some devices such as the receiving end, etc. received by the unmanned aerial vehicle. Alternatively, the drone may store a first pitch angle of the camera device, where the first pitch angle may be used to characterize a preferred attitude angle of the camera device when shooting the target object (for example, the first pitch angle may be set to 40 deg); the unmanned aerial vehicle can calculate a second guide circle radius of the unmanned aerial vehicle when the photographing device photographs according to the preset first pitch angle based on the preset first pitch angle, so that the second guide circle radius is used as a guide circle radius of a constructed expected guide circle.
For example, the calculation formula of the second guidance circle radius may satisfy the following:
Figure 870046DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,R_GIMrepresenting a second guiding circle radius; VerticalDis represents the height difference between the unmanned aerial vehicle and the target object;EulerThetarepresenting the preset first pitch angle.
Or, in some other implementation manners of the present application, the unmanned aerial vehicle may further determine, based on the calculated first guidance circle radius and the calculated second guidance circle radius, the larger of the first guidance circle radius and the second guidance circle radius as the guidance circle radius of the desired guidance circle; in this way, the determined radius of the guidance circle of the desired guidance circle can be made more accurate.
Next, after obtaining the guidance circle radius of the desired guidance circle for the drone, the drone may also obtain a redundant flight range for the drone, which may be used to indicate that the drone is not updating the flight range of the flight plan.
In some possible embodiments, the drone may preset another second pitch angle, where the preset second pitch angle may be used to indicate a maximum pitch angle when the photographing device photographs the target object (for example, the second pitch angle may be 45 deg); through the scale proportion effect of utilizing the photography equipment, unmanned aerial vehicle can calculate the target field of view scope when the photography equipment is shooing according to this predetermined second pitch angle, the biggest ground scope that can shoot when photography equipment's gesture is the second pitch angle promptly to calculate the difference between this target field of view scope and unmanned aerial vehicle's the current system guide circle and regard as first distance value, thereby regard this first distance value as redundant flight range, this current system guide circle is the guidance circle that unmanned aerial vehicle carried out before flying the plan earlier.
Or, in another possible implementation, the unmanned aerial vehicle may further preset a second distance value, where the preset second distance value may be used to indicate a distance threshold value of the unmanned aerial vehicle when updating the flight plan, that is, an upper limit of a distance difference between centers of the front and rear braking circles when updating the flight plan; based on this, the drone may take this preset second distance value as the redundant flight range described above.
Still alternatively, in some other possible embodiments of the present application, the unmanned aerial vehicle may further compare the first distance value with the second distance value by combining the calculated first distance value and the preset second distance value, so as to determine the smaller one of the first distance value and the second distance value as the redundant flight range.
Next, the unmanned aerial vehicle may determine whether the current position parameter is within the redundant flight range based on the obtained redundant flight range; for example, whether the coordinate value of the view center of the photographic equipment is located in the redundant flight range is judged, so that when the current position parameter is located in the redundant flight range, the unmanned aerial vehicle judges that the current position parameter does not meet the set parameter updating condition, and the unmanned aerial vehicle does not need to update the flight plan; otherwise, when the current position parameter is not within the redundant flight range, the unmanned aerial vehicle determines that the current position parameter meets the set parameter updating condition, that is, the unmanned aerial vehicle needs to update the flight plan.
In some embodiments, when updating the flight plan, the drone may use the calculated expected circle as a new flight plan; namely: the unmanned aerial vehicle takes the current position parameter corresponding to the visual field center of the photographic equipment as a circle center and takes the calculated guidance circle radius of the expected guidance circle as a surrounding radius to carry out surrounding tracking flight on the target object.
In addition, in some embodiments, when the drone obtains the updated flight plan, the drone may utilize a preset route guidance algorithm, such as the L1 guidance algorithm in the above example, to control the drone to guide and fly according to the desired circle, so as to ensure that the camera device can track the target object at the preferred pitch angle.
And, when unmanned aerial vehicle was flying at real-time guidance tracking target object, can control unmanned aerial vehicle's airspeed and keep for predetermined cruising airspeed to ensure that unmanned aerial vehicle can be stable tracking target object.
For example, assuming that the minimum and maximum flight speeds of the drone are 15m/s and 30m/s, respectively, the cruising flight speed of the drone may be set to 20m/s as long as the cruising flight speed is between the minimum and maximum flight speeds.
In addition, based on the same inventive concept as the above-mentioned flight control method provided in the present application, please refer to fig. 3, fig. 3 shows an exemplary structural block diagram of a flight control device 300 provided in the present application, where the flight control device 300 may include a processing module 301 and a control module 302; wherein:
the processing module 301 is configured to acquire a current position parameter corresponding to a center of a field of view of the photographing device when the photographing device photographs a target object;
the processing module 301 is further configured to determine whether the current location parameter meets a set parameter update condition;
and the control module 302 is configured to control the unmanned aerial vehicle to fly according to the updated flight plan when the current position parameter meets the parameter update condition.
Optionally, as a possible implementation manner, when determining whether the current location parameter meets the set parameter update condition, the processing module 301 is specifically configured to:
acquiring the radius of a guidance circle when the unmanned aerial vehicle flies by the expected guidance circle; the expected pilot circle is a pilot circle with the current position parameter as the center of a circle, and the expected pilot circle is used for indicating the pilot circle when the unmanned aerial vehicle flies according to the updated flight plan;
obtaining a redundant flight range of the unmanned aerial vehicle; wherein the redundant flight ranges are used to indicate flight ranges for which the drone is not to update the flight plan;
judging whether the current position parameter is in a redundant flight range;
when the current position parameter is in the redundant flight range, judging that the current position parameter does not meet the set parameter updating condition;
and when the current position parameter is not in the redundant flight range, judging that the current position parameter meets the set parameter updating condition.
Optionally, as a possible implementation manner, when obtaining the radius of the manufactured circle, the processing module 301 is specifically configured to:
calculating a first guidance circle radius of the unmanned aerial vehicle when the unmanned aerial vehicle flies according to a preset rolling angle and a second guidance circle radius of the unmanned aerial vehicle when the photographing equipment photographs according to a preset first pitch angle;
the larger of the first and second guide circle radii is determined as the guide circle radius of the desired guide circle.
Optionally, as a possible implementation manner, when obtaining the redundant flight range of the drone, the processing module 301 is specifically configured to:
calculating a target view field range when the photographing equipment performs photographing according to a preset second pitch angle and a first distance value between the target view field range and a current pilot circle of the unmanned aerial vehicle;
and comparing the first distance value with a preset second distance value to determine the smaller one of the first distance value and the second distance value as a redundant flight range.
Optionally, as a possible implementation, when controlling the drone to fly according to the updated flight plan, the control module 302 is specifically configured to:
and controlling the unmanned aerial vehicle to guide and fly according to the expected pilot circle by using a preset air route guidance algorithm.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to some embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in some embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to some embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only a few examples of the present application and is not intended to limit the present application, and those skilled in the art will appreciate that various modifications and variations can be made in the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A flight control method is characterized by being applied to an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a photographic device; the method comprises the following steps:
acquiring a current position parameter corresponding to the center of a visual field of the photographic equipment when the photographic equipment shoots a target object;
judging whether the current position parameter meets a set parameter updating condition;
and when the current position parameter meets the parameter updating condition, controlling the unmanned aerial vehicle to fly according to the updated flight plan.
2. The method of claim 1, wherein the determining whether the current location parameter satisfies a set parameter update condition comprises:
obtaining the radius of a guidance circle when the unmanned aerial vehicle flies by a desired guidance circle; the expected guidance circle is a guidance circle with the current position parameter as a circle center, and the expected guidance circle is used for indicating the guidance circle when the unmanned aerial vehicle flies according to the updated flight plan;
obtaining a redundant flight range of the unmanned aerial vehicle; wherein the redundant flight ranges are used to indicate flight ranges for which the drone is not updating a flight plan;
judging whether the current position parameter is in the redundant flight range;
when the current position parameter is positioned in the redundant flight range, judging that the current position parameter does not meet a set parameter updating condition;
and when the current position parameter is not in the redundant flight range, judging that the current position parameter meets the set parameter updating condition.
3. The method of claim 2, wherein obtaining the guided circle radius comprises:
calculating a first guidance circle radius of the unmanned aerial vehicle when flying according to a preset rolling angle, and calculating a second guidance circle radius of the unmanned aerial vehicle when shooting is carried out by the shooting equipment according to a preset first pitch angle;
determining the larger of the first and second guide circle radii as the guide circle radius of the desired guide circle.
4. The method of claim 2, wherein said obtaining redundant flight ranges for said drones comprises:
calculating a target view field range when the photographing equipment performs photographing according to a preset second pitch angle and a first distance value between the target view field range and a current pilot circle of the unmanned aerial vehicle;
and comparing the first distance value with a preset second distance value to determine the smaller one of the first distance value and the second distance value as the redundant flight range.
5. The method of claim 2, wherein said controlling the drone to fly at an updated flight plan comprises:
and controlling the unmanned aerial vehicle to guide and fly according to the expected pilot circle by using a preset air route guidance algorithm.
6. A flight control device is characterized by being applied to an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a photographic device; the device comprises:
the processing module is used for acquiring current position parameters corresponding to the center of the field of view of the photographic equipment when the photographic equipment shoots a target object;
the processing module is further used for judging whether the current position parameter meets a set parameter updating condition;
and the control module is used for controlling the unmanned aerial vehicle to fly according to the updated flight plan when the current position parameter meets the parameter updating condition.
7. The apparatus according to claim 6, wherein the processing module, when determining whether the current location parameter satisfies a set parameter update condition, is specifically configured to:
obtaining the radius of a guidance circle when the unmanned aerial vehicle flies by a desired guidance circle; the expected guidance circle is a guidance circle with the current position parameter as a circle center, and the expected guidance circle is used for indicating the guidance circle when the unmanned aerial vehicle flies according to the updated flight plan;
obtaining a redundant flight range of the unmanned aerial vehicle; wherein the redundant flight ranges are used to indicate flight ranges for which the drone is not updating a flight plan;
judging whether the current position parameter is in the redundant flight range;
when the current position parameter is positioned in the redundant flight range, judging that the current position parameter does not meet a set parameter updating condition;
and when the current position parameter is not in the redundant flight range, judging that the current position parameter meets the set parameter updating condition.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
9. An autopilot, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-5.
10. An unmanned aerial vehicle, characterized in that the unmanned aerial vehicle is equipped with an autopilot as claimed in claim 9.
CN202011069171.1A 2020-10-09 2020-10-09 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle Active CN111930147B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011069171.1A CN111930147B (en) 2020-10-09 2020-10-09 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011069171.1A CN111930147B (en) 2020-10-09 2020-10-09 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN111930147A true CN111930147A (en) 2020-11-13
CN111930147B CN111930147B (en) 2021-01-26

Family

ID=73334312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011069171.1A Active CN111930147B (en) 2020-10-09 2020-10-09 Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111930147B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110578A (en) * 2021-04-16 2021-07-13 成都纵横自动化技术股份有限公司 Control method, system and device of unmanned aerial vehicle
CN115665553A (en) * 2022-09-29 2023-01-31 深圳市旗扬特种装备技术工程有限公司 Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016174659A1 (en) * 2015-04-27 2016-11-03 Snapaid Ltd. Estimating and using relative head pose and camera field-of-view
CN106657779A (en) * 2016-12-13 2017-05-10 重庆零度智控智能科技有限公司 Surround shooting method and device, and unmanned aerial vehicle
CN109447398A (en) * 2018-09-17 2019-03-08 北京晶品镜像科技有限公司 A kind of intelligence shooting decision-making technique of artilleryman group
CN109911231A (en) * 2019-03-20 2019-06-21 武汉理工大学 Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation
CN109947123A (en) * 2019-02-27 2019-06-28 南京航空航天大学 A kind of unmanned plane path trace and automatic obstacle avoiding method based on line of sight guidance rule
WO2019135912A1 (en) * 2018-01-05 2019-07-11 Gopro, Inc. Adaptive object detection
CN110568862A (en) * 2019-09-29 2019-12-13 苏州浪潮智能科技有限公司 Unmanned aerial vehicle flight path planning method and device and related equipment
CN110908405A (en) * 2019-12-18 2020-03-24 中国人民解放军总参谋部第六十研究所 Control method for fixed-wing unmanned aerial vehicle during concentric circle flight
CN111556546A (en) * 2020-03-19 2020-08-18 西安电子科技大学 Searching method, system, storage medium and application of shortest information collection path

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016174659A1 (en) * 2015-04-27 2016-11-03 Snapaid Ltd. Estimating and using relative head pose and camera field-of-view
CN106657779A (en) * 2016-12-13 2017-05-10 重庆零度智控智能科技有限公司 Surround shooting method and device, and unmanned aerial vehicle
WO2019135912A1 (en) * 2018-01-05 2019-07-11 Gopro, Inc. Adaptive object detection
CN109447398A (en) * 2018-09-17 2019-03-08 北京晶品镜像科技有限公司 A kind of intelligence shooting decision-making technique of artilleryman group
CN109947123A (en) * 2019-02-27 2019-06-28 南京航空航天大学 A kind of unmanned plane path trace and automatic obstacle avoiding method based on line of sight guidance rule
CN109911231A (en) * 2019-03-20 2019-06-21 武汉理工大学 Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation
CN110568862A (en) * 2019-09-29 2019-12-13 苏州浪潮智能科技有限公司 Unmanned aerial vehicle flight path planning method and device and related equipment
CN110908405A (en) * 2019-12-18 2020-03-24 中国人民解放军总参谋部第六十研究所 Control method for fixed-wing unmanned aerial vehicle during concentric circle flight
CN111556546A (en) * 2020-03-19 2020-08-18 西安电子科技大学 Searching method, system, storage medium and application of shortest information collection path

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110578A (en) * 2021-04-16 2021-07-13 成都纵横自动化技术股份有限公司 Control method, system and device of unmanned aerial vehicle
CN113110578B (en) * 2021-04-16 2023-03-10 成都纵横自动化技术股份有限公司 Unmanned aerial vehicle control method, system and device
CN115665553A (en) * 2022-09-29 2023-01-31 深圳市旗扬特种装备技术工程有限公司 Automatic tracking method and device for unmanned aerial vehicle, electronic equipment and storage medium
CN115665553B (en) * 2022-09-29 2023-06-13 深圳市旗扬特种装备技术工程有限公司 Automatic tracking method and device of unmanned aerial vehicle, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111930147B (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN108227751B (en) Landing method and system of unmanned aerial vehicle
US10656661B2 (en) Methods and apparatus of tracking moving targets from air vehicles
CN106657779B (en) Surrounding shooting method and device and unmanned aerial vehicle
US20200378760A1 (en) Hover control
US11906983B2 (en) System and method for tracking targets
CA2569209C (en) Image-augmented inertial navigation system (iains) and method
US20200191556A1 (en) Distance mesurement method by an unmanned aerial vehicle (uav) and uav
CN111930147B (en) Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle
CN111966133A (en) Visual servo control system of holder
WO2017168423A1 (en) System and method for autonomous guidance of vehicles
CN112631265B (en) Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
US20210120171A1 (en) Determination device, movable body, determination method, and program
US10565863B1 (en) Method and device for providing advanced pedestrian assistance system to protect pedestrian preoccupied with smartphone
CN112712558A (en) Positioning method and device of unmanned equipment
CN109143303B (en) Flight positioning method and device and fixed-wing unmanned aerial vehicle
Kwak et al. Emerging ICT UAV applications and services: Design of surveillance UAVs
CN113848541B (en) Calibration method and device, unmanned aerial vehicle and computer readable storage medium
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
JP2021110692A (en) Drone system and vehicle photographing method by drone
CN110997488A (en) System and method for dynamically controlling parameters for processing sensor output data
KR101821992B1 (en) Method and apparatus for computing 3d position of target using unmanned aerial vehicles
CN116430901A (en) Unmanned aerial vehicle return control method and system based on mobile parking apron
WO2018123013A1 (en) Controller, mobile entity, control method, and program
US20210218879A1 (en) Control device, imaging apparatus, mobile object, control method and program
CN117470199B (en) Swing photography control method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 7 / F, area a, building 6, No. 200, Tianfu 5th Street, high tech Zone, Chengdu, Sichuan 610000

Patentee after: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: Room 801-805, 8th floor, Building A, No. 200, Tianfu Wujie, Chengdu High-tech Zone, Sichuan Province, 610000

Patentee before: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region before: China