CN111352410A - Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle - Google Patents

Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle Download PDF

Info

Publication number
CN111352410A
CN111352410A CN202010340065.6A CN202010340065A CN111352410A CN 111352410 A CN111352410 A CN 111352410A CN 202010340065 A CN202010340065 A CN 202010340065A CN 111352410 A CN111352410 A CN 111352410A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
angular velocity
target
flight state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010340065.6A
Other languages
Chinese (zh)
Inventor
李泽伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Yifei Zhilian Technology Co ltd
Original Assignee
Chongqing Yifei Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Yifei Zhilian Technology Co ltd filed Critical Chongqing Yifei Zhilian Technology Co ltd
Priority to CN202010340065.6A priority Critical patent/CN111352410A/en
Publication of CN111352410A publication Critical patent/CN111352410A/en
Priority to CN202110031631.XA priority patent/CN112631265B/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D13/00Control of linear speed; Control of angular speed; Control of acceleration or deceleration, e.g. of a prime mover
    • G05D13/62Control of linear speed; Control of angular speed; Control of acceleration or deceleration, e.g. of a prime mover characterised by the use of electric means, e.g. use of a tachometric dynamo, use of a transducer converting an electric value into a displacement

Abstract

The application provides a flight control method, a device, a storage medium, an autopilot and an unmanned aerial vehicle, and relates to the technical field of flight control, wherein an ideal flight state is recorded, and the ideal flight state is the flight state of the unmanned aerial vehicle when an image acquisition device tracks a target object at a set tracking visual angle, so that after a first angular velocity of the image acquisition device is obtained, a target course angular velocity of the unmanned aerial vehicle in the ideal flight state can be obtained according to the first angular velocity, and further, the control output quantity of the unmanned aerial vehicle is obtained according to the current course angular velocity of the unmanned aerial vehicle and the target course angular velocity, so that the unmanned aerial vehicle can fly according to the ideal flight state; compare in prior art, can combine image acquisition equipment's first angular velocity to calculate unmanned aerial vehicle's control output volume to make unmanned aerial vehicle can satisfy image acquisition equipment when flight and follow target object with the tracking visual angle of setting for, thereby promote the image quality that acquires when image acquisition equipment trails target object.

Description

Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
Technical Field
The application relates to the technical field of flight control, in particular to a flight control method, a flight control device, a storage medium, an automatic pilot and an unmanned aerial vehicle.
Background
With the increase of unmanned aerial vehicle application scenes, a pod, a camera and the like can be mounted on the unmanned aerial vehicle as an image acquisition device to track a target.
Taking the pod mounted on the unmanned aerial vehicle as an example, when tracking the target, the unmanned aerial vehicle is generally responsible for executing a flight task, the pod generally executes the target tracking task, the unmanned aerial vehicle and the pod are mutually independent in control, and the unmanned aerial vehicle does not consider the influence of the flight attitude of the unmanned aerial vehicle on the shooting attitude of the pod when flying, so that the image quality obtained when the pod tracks the target is poor.
Disclosure of Invention
An object of the application is to provide a flight control method, a flight control device, a storage medium, an automatic pilot and an unmanned aerial vehicle, so that the unmanned aerial vehicle can meet the requirement that an image acquisition device tracks a target object at a set tracking visual angle when flying, and the image quality acquired when the image acquisition device tracks the target object is improved.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides a flight control method, which is applied to an autopilot in an unmanned aerial vehicle, wherein the unmanned aerial vehicle is mounted with an image acquisition device, and the method includes:
obtaining a first angular velocity of the image acquisition device;
obtaining a target course angular speed of the unmanned aerial vehicle in an ideal flight state according to the first angular speed; the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks the target object at the set tracking visual angle;
and obtaining the control output quantity of the unmanned aerial vehicle according to the current course angular velocity and the target course angular velocity of the unmanned aerial vehicle so that the unmanned aerial vehicle can fly according to the ideal flight state.
In a second aspect, the present application provides a flight control device, is applied to the automatic pilot among the unmanned aerial vehicle, unmanned aerial vehicle carries image acquisition equipment, the device includes:
the processing module is used for obtaining a first angular speed of the image acquisition equipment;
the processing module is further used for obtaining a target course angular speed of the unmanned aerial vehicle in an ideal flight state according to the first angular speed; the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks the target object at the set tracking visual angle;
and the control module is used for obtaining the control output quantity of the unmanned aerial vehicle according to the current course angular velocity and the target course angular velocity of the unmanned aerial vehicle so that the unmanned aerial vehicle can fly according to the ideal flight state.
In a third aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the flight control method described above.
In a fourth aspect, the present application provides an autopilot that includes a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the flight control method described above.
In a fifth aspect, the present application provides an unmanned aerial vehicle, which is equipped with the above-mentioned autopilot.
According to the flight control method, the device, the storage medium, the autopilot and the unmanned aerial vehicle, an ideal flight state is recorded, the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks a target object at a set tracking visual angle, so that after the first angular velocity of the image acquisition equipment is obtained, the target course angular velocity of the unmanned aerial vehicle in the ideal flight state can be obtained according to the first angular velocity, and further, the control output quantity of the unmanned aerial vehicle is obtained according to the current course angular velocity of the unmanned aerial vehicle and the target course angular velocity, so that the unmanned aerial vehicle can fly according to the ideal flight state; compare in prior art, can combine image acquisition equipment's first angular velocity to calculate unmanned aerial vehicle's control output volume to make unmanned aerial vehicle can satisfy image acquisition equipment when flight and follow target object with the tracking visual angle of setting for, thereby promote the image quality that acquires when image acquisition equipment trails target object.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed for the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also derive other related drawings from these drawings without inventive effort.
FIG. 1 illustrates a schematic application scenario of the flight control method provided in the present application;
FIG. 2 is a block schematic diagram of an autopilot provided herein;
FIG. 3 illustrates a schematic flow diagram of a flight control method provided herein;
FIG. 4 shows a schematic flow diagram of sub-steps of step 201 of FIG. 3;
FIG. 5 shows a schematic flow diagram of sub-steps of step 203 in FIG. 3;
FIG. 6 shows a schematic flow diagram of the substeps of step 203-1 in FIG. 5;
FIG. 7 shows a schematic block flow diagram of the substeps of step 203-1b of FIG. 6;
fig. 8 shows a schematic block diagram of a flight control device provided in the present application.
In the figure: 100-autopilot; 101-a memory; 102-a processor; 103-a communication interface; 300-a flight control device; 301-a processing module; 302-control module.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. The components of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on a part of the embodiments in the present application without any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic application scenario diagram of a flight control method provided in the present application; in some application scenarios, for example, a pod or a camera can be mounted on the unmanned aerial vehicle as load equipment, so that the image video information of a target object on the ground is acquired by using the load equipment with the sky as a view angle, and the purposes of security inspection and the like are achieved.
In an application scenario such as that shown in fig. 1, for example, a pod is mounted on an unmanned aerial vehicle as a load device to track a target object, when tracking the target object, the unmanned aerial vehicle is generally responsible for performing a flight mission, an autopilot mounted on the unmanned aerial vehicle is responsible for controlling the unmanned aerial vehicle to fly along a flight route, and the pod is generally responsible for performing video tracking on the target object and transmitting image information or ground information of the target object to a ground device.
When the target object is tracked, the control of the unmanned aerial vehicle and the control of the pod are mutually independent; generally, an unmanned aerial vehicle executes a flight task according to a preset flight route or a flight command sent by a receiving ground terminal, and after a pod locks a target, the shooting attitude of the pod is adjusted according to the position and the size of the target object in a pod picture, so that the target object can be stably tracked.
However, just because the control of the unmanned aerial vehicle and the control of the pod are independent, the unmanned aerial vehicle does not consider the influence of the flight attitude of the unmanned aerial vehicle on the tracking target of the pod when performing the flight task, so that the image quality obtained by the pod when tracking the target object is poor.
For example, when the unmanned aerial vehicle is far away from the target, the target object becomes smaller in the nacelle picture and is limited by the influence of the resolution of the nacelle, and the nacelle picture may be blurred to cause that the target object cannot be shot; for another example, when the drone approaches the target, the target object may become larger in the nacelle view, and even the target object may be spread over the entire nacelle view, so that the nacelle cannot track the target object.
Therefore, based on the above drawbacks, the present application provides a possible implementation manner as follows: through recording an ideal flight state, this ideal flight state is image acquisition equipment with the flight state of unmanned aerial vehicle when tracking target object at a tracking visual angle of settlement, thereby after obtaining image acquisition equipment's first angular velocity, can be according to this first angular velocity, obtain unmanned aerial vehicle target course angular velocity under this ideal flight state, and then current course angular velocity and this target course angular velocity according to unmanned aerial vehicle, obtain unmanned aerial vehicle's control output, so that unmanned aerial vehicle can fly according to ideal flight state, thereby make unmanned aerial vehicle can satisfy image acquisition equipment when flying and track target object with the tracking visual angle of settlement, promote the image quality that the image acquisition equipment acquireed when tracking target object.
Referring to fig. 2, fig. 2 shows a schematic block diagram of an autopilot 100 provided herein, and in one embodiment, the autopilot 100 may include a memory 101, a processor 102, and a communication interface 103, and the memory 101, the processor 102, and the communication interface 103 are electrically connected to each other directly or indirectly to enable data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to the flight control apparatus provided in the present application, and the processor 102 executes the software programs and modules stored in the memory 101 to execute various functional applications and data processing, thereby executing the steps of the flight control method provided in the present application. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The processor 102 may be a general-purpose processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 2 is merely illustrative and that the autopilot 100 may include more or fewer components than shown in fig. 2 or may have a different configuration than shown in fig. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
Based on the autopilot 100 of the above example, the present application also provides an unmanned aerial vehicle (not shown) equipped with the autopilot 100.
The flight control method provided by the present application is exemplarily described below with the autopilot 100 shown in fig. 2 as a schematic execution subject, where the autopilot 100 is mounted on an unmanned aerial vehicle, and the unmanned aerial vehicle is also mounted with an image capture device.
Referring to fig. 3, fig. 3 shows a schematic flow chart of a flight control method provided by the present application, which may include the following steps:
step 201, obtaining a first angular velocity of an image acquisition device;
step 203, obtaining a target course angular speed of the unmanned aerial vehicle in an ideal flight state according to the first angular speed;
and step 205, obtaining the control output quantity of the unmanned aerial vehicle according to the current course angular velocity and the target course angular velocity of the unmanned aerial vehicle, so that the unmanned aerial vehicle can fly according to an ideal flying state.
In one embodiment, the autopilot may record an ideal flight state, where the ideal flight state is a flight state of the drone when the image capture device tracks the target object at the set tracking view angle; for example, taking a pod as an image capturing device as an example, assuming that an ideal tracking view angle of the pod tracking the target object is preset to be 30 degrees, the ideal flight state is a flight state of the unmanned aerial vehicle when the pod tracks the target object by 30 degrees.
Therefore, after the automatic pilot determines that the image acquisition device acquires the image of the target object, the automatic pilot can adjust the flight state of the unmanned aerial vehicle, so that the image acquisition device can place the target object in the optimal tracking view range.
In some possible scenes, image acquisition equipment is mounted on the unmanned aerial vehicle to track and surround the target object, and generally the target object can be tracked and surrounded by adjusting the course of the unmanned aerial vehicle. For example, the view angle between the heading of the drone and the heading of the image acquisition device may be represented as:
diff_yaw=plane_yaw-pod_yaw
in the formula, diff _ yaw represents a sight angle, plane _ yaw represents the heading of the unmanned aerial vehicle in an earth coordinate system, and pod _ yaw represents the heading of the image acquisition equipment in the earth coordinate system.
When diff _ yaw is equal to 90 degrees, the unmanned aerial vehicle flies around the target; when diff _ yaw is equal to 0 degrees, the drone will fly aimed at the target object and not hover; when diff _ yaw is between 0 and 90 readings, the unmanned aerial vehicle spirals around the target object while approaching the target object; when diff _ yaw is greater than 90 degrees, the drone will hover around the target object while away from the target object.
Therefore, when the automatic pilot controls the unmanned aerial vehicle to fly, the first angular velocity of the image acquisition device can be obtained firstly, and the first angular velocity represents the current actual angular velocity of the image acquisition device; then, the autopilot can calculate, according to the first angular velocity, a target course angular velocity of the drone when the image capture device tracks the target object at the set tracking view angle, that is: the automatic pilot can be combined with the current actual angular speed of the image acquisition equipment to calculate the target navigation angular speed of the unmanned aerial vehicle in an ideal flight state.
Next, the autopilot can obtain the control output quantity of the unmanned aerial vehicle according to the current course angular velocity of the unmanned aerial vehicle and the target course angular velocity obtained through calculation, so that the current course angular velocity of the unmanned aerial vehicle is controlled to be continuously close to the target course angular velocity obtained through calculation, the unmanned aerial vehicle can fly according to an ideal flying state, and the unmanned aerial vehicle can be kept flying when the image acquisition equipment tracks the target object at the set tracking visual angle.
Therefore, based on the design, the flight control method provided by the application can obtain the target course angular velocity of the unmanned aerial vehicle in the ideal flight state according to the first angular velocity after the first angular velocity of the image acquisition device is obtained by recording an ideal flight state of the unmanned aerial vehicle when the image acquisition device tracks the target object at a set tracking view angle, and further obtain the control output quantity of the unmanned aerial vehicle according to the current course angular velocity of the unmanned aerial vehicle and the target course angular velocity, so that the unmanned aerial vehicle can fly according to the ideal flight state; compare in prior art, can combine image acquisition equipment's first angular velocity to calculate unmanned aerial vehicle's control output volume to make unmanned aerial vehicle can satisfy image acquisition equipment when flight and follow target object with the tracking visual angle of setting for, thereby promote the image quality that acquires when image acquisition equipment trails target object.
It should be noted that, in step 201, a device such as an angular velocity sensor may be configured on the image capturing device, so as to automatically capture the first angular velocity of the image capturing device, and then send the first angular velocity to the autopilot, so that the autopilot can directly receive the first angular velocity of the image capturing device.
In addition, the relative angle between the image acquisition equipment and the unmanned aerial vehicle can be acquired, and certain coordinate system transformation and processing are carried out, so that the first angular speed of the image acquisition equipment is obtained.
For example, referring to fig. 4 on the basis of fig. 3, fig. 4 shows a schematic flow chart of sub-steps of step 201 in fig. 3, and as a possible implementation, step 201 may include the following sub-steps:
step 201-1, obtaining a posture relative angle of an image acquisition device;
step 201-2, converting the relative attitude angle to obtain the current attitude angle of the image acquisition equipment in a terrestrial coordinate system;
in step 201-3, the current attitude angle is subjected to differential processing to obtain a first angular velocity.
In an embodiment, Inertial Measurement Units (IMU) and hall sensors may be disposed on the image capturing device, so that when the image capturing device is mounted on the unmanned aerial vehicle to perform a flight mission, the Inertial measurement unit may be used to capture information such as a posture of the image capturing device.
Therefore, as a possible implementation manner, the above-mentioned inertial measurement unit, such as an IMU, disposed on the image capturing device may be adopted to obtain a relative attitude angle of the image capturing device, which is an attitude angle representing the image capturing device relative to the drone.
Wherein, also can set up an IMU on unmanned aerial vehicle for can convert image acquisition equipment's inertial coordinate system into unmanned aerial vehicle's inertial coordinate system.
For example, the autopilot may receive parameter information transmitted by the IMU disposed on the image acquisition device and the drone, calculate to obtain a rotation matrix from which the inertial coordinate system of the image acquisition device is converted to the inertial coordinate system of the drone, and calculate to obtain a rotation matrix from which the inertial coordinate system of the drone is converted to the terrestrial coordinate system; and then, after certain operation is carried out on the two obtained rotation matrixes, a rotation matrix for converting the inertial coordinate system of the image acquisition equipment into the terrestrial coordinate system can be obtained.
Illustratively, the operational formula may satisfy the following:
Cei=Cep*Cpi
where Cei denotes a rotation matrix to which the inertial coordinate system of the image capturing device is converted to the terrestrial coordinate system, Cep denotes a rotation matrix to which the inertial coordinate system of the drone is converted to the terrestrial coordinate system, and Cpi denotes a rotation matrix to which the inertial coordinate system of the image capturing device is converted to the inertial coordinate system of the drone.
In this way, when the autopilot executes step 201, the obtained relative attitude angle of the image capturing device may be converted into the terrestrial coordinate system by using the rotation matrix Cei obtained by the above calculation, so as to obtain the current attitude angle of the image capturing device in the terrestrial coordinate system.
Then, the automatic pilot can perform differential processing on the obtained current attitude angle, so as to obtain a first angular velocity of the image acquisition equipment in a terrestrial coordinate system; therefore, the first angular speed of the image acquisition equipment is obtained through calculation by combining the inertia measurement equipment arranged on the image acquisition equipment, the hardware overhead can be reduced, and the cost is reduced.
In addition, it should be noted that when the autopilot adjusts the flight state of the drone to the ideal flight state, the drone may fly around the target object; when flying around the target object, the autopilot generally needs to consider the angular velocity of the unmanned aerial vehicle when surrounding the target object and the angular velocity of the image acquisition device.
Therefore, referring to fig. 5 on the basis of fig. 3, fig. 5 shows a schematic flow chart of the sub-steps of step 203 in fig. 3, and as a possible implementation, step 203 may include the following sub-steps:
step 203-1, calculating a second angular velocity of the unmanned aerial vehicle in the ideal flight state according to the condition of the ideal flight state;
and step 203-2, obtaining the target course angular speed according to the first angular speed and the second angular speed.
In one embodiment, when performing step 203, the autopilot may calculate a second angular velocity of the drone in the ideal flight state based on the above-mentioned condition of the ideal flight state, such as the flight state of the drone when the pod of the above example tracks the target object at 30 degrees, the second angular velocity being an angular velocity required for the drone to adjust the flight state to the ideal flight state, that is: when the image capturing device tracks the target object at the set tracking angle, ideally, the drone should fly at the second angular velocity.
Next, after obtaining the second angular velocity, the autopilot may calculate and obtain the target heading angular velocity of the drone according to the first angular velocity of the image capture device and the second angular velocity.
For example, as a possible implementation manner, the autopilot may superimpose the first angular velocity and the second angular velocity, so that the superimposed result is used as the target heading angular velocity; for example, the calculation formula of the target heading angular velocity may satisfy the following:
ω=ω12
where ω denotes a target heading angular velocity, ω1Representing a first angular velocity, ω2Representing the second angular velocity.
Of course, it is understood that the above is only an illustration, and the description illustrates a way of calculating to obtain the target heading angular velocity, and in some other possible implementations of the present application, some other ways may also be adopted to obtain the target heading angular velocity; for example, partial redundancy may be considered in advance, and by setting an angular velocity adjustment amount in advance, when step 203-2 is executed, the first angular velocity, the second angular velocity, and the angular velocity adjustment amount are superimposed, so as to obtain a target heading angular velocity of the unmanned aerial vehicle; then, or, the first angular velocity and the second angular velocity are respectively weighted and summed by using the set proportional parameters, so as to obtain the target course angular velocity; in short, as long as the target course angular velocity can be obtained by calculation according to the first angular velocity and the second angular velocity in a certain manner, the specific manner of obtaining the target course angular velocity by calculation is not limited in the present application.
Therefore, the target course angular speed is obtained through calculation according to the obtained first angular speed and the obtained second angular speed, so that when the automatic pilot controls the unmanned aerial vehicle to fly according to the control output quantity of the unmanned aerial vehicle calculated by combining the target course angular speed, the automatic pilot controls the unmanned aerial vehicle to fly, and the unmanned aerial vehicle and the image acquisition equipment can be in an ideal state.
In addition, referring to fig. 6 on the basis of fig. 5, fig. 6 shows a schematic flow chart of the sub-steps of step 203-1 in fig. 5, and as a possible implementation, step 203-1 may include the following sub-steps:
step 203-1a, calculating an expected relative distance between the unmanned aerial vehicle and a target object under a set tracking visual angle;
step 203-1b, obtaining a target acceleration required for adjusting the unmanned aerial vehicle to an expected relative distance;
and 203-1c, calculating to obtain a second angular velocity according to the target acceleration and the acquired flight state parameters.
In an embodiment, when the autopilot performs step 203-1 to obtain the second angular velocity, an expected relative distance between the drone and the target object under the set tracking angle of view may be calculated according to the condition of the ideal flight status.
For example, the calculation formula of the expected relative distance may satisfy the following:
dist_c=(plane_height-target_height)*cot(set_pitch)
in the formula, dist _ c represents an expected relative distance, plane _ height represents an altitude of the unmanned aerial vehicle, target _ height represents an altitude of the target object, and set _ pitch represents a set tracking view angle.
In some possible implementations, the altitude of the unmanned aerial vehicle may be obtained by a Global Positioning System (GPS) provided on the unmanned aerial vehicle; in addition, the altitude of the target object may be obtained by a GPS provided to the target object.
Or in some other possible implementation manners of the present application, a laser range finder may be further arranged on the image acquisition device, so that the laser range finder acquires a connection line distance between the image acquisition device and the target object, and in combination with a pitch angle of the image acquisition device when the target object is photographed, a sine value of the connection line distance multiplied by the pitch angle is used to inversely calculate a horizontal height difference between the unmanned aerial vehicle and the target object; and then multiplied by the set cotangent value of the tracking viewing angle to obtain the expected relative distance.
Next, the autopilot may calculate, in conjunction with the calculated expected relative distance, a target acceleration required to adjust the drone to the expected relative distance.
Illustratively, referring to fig. 7 on the basis of fig. 6, fig. 7 shows a schematic flow chart of the sub-steps of step 203-1b in fig. 6, and as a possible implementation, step 203-1b may include the following sub-steps:
step 203-1b-1, obtaining the expected flight distance of the unmanned aerial vehicle according to the actual relative distance and the expected relative distance between the unmanned aerial vehicle and the target object;
step 203-1b-2, processing the expected flight distance by using the set first time coefficient to obtain the target flight speed of the unmanned aerial vehicle;
and 203-1b-3, obtaining the target acceleration according to the set second time coefficient, the actual flying speed of the unmanned aerial vehicle and the target flying speed.
In an embodiment, when the autopilot performs step 203-1b, an expected flying distance of the drone, that is, a distance from the current position to the expected position of the drone, may be obtained according to the actual relative distance and the expected relative distance between the drone and the target object.
As a possible implementation manner, the actual relative distance may be calculated in a manner similar to the expected relative distance; for example, the calculation formula of the actual relative distance may satisfy the following:
dist=(plane_height-target_height)*cot(pot_pitch)
in the formula, dist represents an actual relative distance, plane _ height represents the altitude of the unmanned aerial vehicle, target _ height represents the altitude of the target object, and pot _ pitch represents the pitch angle of the image acquisition device.
Of course, it is understood that the above implementation is only an example, and an algorithm similar to the expected relative distance is used for calculation to obtain the actual relative distance; in other possible implementation manners of the present application, for example, a mode of setting GPS at the unmanned aerial vehicle and the target object may be further adopted to obtain current coordinates of the unmanned aerial vehicle and the target object, respectively, and an actual relative distance between the unmanned aerial vehicle and the target object is obtained by using a mode of solving a coordinate difference.
Therefore, the automatic pilot can calculate the difference value between the actual relative distance and the expected relative distance by adopting a distance difference calculating mode, so that the expected flying distance of the unmanned aerial vehicle is obtained.
The autopilot may then process the expected flight distance using the set first time factor to obtain a target airspeed of the drone, which may be indicative of the speed required for the drone to fly to the expected relative distance.
For example, as a possible implementation manner, the calculation manner of the target flying speed may satisfy the following:
vel_c=(dist_c-dist)*k_factor1
in the formula, vel _ c represents the target flying speed, dist _ c represents the expected relative distance, dist represents the actual relative distance, and k _ factor1 represents the first time coefficient, which can be obtained by setting a first preset time and inverting the first preset time.
It will be appreciated that in some possible scenarios, the value of the first time coefficient may be related to the sensitivity of the autopilot to adjust the flight state of the drone, the greater the first time coefficient, the higher the sensitivity, the smaller the first time coefficient, the lower the sensitivity.
In addition, it should be noted that the target flying speed vel _ c obtained through the above calculation is a speed component in a direction in which the unmanned aerial vehicle and the target object are connected; therefore, before calculating the target acceleration, a component of the horizontal velocity of the unmanned aerial vehicle in the direction of the connection line between the unmanned aerial vehicle and the target object can be obtained as the actual flying velocity of the unmanned aerial vehicle.
For example, as a possible implementation manner, the calculation formula of the actual flying speed of the unmanned aerial vehicle may satisfy the following:
vel=vel_plane*cos(plane_yaw-pod_yaw)
in the formula, vel represents the actual flying speed of the unmanned aerial vehicle in the direction of connecting the unmanned aerial vehicle with a target object, vel _ plane represents the horizontal speed of the unmanned aerial vehicle in the terrestrial coordinate system, plane _ yaw represents the heading of the unmanned aerial vehicle in the terrestrial coordinate system, and pod _ yaw represents the heading of the image acquisition equipment in the terrestrial coordinate system.
And then, the automatic pilot calculates the target acceleration according to the set second time coefficient, the calculated actual flying speed of the unmanned aerial vehicle and the calculated target flying speed.
For example, as a possible implementation, the calculation formula of the target acceleration may satisfy the following:
acc_c=(vel_c-vel)*k_factor2
in the formula, acc _ c represents a target acceleration, vel _ c represents a target flying speed, vel represents an actual flying speed of the unmanned aerial vehicle, and k _ factor2 represents a second time coefficient.
It is understood that, similar to the first time coefficient, the second time coefficient may also be obtained by calculating the reciprocal of the set second preset time, and the value of the second time coefficient may also be related to the sensitivity of the autopilot for adjusting the flight state of the unmanned aerial vehicle, where the larger the second time coefficient, the higher the sensitivity, and the smaller the second time coefficient, the lower the sensitivity.
Thus, referring to fig. 6 again, after the autopilot performs step 203-1b to obtain the target acceleration of the drone, the second angular velocity may be calculated according to the target acceleration and the obtained flight state parameter.
For example, as a possible implementation manner, the flight state parameters acquired by the autopilot may include a first heading of the unmanned aerial vehicle, a second heading of the image acquisition device, and a horizontal flying speed of the unmanned aerial vehicle; for example, the first heading of the drone may be a heading of the drone in a terrestrial coordinate system, and the second heading of the image capture device may be a heading of the image capture device in a terrestrial coordinate system.
Wherein, combine above-mentioned can know, unmanned aerial vehicle's actual airspeed is the horizontal velocity of unmanned aerial vehicle at the component of unmanned aerial vehicle and target object line direction, and its computational formula can be expressed as:
vel=vel_plane*cos(plane_yaw-pod_yaw),
the formula is subjected to differential processing to obtain:
acc=vel_plane*(-sin(plane_yaw-pod_yaw))*ω2
in the formula, acc represents acceleration, vel _ plane represents horizontal speed of the unmanned aerial vehicle, plane _ yaw represents heading of the unmanned aerial vehicle in a terrestrial coordinate system, pod _ yaw represents heading of the image acquisition equipment in the terrestrial coordinate system, and omega2Representing the second angular velocity.
Thus, by transforming the formula, a calculation formula for obtaining the second angular velocity can be obtained, which is expressed as:
Figure BDA0002468048060000161
therefore, as a possible implementation manner, the calculation formula of the second angular velocity is used as a set processing algorithm, so that when the autopilot executes step 203-1c, the target acceleration, the first heading, the second heading, and the actual flying speed can be processed by using the set processing algorithm to obtain the second angular velocity; wherein, the specific calculation formula of the second angular velocity may satisfy the following:
Figure BDA0002468048060000171
in the formula, ω2Representing a second angular velocity, acc _ c representing a target acceleration, vel _ plane representing a horizontal velocity of the unmanned aerial vehicle, plane _ yaw representing a heading of the unmanned aerial vehicle in an earth coordinate system, namely a first heading, and pod _ yaw representing a heading of the image acquisition device in the earth coordinate system, namely a second heading.
In addition, when the autopilot executes step 205 to obtain the control output of the unmanned aerial vehicle, as a possible implementation manner, the autopilot may input the current heading angular velocity and the target heading angular velocity into a set PID (proportional Integral Differential) controller, so that the PID controller outputs the control output of the unmanned aerial vehicle, thereby enabling the current heading angular velocity of the unmanned aerial vehicle to continuously approach the target heading angular velocity, and further enabling the flight state of the unmanned aerial vehicle to continuously approach the ideal flight state.
The control method comprises the following steps of configuring control output quantity for the unmanned aerial vehicle according to different scenes; for example, in some possible scenarios of the present application, the control output quantity of the drone may be a roll control quantity and a direction control quantity required for controlling the flight of the drone; in other possible scenarios of the present application, the control output quantity of the drone may be a roll control quantity or a direction control quantity, or may even be some other control quantity; the application does not limit the specific type of the control output quantity of the unmanned aerial vehicle, and the specific numerical value can be configured by combining with an actual scene.
Based on the same inventive concept as the above-mentioned flight control method, please refer to fig. 8, fig. 8 shows a schematic structural block diagram of a flight control device 300 provided in the present application; the flight control device 300 may include a processing module 301 and a control module 302. Wherein:
a processing module 301, configured to obtain a first angular velocity of the image capturing apparatus;
the processing module 301 is further configured to obtain a target course angular velocity of the unmanned aerial vehicle in an ideal flight state according to the first angular velocity; the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks the target object at the set tracking visual angle;
the control module 302 is configured to obtain a control output of the unmanned aerial vehicle according to the current heading angular velocity and the target heading angular velocity of the unmanned aerial vehicle, so that the unmanned aerial vehicle can fly according to an ideal flight state.
Optionally, as a possible implementation manner, when obtaining the target heading angular velocity of the unmanned aerial vehicle in the ideal flight state according to the first angular velocity, the processing module 301 is specifically configured to:
calculating a second angular velocity of the unmanned aerial vehicle in the ideal flight state according to the condition of the ideal flight state, wherein the second angular velocity represents the angular velocity required by the unmanned aerial vehicle for adjusting the flight state to the ideal flight state;
and obtaining the target course angular speed according to the first angular speed and the second angular speed.
Optionally, as a possible implementation manner, when calculating the second angular velocity of the unmanned aerial vehicle in the ideal flight state according to the condition of the ideal flight state, the processing module 301 is specifically configured to:
calculating an expected relative distance between the unmanned aerial vehicle and a target object under a set tracking visual angle;
obtaining a target acceleration required to adjust the drone to a desired relative distance;
and calculating to obtain a second angular velocity according to the target acceleration and the acquired flight state parameters.
Optionally, as a possible implementation, the processing module 301, when obtaining the target acceleration required to adjust the drone to the desired relative distance, is specifically configured to:
obtaining the expected flying distance of the unmanned aerial vehicle according to the actual relative distance and the expected relative distance between the unmanned aerial vehicle and the target object;
processing the expected flight distance by using the set first time coefficient to obtain the target flight speed of the unmanned aerial vehicle;
and obtaining the target acceleration according to the set second time coefficient, the actual flying speed of the unmanned aerial vehicle and the target flying speed.
Optionally, as a possible implementation manner, the flight state parameter includes a first heading of the unmanned aerial vehicle, a second heading of the image acquisition device, and a horizontal flying speed of the unmanned aerial vehicle;
when the second angular velocity is calculated by the processing module 301 according to the target acceleration and the acquired flight state parameter, the processing module is specifically configured to:
and processing the target acceleration, the first course, the second course and the horizontal flying speed by using a set processing algorithm to obtain a second angular speed.
Optionally, as a possible implementation manner, when obtaining the control output quantity of the unmanned aerial vehicle according to the current heading angular velocity and the target heading angular velocity of the unmanned aerial vehicle, the control module 302 is specifically configured to:
and inputting the current course angular speed and the target course angular speed into a set PID controller so that the PID controller outputs to obtain the control output quantity of the unmanned aerial vehicle.
Optionally, as a possible implementation manner, when obtaining the first angular velocity of the image capturing device, the processing module 301 is specifically configured to:
obtaining a posture relative angle of the image acquisition equipment; the relative attitude angle represents the attitude angle of the image acquisition equipment relative to the unmanned aerial vehicle;
converting the relative attitude angle to obtain the current attitude angle of the image acquisition equipment under the terrestrial coordinate system;
the current attitude angle is subjected to differential processing to obtain a first angular velocity.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to some embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in some embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to some embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only a few examples of the present application and is not intended to limit the present application, and those skilled in the art will appreciate that various modifications and variations can be made in the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A flight control method is characterized by being applied to an autopilot in an unmanned aerial vehicle, wherein the unmanned aerial vehicle is hung with an image acquisition device, and the method comprises the following steps:
obtaining a first angular velocity of the image acquisition device;
obtaining a target course angular speed of the unmanned aerial vehicle in an ideal flight state according to the first angular speed; the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks the target object at the set tracking visual angle;
and obtaining the control output quantity of the unmanned aerial vehicle according to the current course angular velocity and the target course angular velocity of the unmanned aerial vehicle so that the unmanned aerial vehicle can fly according to the ideal flight state.
2. The method of claim 1, wherein the step of obtaining a target heading angular velocity of the drone in an ideal flight based on the first angular velocity comprises:
calculating a second angular velocity of the unmanned aerial vehicle in the ideal flight state according to the condition of the ideal flight state, wherein the second angular velocity represents an angular velocity required by the unmanned aerial vehicle for adjusting the flight state to the ideal flight state;
and obtaining the target course angular speed according to the first angular speed and the second angular speed.
3. The method of claim 2, wherein said step of calculating a second angular velocity of said drone at said ideal flight condition based on said condition of said ideal flight condition comprises:
calculating an expected relative distance between the unmanned aerial vehicle and the target object under the set tracking visual angle;
obtaining a target acceleration required to adjust the drone to the desired relative distance;
and calculating to obtain the second angular velocity according to the target acceleration and the acquired flight state parameters.
4. The method of claim 3, wherein the step of obtaining a target acceleration required to adjust the drone to the desired relative distance comprises:
obtaining an expected flying distance of the unmanned aerial vehicle according to the actual relative distance between the unmanned aerial vehicle and the target object and the expected relative distance;
processing the expected flight distance by utilizing a set first time coefficient to obtain the target flight speed of the unmanned aerial vehicle;
and obtaining the target acceleration according to a set second time coefficient, the actual flying speed of the unmanned aerial vehicle and the target flying speed.
5. The method of claim 3, wherein the flight state parameters include a first heading of the drone, a second heading of the image capture device, and a horizontal flight speed of the drone;
the step of calculating the second angular velocity according to the target acceleration and the acquired flight state parameters includes:
and processing the target acceleration, the first course, the second course and the horizontal flying speed by using a set processing algorithm to obtain the second angular speed.
6. The method of claim 1, wherein the step of obtaining the first angular velocity of the image capture device comprises:
obtaining a posture relative angle of the image acquisition equipment; wherein the relative attitude angle characterizes an attitude angle of the image capture device relative to the drone;
converting the relative attitude angle to obtain the current attitude angle of the image acquisition equipment under the terrestrial coordinate system;
and carrying out differential processing on the current attitude angle to obtain the first angular speed.
7. The utility model provides a flight control device, its characterized in that is applied to the autopilot among the unmanned aerial vehicle, unmanned aerial vehicle carries with image acquisition equipment, the device includes:
the processing module is used for obtaining a first angular speed of the image acquisition equipment;
the processing module is further used for obtaining a target course angular speed of the unmanned aerial vehicle in an ideal flight state according to the first angular speed; the ideal flight state is the flight state of the unmanned aerial vehicle when the image acquisition equipment tracks the target object at the set tracking visual angle;
and the control module is used for obtaining the control output quantity of the unmanned aerial vehicle according to the current course angular velocity and the target course angular velocity of the unmanned aerial vehicle so that the unmanned aerial vehicle can fly according to the ideal flight state.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
9. An autopilot, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
10. An unmanned aerial vehicle, characterized in that the unmanned aerial vehicle is equipped with an autopilot as claimed in claim 9.
CN202010340065.6A 2020-04-26 2020-04-26 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle Pending CN111352410A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010340065.6A CN111352410A (en) 2020-04-26 2020-04-26 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
CN202110031631.XA CN112631265B (en) 2020-04-26 2021-01-11 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010340065.6A CN111352410A (en) 2020-04-26 2020-04-26 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN111352410A true CN111352410A (en) 2020-06-30

Family

ID=71197789

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010340065.6A Pending CN111352410A (en) 2020-04-26 2020-04-26 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
CN202110031631.XA Active CN112631265B (en) 2020-04-26 2021-01-11 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110031631.XA Active CN112631265B (en) 2020-04-26 2021-01-11 Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle

Country Status (1)

Country Link
CN (2) CN111352410A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221253A (en) * 2021-06-01 2021-08-06 山东贝特建筑项目管理咨询有限公司 Unmanned aerial vehicle control method and system for anchor bolt image detection
CN114489093A (en) * 2020-10-27 2022-05-13 北京远度互联科技有限公司 Attitude adjusting method and device, storage medium, image acquisition equipment and unmanned aerial vehicle
WO2023036260A1 (en) * 2021-09-10 2023-03-16 深圳市道通智能航空技术股份有限公司 Image acquisition method and apparatus, and aerial vehicle and storage medium
CN117055599A (en) * 2023-08-31 2023-11-14 北京航翊科技有限公司 Unmanned aerial vehicle flight control method and device, electronic equipment and storage medium

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3835883C2 (en) * 1988-10-21 1996-07-18 Tzn Forschung & Entwicklung Procedure for target detection for missiles with a seeker head
FR2985581B1 (en) * 2012-01-05 2014-11-28 Parrot METHOD FOR CONTROLLING A ROTARY SAILING DRONE FOR OPERATING A SHOOTING VIEW BY AN ON-BOARD CAMERA WITH MINIMIZATION OF DISTURBING MOVEMENTS
EP3862837B1 (en) * 2014-07-30 2023-05-03 SZ DJI Technology Co., Ltd. Systems and methods for target tracking
CN105116905A (en) * 2015-05-26 2015-12-02 芜湖航飞科技股份有限公司 Aircraft attitude control method
CN106547275A (en) * 2015-09-17 2017-03-29 西安翔舟航空技术有限公司 A kind of new rotor class unmanned plane is automatically positioned control method
CN105425819B (en) * 2015-11-25 2019-01-11 南京航空航天大学 A kind of unmanned plane automatically tracks the method for guidance of ground target
CN106094876A (en) * 2016-07-04 2016-11-09 苏州光之翼智能科技有限公司 A kind of unmanned plane target locking system and method thereof
CN107223219B (en) * 2016-09-26 2020-06-23 深圳市大疆创新科技有限公司 Control method, control device and carrying system
CN106375669B (en) * 2016-09-30 2019-08-06 天津远度科技有限公司 A kind of digital image stabilization method, device and unmanned plane
CN108253928B (en) * 2016-12-28 2023-01-10 北京远度互联科技有限公司 Attitude angle acquisition method and device and movable equipment
WO2018120132A1 (en) * 2016-12-30 2018-07-05 深圳市大疆创新科技有限公司 Control method, device, and apparatus, and unmanned aerial vehicle
CN106909172A (en) * 2017-03-06 2017-06-30 重庆零度智控智能科技有限公司 Around tracking, device and unmanned plane
CN113163119A (en) * 2017-05-24 2021-07-23 深圳市大疆创新科技有限公司 Shooting control method and device
CN107741229B (en) * 2017-10-10 2020-09-25 北京航空航天大学 Photoelectric/radar/inertia combined carrier-based aircraft landing guiding method
CN108375988A (en) * 2018-05-25 2018-08-07 哈尔滨工业大学 A kind of quadrotor drone posture control method with unbalanced load
CN109062235A (en) * 2018-08-24 2018-12-21 天津远度科技有限公司 Flight control method, device and unmanned plane
CN109992009B (en) * 2019-03-14 2020-06-09 清华大学 Moving target surrounding tracking method based on distance measurement
CN110794877B (en) * 2019-11-22 2020-10-13 北京理工大学 Vehicle-mounted camera holder servo system and control method
CN110758758B (en) * 2019-11-29 2021-04-02 重庆市亿飞智联科技有限公司 Lifting mechanism, control method thereof and unmanned aerial vehicle

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489093A (en) * 2020-10-27 2022-05-13 北京远度互联科技有限公司 Attitude adjusting method and device, storage medium, image acquisition equipment and unmanned aerial vehicle
CN114489093B (en) * 2020-10-27 2022-11-29 北京远度互联科技有限公司 Attitude adjusting method and device, storage medium, image acquisition equipment and unmanned aerial vehicle
CN113221253A (en) * 2021-06-01 2021-08-06 山东贝特建筑项目管理咨询有限公司 Unmanned aerial vehicle control method and system for anchor bolt image detection
WO2023036260A1 (en) * 2021-09-10 2023-03-16 深圳市道通智能航空技术股份有限公司 Image acquisition method and apparatus, and aerial vehicle and storage medium
CN117055599A (en) * 2023-08-31 2023-11-14 北京航翊科技有限公司 Unmanned aerial vehicle flight control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112631265A (en) 2021-04-09
CN112631265B (en) 2023-02-03

Similar Documents

Publication Publication Date Title
CN112631265B (en) Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
US20230236611A1 (en) Unmanned Aerial Vehicle Sensor Activation and Correlation System
CN109596118B (en) Method and equipment for acquiring spatial position information of target object
Weiss et al. Monocular‐SLAM–based navigation for autonomous micro helicopters in GPS‐denied environments
EP3128386A1 (en) Method and device for tracking a moving target from an air vehicle
KR20180064253A (en) Flight controlling method and electronic device supporting the same
CA2977597A1 (en) Method and apparatus for target relative guidance
JP2008186145A (en) Aerial image processing apparatus and aerial image processing method
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
CN203845021U (en) Panoramic aerial photographic unit system for aircrafts
Li et al. Monocular Snapshot‐based Sensing and Control of Hover, Takeoff, and Landing for a Low‐cost Quadrotor
WO2018059295A1 (en) Control method, device, and system for multirotor aerial vehicle
WO2022077296A1 (en) Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium
KR102269792B1 (en) Method and apparatus for determining altitude for flying unmanned air vehicle and controlling unmanned air vehicle
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
US20240007752A1 (en) Variable focal length multi-camera aerial imaging system and method
Miller et al. UAV navigation based on videosequences captured by the onboard video camera
CN111930147B (en) Storage medium, flight control method and device, automatic pilot and unmanned aerial vehicle
Ješke et al. Autonomous compact monitoring of large areas using micro aerial vehicles with limited sensory information and computational resources
CN113654528B (en) Method and system for estimating target coordinates through unmanned aerial vehicle position and cradle head angle
CN113301248B (en) Shooting method and device, electronic equipment and computer storage medium
JP6515423B2 (en) CONTROL DEVICE, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length
CN111487993A (en) Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle
KR20170123999A (en) A video GPS map overlaying method for positioning of a remote control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200630