CN113867389A - Unmanned aerial vehicle shooting control method, device, equipment and storage medium - Google Patents

Unmanned aerial vehicle shooting control method, device, equipment and storage medium Download PDF

Info

Publication number
CN113867389A
CN113867389A CN202111165234.8A CN202111165234A CN113867389A CN 113867389 A CN113867389 A CN 113867389A CN 202111165234 A CN202111165234 A CN 202111165234A CN 113867389 A CN113867389 A CN 113867389A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
shooting
target
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111165234.8A
Other languages
Chinese (zh)
Inventor
费志杰
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202111165234.8A priority Critical patent/CN113867389A/en
Publication of CN113867389A publication Critical patent/CN113867389A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The application provides a shooting control method, device, equipment and storage medium for an unmanned aerial vehicle, and relates to the technical field of unmanned aerial vehicle shooting. The method comprises the following steps: determining the target flight height of the unmanned aerial vehicle at the current moment, wherein the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and the ground obstacle; acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle; determining whether to shoot an image at the current moment according to the target flight height, the flight distance of the stage and a preset shooting overlapping distance; if, then control the shooting device on this unmanned aerial vehicle and shoot the image. By applying the embodiment of the application, the overlapping degree (namely the shooting overlapping distance) between adjacent images shot by the unmanned aerial vehicle along the course direction can be kept consistent, and the accuracy of the three-dimensional model of the target area can be improved.

Description

Unmanned aerial vehicle shooting control method, device, equipment and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicle photography, in particular to an unmanned aerial vehicle shooting control method, device, equipment and storage medium.
Background
With the rapid development of unmanned aerial vehicles and oblique photography technologies, images of a target area are shot by using the unmanned aerial vehicles, and the work of constructing a three-dimensional model of the target area according to the shot images is widely applied, wherein the overlapping degree between adjacent images of the target area directly influences the accuracy of the three-dimensional model of the target area.
At present, the aerial survey task of the unmanned aerial vehicle is to control the unmanned aerial vehicle to execute a shooting task according to a fixed shooting interval according to preset flying height, flying speed and a view angle so as to shoot images of a target area, and the overlapping degree between the images shot by the fixed surveying and mapping mode cannot meet the overlapping degree requirement of a surveying and mapping scene with a complex environment, so that the accuracy of a three-dimensional model of the target area is influenced.
Disclosure of Invention
An object of the present application is to provide a method, an apparatus, a device and a storage medium for controlling shooting by an unmanned aerial vehicle, which can keep the overlapping degree between adjacent images on a shot target area consistent, and further can improve the accuracy of a three-dimensional model of the target area.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides an unmanned aerial vehicle shooting control method, where the method includes:
determining a target flight height of the unmanned aerial vehicle at the current moment, wherein the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and a ground obstacle;
acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle;
determining whether to shoot an image at the current moment according to the target flight height, the stage flight distance and a preset shooting overlapping distance;
if yes, then control the shooting device on the unmanned aerial vehicle shoots the image.
Optionally, the determining the target flying height of the drone at the current moment includes:
obtaining the actual flying height of the unmanned aerial vehicle relative to the obstacle on the ground at the current moment based on the measurement of the radar device on the unmanned aerial vehicle;
and determining the target flight height according to the actual flight height.
Optionally, said determining the target flying height according to the actual flying height comprises:
determining the average value of the actual flying heights of the unmanned aerial vehicle at all the moments from the previous photographing moment to the current moment;
and taking the average value as the target flying height.
Optionally, said determining the target flying height according to the actual flying height comprises:
determining the minimum value of the actual flying heights of the unmanned aerial vehicle at each moment between the previous photographing moment and the current moment;
and taking the minimum value as the target flying height.
Optionally, said determining the target flying height according to the actual flying height comprises:
and taking the actual flying height of the unmanned aerial vehicle at the current moment as the target flying height.
Optionally, the determining whether to shoot an image at the current time according to the target flying height, the phase flying distance, and a preset shooting overlap distance includes:
determining whether the target flying height, the stage flying distance, the preset shooting overlapping distance and the field angle of the shooting device meet a preset relation or not;
and if the preset relation is met, determining to shoot the image at the current moment.
Optionally, the preset relationship includes:
Figure BDA0003291466010000031
wherein S is the stage flight distance, C is the preset shooting overlap distance, θ is the field angle of the shooting device, and h is the target flight height.
In a second aspect, an embodiment of the present application further provides an unmanned aerial vehicle shooting control device, the device includes:
the system comprises a first determination module, a second determination module and a control module, wherein the first determination module is used for determining the target flight height of the unmanned aerial vehicle at the current moment, and the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and a ground obstacle;
the acquisition module is used for acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle;
the second determining module is used for determining whether to shoot an image at the current moment according to the target flight height, the stage flight distance and a preset shooting overlapping distance;
and the control module is used for controlling a shooting device on the unmanned aerial vehicle to shoot images if the control module is used for controlling the shooting device on the unmanned aerial vehicle to shoot images.
Optionally, the first determining module is specifically configured to obtain an actual flying height of the drone relative to an obstacle on the ground at a current moment based on a measurement by a radar device on the drone; and determining the target flight height according to the actual flight height.
Optionally, the first determining module is further specifically configured to determine an average value of actual flying heights of the unmanned aerial vehicle at each time between the previous photographing time and the current time; and taking the average value as the target flying height.
Optionally, the first determining module is further specifically configured to determine a minimum value of actual flying heights of the unmanned aerial vehicle at each time between the previous photographing time and the current time; and taking the minimum value as the target flying height.
Optionally, the first determining module is further specifically configured to use an actual flying height of the drone at the current moment as the target flying height.
Optionally, the second determining module is specifically configured to determine whether the target flying height, the stage flying distance, the preset shooting overlapping distance, and the field angle of the shooting device satisfy a preset relationship; and if the preset relation is met, determining to shoot the image at the current moment.
Optionally, the preset relationship includes:
Figure BDA0003291466010000041
wherein S is the phase flight distance, and C is the preset shooting overlapAnd the distance theta is the field angle of the shooting device, and h is the flying height of the target.
In a third aspect, an embodiment of the present application provides an electronic device, including: the unmanned aerial vehicle shooting control method comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when the electronic device runs, the processor and the storage medium communicate through the bus, and the processor executes the machine-readable instructions to execute the steps of the unmanned aerial vehicle shooting control method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the steps of the unmanned aerial vehicle shooting control method according to the first aspect are executed.
The beneficial effect of this application is:
the embodiment of the application provides a shooting control method, a shooting control device, shooting control equipment and a shooting control storage medium for an unmanned aerial vehicle, wherein the shooting control method comprises the following steps: determining the target flight height of the unmanned aerial vehicle at the current moment, wherein the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and the ground obstacle; acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle; determining whether to shoot an image at the current moment according to the target flight height, the flight distance of the stage and a preset shooting overlapping distance; if, then control the shooting device on this unmanned aerial vehicle and shoot the image.
By adopting the unmanned aerial vehicle flight shooting control method provided by the embodiment of the application, the processor calculates the target flight height of the unmanned aerial vehicle at the current moment in real time based on the received relative flight height of the unmanned aerial vehicle and the ground obstacle, and further dynamically judges whether the current moment is the shooting moment according to the target flight height of the unmanned aerial vehicle at the current moment, the stage flight distance from the previous shooting moment to the current moment when the unmanned aerial vehicle flies, and the preset shooting overlapping distance, so that the overlapping degree (namely the shooting overlapping distance) between adjacent images shot by the unmanned aerial vehicle along the course direction is kept consistent, and the accuracy of a three-dimensional model of a target area can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a scene schematic diagram of an unmanned aerial vehicle shooting control system provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of the shooting control of the unmanned aerial vehicle according to the embodiment of the present application;
fig. 3 is a schematic flow chart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application;
fig. 4 is a schematic flow chart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application;
fig. 5 is a scene schematic diagram of an unmanned aerial vehicle performing a shooting task in a sub-area in a non-ground-defense flight mode according to an embodiment of the present application;
fig. 6 is a scene schematic diagram of an unmanned aerial vehicle performing a shooting task in a sub-area in a ground defense flight mode according to an embodiment of the present application;
fig. 7 is a schematic flowchart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application;
fig. 8 is a scene schematic diagram of an unmanned aerial vehicle performing a shooting task in a flat area according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an unmanned aerial vehicle shooting control device provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Before explaining the embodiments of the present application in detail, some terms in the present application are explained first:
non-ground-defense flight: the unmanned aerial vehicle adopts the flight mode of level to fly at the operation in-process, keeps 1000 meters to fly like unmanned aerial vehicle and ground always.
Ground-defense flying: during the operation process of the unmanned aerial vehicle, the unmanned aerial vehicle flies in a mode of keeping constant relative height with the ground obstacle, for example, the relative heights of the unmanned aerial vehicle and the ground obstacle are both 100 meters.
Overlapping degree: the overlap degree can be divided into a lateral overlap degree and a heading overlap degree, wherein the lateral overlap degree can be called as a transverse overlap degree, and the lateral overlap degree refers to a part of adjacent images shot along two adjacent route lines, which have the same regional image, and is usually expressed by percentage; the heading overlap may be referred to as "longitudinal overlap," which refers to the portion of the adjacent images in the same route that have the same regional imagery, usually expressed as a percentage.
The application scene is introduced here, and the application scene can be specifically an image scene in which the unmanned aerial vehicle is used for shooting the target area, and a three-dimensional model corresponding to the target area can be constructed according to the shot image in the later stage. Fig. 1 is a scene schematic diagram of an unmanned aerial vehicle shooting control system provided in an embodiment of the present application, and as shown in fig. 1, the system may include: the system comprises a processor 100, an imaging device 101 and a radar device 102, wherein the imaging device 101 and the radar device 102 are respectively connected with the processor 100. The unmanned aerial vehicle for performing a shooting task is provided with a processor 100, a shooting device 101, and a radar device 102, where the shooting device 101 may be, for example, a camera or a video camera or other equipment for capturing images, and the radar device 102 may be, for example, a range finding device such as a radar detector, and the present application does not limit the specific form of the shooting device 101 and the radar device 102.
It should be noted that the processor 100 may be on the unmanned aerial vehicle, and may also be a cloud processor, and this application does not limit it.
When the drone can perform a shooting task on the target area according to a preset route based on preset flight parameters and shooting parameters, the radar device 102 on the drone can detect the relative flying height between the drone and the obstacle on the target area according to a preset detection period, wherein the preset flight parameters can include the flying speed, flying height, etc. of the drone, and the preset shooting parameters can include the camera field angle, the overlap degree (e.g., course overlap degree), etc. The radar device 102 sends the detected relative altitude to the processor 100, the processor 100 may determine whether the current time is the shooting time according to the received relative altitude and the flight distance information of the unmanned aerial vehicle in the manner of the embodiment of the present application, and if the current time is the shooting time, the processor 100 sends a shooting control instruction to the shooting device 101, and the shooting device 101 may shoot an image based on the shooting control instruction. If the unmanned aerial vehicle finishes the shooting task in the target area, all images shot by the shooting device 101 can be sent to the image processing equipment, and the image processing equipment can construct a three-dimensional model corresponding to the target area.
The unmanned aerial vehicle shooting control method mentioned in the present application is exemplified as follows with reference to the accompanying drawings. Fig. 2 is a schematic flow chart of unmanned aerial vehicle shooting control provided by the embodiment of the application. As shown in fig. 2, the method may include:
s201, determining the target flight height of the unmanned aerial vehicle at the current moment, wherein the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and the ground obstacle.
When the unmanned aerial vehicle executes a shooting task on the target area, a large-range target area may face the unmanned aerial vehicle, and for the large-range target area, the target area is firstly required to be divided to obtain a plurality of sub-areas, and corresponding numbers can be set for the sub-areas. And determining the air route corresponding to each sub-area according to the target task corresponding to each sub-area, such as an flight mode (non-ground-defense flight or ground-defense flight), and storing the number of each sub-area and the position information of the corresponding air route in a correlated manner.
Here, a sub-area is taken as an example, and the unmanned aerial vehicle flies along a route corresponding to the sub-area according to preset flight parameters (such as flight altitude and flight speed) and camera parameters (such as field angle and heading overlap). Optionally, when the unmanned aerial vehicle flies on the sub-area in the non-ground-defense flying mode, the preset flying height is a height relative to a ground plane of a flying point on the sub-area, and a distance measuring device (such as a radar device) installed on the unmanned aerial vehicle can measure the relative flying height between the unmanned aerial vehicle and each ground obstacle in the sub-area according to a preset distance measuring frequency, that is, the relative flying height is a distance between a horizontal plane where the unmanned aerial vehicle is located and a horizontal plane where a vertex of each ground obstacle is located. This range unit can send the relative altitude that real-time measurement obtained to the treater, and the treater can obtain unmanned aerial vehicle at the target altitude of current moment according to the relative altitude that receives, and is specific, can confirm the target altitude of current moment according to the altitude that the current moment corresponds or the altitude that corresponds with the historical moment that the current moment is correlated with.
When the unmanned aerial vehicle flies on the sub-area in the ground-defense flying mode, the preset flying height is the relative height between the unmanned aerial vehicle and each ground obstacle in the sub-area. It should be explained that, although the preset flying height in the ground-defense flying mode is a fixed value, when the unmanned aerial vehicle flies from above a low obstacle to a high ground obstacle during the actual flying process, during climbing, the flight altitude of the drone is lagging, that is, when the drone is above the high ground obstacle, the relative altitude may not have reached the predetermined flying height, which the drone may have exceeded when flying over the high ground obstacle, and the processor can further measure the relative flying height of the unmanned aerial vehicle and the ground obstacle in the sub-area according to the received distance measuring device, the target flight height of the unmanned aerial vehicle at the current moment can be determined, and the specific determination process can refer to the related description of the non-ground-defense flight mode part, and is not explained here.
It can be seen that no matter which flight mode the unmanned aerial vehicle carries out on this subregion and shoots the task, the relative flying height of unmanned aerial vehicle and ground barrier that all can gather through range unit determines the target flying height of unmanned aerial vehicle at the present moment.
S102, obtaining the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle.
In an implementation embodiment, a Real Time Kinematic (RTK) device may be installed on the drone, and the RTK device may detect the position information of the drone in Real Time and send the position information to the processor. Specifically, the processor stores the position information of the unmanned aerial vehicle at the previous photographing moment and the position information of the unmanned aerial vehicle at the current moment, and based on the position information, the processor can calculate the stage flight distance from the previous photographing moment to the current moment in real time.
In another practical embodiment, the unmanned aerial vehicle performs the shooting task at a fixed flying speed along the route in the sub-area, a time recorder may be installed on the unmanned aerial vehicle, the time recorder may record the flying time of the unmanned aerial vehicle and send the flying time of the unmanned aerial vehicle to the processor, and the processor may identify the flying time corresponding to the shooting action performed by the unmanned aerial vehicle as the shooting time. Specifically, the processor stores the previous photographing time of the unmanned aerial vehicle, and the stage flight distance of the unmanned aerial vehicle from the previous photographing time to the current time is determined according to the time interval between the current time and the previous photographing time and the preset flight speed.
S203, determining whether to shoot an image at the current moment according to the target flight height, the stage flight distance and a preset shooting overlapping distance.
And S204, if so, controlling a shooting device on the unmanned aerial vehicle to shoot the image.
The shooting overlapping distance is used for indicating the overlapping distance between camera view angles corresponding to adjacent shooting moments in the course direction, the course overlapping distance can be called shooting course overlapping distance, the shooting overlapping distance can be obtained according to camera parameters on the premise of knowing the course overlapping degree, namely, the course overlapping degree and the shooting overlapping distance have a corresponding relation, and the shooting overlapping distance can be used for replacing the course overlapping degree.
A constraint condition is formed between the preset shooting overlapping distance and the target flight height of the unmanned aerial vehicle and the stage flight distance of the unmanned aerial vehicle, and the processor can judge whether the current moment is the shooting moment or not based on the constraint condition. Specifically, when the processor judges that the preset shooting overlapping distance, the target flight height of the unmanned aerial vehicle and the stage flight distance of the unmanned aerial vehicle meet the constraint condition, the processor can send a shooting instruction to a shooting device installed on the unmanned aerial vehicle, the shooting device executes shooting action based on the shooting instruction, and the shot image is sent to the processor.
To sum up, in the unmanned aerial vehicle shooting control method provided by the application, the processor calculates the target flight height of the unmanned aerial vehicle at the current moment in real time based on the received relative flight height of the unmanned aerial vehicle and the ground obstacle, and then according to the target flight height of the unmanned aerial vehicle at the current moment, the stage flight distance from the previous shooting moment to the current moment of the unmanned aerial vehicle, and the preset shooting overlapping distance, dynamically judges whether the current moment is the shooting moment, so that the overlapping degree between adjacent images shot by the unmanned aerial vehicle along the course direction is kept consistent, and the accuracy of the three-dimensional model of the target area can be improved.
Fig. 3 is a schematic flow chart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application. As shown in fig. 3, optionally, the determining the target flying height of the drone at the current time includes:
s301, the actual flying height of the unmanned aerial vehicle relative to the obstacle on the ground at the current moment is measured based on the radar device on the unmanned aerial vehicle.
S302, determining the target flight height according to the actual flight height.
Wherein, radar installations specifically can be millimeter wave radar installations, this millimeter wave radar installations accessible following mode is measured and is obtained unmanned aerial vehicle at the present moment for the actual flying height of subaerial barrier, and is concrete, including sending end and receiving terminal in this millimeter wave radar installations, this millimeter wave radar installations passes through the sending end and sends the millimeter wave, the millimeter wave when touchhing ground barrier (like building, trees), but the signal of reflection back is received to the receiving terminal, wherein, this millimeter wave and the signal that the receiving terminal received can be the triangle wave. It is understood that the following relationship may exist for the distance d between the millimeter wave radar apparatus and the ground obstacle:
t=2d/c
wherein t is the time that the sending end sends the millimeter waves to the receiving end to receive the reflected signals, and C is the speed of light.
According to the time t, a frequency difference signal IF signal is obtained, the frequency f corresponding to the frequency difference signal is obtained through Fourier transformation, and then the actual flying height d of the unmanned aerial vehicle relative to the ground obstacle at the current moment can be solved according to the relation between the frequency f and the distance d between the millimeter wave radar device and the ground obstacle.
In the process that the unmanned aerial vehicle flies along the air route, the radar device measures the relative actual flying height of the unmanned aerial vehicle and the ground obstacle according to the preset measuring frequency.
Fig. 4 is a schematic flow chart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application. As shown in fig. 4, optionally, the determining the target flying height according to the actual flying height includes:
s401, determining the average value of the actual flying heights of the unmanned aerial vehicle at all the moments from the previous photographing moment to the current moment.
S402, taking the average value as the target flying height.
In an implementable embodiment, as shown in fig. 5, fig. 5 is a schematic view of a scene in which a drone provided by an embodiment of the present application performs a shooting task in a sub-area in a non-ground-defense flight mode. Given the large range of elevated ground obstacles (e.g., buildings) in this sub-area, it can be seen that the drone remains at a fixed flying height h from the ground0However, due to the existence of the ground obstacle on the sub-area, it is further necessary to determine whether the current time is the photographing time by using the relative actual flying height of the unmanned aerial vehicle and the ground obstacle.
Assume that the current time is tnThe previous photographing time is tn-1Then the processor may be programmed to read the previously stored previous time tn-1Corresponding actual flying height h (t)n-1) The previous photographing time tn-1To the current time tnActual flying height h (t) corresponding to a plurality of moments in timei) And the current time tnCorresponding actual flying height h (t)n) Obtaining the average value of the actual flying height and calculating the current time tnCorresponding inter-flight altitude h (t)n) The average of the actual flying heights is replaced. The processor then calculates the average value of the actual flying height and the previous moment t of the unmanned aerial vehiclen-1Fly to the current time tnDetermining whether the obtained shooting overlap distance is a preset shooting overlap distance, and if the obtained shooting overlap distance is the preset shooting overlap distance, determining the current time tnFor the moment of shooing, and then the shooting device on the steerable unmanned aerial vehicle of treater is at present moment tnAnd shooting images at the corresponding shooting points.
Optionally, if the width information of the high-range elevation ground obstacle along the course direction is known, and the first photographing time (such as t in the figure) corresponding to the high-range elevation ground obstacle is determineda) Ground obstacle in the wide range of elevationThen, the first photographing time (t) can be obtained according to the width informationa) And a second photographing time (t)b) Photographing interval (S) therebetween2) And judging the number of the corresponding photographing intervals on the ground obstacle with the large elevation, and then the processor can directly take a second photographing time (t) according to the number of the corresponding photographing intervals on the ground obstacle with the large elevationb) The shooting frequency of the shooting device is set, and the shooting device shoots images according to the shooting frequency, so that the control efficiency of the processor can be improved, and the operation load of the processor can be reduced.
In another practical embodiment, as shown in fig. 6, fig. 6 is a schematic view of a scene in which an unmanned aerial vehicle performs a shooting task in a sub-area in a ground-defense flight mode according to an embodiment of the present application. Assuming that there are ground obstacles (such as buildings with different heights) at different elevations in the sub-area, it can be seen that although the unmanned aerial vehicle flies at a fixed relative height to each ground obstacle in the sub-area, since the flying height of the unmanned aerial vehicle has hysteresis when climbing or landing, it is necessary to determine whether the current time is the photographing time by using the relative actual flying height of the unmanned aerial vehicle and the ground obstacle.
Assuming that the current time is P7 and the previous photographing time is P6, the processor may pre-store the actual flying height h (P6) corresponding to the previous time P6 and the actual flying height h (t) corresponding to a plurality of times between the previous photographing time P6 and the current time P7i) And the actual flying height h (P7) corresponding to the current time P7, obtaining an average value of the actual flying heights, and replacing the actual flying height h (P7) corresponding to the current time P7 with the average value of the actual flying heights. The processor determines whether the obtained shooting overlapping distance is a preset shooting overlapping distance or not according to the average value of the actual flying heights and the stage flying distance (S0) of the unmanned aerial vehicle from the previous moment P6 to the current moment P7, if so, the current moment P7 is the shooting moment, and the processor can control the shooting device on the unmanned aerial vehicle to shoot images at the shooting point corresponding to the current moment P7.
Fig. 5 and 6 are described as examples, and are not intended to limit the scope of the present invention.
Fig. 7 is a schematic flow chart of another unmanned aerial vehicle shooting control method provided in the embodiment of the present application. As shown in fig. 7, optionally, the determining the target flying height according to the actual flying height includes:
s701, determining the minimum value of the actual flying heights of the unmanned aerial vehicle at each moment between the previous photographing moment and the current moment.
And S702, taking the minimum value as the target flight height.
Alternatively, as shown in FIG. 5, the previous time tn-1Corresponding actual flying height h (t)n-1) The previous photographing time tn-1To the current time tnActual flying height h (t) corresponding to a plurality of moments in timei) And the current time tnCorresponding actual flying height h (t)n) At the previous time tn-1Corresponding actual flying height h (t)n-1) At a minimum, the previous time t can be setn-1Corresponding actual flying height h (t)n-1) As the current time tnThe target flying height of (2). The processor is used for processing the actual flying height h (t)n-1) Unmanned aerial vehicle from previous moment tn-1Fly to the current time tnDetermining whether the obtained shooting overlap distance is a preset shooting overlap distance, and if the obtained shooting overlap distance is the preset shooting overlap distance, determining the current time tnFor the moment of shooing, the shooting device on the unmanned aerial vehicle of treater steerable is at present moment tnAnd shooting images at the corresponding shooting points.
As shown in fig. 6, the actual flying height h (P6) corresponding to the previous time P6, the actual flying height h (t) corresponding to a plurality of times between the previous photographing time P6 and the current time P7i) And the actual flying height h (P7) corresponding to the current time P7 is the smallest of the actual flying heights h (P7) corresponding to the current time P7, that is, the actual flying height h (P7) corresponding to the current time P7 can be used as the target flying height of the current time P7. The processor determines whether the obtained shooting overlapping distance is the preset shooting weight or not according to the target flying height h (P7) and the stage flying distance (S0) of the unmanned aerial vehicle from the previous time P6 to the current time P7The distance is overlapped, if the distance is a preset shooting overlapping distance, the current moment P7 is the shooting moment, and the processor can control the shooting device on the unmanned aerial vehicle to shoot images at the shooting point corresponding to the current moment P7.
Optionally, the determining the target flying height according to the actual flying height includes: and taking the actual flying height of the unmanned aerial vehicle at the current moment as the target flying height.
Illustratively, as shown in FIG. 5, the current time t may be setnCorresponding actual flying height h (t)n) As the target flight height, the processor is based on the target flight height h (t)n) Unmanned aerial vehicle from previous moment tn-1Fly to the current time tnDetermining whether the obtained shooting overlap distance is a preset shooting overlap distance, and if the obtained shooting overlap distance is the preset shooting overlap distance, determining the current time tnFor the moment of shooing, the shooting device on the unmanned aerial vehicle of treater steerable is at present moment tnAnd shooting images at the corresponding shooting points.
As shown in fig. 6, the actual flying height h (P7) corresponding to the current time P7 is used as the target flying height of the current time P7. The processor may determine whether the obtained shooting overlapping distance is a preset shooting overlapping distance according to the target flying height h (P7) and a stage flying distance (S0) of the unmanned aerial vehicle from a previous time P6 to a current time P7, and if the obtained shooting overlapping distance is the preset shooting overlapping distance, the current time P7 is a shooting time, and the processor may control the shooting device on the unmanned aerial vehicle to shoot an image at a shooting point corresponding to the current time P7.
Optionally, the determining whether to capture an image at the current time according to the target flying height, the stage flying distance, and the preset capture overlap distance includes: determining whether the target flight height, the stage flight distance, the preset shooting overlapping distance and the field angle of the shooting device meet a preset relationship; and if the preset relation is met, determining to shoot the image at the current moment.
As shown in fig. 6, it can be seen that the Field angle of the photographing device is the same at each photographing time, a preset relationship exists between the preset photographing overlap distance and the target flying height, the stage flying distance and the Field angle of the photographing device, and when the target flying height, the stage flying distance and the Field angle of the photographing device corresponding to the current time are converted to be equal to the preset photographing overlap distance according to the preset relationship, the current time is proved to be the photographing time.
Wherein the predetermined relationship is:
Figure BDA0003291466010000161
wherein S is the stage flight distance, C is the preset shooting overlapping distance, theta is the field angle of the shooting device, and h is the target flight height.
As can be seen from fig. 8, a shooting interval in the flat area is a fixed value (e.g., S), an overlapping distance between two adjacent field angles is a preset shooting overlapping distance C, and a shooting distance corresponding to a field angle θ of the shooting device is O, then it can be seen that:
Figure BDA0003291466010000162
further, the following relationship exists between the shooting interval (step flight distance) S, the shooting distance O, and the preset shooting overlap distance C: s + C ═ O.
Further, it can be deduced that:
Figure BDA0003291466010000171
it can be seen that the preset shooting overlap distance C is related to the field angle θ, the phase flight distance S, and the target flight height h of the shooting device.
As the target flying height h corresponding to the current moment is smaller, the shooting interval s is smaller, as shown in fig. 6, as the unmanned aerial vehicle climbs, the relative flying height between the unmanned aerial vehicle and the ground obstacle becomes smaller, and the shooting intervals from P3 to P4 are shorter than those from P2 to P3, that is, the application is consistent with the practical theory.
Fig. 9 is a schematic structural diagram of an unmanned aerial vehicle shooting control device provided in the embodiment of the present application.
As shown in fig. 9, the apparatus includes:
a first determining module 901, configured to determine a target flying height of the drone at a current time, where the target flying height is used to represent a relative flying height of the drone and a ground obstacle;
an obtaining module 902, configured to obtain a phase flight distance from a previous photographing time to a current time of the unmanned aerial vehicle;
a second determining module 903, configured to determine whether to capture an image at the current time according to the target flying height, the stage flying distance, and a preset capture overlap distance;
and the control module 904 is used for controlling a shooting device on the unmanned aerial vehicle to shoot images if the unmanned aerial vehicle is in the normal state.
Optionally, the first determining module 901 is specifically configured to obtain an actual flying height of the drone relative to an obstacle on the ground at a current time based on a measurement of a radar device on the drone; and determining the target flight height according to the actual flight height.
Optionally, the first determining module 901 is further specifically configured to determine an average value of actual flying heights of the unmanned aerial vehicle at each time between the previous photographing time and the current time; the average value is taken as the target flying height.
Optionally, the first determining module 901 is further specifically configured to determine a minimum value of actual flying heights of the unmanned aerial vehicle at each time between a previous photographing time and a current time; the minimum value is taken as the target flying height.
Optionally, the first determining module 901 is further specifically configured to use an actual flying height of the drone at the current moment as the target flying height.
Optionally, the second determining module 903 is specifically configured to determine whether the target flight height, the stage flight distance, the preset shooting overlap distance, and the field angle of the shooting device satisfy a preset relationship; and if the preset relation is met, determining to shoot the image at the current moment.
Optionally, the preset relationship comprises:
Figure BDA0003291466010000181
wherein S is the stage flight distance, C is the preset shooting overlapping distance, theta is the field angle of the shooting device, and h is the target flight height.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 10, the electronic device may include: a processor 1001, a storage medium 1002 and a bus 1003, wherein the storage medium 1002 stores machine-readable instructions executable by the processor 1001, when the electronic device is operated, the processor 1001 and the storage medium 1002 communicate with each other through the bus 1003, and the processor 1001 executes the machine-readable instructions to execute the steps of the above method embodiment. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the present application further provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program performs the steps of the above method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. Alternatively, the indirect coupling or communication connection of devices or units may be electrical, mechanical or other.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An unmanned aerial vehicle shooting control method is characterized by comprising the following steps:
determining a target flight height of the unmanned aerial vehicle at the current moment, wherein the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and a ground obstacle;
acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle;
determining whether to shoot an image at the current moment according to the target flight height, the stage flight distance and a preset shooting overlapping distance;
if yes, then control the shooting device on the unmanned aerial vehicle shoots the image.
2. The method of claim 1, wherein determining the target altitude of the drone at the current time comprises:
obtaining the actual flying height of the unmanned aerial vehicle relative to the obstacle on the ground at the current moment based on the measurement of the radar device on the unmanned aerial vehicle;
and determining the target flight height according to the actual flight height.
3. The method of claim 2, wherein said determining the target flight level from the actual flight level comprises:
determining the average value of the actual flying heights of the unmanned aerial vehicle at all the moments from the previous photographing moment to the current moment;
and taking the average value as the target flying height.
4. The method of claim 2, wherein said determining the target flight level from the actual flight level comprises:
determining the minimum value of the actual flying heights of the unmanned aerial vehicle at each moment between the previous photographing moment and the current moment;
and taking the minimum value as the target flying height.
5. The method of claim 2, wherein said determining the target flight level from the actual flight level comprises:
and taking the actual flying height of the unmanned aerial vehicle at the current moment as the target flying height.
6. The method according to any one of claims 1-5, wherein the determining whether to capture an image at the current time based on the target flight height, the phase flight distance, and a preset capture overlap distance comprises:
determining whether the target flying height, the stage flying distance, the preset shooting overlapping distance and the field angle of the shooting device meet a preset relation or not;
and if the preset relation is met, determining to shoot the image at the current moment.
7. The method of claim 6, wherein the preset relationship comprises:
Figure FDA0003291465000000021
Figure FDA0003291465000000022
wherein S is the stage flight distance, C is the preset shooting overlap distance, θ is the field angle of the shooting device, and h is the target flight height.
8. The utility model provides an unmanned aerial vehicle shoots controlling means which characterized in that, the device includes:
the system comprises a first determination module, a second determination module and a control module, wherein the first determination module is used for determining the target flight height of the unmanned aerial vehicle at the current moment, and the target flight height is used for representing the relative flight height of the unmanned aerial vehicle and a ground obstacle;
the acquisition module is used for acquiring the stage flight distance from the previous photographing moment to the current moment of the unmanned aerial vehicle;
the second determining module is used for determining whether to shoot an image at the current moment according to the target flight height, the stage flight distance and a preset shooting overlapping distance;
and the control module is used for controlling a shooting device on the unmanned aerial vehicle to shoot images if the control module is used for controlling the shooting device on the unmanned aerial vehicle to shoot images.
9. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing a machine executable by the processor; machine readable instructions, which when executed by the electronic device, communicate with the storage medium via a bus, the processor executing the machine readable instructions to perform the steps of the drone shooting control method according to any one of claims 1-7.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the steps of the drone beat control method according to any one of claims 1 to 7.
CN202111165234.8A 2021-09-30 2021-09-30 Unmanned aerial vehicle shooting control method, device, equipment and storage medium Pending CN113867389A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111165234.8A CN113867389A (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle shooting control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111165234.8A CN113867389A (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle shooting control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113867389A true CN113867389A (en) 2021-12-31

Family

ID=79001466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111165234.8A Pending CN113867389A (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle shooting control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113867389A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116929306A (en) * 2023-07-20 2023-10-24 深圳赛尔智控科技有限公司 Data acquisition method, device, equipment and computer readable storage medium
CN116929306B (en) * 2023-07-20 2024-04-19 深圳赛尔智控科技有限公司 Data acquisition method, device, equipment and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108416263A (en) * 2018-01-29 2018-08-17 华南农业大学 A kind of drone height measurement method of low cost suitable for the monitoring of agriculture feelings low-altitude remote sensing
CN110716586A (en) * 2019-11-14 2020-01-21 广州极飞科技有限公司 Photographing control method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium
CN111966129A (en) * 2020-08-31 2020-11-20 金陵科技学院 Photovoltaic inspection unmanned aerial vehicle and ground-imitating flying method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108416263A (en) * 2018-01-29 2018-08-17 华南农业大学 A kind of drone height measurement method of low cost suitable for the monitoring of agriculture feelings low-altitude remote sensing
CN110716586A (en) * 2019-11-14 2020-01-21 广州极飞科技有限公司 Photographing control method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium
CN111966129A (en) * 2020-08-31 2020-11-20 金陵科技学院 Photovoltaic inspection unmanned aerial vehicle and ground-imitating flying method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116929306A (en) * 2023-07-20 2023-10-24 深圳赛尔智控科技有限公司 Data acquisition method, device, equipment and computer readable storage medium
CN116929306B (en) * 2023-07-20 2024-04-19 深圳赛尔智控科技有限公司 Data acquisition method, device, equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
US9996746B1 (en) Systems and methods for autonomous perpendicular imaging with a target field of view
CN109144097B (en) Obstacle or ground recognition and flight control method, device, equipment and medium
EP3264364B1 (en) Method and apparatus for obtaining range image with uav, and uav
US10089530B2 (en) Systems and methods for autonomous perpendicular imaging of test squares
EP3581890A2 (en) Method and device for positioning
CN106645205A (en) Unmanned aerial vehicle bridge bottom surface crack detection method and system
CN110889808A (en) Positioning method, device, equipment and storage medium
US20210319221A1 (en) Vessel Height Detection Through Video Analysis
US20230222642A1 (en) Inundation damage determination device, inundation damage determination method, and program
CN113867389A (en) Unmanned aerial vehicle shooting control method, device, equipment and storage medium
CN116692690A (en) Crane anti-collision early warning method, device, equipment and medium
US20220089166A1 (en) Motion state estimation method and apparatus
KR101821992B1 (en) Method and apparatus for computing 3d position of target using unmanned aerial vehicles
CN113781536A (en) Image alignment method and apparatus, electronic device, and computer-readable storage medium
US10964055B2 (en) Methods and systems for silent object positioning with image sensors
KR20210053012A (en) Image-Based Remaining Fire Tracking Location Mapping Device and Method
CN116793340B (en) Unmanned aerial vehicle automatic landing navigation method and device and electronic equipment
CN111095024A (en) Height determination method, height determination device, electronic equipment and computer-readable storage medium
US20230243976A1 (en) Systems and methods for utility pole loading and/or clearance analyses
JP2018097588A (en) Three-dimensional space specifying device, method, and program
US20240053487A1 (en) Systems and methods for transforming autonomous aerial vehicle sensor data between platforms
CN114740878A (en) Unmanned aerial vehicle flight obstacle detection method based on computer image recognition
CN114746826A (en) Data processing method and movable platform
CN111583312A (en) Method and device for accurately matching remote sensing images, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination