CN113875222B - Shooting control method and device, unmanned aerial vehicle and computer readable storage medium - Google Patents

Shooting control method and device, unmanned aerial vehicle and computer readable storage medium Download PDF

Info

Publication number
CN113875222B
CN113875222B CN202080032440.9A CN202080032440A CN113875222B CN 113875222 B CN113875222 B CN 113875222B CN 202080032440 A CN202080032440 A CN 202080032440A CN 113875222 B CN113875222 B CN 113875222B
Authority
CN
China
Prior art keywords
shooting
aerial vehicle
unmanned aerial
sequence
cradle head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080032440.9A
Other languages
Chinese (zh)
Other versions
CN113875222A (en
Inventor
吴利鑫
何纲
黄振昊
方朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202311469626.2A priority Critical patent/CN117641107A/en
Publication of CN113875222A publication Critical patent/CN113875222A/en
Application granted granted Critical
Publication of CN113875222B publication Critical patent/CN113875222B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Abstract

A shooting control method and device, an unmanned aerial vehicle and a computer readable storage medium, wherein the method comprises the following steps: acquiring first position information of a region to be shot and second position information of an extended shooting region, wherein the extended shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information; determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information; determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle; each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction. The shooting points in each shooting direction in each shooting sequence are located in the effective shooting area in the shooting direction, so that invalid image data is prevented from being generated.

Description

Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
Technical Field
The present application relates to the field of shooting, and in particular, to a shooting control method and apparatus, an unmanned aerial vehicle, and a computer readable storage medium.
Background
The oblique photography technology is characterized in that a plurality of photographing devices are mounted on the unmanned aerial vehicle, images are collected from different angles of one vertical side view and four side views, four oblique photographing angles are more than those of the traditional photography, and therefore richer side texture information and the like can be obtained, and the oblique photography technology is applicable to the fields of mapping and the like in which diversified characteristic information of photographed objects needs to be obtained. In the related art, in order to realize shooting in multiple directions, one way is to mount a multi-spelling shooting device (such as a 5-spelling shooting device) on an unmanned aerial vehicle, and shoot images in multiple directions at the same time, the multi-spelling shooting device has high cost and heavy weight, is generally directly mounted on a body of the unmanned aerial vehicle through a vibration reduction system, and lacks a mechanical tripod head to perform stability enhancement, so that imaging quality is poor; in order to reduce the volume of the multi-spelling shooting device, a rolling shutter door or an electronic global shutter is adopted, and the rolling shutter door has a jelly effect under the fast motion shooting, so that the modeling precision is reduced, and the imaging quality of the electronic global shutter is poor, and the modeling effect is also influenced. The other mode is to carry the shooting device with a single lens on the unmanned aerial vehicle and to match with multiple airlines to realize shooting in multiple directions, compared with the multi-spliced shooting device, the shooting device with the single lens has low cost and small weight, can be carried on the body of the unmanned aerial vehicle through the cradle head, and has better imaging quality.
In order to ensure that images of the to-be-photographed area in all directions are photographed, when route planning is performed, the to-be-photographed area is firstly subjected to expansion, and then the route planning is performed on the expanded area (namely the expanded to-be-photographed area). The unmanned aerial vehicle flies along the planned route, and when flying to each shooting point, images in each direction are collected, so that a large amount of invalid image data can be generated on the outer-expansion route of the area to be shot, storage space is wasted, and inconvenience is brought to modeling processing.
Disclosure of Invention
The application provides a shooting control method and device, an unmanned aerial vehicle and a computer readable storage medium.
In a first aspect, an embodiment of the present application provides a photographing control method, including:
acquiring first position information of a region to be shot and second position information of an expanded shooting region, wherein the expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
Each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction.
In a second aspect, an embodiment of the present application provides a photographing control apparatus, including:
a storage device for storing program instructions; and
one or more processors invoking program instructions stored in the storage device, which when executed, are configured, individually or collectively, to:
acquiring first position information of a region to be shot and second position information of an expanded shooting region, wherein the expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
Each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction.
In a third aspect, an embodiment of the present application provides an unmanned aerial vehicle, including:
a body;
a cradle head mounted on the body, the cradle head being used for mounting a photographing device; and
the shooting control apparatus of the second aspect is supported by the body, and the shooting control apparatus is electrically connected to the pan-tilt.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the photographing control method of the first aspect.
In a fifth aspect, an embodiment of the present application provides a photographing control method, including:
receiving a shooting sequence corresponding to each waypoint on a flight route sent by a control device of an unmanned aerial vehicle;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each flight route and each waypoint;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of an area to be shot and second position information of an outward-expansion shooting area, the outward-expansion shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information.
In a sixth aspect, an embodiment of the present application provides a photographing control apparatus, including:
a storage device for storing program instructions; and
one or more processors invoking program instructions stored in the storage device, which when executed, are configured, individually or collectively, to:
receiving a shooting sequence corresponding to each waypoint on a flight route sent by a control device of an unmanned aerial vehicle;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each flight route and each waypoint;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of an area to be shot and second position information of an outward-expansion shooting area, the outward-expansion shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information.
In a seventh aspect, an embodiment of the present application provides a unmanned aerial vehicle, including:
a body;
a cradle head mounted on the body, the cradle head being used for mounting a photographing device; and
the shooting control apparatus of the sixth aspect is supported by the body, and the shooting control apparatus is electrically connected to the pan-tilt.
In an eighth aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the photographing control method of the fifth aspect.
According to the technical scheme provided by the embodiment of the application, when the shooting sequences are planned, the shooting points in each shooting direction in each shooting sequence are ensured to be positioned in the effective shooting area in the shooting direction, so that not only can invalid image data be prevented from being generated, but also the number of the shooting points can be reduced, the shooting time is shortened, and the efficiency of multi-direction shooting is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a schematic view of a unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a flowchart of a shooting control method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a positional relationship between a region to be photographed and an extended photographing region in an embodiment of the present application;
FIG. 4A is a schematic illustration of a flight path in an embodiment of the application;
FIG. 4B is a schematic illustration of a flight path in another embodiment of the application;
FIG. 5A is a schematic diagram showing a positional relationship between an effective shooting area and a region to be shot in one shooting direction according to an embodiment of the present application;
FIG. 5B is a schematic diagram showing a position relationship between an effective shooting area and a region to be shot in another shooting direction according to an embodiment of the present application;
FIG. 5C is a schematic diagram illustrating a position relationship between an effective shooting area and a region to be shot in another shooting direction according to an embodiment of the present application;
FIG. 5D is a schematic diagram showing a position relationship between an effective shooting area and a region to be shot in another shooting direction according to an embodiment of the present application;
fig. 6A is a comparison diagram of images taken by the unmanned aerial vehicle at different shooting points in the same shooting direction in an embodiment of the present application;
FIG. 6B is a schematic illustration of a flight path in another embodiment of the application;
FIG. 7 is a schematic diagram of an implementation of controlling a camera mounted on a unmanned aerial vehicle to perform shooting based on a flight line and a shooting sequence according to an embodiment of the present application;
fig. 8 is a schematic diagram of a positional relationship between images of shooting points in which shooting directions of adjacent two shooting sequences are the same in an embodiment of the present application;
fig. 9 is a schematic diagram of a process of a pan/tilt head performing a shooting sequence according to an embodiment of the present application;
fig. 10 is a flowchart of a photographing control method according to another embodiment of the present application;
fig. 11 is a block diagram showing the configuration of a photographing control apparatus in an embodiment of the present application;
fig. 12 is a block diagram of a drone in an embodiment of the application.
Detailed Description
Traditional surveying and mapping is performed through total stations or GNSS (Global Navigation Satellite System ) handheld devices, and has the defects of low efficiency, high operation difficulty and high operation cost, and for large-area high-precision high-resolution surveying and mapping, traditional surveying and mapping cannot be satisfied, and the traditional surveying and mapping is gradually replaced by organic-vehicle surveying and unmanned aerial vehicle surveying and mapping. The mapping of the unmanned aerial vehicle or the man-machine can also be used for establishing a three-dimensional model of a measurement area, shooting is carried out on a plurality of directions of the area to be shot through an oblique shooting technology, and the images in the plurality of directions are processed and resolved by combining a three-dimensional modeling algorithm to obtain the model containing three-dimensional space information.
In order to ensure that images of the region to be photographed in all directions are photographed, during route planning of oblique photographing, the region to be photographed is firstly subjected to expansion, and then the route planning is performed on the expanded region (namely the expanded region to be photographed). The unmanned aerial vehicle flies along the planned route, and when flying to each shooting point, images in each direction are collected, so that a large amount of invalid image data can be generated on the outer-expansion route of the area to be shot, storage space is wasted, and inconvenience is brought to modeling processing.
In view of the above problems, when the shooting sequences are planned, the embodiment of the application ensures that the shooting points in each shooting direction in each shooting sequence are located in the effective shooting area in the shooting direction, so that not only can the generation of invalid image data be prevented, but also the number of the shooting points can be reduced, the shooting time is shortened, and the efficiency of multi-direction shooting is improved.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
The features of the following examples and embodiments may be combined with each other without any conflict.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a and b, a and c, b and c, or a and b and c, wherein a, b, c may be single or plural.
FIG. 1 is a schematic view of a unmanned aerial vehicle according to an embodiment of the present application; referring to fig. 1, the unmanned aerial vehicle of the embodiment of the application may include a body 100, a photographing device 200 and a cradle head 300, wherein the photographing device 200 is mounted on the body 100 through the cradle head 300. The unmanned aerial vehicle can be a fixed-wing unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle, and the type of the unmanned aerial vehicle can be selected according to actual requirements, for example, when the weight of pan-tilt 300 and camera 200 is large, a fixed wing unmanned aerial vehicle with larger volume and weight can be selected to carry the cradle head 300 and the shooting device 200; when the weight of the cradle head 300 and the photographing device 200 is smaller, the multi-rotor unmanned aerial vehicle with smaller volume and weight can be selected to carry the cradle head 300 and the photographing device 200.
According to the embodiment of the application, the number of the shooting devices is one, and when the unmanned aerial vehicle is used for oblique shooting, only one shooting device is needed, and the shooting device has large pixels, but the volume and the weight of the shooting device are greatly reduced compared with those of the multi-spelling shooting device, so that the weight and the size of the unmanned aerial vehicle are greatly reduced. The photographing device 200 may be an integrated camera, or may be a device formed by combining an image sensor and a lens, and it should be noted that the photographing device 200 according to the embodiment of the application is a photographing device having a single lens. In addition, the cradle head 300 of the embodiment of the present application may be a single-axis cradle head, a two-axis cradle head, a three-axis cradle head or other multi-axis cradle heads.
The unmanned aerial vehicle can be applied to the mapping field, taking a shot object as an example of the ground, acquiring a ground image by taking the shooting device 200 carried by the unmanned aerial vehicle, and reconstructing a three-dimensional or two-dimensional map of the ground image by using software, wherein the map obtained by mapping can be applied to different industries, such as the field of electric power inspection, and the reconstructed map can be used for inspecting line faults; in the road planning field, the reconstructed map can be utilized to select the road address; the drug-arresting police can check the poppy planting condition in the deep mountain by using the reconstructed three-dimensional map. Of course, the unmanned aerial vehicle is not limited to the mapping field, and can be applied to other fields in which the multi-azimuth characteristic information of the photographed object needs to be acquired. The object to be photographed is not limited to the ground, but may be a large building, a mountain and the like.
Fig. 2 is a flowchart of a shooting control method according to an embodiment of the present application; referring to fig. 2, the photographing control method according to the embodiment of the present application may include steps S201 to S203.
In S201, first position information of a region to be shot and second position information of an expanded shooting region are obtained by expanding the region to be shot, and the second position information is determined according to the first position information.
The user may define the region to be photographed in different ways, such as by manually dotting or importing an external file. Accordingly, different strategies may be employed to obtain the first location information, and in some embodiments, the first location information is set by the user, for example, the user inputs the first location information by manually dotting; in some embodiments, the region to be photographed is determined by importing an external file, which has first location information recorded therein. Alternatively, before S201 is performed, a prompt message may be output to prompt the user to define the region to be photographed.
The region to be shot in the embodiment of the application can be a square region, or can be a region with other shapes, such as a circular region, a pentagonal region and the like.
The first position information may include position information of four corners of the square area, and of course, the first position information may also include position information of other positions of the square area.
In addition, in some embodiments, before the first position information of the to-be-photographed area and the second position information of the extended photographing area are acquired, if a trigger instruction for instructing to enter the tilt photographing mode is acquired, the tilt photographing mode is entered, that is, after entering the tilt photographing mode, the planning of the photographing sequence is performed.
It should be understood that the planning of the shooting sequence may be performed by the control device of the unmanned aerial vehicle or by the unmanned aerial vehicle, and the shooting using the planned shooting sequence is performed by the unmanned aerial vehicle. Therefore, if the shooting sequence planning process is performed in the control device, the control device can be triggered to enter the oblique shooting mode before the control device performs the planning of the shooting sequence; if the shooting sequence planning process is performed in the unmanned aerial vehicle, before the unmanned aerial vehicle performs the planning of the shooting sequence, the unmanned aerial vehicle needs to be triggered to enter an oblique shooting mode. In addition, the unmanned aerial vehicle performs shooting in the oblique shooting mode by using a shooting sequence planned by the control device.
In the embodiment of the application, the control device of the unmanned aerial vehicle can be a remote controller or other terminal equipment capable of controlling the unmanned aerial vehicle, such as a mobile phone, a tablet personal computer, a portable computer, a desktop computer, intelligent wearing equipment and the like.
The second position information is also related to a strategy adopted when the to-be-shot area is enlarged, and when the to-be-shot area is obtained by enlarging the to-be-shot area equally in all directions, the second position information can be determined according to the first position information and the expansion times of the to-be-shot area; the second position information may also be determined according to the first position information and the distance between the edge of the extended photographing region and the edge of the region to be photographed. When the expanding shooting area is obtained by expanding the area to be shot in different sizes in at least partial directions, the second position information can be determined according to the first position information and the distance between the edge of the expanding shooting area in different directions and the edge of the area to be shot in the corresponding direction.
The extended photographing region is an area obtained by expanding different directions of the region to be photographed by a first preset distance, respectively. For example, referring to fig. 3, the region to be photographed is a rectangular region 10, and the rectangular region 10 is respectively enlarged by a first preset distance D in different directions ext I.e. the size of the extended shot region 20 is obtained.
Optionally, the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle, so that the factors such as the resolution of the shooting device for acquiring images and the planning requirement of the route are considered. Exemplary, a first preset distance D ext The calculation formula of (2) is as follows:
in formula (1), H is the flying height, α is the installation angle of the photographing device, and, illustratively, α is the angle between the optical axis of the lens of the photographing device and the ground plane.
It should be appreciated that the first predetermined distance may also be determined using other strategies.
The flying height can also be determined by adopting different strategies, for example, in some embodiments, the flying height is set by a user, and the flying height is input by the user through a control device of the unmanned aerial vehicle, so that the mode of determining the flying height can meet different user requirements, and has high flexibility; in some embodiments, the flying height is determined according to parameters of a photographing device mounted on the unmanned aerial vehicle and a preset ground resolution, and exemplary parameters of the photographing device include a focal length of the photographing device and a single pixel side length of an image sensor of the photographing device, a calculation formula of the flying height may be:
In formula (2), H is the flying height, f is the focal length of the camera, GSD (Ground Sampling Distance) is the preset ground resolution, and pix is the single pixel side length of the image sensor of the camera. It should be understood that the parameters of the photographing device are not limited to the above listed parameters, but may include other parameters, and the calculation formula of the flying height is not limited to the above formula (1), but may be other parameters.
In S202, third position information of the effective photographing region in different photographing directions is determined based on the first position information and the second position information.
The shooting direction of the embodiment of the present application may include at least two of the following: the front shooting direction of the front of the unmanned aerial vehicle, the rear shooting direction of the rear of the unmanned aerial vehicle, the left shooting direction of the left direction of the unmanned aerial vehicle, the right shooting direction of the right direction of the unmanned aerial vehicle or the right shooting direction of the shooting direction vertically downward. When the unmanned aerial vehicle is upright, the nose is directed forward and the tail is directed backward.
At least two of the above may be selected according to actual demands, and exemplary, at the time of mapping, the photographing directions include a front photographing direction (a photographing device is used for photographing a forward image of a photographing object), a rear photographing direction (a photographing device is used for photographing a backward image of a photographing object), a left photographing direction (a photographing device is used for photographing a left image of a photographing object), and a right photographing direction (a photographing device is used for photographing a right image of a photographing object), or the photographing directions include a front photographing direction, a rear photographing direction, a left photographing direction, a right photographing direction, and a front photographing direction (a photographing device is used for photographing an orthographic image of a photographing object). It will be appreciated that in other use scenarios, the shooting direction may be selected to be other to meet the respective requirements.
The effective shooting area in the front shooting direction, the effective shooting area in the rear shooting direction, the effective shooting area in the left shooting direction, the effective shooting area in the right shooting direction and the effective shooting area in the front shooting direction are respectively as follows: the method comprises the steps that a region to be shot is obtained after a second preset distance is moved to a first direction, a region to be shot is obtained after a second preset distance is moved to a second direction, a region to be shot is obtained after a second preset distance is moved to a third direction, a region to be shot is obtained after a second preset distance is moved to a fourth direction, namely, an effective shooting region in a front shooting direction is a region to be shot after a second preset distance is moved to the first direction, an effective shooting region in a rear shooting direction is a region to be shot after a second preset distance is moved to the second direction, an effective shooting region in a left shooting direction is a region to be shot after a second preset distance is moved to the third direction, and an effective shooting region in a right shooting direction is a region to be shot after a second preset distance is moved to the fourth direction. The second preset distance may be equal to or different from the first preset distance. It can be appreciated that the effective area of each shooting direction is located in the extended shooting area, and therefore, the second preset distance is smaller than or equal to the first preset distance.
When the region to be measured is expanded by different distances in each direction to obtain the expanded shooting region, the distances of the movement of each direction in different directions may be unequal. For example, the effective area in the front shooting direction is an area obtained after the area to be shot moves a second preset distance in the first direction, the second preset distance is smaller than or equal to the distance of the area to be shot moving in the first direction from the area to be shot, and the effective shooting area in the rear shooting direction is an area obtained after the area to be shot moves a third preset distance in the second direction, and the third preset distance is smaller than or equal to the distance of the area to be shot moving in the second direction from the area to be shot.
The first direction is opposite to the second direction and the third direction is opposite to the fourth direction in the embodiment of the application. In particular, the first direction, the second direction, the third direction or the fourth direction is related to the shape of the flight path of the unmanned aerial vehicle.
By way of example, a flight path may include a plurality of mutually parallel sub-paths, with one side of adjacent sub-paths being connected to form one flight path. Optionally, the starting waypoint of the flight route is any side angle position of the extended shooting area, and the sub-route is parallel to one side of the extended shooting area. For example, referring to fig. 4A, the starting waypoint a of the flight path 30 is the lower left corner of the out-spread shooting area, and the end point B of the flight path 30 is the upper right corner of the out-spread shooting area; for example, referring to fig. 4B, the starting waypoint C of the flight path 40 is the upper left corner of the out-of-coverage shooting area and the ending point D of the flight path 40 is the lower right corner of the out-of-coverage shooting area. Of course, the starting waypoint may be the upper right corner or the lower right corner of the extended shot region, and the end point may be the lower left corner or the upper left corner of the extended shot region. In addition, in the embodiment shown in fig. 4A and 4B, the sub-routes are parallel to the short sides of the extended shot region, and it is understood that the sub-routes may be parallel to the long sides of the extended shot region.
For example, regarding the flight route shown in fig. 4A, with reference to the up, down, left, and right directions shown in fig. 4A, the first direction is the down direction, the second direction is the up direction, the third direction is the right direction, and the fourth direction is the left direction, so that the obtained effective shooting area in the front shooting direction is the area 51 shown in fig. 5A, and the area obtained by removing the area 51 from the extended shooting area 20 in fig. 5A is the ineffective shooting area in the front shooting direction; the effective shooting area in the rear shooting direction is the area 52 shown in fig. 5B, and the area obtained by removing the area 52 in the out-expansion shooting area 20 in fig. 5B is the ineffective shooting area in the rear shooting direction; the effective shooting area in the left shooting direction is an area 53 shown in fig. 5C, and an area obtained by removing the area 53 in the extended shooting area 20 in fig. 5C is an ineffective shooting area in the left shooting direction; the effective photographing region in the right photographing direction is the region 54 shown in fig. 5D, and the region obtained by removing the region 54 from the enlarged photographing region 20 in fig. 5D is the ineffective photographing region in the right photographing direction.
For example, for the flight route shown in fig. 4B, with reference to the up, down, left, and right directions shown in fig. 4B, the first direction is the up direction, the second direction is the down direction, the third direction is the left direction, and the fourth direction is the right direction, so that the obtained effective shooting area in the front shooting direction is the area 52 shown in fig. 5B, and the area obtained by removing the area 52 from the extended shooting area 20 in fig. 5B is the ineffective shooting area in the front shooting direction; the effective shooting area in the rear shooting direction is the area 51 shown in fig. 5A, and the area obtained by removing the area 51 in the extended shooting area 20 in fig. 5A is the ineffective shooting area in the rear shooting direction; the effective shooting area in the left shooting direction is an area 54 shown in fig. 5D, and an area obtained by removing the area 54 in the extended shooting area 20 in fig. 5D is an ineffective shooting area in the left shooting direction; the effective photographing region in the right photographing direction is the region 53 shown in fig. 5C, and the region obtained by removing the region 53 from the enlarged photographing region 20 in fig. 5C is the ineffective photographing region in the right photographing direction.
For the flight route shown in fig. 4A and 4B, the effective shooting area in the forward shooting direction is the area to be shot 10, and the area obtained by removing the area to be shot 10 from the extended shooting area 20 in fig. 4A and 4B is the ineffective shooting area in the forward shooting direction. In addition, D in FIGS. 5A to 5D 1 The second preset distance is equal to the first preset distance. It will be appreciated that the flight path is not limited to the flight path shown in fig. 4A and 4B, but may be provided otherwise.
The determination method of the flight route can be selected according to the need, and the determination process of the flight route includes, but is not limited to, the following steps:
(1) Determining a lateral distance between two adjacent sub-airlines in a flight airlines according to a preset ground resolution, a preset side overlap ratio and the number of pixels of a shooting device carried on the unmanned aerial vehicle in the direction perpendicular to the flight direction of the unmanned aerial vehicle (namely, the number of pixels of an image sensor of the shooting device in the direction perpendicular to the flight direction of the unmanned aerial vehicle);
exemplary, the lateral distance D route The calculation formula of (2) is as follows:
D route =GSD(1-γ lateral )n H (3);
in the formula (3), GSD is ground resolution, gamma lateral For side-to-side overlap ratio, n H The number of pixels in the flight direction of the shooting device vertical to the unmanned aerial vehicle is carried on the unmanned aerial vehicle.
It will be appreciated that the sideways spacing D route The calculation method of (2) is not limited to the formula (3), but may be other.
Taking an orthographic image captured therein (the capturing direction of the capturing device is the forward capturing direction) as an example, as shown in fig. 6A, since the capturing point 1 and the capturing point 2 are on the same sub-course, the overlapping ratio of the image captured by the capturing device at the capturing point 1 and the image captured by the capturing device at the capturing point 2 in the flight direction is referred to as the heading overlapping ratio. The overlapping ratio of the image captured by the imaging device at the imaging point 1 and the image captured by the imaging device at the imaging point 12 in the vertical direction of the flight direction on the adjacent two sub-airlines is called the side overlapping ratio.
(2) And determining the flight route according to the second position information and the lateral distance.
And (3) planning the route in the extended shooting area, wherein the lateral distance between adjacent sub-routes in the flight route is the lateral distance determined in the step (1).
In the shooting process of the unmanned aerial vehicle, photos shot by the advancing route need to ensure a certain overlapping rate so as to be applied to the fields of surveying and mapping and the like. The side lap ratio may be a default value or may be set by the user. By way of example, the side lap rate is set by the user, for example, the side lap rate is input by the user through the control device of the unmanned aerial vehicle, and the mode of determining the side lap rate can meet different user requirements, so that the flexibility is high. Alternatively, the side lap ratio may be greater than or equal to 65% and less than or equal to 80%. Illustratively, the side lap ratio is 65%, 70%, 75%, 80%, or other numerical magnitude greater than 65% and less than 80%.
It will be appreciated that the flight path may be planned in other manners, and as an example, referring to fig. 6B, the flight path may be a cross path, where the cross path includes two paths (paths 60 and 70 in fig. 6B), where the sub paths of the two paths are perpendicular to each other, where one path is required to complete the acquisition of oblique images in two or three shooting directions, where one path acquires a left-hand image and a right-hand image, or a left-hand image, a right-hand image and an orthoimage, and where the other path acquires a forward-hand image and a backward image, or a forward-hand image, a backward image and an orthoimage, where only the left-hand image, the right-hand image and the orthoimage are required to be shot, or where only the forward-hand image, the backward image and the orthoimage are required to be shot, and where the flight path may be planned. Wherein, the lateral distance of the cross-shaped route is equal to the lateral distance D of the embodiment route Is uniform in size.
In the embodiment of the application, one shooting direction corresponds to a preset target posture of a tripod head, namely, when the tripod head reaches the preset target posture, the shooting device is in the corresponding shooting direction.
In S203, according to the third position information and the preset flight route of the unmanned aerial vehicle, determining a shooting sequence corresponding to each waypoint on the flight route, where each shooting sequence includes one or more continuous shooting points, shooting directions of one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are all located in an effective shooting area in the shooting direction.
The shooting directions comprise a front shooting direction, a rear shooting direction, a left shooting direction, a right shooting direction and a front shooting direction, wherein if the shooting sequence is positioned on a flying line outside an effective shooting area of the front shooting direction, no shooting point of the front shooting direction exists in the shooting sequence; if the shooting sequence is positioned on the flight route outside the effective shooting area in the backward shooting direction, no shooting point in the backward shooting direction exists in the shooting sequence; if the shooting sequence is positioned on the flight route outside the effective shooting area in the left shooting direction, no shooting point in the left shooting direction exists in the shooting sequence; if the shooting sequence is positioned on the flight route outside the effective shooting area in the right shooting direction, no shooting point in the right shooting direction exists in the shooting sequence; if the shooting sequence is positioned on the flight route outside the effective shooting area in the positive shooting direction, no shooting point in the positive shooting direction exists in the shooting sequence. It can be appreciated that in the embodiment of the present application, each photographing sequence is located on a flight line in an effective area of at least one of a front photographing direction, a rear photographing direction, a left photographing direction, a right photographing direction and a front photographing direction, that is, a position of each photographing sequence on the flight line is located in at least one effective photographing area, and each photographing sequence includes a photographing point of at least one photographing direction.
In the embodiment of the present application, the number of shooting points in each shooting sequence is positively correlated with the number of effective shooting areas where the shooting sequence is located at the position on the flight route, and by using the embodiment shown in fig. 4A and fig. 4B, the shooting sequence may be located at least one of the area 1, the area 2, the area 3 and the area 4 on the flight route.
Wherein, the area 1 is an overlapping area of the effective areas in 5 shooting directions, that is, the area 1 is an overlapping area of the effective shooting areas in the front shooting direction, the rear shooting direction, the left shooting direction, the right shooting direction and the front shooting direction, and the area 2 is an overlapping area of the effective areas in 4 shooting directions, and the area 2 includes 4 overlapping areas: overlapping areas of effective shooting areas of a front shooting direction, a left shooting direction, a right shooting direction and a front shooting direction, overlapping areas of effective shooting areas of a rear shooting direction, a left shooting direction, a right shooting direction and a front shooting direction, overlapping areas of effective shooting areas of a front shooting direction, a rear shooting direction, a left shooting direction and a front shooting direction, overlapping areas of effective shooting areas of a front shooting direction, a rear shooting direction, a right shooting direction and a front shooting direction; the area 3 is an overlapping area of the effective areas in 3 shooting directions, and the area 3 includes 4 overlapping areas, which are respectively: overlapping areas of effective shooting areas in a front shooting direction, a left shooting direction and a front shooting direction, overlapping areas of effective shooting areas in a front shooting direction, a right shooting direction and a front shooting direction, overlapping areas of effective shooting areas in a rear shooting direction, a left shooting direction and a front shooting direction, overlapping areas of effective shooting areas in a rear shooting direction, a right shooting direction and a front shooting direction; the area 4 is an effective shooting area in a single shooting direction, and the area 4 includes 4 independent effective shooting areas (which do not overlap with effective shooting areas in other shooting directions), namely an effective shooting area in a front shooting direction, an effective shooting area in a rear shooting direction, an effective shooting area in a left shooting direction and an effective shooting area in a right shooting direction.
If the position of the shooting sequence on the flight route is in the area 1, the number of shooting points in the shooting sequence is 5; if the position of the shooting sequence on the flight route is in the region 2, the number of shooting points in the shooting sequence is 4; if the position of the shooting sequence on the flight route is in the region 3, the number of shooting points in the shooting sequence is 3; if the position of the imaging sequence on the flight path is within the region 4, the number of imaging points in the imaging sequence is 1.
It can be appreciated that in the embodiment of the present application, the plurality of shooting sequences are arranged in sequence, where the sequence of each shooting sequence is consistent with the sequence of the position of the unmanned aerial vehicle on the flight route through the shooting sequence when the unmanned aerial vehicle flies according to the flight route.
When the follow-up unmanned aerial vehicle performs oblique photography according to the planned shooting sequence, a timed shooting or fixed-distance shooting mode can be adopted to trigger the cradle head and the shooting device to complete the shooting process. Optionally, the length of time required for the shooting device to complete shooting of each shooting sequence is fixed or the interval between adjacent shooting sequences is fixed, so that the shooting interval length or interval between two shooting sequences is more stable. In some embodiments, the time required for the photographing device to complete photographing of each photographing sequence is a first fixed time, so that the unmanned aerial vehicle can trigger the cradle head and the photographing device to complete the photographing process in a timing photographing mode, optionally, when the unmanned aerial vehicle performs oblique photographing according to the planned photographing sequence, the unmanned aerial vehicle can send a timing photographing trigger signal to the cradle head before the cradle head controls the photographing device to complete photographing of the first photographing point of the photographing sequence, after receiving the timing photographing trigger signal, the cradle head can complete photographing of each photographing sequence in sequence based on the timing triggering device of the first fixed time, and by adopting the timing photographing trigger mode, the unmanned aerial vehicle only needs to send the timing photographing trigger signal once, so that control of the unmanned aerial vehicle is simpler. Of course, the length of time required for the photographing device to complete photographing of each photographing sequence may not be a fixed length of time.
Further optionally, the duration required for the photographing device to complete photographing of the adjacent photographing points in the same photographing sequence is a second fixed duration, so that the photographing interval duration between the adjacent photographing points in each photographing sequence is stable. Of course, the time period required for the photographing device to complete photographing of the adjacent photographing points in the same photographing sequence may not be a fixed time period.
It will be appreciated that for the same shot sequence, the first fixed time period is longer than the second fixed time period. The first fixed duration and the second fixed duration can be set according to requirements, and the first fixed duration is 10 seconds and the second fixed duration is 2 seconds; of course, the first fixed duration and the second fixed duration may be set to other values.
In some embodiments, the interval between the adjacent shooting sequences is a first fixed interval, so that the unmanned aerial vehicle can trigger the cradle head and the shooting device to complete the shooting process in a fixed-distance shooting mode, optionally, when the unmanned aerial vehicle performs oblique shooting according to the planned shooting sequence, before the shooting device is controlled by the cradle head to complete shooting of the first shooting point of each shooting sequence, fixed-distance shooting trigger signals are respectively sent to the cradle head, after the cradle head receives the fixed-distance shooting trigger signals each time, gesture switching is performed first, so that the shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point of the corresponding shooting sequence, shooting of the shooting device is triggered when the shooting device is in the corresponding shooting direction, and the distance between the two shooting sequences is more stable in the fixed-distance shooting trigger mode.
Further optionally, the distance between adjacent shooting points in the same shooting sequence is a second fixed distance, the unmanned aerial vehicle can send a distance shooting trigger signal to the tripod head before the tripod head controls the shooting device to complete shooting of each shooting point, and after the tripod head receives the distance shooting trigger signal each time, the tripod head firstly performs gesture switching, so that the shooting device on the tripod head is in the shooting direction corresponding to the shooting point when the unmanned aerial vehicle reaches the corresponding shooting point, and then the shooting device is triggered to shoot when the shooting device is in the shooting direction corresponding to the shooting point, so that the distance between two adjacent shooting points in each shooting sequence is more stable.
The first fixed distance and the second fixed distance can be set according to the requirement, and the first fixed distance is 10 meters and the second fixed distance is 2 meters; of course, the first fixed interval and the second fixed interval can be set to other values.
In some embodiments, the unmanned aerial vehicle triggers the cradle head and the shooting device to complete the shooting process in a fixed-distance shooting manner, in the shooting process of the cradle head and the shooting device, the cradle head completes a swinging shooting sequence at fixed time, that is, when the unmanned aerial vehicle reaches each waypoint, the cradle head is triggered to enter the shooting program, after entering the shooting program, the cradle head triggers the shooting device to shoot at each shooting point at fixed time, an exemplary interval between adjacent shooting sequences (that is, an interval between adjacent waypoints) is a third fixed interval, a time length required for the shooting device to complete shooting of adjacent shooting points in the same shooting sequence is a third fixed time length, the third fixed interval and the third fixed time length can be set according to needs, the exemplary third fixed interval is 10 meters, and the third fixed time length is 2 seconds.
Optionally, the initial shooting point (i.e., the first shooting point in the first shooting sequence) of the shooting points is: the unmanned aerial vehicle is in an initial flight position when flying according to a flight route; optionally, the initial shooting point in the shooting points is: initial waypoints of the flight path. The initial flight position and the initial waypoint may be the same position or different positions, and specifically, one of the above embodiments may be selected to determine the initial shooting point according to the need. It will be appreciated that the manner of determining the initial photographing point is not limited to the several ways listed above, and that other ways of determining the initial photographing point may be used.
The main body of execution of the shooting control method of the above embodiment may be a control device of an unmanned aerial vehicle, and the control device may be a device capable of controlling an unmanned aerial vehicle, such as a remote controller, a mobile phone, a computer, an intelligent wearable device, etc., unless otherwise specified; the execution main body of the shooting control method in the above embodiment may also be an unmanned aerial vehicle, for example, the execution main body may be a flight controller of the unmanned aerial vehicle or other controllers provided to the unmanned aerial vehicle or a combination of the flight controller and other controllers provided to the unmanned aerial vehicle; the execution subject of the shooting control method in the above embodiment may also be a combination of a control device of an unmanned aerial vehicle and the unmanned aerial vehicle, for example, the first position information, the second position information, and the flight route planning are executed by the control device of the unmanned aerial vehicle, and the determination of the effective shooting area and the determination of the shooting sequence are executed by the unmanned aerial vehicle; for another example, the first position information and the second position information are acquired and executed by a control device of the unmanned aerial vehicle, and the planning of the flight route, the determination of the effective shooting area and the determination of the shooting sequence are executed by the unmanned aerial vehicle; for another example, the first position information is acquired and executed by a control device of the unmanned aerial vehicle, and the second position information is determined, the flight route is planned, the effective shooting area is determined and the shooting sequence is determined and executed by the unmanned aerial vehicle; for another example, the acquisition of the first position information, the determination of the second position information, the planning of the flight route, the determination of the effective shooting area and the determination of the shooting sequence can be all carried out by the unmanned aerial vehicle; of course, the execution subject of the shooting control method in the above embodiment is not limited to the control device of the unmanned aerial vehicle and/or the unmanned aerial vehicle, but may be other control devices independent of the unmanned aerial vehicle or electronic devices of the unmanned aerial vehicle, such as a control device of a pan-tilt or a shooting device.
In some embodiments, the execution subject of the method of the above embodiments is a control device of an unmanned aerial vehicle. When the control device acquires the second position information of the extended shooting area, specifically, the control device determines the second position information of the extended shooting area according to the first position information. Optionally, the planning of the flight path is performed in the control device, and illustratively, the flight path of the unmanned aerial vehicle is planned according to the second position information, where the planning of the flight path can be referred to the description of the corresponding parts in the above embodiment, and will not be repeated here. Further, the shooting control method further includes: the method comprises the steps of sending a flight route to an unmanned aerial vehicle, after determining shooting sequences corresponding to each waypoint on the flight route according to third position information and a preset flight route of the unmanned aerial vehicle, sending the shooting sequences corresponding to each waypoint to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequences corresponding to each waypoint, and before the unmanned aerial vehicle performs oblique shooting, the flight route is sent to the unmanned aerial vehicle through a control device of the unmanned aerial vehicle, and the unmanned aerial vehicle performs oblique shooting in the process of executing the flight route. It will be appreciated that the planning of the flight path may also take place in the unmanned aerial vehicle.
In some embodiments, the main execution body of the shooting control method in the foregoing embodiments is an unmanned plane, and for example, the unmanned plane may perform planning of the foregoing shooting sequence before performing oblique shooting, where shooting points in each shooting direction in the shooting sequence obtained by planning are located in an effective shooting area in the shooting direction; for example, each shooting sequence includes shooting points in all shooting directions, and in the process of oblique shooting, the unmanned aerial vehicle removes shooting points located in an ineffective shooting area in the current shooting sequence according to real-time position information of the unmanned aerial vehicle and the shooting points of the currently executed shooting sequence.
The following describes the shooting control method by taking the main body of execution of the shooting control method as an example of an unmanned plane.
The first position information may be sent by the control device of the unmanned aerial vehicle, and the user inputs the first position information through the control device of the unmanned aerial vehicle, and the control device of the unmanned aerial vehicle sends the first position information to the unmanned aerial vehicle. The user may input the first position information to the control device of the unmanned aerial vehicle through a manual dotting manner or through an external file, and specifically, reference may be made to descriptions of corresponding parts in the above embodiments, which are not repeated herein. It can be appreciated that if the unmanned aerial vehicle itself is provided with the input module, the first position information may also be directly input into the unmanned aerial vehicle by the user operating the input module.
Different strategies can be adopted to acquire second position information, and in some embodiments, the second position information is sent by a control device of the unmanned aerial vehicle, optionally, the control device of the unmanned aerial vehicle determines the second position information of the out-expansion shooting area according to the first position information, and then sends the second position information to the unmanned aerial vehicle; in some embodiments, the second location information is determined by the drone itself, and in particular, the second location information of the extended shooting area is determined according to the first location information. The implementation process of determining the second position information of the extended shooting area according to the first position information may be referred to the description of the corresponding parts in the above embodiment, which is not repeated herein.
The planning of the flight path may be performed by the control device of the unmanned aerial vehicle or performed by the unmanned aerial vehicle, and illustratively, in some embodiments, the flight path is planned by the control device of the unmanned aerial vehicle based on the second position information, and the control device of the unmanned aerial vehicle then transmits the flight path to the unmanned aerial vehicle; in some embodiments, the flight path is planned by the drone based on the second location information. The flight path planning can be referred to the description of the corresponding parts in the above embodiments, and will not be repeated here.
The shooting control method of the embodiment of the application can further comprise the following steps: and controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each navigation point.
Next, a process of controlling a photographing device mounted on the unmanned aerial vehicle to photograph based on a flight route and a photographing sequence of the unmanned aerial vehicle will be described in detail.
Fig. 7 is a schematic diagram of an implementation manner of controlling a shooting device mounted on an unmanned aerial vehicle to shoot based on a flight route and a shooting sequence in an embodiment of the present application. Referring to fig. 7, the process of controlling the photographing device mounted on the unmanned aerial vehicle to perform photographing based on the flight path and the photographing sequence may include steps S701 to S703.
In S701, controlling the unmanned aerial vehicle to fly according to a flight route;
in S702, according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling the pan-tilt on the unmanned aerial vehicle to switch the gesture, so that the shooting device on the pan-tilt is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
in S703, an image captured by the capturing device at each capturing point is acquired.
According to the embodiment of the application, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, the cradle head carrying the shooting device is controlled to switch the gesture, so that the shooting device is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point, and the unmanned aerial vehicle does not need to fly in the shooting process, thereby improving the shooting efficiency, and being particularly suitable for map mapping; in addition, according to the embodiment of the application, one shooting device is controlled by the cradle head to asynchronously shoot images in a plurality of shooting directions, and compared with the traditional multi-spelling shooting device, the weight of the unmanned aerial vehicle is greatly reduced, so that the unmanned aerial vehicle with lighter volume and weight can be selected to carry the shooting device, and the use cost is reduced.
In the embodiment of the application, the shooting device does not influence the flight of the unmanned aerial vehicle, namely, when the shooting device shoots, the unmanned aerial vehicle continues to execute the flight route, and the unmanned aerial vehicle cannot hover due to the shooting action of the shooting device, so that the shooting efficiency is further improved.
The flight route of the embodiment of the application can comprise a plurality of waypoints, wherein the flight route can be preset by a user, and optionally, the user inputs the position information of each waypoint into the unmanned aerial vehicle through a control device of the unmanned aerial vehicle, and the unmanned aerial vehicle can sequentially connect each waypoint according to the input sequence to form the flight route. When the user updates the position of the partial waypoints in the set flight route, the control device of the unmanned aerial vehicle can be operated to modify the position information of the partial waypoints in the set flight route. The step of modifying the position information of a part of waypoints in the set flight route can be performed before the unmanned aerial vehicle flies, and can also be performed in the process of flying the unmanned aerial vehicle. It will be appreciated that the flight profile may also be a default flight profile.
The position setting relation between the waypoints and the shooting points can be selected according to the needs, for example, in some embodiments, the shooting points are arranged between adjacent waypoints, so that the characteristics that the flight time between the waypoints is longer than the time required by the shooting device to shoot images can be utilized, and the shooting efficiency is higher due to the fact that the shooting directions of the images are inserted between the waypoints; in other embodiments, a portion of the plurality of waypoints is taken as a shooting point, and the shooting point may or may not be set between adjacent waypoints; in other embodiments, all of the plurality of waypoints may be taken as capture points, and capture points may or may not be located between adjacent waypoints. It can be understood that one shooting point has one shooting, corresponds to one shooting direction, and corresponds to a preset target posture of a cradle head, so as to obtain one shot image.
In the related art, when a fixed wing unmanned aerial vehicle or a rotary wing unmanned aerial vehicle is used to achieve shooting at multiple angles, due to speed or efficiency control, a flight route is generally designed into multiple flight routes, such as five routes, each flight route corresponds to one shooting direction, and a forward image, a backward image, a left image, a right image and an orthographic image of a region to be shot are respectively acquired, so that the unmanned aerial vehicle needs to be controlled to fly along the five routes respectively, which is unfavorable for the cruising of the unmanned aerial vehicle. For this, the flight route in the embodiment of the application is designed to be one, and the flight route can be the flight route shown in fig. 4A and 4B, or can be other flight routes, and in the one-time flight process of the unmanned aerial vehicle, the cradle head on the unmanned aerial vehicle is controlled to perform gesture switching so as to realize shooting at a plurality of shooting angles, so that repeated flight of the route is not required to be realized, thereby being beneficial to improving shooting efficiency and reducing energy consumption of the unmanned aerial vehicle.
Wherein, the process of controlling the unmanned aerial vehicle to fly according to the flight route can comprise: and controlling the real-time height between the lens of the shooting device and the region to be shot to be within a preset height range. When the unmanned aerial vehicle is used for mapping, GSD is uneven due to fluctuation of the terrain in the mapping process, so that the uniformity of GSD is maintained by controlling the height between the lens of the shooting device and the ground, and if the terrain becomes high, the unmanned aerial vehicle ascends; and if the topography is reduced, the unmanned aerial vehicle descends, and the GSDs are approximately equal in the mapping process. The lifting height and the descending height of the fixed wing unmanned aerial vehicle are limited, so that the fixed wing unmanned aerial vehicle can only be controlled to lift or descend in the lifting height or descending height range of the fixed wing unmanned aerial vehicle, and GSDs are kept consistent as much as possible.
According to the embodiment of the application, the unmanned aerial vehicle is not required to hover at the shooting point, and in order to ensure that each shooting sequence is triggered, the cradle head finishes the last shooting sequence, and the flight speed is required to be controlled within the maximum flight speed allowed by the unmanned aerial vehicle. The maximum flight speed can be calculated by combining with the rotation performance of the cradle head, and optionally, the maximum flight speed allowed by the unmanned aerial vehicle is determined based on the course distance of each shooting direction and the time required by the shooting device to complete shooting of a shooting sequence and restore to the initial shooting direction, wherein the course distance of each shooting direction is equal in size, and the course distance is determined based on the preset ground resolution, the preset course overlapping rate and the number of pixels of the shooting device in the flight direction parallel to the unmanned aerial vehicle (namely, the number of pixels of the image sensor of the shooting device in the flight direction parallel to the unmanned aerial vehicle).
Exemplary, maximum flying speed V max The calculation formula of (2) is as follows:
in the formula (4), D 2 For course distance of each shooting direction, T Gim The time required for the photographing device to complete photographing of one photographing sequence and resume to the initial photographing direction. It should be understood thatMaximum flying speed V max The calculation method of (2) is not limited to the formula (4), but may be other.
Exemplary, wherein the relative positional relationship of the shooting directions between the two shooting sequences is shown in FIG. 8, F1 and F2 are effective shooting areas in the front shooting direction, D1 and D2 are effective shooting areas in the front shooting direction, B1 and B2 are effective shooting areas in the rear shooting direction, R1 and R2 are effective shooting areas in the right shooting direction, L1 and L2 are effective shooting areas in the left shooting direction, D F 、D D 、D B 、D R 、D L The heading distances of the front shooting direction, the rear shooting direction, the right shooting direction and the left shooting direction are respectively set. In the embodiment of the application, D 2 =D F =D D =D B =D R =D L
Exemplary, heading distance D in the front beat direction F The calculation formula of (2) is as follows:
D F =GSD(1-γ course )n V (5);
in the formula (5), gamma course For a preset course overlap rate, n V The number of pixels of the shooting device in the flight direction parallel to the unmanned aerial vehicle. It should be appreciated that heading distance D in the forward beat direction F The calculation method of (2) is not limited to the formula (5), but may be other.
The heading overlap rate may be a default value or may be set by a user. The course overlap rate is set by a user, for example, the course overlap rate is input by the user through a control device of the unmanned aerial vehicle, and the mode for determining the course overlap rate can meet different user requirements and is high in flexibility. In order to ensure that the images in each shooting direction meet modeling requirements, optionally, the heading overlap ratio is greater than or equal to 65% and less than or equal to 80%. Illustratively, the heading overlap is 65%, 70%, 75%, 80%, or other numerical magnitude greater than 65%, and less than 80%.
In the following, the effect of the systematic error on the heading overlap rate is analyzed using the forward image as an example.
Assuming a ground resolution GSD of 2.5cm, heading overlap ratio gamma course 70% of the number n of pixels of the shooting device in parallel with the flight direction of the unmanned aerial vehicle V 5460, the flying speed is 10m/s, and according to formula (5), the theoretical distance between the front and rear front images is determined as follows:
2.5cm*(1-70%)*5460=40.95m;
due to factors such as fluctuation of rotation speed of the cradle head, fluctuation of system delay time and the like, the error between the actual shooting time and the theoretical shooting time of the second forward image is 0.5s (delay), and then the actual distance between the front forward image and the rear forward image is as follows:
40.95m+10m/s*0.5=45.95m;
then D will be F =45.95m、GSD=2.5cm、n V By substituting 5460 into equation (5), it can be determined that the actual heading overlap ratio of the forward image is 66%, and the modeling requirement is still satisfied.
Because the effective shooting area parallel to the heading in the front shooting direction and the rear shooting direction is longer than the effective shooting area parallel to the heading in the front shooting direction, the right shooting direction and the left shooting direction, the error of the actual shooting time and the theoretical shooting time is smaller for the images of the heading overlapping rate in the front shooting direction and the rear shooting direction. By controlling parameters such as flight speed, flight direction, rotation speed of the cradle head and the like (needing system optimization), or by increasing the course overlapping rate (which can influence the overall operation efficiency), the course overlapping rate of the images in all shooting directions can be ensured to meet modeling requirements.
The initial shooting direction may be a shooting direction corresponding to one of the shooting points in the shooting sequence, and exemplary, the initial shooting direction is a shooting direction of a first shooting point in the shooting sequence. Optionally, the shooting directions of the first shooting point of each shooting sequence are the same, for example, the shooting directions of the first shooting point of each shooting sequence are positive shooting directions; optionally, the shooting directions of the first shooting point of the plurality of shooting sequences are at least partially different, for example, the shooting direction of the first shooting point of the shooting sequence 1 is a left shooting direction, the shooting direction of the first shooting point of the shooting sequence 2 is a right shooting direction, the shooting direction of the first shooting point of the shooting sequence 3 is a left shooting direction, and so on. Wherein T is Gim The shooting device can complete shooting of the current shooting sequence and restore to the next shooting sequenceThe time length required for the initial shooting direction of the column is suitable for scenes in which the shooting directions of the first shooting points of each shooting sequence are the same or the shooting directions of the first shooting points of a plurality of shooting sequences are at least partially different; of course T Gim The shooting device can also complete shooting of the current shooting sequence and restore to the initial shooting direction of the current shooting sequence, which is applicable to scenes with the same shooting direction of the first shooting point of each shooting sequence.
The method comprises the steps of controlling a cradle head on an unmanned aerial vehicle to switch postures, and enabling a shooting device on the cradle head to obtain the real-time postures of the unmanned aerial vehicle when each shooting point is in a corresponding shooting direction; determining the deviation between the real-time gesture of the unmanned aerial vehicle and the shooting direction of the next shooting point; and controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the deviation, so that the shooting device on the cradle head is positioned in the corresponding shooting direction at each shooting point. According to the shooting device provided by the embodiment of the application, the cradle head is mounted on the body of the unmanned aerial vehicle, when the posture of the body is changed greatly, shooting in the same direction of different waypoints (shooting sequences) can be realized by controlling the posture of the cradle head, the consistent (or small deviation) of the cradle head to the ground angle is ensured by controlling the posture of the cradle head, and the overlapping rate of the photo sequences in the same direction is ensured to be kept uniform.
The unmanned aerial vehicle and the cradle head can be matched in different modes to complete shooting in multiple shooting directions, and according to the shooting sequence, the process of controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the shooting sequence in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, so that the shooting device on the cradle head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point can comprise the following steps: according to shooting sequence, send and shoot trigger signal to cloud platform to make the cloud platform fly to the in-process of next shooting point at unmanned aerial vehicle from current shooting point, carry out the gesture and switch, make the shooting device on the cloud platform all be in the shooting direction that corresponds when unmanned aerial vehicle reaches every shooting point. And the shooting trigger signal is also used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is in the corresponding shooting direction. In this embodiment, the unmanned aerial vehicle triggers the cloud platform to enter the procedure of executing the shooting sequence, wherein the procedure of shooting the sequence includes two steps of gesture switching and shooting triggering, and the gesture switching and the shooting triggering are completed by the cloud platform, so that the influence of time delay of the unmanned aerial vehicle triggering signal processing process on the operation efficiency is reduced. The gesture switching is to enable the shooting device on the cradle head to be in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point. Of course, the unmanned aerial vehicle can also directly control the cradle head to switch the gesture and/or directly trigger the shooting device to shoot.
In the embodiment of the present application, the process of completing the shooting by the pan-tilt and shooting device may include: the cradle head controls the shooting device to complete shooting of each shooting sequence, specifically, the cradle head carries out gesture switching according to the shooting sequences, so that the shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point of each shooting sequence, and the cradle head triggers the shooting device to shoot when the shooting device is in the corresponding shooting direction.
The shooting trigger signal can be a timing shooting trigger signal or a fixed-distance shooting trigger signal, that is, the unmanned aerial vehicle can trigger the cradle head and the shooting device to complete the shooting process in a timing shooting or fixed-distance shooting trigger mode.
For example, in some embodiments, the shooting trigger signal is a timing shooting trigger signal, where the timing shooting trigger signal is used to instruct the pan-tilt to trigger the shooting device to shoot based on the first timing policy. Wherein the first timing policy may include: the time required for the shooting device to complete shooting of each shooting sequence is a first fixed time, so that the time required for the shooting device to complete shooting of each shooting sequence is stable. Optionally, the number of times of sending the timing shooting trigger signal to the tripod head is once, for example, can send the timing shooting trigger signal to the tripod head before the tripod head control shooting device finishes shooting of the first shooting point of the first shooting sequence, and the tripod head rotates each shooting direction to every shooting sequence in proper order after receiving the timing trigger signal to the timing triggering shooting device shoots, and this mode only needs unmanned aerial vehicle to send the timing shooting trigger signal once can, unmanned aerial vehicle's control is comparatively simple. It can be understood that the number of times that the unmanned aerial vehicle sends the timing shooting trigger signal to the cradle head can also be multiple times, for example, before the cradle head controls the shooting device to complete shooting of the first shooting point of each shooting sequence, the timing shooting trigger signal is sent to the cradle head respectively, after the cradle head receives the timing shooting trigger signal, the cradle head rotates to each shooting direction corresponding to the shooting sequence in sequence, and the timing triggering shooting device completes shooting corresponding to each shooting direction in the shooting sequence. It will be appreciated that the first timing strategy may be other as well.
Furthermore, the timing shooting trigger signal can be further used for indicating the cradle head to trigger the shooting device to shoot based on the second timing strategy. Wherein the second timing strategy comprises: the time length required for the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length, so that the time length required for the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is stable. It will be appreciated that the second timing strategy may be other as well.
Under a timing shooting triggering mode, shooting sequences can be sent to the cradle head by adopting different strategies, and the number of the shooting sequences is multiple as an example, in some embodiments, before the cradle head controls the shooting device to complete shooting of the first shooting point of the first shooting sequence, all the shooting sequences are sent to the cradle head at one time, and in the process of completing shooting by matching the subsequent cradle head with the shooting device, the unmanned aerial vehicle does not need to send the shooting sequences to the cradle head any more; in some other embodiments, after the shooting device finishes shooting the current shooting sequence, the next shooting sequence is sent to the pan-tilt, that is, after the pan-tilt controls the shooting device to finish shooting each shooting sequence, the next shooting sequence of the shooting sequence is sent to the pan-tilt to instruct the pan-tilt to execute shooting of the next shooting sequence.
In some embodiments, the shooting trigger signal is a fixed-distance shooting trigger signal, and the fixed-distance shooting trigger signal may be used to instruct the pan-tilt to control the shooting device to perform shooting based on the first fixed-distance strategy. Wherein the first ranging strategy comprises: the interval between the adjacent shooting sequences is a first fixed interval, so that the interval between the adjacent shooting sequences is stable. Optionally, the number of times of sending the distance trigger signal to the pan-tilt is multiple, for example, before the pan-tilt controls the photographing device to complete photographing of the first photographing point of each photographing sequence, the distance photographing trigger signal can be sent to the pan-tilt respectively, and after the pan-tilt receives the distance photographing trigger signal each time, the pan-tilt performs gesture switching first, so that the photographing device on the pan-tilt is in the corresponding photographing direction when the unmanned aerial vehicle reaches each photographing point of the corresponding photographing sequence, and then the photographing device is triggered to photograph when the photographing device is in the corresponding photographing direction. Because unmanned aerial vehicle triggers the cloud platform and has the time delay, consequently, adopt first gauge strategy to trigger the mode that the cloud platform control shooting device shot can reduce unmanned aerial vehicle trigger the influence of time delay to operating efficiency.
Further, optionally, the distance shooting trigger signal may be further used to instruct the pan-tilt control shooting device to shoot based on the second distance strategy. Wherein the second distance strategy comprises: the distance between adjacent shooting points in the same shooting sequence is a second fixed distance, so that the distance between the adjacent shooting points in the same shooting sequence is stable. In addition, because the shooting directions in each shooting sequence are not necessarily identical, before the pan-tilt control shooting device finishes shooting of each shooting point, a fixed-distance shooting trigger signal can be sent to the pan-tilt respectively so as to trigger the pan-tilt to finish shooting of each shooting point. Optionally, the unmanned aerial vehicle adopts the distance to take the trigger mode to trigger cloud platform and shooting device and accomplishes shooting process, at cloud platform and shooting device shooting in-process, the cloud platform is regularly accomplished and is swung the sequence of taking a photo, that is, when unmanned aerial vehicle arrived every waypoint, trigger cloud platform and get into shooting procedure, the cloud platform is after getting into shooting procedure, regularly trigger shooting device and shoot at every shooting point, then interval between the adjacent shooting sequence (i.e. interval between the adjacent waypoint) is first fixed interval, the shooting device is accomplished the duration that the shooting of adjacent shooting point in the same shooting sequence is the third fixed duration, the size of the third fixed duration can be set up as required, the exemplary, the third fixed duration is 2 seconds.
In a fixed-distance shooting triggering manner, shooting sequences can be sent to the cradle head by adopting different strategies, and in an exemplary embodiment, before the cradle head controls the shooting device to complete shooting of the first shooting point of each shooting sequence, the shooting sequences are sent to the cradle head, that is, after the cradle head controls the shooting device to complete shooting of each shooting sequence, before the cradle head controls the shooting device to execute shooting of the next shooting sequence of each shooting sequence, the cradle head sends the next shooting sequence of each shooting sequence to the cradle head so as to instruct the cradle head to execute shooting of the next shooting sequence; in some other embodiments, before the pan-tilt control photographing device finishes photographing each photographing point, an indication signal is sent to the pan-tilt respectively, where the indication signal is used to indicate a target gesture of the pan-tilt corresponding to the photographing point or a photographing direction of the photographing device, that is, each photographing point triggers the pan-tilt to perform photographing of the photographing point.
In addition, in the embodiment of the application, the control mode of the posture switching of the tripod head can be selected according to the type of the tripod head, and the tripod head is taken as an example of a triaxial tripod head and is configured to move around a yaw axis, a transverse rolling axis and a pitching axis. Alternatively, the switching of the pan-tilt attitude may be achieved by controlling one or more of the roll-axis attitude, pitch-axis attitude, and yaw-axis attitude of the pan-tilt. In general, the yaw axis of the pan-tilt cannot rotate entirely, so that the shooting direction of the shooting device at different shooting points in each shooting sequence cannot be controlled by adopting a yaw axis posture mode of the pan-tilt, so that in the embodiment of the application, any two of the yaw posture, the roll axis posture and the pitch axis posture of the pan-tilt are controlled to control the switching posture of the pan-tilt.
Illustratively, the yaw attitude and the pitch axis attitude are controlled to control the pan-tilt switching attitude, and optionally, the target attitudes of the pan-tilt are characterized as (pitch attitude, roll attitude, yaw attitude), and the pan-tilt target attitudes corresponding to the front shooting direction, the rear shooting direction, the right shooting direction and the left shooting direction are respectively: (-60 °,0 °,0 °), (-90 °,0 °,0 °), (-120 °,0 °,0 °), (-60 °,0 °,90 °), (-120 °,0 °,90 °).
Illustratively, the yaw attitude and the roll axis attitude are controlled to control the pan-tilt switching attitude, and optionally, the pan-tilt target attitude is characterized as (pitch attitude, roll attitude, yaw attitude), and the pan-tilt target attitudes corresponding to the front shooting direction, the rear shooting direction, the right shooting direction, and the left shooting direction are respectively: (-90 °,30 °,0 °), (-90 °,0 °,0 °), (-90 °, -30 °,90 °), (-90 °,30 °,90 °).
It can be understood that the above listed target attitudes of the pan head corresponding to the front shooting direction, the rear shooting direction, the right shooting direction and the left shooting direction are only two types, and shooting in similar shooting directions can be realized by adjusting the rotation angles of the shafts and the front-back rotation sequence of the shafts, but the operation efficiency is different.
For example, referring to fig. 9, the unmanned aerial vehicle flies along the flight path 80, and when the unmanned aerial vehicle reaches the first shooting point of a certain shooting sequence (for example, fig. 9, the shooting direction of the first shooting point of the shooting sequence is the front shooting direction), the cradle head is triggered to execute shooting of the shooting sequence. For convenience of description, this photographing sequence is referred to as a photographing sequence a, and photographing directions of the photographing sequence a include a front photographing direction, a rear photographing direction, a right photographing direction, and a left photographing direction. The process of the pan-tilt executing the photographing of the photographing sequence a is as follows:
(1) After the cradle head detects that the shooting device finishes shooting the last shooting point of the last shooting sequence, the shooting device rotates to a first target posture of the cradle head corresponding to the front shooting direction of the shooting sequence A, and then the shooting device is triggered to shoot, so that a forward image is obtained;
(2) After the shooting device finishes shooting the forward image of the shooting sequence A, continuing to rotate to a second target posture of the cradle head corresponding to the forward shooting direction of the shooting sequence A, and triggering the shooting device to shoot to obtain an orthographic image;
(3) After the shooting device finishes shooting the orthographic image of the shooting sequence A, continuing to rotate to a third target posture of the cradle head corresponding to the rear shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain a rear image;
(4) After the shooting device finishes shooting the backward image of the shooting sequence A, continuing to rotate to a fourth target posture of the cradle head corresponding to the right shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain a right image;
(5) And after the photographing device finishes photographing the right-direction image of the photographing sequence A, continuing to rotate to a fifth target posture of the cradle head corresponding to the left photographing direction of the photographing sequence A, and triggering the photographing device to photograph to obtain the left-direction image.
So far, the cradle head finishes shooting of the shooting sequence A.
It can be understood that the first target gesture, the second target gesture, the third target gesture, the fourth target gesture, and the fifth target gesture are all preset target gestures, and optionally, the first target gesture, the second target gesture, the third target gesture, the fourth target gesture, and the fifth target gesture respectively: (-60 °,0 °,0 °), (-90 °,0 °,0 °), (-120 °,0 °,0 °), (-60 °,0 °,90 °), (-120 °,0 °,90 °) or the first, second, third, fourth, and fifth target poses respectively: (-90 °,30 °,0 °), (-90 °,0 °,0 °), (-90 °, -30 °,90 °), (-90 °,30 °,90 °).
When the cradle head reaches a preset target posture and is in a stable state, the shooting device is triggered to shoot. The cradle head being in a steady state may include: the fluctuation range of the angle of the cradle head relative to the preset direction (such as the angle of the cradle head relative to the ground) is within the preset angle range, namely the fluctuation of the angle of the cradle head relative to the preset direction is smaller.
In addition, before the pan-tilt triggers the photographing device to photograph, the photographing device needs to be in a state of being capable of performing photographing. Optionally, the photographing device being in a state of being capable of performing photographing includes: the control part of the shooting device is in a triggerable state, and the shooting device is a camera and the control part is a shutter of the camera; optionally, the photographing device being in a state of being capable of performing photographing includes: the buffer of the camera is greater than a preset capacity, i.e. the buffer of the camera is large enough to be able to store at least one image.
Optionally, each shooting sequence corresponds to an image queue, and the image of each shooting point in each shooting sequence can be stored in the corresponding image queue; optionally, images in the same shooting direction are stored in the same image queue, images in different shooting directions are stored in different image queues, and specifically, the storage mode of the images can be selected according to the needs.
In addition, it should be noted that, in the shooting control method of the embodiment of the present application, the double-axis or three-axis pan-tilt is not required, and the unmanned aerial vehicle can complete shooting in multiple shooting directions in cooperation with the single-axis pan-tilt, for example, shooting in three shooting directions can be achieved by the pan-tilt with variable pitch posture and the groined-channel planning, shooting in three shooting directions can be achieved by the pan-tilt with variable roll posture and the groined-channel planning, and shooting in three or four shooting directions can be achieved by the pan-tilt with variable yaw posture and the flying route shown in fig. 4A or 4B.
In addition, the embodiment of the application also provides a shooting control method, and the execution main body of the shooting control method in the embodiment of the application is an unmanned aerial vehicle, for example, the execution main body can be a flight controller of the unmanned aerial vehicle or other controllers or flight controllers arranged on the unmanned aerial vehicle or a combination of the flight controllers and other controllers arranged on the unmanned aerial vehicle.
Referring to fig. 10, the photographing control method according to the embodiment of the present application may include S1001 to S1002:
in S1001, receiving a flight route sent by a control device of the unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight route, where each shooting sequence includes one or more continuous shooting points, shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of the area to be shot and second position information of the area to be shot, the area to be shot is enlarged, and the second position information is determined according to the first position information;
In S1002, an imaging device mounted on the unmanned aerial vehicle is controlled to perform imaging based on the flight route and the imaging sequence corresponding to each waypoint.
The embodiment shown in fig. 10 is specifically described with reference to the embodiment shown in fig. 2 to 9, and is not specifically limited herein.
In response to the shooting control method of the foregoing embodiment, the embodiment of the present application further provides a shooting control device, referring to fig. 11, where the shooting control device of the embodiment of the present application may include a storage device and a processor, and the processor includes one or more processors.
And the storage device is used for storing the program instructions. The storage device stores an executable instruction computer program of the photographing control method, and the storage device may include at least one type of storage medium including flash memory, a hard disk, a multimedia card, a card memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. Also, the photographing control apparatus may cooperate with a network storage apparatus that performs a storage function of the memory through a network connection. The memory may be an internal storage unit of the photographing control apparatus, for example, a hard disk or a memory of the photographing control apparatus. The memory may be an external storage device of the photographing control apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, which are provided on the photographing control apparatus. Further, the memory may include both an internal storage unit and an external storage device of the photographing control apparatus. The memory is used to store computer programs and other programs and data required by the device. The memory may also be used to temporarily store data that has been output or is to be output.
In some embodiments, one or more processors, which, when executed, are configured, individually or collectively, to perform the following operations: acquiring first position information of a region to be shot and second position information of an extended shooting region, wherein the extended shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information; determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information; determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle; each shooting sequence comprises one or more continuous shooting points to form a shooting sequence, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points of each shooting direction are located in an effective shooting area of the shooting direction. The processor of the present embodiment may implement the photographing control method of the embodiment shown in fig. 2 or fig. 7 of the present application, and the photographing control apparatus of the present embodiment will be described with reference to the photographing control method of the embodiment shown in fig. 2 or fig. 7.
In some embodiments, one or more processors, which, when executed, are configured, individually or collectively, to perform the following operations: receiving a shooting sequence corresponding to each waypoint on a flight route sent by a control device of the unmanned aerial vehicle; controlling a shooting device carried on the unmanned aerial vehicle to shoot based on a shooting sequence corresponding to each navigation point; each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in effective shooting areas in the shooting directions, the effective shooting areas are determined according to first position information of the to-be-shot areas and second position information of the expansion shooting areas, the expansion shooting areas are obtained by expanding the to-be-shot areas, and the second position information is determined according to the first position information. The processor of the present embodiment may implement the photographing control method of the embodiment shown in fig. 10 of the present application, and the photographing control apparatus of the present embodiment will be described with reference to the photographing control method of the embodiment shown in fig. 10.
The processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The communication processes such as "transmission" and "reception" related to the entity apparatus may be performed by using a transceiver or a communication interface on the apparatus, and other data processing processes than "transmission" and "reception" may be performed by a processor on the apparatus.
Further, referring to fig. 1 and 12, the unmanned aerial vehicle may include a body 100, a pan-tilt 300, and the shooting control apparatus of the above embodiment. The cradle head 300 is mounted on a body, and the cradle head 300 of the present embodiment is used for mounting the photographing device 200. The photographing control apparatus is supported by the body 100, and is electrically connected to the cradle head 300.
Further, an embodiment of the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the photographing control method of the above embodiment.
The computer readable storage medium may be an internal storage unit of the photographing control apparatus according to any of the foregoing embodiments, such as a hard disk or a memory. The computer-readable storage medium may be an external storage device of the photographing control apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), or the like, which are provided on the device. Further, the computer-readable storage medium may further include both an internal storage unit and an external storage device of the photographing control apparatus. The computer-readable storage medium is used to store the computer program and other programs and data required by the photographing control apparatus, and also to temporarily store data that has been output or is to be output.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
The above disclosure is illustrative only of some embodiments of the application and is not intended to limit the scope of the application, which is defined by the claims and their equivalents.

Claims (170)

1. A photographing control method, characterized in that the method comprises:
acquiring first position information of a region to be shot and second position information of an expanded shooting region, wherein the expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction.
2. The method according to claim 1, wherein the extended photographing region is a region obtained by expanding different directions of the region to be photographed by a first preset distance, respectively;
The first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle.
3. The method of claim 2, wherein the shooting direction comprises at least two of:
the front shooting direction of the front of the unmanned aerial vehicle, the rear shooting direction of the rear of the unmanned aerial vehicle, the left shooting direction of the left direction of the unmanned aerial vehicle, the right shooting direction of the right direction of the unmanned aerial vehicle or the right shooting direction of the shooting direction vertically downwards.
4. A method according to claim 3, wherein the effective photographing area in the front photographing direction, the effective photographing area in the rear photographing direction, the effective photographing area in the left photographing direction, and the effective photographing area in the right photographing direction are respectively:
the region to be shot is obtained after the second preset distance is moved to the first direction, the region to be shot is obtained after the second preset distance is moved to the second direction, the region to be shot is obtained after the second preset distance is moved to the third direction, and the region to be shot is obtained after the second preset distance is moved to the fourth direction; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
5. The method of claim 4, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
6. The method of claim 2, wherein the flying height is set by a user.
7. The method of claim 6, wherein the altitude is entered by a user via a control device of the drone.
8. The method of claim 2, wherein the flying height is determined according to parameters of a camera mounted on the unmanned aerial vehicle and a preset ground resolution.
9. The method of claim 8, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
10. The method of claim 1, wherein the flight path comprises a plurality of mutually parallel sub-paths, one side of adjacent sub-paths being connected to form one flight path;
the determination process of the flight route comprises the following steps:
determining a side distance between two adjacent sub-airlines in a flight airlines according to a preset ground resolution, a preset side overlapping rate and the number of pixels of a shooting device carried on the unmanned aerial vehicle in the flight direction perpendicular to the unmanned aerial vehicle;
And determining the flight route according to the second position information and the lateral distance.
11. The method of claim 10, wherein the extended shot region is square, the starting waypoint of the flight path is any angular position of the extended shot region, and the sub-path is parallel to one of the sides of the extended shot region.
12. The method of claim 10, wherein the side lap rate is set by a user.
13. The method of claim 1, wherein the first location information is set by a user; or,
the region to be shot is determined by importing an external file, and the first position information is recorded in the external file.
14. The method of claim 1, wherein a length of time required for a photographing device mounted on the unmanned aerial vehicle to complete photographing of each of the photographing sequences is a first fixed length of time.
15. The method of claim 14, wherein the time period required for the photographing device to complete photographing of the adjacent photographing points in the same photographing sequence is a second fixed time period.
16. The method of claim 1, wherein the spacing between adjacent shot sequences is a first fixed spacing.
17. The method of claim 16, wherein the spacing between adjacent shots in the same shot sequence is a second fixed spacing.
18. The method of claim 1, wherein an initial one of the shots is: the unmanned aerial vehicle starts to fly according to the flight route; or,
the initial shooting point in the shooting points is as follows: the initial waypoint of the flight route.
19. The method of claim 1, wherein before the acquiring the first position information of the region to be photographed and the second position information of the extended photographing region, further comprises:
acquiring a trigger instruction for indicating to enter an oblique shooting mode;
and entering the oblique shooting mode.
20. The method according to any one of claims 1 to 19, wherein the subject of execution of the method is a control device of the unmanned aerial vehicle.
21. The method of claim 20, wherein the obtaining second location information of the extended shot region comprises:
and determining second position information of the extended shooting area according to the first position information.
22. The method of claim 20, wherein after the obtaining the second position information of the extended shot region, further comprising:
And planning a flight route of the unmanned aerial vehicle according to the second position information.
23. The method as recited in claim 22, further comprising:
transmitting the flight route to the unmanned aerial vehicle;
after determining the shooting sequence corresponding to each waypoint on the flight route according to the third position information and the preset flight route of the unmanned aerial vehicle, the method further comprises:
and sending a shooting sequence corresponding to each waypoint to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence corresponding to each waypoint.
24. The method according to any one of claims 1 to 19, wherein the subject of execution of the method is the drone.
25. The method of claim 24, wherein the first location information is transmitted by a control device of the drone.
26. The method of claim 24, wherein the second location information is transmitted by a control device of the drone.
27. The method of claim 24, wherein the obtaining second location information of the extended shot region comprises:
And determining second position information of the extended shooting area according to the first position information.
28. The method of claim 24, wherein the flight path is planned by a control device of the drone based on the second location information; or,
the flight route is planned by the drone based on the second location information.
29. The method as recited in claim 24, further comprising:
and controlling a shooting device mounted on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each flight route and each waypoint.
30. The method of claim 29, wherein controlling the camera onboard the drone to take the shots based on the flight path and the shot sequence comprises:
controlling the unmanned aerial vehicle to fly according to the flying route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the gesture, so that a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring images shot by the shooting device at each shooting point.
31. The method of claim 30, wherein the photographing by the photographing device does not affect the flight of the drone.
32. The method according to claim 30, wherein the controlling the pan-tilt-switch gesture on the unmanned aerial vehicle during the flight of the unmanned aerial vehicle from the current shooting point to the next shooting point according to the shooting sequence, so that the shooting device on the pan-tilt is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point, comprises:
according to the shooting sequence, a shooting trigger signal is sent to the cradle head, so that the cradle head performs gesture switching in the process that the unmanned aerial vehicle flies from a current shooting point to a next shooting point, and a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
33. The method of claim 32, wherein the capture trigger signal is a timed capture trigger signal, the timed capture trigger signal being configured to instruct the pan-tilt to trigger the capture device to capture based on a first timing strategy;
Wherein the first timing strategy comprises: the time required for the shooting device to complete shooting of each shooting sequence is a first fixed time.
34. The method of claim 33, wherein the timed shoot trigger signal is further configured to instruct the pan-tilt to trigger the shooting device to shoot based on a second timing strategy;
wherein the second timing strategy comprises: the time length required by the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
35. The method of claim 33, wherein the sending the timed shoot trigger signal to the pan-tilt head is one time, the sending the shoot trigger signal to the pan-tilt head comprising:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the shooting sequence, sending a timing shooting trigger signal to the cradle head.
36. The method of any one of claims 33 to 35, wherein the number of shot sequences is a plurality, the method further comprising:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the first shooting sequence, all shooting sequences are sent to the cradle head at one time.
37. The method of any one of claims 33 to 35, wherein the number of shot sequences is a plurality, the method further comprising:
and after the shooting device finishes shooting of the current shooting sequence, sending a next shooting sequence to the holder.
38. The method of claim 32, wherein the capture trigger signal is a fixed-distance capture trigger signal, the fixed-distance capture trigger signal being configured to instruct the pan-tilt to trigger the capture device to capture based on a first fixed-distance strategy;
wherein the first ranging strategy comprises: the interval between adjacent shooting sequences is a first fixed interval.
39. The method of claim 38, wherein the range-finding trigger signal is further used to instruct triggering the pan-tilt to control the camera to take a picture based on a second range-finding strategy;
wherein the second distance strategy comprises: the interval between adjacent shooting points in the same shooting sequence is a second fixed interval.
40. The method of claim 38, wherein the sending the distance shooting trigger signal to the pan-tilt head is performed a plurality of times, the sending the shooting trigger signal to the pan-tilt head comprising:
Before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, a fixed-distance shooting trigger signal is respectively sent to the cradle head.
41. The method of claim 40, further comprising:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, sending the shooting sequence to the cradle head.
42. The method of claim 39, wherein the sending the distance capture trigger signal to the cradle head a plurality of times, the sending the capture trigger signal to the cradle head comprises:
before the cradle head controls the shooting device to finish shooting of each shooting point, a fixed-distance shooting trigger signal is sent to the cradle head respectively.
43. The method of claim 42, further comprising:
before the cradle head controls the shooting device to finish shooting of each shooting point, respectively sending an indication signal to the cradle head, wherein the indication signal is used for indicating the target gesture of the cradle head or the shooting direction of the shooting device corresponding to the shooting point.
44. The method of claim 30, wherein the maximum allowable flight speed of the drone is determined based on the heading distance for each of the shooting directions and the length of time required for the shooting device to complete a shot of one of the shooting sequences and resume to the original shooting direction;
the course distance of each shooting direction is equal, and the course distance is determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device in parallel to the flight direction of the unmanned aerial vehicle.
45. The method of claim 44, wherein the heading overlap rate is set by a user.
46. The method of claim 44, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
47. The method of claim 30, wherein said controlling the drone to fly in accordance with the flight path comprises:
and controlling the real-time height between the lens of the shooting device and the region to be shot to be within a preset height range.
48. The method of claim 30, wherein the controlling the pan-tilt-zoom gesture on the unmanned aerial vehicle so that the photographing device on the pan-tilt is in a corresponding photographing direction at each photographing point specifically comprises:
Acquiring the real-time gesture of the unmanned aerial vehicle;
determining the deviation between the real-time gesture of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the deviation, so that the shooting device on the cradle head is positioned in the corresponding shooting direction at each shooting point.
49. The method of claim 30, wherein the pan-tilt is a tri-axial pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
the controlling the pan-tilt switching gesture on the unmanned aerial vehicle comprises:
and controlling any two of the yaw attitude, the roll axis attitude and the pitch axis attitude of the tripod head so as to control the tripod head to switch the attitude.
50. A photographing control apparatus, characterized in that the apparatus comprises:
a storage device for storing program instructions; and
one or more processors invoking program instructions stored in the storage device, which when executed, are configured, individually or collectively, to:
acquiring first position information of a region to be shot and second position information of an expanded shooting region, wherein the expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information;
Determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction.
51. The apparatus of claim 50, wherein the extended photographing region is a region obtained by expanding different directions of the region to be photographed by a first preset distance, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle.
52. The apparatus of claim 51, wherein the shooting direction comprises at least two of:
the front shooting direction of the front of the unmanned aerial vehicle, the rear shooting direction of the rear of the unmanned aerial vehicle, the left shooting direction of the left direction of the unmanned aerial vehicle, the right shooting direction of the right direction of the unmanned aerial vehicle or the right shooting direction of the shooting direction vertically downwards.
53. The apparatus of claim 52, wherein the effective photographing area in the front photographing direction, the effective photographing area in the rear photographing direction, the effective photographing area in the left photographing direction, and the effective photographing area in the right photographing direction are respectively:
the region to be shot is obtained after the second preset distance is moved to the first direction, the region to be shot is obtained after the second preset distance is moved to the second direction, the region to be shot is obtained after the second preset distance is moved to the third direction, and the region to be shot is obtained after the second preset distance is moved to the fourth direction; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
54. The apparatus of claim 53, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
55. The apparatus of claim 51, wherein the flying height is set by a user.
56. The device of claim 55, wherein the flying height is input by a user through a control device of the drone.
57. The apparatus of claim 51, wherein the flying height is determined based on parameters of a camera mounted on the drone and a preset ground resolution.
58. The device of claim 57, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
59. The apparatus of claim 50, wherein the flight path comprises a plurality of mutually parallel sub-paths, one side of adjacent sub-paths being connected to form one flight path;
the one or more processors, alone or in combination, are further configured to, when determining the flight path, perform the following:
determining a side distance between two adjacent sub-airlines in a flight airlines according to a preset ground resolution, a preset side overlapping rate and the number of pixels of a shooting device carried on the unmanned aerial vehicle in the flight direction perpendicular to the unmanned aerial vehicle;
And determining the flight route according to the second position information and the lateral distance.
60. The apparatus of claim 59, wherein the extended shot region is square, the starting waypoint of the flight path is any angular position of the extended shot region, and the sub-path is parallel to one of the sides of the extended shot region.
61. The apparatus of claim 59 wherein the side lap rate is set by a user.
62. The apparatus of claim 50, wherein the first location information is set by a user; or,
the region to be shot is determined by importing an external file, and the first position information is recorded in the external file.
63. The apparatus of claim 50, wherein a length of time required for a photographing device mounted on the unmanned aerial vehicle to complete photographing of each of the photographing sequences is a first fixed length of time.
64. The apparatus of claim 63, wherein the time period required for the photographing apparatus to complete photographing of adjacent photographing points in the same photographing sequence is a second fixed time period.
65. The apparatus of claim 50, wherein the spacing between adjacent shot sequences is a first fixed spacing.
66. The apparatus of claim 65, wherein the spacing between adjacent shots in the same shot sequence is a second fixed spacing.
67. The apparatus of claim 50, wherein an initial one of the shots is: the unmanned aerial vehicle starts to fly according to the flight route; or,
the initial shooting point in the shooting points is as follows: the initial waypoint of the flight route.
68. The apparatus of claim 50, wherein the one or more processors, prior to acquiring the first location information of the region to be photographed, are further configured, individually or collectively, to:
acquiring a trigger instruction for indicating to enter an oblique shooting mode;
and entering the oblique shooting mode.
69. The apparatus according to any one of claims 50 to 68, wherein the photographing control apparatus is provided to a control apparatus of the unmanned aerial vehicle.
70. The apparatus of claim 69, wherein the one or more processors, when acquiring the second location information of the pan-out shot region, are further configured, individually or collectively, to:
And determining second position information of the extended shooting area according to the first position information.
71. The apparatus of claim 69, wherein the one or more processors, after obtaining the second location information for the extended shot region, are further configured, individually or collectively, to:
and planning a flight route of the unmanned aerial vehicle according to the second position information.
72. The apparatus of claim 69, wherein the one or more processors are further configured, individually or collectively, to:
transmitting the flight route to the unmanned aerial vehicle;
the one or more processors, after determining a shooting sequence corresponding to each waypoint on the flight route according to the third location information and a preset flight route of the unmanned aerial vehicle, are further configured, individually or collectively, to perform the following operations:
and sending the shooting sequence to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence.
73. The apparatus according to any one of claims 50 to 68, wherein the photographing control device is provided to the unmanned aerial vehicle.
74. The device of claim 73, wherein the first location information is transmitted by a control device of the drone.
75. The device of claim 73, wherein the second location information is transmitted by a control device of the drone.
76. The apparatus of claim 73, wherein the one or more processors, when acquiring the second location information of the pan-out shot region, are further configured, individually or collectively, to:
and determining second position information of the extended shooting area according to the first position information.
77. The apparatus of claim 73, wherein the flight path is planned by the control of the drone based on the second location information; or,
the flight route is planned by the drone based on the second location information.
78. The apparatus of claim 73, wherein the one or more processors are further configured, individually or collectively, to:
and controlling a shooting device mounted on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence.
79. The device of claim 78, wherein the one or more processors, when controlling the imaging device onboard the drone to take images based on the flight path and the imaging sequence, are further configured, individually or collectively, to:
controlling the unmanned aerial vehicle to fly according to the flying route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the gesture, so that a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring images shot by the shooting device at each shooting point.
80. The device of claim 79, wherein the photographing device photographs without affecting the flight of the drone.
81. The apparatus of claim 79, wherein the one or more processors, in accordance with the sequence of shots, during flight of the drone from a current shooting point to a next shooting point, control a pan-tilt-switch attitude on the drone such that when the drone arrives at each shooting point, the cameras on the pan-tilt are in a corresponding shooting direction, individually or collectively, are further configured to:
According to the shooting sequence, a shooting trigger signal is sent to the cradle head, so that the cradle head performs gesture switching in the process that the unmanned aerial vehicle flies from a current shooting point to a next shooting point, and a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
82. The apparatus of claim 81, wherein the capture trigger signal is a timed capture trigger signal, the timed capture trigger signal configured to instruct the pan-tilt to trigger the capture device to capture based on a first timed policy;
wherein the first timing strategy comprises: the time required for the shooting device to complete shooting of each shooting sequence is a first fixed time.
83. The apparatus of claim 82, wherein the timed photograph trigger signal is further configured to instruct the pan-tilt to trigger the photographing apparatus to photograph based on a second timing strategy;
wherein the second timing strategy comprises: the time length required by the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
84. The apparatus of claim 82, wherein the one or more processors are configured to send the timed shoot trigger to the pan-tilt once, the one or more processors being further configured, alone or in combination, to:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the shooting sequence, sending a timing shooting trigger signal to the cradle head.
85. The apparatus of any one of claims 82 to 84, wherein the number of shot sequences is a plurality, the one or more processors, individually or collectively, further configured to:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the first shooting sequence, all shooting sequences are sent to the cradle head at one time.
86. The apparatus of any one of claims 82 to 84, wherein the number of shot sequences is a plurality, the one or more processors, individually or collectively, further configured to:
And after the shooting device finishes shooting of the current shooting sequence, sending a next shooting sequence to the holder.
87. The apparatus of claim 81, wherein the capture trigger signal is a fixed-distance capture trigger signal, the fixed-distance capture trigger signal being configured to instruct the pan-tilt to trigger the capture apparatus to capture based on a first fixed-distance strategy;
wherein the first ranging strategy comprises: the interval between adjacent shooting sequences is a first fixed interval.
88. The apparatus of claim 87, wherein the range-to-range trigger signal is further configured to instruct triggering the pan-tilt to control the camera to take a picture based on a second range-to-range strategy;
wherein the second distance strategy comprises: the interval between adjacent shooting points in the same shooting sequence is a second fixed interval.
89. The apparatus of claim 87, wherein the one or more processors are configured to send the range shooting trigger signal to the pan-tilt a plurality of times, the one or more processors being further configured, individually or collectively, to:
Before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, a fixed-distance shooting trigger signal is respectively sent to the cradle head.
90. The apparatus of claim 89, wherein the one or more processors are further configured, individually or collectively, to:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, sending the shooting sequence to the cradle head.
91. The apparatus of claim 88, wherein the one or more processors are configured, individually or collectively, to perform the following when sending the range shooting trigger signal to the pan-tilt a number of times, the sending the shooting trigger signal to the Yun Taishi:
before the cradle head controls the shooting device to finish shooting of each shooting point, a fixed-distance shooting trigger signal is sent to the cradle head respectively.
92. The apparatus of claim 91, wherein the one or more processors are further configured, individually or collectively, to:
Before the cradle head controls the shooting device to finish shooting of each shooting point, respectively sending an indication signal to the cradle head, wherein the indication signal is used for indicating the target gesture of the cradle head or the shooting direction of the shooting device corresponding to the shooting point.
93. The apparatus of claim 79, wherein the maximum allowable speed of flight of the drone is determined based on the heading distance for each shooting direction and the length of time required for the shooting apparatus to complete a shooting of the shooting sequence and resume to the original shooting direction;
the course distance of each shooting direction is equal, and the course distance is determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device in parallel to the flight direction of the unmanned aerial vehicle.
94. The apparatus of claim 93, wherein the heading overlap rate is set by a user.
95. The apparatus of claim 93, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
96. The apparatus of claim 79, wherein the one or more processors, when controlling the drone to fly according to the flight path, are further configured, individually or collectively, to:
And controlling the real-time height between the lens of the shooting device and the region to be shot to be within a preset height range.
97. The apparatus of claim 79, wherein the one or more processors, when controlling a pan-tilt-shift attitude on the drone, such that a camera on the pan-tilt is, individually or collectively, further configured to perform operations of, when each of the filming points is in a corresponding filming direction:
acquiring the real-time gesture of the unmanned aerial vehicle;
determining the deviation between the real-time gesture of the unmanned aerial vehicle and the shooting direction of the next shooting point,
and controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the deviation, so that the shooting device on the cradle head is positioned in the corresponding shooting direction at each shooting point.
98. The apparatus of claim 79, wherein the pan-tilt is a tri-axial pan-tilt, the pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
the one or more processors, when controlling a pan-tilt-switch attitude on the drone, are further configured, individually or collectively, to perform operations of:
and controlling any two of the yaw attitude, the roll axis attitude and the pitch axis attitude of the tripod head so as to control the tripod head to switch the attitude.
99. An unmanned aerial vehicle, comprising:
a body;
a cradle head mounted on the body, the cradle head being used for mounting a photographing device; and
the photographing control apparatus of any one of claims 50 to 68, 73 to 98, supported by the body, the photographing control apparatus being electrically connected to the pan-tilt.
100. A computer-readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the photographing control method of any one of claims 1 to 49.
101. A photographing control method, characterized in that the method comprises:
receiving a shooting sequence corresponding to each waypoint on a flight route sent by a control device of an unmanned aerial vehicle;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each flight route and each waypoint;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of an area to be shot and second position information of an outward-expansion shooting area, the outward-expansion shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information.
102. The method of claim 101, wherein the extended photographing region is a region obtained by expanding different directions of the region to be photographed by a first preset distance, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle.
103. The method of claim 102, wherein the shooting direction comprises at least two of:
the front shooting direction of the front of the unmanned aerial vehicle, the rear shooting direction of the rear of the unmanned aerial vehicle, the left shooting direction of the left direction of the unmanned aerial vehicle, the right shooting direction of the right direction of the unmanned aerial vehicle or the right shooting direction of the shooting direction vertically downwards.
104. The method of claim 103, wherein the effective photographing area in the front photographing direction, the effective photographing area in the rear photographing direction, the effective photographing area in the left photographing direction, and the effective photographing area in the right photographing direction are respectively:
the region to be shot is obtained after the second preset distance is moved to the first direction, the region to be shot is obtained after the second preset distance is moved to the second direction, the region to be shot is obtained after the second preset distance is moved to the third direction, and the region to be shot is obtained after the second preset distance is moved to the fourth direction; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
105. The method of claim 104, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
106. The method of claim 102, wherein the flying height is set by a user.
107. The method of claim 106, wherein the altitude is entered by a user via a control of the drone.
108. The method of claim 102, wherein the flying height is determined based on parameters of a camera onboard the drone and a preset ground resolution.
109. The method of claim 108, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
110. The method of claim 101, wherein the flight path comprises a plurality of mutually parallel sub-paths, one side of adjacent sub-paths being connected to form one flight path;
the determination process of the flight route comprises the following steps:
the control device of the unmanned aerial vehicle determines the lateral distance between two adjacent sub-airlines in the flight airlines according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of the shooting device carried on the unmanned aerial vehicle in the flight direction perpendicular to the unmanned aerial vehicle; and determining the flight route according to the second position information and the lateral distance.
111. The method of claim 110, wherein the extended shot region is square, the starting waypoint of the flight path is any angular position of the extended shot region, and the sub-path is parallel to one of the sides of the extended shot region.
112. The method of claim 110, wherein the side lap rate is set by a user.
113. The method of claim 101, wherein the first location information is set by a user; or,
the region to be shot is determined by importing an external file, and the first position information is recorded in the external file.
114. The method of claim 101, wherein prior to receiving the flight path sent by the control device of the unmanned aerial vehicle and the shooting sequence corresponding to each waypoint on the flight path, further comprising:
acquiring a trigger instruction for indicating to enter an oblique shooting mode;
and entering the oblique shooting mode.
115. The method of claim 101, wherein controlling the camera onboard the drone to take shots based on the flight path and the shot sequence comprises:
Controlling the unmanned aerial vehicle to fly according to the flying route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the gesture, so that a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring images shot by the shooting device at each shooting point.
116. The method of claim 115, wherein the photographing by the photographing device does not affect the flight of the drone.
117. The method according to claim 115, wherein the controlling the pan-tilt-switch gesture on the unmanned aerial vehicle during the flight of the unmanned aerial vehicle from the current shooting point to the next shooting point according to the shooting sequence, such that the shooting device on the pan-tilt is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point, comprises:
according to the shooting sequence, a shooting trigger signal is sent to the cradle head, so that the cradle head performs gesture switching in the process that the unmanned aerial vehicle flies from a current shooting point to a next shooting point, and a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
The shooting trigger signal is further used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
118. The method of claim 117, wherein the capture trigger signal is a timed capture trigger signal, the timed capture trigger signal configured to instruct the pan-tilt to trigger the capture device to capture based on a first timed policy;
wherein the first timing strategy comprises: the time required for the shooting device to complete shooting of each shooting sequence is a first fixed time.
119. The method of claim 118, wherein the timed capture trigger signal is further used to instruct the cradle head to trigger the capture device to capture based on a second timing strategy;
wherein the second timing strategy comprises: the time length required by the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
120. The method of claim 118, wherein the sending the timed shot trigger signal to the pan-tilt is performed once, the sending the shot trigger signal to the pan-tilt comprising:
And before the cradle head controls the shooting device to finish shooting of the first shooting point of the shooting sequence, sending a timing shooting trigger signal to the cradle head.
121. The method of any one of claims 118 to 120, wherein the number of shot sequences is a plurality, the method further comprising:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the first shooting sequence, all shooting sequences are sent to the cradle head at one time.
122. The method of any one of claims 118 to 120, wherein the number of shot sequences is a plurality, the method further comprising:
and after the shooting device finishes shooting of the current shooting sequence, sending a next shooting sequence to the holder.
123. The method of claim 117, wherein the capture trigger signal is a fixed-distance capture trigger signal, the fixed-distance capture trigger signal configured to instruct the pan-tilt to trigger the capture device to capture based on a first fixed-distance strategy;
wherein the first ranging strategy comprises: the interval between adjacent shooting sequences is a first fixed interval.
124. The method of claim 123, wherein the range-to-range trigger signal is further configured to instruct triggering the pan-tilt to control the camera to take a picture based on a second range-to-range strategy;
wherein the second distance strategy comprises: the interval between adjacent shooting points in the same shooting sequence is a second fixed interval.
125. The method of claim 123, wherein the sending the distance capture trigger signal to the cradle head a plurality of times, the sending the capture trigger signal to the cradle head comprises:
before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, a fixed-distance shooting trigger signal is respectively sent to the cradle head.
126. The method of claim 125, further comprising:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, sending the shooting sequence to the cradle head.
127. The method of claim 124, wherein the sending the distance capture trigger signal to the pan-tilt is performed a plurality of times, the sending the capture trigger signal to the pan-tilt comprising:
Before the cradle head controls the shooting device to finish shooting of each shooting point, a fixed-distance shooting trigger signal is sent to the cradle head respectively.
128. The method of claim 127, further comprising:
before the cradle head controls the shooting device to finish shooting of each shooting point, respectively sending an indication signal to the cradle head, wherein the indication signal is used for indicating the target gesture of the cradle head or the shooting direction of the shooting device corresponding to the shooting point.
129. The method of claim 115, wherein the maximum allowable speed of flight of the drone is determined based on the heading distance for each of the shooting directions and the length of time required for the shooting device to complete a shot of one of the shooting sequences and resume to the original shooting direction;
the course distance of each shooting direction is equal, and the course distance is determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device in parallel to the flight direction of the unmanned aerial vehicle.
130. The method of claim 129 wherein the heading overlap rate is set by a user.
131. The method of claim 129, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
132. The method of claim 115, wherein said controlling said drone to fly according to said flight path comprises:
and controlling the real-time height between the lens of the shooting device and the region to be shot to be within a preset height range.
133. The method of claim 115, wherein controlling the pan-tilt-zoom-posture of the unmanned aerial vehicle such that the camera on the pan-tilt is in a corresponding shooting direction at each shooting point, comprises:
acquiring the real-time gesture of the unmanned aerial vehicle;
determining the deviation between the real-time gesture of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the deviation, so that the shooting device on the cradle head is positioned in the corresponding shooting direction at each shooting point.
134. The method of claim 115, wherein the pan-tilt is a tri-axial pan-tilt, the pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
The controlling the pan-tilt switching gesture on the unmanned aerial vehicle comprises:
and controlling any two of the yaw attitude, the roll axis attitude and the pitch axis attitude of the tripod head so as to control the tripod head to switch the attitude.
135. A photographing control apparatus, characterized in that the apparatus comprises:
a storage device for storing program instructions; and
one or more processors invoking program instructions stored in the storage device, which when executed, are configured, individually or collectively, to:
receiving a shooting sequence corresponding to each waypoint on a flight route sent by a control device of an unmanned aerial vehicle;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the shooting sequence corresponding to each flight route and each waypoint;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of an area to be shot and second position information of an outward-expansion shooting area, the outward-expansion shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information.
136. The apparatus according to claim 135, wherein the extended photographing region is a region obtained by expanding different directions of the region to be photographed by a first preset distance, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle.
137. The device of claim 136, wherein the shooting direction comprises at least two of:
the front shooting direction of the front of the unmanned aerial vehicle, the rear shooting direction of the rear of the unmanned aerial vehicle, the left shooting direction of the left direction of the unmanned aerial vehicle, the right shooting direction of the right direction of the unmanned aerial vehicle or the right shooting direction of the shooting direction vertically downwards.
138. The apparatus of claim 137, wherein the effective shooting area in the front shooting direction, the effective shooting area in the rear shooting direction, the effective shooting area in the left shooting direction, and the effective shooting area in the right shooting direction are respectively:
the region to be shot is obtained after the second preset distance is moved to the first direction, the region to be shot is obtained after the second preset distance is moved to the second direction, the region to be shot is obtained after the second preset distance is moved to the third direction, and the region to be shot is obtained after the second preset distance is moved to the fourth direction; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
139. The apparatus of claim 138, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
140. The device of claim 136 wherein the flying height is set by a user.
141. The device of claim 140, wherein the altitude is user input via a control device of the drone.
142. The device of claim 136, wherein the flying height is determined based on parameters of a camera onboard the drone and a preset ground resolution.
143. The device of claim 142, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
144. The apparatus of claim 135 wherein the flight path comprises a plurality of mutually parallel sub-paths, one side of adjacent sub-paths being connected to form one flight path;
the determination process of the flight route comprises the following steps:
the control device of the unmanned aerial vehicle determines the lateral distance between two adjacent sub-airlines in the flight airlines according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of the shooting device carried on the unmanned aerial vehicle in the flight direction perpendicular to the unmanned aerial vehicle; and determining the flight route according to the second position information and the lateral distance.
145. The apparatus of claim 144, wherein the extended shot region is square, the starting waypoint of the flight path is any angular position of the extended shot region, and the sub-path is parallel to one of the sides of the extended shot region.
146. The apparatus of claim 144, wherein the side lap rate is set by a user.
147. The apparatus of claim 135, wherein the first location information is set by a user; or,
the region to be shot is determined by importing an external file, and the first position information is recorded in the external file.
148. The device of claim 135, wherein the one or more processors, prior to receiving the flight path sent by the control device of the drone and the sequence of shots corresponding to each waypoint on the flight path, are further configured, individually or collectively, to:
acquiring a trigger instruction for indicating to enter an oblique shooting mode;
and entering the oblique shooting mode.
149. The device of claim 135, wherein the one or more processors, when controlling the imaging device onboard the drone based on the flight route and the imaging sequence, are individually or collectively further configured to:
Controlling the unmanned aerial vehicle to fly according to the flying route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the gesture, so that a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring images shot by the shooting device at each shooting point.
150. The device of claim 149 wherein the photographing means photographs without affecting the flight of the drone.
151. The apparatus of claim 149, wherein the one or more processors, in accordance with the sequence of shots, during flight of the drone from a current shooting point to a next shooting point, control a pan-tilt-switch attitude on the drone such that when the drone arrives at each shooting point, the cameras on the pan-tilt are in a corresponding shooting direction, individually or collectively, are further configured to:
according to the shooting sequence, a shooting trigger signal is sent to the cradle head, so that the cradle head performs gesture switching in the process that the unmanned aerial vehicle flies from a current shooting point to a next shooting point, and a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
The shooting trigger signal is further used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
152. The device of claim 151, wherein the capture trigger signal is a timed capture trigger signal, the timed capture trigger signal configured to instruct the pan-tilt to trigger the capture device to capture based on a first timed policy;
wherein the first timing strategy comprises: the time required for the shooting device to complete shooting of each shooting sequence is a first fixed time.
153. The apparatus of claim 152, wherein the timed capture trigger signal is further configured to instruct the pan-tilt to trigger the capture device to capture based on a second timing strategy;
wherein the second timing strategy comprises: the time length required by the shooting device to complete shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
154. The apparatus of claim 152, wherein the number of times the timed shoot trigger is sent to the pan-tilt is one time, the one or more processors being further configured, individually or collectively, to:
And before the cradle head controls the shooting device to finish shooting of the first shooting point of the shooting sequence, sending a timing shooting trigger signal to the cradle head.
155. The apparatus of any one of claims 152-154, wherein the number of shot sequences is a plurality, the one or more processors, individually or collectively, further configured to:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of the first shooting sequence, all shooting sequences are sent to the cradle head at one time.
156. The apparatus of any one of claims 152-154, wherein the number of shot sequences is a plurality, the one or more processors, individually or collectively, further configured to:
and after the shooting device finishes shooting of the current shooting sequence, sending a next shooting sequence to the holder.
157. The device of claim 151, wherein the capture trigger signal is a fixed-distance capture trigger signal, the fixed-distance capture trigger signal configured to instruct the pan-tilt to trigger the capture device to capture based on a first fixed-distance strategy;
Wherein the first ranging strategy comprises: the interval between adjacent shooting sequences is a first fixed interval.
158. The apparatus of claim 157, wherein the range finder trigger signal is further configured to instruct the pan-tilt to trigger the camera to take a picture based on a second range finder strategy;
wherein the second distance strategy comprises: the interval between adjacent shooting points in the same shooting sequence is a second fixed interval.
159. The apparatus of claim 157, wherein the number of times the range shooting trigger signal is sent to the pan-tilt is a plurality of times, the one or more processors, when sending a shooting trigger signal to the Yun Taishi, are further configured, individually or collectively, to:
before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, a fixed-distance shooting trigger signal is respectively sent to the cradle head.
160. The apparatus of claim 159, wherein the one or more processors are further configured, individually or collectively, to:
and before the cradle head controls the shooting device to finish shooting of the first shooting point of each shooting sequence, sending the shooting sequence to the cradle head.
161. The apparatus of claim 158, wherein the one or more processors are further configured, individually or collectively, to perform the following operations in sending the range shooting trigger signal to the pan-tilt a plurality of times, the sending the shooting trigger signal to the Yun Taishi:
before the cradle head controls the shooting device to finish shooting of each shooting point, a fixed-distance shooting trigger signal is sent to the cradle head respectively.
162. The apparatus of claim 161, wherein the one or more processors are further configured, individually or collectively, to:
before the cradle head controls the shooting device to finish shooting of each shooting point, respectively sending an indication signal to the cradle head, wherein the indication signal is used for indicating the target gesture of the cradle head or the shooting direction of the shooting device corresponding to the shooting point.
163. The device of claim 151, wherein the maximum allowable speed of flight of the drone is determined based on the heading distance for each shooting direction and the length of time required for the shooting device to complete a shot of the shooting sequence and resume to an initial shooting direction;
The course distance of each shooting direction is equal, and the course distance is determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device in parallel to the flight direction of the unmanned aerial vehicle.
164. The apparatus recited in claim 163, wherein the heading overlap rate is set by a user.
165. The apparatus of claim 163, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
166. The apparatus of claim 151, wherein the one or more processors, when controlling the drone to fly in accordance with the flight path, are further configured, individually or collectively, to:
and controlling the real-time height between the lens of the shooting device and the region to be shot to be within a preset height range.
167. The device of claim 151, wherein the one or more processors, when controlling a pan-tilt-switch attitude on the drone, such that the cameras on the pan-tilt are, individually or collectively, further configured to perform operations of, when each of the imaging points is in a corresponding imaging direction:
Acquiring the real-time gesture of the unmanned aerial vehicle;
determining the deviation between the real-time gesture of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cradle head on the unmanned aerial vehicle to switch the gesture according to the deviation, so that the shooting device on the cradle head is positioned in the corresponding shooting direction at each shooting point.
168. The apparatus of claim 151, wherein the pan-tilt is a tri-axial pan-tilt, the pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
the one or more processors, when controlling a pan-tilt-switch attitude on the drone, are further configured, individually or collectively, to perform operations of:
and controlling any two of the yaw attitude, the roll axis attitude and the pitch axis attitude of the tripod head so as to control the tripod head to switch the attitude.
169. An unmanned aerial vehicle, comprising:
a body;
a cradle head mounted on the body, the cradle head being used for mounting a photographing device; and
the camera control device of any one of claims 135-168, supported by the body, the camera control device electrically coupled to the pan-tilt.
170. A computer-readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the photographing control method of any one of claims 101 to 134.
CN202080032440.9A 2020-07-16 2020-07-16 Shooting control method and device, unmanned aerial vehicle and computer readable storage medium Active CN113875222B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311469626.2A CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/102249 WO2022011623A1 (en) 2020-07-16 2020-07-16 Photographing control method and device, unmanned aerial vehicle, and computer-readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311469626.2A Division CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Publications (2)

Publication Number Publication Date
CN113875222A CN113875222A (en) 2021-12-31
CN113875222B true CN113875222B (en) 2023-11-24

Family

ID=78982120

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311469626.2A Pending CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device
CN202080032440.9A Active CN113875222B (en) 2020-07-16 2020-07-16 Shooting control method and device, unmanned aerial vehicle and computer readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311469626.2A Pending CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Country Status (2)

Country Link
CN (2) CN117641107A (en)
WO (1) WO2022011623A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114777744B (en) * 2022-04-25 2024-03-08 中国科学院古脊椎动物与古人类研究所 Geological measurement method and device in ancient organism field and electronic equipment
CN114935942A (en) * 2022-05-20 2022-08-23 无锡海纳智能科技有限公司 Method for determining inspection route of distributed photovoltaic power station and electronic equipment
CN117470199B (en) * 2023-12-27 2024-03-15 天津云圣智能科技有限责任公司 Swing photography control method and device, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706A (en) * 2016-12-09 2017-05-31 中山大学 A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident
CN107504957A (en) * 2017-07-12 2017-12-22 天津大学 The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
CN110771141A (en) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and unmanned aerial vehicle
WO2020103022A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method and apparatus, device and medium
CN111226185A (en) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 Flight route generation method, control device and unmanned aerial vehicle system
CN111373339A (en) * 2019-05-17 2020-07-03 深圳市大疆创新科技有限公司 Flight task generation method, control terminal, unmanned aerial vehicle and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10364026B1 (en) * 2015-09-21 2019-07-30 Amazon Technologies, Inc. Track and tether vehicle position estimation
CN109032165B (en) * 2017-07-21 2021-09-10 广州极飞科技股份有限公司 Method and device for generating unmanned aerial vehicle air route
US10364027B2 (en) * 2017-10-24 2019-07-30 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for UAV scanning and imaging
EP3746745A4 (en) * 2018-01-29 2021-08-11 AeroVironment, Inc. Methods and systems for determining flight plans for vertical take-off and landing (vtol) aerial vehicles

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706A (en) * 2016-12-09 2017-05-31 中山大学 A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident
CN107504957A (en) * 2017-07-12 2017-12-22 天津大学 The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
CN110771141A (en) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and unmanned aerial vehicle
WO2020103022A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method and apparatus, device and medium
CN111226185A (en) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 Flight route generation method, control device and unmanned aerial vehicle system
CN111373339A (en) * 2019-05-17 2020-07-03 深圳市大疆创新科技有限公司 Flight task generation method, control terminal, unmanned aerial vehicle and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Muhammad Attamimi;Ronny Mardiyanto;Astria Nur Irfansyah.Inclined Image Recognition for Aerial Mapping by Unmanned Aerial Vehicles.2018 International Seminar on Intelligent Technology and Its Applications (ISITIA).2019,333-337. *
赵红艳 ; 黄雪琴.无人机倾斜摄影技术在地质灾害应急测绘中的应用.世界有色金属.2020,(第09期),183-184. *
魏铼 ; 胡卓玮 ; 陈天博 ; 胡顺强 ; 陈诚 ; 赵文吉.单机倾斜摄影方式的无人机航线设计.测绘科学.2018,第43卷(第06期),147-155. *

Also Published As

Publication number Publication date
CN113875222A (en) 2021-12-31
CN117641107A (en) 2024-03-01
WO2022011623A1 (en) 2022-01-20

Similar Documents

Publication Publication Date Title
US11377211B2 (en) Flight path generation method, flight path generation system, flight vehicle, program, and storage medium
CN110771141B (en) Shooting method and unmanned aerial vehicle
CN113875222B (en) Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
JP6878567B2 (en) 3D shape estimation methods, flying objects, mobile platforms, programs and recording media
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
WO2018120350A1 (en) Method and device for positioning unmanned aerial vehicle
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
WO2018120351A1 (en) Method and device for positioning unmanned aerial vehicle
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
CN111699454B (en) Flight planning method and related equipment
CN111247389B (en) Data processing method and device for shooting equipment and image processing equipment
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
JP2021096865A (en) Information processing device, flight control instruction method, program, and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
CN112313942A (en) Control device for image processing and frame body control
JP2020095519A (en) Shape estimation device, shape estimation method, program, and recording medium
WO2018188086A1 (en) Unmanned aerial vehicle and control method therefor
JP2019158515A (en) Unmanned flying body, information processor, method for acquiring image information, and image information acquisition program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant