CN113875222A - Shooting control method and device, unmanned aerial vehicle and computer readable storage medium - Google Patents

Shooting control method and device, unmanned aerial vehicle and computer readable storage medium Download PDF

Info

Publication number
CN113875222A
CN113875222A CN202080032440.9A CN202080032440A CN113875222A CN 113875222 A CN113875222 A CN 113875222A CN 202080032440 A CN202080032440 A CN 202080032440A CN 113875222 A CN113875222 A CN 113875222A
Authority
CN
China
Prior art keywords
shooting
aerial vehicle
unmanned aerial
area
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080032440.9A
Other languages
Chinese (zh)
Other versions
CN113875222B (en
Inventor
吴利鑫
何纲
黄振昊
方朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202311469626.2A priority Critical patent/CN117641107A/en
Publication of CN113875222A publication Critical patent/CN113875222A/en
Application granted granted Critical
Publication of CN113875222B publication Critical patent/CN113875222B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

A shooting control method and device, an unmanned aerial vehicle and a computer-readable storage medium, wherein the method comprises the following steps: acquiring first position information of a region to be shot and second position information of an externally expanded shooting region, wherein the externally expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information; determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information; determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle; each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in all the shooting directions are located in the effective shooting area of the shooting direction. The shooting points of all shooting directions in each shooting sequence are located in the effective shooting area of the shooting direction, and invalid image data are prevented from being generated.

Description

Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
Technical Field
The present application relates to the field of photography, and in particular, to a photography control method and apparatus, an unmanned aerial vehicle, and a computer-readable storage medium.
Background
The oblique photography technique is through carrying on many shooting devices on unmanned aerial vehicle, follows a perpendicular and four different angle collection images that look sideways at simultaneously, compares traditional photography and has had four more oblique shooting angles to can acquire information such as richer side texture, be applicable to fields that needs to acquire the diversified characteristic information of shooting thing such as survey and drawing. In the related art, in order to realize shooting in multiple directions, one way is to mount multiple-shot shooting devices (such as 5-shot shooting devices) on an unmanned aerial vehicle and shoot images in multiple directions at the same time, and the multiple-shot shooting devices are high in cost and heavy in weight, generally directly mounted on the body of the unmanned aerial vehicle through a damping system, lack of a mechanical holder for stability enhancement and poor in imaging quality; in order to reduce the volume of the multi-mosaic shooting device, a roller shutter door or an electronic global shutter is adopted, and the roller shutter can have a 'jelly effect' under fast motion shooting, so that the modeling precision is reduced, the imaging quality of the electronic global shutter is poor, and the modeling effect is also influenced. Another mode is that carry on the shooting device that has single camera lens on unmanned aerial vehicle to the shooting of a plurality of directions is realized to many routes of cooperation, compares and pieces together shooting device many times, and the shooting device that has single camera lens is with low costs, weight is little, and the accessible cloud platform carries on unmanned aerial vehicle's organism, and imaging quality is better.
In order to ensure that images of the area to be shot in all directions are shot, when the route planning is carried out, the area to be shot is firstly expanded, and then the route planning is carried out on the expanded area (namely the expanded area to be shot). The unmanned aerial vehicle flies along the planned route and acquires images in each direction when flying to each shooting point, so that a large amount of invalid image data can be generated on the external extended route of a region to be shot, storage space is wasted, and inconvenience is brought to modeling processing.
Disclosure of Invention
The application provides a shooting control method and device, an unmanned aerial vehicle and a computer readable storage medium.
In a first aspect, an embodiment of the present application provides a shooting control method, where the method includes:
acquiring first position information of a to-be-shot area and second position information of an externally-expanded shot area, wherein the externally-expanded shot area is obtained by expanding the to-be-shot area, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in the shooting directions are all located in the effective shooting area in the shooting direction.
In a second aspect, an embodiment of the present application provides a shooting control apparatus, including:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
acquiring first position information of a to-be-shot area and second position information of an externally-expanded shot area, wherein the externally-expanded shot area is obtained by expanding the to-be-shot area, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in the shooting directions are all located in the effective shooting area in the shooting direction.
In a third aspect, an embodiment of the present application provides an unmanned aerial vehicle, including:
a body;
the holder is carried on the machine body and is used for carrying a shooting device; and
the shooting control device of the second aspect is supported by the body, and is electrically connected with the holder.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the shooting control method of the first aspect.
In a fifth aspect, an embodiment of the present application provides a shooting control method, where the method includes:
receiving a flight path sent by a control device of an unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and a shooting sequence corresponding to each waypoint;
the shooting method comprises the steps that each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in all the shooting directions are located in effective shooting areas of the shooting directions, the effective shooting areas are determined according to first position information of areas to be shot and second position information of extended shooting areas, the extended shooting areas are obtained by expanding the areas to be shot, and the second position information is determined according to the first position information.
In a sixth aspect, an embodiment of the present application provides a shooting control apparatus, including:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
receiving a flight path sent by a control device of an unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and a shooting sequence corresponding to each waypoint;
the shooting method comprises the steps that each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in all the shooting directions are located in effective shooting areas of the shooting directions, the effective shooting areas are determined according to first position information of areas to be shot and second position information of extended shooting areas, the extended shooting areas are obtained by expanding the areas to be shot, and the second position information is determined according to the first position information.
In a seventh aspect, an embodiment of the present application provides an unmanned aerial vehicle, including:
a body;
the holder is carried on the machine body and is used for carrying a shooting device; and
the shooting control device of the sixth aspect is supported by the body, and the shooting control device is electrically connected to the holder.
In an eighth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the shooting control method of the fifth aspect.
According to the technical scheme provided by the embodiment of the application, when the shooting sequence is planned, the shooting points in all shooting directions in each shooting sequence are ensured to be located in the effective shooting area of the shooting direction, so that invalid image data can be prevented from being generated, the number of the shooting points can be reduced, the shooting time is shortened, and the multi-direction shooting efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle in an embodiment of the present application;
fig. 2 is a schematic method flow diagram of a shooting control method in an embodiment of the present application;
fig. 3 is a schematic diagram of a position relationship between a region to be shot and an extended shot region in an embodiment of the present application;
FIG. 4A is a schematic illustration of a flight path in one embodiment of the present application;
FIG. 4B is a schematic illustration of a flight path in another embodiment of the present application;
fig. 5A is a schematic diagram of a position relationship between an effective shooting area and an area to be shot in one shooting direction in an embodiment of the present application;
fig. 5B is a schematic diagram of a position relationship between an effective shooting area and an area to be shot in another shooting direction in an embodiment of the present application;
fig. 5C is a schematic diagram of a position relationship between an effective shooting area and an area to be shot in another shooting direction in an embodiment of the present application;
fig. 5D is a schematic diagram of a position relationship between an effective shooting area and an area to be shot in another shooting direction in an embodiment of the present application;
fig. 6A is a comparison diagram of images shot by the unmanned aerial vehicle at different shooting points in the same shooting direction in an embodiment of the present application;
FIG. 6B is a schematic view of a flight path in another embodiment of the present application;
fig. 7 is a schematic diagram illustrating an implementation manner of controlling a shooting device mounted on an unmanned aerial vehicle to shoot based on a flight line and a shooting sequence in an embodiment of the present application;
fig. 8 is a schematic diagram of a positional relationship between images of shot points in the same shooting direction in two adjacent shooting sequences in an embodiment of the present application;
fig. 9 is a schematic diagram of a process in which a pan/tilt head in an embodiment of the present application performs shooting of a certain shooting sequence;
fig. 10 is a schematic method flow diagram of a photographing control method in another embodiment of the present application;
fig. 11 is a block diagram of a configuration of a photographing control apparatus in an embodiment of the present application;
fig. 12 is a block diagram of a structure of a drone in an embodiment of the present application.
Detailed Description
The traditional survey and drawing carries out the measurement station through total powerstation or GNSS (Global Navigation Satellite System ) handheld device, and its shortcoming is that inefficiency, the operation degree of difficulty are high, the operating cost is high, to the high-resolution survey and drawing of large tracts of land high accuracy, traditional survey and drawing can't satisfy, and it has gradually been replaced by man-machine survey and unmanned aerial vehicle survey and drawing. The unmanned aerial vehicle or the man-machine surveying and mapping can also be used for establishing a three-dimensional model of a measurement area, shooting multiple directions of an area to be shot through an oblique photography technology, and processing and resolving images in the multiple directions by combining a three-dimensional modeling algorithm to obtain the model containing three-dimensional space information.
In order to ensure that images of the area to be shot in all directions are shot, when the route planning of oblique photography is carried out, the area to be shot is firstly expanded, and then the route planning is carried out on the expanded area (namely the expanded area to be shot). The unmanned aerial vehicle flies along the planned route and acquires images in each direction when flying to each shooting point, so that a large amount of invalid image data can be generated on the external extended route of a region to be shot, storage space is wasted, and inconvenience is brought to modeling processing.
In order to solve the above problem, in the embodiment of the present application, when a shooting sequence is planned, it is ensured that shooting points in each shooting direction in each shooting sequence are located in an effective shooting area in the shooting direction, so that not only invalid image data can be prevented from being generated, but also the number of the shooting points can be reduced, shooting time can be reduced, and efficiency of multi-direction shooting can be improved.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that, in the following examples and embodiments, features may be combined with each other without conflict.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a and b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle in an embodiment of the present application; referring to fig. 1, the unmanned aerial vehicle according to the embodiment of the present application may include a body 100, a camera 200, and a cradle head 300, wherein the camera 200 is mounted on the body 100 through the cradle head 300. The unmanned aerial vehicle can be a fixed-wing unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle, and the type of the unmanned aerial vehicle can be specifically selected according to actual requirements, for example, when the weight of the holder 300 and the weight of the shooting device 200 are large, the fixed-wing unmanned aerial vehicle with large volume and weight can be selected to carry the holder 300 and the shooting device 200; when the weight of the pan/tilt.
The quantity of the shooting device of this application embodiment is one, when utilizing unmanned aerial vehicle to carry out oblique photography, only needs to adopt a shooting device, and this shooting device is though the pixel is big, but splices the shooting device volume and weight greatly reduced relatively many to unmanned aerial vehicle's weight and size have been reduced greatly. The photographing device 200 may be an integrated camera, or may be a device combining an image sensor and a lens, and it should be noted that the photographing device 200 of the embodiment of the present application is a photographing device having a single lens. In addition, the holder 300 of the embodiment of the present application can be a single-axis holder, a two-axis holder, a three-axis holder or other multi-axis holders.
The unmanned aerial vehicle can be applied to the field of surveying and mapping, a shot object is taken as the ground as an example, the ground image is collected by the unmanned aerial vehicle carrying the shooting device 200, then three-dimensional or two-dimensional map reconstruction is carried out on the ground image by using software, and the map obtained by surveying and mapping can be applied to different industries, such as the field of power inspection, and the line fault can be checked by using the reconstructed map; in the field of road planning, the reconstructed map can be used for road site selection; the drug-arresting police can use the reconstructed three-dimensional map to check the planting situation of poppy in the mountains, and the like. Of course, this unmanned aerial vehicle is not restricted to the survey and drawing field, also can use in other fields that need acquire the diversified characteristic information of shooting thing. The shot object is not limited to the ground, and may be a large building, a mountain and the like.
Fig. 2 is a schematic method flow diagram of a shooting control method in an embodiment of the present application; referring to fig. 2, the photographing control method according to the embodiment of the present application may include steps S201 to S203.
In S201, first location information of a to-be-photographed area and second location information of an extended photographing area are obtained, where the extended photographing area is obtained by expanding the to-be-photographed area, and the second location information is determined according to the first location information.
The user may define the area to be photographed in different ways, such as by manually dotting or importing an external file. Accordingly, different strategies may be employed to obtain the first location information, for example, in some embodiments, the first location information is set by a user, for example, the user inputs the first location information by a manual dotting manner; in some embodiments, the area to be photographed is determined by importing an external file, and the external file records the first position information. Optionally, before executing S201, prompt information may be output to prompt the user to define the region to be photographed.
The region to be photographed in the embodiment of the present application may be a square region, or may be a region of other shapes, such as a circular region, a pentagonal region, and the like.
For example, the area to be photographed is a square area, and the first position information may include position information of four corners of the square area, but of course, the first position information may also include position information of other positions of the square area.
In addition, in some embodiments, before the first position information of the to-be-shot area and the second position information of the outward-extended shot area are obtained, if a trigger instruction indicating to enter the oblique shooting mode is obtained, the oblique shooting mode is entered, that is, after the oblique shooting mode is entered, a shooting sequence is planned.
It should be understood that the planning of the shooting sequence may be performed by the control means of the drone, or by the drone, while the process of shooting using the planned shooting sequence is performed by the drone. Therefore, if the shooting sequence planning process is performed in the control device, the control device may be triggered to enter the oblique shooting mode before the control device performs the planning of the shooting sequence; if the shooting sequence planning process is performed in the unmanned aerial vehicle, the unmanned aerial vehicle needs to be triggered to enter an oblique shooting mode before the unmanned aerial vehicle plans the shooting sequence. In addition, the unmanned aerial vehicle performs shooting using a shooting sequence planned by the control device in the oblique shooting mode.
In this application embodiment, unmanned aerial vehicle's controlling means can be remote controller or other terminal equipment that can control unmanned aerial vehicle, like cell-phone, panel computer, portable computer, desktop, intelligent wearing equipment etc..
The second position information is also related to a strategy adopted when the area to be shot is enlarged, and for example, when the area to be shot is enlarged in all directions, the second position information can be determined according to the first position information and the times of enlargement of the area to be shot relative to the area to be shot of the area to be shot; the second position information may also be determined according to the first position information and a distance between the edge of the extended photographing region and the edge of the region to be photographed. When the extended shooting area is obtained by expanding the area to be shot in different sizes in at least part of directions, the second position information can be determined according to the first position information and the distance between the edge of the extended shooting area in different directions and the edge of the area to be shot in the corresponding direction.
Illustratively, the outward-extending shooting area is an area obtained by respectively extending different directions of the area to be shot by a first preset distance. For example, referring to fig. 3, the area to be photographed is a rectangular area 10, and the rectangular area 10 is enlarged by a first preset distance D in different directionsextThe extended shot area 20 is obtained.
Optionally, the first preset distance is determined based on the flight height of the unmanned aerial vehicle and the installation angle of the shooting device carried on the unmanned aerial vehicle, and the factors such as the resolution of the image acquired by the shooting device, the air route planning requirement and the like are considered in the setting. Illustratively, the first predetermined distance DextThe calculation formula of (a) is as follows:
Figure BDA0003328842790000091
in the formula (1), H is a flying height, α is a mounting angle of the shooting device, and exemplarily, α is an included angle of a lens optical axis of the shooting device to a ground plane.
It should be understood that the first predetermined distance may be determined using other strategies.
The flight altitude can also be determined by adopting different strategies, for example, in some embodiments, the flight altitude is set by a user, and the flight altitude is input by the user through a control device of the unmanned aerial vehicle, so that the manner of determining the flight altitude can meet different user requirements and has strong flexibility; in some embodiments, the flying height is determined according to parameters of a camera mounted on the drone and a preset ground resolution, for example, the parameters of the camera include a focal length of the camera and a side length of a single pixel of an image sensor of the camera, and the flying height may be calculated by:
Figure BDA0003328842790000092
in formula (2), H is the flying height, f is the focal length of the camera, gsd (ground Sampling distance) is the preset ground resolution, and pix is the single pixel side length of the image sensor of the camera. It should be understood that the parameters of the shooting device are not limited to the above listed parameters, and may include other parameters, and the calculation formula of the flying height is not limited to the above formula (1) and may be other parameters.
In S202, third position information of the effective shooting area in different shooting directions is determined from the first position information and the second position information.
The shooting directions of the embodiment of the present application may include at least two of the following: the unmanned aerial vehicle comprises a front shooting direction which inclines relative to the vertical direction and faces the front of the unmanned aerial vehicle, a rear shooting direction which inclines relative to the vertical direction and faces the rear of the unmanned aerial vehicle, a left shooting direction which inclines relative to the vertical direction and faces the left direction of the unmanned aerial vehicle, a right shooting direction which inclines relative to the vertical direction and faces the right direction of the unmanned aerial vehicle, or a forward shooting direction which faces the vertical direction downwards. It should be noted that, when the unmanned aerial vehicle is upright, the nose points to the front, and the tail points to the rear.
At least two of the above-mentioned shooting directions can be selected according to actual needs, for example, during surveying and mapping, the shooting directions include a forward shooting direction (the shooting device is used for shooting a forward image of a shot object), a backward shooting direction (the shooting device is used for shooting a backward image of the shot object), a left shooting direction (the shooting device is used for shooting a left image of the shot object), and a right shooting direction (the shooting device is used for shooting a right image of the shot object), or the shooting directions include a forward shooting direction, a backward shooting direction, a left shooting direction, a right shooting direction, and a forward shooting direction (the shooting device is used for shooting a orthographic image of the shot object). It is understood that in other usage scenarios, the shooting direction may be selected to be other to meet the corresponding requirements.
Wherein, the effective shooting region of preceding direction of clapping, the effective shooting region of back direction of clapping, the effective shooting region of left direction of clapping, the effective shooting region of right direction of clapping, the effective shooting region of positive direction of clapping do respectively: the shooting method comprises the steps of obtaining a region after a region to be shot moves a second preset distance in a first direction, obtaining a region after the region to be shot moves the second preset distance in a second direction, obtaining a region after the region to be shot moves the second preset distance in a third direction, obtaining a region after the region to be shot moves the second preset distance in a fourth direction, and obtaining a region to be shot, namely, an effective shooting region in a forward shooting direction is a region obtained after the region to be shot moves the second preset distance in the first direction, an effective shooting region in a backward shooting direction is a region obtained after the region to be shot moves the second preset distance in the second direction, an effective shooting region in a left shooting direction is a region obtained after the region to be shot moves the second preset distance in the third direction, and an effective shooting region in a right shooting direction is a region obtained after the region to be shot moves the second preset distance in the fourth direction, the effective shooting area in the positive shooting direction is an area to be shot. The second predetermined distance may be equal to or different from the first predetermined distance. It can be understood that the effective area of each shooting direction is located within the outward-extending shooting area, and therefore, the second preset distance is less than or equal to the first preset distance.
It should be noted that, when the region to be measured is respectively extended by different distances in various directions to obtain the extended shooting region, the distances that the various directions respectively move in different directions may not be equal. For example, the effective region in the forward shooting direction is a region obtained after the region to be shot moves a second preset distance in the first direction, the second preset distance is smaller than or equal to the distance of the outward-extended shooting region moving from the region to be shot in the first direction, and the effective shooting region in the backward shooting direction is a region obtained after the region to be shot moves a third preset distance in the second direction, and the third preset distance is smaller than or equal to the distance of the outward-extended shooting region moving from the region to be shot in the second direction.
The first direction of the present embodiment is opposite to the second direction, and the third direction is opposite to the fourth direction. Specifically, the first direction, the second direction, the third direction, or the fourth direction is related to a shape of a flight path of the drone.
For example, the flight path may include a plurality of parallel sub-paths, and one side of each of the adjacent sub-paths is connected to form a flight path. Along the area to be shot and the outward-extended shooting area in the embodiment shown in fig. 3, optionally, the starting waypoint of the flight path is any corner position of the outward-extended shooting area, and the sub-path is parallel to one of the edges of the outward-extended shooting area. For example, referring to fig. 4A, a starting waypoint a of the flight route 30 is a lower left corner of the outward-extended shooting area, and an ending point B of the flight route 30 is an upper right corner of the outward-extended shooting area; for example, referring to fig. 4B, the starting waypoint C of the flight path 40 is the upper left corner of the outward expansion shot area, and the ending point D of the flight path 40 is the lower right corner of the outward expansion shot area. Of course, the starting waypoint may also be the upper right corner or the lower right corner of the outward-extended shooting area, and the end point is the lower left corner or the upper left corner of the outward-extended shooting area correspondingly. In addition, in the embodiment shown in fig. 4A and 4B, the sub-routes are both parallel to the short side of the extended shooting area, and it is understood that the sub-routes may also be parallel to the long side of the extended shooting area.
For example, for the flight path shown in fig. 4A, with the up, down, left and right directions shown in fig. 4A as references, the first direction is the down direction, the second direction is the up direction, the third direction is the right direction, and the fourth direction is the left direction, so that the obtained effective shooting area in the forward shooting direction is the area 51 shown in fig. 5A, and the area obtained by removing the area 51 in the outward shooting area 20 in fig. 5A is the ineffective shooting area in the forward shooting direction; the effective shooting area in the post-shooting direction is an area 52 shown in fig. 5B, and an area obtained by removing the area 52 from the extended shooting area 20 in fig. 5B is an invalid shooting area in the post-shooting direction; the effective shooting area in the left shooting direction is the area 53 shown in fig. 5C, and the area obtained by removing the area 53 in the extended shooting area 20 in fig. 5C is the ineffective shooting area in the left shooting direction; the effective shooting area in the right shooting direction is the area 54 shown in fig. 5D, and the area obtained by removing the area 54 in the outward shooting area 20 in fig. 5D is the ineffective shooting area in the right shooting direction.
For example, for the flight path shown in fig. 4B, with the upper, lower, left and right directions shown in fig. 4B as references, the first direction is an upper direction, the second direction is a lower direction, the third direction is a left direction, and the fourth direction is a right direction, so that the obtained effective shooting area in the previous shooting direction is the area 52 shown in fig. 5B, and the area obtained by removing the area 52 in the extended shooting area 20 in fig. 5B is the ineffective shooting area in the previous shooting direction; the effective shooting area in the post shooting direction is an area 51 shown in fig. 5A, and an area obtained by removing the area 51 in the outward-extended shooting area 20 in fig. 5A is an invalid shooting area in the post shooting direction; the effective shooting area in the left shooting direction is the area 54 shown in fig. 5D, and the area obtained by removing the area 54 in the outward shooting area 20 in fig. 5D is the ineffective shooting area in the left shooting direction; the effective shooting area in the right shooting direction is the area 53 shown in fig. 5C, and the area obtained by removing the area 53 in the extended shooting area 20 in fig. 5C is the ineffective shooting area in the right shooting direction.
For the flight paths shown in fig. 4A and 4B, the effective shooting areas in the forward shooting direction are both the areas to be shot 10, and the areas obtained by removing the areas to be shot 10 in the extended shooting areas 20 in fig. 4A and 4B are the ineffective shooting areas in the forward shooting direction. In addition, D in FIGS. 5A to 5D1Namely a second preset distance which is equal to the first preset distance. It will be appreciated that the flight path is not limited to that shown in fig. 4A and 4B, and may be otherwise configured.
The above-mentioned flight path determining method can be selected according to the requirement, and the flight path determining process includes, but is not limited to, the following steps:
(1) determining the lateral distance between two adjacent sub-routes in the flight route according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of a shooting device carried on the unmanned aerial vehicle, which are vertical to the flight direction of the unmanned aerial vehicle (namely the number of pixels of an image sensor of the shooting device, which are vertical to the flight direction of the unmanned aerial vehicle);
exemplary, lateral spacing DrouteThe calculation formula of (a) is as follows:
Droute=GSD(1-γlateral)nH (3);
in equation (3), GSD is the ground resolution, γlateralAs lateral overlap ratio, nHThe number of pixels perpendicular to the flight direction of the unmanned aerial vehicle is the shooting device carried on the unmanned aerial vehicle.
It will be appreciated that the lateral spacing DrouteThe calculation method of (2) is not limited to the formula (3), and may be other.
Taking an orthographic image taken therein (the shooting direction of the shooting device is the orthographic direction) as an example, as shown in fig. 6A, since the shooting point 1 and the shooting point 2 are on the same sub-route, the overlapping ratio of the image shot by the shooting device at the shooting point 1 and the image shot by the shooting device at the shooting point 2 in the flight direction is called a heading overlap ratio. The shooting point 1 and the shooting point 12 are respectively on two adjacent sub-flight lines, and the overlapping ratio of the image shot by the shooting device at the shooting point 1 and the image shot by the shooting device at the shooting point 12 in the vertical direction of the flight direction is called a side overlapping ratio.
(2) And determining a flight route according to the second position information and the lateral distance.
And (3) planning a route in the outward-extended shooting area, wherein the lateral distance between adjacent sub routes in the flight route is the lateral distance determined in the step (1).
Unmanned aerial vehicle is at the shooting in-process, and the picture that the air route of marcing was shot need to guarantee certain overlap ratio to can use in fields such as survey and drawing. The side lap ratio may be a default value or may be set by the user. Illustratively, the side lap ratio is set by a user, for example, the side lap ratio is input by the user through a control device of the unmanned aerial vehicle, and the mode of determining the side lap ratio can meet different user requirements and has strong flexibility. Alternatively, the side lap ratio may be greater than or equal to 65% and less than or equal to 80%. Illustratively, the side lap is 65%, 70%, 75%, 80%, or other numerical value greater than 65% and less than 80%.
It will be appreciated that the flight path may be planned in other ways, for example, referring to FIG. 6B, the flight path may be a tic-tac-toe path, the tic-tac-toe path includes two paths (path 60 and path 70 in FIG. 6B), the sub-paths of the two paths are perpendicular to each other, one path is required to complete the acquisition of oblique images in two or three shooting directions, one of the routes acquires a left image and a right image, or a left image, a right image and an orthoimage, the other route acquires a forward image and a backward image, or forward, backward and ortho images, if only left and right images need to be taken, or only a left image, a right image and an ortho image, or only a forward image and a backward image, or only the forward image, the backward image and the orthographic image need to be shot, and the flight route can be planned to be a # -shaped route. Wherein, the lateral distance of the crossword route is the same as the lateral distance D of the above embodimentrouteAre consistent in size.
In the embodiment of the application, one shooting direction corresponds to the preset target posture of one cradle head, namely when the cradle head reaches the preset target posture, the shooting device is in the corresponding shooting direction.
In S203, according to the third position information and a preset flight path of the unmanned aerial vehicle, determining a shooting sequence corresponding to each waypoint on the flight path, wherein each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction.
Illustratively, the shooting directions include a forward shooting direction, a backward shooting direction, a left shooting direction, a right shooting direction and a forward shooting direction, wherein if the shooting sequence is located on a flight route outside an effective shooting area of the forward shooting direction, no shooting point in the forward shooting direction exists in the shooting sequence; if the shooting sequence is located on the flight route outside the effective shooting area in the post-shooting direction, shooting points in the post-shooting direction do not exist in the shooting sequence; if the shooting sequence is located on the flight route outside the effective shooting area in the left shooting direction, shooting points in the left shooting direction do not exist in the shooting sequence; if the shooting sequence is located on the flight route outside the effective shooting area in the right shooting direction, no shooting point in the right shooting direction exists in the shooting sequence; if the shooting sequence is located on the flight route outside the effective shooting area in the positive shooting direction, no shooting point in the positive shooting direction exists in the shooting sequence. It can be understood that, in the embodiment of the present application, each shooting sequence is located on the flight path within the effective area of at least one shooting direction of the front shooting direction, the rear shooting direction, the left shooting direction, the right shooting direction and the forward shooting direction, that is, the position of each shooting sequence on the flight path is in at least one effective shooting area, and each shooting sequence includes a shooting point of at least one shooting direction.
In the embodiment of the present application, the number of the shooting points in each shooting sequence is positively correlated with the number of the effective shooting areas where the shooting sequence is located on the flight path, and along with the embodiment shown in fig. 4A and 4B, the shooting sequence may be located on at least one of the area 1, the area 2, the area 3, and the area 4.
Wherein, regional 1 is the overlapping region of the effective area of 5 shooting directions, and regional 1 is the overlapping region of the effective shooting area of preceding direction of shooing, the direction of shooing after, the direction of shooing left side, the direction of shooing right side and the direction of shooing just, and regional 2 is the overlapping region of the effective area of 4 shooting directions respectively, then regional 2 includes 4 overlapping regions: the overlapping regions of the effective shooting regions in the front shooting direction, the left shooting direction, the right shooting direction and the forward shooting direction, the overlapping regions of the effective shooting regions in the rear shooting direction, the left shooting direction, the right shooting direction and the forward shooting direction, the overlapping regions of the effective shooting regions in the front shooting direction, the rear shooting direction, the left shooting direction and the forward shooting direction, and the overlapping regions of the effective shooting regions in the front shooting direction, the rear shooting direction, the right shooting direction and the forward shooting direction; area 3 is an overlapping area of effective areas in 3 shooting directions, and area 3 includes 4 overlapping areas, which are: the overlapping regions of the effective shooting regions in the front shooting direction, the left shooting direction and the front shooting direction, the overlapping regions of the effective shooting regions in the front shooting direction, the right shooting direction and the front shooting direction, the overlapping regions of the effective shooting regions in the back shooting direction, the left shooting direction and the front shooting direction, and the overlapping regions of the effective shooting regions in the back shooting direction, the right shooting direction and the front shooting direction; the area 4 is an effective shooting area in a single shooting direction, and the area 4 includes 4 independent effective shooting areas (not overlapping with effective shooting areas in other shooting directions), namely an effective shooting area in a forward shooting direction, an effective shooting area in a backward shooting direction, an effective shooting area in a left shooting direction, and an effective shooting area in a right shooting direction.
If the position of the shooting sequence on the flight path is in the area 1, the number of the shooting points in the shooting sequence is 5; if the position of the shooting sequence on the flight path is in the area 2, the number of the shooting points in the shooting sequence is 4; if the position of the shooting sequence on the flight path is in the area 3, the number of the shooting points in the shooting sequence is 3; if the position of the recording sequence on the flight path is within the area 4, the number of recording points in the recording sequence is 1.
It can be understood that, in the embodiment of the present application, the plurality of shooting sequences are arranged in sequence, where the sequence of each shooting sequence is consistent with the sequence of the positions of the unmanned aerial vehicle on the flight path through the shooting sequence when the unmanned aerial vehicle flies along the flight path.
When the follow-up unmanned aerial vehicle carries out oblique photography according to the planned shooting sequence, a fixed-time shooting mode or a fixed-distance shooting mode can be adopted to trigger the cloud deck and the shooting device to complete the shooting process. Optionally, the time length required for the photographing device to complete the photographing of each photographing sequence is fixed or the distance between adjacent photographing sequences is fixed, so that the photographing interval time length or distance between two photographing sequences is more stable. For example, in some embodiments, the time required for the photographing device to complete the photographing of each photographing sequence is a first fixed time, so that the unmanned aerial vehicle can trigger the cradle head and the photographing device to complete the photographing process in a timed photographing manner, optionally, when the unmanned aerial vehicle performs oblique photographing according to a planned photographing sequence, the unmanned aerial vehicle can control the cradle head to complete the first photographing of the first photographing point of the photographing sequence, and send a timed photographing trigger signal to the cradle head. Of course, the time period required for the photographing apparatus to complete the photographing of each photographing sequence may not be a fixed time period.
Further optionally, a time length required for the photographing device to complete photographing of adjacent photographing points in the same photographing sequence is a second fixed time length, so that a photographing interval time length between adjacent photographing points in each photographing sequence is stable. Of course, the time length required for the photographing apparatus to complete photographing of adjacent photographing points in the same photographing sequence may not be a fixed time length.
It will be appreciated that the first fixed duration is greater than the second fixed duration for the same capture sequence. The first fixed time length and the second fixed time length can be set according to needs, and illustratively, the first fixed time length is 10 seconds, and the second fixed time length is 2 seconds; of course, the first fixed time length and the second fixed time length may be set to other values.
In some embodiments, the distance between adjacent shooting sequences is a first fixed distance, so that an unmanned aerial vehicle can trigger a cradle head and a shooting device to complete a shooting process in a fixed-distance shooting mode, and optionally, when the unmanned aerial vehicle performs oblique shooting according to a planned shooting sequence, before the shooting device is controlled by the cradle head to complete shooting of a first shooting point of each shooting sequence, fixed-distance shooting trigger signals are respectively sent to the cradle head, and after the cradle head receives the fixed-distance shooting trigger signals, attitude switching is performed first, so that the shooting device on the cradle head is located in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point of the corresponding shooting sequence, and then when the shooting device is located in the corresponding shooting direction, the shooting device is triggered to shoot, and the triggering mode of fixed-distance shooting can enable the distance between two shooting sequences to be more stable.
Further optionally, the distance between adjacent shooting points in the same shooting sequence is a second fixed distance, the unmanned aerial vehicle can respectively send distance shooting trigger signals to the cloud deck before the cloud deck controls the shooting device to complete shooting of each shooting point, the cloud deck receives the distance shooting trigger signals every time and then performs posture switching, so that the shooting device on the cloud deck is located in the shooting direction corresponding to the shooting point when the unmanned aerial vehicle reaches the corresponding shooting point, and then the shooting device is triggered to shoot when the shooting device is located in the shooting direction corresponding to the shooting point, so that the distance between two adjacent shooting points in each shooting sequence is more stable.
The first fixed interval and the second fixed interval can be set according to needs, and exemplarily, the first fixed interval is 10 meters, and the second fixed interval is 2 meters; of course, the first fixed interval and the second fixed interval may be set to other values.
In some embodiments, the unmanned aerial vehicle triggers the pan tilt and the shooting device to complete a shooting process in a fixed-distance shooting manner, and in the shooting process of the pan tilt and the shooting device, the pan tilt completes a panning sequence at regular time, that is, when the unmanned aerial vehicle reaches each waypoint, the pan tilt is triggered to enter a shooting program, and after the pan tilt enters the shooting program, the shooting device is triggered to shoot at each shooting point at regular time, for example, the distance between adjacent shooting sequences (i.e., the distance between adjacent waypoints) is a third fixed distance, the time required for the shooting device to complete the shooting of adjacent shooting points in the same shooting sequence is a third fixed time, the third fixed distance and the third fixed time can be set as required, for example, the third fixed distance is 10 meters, and the third fixed time is 2 seconds.
Optionally, an initial shooting point (i.e., a first shooting point in a first shooting sequence) of the shooting points is: the starting flight position of the unmanned aerial vehicle when flying according to the flight route; optionally, the initial shooting points in the shooting points are: initial waypoints of the flight path. The initial flight position and the initial waypoint may be the same position or different positions, and specifically, one of the above embodiments may be selected as needed to determine the initial shooting point. It is to be understood that the manner of determining the initial shot point is not limited to the above-listed manners, and other manners may be used to determine the initial shot point.
Except for special description, the executing main body of the shooting control method of the above embodiment may be a control device of the unmanned aerial vehicle, and the control device may be a device capable of controlling the unmanned aerial vehicle, such as a remote controller, a mobile phone, a computer, an intelligent wearable device, and the like; the main executing body of the shooting control method in the above embodiment may also be an unmanned aerial vehicle, for example, the main executing body may be a flight controller of the unmanned aerial vehicle or another controller provided in the unmanned aerial vehicle or a combination of the flight controller and another controller provided in the unmanned aerial vehicle; the main executing body of the shooting control method of the above embodiment may also be a combination of a control device of the unmanned aerial vehicle and the unmanned aerial vehicle, for example, the obtaining of the first position information and the second position information and the planning of the flight path are executed by the control device of the unmanned aerial vehicle, and the determination of the effective shooting area and the determination of the shooting sequence are executed by the unmanned aerial vehicle; for another example, the acquisition of the first position information and the second position information is carried out by a control device of the unmanned aerial vehicle, and the planning of the flight path, the determination of the effective shooting area and the determination of the shooting sequence are carried out by the unmanned aerial vehicle; for another example, the acquisition of the first position information is executed by a control device of the unmanned aerial vehicle, and the determination of the second position information, the planning of the flight path, the determination of the effective shooting area and the determination of the shooting sequence are executed by the unmanned aerial vehicle; for another example, the acquisition of the first position information, the determination of the second position information, the planning of the flight path, the determination of the effective shooting area and the determination of the shooting sequence can be all performed by the unmanned aerial vehicle; of course, the execution subject of the shooting control method of the above embodiment is not limited to the control device of the drone and/or the drone, but may also be electronic equipment such as other control devices independent of the drone or the drone, such as a control device of a pan/tilt head or a shooting device.
For example, in some embodiments, the main execution subject of the method of the above embodiments is a control device of the drone. When the control device acquires the second position information of the externally-extended shooting area, the control device specifically determines the second position information of the externally-extended shooting area according to the first position information. Optionally, the planning of the flight path is performed in the control device, and for example, the flight path of the unmanned aerial vehicle is planned according to the second position information, where the planning of the flight path may refer to the description of the corresponding part in the above embodiment, and is not described here again. Further, the shooting control method further includes: sending a flight route to the unmanned aerial vehicle, and, according to the flight route of third positional information and predetermined unmanned aerial vehicle, after confirming the shooting sequence that each waypoint on the flight route corresponds, send the shooting sequence that each waypoint corresponds to unmanned aerial vehicle, make unmanned aerial vehicle carry on the shooting device on unmanned aerial vehicle and shoot based on the shooting sequence control that flight route and each waypoint correspond, thereby before unmanned aerial vehicle carries out the oblique photography, send the flight route to unmanned aerial vehicle through unmanned aerial vehicle's controlling means, unmanned aerial vehicle carries out the oblique photography at the in-process of carrying out this flight route. It will be appreciated that the flight path planning process may also be performed in a drone.
In some embodiments, the executing subject of the shooting control method in the above embodiments is an unmanned aerial vehicle, for example, the unmanned aerial vehicle may plan the shooting sequence before performing oblique photography, and the shooting points in each shooting direction in the planned shooting sequence are all located in the effective shooting area in the shooting direction; illustratively, each shooting sequence comprises shooting points in all shooting directions, and in the process of oblique shooting by the unmanned aerial vehicle, the shooting points in the invalid shooting area in the current shooting sequence are removed according to the real-time position information of the unmanned aerial vehicle and the shooting points of the currently executed shooting sequence.
In the following, the shooting control method is further described by taking an execution subject of the shooting control method as an unmanned aerial vehicle as an example.
Wherein, first positional information can be sent by unmanned aerial vehicle's controlling means, and exemplarily, the user inputs first positional information through unmanned aerial vehicle's controlling means, and unmanned aerial vehicle's controlling means sends first positional information for unmanned aerial vehicle again. The user may input the first position information into the control device of the unmanned aerial vehicle in a manual dotting manner or through an external file, which may specifically refer to the description of the corresponding part in the above embodiments, and is not described here any more. It can be understood that if the drone is provided with the input module, the first location information may also be directly input to the drone by the user operating the input module.
Different strategies can be adopted to obtain the second position information, for example, in some embodiments, the second position information is sent by a control device of the unmanned aerial vehicle, optionally, the control device of the unmanned aerial vehicle determines the second position information of the outward expansion shooting area according to the first position information, and then sends the second position information to the unmanned aerial vehicle; in some embodiments, the second location information is determined by the drone itself, and specifically, the second location information of the extended shooting area is determined according to the first location information. The implementation process of determining the second location information of the outward-extended shooting area according to the first location information may refer to the description of the corresponding parts in the above embodiments, and details are not repeated here.
The planning of the flight route may be performed at the control device of the unmanned aerial vehicle or at the unmanned aerial vehicle, for example, in some embodiments, the flight route is planned by the control device of the unmanned aerial vehicle based on the second position information, and the control device of the unmanned aerial vehicle sends the flight route to the unmanned aerial vehicle; in some embodiments, the flight path is planned by the drone based on the second location information. For planning the flight route, reference may be made to the description of the corresponding parts in the above embodiments, and details are not described here.
The shooting control method of the embodiment of the application may further include: and controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence corresponding to each waypoint.
Next, a process in which the unmanned aerial vehicle controls the shooting device mounted on the unmanned aerial vehicle to shoot based on the flight line and the shooting sequence is described in detail.
Fig. 7 is a schematic diagram of an implementation manner of controlling a shooting device mounted on an unmanned aerial vehicle to shoot based on a flight line and a shooting sequence in an embodiment of the present application. Referring to fig. 7, the process of controlling the camera mounted on the drone to shoot based on the flight path and the shooting sequence may include steps S701 to S703.
In S701, controlling the unmanned aerial vehicle to fly according to a flight route;
in S702, according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the attitude, so that a shooting device on the cradle head is in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
in S703, an image captured by the capturing device at each capturing point is acquired.
According to the embodiment of the application, in the process that the unmanned aerial vehicle flies to the next shooting point from the current shooting point, the cloud deck carrying the shooting device is controlled to switch the postures, so that the shooting device is in the corresponding shooting direction when the unmanned aerial vehicle arrives at each shooting point, and the unmanned aerial vehicle does not need to stop flying in the shooting process, so that the shooting efficiency is improved, and the unmanned aerial vehicle is particularly suitable for mapping; in addition, the embodiment of the application asynchronously finishes shooting images in multiple shooting directions by controlling one shooting device through the cloud deck, and compared with the traditional multi-splicing shooting device, the unmanned aerial vehicle disclosed by the invention has the advantages that the weight is greatly reduced, so that the unmanned aerial vehicle with lighter volume and weight can be selected to carry the shooting device, and the use cost is reduced.
In this application embodiment, shoot the device and do not influence unmanned aerial vehicle's flight, promptly, when shooting the device and shoot, unmanned aerial vehicle continues to carry out the flight route, and unmanned aerial vehicle can not hover because of the shooting action of shooting the device, further improves and shoots efficiency.
The flight line of this application embodiment can include a plurality of waypoints, and wherein, the flight line can be preset by the user, and is optional, and the user inputs unmanned aerial vehicle with the positional information of each waypoint through unmanned aerial vehicle's controlling means, and unmanned aerial vehicle can link to each waypoint in proper order according to the order of input and form above-mentioned flight line. When the user updates the position of the part of the waypoint in the set flight path, the position information of the part of the waypoint in the set flight path can be modified by operating the control device of the unmanned aerial vehicle. The step of modifying the position information of the part of the waypoints in the set flight route can be executed before the unmanned aerial vehicle flies, and can also be executed in the process of flying the unmanned aerial vehicle. It will be appreciated that the flight path may also be a default flight path.
The position setting relationship between the waypoints and the shooting points can be selected according to needs, for example, in some embodiments, the shooting points are arranged between the adjacent waypoints, so that the characteristic that the flying time between the waypoints is longer than the time required by the shooting device to shoot the images can be utilized, the images in multiple shooting directions are inserted between the waypoints for shooting, and the shooting efficiency is higher; in other embodiments, some of the plurality of waypoints are taken as shooting points, and the shooting points may or may not be arranged between adjacent waypoints; in other embodiments, all of the plurality of waypoints are taken as shooting points, and the shooting points may or may not be arranged between adjacent waypoints. It can be understood that one shooting point has one shooting, corresponds to one shooting direction, and corresponds to a preset target posture of one holder, so as to obtain a shot image.
In the related art, no matter fixed wing unmanned aerial vehicle or rotor unmanned aerial vehicle, when utilizing a shooting device to realize the shooting of a plurality of angles, because of speed or efficiency control's reason, generally can design the flight route into many flight routes, like five routes, every flight route corresponds a direction of shooing, gather respectively waiting to shoot the regional preceding image of district, backward image, left image, right image and orthographic image, consequently, need control unmanned aerial vehicle and fly along five routes respectively, but this is unfavorable for unmanned aerial vehicle's continuation of the journey. To this, the flight route design of this application embodiment is one, and this flight route can be for such as the flight route that fig. 4A and fig. 4B show, also can be for other, in unmanned aerial vehicle's a flight in-process, and the cloud platform on the control unmanned aerial vehicle carries out the gesture and switches to realize the shooting of a plurality of shooting angles, thereby need not realize going round to fly repeatedly of route, and then not only be favorable to improving shooting efficiency, still be favorable to reducing unmanned aerial vehicle's energy consumption.
Wherein, the process of controlling unmanned aerial vehicle to fly according to flight route can include: and controlling the real-time height between the lens of the shooting device and the area to be shot within a preset height range. When the unmanned aerial vehicle is used for surveying and mapping, the GSD is not uniform due to the fluctuation of the terrain in the surveying and mapping process, so that the uniformity of the GSD is maintained by controlling the height between a lens of the shooting device and the ground, and if the terrain is high, the unmanned aerial vehicle ascends; and when the terrain is reduced, the unmanned aerial vehicle descends, and the GSD is approximately equal in the surveying and mapping process. The fixed wing unmanned aerial vehicle is limited in ascending height and descending height, so that the fixed wing unmanned aerial vehicle can only be controlled to ascend or descend within the ascending height or descending height range of the fixed wing unmanned aerial vehicle, and GSD consistency is kept as much as possible.
According to the embodiment of the application, the unmanned aerial vehicle is not required to hover at the shooting point, and when each shooting sequence is triggered, the cradle head finishes the last shooting sequence, and the flying speed is required to be controlled within the maximum flying speed allowed by the unmanned aerial vehicle. The maximum flying speed can be calculated by combining the rotation performance of the holder, optionally, the maximum flying speed allowed by the unmanned aerial vehicle is determined by the course distance based on each shooting direction and the time length required by the shooting device to complete shooting of a shooting sequence and recover to the initial shooting direction, wherein the course distances of each shooting direction are equal in size, and the course distances are determined by the preset ground resolution, the preset course overlapping rate and the number of pixels of the shooting device in the flying direction parallel to the unmanned aerial vehicle (namely the number of pixels of the image sensor of the shooting device in the flying direction parallel to the unmanned aerial vehicle).
Exemplary, maximum flight velocity VmaxThe calculation formula of (a) is as follows:
Figure BDA0003328842790000211
in the formula (4), D2Course spacing, T, for each direction of captureGimThe time length required for the photographing apparatus to complete photographing of one photographing sequence and to return to the initial photographing direction. It will be appreciated that the maximum flight speed VmaxThe calculation method (2) is not limited to the formula (4), and may be other.
Illustratively, the relative positional relationship between the two shooting sequences in each shooting direction is shown in fig. 8, F1 and F2 are effective shooting areas in the forward shooting direction, D1 and D2 are effective shooting areas in the forward shooting direction, B1 and B2 are effective shooting areas in the backward shooting direction, and R1 and R2 are effective shooting areas in the right shooting directionEffective shooting area in the left direction, L1 and L2, DF、DD、DB、DR、DLRespectively the course intervals of the front shooting direction, the back shooting direction, the right shooting direction and the left shooting direction. In the examples of this application, D2=DF=DD=DB=DR=DL
Exemplary heading separation D in the forward directionFThe calculation formula of (a) is as follows:
DF=GSD(1-γcourse)nV (5);
in the formula (5), γcourseIs a preset course overlap ratio, nVThe number of pixels parallel to the flight direction of the unmanned aerial vehicle is taken as a shooting device. It should be appreciated that the heading separation D in the forward directionFThe calculation method (2) is not limited to the formula (5), and may be other.
The course overlapping rate can be a default value or can be set by a user. Illustratively, the course overlapping rate is set by a user, for example, the course overlapping rate is input by the user through a control device of the unmanned aerial vehicle, and the way of determining the course overlapping rate can meet different user requirements and has strong flexibility. In order to ensure that the images in each shooting direction meet the modeling requirement, the course overlapping rate is optionally greater than or equal to 65% and less than or equal to 80%. Illustratively, the heading overlap ratio is 65%, 70%, 75%, 80%, or other numerical magnitude greater than 65% and less than 80%.
In the following, the influence of the system error on the course overlap rate is analyzed by taking the forward image as an example.
Assuming that the ground resolution GSD is 2.5cm and the course overlapping rate gammacourseIs 70 percent, the number n of the pixels of the shooting device parallel to the flight direction of the unmanned aerial vehicleV5460, the flying speed is 10m/s, and then according to the formula (5), the theoretical distance between the front and back forward images is determined as follows:
2.5cm*(1-70%)*5460=40.95m;
due to the factors such as the fluctuation of the rotational speed of the holder and the fluctuation of the system delay time, the error between the actual shooting time and the theoretical shooting time of the second forward image is 0.5s (delay), and the actual distance between the front forward image and the rear forward image is as follows:
40.95m+10m/s*0.5=45.95m;
then D will beF=45.95m、GSD=2.5cm、nVSubstituting 5460 into equation (5) can determine that the actual course overlap ratio of the forward image is 66%, still meeting the modeling requirements.
Because the side lengths of the effective shooting areas in the front shooting direction and the rear shooting direction, which are parallel to the course, are longer than those in the front shooting direction, the right shooting direction and the left shooting direction, the error between the actual shooting time and the theoretical shooting time is smaller than the images of the course overlapping rates in the front shooting direction and the rear shooting direction. By controlling parameters such as flying speed, flying direction, rotational speed of a holder and the like (needing system optimization), or by increasing the size of the course overlapping rate (influencing the whole operation efficiency), the course overlapping rate of the images in all shooting directions can be ensured to meet the modeling requirement.
The initial shooting direction may be a shooting direction corresponding to one of the shooting points in the shooting sequence, and exemplarily, the initial shooting direction is a shooting direction of a first shooting point of the shooting sequence. Optionally, the shooting direction of the first shooting point of each shooting sequence is the same, for example, the shooting direction of the first shooting point of each shooting sequence is a positive shooting direction; optionally, the shooting directions of the first shooting points of the multiple shooting sequences are at least partially different, for example, the shooting direction of the first shooting point of the shooting sequence 1 is the left shooting direction, the shooting direction of the first shooting point of the shooting sequence 2 is the right shooting direction, and the shooting direction of the first shooting point of the shooting sequence 3 is the left shooting direction. Wherein, TGimThe time length required by the shooting device to complete the shooting of the current shooting sequence and recover to the initial shooting direction of the next shooting sequence is suitable for scenes that the shooting direction of the first shooting point of each shooting sequence is the same or the shooting directions of the first shooting points of a plurality of shooting sequences are at least partially different; of course, TGimIt is also possible for the recording device to complete the recording of the current recording sequence and to return to the initial recording direction of the current recording sequence, which is suitable for the first recording point of each recording sequenceAnd (5) shooting scenes with the same shooting direction.
The method comprises the steps that the posture of a cradle head on the unmanned aerial vehicle is controlled to be switched, so that when a shooting device on the cradle head is in a corresponding shooting direction at each shooting point, the real-time posture of the unmanned aerial vehicle can be optionally obtained; determining the deviation between the real-time attitude of the unmanned aerial vehicle and the shooting direction of the next shooting point; and controlling the cloud deck on the unmanned aerial vehicle to switch the attitude according to the deviation, so that the shooting device on the cloud deck is in the corresponding shooting direction at each shooting point. The shooting device of this application embodiment carries on unmanned aerial vehicle's organism through the cloud platform, and when the gesture change of organism was great, the gesture of accessible control cloud platform for at the shooting of different waypoints (shooting sequence) same direction, utilize cloud platform attitude control to guarantee that the cloud platform keeps unanimous (or the deviation is very little) to ground angle, guarantee that the photo sequence overlap ratio of same direction keeps even.
Can adopt different modes cooperation in order to accomplish the shooting of many shooting directions between unmanned aerial vehicle and the cloud platform, exemplary, according to shoot the sequence unmanned aerial vehicle flies to the in-process of next shooting point from current shooting point, control cloud platform on the unmanned aerial vehicle switches the gesture, makes shooting device on the cloud platform is in the process that all is in the shooting direction that corresponds when unmanned aerial vehicle reachs each shooting point can include: according to the shooting sequence, sending a shooting trigger signal to the cloud deck so that the cloud deck carries out attitude switching in the process that the unmanned aerial vehicle flies to the next shooting point from the current shooting point, and the shooting device on the cloud deck is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point. And the shooting trigger signal is also used for indicating the cradle head to trigger the cradle head to shoot when the shooting device is in the corresponding shooting direction. In this embodiment, unmanned aerial vehicle triggers the cloud platform and gets into the procedure of carrying out the shooting sequence, and wherein, the procedure of shooting the sequence includes that the gesture switches and shoots and trigger these two steps, and the gesture switches and shoots and triggers and accomplish by the cloud platform to reduce the influence of the time delay of unmanned aerial vehicle trigger signal processing process to the operating efficiency. Wherein, the gesture switches over for making the shooting device on the cloud platform all be in corresponding shooting direction when unmanned aerial vehicle reachs each shooting point. Of course, the unmanned aerial vehicle can also directly control the holder to switch the posture and/or directly trigger the shooting device to shoot by the unmanned aerial vehicle.
In this application embodiment, the cloud platform and the shooting device accomplish the shooting process and can include: the cloud platform control is shot the shooting that the device accomplished each shooting sequence, specifically speaking, the cloud platform carries out the gesture according to shooting sequence and switches over for the shooting device on the cloud platform all is in the shooting direction that corresponds when unmanned aerial vehicle reachs each shooting point of each shooting sequence, and, the cloud platform triggers the shooting device and shoots when shooting the device and be in the shooting direction that corresponds.
Shoot trigger signal can shoot trigger signal for timing, also can shoot trigger signal for the distance, also promptly, unmanned aerial vehicle can adopt regularly to clap or the distance claps trigger mode triggers the cloud platform and shoots the device and accomplish the shooting process.
For example, in some embodiments, the shooting trigger signal is a timing shooting trigger signal, and the timing shooting trigger signal is used for instructing the pan/tilt head to trigger the shooting device to shoot based on the first timing strategy. Wherein the first timing policy may include: the time length required by the shooting device to complete the shooting of each shooting sequence is a first fixed time length, so that the time length required by the shooting device to complete the shooting of each shooting sequence is stable. Optionally, the number of times of sending regularly to shoot trigger signal to cloud platform is once, for example, can send regularly to shoot trigger signal to cloud platform before cloud platform control shooting device accomplishes the shooting of the first shooting point of first shooting sequence, the cloud platform is after receiving regularly trigger signal, rotates in proper order to every shooting direction of shooting sequence to regularly trigger shooting device and shoot, this mode only need unmanned aerial vehicle send regularly shoot trigger signal can, unmanned aerial vehicle's control is comparatively simple. It can be understood that the number of times that unmanned aerial vehicle sent regularly to shoot trigger signal to cloud platform also can be many times, if before cloud platform control shooting device accomplishes the shooting of the first shooting point of every shooting sequence, send regularly respectively and shoot trigger signal to cloud platform, the cloud platform is after receiving regularly and shoots trigger signal, rotates in proper order to each shooting direction that corresponds the shooting sequence to regularly trigger shooting device and accomplish the shooting that corresponds each shooting direction in the shooting sequence. It will be appreciated that the first timing strategy may be other.
Further, the timing shooting trigger signal can be used for instructing the holder to trigger the shooting device to shoot based on the second timing strategy. Wherein the second timing strategy comprises: the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is the second fixed time length, so that the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is stable. It will be appreciated that the second timing strategy may be other.
In a timing shooting triggering mode, different strategies can be adopted to send shooting sequences to the cloud deck, illustratively, the number of the shooting sequences is multiple, in some embodiments, all the shooting sequences are sent to the cloud deck at one time before the cloud deck controls the shooting device to complete shooting of a first shooting point of the first shooting sequence, and in the process that the subsequent cloud deck and the shooting device are matched to complete shooting, the unmanned aerial vehicle does not need to send the shooting sequences to the cloud deck again; in some other embodiments, after the shooting device completes shooting of the current shooting sequence, the next shooting sequence is sent to the pan/tilt head, that is, after the pan/tilt head controls the shooting device to complete shooting of each shooting sequence, the next shooting sequence of the shooting sequence is sent to the pan/tilt head to instruct the pan/tilt head to perform shooting of the next shooting sequence.
In some embodiments, the shooting trigger signal is a fixed-distance shooting trigger signal, and the fixed-distance shooting trigger signal can be used for indicating that the pan-tilt-zoom control shooting device is triggered to shoot based on the first fixed-distance strategy. Wherein the first distance strategy comprises: the distance between adjacent shooting sequences is a first fixed distance, so that the distance between adjacent shooting sequences is stable. Optionally, the number of times of sending the distance trigger signal to the pan/tilt head is multiple times, for example, before the pan/tilt head controls the shooting device to complete shooting of the first shooting point of each shooting sequence, the distance shooting trigger signal may be sent to the pan/tilt head respectively, and after the pan/tilt head receives the distance shooting trigger signal each time, the attitude switching is performed first, so that the shooting device on the pan/tilt head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point of the corresponding shooting sequence, and then when the shooting device is in the corresponding shooting direction, the shooting device is triggered to shoot. Because there is the time delay in unmanned aerial vehicle trigger cloud platform, consequently, adopt first distance strategy to trigger cloud platform control shooting device to carry out the mode of shooing and can reduce the influence of unmanned aerial vehicle trigger time delay to the operating efficiency.
Further, optionally, the distance shooting trigger signal may be further used to instruct the pan/tilt head to control the shooting device to shoot based on the second distance strategy. Wherein the second distance strategy comprises: the interval between adjacent shooting points in the same shooting sequence is a second fixed interval, so that the interval between adjacent shooting points in the same shooting sequence is stable. In addition, since the shooting directions in each shooting sequence are not necessarily completely the same, a fixed-distance shooting trigger signal may be sent to the pan/tilt head respectively before the pan/tilt head control shooting device completes shooting at each shooting point, so as to trigger the pan/tilt head to complete shooting at each shooting point. Optionally, the unmanned aerial vehicle adopts a fixed-distance shooting triggering mode to trigger the cradle head and the shooting device to complete a shooting process, in the shooting process of the cradle head and the shooting device, the cradle head completes a shooting sequence at regular time, that is, when the unmanned aerial vehicle reaches each waypoint, the cradle head is triggered to enter the shooting program, after the cradle head enters the shooting program, the shooting device is triggered at regular time to shoot at each shooting point, the distance between adjacent shooting sequences (i.e., the distance between adjacent waypoints) is a first fixed distance, the time required for the shooting device to complete the shooting of adjacent shooting points in the same shooting sequence is a third fixed time, the size of the third fixed time can be set as required, exemplarily, the third fixed time is 2 seconds.
In the fixed-distance shooting triggering mode, different strategies may be adopted to send the shooting sequence to the pan/tilt head, for example, in some embodiments, before the pan/tilt head controls the shooting device to complete shooting of the first shooting point of each shooting sequence, the shooting sequence is sent to the pan/tilt head, that is, after the pan/tilt head controls the shooting device to complete shooting of each shooting sequence, before the pan/tilt head controls the shooting device to execute shooting of the next shooting sequence of each shooting sequence, the next shooting sequence of the shooting sequence is sent to the pan/tilt head to instruct the pan/tilt head to execute shooting of the next shooting sequence; in other embodiments, before the pan/tilt head controls the shooting device to complete the shooting of each shooting point, an indication signal is respectively sent to the pan/tilt head, and the indication signal is used for indicating the target posture of the pan/tilt head corresponding to the shooting point or the shooting direction of the shooting device, that is, the pan/tilt head is triggered to perform the shooting of the shooting point at each shooting point.
In addition, in this embodiment of the application, the control mode of the pan-tilt attitude switching may be selected according to the type of the pan-tilt, and taking a three-axis pan-tilt as an example, the pan-tilt is configured to move around a yaw axis, a roll axis, and a pitch axis. Optionally, switching of the attitude of the pan/tilt head may be achieved by controlling one or more of the attitude of the roll axis, the attitude of the pitch axis, and the attitude of the yaw axis of the pan/tilt head. Generally, the yaw axis of the pan/tilt head cannot rotate in a whole circle, so that the yaw axis attitude control manner of the pan/tilt head cannot be adopted to control the shooting direction of the shooting device at different shooting points in each shooting sequence, and in the embodiment of the present application, any two of the yaw attitude, the roll axis attitude and the pitch axis attitude of the pan/tilt head are controlled to control the pan/tilt head to switch the attitude.
Exemplarily, yaw attitude and pitch axis attitude are controlled to control the pan-tilt switching attitude, optionally, the target attitude characterization of the pan-tilt is (pitch attitude, roll attitude, yaw attitude), and the pan-tilt target attitudes corresponding to the forward shooting direction, the backward shooting direction, the right shooting direction and the left shooting direction are respectively: (-60 °,0 °,0 °), (-90 °,0 °,0 °), (-120 °,0 °,0 °), (-60 °,0 °,90 °), and (-120 °,0 °,90 °).
Illustratively, control driftage gesture and roll shaft gesture to control the cloud platform and switch the gesture, optionally, cloud platform target gesture characterization is (pitch gesture, roll gesture, driftage gesture), and the cloud platform target gesture that the direction was taken to the preceding, the direction is taken to the positive, the direction is taken to the back, the direction is taken to the right, the direction is taken to the left corresponding is taken to the cloud platform target gesture respectively: (-90 °,30 °,0 °), (-90 °,0 °,0 °), (-90 °, -30 °,90 °), and (-90 °,30 °,90 °).
It can be understood that the above listed target poses of the pan-tilt head corresponding to the forward shooting direction, the backward shooting direction, the right shooting direction and the left shooting direction are only two, and similar shooting in the shooting directions can be realized by adjusting the rotation angles of the respective shafts and the forward and backward rotation sequence of the respective shafts, but the operation efficiency is different.
For example, referring to fig. 9, when the drone flies along the flight path 80 and reaches the first shooting point of a shooting sequence (as shown in fig. 9, the shooting direction of the first shooting point of the shooting sequence is the forward shooting direction), the pan-tilt is triggered to perform shooting of the shooting sequence. For convenience of description, the shooting sequence is referred to as a shooting sequence a, and the shooting directions of the shooting sequence a include a forward shooting direction, a backward shooting direction, a right shooting direction, and a left shooting direction. The pan-tilt performs the shooting of the shooting sequence a as follows:
(1) after the cloud deck detects that the shooting device finishes shooting of the last shooting point of the last shooting sequence, the cloud deck rotates to the first target posture of the cloud deck corresponding to the front shooting direction of the shooting sequence A, and then the shooting device is triggered to shoot to obtain a forward image;
(2) after the shooting device finishes shooting the forward images of the shooting sequence A, continuing rotating to a second target posture of the holder corresponding to the forward shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain an orthoimage;
(3) after the shooting device finishes shooting the orthographic images of the shooting sequence A, continuing rotating to a third target posture of the holder corresponding to the post-shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain backward images;
(4) after the shooting device finishes shooting the backward images of the shooting sequence A, continuing rotating to a fourth target posture of the holder corresponding to the right shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain a right image;
(5) and after the shooting device finishes shooting the right images of the shooting sequence A, continuing rotating to the fifth target posture of the holder corresponding to the left shooting direction of the shooting sequence A, and then triggering the shooting device to shoot to obtain the left images.
At this point, the pan/tilt head completes the shooting of the shooting sequence a.
It can be understood that the first target posture, the second target posture, the third target posture, the fourth target posture and the fifth target posture are all preset target postures, and optionally, the first target posture, the second target posture, the third target posture, the fourth target posture and the fifth target posture are respectively: (-60 °,0 °,0 °), (-90 °,0 °,0 °), (-120 °,0 °,0 °), (-60 °,0 °,90 °), or-120 °,0 °,90 °, or-the first target attitude, the second target attitude, the third target attitude, the fourth target attitude, the fifth target attitude, respectively: (-90 °,30 °,0 °), (-90 °,0 °,0 °), (-90 °, -30 °,90 °), and (-90 °,30 °,90 °).
The cloud platform triggers the shooting device to shoot when the cloud platform reaches a preset target posture and is in a stable state. The cloud platform is in stable state can include: the fluctuation range of the angle of the cloud deck relative to the preset direction (such as the angle of the cloud deck relative to the ground) is within the preset angle range, namely the fluctuation of the angle of the cloud deck relative to the preset direction is small.
In addition, before the pan-tilt triggers the shooting device to shoot, the shooting device needs to be in a shooting executable state. Optionally, the state that the photographing apparatus is in the photographing executable state includes: the control part of the shooting device is in a triggerable state, illustratively, the shooting device is a camera, and the control part is a shutter of the camera; optionally, the state that the photographing apparatus is in the photographing executable state includes: the buffer of the camera is larger than a preset capacity, i.e. the buffer of the camera is large enough to be able to store at least one image.
Optionally, each shooting sequence corresponds to one image queue, and the image of each shooting point in each shooting sequence can be stored in the corresponding image queue; optionally, images in the same shooting direction are stored in the same image queue, images in different shooting directions are stored in different image queues, and specifically, the storage mode of the images can be selected according to needs.
In addition, it should be noted that the shooting control method in the embodiment of the present application does not need to rely on a dual-axis or tri-axis pan-tilt, and the unmanned aerial vehicle can also complete shooting in multiple shooting directions in cooperation with a single-axis pan-tilt, for example, shooting in three shooting directions can be achieved by a variable pitch attitude pan-tilt + cross course planning, shooting in three shooting directions can be achieved by a variable roll attitude pan-tilt + cross course planning, and shooting in three or four shooting directions can be achieved by a variable yaw attitude pan-tilt + flight course such as that shown in fig. 4A or 4B.
In addition, the embodiment of the application can also provide a shooting control method, an execution main body of the shooting control method of the embodiment of the application is an unmanned aerial vehicle, and if the execution main body is an unmanned aerial vehicle flight controller or other controllers arranged on the unmanned aerial vehicle or a combination of the flight controller and other controllers arranged on the unmanned aerial vehicle.
Referring to fig. 10, the shooting control method according to the embodiment of the present application may include steps S1001 to S1002:
in S1001, receiving a flight path sent by a control device of an unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path, where each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in each shooting direction are located in an effective shooting area in the shooting direction, the effective shooting area is determined according to first position information of the area to be shot and second position information of an externally-extended shooting area, the externally-extended shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information;
in S1002, the imaging device mounted on the unmanned aerial vehicle is controlled to perform imaging based on the flight line and the imaging sequence corresponding to each waypoint.
For a specific description of the embodiment shown in fig. 10, reference may be made to the contents of the embodiments shown in fig. 2 to 9, which are not limited in detail here.
Corresponding to the shooting control method of the above embodiment, an embodiment of the present application further provides a shooting control apparatus, please refer to fig. 11, the shooting control apparatus of the embodiment of the present application may include a storage device and a processor, wherein the processor includes one or more processors.
A storage device to store program instructions. The storage device stores a computer program of executable instructions of the photographing control method, and the storage device may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the photographing control apparatus may cooperate with a network storage apparatus that performs a storage function of the memory through a network connection. The memory may be an internal storage unit of the photographing control apparatus, such as a hard disk or a memory of the photographing control apparatus. The memory may also be an external storage device of the photographing control apparatus, such as a plug-in hard disk provided on the photographing control apparatus, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory may also include both an internal storage unit of the photographing control apparatus and an external storage device. The memory is used for storing computer programs and other programs and data required by the device. The memory may also be used to temporarily store data that has been output or is to be output.
In some embodiments, one or more processors, invoking program instructions stored in a storage device, the one or more processors, when executed, being individually or collectively configured to perform operations of: acquiring first position information of a region to be shot and second position information of an externally expanded shooting region, wherein the externally expanded shooting region is obtained by expanding the region to be shot, and the second position information is determined according to the first position information; determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information; determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle; each shooting sequence comprises one or more continuous shooting points to form a shooting sequence, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in all the shooting directions are located in the effective shooting area in the shooting direction. The processor of this embodiment can implement the shooting control method according to the embodiment shown in fig. 2 or fig. 7 of this application, and the shooting control apparatus of this embodiment will be described with reference to the shooting control method according to the embodiment shown in fig. 2 or fig. 7.
In some embodiments, one or more processors, invoking program instructions stored in a storage device, the one or more processors, when executed, being individually or collectively configured to perform operations of: receiving a flight path sent by a control device of the unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path; controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and a shooting sequence corresponding to each waypoint; the shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in all the shooting directions are located in effective shooting areas of the shooting directions, the effective shooting areas are determined according to first position information of areas to be shot and second position information of extended shooting areas, the extended shooting areas are obtained by expanding the areas to be shot, and the second position information is determined according to the first position information. The processor of this embodiment can implement the shooting control method according to the embodiment shown in fig. 10 of this application, and the shooting control apparatus of this embodiment will be described with reference to the shooting control method according to the embodiment shown in fig. 10.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It should be noted that the communication processes such as "transmission" and "reception" related to the above entity apparatus may be performed by using a transceiver or a communication interface on the apparatus, and other data processing processes except for "transmission" and "reception" may be performed by a processor on the apparatus.
Further, this application embodiment still has an unmanned aerial vehicle, please refer to fig. 1 and fig. 12, and this unmanned aerial vehicle can include organism 100, cloud platform 300 and the shooting controlling means of above-mentioned embodiment. The cradle head 300 is mounted on the body, and the cradle head 300 of the present embodiment is used for mounting the photographing device 200. The photographing control device is supported by the body 100, and is electrically connected to the pan/tilt head 300.
Furthermore, an embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the photographing control method of the above-described embodiment.
The computer-readable storage medium may be an internal storage unit, such as a hard disk or a memory, of the shooting control apparatus according to any of the foregoing embodiments. The computer-readable storage medium may also be an external storage device of the photographing control apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), and the like provided on the device. Further, the computer-readable storage medium may also include both an internal storage unit of the photographing control apparatus and an external storage device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the photographing control apparatus, and may also be used for temporarily storing data that has been output or is to be output.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only a few examples of the present application, and certainly should not be taken as limiting the scope of the present application, which is therefore intended to cover all modifications that are within the scope of the present application and which are equivalent to the claims.

Claims (170)

1. A shooting control method, characterized by comprising:
acquiring first position information of a to-be-shot area and second position information of an externally-expanded shot area, wherein the externally-expanded shot area is obtained by expanding the to-be-shot area, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in the shooting directions are all located in the effective shooting area in the shooting direction.
2. The method according to claim 1, wherein the outward-enlarging photographing region is a region obtained by enlarging different directions of the region to be photographed by a first preset distance, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of a shooting device carried on the unmanned aerial vehicle.
3. The method of claim 2, wherein the shooting directions comprise at least two of:
relative vertical direction slope and orientation take the orientation before unmanned aerial vehicle's the place ahead, relative vertical direction slope and orientation take the orientation after unmanned aerial vehicle's the rear, relative vertical direction slope and orientation the left side of unmanned aerial vehicle's left side direction is taken the orientation, relative vertical direction slope and orientation the right side of unmanned aerial vehicle's right side direction is taken the orientation or is taken the vertical orientation of taking the vertical orientation of down of orientation and take the orientation.
4. The method according to claim 3, wherein the effective shooting area in the forward shooting direction, the effective shooting area in the backward shooting direction, the effective shooting area in the left shooting direction, and the effective shooting area in the right shooting direction are respectively:
the area to be shot is obtained after moving the area to be shot to the first direction by a second preset distance, the area to be shot is obtained after moving the area to be shot to the second direction by the second preset distance, the area to be shot is obtained after moving the area to be shot to the third direction by the second preset distance, and the area to be shot is obtained after moving the area to be shot to the fourth direction by the second preset distance; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
5. The method of claim 4, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
6. The method of claim 2, wherein the fly height is set by a user.
7. The method of claim 6, wherein the fly height is user input through a control device of the drone.
8. The method of claim 2, wherein the flying height is determined according to parameters of a camera onboard the drone and a preset ground resolution.
9. The method of claim 8, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
10. The method of claim 1, wherein the flight path comprises a plurality of parallel subpaths, wherein one side of each adjacent subpath is connected to form a flight path;
the flight route determining process comprises the following steps:
determining the lateral distance between two adjacent sub-routes in the flight route according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of a shooting device carried on the unmanned aerial vehicle, which are vertical to the flight direction of the unmanned aerial vehicle;
and determining the flight route according to the second position information and the lateral distance.
11. The method of claim 10, wherein the extended shot area is square, the starting waypoint of the flight path is any corner position of the extended shot area, and the sub-path is parallel to one of the edges of the extended shot area.
12. The method of claim 10, wherein the side lap ratio is set by a user.
13. The method of claim 1, wherein the first location information is set by a user; or,
the area to be shot is determined by importing an external file, and the first position information is recorded in the external file.
14. The method of claim 1, wherein a duration required for a camera onboard the drone to complete the capturing of each of the capture sequences is a first fixed duration.
15. The method according to claim 14, wherein a time period required for the photographing apparatus to complete photographing of the adjacent photographing point in the same photographing sequence is a second fixed time period.
16. The method of claim 1, wherein the spacing between adjacent shot sequences is a first fixed spacing.
17. The method of claim 16, wherein the spacing between adjacent shots in the same shot sequence is a second fixed spacing.
18. The method of claim 1, wherein an initial shot point of the shot points is: the unmanned aerial vehicle flies according to the flight route at the initial flight position; or,
the initial shooting points in the shooting points are as follows: an initial waypoint of the flight path.
19. The method according to claim 1, wherein before the obtaining the first position information of the region to be shot and the second position information of the extended shot region, the method further comprises:
acquiring a trigger instruction for indicating to enter a tilt shooting mode;
entering the tilt photographing mode.
20. The method according to any one of claims 1 to 19, wherein the method is performed by a control device of the drone.
21. The method of claim 20, wherein the obtaining second location information of the flared shooting area comprises:
and determining second position information of the outward-extended shooting area according to the first position information.
22. The method of claim 20, wherein after the obtaining the second position information of the flaring capture area, further comprising:
and planning the flight route of the unmanned aerial vehicle according to the second position information.
23. The method of claim 22, further comprising:
sending the flight path to the unmanned aerial vehicle;
after determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle, the method further comprises the following steps:
and sending the shooting sequence corresponding to each waypoint to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls a shooting device carried on the unmanned aerial vehicle to shoot based on the flight line and the shooting sequence corresponding to each waypoint.
24. The method according to any one of claims 1 to 19, characterized in that the method is performed by the drone.
25. The method of claim 24, wherein the first location information is sent by a control device of the drone.
26. The method of claim 24, wherein the second location information is sent by a control device of the drone.
27. The method of claim 24, wherein the obtaining second location information of the flared shooting area comprises:
and determining second position information of the outward-extended shooting area according to the first position information.
28. The method of claim 24, wherein the flight path is planned by a control device of the drone based on the second location information; or,
the flight path is planned by the drone based on the second location information.
29. The method of claim 24, further comprising:
and controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence corresponding to each waypoint.
30. The method of claim 29, wherein controlling a camera onboard the drone to take a photograph based on the flight path and the photograph sequence comprises:
controlling the unmanned aerial vehicle to fly according to the flight route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the posture, so that a shooting device on the cradle head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring the image shot by the shooting device at each shooting point.
31. The method of claim 30, wherein the camera takes a picture without affecting the flight of the drone.
32. The method of claim 30, wherein controlling a pan-tilt switching attitude of the drone during the flight of the drone from a current shooting point to a next shooting point according to the shooting sequence so that a shooting device on the pan-tilt is in a corresponding shooting direction when the drone arrives at each shooting point comprises:
according to the shooting sequence, sending a shooting trigger signal to the cloud deck so that the cloud deck performs attitude switching in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, and enabling a shooting device on the cloud deck to be in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating that the cradle head triggers the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
33. The method of claim 32, wherein the capture trigger signal is a timed capture trigger signal, and the timed capture trigger signal is used to instruct the pan/tilt head to trigger the capture device to capture based on a first timing strategy;
wherein the first timing policy comprises: the time length required by the shooting device to complete the shooting of each shooting sequence is a first fixed time length.
34. The method of claim 33, wherein the timed capture trigger signal is further used to instruct the pan/tilt head to trigger the capture device to capture based on a second timing strategy;
wherein the second timing policy comprises: the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
35. The method of claim 33, wherein the sending the timed capturing trigger signal to the pan/tilt head is once, and comprises:
and sending a timing shooting trigger signal to the cradle head before the cradle head controls the shooting device to complete shooting of a first shooting point of the first shooting sequence.
36. The method of any one of claims 33 to 35, wherein the number of capture sequences is plural, the method further comprising:
and before the cradle head controls the shooting device to complete shooting of the first shooting point of the first shooting sequence, all the shooting sequences are sent to the cradle head at one time.
37. The method of any one of claims 33 to 35, wherein the number of capture sequences is plural, the method further comprising:
and after the shooting device finishes shooting of the current shooting sequence, sending the next shooting sequence to the holder.
38. The method according to claim 32, wherein the shooting trigger signal is a distance shooting trigger signal, and the distance shooting trigger signal is used for indicating that the pan-tilt is triggered to control the shooting device to shoot based on a first distance strategy;
wherein the first distance policy comprises: the distance between adjacent shooting sequences is a first fixed distance.
39. The method according to claim 38, wherein the distance-shooting trigger signal is further used to instruct the pan-tilt to control the shooting device to shoot based on a second distance strategy;
wherein the second distance strategy comprises: and the distance between the adjacent shooting points in the same shooting sequence is a second fixed distance.
40. The method according to claim 38, wherein the sending the distance trigger signal to the pan/tilt head is performed a plurality of times, and the sending the shooting trigger signal to the pan/tilt head comprises:
before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence, distance shooting trigger signals are respectively sent to the cloud deck.
41. The method of claim 40, further comprising:
and sending the shooting sequence to the cloud deck before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence.
42. The method according to claim 39, wherein the sending the distance trigger signal to the pan/tilt head is performed a plurality of times, and the sending the shooting trigger signal to the pan/tilt head comprises:
and respectively sending a fixed-distance shooting trigger signal to the cloud deck before the cloud deck controls the shooting device to complete the shooting of each shooting point.
43. The method of claim 42, further comprising:
and respectively sending an indication signal to the cloud deck before the cloud deck controls the shooting device to complete shooting of each shooting point, wherein the indication signal is used for indicating the target posture of the cloud deck corresponding to the shooting point or the shooting direction of the shooting device.
44. The method of claim 30, wherein the maximum allowable flying speed of the drone is determined based on a heading distance for each shooting direction and a length of time required for the camera to complete a shooting of the shooting sequence and return to an initial shooting direction;
the course intervals in all shooting directions are equal in size, and the course intervals are determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device parallel to the flight direction of the unmanned aerial vehicle.
45. The method of claim 44, wherein the course overlap rate is set by a user.
46. The method of claim 44, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
47. The method of claim 30, wherein said controlling said drone to fly according to said flight pattern comprises:
and controlling the real-time height between the lens of the shooting device and the area to be shot within a preset height range.
48. The method according to claim 30, wherein the controlling of the pan/tilt head on the unmanned aerial vehicle to switch the attitude causes the shooting device on the pan/tilt head to be located in the corresponding shooting direction at each shooting point, specifically comprises:
acquiring the real-time attitude of the unmanned aerial vehicle;
determining the deviation between the real-time attitude of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cloud deck on the unmanned aerial vehicle to switch the attitude according to the deviation, so that the shooting device on the cloud deck is in the corresponding shooting direction at each shooting point.
49. The method of claim 30, wherein the pan-tilt is a three-axis pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
control cloud platform on the unmanned aerial vehicle switches the gesture, include:
and controlling any two of the yaw attitude, the roll shaft attitude and the pitch shaft attitude of the holder so as to control the holder to switch the attitude.
50. A shooting control apparatus, characterized in that the apparatus comprises:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
acquiring first position information of a to-be-shot area and second position information of an externally-expanded shot area, wherein the externally-expanded shot area is obtained by expanding the to-be-shot area, and the second position information is determined according to the first position information;
determining third position information of effective shooting areas in different shooting directions according to the first position information and the second position information;
determining a shooting sequence corresponding to each waypoint on the flight route according to the third position information and a preset flight route of the unmanned aerial vehicle;
each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, and the shooting points in the shooting directions are all located in the effective shooting area in the shooting direction.
51. The apparatus according to claim 50, wherein the outward-enlarging photographing region is a region obtained by enlarging different directions of the region to be photographed by first preset distances, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of a shooting device carried on the unmanned aerial vehicle.
52. The apparatus of claim 51, wherein the shooting directions comprise at least two of:
relative vertical direction slope and orientation take the orientation before unmanned aerial vehicle's the place ahead, relative vertical direction slope and orientation take the orientation after unmanned aerial vehicle's the rear, relative vertical direction slope and orientation the left side of unmanned aerial vehicle's left side direction is taken the orientation, relative vertical direction slope and orientation the right side of unmanned aerial vehicle's right side direction is taken the orientation or is taken the vertical orientation of taking the vertical orientation of down of orientation and take the orientation.
53. The apparatus according to claim 52, wherein the effective shooting area in the forward shooting direction, the effective shooting area in the backward shooting direction, the effective shooting area in the left shooting direction, and the effective shooting area in the right shooting direction are respectively:
the area to be shot is obtained after moving the area to be shot to the first direction by a second preset distance, the area to be shot is obtained after moving the area to be shot to the second direction by the second preset distance, the area to be shot is obtained after moving the area to be shot to the third direction by the second preset distance, and the area to be shot is obtained after moving the area to be shot to the fourth direction by the second preset distance; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
54. The apparatus of claim 53, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
55. The device of claim 51, wherein the fly height is set by a user.
56. The apparatus of claim 55, wherein the fly height is user input via a control device of the drone.
57. The device of claim 51, wherein the flying height is determined according to parameters of a camera mounted on the unmanned aerial vehicle and a preset ground resolution.
58. The device of claim 57, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
59. The apparatus of claim 50 wherein said flight path comprises a plurality of parallel subpaths, adjacent subpaths being connected on one side to form a flight path;
the one or more processors, individually or collectively, are further configured to, when determining a flight path, perform the following:
determining the lateral distance between two adjacent sub-routes in the flight route according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of a shooting device carried on the unmanned aerial vehicle, which are vertical to the flight direction of the unmanned aerial vehicle;
and determining the flight route according to the second position information and the lateral distance.
60. The device of claim 59, wherein the flared photographic area is square, the starting waypoint of the flight path is any corner position of the flared photographic area, and the sub-path is parallel to one of the sides of the flared photographic area.
61. The apparatus of claim 59 wherein the side lap ratio is set by a user.
62. The apparatus of claim 50, wherein the first location information is set by a user; or,
the area to be shot is determined by importing an external file, and the first position information is recorded in the external file.
63. The apparatus of claim 50, wherein the duration of time required for the camera onboard the drone to complete the capture of each of the capture sequences is a first fixed duration.
64. The apparatus according to claim 63, wherein a time period required for the photographing apparatus to complete photographing of the adjacent photographing point in the same photographing sequence is a second fixed time period.
65. The apparatus of claim 50, wherein the spacing between adjacent capture sequences is a first fixed spacing.
66. The apparatus of claim 65 wherein the spacing between adjacent shots in the same shot sequence is a second fixed spacing.
67. The apparatus of claim 50, wherein an initial shot point of the shot points is: the unmanned aerial vehicle flies according to the flight route at the initial flight position; or,
the initial shooting points in the shooting points are as follows: an initial waypoint of the flight path.
68. The apparatus of claim 50, wherein the one or more processors, individually or collectively, are further configured to, prior to obtaining the first location information for the area to be photographed:
acquiring a trigger instruction for indicating to enter a tilt shooting mode;
entering the tilt photographing mode.
69. The device of any one of claims 50 to 68, wherein the camera control device is provided to a control device of the drone.
70. The apparatus of claim 69, wherein the one or more processors, when obtaining second location information for the flared capture area, are further configured, individually or collectively, to:
and determining second position information of the outward-extended shooting area according to the first position information.
71. The apparatus of claim 69, wherein the one or more processors, individually or collectively, after obtaining the second location information for the flared capture area, are further configured to:
and planning the flight route of the unmanned aerial vehicle according to the second position information.
72. The apparatus of claim 69, wherein the one or more processors are further configured, individually or collectively, to:
sending the flight path to the unmanned aerial vehicle;
the one or more processors, after determining a shooting sequence corresponding to each waypoint on the flight path according to the third position information and a preset flight path of the unmanned aerial vehicle, are further configured to, individually or collectively:
and sending the shooting sequence to the unmanned aerial vehicle, so that the unmanned aerial vehicle controls a shooting device carried on the unmanned aerial vehicle to shoot based on the flight line and the shooting sequence.
73. The device of any one of claims 50 to 68, wherein the camera control device is provided on the drone.
74. The apparatus of claim 73, wherein the first location information is sent by a control device of the drone.
75. The apparatus of claim 73, wherein the second location information is sent by a control device of the drone.
76. The apparatus of claim 73, wherein the one or more processors, when obtaining second location information for the flared capture area, are further configured, individually or collectively, to:
and determining second position information of the outward-extended shooting area according to the first position information.
77. The apparatus of claim 73, wherein the flight path is planned by a control device of the drone based on the second location information; or,
the flight path is planned by the drone based on the second location information.
78. The apparatus of claim 73, wherein the one or more processors are further configured, individually or collectively, to:
and controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and the shooting sequence.
79. The device of claim 78, wherein the one or more processors, when controlling a camera onboard the drone to take a shot based on the flight path and the shooting sequence, are further configured, individually or collectively, to:
controlling the unmanned aerial vehicle to fly according to the flight route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the posture, so that a shooting device on the cradle head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring the image shot by the shooting device at each shooting point.
80. The device of claim 79, wherein the camera takes a picture without affecting the flight of the drone.
81. The apparatus of claim 79, wherein the one or more processors, in controlling the pan-tilt switching attitude on the drone during the flight of the drone from a current shoot point to a next shoot point according to the shooting sequence, such that the cameras on the pan-tilt are in corresponding shooting directions when the drone reaches each shoot point, are further configured, individually or collectively, to:
according to the shooting sequence, sending a shooting trigger signal to the cloud deck so that the cloud deck performs attitude switching in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, and enabling a shooting device on the cloud deck to be in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating that the cradle head triggers the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
82. The apparatus according to claim 81, wherein the capture trigger signal is a timed capture trigger signal, the timed capture trigger signal being configured to instruct the pan/tilt head to trigger the capture apparatus to capture based on a first timing strategy;
wherein the first timing policy comprises: the time length required by the shooting device to complete the shooting of each shooting sequence is a first fixed time length.
83. The apparatus according to claim 82, wherein the timed capture trigger signal is further configured to instruct the pan/tilt head to trigger the capture apparatus to capture based on a second timing strategy;
wherein the second timing policy comprises: the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
84. The apparatus of claim 82, wherein the one or more processors are configured to send the timed capture trigger signal to the pan/tilt head once, and wherein the one or more processors, when sending the capture trigger signal to the pan/tilt head, are further configured, individually or collectively, to:
and sending a timing shooting trigger signal to the cradle head before the cradle head controls the shooting device to complete shooting of a first shooting point of the first shooting sequence.
85. The apparatus of any of claims 82 to 84, wherein the number of capture sequences is plural, the one or more processors being further configured, individually or collectively, to:
and before the cradle head controls the shooting device to complete shooting of the first shooting point of the first shooting sequence, all the shooting sequences are sent to the cradle head at one time.
86. The apparatus of any of claims 82 to 84, wherein the number of capture sequences is plural, the one or more processors being further configured, individually or collectively, to:
and after the shooting device finishes shooting of the current shooting sequence, sending the next shooting sequence to the holder.
87. The apparatus according to claim 81, wherein the shooting trigger signal is a distance shooting trigger signal, and the distance shooting trigger signal is used for indicating that the pan-tilt is triggered to control the shooting apparatus to shoot based on a first distance strategy;
wherein the first distance policy comprises: the distance between adjacent shooting sequences is a first fixed distance.
88. The device of claim 87, wherein the distance-to-shoot trigger signal is further used to instruct the pan/tilt head to control the shooting device to shoot based on a second distance strategy;
wherein the second distance strategy comprises: and the distance between the adjacent shooting points in the same shooting sequence is a second fixed distance.
89. The apparatus according to claim 87, wherein the one or more processors are further configured, individually or collectively, when sending a capture trigger signal to the cloud platform, to:
before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence, distance shooting trigger signals are respectively sent to the cloud deck.
90. The apparatus of claim 89, wherein said one or more processors are further configured, individually or collectively, to:
and sending the shooting sequence to the cloud deck before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence.
91. The apparatus according to claim 88, wherein the one or more processors, when sending the distance trigger signal to the pan/tilt head a plurality of times, are further configured, individually or collectively, to perform the following operations:
and respectively sending a fixed-distance shooting trigger signal to the cloud deck before the cloud deck controls the shooting device to complete the shooting of each shooting point.
92. The apparatus of claim 91, wherein the one or more processors are further configured, individually or collectively, to perform operations comprising:
and respectively sending an indication signal to the cloud deck before the cloud deck controls the shooting device to complete shooting of each shooting point, wherein the indication signal is used for indicating the target posture of the cloud deck corresponding to the shooting point or the shooting direction of the shooting device.
93. The apparatus of claim 79, wherein the maximum allowable flying speed of the drone is determined based on the course distance of each shooting direction and the length of time required for the camera to complete a shooting sequence and return to the original shooting direction;
the course intervals in all shooting directions are equal in size, and the course intervals are determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device parallel to the flight direction of the unmanned aerial vehicle.
94. The apparatus of claim 93, wherein the course overlap ratio is set by a user.
95. The apparatus of claim 93, wherein the initial shooting direction is a shooting direction corresponding to one of the shooting points in the shooting sequence.
96. The apparatus of claim 79, wherein the one or more processors, when controlling the drone to fly according to the flight pattern, are further configured, individually or collectively, to:
and controlling the real-time height between the lens of the shooting device and the area to be shot within a preset height range.
97. The apparatus of claim 79, wherein the one or more processors, when controlling a pan-tilt on the drone to switch attitudes such that a camera on the pan-tilt is further configured, individually or collectively, to perform the following operations when each camera point is in a corresponding camera direction:
acquiring the real-time attitude of the unmanned aerial vehicle;
determining a deviation between a real-time pose of the drone and a shooting direction of a next shooting point,
and controlling the cloud deck on the unmanned aerial vehicle to switch the attitude according to the deviation, so that the shooting device on the cloud deck is in the corresponding shooting direction at each shooting point.
98. The apparatus according to claim 79, wherein the pan-tilt head is a three-axis pan-tilt head configured to move about a yaw axis, a roll axis, and a pitch axis;
the one or more processors, when controlling a pan-tilt switch attitude on the drone, are further configured, individually or collectively, to:
and controlling any two of the yaw attitude, the roll shaft attitude and the pitch shaft attitude of the holder so as to control the holder to switch the attitude.
99. An unmanned aerial vehicle, comprising:
a body;
the holder is carried on the machine body and is used for carrying a shooting device; and
the camera control device of any one of claims 50 to 68, 73 to 98, supported by the body, the camera control device being electrically connected to the pan and tilt head.
100. A computer-readable storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements the photographing control method of any of claims 1 to 49.
101. A shooting control method, characterized by comprising:
receiving a flight path sent by a control device of an unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and a shooting sequence corresponding to each waypoint;
the shooting method comprises the steps that each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in all the shooting directions are located in effective shooting areas of the shooting directions, the effective shooting areas are determined according to first position information of areas to be shot and second position information of extended shooting areas, the extended shooting areas are obtained by expanding the areas to be shot, and the second position information is determined according to the first position information.
102. The method according to claim 101, wherein the outward-extending shooting area is an area obtained by respectively extending different directions of the area to be shot by a first preset distance;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of a shooting device carried on the unmanned aerial vehicle.
103. The method of claim 102, wherein the shooting directions comprise at least two of:
relative vertical direction slope and orientation take the orientation before unmanned aerial vehicle's the place ahead, relative vertical direction slope and orientation take the orientation after unmanned aerial vehicle's the rear, relative vertical direction slope and orientation the left side of unmanned aerial vehicle's left side direction is taken the orientation, relative vertical direction slope and orientation the right side of unmanned aerial vehicle's right side direction is taken the orientation or is taken the vertical orientation of taking the vertical orientation of down of orientation and take the orientation.
104. The method according to claim 103, wherein the effective shooting area in the forward shooting direction, the effective shooting area in the backward shooting direction, the effective shooting area in the left shooting direction, and the effective shooting area in the right shooting direction are respectively:
the area to be shot is obtained after moving the area to be shot to the first direction by a second preset distance, the area to be shot is obtained after moving the area to be shot to the second direction by the second preset distance, the area to be shot is obtained after moving the area to be shot to the third direction by the second preset distance, and the area to be shot is obtained after moving the area to be shot to the fourth direction by the second preset distance; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
105. The method of claim 104, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
106. The method of claim 102, wherein the fly height is set by a user.
107. The method of claim 106, wherein the fly height is user input via a control device of the drone.
108. The method of claim 102, wherein the flying height is determined according to parameters of a camera onboard the drone and a preset ground resolution.
109. The method of claim 108, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
110. The method of claim 101 wherein said flight path comprises a plurality of parallel subpassages, adjacent subpassages being connected on one side to form a flight path;
the flight route determining process comprises the following steps:
the control device of the unmanned aerial vehicle determines the lateral distance between two adjacent sub-routes in the flight route according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of the shooting device carried on the unmanned aerial vehicle, which are vertical to the flight direction of the unmanned aerial vehicle; and determining the flight route according to the second position information and the lateral distance.
111. The method of claim 110 wherein the extended shot area is square, the starting waypoint of the flight path is any corner location of the extended shot area, and the sub-path is parallel to one of the edges of the extended shot area.
112. The method of claim 110 wherein the side-by-side overlap ratio is set by a user.
113. The method of claim 101, wherein the first location information is set by a user; or,
the area to be shot is determined by importing an external file, and the first position information is recorded in the external file.
114. The method of claim 101, wherein before receiving the flight path sent by the control device of the drone and the shooting sequence corresponding to each waypoint on the flight path, the method further comprises:
acquiring a trigger instruction for indicating to enter a tilt shooting mode;
entering the tilt photographing mode.
115. The method of claim 101, wherein controlling a camera onboard the drone to take a photograph based on the flight path and the photograph sequence comprises:
controlling the unmanned aerial vehicle to fly according to the flight route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the posture, so that a shooting device on the cradle head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring the image shot by the shooting device at each shooting point.
116. The method of claim 115, wherein the camera takes a picture without affecting the flight of the drone.
117. The method of claim 115, wherein the controlling, according to the shooting sequence, a pan-tilt switching attitude of the drone during a flight from a current shooting point to a next shooting point so that a shooting device on the pan-tilt is in a corresponding shooting direction when the drone arrives at each shooting point comprises:
according to the shooting sequence, sending a shooting trigger signal to the cloud deck so that the cloud deck performs attitude switching in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, and enabling a shooting device on the cloud deck to be in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating that the cradle head triggers the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
118. The method of claim 117, wherein the capture trigger is a timed capture trigger for instructing the pan/tilt head to trigger the capture device to capture based on a first timing strategy;
wherein the first timing policy comprises: the time length required by the shooting device to complete the shooting of each shooting sequence is a first fixed time length.
119. The method of claim 118, wherein the timed capture trigger signal is further used to instruct the pan/tilt head to trigger the capture device to capture based on a second timing strategy;
wherein the second timing policy comprises: the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
120. The method of claim 118, wherein sending the timed capture trigger signal to the pan/tilt head once comprises:
and sending a timing shooting trigger signal to the cradle head before the cradle head controls the shooting device to complete shooting of a first shooting point of the first shooting sequence.
121. The method of any one of claims 118 to 120, wherein the number of capture sequences is plural, the method further comprising:
and before the cradle head controls the shooting device to complete shooting of the first shooting point of the first shooting sequence, all the shooting sequences are sent to the cradle head at one time.
122. The method of any one of claims 118 to 120, wherein the number of capture sequences is plural, the method further comprising:
and after the shooting device finishes shooting of the current shooting sequence, sending the next shooting sequence to the holder.
123. The method according to claim 117, wherein the shooting trigger signal is a distance shooting trigger signal, and the distance shooting trigger signal is used for indicating that the pan-tilt is triggered to control the shooting device to shoot based on a first distance strategy;
wherein the first distance policy comprises: the distance between adjacent shooting sequences is a first fixed distance.
124. The method according to claim 123, wherein the distance-to-shoot trigger signal is further used to instruct triggering the pan-tilt to control the shooting device to shoot based on a second distance strategy;
wherein the second distance strategy comprises: and the distance between the adjacent shooting points in the same shooting sequence is a second fixed distance.
125. The method according to claim 123, wherein the sending the distance trigger signal to the pan/tilt head is performed a plurality of times, and the sending the shooting trigger signal to the pan/tilt head comprises:
before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence, distance shooting trigger signals are respectively sent to the cloud deck.
126. The method of claim 125, further comprising:
and sending the shooting sequence to the cloud deck before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence.
127. The method according to claim 124, wherein the sending the distance trigger signal to the pan/tilt head is performed a plurality of times, and the sending the shooting trigger signal to the pan/tilt head comprises:
and respectively sending a fixed-distance shooting trigger signal to the cloud deck before the cloud deck controls the shooting device to complete the shooting of each shooting point.
128. The method of claim 127, further comprising:
and respectively sending an indication signal to the cloud deck before the cloud deck controls the shooting device to complete shooting of each shooting point, wherein the indication signal is used for indicating the target posture of the cloud deck corresponding to the shooting point or the shooting direction of the shooting device.
129. The method of claim 115, wherein the maximum allowable flying speed of the drone is determined based on a heading distance for each shooting direction and a length of time required for the camera to complete a shooting sequence and return to an initial shooting direction;
the course intervals in all shooting directions are equal in size, and the course intervals are determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device parallel to the flight direction of the unmanned aerial vehicle.
130. The method of claim 129 wherein the course overlap rate is set by a user.
131. The method of claim 129, wherein the initial capture direction is a capture direction corresponding to one of the capture points in the capture sequence.
132. The method of claim 115, wherein said controlling said drone to fly according to said flight pattern comprises:
and controlling the real-time height between the lens of the shooting device and the area to be shot within a preset height range.
133. The method according to claim 115, wherein the controlling of the pan/tilt head on the drone to switch the attitude causes the camera on the pan/tilt head to be in the corresponding shooting direction at each shooting point, specifically comprises:
acquiring the real-time attitude of the unmanned aerial vehicle;
determining the deviation between the real-time attitude of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cloud deck on the unmanned aerial vehicle to switch the attitude according to the deviation, so that the shooting device on the cloud deck is in the corresponding shooting direction at each shooting point.
134. The method of claim 115, wherein the pan-tilt is a three-axis pan-tilt configured to move about a yaw axis, a roll axis, and a pitch axis;
control cloud platform on the unmanned aerial vehicle switches the gesture, include:
and controlling any two of the yaw attitude, the roll shaft attitude and the pitch shaft attitude of the holder so as to control the holder to switch the attitude.
135. A shooting control apparatus, characterized in that the apparatus comprises:
storage means for storing program instructions; and
one or more processors that invoke program instructions stored in the storage device, the one or more processors individually or collectively configured to, when the program instructions are executed, perform operations comprising:
receiving a flight path sent by a control device of an unmanned aerial vehicle and a shooting sequence corresponding to each waypoint on the flight path;
controlling a shooting device carried on the unmanned aerial vehicle to shoot based on the flight route and a shooting sequence corresponding to each waypoint;
the shooting method comprises the steps that each shooting sequence comprises one or more continuous shooting points, the shooting directions of the one or more shooting points of each shooting sequence are different, the shooting points in all the shooting directions are located in effective shooting areas of the shooting directions, the effective shooting areas are determined according to first position information of areas to be shot and second position information of extended shooting areas, the extended shooting areas are obtained by expanding the areas to be shot, and the second position information is determined according to the first position information.
136. The apparatus according to claim 135, wherein the outward-enlarging photographing region is a region obtained by enlarging different directions of the region to be photographed by a first preset distance, respectively;
the first preset distance is determined based on the flying height of the unmanned aerial vehicle and the installation angle of a shooting device carried on the unmanned aerial vehicle.
137. The apparatus according to claim 136, wherein the shooting directions comprise at least two of:
relative vertical direction slope and orientation take the orientation before unmanned aerial vehicle's the place ahead, relative vertical direction slope and orientation take the orientation after unmanned aerial vehicle's the rear, relative vertical direction slope and orientation the left side of unmanned aerial vehicle's left side direction is taken the orientation, relative vertical direction slope and orientation the right side of unmanned aerial vehicle's right side direction is taken the orientation or is taken the vertical orientation of taking the vertical orientation of down of orientation and take the orientation.
138. The apparatus according to claim 137, wherein the effective shooting area in the forward shooting direction, the effective shooting area in the backward shooting direction, the effective shooting area in the left shooting direction, and the effective shooting area in the right shooting direction are respectively:
the area to be shot is obtained after moving the area to be shot to the first direction by a second preset distance, the area to be shot is obtained after moving the area to be shot to the second direction by the second preset distance, the area to be shot is obtained after moving the area to be shot to the third direction by the second preset distance, and the area to be shot is obtained after moving the area to be shot to the fourth direction by the second preset distance; the effective shooting area in the positive shooting direction is the area to be shot, wherein the first direction is opposite to the second direction, and the third direction is opposite to the fourth direction.
139. The apparatus of claim 138, wherein the first direction, the second direction, the third direction, or the fourth direction is related to a shape of the flight path.
140. The device of claim 136 wherein said fly height is set by a user.
141. The apparatus of claim 140, wherein the fly height is user input via a control device of the drone.
142. The device of claim 136, wherein the flying height is determined according to parameters of a camera mounted on the drone and a preset ground resolution.
143. The device of claim 142, wherein the parameters of the camera include a focal length of the camera and a single pixel side length of an image sensor of the camera.
144. The apparatus of claim 135 wherein said flight path comprises a plurality of parallel subpaths, adjacent subpaths being connected on one side to form a flight path;
the flight route determining process comprises the following steps:
the control device of the unmanned aerial vehicle determines the lateral distance between two adjacent sub-routes in the flight route according to the preset ground resolution, the preset lateral overlapping rate and the number of pixels of the shooting device carried on the unmanned aerial vehicle, which are vertical to the flight direction of the unmanned aerial vehicle; and determining the flight route according to the second position information and the lateral distance.
145. The apparatus of claim 144 wherein the flared photographic region is square, the starting waypoint of the flight path is any corner position of the flared photographic region, and the sub-path is parallel to one of the sides of the flared photographic region.
146. The apparatus recited in claim 144 wherein the side lap ratio is set by a user.
147. The apparatus according to claim 135, wherein the first location information is set by a user; or,
the area to be shot is determined by importing an external file, and the first position information is recorded in the external file.
148. The apparatus of claim 135, wherein the one or more processors, individually or collectively, are further configured to, prior to receiving the flight path sent by the control of the drone and the capture sequence corresponding to each waypoint on the flight path, perform:
acquiring a trigger instruction for indicating to enter a tilt shooting mode;
entering the tilt photographing mode.
149. The device of claim 135, wherein the one or more processors, when controlling a camera onboard the drone to take a shot based on the flight path and the shooting sequence, are further configured, individually or collectively, to:
controlling the unmanned aerial vehicle to fly according to the flight route;
according to the shooting sequence, in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, controlling a cradle head on the unmanned aerial vehicle to switch the posture, so that a shooting device on the cradle head is in the corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
and acquiring the image shot by the shooting device at each shooting point.
150. The apparatus of claim 149, wherein the camera captures images that do not affect the flight of the drone.
151. The apparatus of claim 149, wherein the one or more processors, in controlling the pan-tilt switching attitude on the drone during the flight of the drone from a current shoot point to a next shoot point according to the shooting sequence, such that the cameras on the pan-tilt are in corresponding shooting directions when the drone arrives at each shoot point, are further configured, individually or collectively, to:
according to the shooting sequence, sending a shooting trigger signal to the cloud deck so that the cloud deck performs attitude switching in the process that the unmanned aerial vehicle flies from the current shooting point to the next shooting point, and enabling a shooting device on the cloud deck to be in a corresponding shooting direction when the unmanned aerial vehicle reaches each shooting point;
the shooting trigger signal is further used for indicating that the cradle head triggers the cradle head to shoot when the shooting device is located in the corresponding shooting direction.
152. The apparatus according to claim 151, wherein the capture trigger signal is a timed capture trigger signal for instructing the pan/tilt head to trigger the capture apparatus to capture based on a first timing strategy;
wherein the first timing policy comprises: the time length required by the shooting device to complete the shooting of each shooting sequence is a first fixed time length.
153. The apparatus according to claim 152, wherein the timed capture trigger signal is further configured to instruct the pan/tilt head to trigger the capture device to capture based on a second timing strategy;
wherein the second timing policy comprises: the time length required by the shooting device to complete the shooting of the adjacent shooting points in the same shooting sequence is a second fixed time length.
154. The apparatus according to claim 152, wherein the number of times the timed capture trigger signal is sent to the pan/tilt head is one, and wherein the one or more processors, when sending the capture trigger signal to the pan/tilt head, are further configured, individually or collectively, to:
and sending a timing shooting trigger signal to the cradle head before the cradle head controls the shooting device to complete shooting of a first shooting point of the first shooting sequence.
155. The apparatus of any one of claims 152 to 154, wherein the number of capture sequences is plural, the one or more processors being further configured, individually or collectively, to:
and before the cradle head controls the shooting device to complete shooting of the first shooting point of the first shooting sequence, all the shooting sequences are sent to the cradle head at one time.
156. The apparatus of any one of claims 152 to 154, wherein the number of capture sequences is plural, the one or more processors being further configured, individually or collectively, to:
and after the shooting device finishes shooting of the current shooting sequence, sending the next shooting sequence to the holder.
157. The device according to claim 151, wherein the capture trigger signal is a distance capture trigger signal, the distance capture trigger signal indicating that the pan/tilt head is triggered to control the capture device to capture based on a first distance strategy;
wherein the first distance policy comprises: the distance between adjacent shooting sequences is a first fixed distance.
158. The apparatus according to claim 157, wherein the range-finding capture trigger signal is further configured to instruct the pan-tilt to control the capture device to capture based on a second range-finding strategy;
wherein the second distance strategy comprises: and the distance between the adjacent shooting points in the same shooting sequence is a second fixed distance.
159. The apparatus according to claim 157, wherein the number of times the distance trigger signal is sent to the pan/tilt head is a plurality of times, the one or more processors, when sending a capture trigger signal to the pan/tilt head, are further configured, individually or collectively, to:
before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence, distance shooting trigger signals are respectively sent to the cloud deck.
160. The apparatus of claim 159, wherein the one or more processors are further configured, individually or collectively, to perform operations comprising:
and sending the shooting sequence to the cloud deck before the cloud deck controls the shooting device to complete shooting of the first shooting point of each shooting sequence.
161. The apparatus according to claim 158, wherein the one or more processors, when sending the distance trigger signal to the pan/tilt head a plurality of times, are further configured, individually or collectively, to perform the following operations:
and respectively sending a fixed-distance shooting trigger signal to the cloud deck before the cloud deck controls the shooting device to complete the shooting of each shooting point.
162. The apparatus of claim 161, wherein the one or more processors are further configured, individually or collectively, to perform operations comprising:
and respectively sending an indication signal to the cloud deck before the cloud deck controls the shooting device to complete shooting of each shooting point, wherein the indication signal is used for indicating the target posture of the cloud deck corresponding to the shooting point or the shooting direction of the shooting device.
163. The device of claim 151, wherein the maximum allowable airspeed of the drone is determined based on the heading distance for each shooting direction and the length of time required for the camera to complete a shooting sequence and return to the initial shooting direction;
the course intervals in all shooting directions are equal in size, and the course intervals are determined based on preset ground resolution, preset course overlapping rate and the number of pixels of the shooting device parallel to the flight direction of the unmanned aerial vehicle.
164. The apparatus of claim 163 wherein the course overlap rate is set by a user.
165. The apparatus according to claim 163 wherein the initial capture direction is a capture direction corresponding to one of the capture points in the capture sequence.
166. The apparatus of claim 151, wherein the one or more processors, when controlling the drone to fly according to the flight pattern, are further configured, individually or collectively, to:
and controlling the real-time height between the lens of the shooting device and the area to be shot within a preset height range.
167. The apparatus of claim 151, wherein the one or more processors, when controlling a pan-tilt switch attitude on the drone such that a camera on the pan-tilt is in a corresponding shooting direction at each shooting point, are further configured, individually or collectively, to:
acquiring the real-time attitude of the unmanned aerial vehicle;
determining the deviation between the real-time attitude of the unmanned aerial vehicle and the shooting direction of the next shooting point;
and controlling the cloud deck on the unmanned aerial vehicle to switch the attitude according to the deviation, so that the shooting device on the cloud deck is in the corresponding shooting direction at each shooting point.
168. The apparatus according to claim 151 wherein the pan-tilt head is a three-axis pan-tilt head configured for movement about a yaw axis, a roll axis, and a pitch axis;
the one or more processors, when controlling a pan-tilt switch attitude on the drone, are further configured, individually or collectively, to:
and controlling any two of the yaw attitude, the roll shaft attitude and the pitch shaft attitude of the holder so as to control the holder to switch the attitude.
169. An unmanned aerial vehicle, comprising:
a body;
the holder is carried on the machine body and is used for carrying a shooting device; and
the camera control device as claimed in any one of claims 135-168, supported by the body, wherein the camera control device is electrically connected to the holder.
170. A computer-readable storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements the shooting control method of any one of claims 101 to 134.
CN202080032440.9A 2020-07-16 2020-07-16 Shooting control method and device, unmanned aerial vehicle and computer readable storage medium Active CN113875222B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311469626.2A CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/102249 WO2022011623A1 (en) 2020-07-16 2020-07-16 Photographing control method and device, unmanned aerial vehicle, and computer-readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311469626.2A Division CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Publications (2)

Publication Number Publication Date
CN113875222A true CN113875222A (en) 2021-12-31
CN113875222B CN113875222B (en) 2023-11-24

Family

ID=78982120

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311469626.2A Pending CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device
CN202080032440.9A Active CN113875222B (en) 2020-07-16 2020-07-16 Shooting control method and device, unmanned aerial vehicle and computer readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311469626.2A Pending CN117641107A (en) 2020-07-16 2020-07-16 Shooting control method and device

Country Status (2)

Country Link
CN (2) CN117641107A (en)
WO (1) WO2022011623A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117470199A (en) * 2023-12-27 2024-01-30 天津云圣智能科技有限责任公司 Swing photography control method and device, storage medium and electronic equipment
WO2024087024A1 (en) * 2022-10-25 2024-05-02 深圳市大疆创新科技有限公司 Information processing method, information processing device, aircraft system and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114777744B (en) * 2022-04-25 2024-03-08 中国科学院古脊椎动物与古人类研究所 Geological measurement method and device in ancient organism field and electronic equipment
CN114935942B (en) * 2022-05-20 2024-08-23 无锡海纳智能科技有限公司 Determination method of distributed photovoltaic power station routing inspection route and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706A (en) * 2016-12-09 2017-05-31 中山大学 A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident
CN107504957A (en) * 2017-07-12 2017-12-22 天津大学 The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
US20190118945A1 (en) * 2017-10-24 2019-04-25 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for uav scanning and imaging
US20190235502A1 (en) * 2018-01-29 2019-08-01 Aerovironment, Inc. Methods and Systems for Determining Flight Plans for Vertical Take-Off and Landing (VTOL) Aerial Vehicles
CN110771141A (en) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and unmanned aerial vehicle
WO2020103022A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method and apparatus, device and medium
CN111226185A (en) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 Flight route generation method, control device and unmanned aerial vehicle system
CN111373339A (en) * 2019-05-17 2020-07-03 深圳市大疆创新科技有限公司 Flight task generation method, control terminal, unmanned aerial vehicle and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10364026B1 (en) * 2015-09-21 2019-07-30 Amazon Technologies, Inc. Track and tether vehicle position estimation
CN109032165B (en) * 2017-07-21 2021-09-10 广州极飞科技股份有限公司 Method and device for generating unmanned aerial vehicle air route

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706A (en) * 2016-12-09 2017-05-31 中山大学 A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident
CN107504957A (en) * 2017-07-12 2017-12-22 天津大学 The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
US20190118945A1 (en) * 2017-10-24 2019-04-25 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for uav scanning and imaging
US20190235502A1 (en) * 2018-01-29 2019-08-01 Aerovironment, Inc. Methods and Systems for Determining Flight Plans for Vertical Take-Off and Landing (VTOL) Aerial Vehicles
CN110771141A (en) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 Shooting method and unmanned aerial vehicle
WO2020103022A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method and apparatus, device and medium
CN111226185A (en) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 Flight route generation method, control device and unmanned aerial vehicle system
CN111373339A (en) * 2019-05-17 2020-07-03 深圳市大疆创新科技有限公司 Flight task generation method, control terminal, unmanned aerial vehicle and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MUHAMMAD ATTAMIMI;RONNY MARDIYANTO;ASTRIA NUR IRFANSYAH: "Inclined Image Recognition for Aerial Mapping by Unmanned Aerial Vehicles" *
赵红艳;黄雪琴: "无人机倾斜摄影技术在地质灾害应急测绘中的应用" *
魏铼;胡卓玮;陈天博;胡顺强;陈诚;赵文吉: "单机倾斜摄影方式的无人机航线设计" *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024087024A1 (en) * 2022-10-25 2024-05-02 深圳市大疆创新科技有限公司 Information processing method, information processing device, aircraft system and storage medium
CN117470199A (en) * 2023-12-27 2024-01-30 天津云圣智能科技有限责任公司 Swing photography control method and device, storage medium and electronic equipment
CN117470199B (en) * 2023-12-27 2024-03-15 天津云圣智能科技有限责任公司 Swing photography control method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN117641107A (en) 2024-03-01
CN113875222B (en) 2023-11-24
WO2022011623A1 (en) 2022-01-20

Similar Documents

Publication Publication Date Title
CN110771141B (en) Shooting method and unmanned aerial vehicle
JP6803919B2 (en) Flight path generation methods, flight path generation systems, flying objects, programs, and recording media
CN103118230B (en) A kind of panorama acquisition, device and system
CN110494360B (en) System and method for providing autonomous photography and photography
CN113875222A (en) Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
WO2018120350A1 (en) Method and device for positioning unmanned aerial vehicle
JP7556383B2 (en) Information processing device, information processing method, information processing program, image processing device, and image processing system
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
WO2018120351A1 (en) Method and device for positioning unmanned aerial vehicle
CN110716579B (en) Target tracking method and unmanned aerial vehicle
US20210112194A1 (en) Method and device for taking group photo
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
US20230359204A1 (en) Flight control method, video editing method, device, uav and storage medium
WO2020237422A1 (en) Aerial surveying method, aircraft and storage medium
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
JP6265576B1 (en) Imaging control apparatus, shadow position specifying apparatus, imaging system, moving object, imaging control method, shadow position specifying method, and program
CN111699454A (en) Flight planning method and related equipment
WO2020225979A1 (en) Information processing device, information processing method, program, and information processing system
CN112313942A (en) Control device for image processing and frame body control
CN112154650A (en) Focusing control method and device for shooting device and unmanned aerial vehicle
JP2020095519A (en) Shape estimation device, shape estimation method, program, and recording medium
US20210218879A1 (en) Control device, imaging apparatus, mobile object, control method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant