WO2022011623A1 - 拍摄控制方法和装置、无人机及计算机可读存储介质 - Google Patents

拍摄控制方法和装置、无人机及计算机可读存储介质 Download PDF

Info

Publication number
WO2022011623A1
WO2022011623A1 PCT/CN2020/102249 CN2020102249W WO2022011623A1 WO 2022011623 A1 WO2022011623 A1 WO 2022011623A1 CN 2020102249 W CN2020102249 W CN 2020102249W WO 2022011623 A1 WO2022011623 A1 WO 2022011623A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
area
drone
photographing
sequence
Prior art date
Application number
PCT/CN2020/102249
Other languages
English (en)
French (fr)
Inventor
吴利鑫
何纲
黄振昊
方朝晖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202311469626.2A priority Critical patent/CN117641107A/zh
Priority to PCT/CN2020/102249 priority patent/WO2022011623A1/zh
Priority to CN202080032440.9A priority patent/CN113875222B/zh
Publication of WO2022011623A1 publication Critical patent/WO2022011623A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present application relates to the field of photography, and in particular, to a photography control method and device, an unmanned aerial vehicle, and a computer-readable storage medium.
  • the oblique photography technology is to install multiple shooting devices on the drone, and simultaneously capture images from one vertical and four different angles of side view. Compared with traditional photography, there are four more oblique shooting angles, so that more abundant side textures can be obtained. It is suitable for fields such as surveying and mapping that need to obtain multi-directional feature information of the photographed object.
  • a multi-shot shooting device such as a 5-shot shooting device
  • the multi-shot shooting device has high cost and weight.
  • a rolling shutter or an electronic global shutter will be used.
  • the rolling shutter will have a "jelly effect" in fast motion photography, which will reduce the modeling accuracy, and the poor image quality of the electronic global shutter will also affect the modeling effect.
  • Another way is to mount a shooting device with a single lens on the drone, and cooperate with multiple routes to achieve shooting in multiple directions.
  • the gimbal is mounted on the body of the drone, and the image quality is better.
  • the area to be photographed will be expanded first, and then the route planning will be performed on the expanded area (ie, the expanded area to be photographed).
  • the drone flies along the planned route and collects images in each direction when it flies to each shooting point. In this way, a large amount of invalid image data will be generated on the extended route of the area to be photographed, which not only wastes storage space , but also brings inconvenience to the modeling process.
  • the present application provides a photographing control method and device, an unmanned aerial vehicle, and a computer-readable storage medium.
  • an embodiment of the present application provides a shooting control method, the method includes:
  • the first position information and the second position information determine the third position information of the effective shooting area in different shooting directions
  • each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points in each shooting sequence are different, and the shooting points in each shooting direction are all located in the effective shooting direction of the shooting direction. shooting area.
  • an embodiment of the present application provides a photographing control device, the device comprising:
  • a storage device for storing program instructions
  • One or more processors that invoke program instructions stored in the storage device, the one or more processors, when executed, are individually or collectively configured to perform the following operations:
  • the first position information and the second position information determine the third position information of the effective shooting area in different shooting directions
  • each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points in each shooting sequence are different, and the shooting points in each shooting direction are all located in the effective shooting direction of the shooting direction. shooting area.
  • an unmanned aerial vehicle including:
  • the photographing control device is supported by the body, and the photographing control device is electrically connected to the pan/tilt.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements the shooting control method described in the first aspect.
  • an embodiment of the present application provides a shooting control method, the method comprising:
  • each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points in each shooting sequence are different, and the shooting points in each shooting direction are all located in the effective shooting direction of the shooting direction.
  • Shooting area, the effective shooting area is determined according to the first position information of the to-be-photographed area and the second position information of the extended shooting area, the extended shooting area is obtained by expanding the to-be-photographed area, the first The second position information is determined according to the first position information.
  • an embodiment of the present application provides a photographing control device, the device comprising:
  • a storage device for storing program instructions
  • One or more processors that invoke program instructions stored in the storage device, the one or more processors, when executed, are individually or collectively configured to perform the following operations:
  • each shooting sequence includes one or more continuous shooting points, the shooting directions of the one or more shooting points in each shooting sequence are different, and the shooting points in each shooting direction are all located in the effective shooting direction of the shooting direction.
  • Shooting area, the effective shooting area is determined according to the first position information of the to-be-photographed area and the second position information of the extended shooting area, the extended shooting area is obtained by expanding the to-be-photographed area, the first The second position information is determined according to the first position information.
  • an unmanned aerial vehicle including:
  • the shooting control device is supported by the body, and the shooting control device is electrically connected to the pan/tilt.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements the shooting control method described in the fifth aspect.
  • the present application when planning the shooting sequence, ensures that the shooting points of each shooting direction in each shooting sequence are located in the effective shooting area of the shooting direction. In this way, not only can invalid image data be prevented from being generated, It can also reduce the number of shooting points, reduce the shooting time, and improve the efficiency of multi-directional shooting.
  • FIG. 1 is a schematic structural diagram of an unmanned aerial vehicle in an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a method of a shooting control method in an embodiment of the present application
  • FIG. 3 is a schematic diagram of the positional relationship between the to-be-photographed area and the expanded photographic area in an embodiment of the present application;
  • FIG. 4A is a schematic diagram of a flight route in an embodiment of the present application.
  • 4B is a schematic diagram of a flight route in another embodiment of the present application.
  • 5A is a schematic diagram of the positional relationship between an effective shooting area and a to-be-photographed area in one of the shooting directions in an embodiment of the present application;
  • 5B is a schematic diagram of the positional relationship between an effective shooting area and a to-be-photographed area in another shooting direction in an embodiment of the present application;
  • 5C is a schematic diagram of the positional relationship between an effective shooting area and a to-be-photographed area in another shooting direction in an embodiment of the present application;
  • 5D is a schematic diagram of the positional relationship between an effective shooting area and a to-be-photographed area in another shooting direction in an embodiment of the present application;
  • 6A is a comparison diagram of images taken by a drone in the same shooting direction at different shooting points according to an embodiment of the present application
  • 6B is a schematic diagram of a flight route in another embodiment of the present application.
  • FIG. 7 is a schematic diagram of an implementation manner of controlling a photographing device mounted on an unmanned aerial vehicle to photograph based on a flight route and a photographing sequence in an embodiment of the present application;
  • FIG. 8 is a schematic diagram of the positional relationship between images of shooting points with the same shooting direction in two adjacent shooting sequences according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a process in which a pan/tilt head in an embodiment of the present application executes a shooting sequence of a shooting sequence
  • FIG. 10 is a schematic flowchart of a method of a shooting control method in another embodiment of the present application.
  • FIG. 11 is a structural block diagram of a photographing control device in an embodiment of the present application.
  • FIG. 12 is a structural block diagram of an unmanned aerial vehicle in an embodiment of the present application.
  • GNSS Global Navigation Satellite System, Global Navigation Satellite System
  • Traditional surveying and mapping use total station or GNSS (Global Navigation Satellite System, Global Navigation Satellite System) handheld device to measure points.
  • the disadvantages are low efficiency, high operation difficulty and high operation cost.
  • GNSS Global Navigation Satellite System, Global Navigation Satellite System
  • traditional Surveying and mapping cannot be satisfied, and it has been gradually replaced by manned and unmanned surveying and mapping.
  • Manned or unmanned aerial vehicle surveying and mapping can also be used to establish a 3D model of the measurement area.
  • the oblique photography technology is used to shoot in multiple directions of the area to be photographed. Combined with the 3D modeling algorithm, the images in multiple directions are processed and calculated to obtain a 3D model that contains three dimensions.
  • a model of spatial information is used to shoot in multiple directions of the area to be photographed.
  • the area to be photographed will be expanded first, and then the route planning will be carried out on the expanded area (ie, the area to be photographed after the expansion). .
  • the drone flies along the planned route and collects images in each direction when it flies to each shooting point. In this way, a large amount of invalid image data will be generated on the extended route of the area to be photographed, which not only wastes storage space , but also brings inconvenience to the modeling process.
  • At least one means one or more, and “plurality” means two or more.
  • And/or which describes the relationship of the associated objects, indicates that there can be three kinds of relationships, for example, A and/or B, it can indicate that A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b, or c can represent: a, b, c, a and b, a and c, b and c, or a and b and c, where a, b, c can be single or multiple.
  • the unmanned aerial vehicle of an embodiment of the present application may include a body 100, a photographing device 200, and a gimbal 300, wherein the photographing device 200 passes through the cloud
  • the stage 300 is mounted on the body 100 .
  • the UAV can be a fixed-wing UAV or a multi-rotor UAV, and the type of UAV can be selected according to actual needs. Choose a fixed-wing UAV with a larger volume and weight to carry the gimbal 300 and the photographing device 200; when the gimbal 300 and the shooting device 200 weigh less, choose a multi-rotor UAV with a smaller volume and weight to carry the cloud. Stage 300 and photographing device 200 .
  • the number of the photographing devices in the embodiment of the present application is one, and only one photographing device is required when the drone is used for oblique photography. Although the photographing device has large pixels, the volume and weight of the photographing device are greatly reduced compared to the multi-pattern photographing device. , thereby greatly reducing the weight and size of the UAV.
  • the photographing apparatus 200 may be an integrated camera, or a device composed of an image sensor and a lens. It should be noted that the photographing apparatus 200 in the embodiment of the present application is a photographing apparatus having a single lens.
  • the gimbal 300 in the embodiment of the present application may be a single-axis gimbal, a two-axis gimbal, a three-axis gimbal, or other multi-axis gimbal.
  • the UAV can be applied in the field of surveying and mapping.
  • the UAV is equipped with the camera device 200 to collect the ground image, and then use the software to reconstruct the three-dimensional or two-dimensional map of the ground image.
  • the map obtained through surveying and mapping can be Applied in different industries, such as in the field of electric power inspection, the reconstructed map can be used to check line faults; in the field of road planning, the reconstructed map can be used for road site selection; the anti-drug police can use the reconstructed three-dimensional map to check the deep mountains. of poppy cultivation, etc.
  • the UAV is not limited to the field of surveying and mapping, but can also be applied to other fields that need to obtain multi-directional feature information of the photographed object.
  • the subject to be photographed is not limited to the ground, but may also be a large building, a mountain, or the like.
  • FIG. 2 is a schematic flowchart of a method for controlling a photographing in an embodiment of the present application; please refer to FIG. 2 , the photographing control method in an embodiment of the present application may include steps S201 to S203 .
  • the first position information of the area to be shot and the second position information of the extended shooting area are acquired, the extended shooting area is obtained by expanding the to-be-photographed area, and the second position information is obtained according to the first position Information OK.
  • the user can define the area to be shot in different ways, such as by manually marking or importing external files to define the area to be shot.
  • different strategies may also be used to obtain the first position information.
  • the first position information is set by the user, for example, the user inputs the first position information by manually clicking; in some embodiments , the area to be shot is determined by importing an external file, and the external file records the first position information.
  • prompt information may be output to prompt the user to define the area to be shot.
  • the to-be-photographed area in this embodiment of the present application may be a square area, or may be an area of other shapes, such as a circular area, a pentagonal area, and the like.
  • the area to be photographed is a square area
  • the first location information may include location information of four corners of the square area.
  • the first location information may also include location information of other locations in the square area.
  • the oblique shooting mode is entered, that is, the oblique shooting mode is entered. , and after entering the oblique shooting mode, plan the shooting sequence.
  • the planning of the shooting sequence may be performed by the control device of the drone, or may be performed by the drone, and the process of shooting using the planned shooting sequence is performed by the drone. Therefore, if the shooting sequence planning process is carried out in the control device, the control device can be triggered to enter the oblique shooting mode before the control device performs the shooting sequence planning; if the shooting sequence planning process is carried out in the drone, then Before the drone can plan the shooting sequence, it is necessary to trigger the drone to enter the tilt shooting mode. In addition, the drone was filmed in a tilt-shot mode using a programmed sequence of shots from the controls.
  • control device of the drone may be a remote controller or other terminal equipment capable of controlling the drone, such as a mobile phone, a tablet computer, a laptop computer, a desktop computer, a smart wearable device, and the like.
  • the second position information is also related to the strategy used when expanding the area to be photographed.
  • the first location information and the external The second position information is determined according to the magnification of the expanded shooting area relative to the to-be-photographed area; the second position information may also be determined according to the first position information and the distance between the edge of the expanded shooting area and the to-be-photographed area.
  • the first position information and the difference between the edges of the expanded shooting area in different directions and the edges of the to-be-photographed area in the corresponding direction can be obtained according to the first position information.
  • the distance between them is determined, and the second position information is determined.
  • the expanding shooting area is an area obtained by expanding the area to be photographed by a first preset distance in different directions respectively.
  • the area to be photographed is a rectangular area 10
  • the rectangular area 10 is expanded in different directions by the size of the first preset distance D ext , that is, the expanded photographing area 20 is obtained.
  • the first preset distance is determined based on the flying height of the drone and the installation angle of the photographing device mounted on the drone. This setting takes into account factors such as the resolution of the image captured by the photographing device and route planning requirements.
  • the calculation formula of the first preset distance D ext is as follows:
  • H is the flying height
  • is the installation angle of the photographing device
  • is the included angle between the optical axis of the lens of the photographing device and the ground plane.
  • the first preset distance may also be determined by using other strategies.
  • the flight height can also be determined using different strategies.
  • the flight height is set by the user.
  • the flight height is input by the user through the control device of the drone. This way of determining the flight height can satisfy the Different user needs have strong flexibility; in some embodiments, the flight height is determined according to the parameters of the camera mounted on the drone and the preset ground resolution.
  • the parameters of the camera include the The formula for calculating the focal length and the side length of a single pixel of the image sensor of the photographing device and the flying height can be:
  • H is the flight height
  • f is the focal length of the photographing device
  • GSD Round Sampling Distance
  • pix is the side length of a single pixel of the image sensor of the photographing device.
  • the third position information of the effective shooting area in different shooting directions is determined.
  • the shooting directions of the embodiments of the present application may include at least two of the following: a front shooting direction inclined relative to the vertical direction and facing the front of the drone, a rear shooting direction inclined relative to the vertical direction and facing the rear of the drone, A left shot direction inclined relative to the vertical direction and toward the left direction of the drone, a right shot direction inclined relative to the vertical direction and toward the right direction of the UAV, or a forward shot direction with the shooting direction vertically downward.
  • a front shooting direction inclined relative to the vertical direction and facing the front of the drone a rear shooting direction inclined relative to the vertical direction and facing the rear of the drone
  • a left shot direction inclined relative to the vertical direction and toward the left direction of the drone a right shot direction inclined relative to the vertical direction and toward the right direction of the UAV, or a forward shot direction with the shooting direction vertically downward.
  • the shooting directions include the forward shooting direction (the shooting device (for shooting the rearward image of the subject), left shooting direction (the shooting device is used to shoot the left image of the subject) and right shooting direction (the shooting device is used to shoot the right image of the subject), or, the shooting direction includes the front Shooting direction, back shooting direction, left shooting direction, right shooting direction and front shooting direction (the shooting device is used to shoot the orthophoto image of the subject). It can be understood that, in other usage scenarios, the shooting direction can be selected as other to meet corresponding needs.
  • the effective shooting area in the forward shooting direction, the effective shooting area in the back shooting direction, the effective shooting area in the left shooting direction, the effective shooting area in the right shooting direction, and the effective shooting area in the front shooting direction are respectively: The area obtained after moving the direction to the second preset distance, the area to be photographed after moving the area to be photographed by a second preset distance in the second direction, the area to be photographed after moving the area to be photographed by a second preset distance in the third direction, the area to be photographed.
  • the area obtained after the area is moved in the fourth direction by a second preset distance, the area to be photographed, that is, the effective photographing area in the front photographing direction is the area obtained after the area to be photographed is moved to the first direction by a second preset distance, and the area to be photographed
  • the effective shooting area in the shooting direction is the area obtained after the area to be shot is moved to the second direction by a second preset distance
  • the effective shooting area in the left shooting direction is the area obtained after the area to be shot is moved in the third direction by
  • the effective shooting area in the right shooting direction is the area obtained by moving the to-be-photographed area in the fourth direction by a second preset distance
  • the effective shooting area in the front-facing direction is the to-be-photographed area.
  • the second preset distance and the first preset distance may or may not be equal. It can be understood that the effective area of each shooting direction is located within the expanded shooting area, therefore, the second preset distance is less than or equal to the first preset distance.
  • the distances moved to different directions in the above-mentioned directions may not be equal.
  • the effective area in the forward shooting direction is the area obtained after the area to be photographed is moved to the first direction by a second preset distance, and the second preset distance is less than or equal to the size of the extended shooting area moving from the area to be photographed to the first direction.
  • the effective shooting area in the back-shooting direction is the area obtained after the area to be photographed is moved to the second direction by a third preset distance, and the third preset distance is less than or equal to the extended shooting area.
  • the area to be photographed moves to the second direction distance.
  • the first direction in the embodiment of the present application is opposite to the second direction, and the third direction is opposite to the fourth direction.
  • the first direction, the second direction, the third direction or the fourth direction is related to the shape of the flight path of the drone.
  • the flight route may include a plurality of mutually parallel sub-routes, and one side of adjacent sub-routes is connected to form a flight route.
  • the area to be photographed and the expanded photographing area of the embodiment shown in FIG. 3 are continued.
  • the starting waypoint of the flight route is any corner position of the expanded photographing area, and the sub-route is parallel to one of the expanded photographing areas. side.
  • the starting waypoint A of the flight route 30 is the lower left corner of the expanded shooting area
  • the end point B of the flight route 30 is the upper right corner of the expanded shooting area; for example, please refer to FIG.
  • the starting waypoint C of the flight route 40 is the upper left corner of the expanded shooting area
  • the end point D of the flight route 40 is the lower right corner of the expanded photographing area.
  • the starting waypoint can also be the upper right corner or the lower right corner of the extended shooting area
  • the end point is correspondingly the lower left corner or the upper left corner of the extended shooting area.
  • the sub-routes are parallel to the short side of the expanded shooting area. It is understandable that the sub-routes can also be parallel to the long side of the expanded shooting area.
  • the first direction is the down direction
  • the second direction is the up direction
  • the third direction is the right direction
  • the fourth direction is the left direction
  • the obtained effective shooting area in the forward shooting direction is the area 51 shown in FIG. 5A
  • the area obtained by removing the area 51 in the expanded shooting area 20 in FIG. 5A is the invalid shooting in the forward shooting direction.
  • the effective shooting area in the back-shooting direction is the area 52 shown in 5B, and the area obtained by removing the area 52 in the externally expanded shooting area 20 in FIG.
  • the effective shooting area in the left-shooting direction is The area 53 shown in FIG. 5C, the area obtained by removing the area 53 in the expanded photographing area 20 in FIG. 5C is the invalid photographing area in the left shooting direction; the effective shooting area in the right shooting direction is the area 54 shown in FIG. 5D, FIG. 5D
  • the area obtained by removing the area 54 in the internal and external expansion of the shooting area 20 is the invalid shooting area in the right shooting direction.
  • the first direction is the up direction
  • the second direction is the down direction
  • the third direction is the left direction
  • the fourth direction is the right direction
  • the obtained effective shooting area in the front shooting direction is the area 52 shown in FIG. 5B
  • the area obtained by removing the area 52 in the externally expanded shooting area 20 in FIG. 5B is the invalid shooting in the front shooting direction.
  • the effective shooting area in the back-shooting direction is the area 51 shown in 5A, and the area obtained by removing the area 51 in the expanded shooting area 20 in FIG.
  • 5A is the invalid shooting area in the back-shooting direction; the effective shooting area in the left-shooting direction is The area 54 shown in FIG. 5D, the area obtained by removing the area 54 in the externally expanded shooting area 20 in FIG. 5D is the invalid shooting area in the left shooting direction; the effective shooting area in the right shooting direction is the area 53 shown in FIG. 5C, FIG. 5C The area obtained by removing the area 53 in the internal and external expansion of the shooting area 20 is the invalid shooting area in the right shooting direction.
  • the effective shooting areas in the forward shooting direction are the areas to be shot 10
  • the area obtained by removing the area to be shot 10 in the externally expanded shooting area 20 in FIGS. 4A and 4B is the forward shooting direction invalid shooting area.
  • FIGS. 5A to 5D D 1 is the second predetermined distance, the second predetermined distance equal to the first predetermined distance. It can be understood that the flight route is not limited to the flight route shown in FIG. 4A and FIG. 4B , and can also be set to other ones.
  • the manner of determining the above-mentioned flight route can be selected as required.
  • the process of determining the flight route includes but is not limited to the following steps:
  • the preset side overlap ratio and the number of pixels in the direction of the drone mounted on the camera perpendicular to the flight direction of the drone that is, the image sensor of the camera is vertical
  • the number of pixels in the flight direction of the UAV determine the lateral distance between two adjacent sub-routes in the flight route
  • GSD is the ground resolution
  • ⁇ lateral is the lateral overlap ratio
  • n H is the number of pixels of the camera mounted on the UAV perpendicular to the flight direction of the UAV.
  • calculation method of the side distance D route is not limited to the formula (3), and can also be other.
  • the photographing direction of the photographing device is the positive photographing direction
  • the overlap ratio between the image and the image captured by the photographing device at the photographing point 2 in the flight direction is called the heading overlap ratio.
  • Shooting point 1 and shooting point 12 are on two adjacent sub-routes, respectively, and the overlap ratio of the image shot by the shooting device at shooting point 1 and the image shot by the shooting device at shooting point 12 in the vertical direction of the flight direction is called the side. to the overlap rate.
  • the route planning is performed in the extended shooting area, and the lateral distance between adjacent sub-routes in the flight route is the lateral distance determined in step (1).
  • the side overlap ratio can be the default value or can be set by the user.
  • the side overlap rate is set by the user.
  • the side overlap rate is input by the user through the control device of the drone. This way of determining the side overlap rate can meet different user needs and is highly flexible.
  • the side overlap ratio may be greater than or equal to 65% and less than or equal to 80%.
  • the lateral overlap ratio is 65%, 70%, 75%, 80%, or other magnitudes greater than 65% and less than 80%.
  • the flight route can also be planned in other ways.
  • the flight route can be a tic-tac-toe route, and the tic-tac-toe route includes two routes (the route 60 and the route 70 in FIG.
  • the sub-routes of each route are perpendicular to each other, and one route needs to complete the acquisition of oblique images in two or three shooting directions.
  • One route collects left and right images, or left, right and ortho images
  • the other A route captures forward image and backward image, or forward image, backward image and ortho image, if only left image and right image are needed, or only left image, right image and ortho image are taken, Or just take a forward image and a backward image, or just take a forward image, a backward image, and an ortho image, and the flight path can be planned as a tic-tac-toe.
  • the side distance of the tic-tac-toe route is the same as the side distance D route of the above-mentioned embodiment.
  • one shooting direction corresponds to a preset target posture of a gimbal, that is, when the gimbal reaches the preset target posture, the shooting device is in the corresponding shooting direction.
  • each shooting sequence includes one or more continuous shooting points, The shooting directions of one or more shooting points of each shooting sequence are different, and the shooting points of each shooting direction are all located in the effective shooting area of the shooting direction.
  • the shooting directions include the front shooting direction, the back shooting direction, the left shooting direction, the right shooting direction and the front shooting direction, wherein, if the shooting sequence is located on the flight path outside the effective shooting area of the front shooting direction, the shooting sequence is There is no shooting point in the front shooting direction; if the shooting sequence is located on the flight route outside the effective shooting area of the back shooting direction, there is no shooting point in the back shooting direction in the shooting sequence; If the shooting sequence is on the flight path outside the shooting area, there is no shooting point in the left shooting direction; if the shooting sequence is on the flight path outside the effective shooting area in the right shooting direction, there is no shooting point in the right shooting direction.
  • each shooting sequence is located on the flight path within the effective area of at least one shooting direction among the front shooting direction, the back shooting direction, the left shooting direction, the right shooting direction and the front shooting direction, and also That is, the position of each shooting sequence on the flight route is in at least one valid shooting area, and each shooting sequence includes shooting points in at least one shooting direction.
  • the number of shooting points in each shooting sequence is positively related to the number of effective shooting areas where the shooting sequence is located on the flight path.
  • the shooting sequence is in The location on the flight path may be in at least one of Zone 1, Zone 2, Zone 3, and Zone 4.
  • the area 1 is the overlapping area of the effective areas of the five shooting directions, that is, the area 1 is the overlapping area of the effective shooting areas in the front shooting direction, the back shooting direction, the left shooting direction, the right shooting direction and the front shooting direction, and the area 2 is respectively is the overlapping area of the effective areas of the 4 shooting directions
  • the area 2 includes 4 overlapping areas: the overlapping area of the effective shooting areas in the front shooting direction, the left shooting direction, the right shooting direction and the front shooting direction, the back shooting direction, the left shooting direction
  • the area 3 is the overlapping area of the effective areas of the three shooting directions, then the area 3 includes 4 overlapping areas, namely: the front shooting direction, the
  • the shooting sequence on the flight route is in area 1, the number of shooting points in the shooting sequence is 5; if the position of the shooting sequence on the flight route is in zone 2, the number of shooting points in the shooting sequence The number is 4; if the position of the shooting sequence on the flight path is within area 3, the number of shooting points in the shooting sequence is 3; if the position of the shooting sequence on the flight path is within area 4, the shooting sequence The number of shot points in the sequence is 1.
  • the multiple shooting sequences are arranged in sequence, wherein the sequence of each shooting sequence is the same as the sequence of the positions of the drones on the flight path after the shooting sequence when the drone flies according to the flight path. consistent.
  • the follow-up UAV When the follow-up UAV performs oblique photography according to the planned shooting sequence, it can trigger the gimbal and the shooting device to complete the shooting process by using time-lapse shooting or fixed-distance shooting.
  • the time required for the shooting device to complete the shooting of each shooting sequence is fixed or the distance between adjacent shooting sequences is fixed, so that the shooting interval or the distance between two shooting sequences is more stable.
  • the time required for the photographing device to complete the photographing of each photographing sequence is the first fixed time length. In this way, the drone can trigger the gimbal and the photographing device to complete the photographing process in a timed photographing manner.
  • the drone when it performs oblique photography according to the planned shooting sequence, before the PTZ controls the shooting device to complete the shooting of the first shooting point of the first shooting sequence, it can send a timing shooting trigger signal to the camera.
  • the gimbal after receiving the timing shooting trigger signal, will trigger the shooting device to complete the shooting of each shooting sequence in turn based on the first fixed duration.
  • the drone only needs to send it once It is enough to shoot the trigger signal periodically, and the control of the drone is relatively simple.
  • the time required for the photographing device to complete the photographing of each photographing sequence may not be a fixed time.
  • the time required for the photographing device to complete the photographing of adjacent photographing points in the same photographing sequence is a second fixed time length, so that the photographing interval duration between adjacent photographing points in each photographing sequence is stable.
  • the time required for the photographing device to complete the photographing of adjacent photographing points in the same photographing sequence may not be a fixed time.
  • the first fixed duration is greater than the second fixed duration.
  • the sizes of the first fixed duration and the second fixed duration can be set as required. Exemplarily, the first fixed duration is 10 seconds, and the second fixed duration is 2 seconds; of course, the first fixed duration and the second fixed duration are also Can be set to other numerical sizes.
  • the distance between adjacent shooting sequences is a first fixed distance.
  • the drone can trigger the pan/tilt and the shooting device to complete the shooting process in a fixed-distance shooting manner.
  • the fixed-distance shooting trigger signal can be sent to the PTZ before the PTZ controls the shooting device to complete the shooting of the first shooting point of each shooting sequence.
  • the attitude switching is performed first, so that the shooting device on the gimbal is in the corresponding shooting direction when the drone reaches each shooting point of the corresponding shooting sequence, and then when the shooting device is in the corresponding shooting direction, Triggering the shooting device to shoot, this fixed-distance shooting triggering method can make the distance between the two shooting sequences more stable.
  • the distance between adjacent shooting points in the same shooting sequence is the second fixed distance
  • the drone can send a fixed-distance shooting trigger signal to the camera before the gimbal controls the shooting device to complete the shooting of each shooting point.
  • the gimbal and gimbal switch the attitude first after receiving the fixed-distance shooting trigger signal each time, so that the shooting device on the gimbal is in the shooting direction corresponding to the shooting point when the drone arrives at the corresponding shooting point, and then at the corresponding shooting point.
  • the shooting device is triggered to shoot, so that the distance between two adjacent shooting points in each shooting sequence is more stable.
  • the size of the first fixed distance and the second fixed distance can be set as required.
  • the first fixed distance is 10 meters
  • the second fixed distance is 2 meters; of course, the first fixed distance and the second fixed distance are also Can be set to other numerical sizes.
  • the drone uses a fixed-distance shooting method to trigger the gimbal and the shooting device to complete the shooting process.
  • the gimbal periodically completes a posing sequence, that is, the drone arrives At each waypoint, the gimbal is triggered to enter the shooting program.
  • the shooting device is periodically triggered to shoot at each shooting point.
  • the distance between adjacent shooting sequences (that is, the adjacent flight The distance between points) is the third fixed distance
  • the time required for the shooting device to complete the shooting of adjacent shooting points in the same shooting sequence is the third fixed time length
  • the size of the third fixed distance and the third fixed time length can be set as required , exemplarily, the third fixed distance is 10 meters, and the third fixed duration is 2 seconds.
  • the initial shooting point in the shooting points (that is, the first shooting point in the first shooting sequence) is: the initial flight position when the drone flies according to the flight route; optionally, the initial shooting point in the shooting points Point is: the initial waypoint of the flight route.
  • the starting flight position and the starting waypoint may be the same position or different positions.
  • one of the above embodiments may be selected as required to determine the initial shooting point. It can be understood that the manner of determining the initial photographing point is not limited to the above-mentioned manners, and other manners may also be used to determine the initial photographing point.
  • the execution subject of the shooting control method in the above-mentioned embodiment may be a control device of an unmanned aerial vehicle, and the control device may be a device capable of controlling the unmanned aerial vehicle, such as a remote controller, a mobile phone, a computer, a smart wearable device, etc.;
  • the execution subject of the shooting control method of the example can also be an unmanned aerial vehicle, for example, the execution subject can be the flight controller of the unmanned aerial vehicle or other controllers or flight controllers set in the unmanned aerial vehicle or other controllers set in the unmanned aerial vehicle.
  • a combination of controllers; the execution subject of the shooting control method of the above-mentioned embodiment can also be a combination of a control device of an unmanned aerial vehicle and an unmanned aerial vehicle, for example, the acquisition of the first position information and the second position information and the planning of the flight route Put it on the control device of the drone for execution, and put the determination of the effective shooting area and the determination of the shooting sequence on the drone for execution; another example, put the acquisition of the first position information and the second position information in the control of the drone Device execution, the planning of the flight route, the determination of the effective shooting area, and the determination of the shooting sequence are carried out by the drone; for another example, the acquisition of the first position information is carried out on the control device of the drone, and the second position
  • the determination of information, the planning of the flight route, the determination of the effective shooting area, and the determination of the shooting sequence are performed by the drone; for another example, the acquisition of the first position information, the determination of the second position information, and the planning of the flight route can also be performed.
  • the determination of the effective shooting area and the determination of the shooting sequence are all performed by the drone; of course, the execution subject of the shooting control method of the above-mentioned embodiment is not limited to the control device of the drone and/or the drone, but can also be such as Others independent of the drone's controls or drone's electronics, such as controls for a gimbal or camera.
  • the execution body of the method in the foregoing embodiment is a control device of an unmanned aerial vehicle.
  • the control device when acquiring the second position information of the extended shooting area, the control device specifically determines the second position information of the extended shooting area according to the first position information.
  • the planning of the flight route is performed in the control device.
  • the flight route of the UAV is planned according to the second position information, wherein, for the planning of the flight route, reference may be made to the description of the corresponding part in the foregoing embodiment. , and will not be repeated here.
  • the shooting control method also includes: sending the flight route to the drone, and after determining the shooting sequence corresponding to each waypoint on the flight route according to the third position information and the preset flight route of the drone. , send the shooting sequence corresponding to each waypoint to the drone, so that the drone controls the shooting device mounted on the drone to shoot based on the flight route and the shooting sequence corresponding to each waypoint, so as to tilt the drone.
  • the flight route is sent to the drone through the control device of the drone, and the drone performs oblique photography during the execution of the flight route. It will be understood that the planning process of the flight route can also be carried out in the drone.
  • the execution subject of the shooting control method in the above embodiment is a drone.
  • the drone may plan the shooting sequence before performing oblique shooting, and plan each shooting direction in the shooting sequence obtained by planning.
  • the shooting points are located in the effective shooting area of the shooting direction; exemplarily, each shooting sequence includes shooting points in all shooting directions.
  • the shooting point of the currently executed shooting sequence remove the shooting point located in the invalid shooting area in the current shooting sequence.
  • the photographing control method will be further described by taking an example that the execution subject of the photographing control method is an unmanned aerial vehicle.
  • the first position information can be sent by the control device of the drone.
  • the user inputs the first position information through the control device of the drone, and the control device of the drone sends the first position information to the drone.
  • the user can input the first position information into the control device of the UAV by manual marking or by using an external file.
  • the first position information can also be directly input to the drone by the user operating the input module.
  • the second position information is sent by the control device of the drone.
  • the control device of the drone may, according to the first position information, Determine the second position information of the extended shooting area, and then send the second position information to the drone; in some embodiments, the drone itself determines the second position information, specifically, according to the first position information, determine The second position information of the externally expanded shooting area.
  • the implementation process of determining the second position information of the extended shooting area according to the first position information may refer to the description of the corresponding part in the foregoing embodiment, which will not be repeated here.
  • the planning of the flight route can be performed on the control device of the drone or on the drone.
  • the flight route is planned by the control device of the drone based on the second position information.
  • the control device of the man-machine sends the flight route to the drone; in some embodiments, the flight route is planned by the drone based on the second position information.
  • the planning of the flight route reference may be made to the description of the corresponding part in the foregoing embodiment, and details are not repeated here.
  • the photographing control method of the embodiment of the present application may further include: controlling the photographing device mounted on the drone to photograph based on the flight route and the photographing sequence corresponding to each waypoint.
  • FIG. 7 is a schematic diagram of an implementation manner of controlling a photographing device mounted on an unmanned aerial vehicle to photograph based on a flight route and a photographing sequence according to an embodiment of the present application.
  • the process of controlling the photographing device mounted on the drone to photograph based on the flight route and the photographing sequence may include steps S701 to S703 .
  • the drone is controlled to fly according to the flight route
  • the gimbal on the drone is controlled to switch the attitude, so that the shooting device on the gimbal is used when the drone reaches each shooting point.
  • the shooting points are in the corresponding shooting direction;
  • the gimbal equipped with the shooting device in the process of the drone flying from the current shooting point to the next shooting point, the gimbal equipped with the shooting device is controlled to switch the posture, so that the shooting device is in the corresponding shooting direction when the drone reaches each shooting point. , the shooting process does not need to stop the flight of the drone, thereby improving the shooting efficiency, which is especially suitable for map surveying and mapping; and the embodiment of the present application controls a shooting device through the PTZ to asynchronously complete the shooting of images in multiple shooting directions.
  • the weight of the UAV of the present invention is greatly reduced compared to the traditional multi-series shooting device, so that the UAV with lighter volume and weight can be selected to carry the shooting device, thereby reducing the use cost.
  • the shooting by the shooting device does not affect the flight of the drone, that is, when the shooting device is shooting, the drone continues to execute the flight route, and the drone will not hover due to the shooting action of the shooting device. Further improve shooting efficiency.
  • the flight route in this embodiment of the present application may include multiple waypoints, wherein the flight route may be preset by the user.
  • the aircraft can connect the waypoints in sequence according to the input sequence to form the above flight route.
  • the position information of some waypoints in the set flight route can be modified by operating the control device of the drone.
  • the step of modifying the position information of some waypoints in the set flight route can be performed before the drone flies, or can be performed during the drone flight. It can be understood that the flight route can also be the default flight route.
  • the position setting relationship between the waypoints and the shooting points can be selected according to needs. For example, in some embodiments, shooting points are set between adjacent waypoints. In this way, the flight time between the waypoints can be used longer than the shooting device to shoot images. Due to the characteristics of the required time, the image shooting in multiple shooting directions is inserted between the waypoints, and the shooting efficiency is high; Shooting points may or may not be set; in other embodiments, multiple waypoints are all used as shooting points, and shooting points may or may not be set between adjacent waypoints. It can be understood that a shooting point has one shot, corresponds to a shooting direction, and corresponds to a preset target posture of a gimbal to obtain a shot image.
  • the flight route is generally designed as multiple flights Routes, such as five routes, each flight route corresponds to a shooting direction, and the forward image, backward image, left image, right image and orthoimage of the area to be photographed are collected respectively. Therefore, it is necessary to control the drone to follow the Five routes are flown, but this is not conducive to the endurance of the drone.
  • the flight route of the embodiment of the present application is designed as one flight route, and the flight route can be the flight route shown in FIG. 4A and FIG. 4B , or it can be other.
  • the unmanned aerial vehicle is controlled
  • the gimbal on the aircraft performs attitude switching to achieve shooting from multiple shooting angles, so as to avoid the repeated patrolling of the route, which not only helps to improve the shooting efficiency, but also helps to reduce the energy consumption of the drone.
  • the process of controlling the drone to fly according to the flight route may include: controlling the real-time height between the lens of the photographing device and the area to be photographed to be within a preset height range.
  • the GSD will be uneven, so the height between the lens of the camera and the ground is controlled to maintain the uniformity of the GSD, such as the terrain becomes higher.
  • the UAV will rise; if the terrain is lower, the UAV will descend, ensuring that the GSD is roughly equal during the surveying and mapping process.
  • the embodiment of the present application does not require the drone to hover at the shooting point.
  • the flight speed needs to be controlled within the maximum flight speed allowed by the drone.
  • the maximum flight speed can be calculated in combination with the rotation performance of the gimbal.
  • the maximum flight speed allowed by the UAV is the time required to complete a shooting sequence based on the heading distance of each shooting direction and the shooting device and return to the original shooting direction. Determine, wherein, the heading spacing of each shooting direction is equal, and the heading spacing is based on the preset ground resolution, the preset heading overlap rate and the number of pixels in the flight direction of the shooting device parallel to the UAV (i.e. The image sensor of the photographing device is parallel to the number of pixels in the flying direction of the UAV).
  • D 2 is the heading distance of each shooting direction
  • T Gim is the time required for the shooting device to complete the shooting of one shooting sequence and return to the original shooting direction. It should be understood that, the calculation method of the maximum flight speed V max is not limited to the formula (4), and can also be other.
  • F1 and F2 are the effective shooting areas in the forward shooting direction
  • D1 and D2 are the effective shooting areas in the forward shooting direction
  • B1 and B2 It is the effective shooting area in the rear shooting direction
  • R1 and R2 are the effective shooting areas in the right shooting direction
  • L1 and L2 are the effective shooting areas in the left shooting direction
  • D F , D D , D B , D R , and DL are the front shooting areas, respectively.
  • the formula for calculating the heading distance DF in the forward shooting direction is as follows:
  • ⁇ course is the preset course overlap rate
  • n V is the number of pixels of the photographing device parallel to the flying direction of the UAV. It should be understood that the calculation method of the heading distance DF in the forward shooting direction is not limited to the formula (5), and may be other.
  • the course overlap rate can be the default value or can be set by the user.
  • the course overlap rate is set by the user.
  • the course overlap rate is input by the user through the control device of the UAV. This way of determining the course overlap rate can meet different user needs and is highly flexible.
  • the heading overlap rate is greater than or equal to 65% and less than or equal to 80%.
  • the heading overlap ratio is 65%, 70%, 75%, 80%, or other values greater than 65% and less than 80%.
  • the forward image is taken as an example to analyze the influence of systematic error on the heading overlap rate.
  • the error between the actual shooting time of the second forward image and the theoretical shooting time is 0.5s (delay), then the actual distance between the two forward images is :
  • the side length of the effective shooting area parallel to the heading in the front and rear shooting directions is longer than the fore shooting direction, right shooting direction, and left shooting direction, the error between the actual shooting time and the theoretical shooting time will affect the difference between the front shooting direction and the back shooting direction.
  • the image of the heading overlap ratio is smaller.
  • the initial shooting direction may be a shooting direction corresponding to one of the shooting points in the shooting sequence.
  • the initial shooting direction is the shooting direction of the first shooting point in the shooting sequence.
  • the shooting direction of the first shooting point of each shooting sequence is the same, for example, the shooting direction of the first shooting point of each shooting sequence is the forward shooting direction; optionally, the first shooting direction of multiple shooting sequences
  • the shooting directions of the points are at least partially different, for example, the shooting direction of the first shooting point of shooting sequence 1 is the left shooting direction, the shooting direction of the first shooting point of shooting sequence 2 is the right shooting direction, and the shooting direction of the first shooting point of shooting sequence 3
  • the shooting direction is the left shooting direction and so on.
  • T Gim can be the time required for the shooting device to complete the shooting of the current shooting sequence and return to the initial shooting direction of the next shooting sequence, which is applicable to the same or multiple shooting directions of the first shooting point of each shooting sequence.
  • the first shooting point of the shooting direction is the same scene.
  • the real-time attitude of the UAV is obtained;
  • the photographing device of the embodiment of the present application is mounted on the body of the UAV through the gimbal.
  • the posture of the body changes greatly, the posture of the gimbal can be controlled, so that the shooting in the same direction at different waypoints (shooting sequences) can be achieved.
  • using the gimbal attitude control to ensure that the gimbal-to-ground angle is consistent (or the deviation is small), and the overlap rate of photo sequences in the same direction remains uniform.
  • the process of controlling the pan/tilt on the drone to switch attitudes so that the camera on the pan/tilt is in the corresponding shooting direction when the drone arrives at each shooting point may include: according to the shooting sequence, sending The shooting trigger signal is sent to the gimbal, so that the gimbal switches the attitude when the drone flies from the current shooting point to the next shooting point, so that the shooting device on the gimbal can be used when the drone reaches each shooting point. are in the corresponding shooting direction.
  • the shooting trigger signal is also used to instruct the PTZ to trigger the PTZ to shoot when the shooting device is in a corresponding shooting direction.
  • the drone triggers the gimbal to enter the program of executing the shooting sequence, wherein the program of the shooting sequence includes two steps of attitude switching and shooting triggering, and the attitude switching and shooting triggering are both completed by the gimbal, thereby reducing unmanned The influence of the delay of the machine trigger signal processing process on the work efficiency.
  • the attitude switching is to make the photographing device on the gimbal be in the corresponding photographing direction when the drone arrives at each photographing point.
  • the gimbal can also be directly controlled by the drone to switch the attitude and/or the drone can directly trigger the shooting device to shoot.
  • the completion of the shooting process by the gimbal and the shooting device may include: the gimbal controls the shooting device to complete the shooting of each shooting sequence.
  • the gimbal controls the shooting device to complete the shooting of each shooting sequence.
  • the drone arrives at each shooting point of each shooting sequence, it is in the corresponding shooting direction, and the gimbal triggers the shooting device to shoot when the shooting device is in the corresponding shooting direction.
  • the shooting trigger signal can be a timing shooting trigger signal or a fixed-distance shooting trigger signal, that is, the drone can trigger the gimbal and the shooting device to complete the shooting process by using the timing shooting or fixed-distance shooting triggering method.
  • the shooting trigger signal is a timing shooting trigger signal
  • the timing shooting trigger signal is used to instruct the PTZ to trigger the shooting device to shoot based on the first timing strategy.
  • the first timing strategy may include: the time required for the photographing device to complete the shooting of each shooting sequence is a first fixed time length, so that the time required for the shooting device to complete the shooting of each shooting sequence is stable.
  • the number of times to send the timing shooting trigger signal to the PTZ is once.
  • the timing shooting trigger signal can be sent to the PTZ, and the cloud After receiving the timing trigger signal, the station rotates to each shooting direction of each shooting sequence in turn, and triggers the shooting device to shoot regularly. In this way, the drone only needs to send the timing trigger signal once. Simpler. It can be understood that the number of times the drone sends the timing shooting trigger signal to the gimbal can also be multiple times. For example, before the gimbal controls the shooting device to complete the shooting of the first shooting point of each shooting sequence, the timing shooting trigger signal is respectively sent.
  • the PTZ After receiving the timing shooting trigger signal, the PTZ rotates to each shooting direction corresponding to the shooting sequence in turn, and periodically triggers the shooting device to complete shooting in each shooting direction in the corresponding shooting sequence.
  • the first timing strategy can also be other.
  • the timing shooting trigger signal can also be used to instruct the PTZ to trigger the shooting device to shoot based on the second timing strategy.
  • the second timing strategy includes: the time required for the shooting device to complete the shooting of adjacent shooting points in the same shooting sequence is a second fixed time period, so that the shooting device completes the shooting of adjacent shooting points in the same shooting sequence. Duration is stable. It can be understood that the second timing strategy can also be other.
  • the pan/tilt controls the shooting device to complete the first shot of the first shooting sequence. Before the shooting of the shooting point, all shooting sequences are sent to the gimbal at one time, and the drone does not need to send the shooting sequence to the gimbal again when the gimbal and the shooting device cooperate to complete the shooting process; After the shooting device completes the shooting of the current shooting sequence, it sends the next shooting sequence to the cloud platform. That is, after the cloud platform controls the shooting device to complete the shooting of each shooting sequence, it sends the next shooting sequence of the shooting sequence to the cloud platform. to instruct the gimbal to perform the next shooting sequence.
  • the shooting trigger signal is a fixed-distance shooting trigger signal
  • the fixed-distance shooting trigger signal can be used to instruct the pan-tilt head to control the shooting device to shoot based on the first fixed-distance strategy.
  • the first spacing strategy includes: the spacing between adjacent shooting sequences is a first fixed spacing, so that the spacing between adjacent shooting sequences is stabilized.
  • the number of times to send the fixed-distance trigger signal to the cloud platform is multiple times. For example, before the cloud platform controls the shooting device to complete the shooting of the first shooting point of each shooting sequence, the fixed-distance shooting trigger signal can be sent to the cloud respectively.
  • the gimbal After receiving the fixed-distance shooting trigger signal, the gimbal first switches the attitude, so that the shooting device on the gimbal is in the corresponding shooting direction when the drone arrives at each shooting point of the corresponding shooting sequence, and then When the photographing device is in the corresponding photographing direction, the photographing device is triggered to photograph. Since there is a delay in the triggering of the gimbal by the drone, using the first fixed-distance strategy to trigger the gimbal to control the shooting device for shooting can reduce the impact of the triggering delay of the drone on the operation efficiency.
  • the fixed-distance shooting trigger signal may also be used to instruct the PTZ to control the shooting device to shoot based on the second fixed-distance strategy.
  • the second fixed distance strategy includes: the distance between adjacent shooting points in the same shooting sequence is a second fixed distance, so that the distance between adjacent shooting points in the same shooting sequence is stable.
  • a fixed-distance shooting trigger signal can be sent to the gimbal to trigger the gimbal to complete each shooting point. Shooting at a shooting point.
  • the UAV uses the fixed-distance shooting trigger method to trigger the gimbal and the shooting device to complete the shooting process.
  • the gimbal regularly completes a posing sequence, that is, the UAV reaches every When there are several waypoints, trigger the gimbal to enter the shooting program.
  • the gimbal After the gimbal enters the shooting program, it triggers the shooting device to shoot at each shooting point regularly, then the distance between adjacent shooting sequences (that is, the distance between adjacent waypoints) distance) is a first fixed distance, the time required for the photographing device to complete the shooting of adjacent shooting points in the same shooting sequence is a third fixed time length, and the size of the third fixed time length can be set as required.
  • the third fixed time length for 2 seconds.
  • the shooting sequence is sent to the gimbal.
  • the shooting sequence is sent to the PTZ, that is, after the PTZ controls the shooting device to complete the shooting of each shooting sequence, and before the PTZ controls the shooting device to perform the shooting of the next shooting sequence of each shooting sequence, the shooting sequence is sent.
  • the gimbal before the gimbal controls the shooting device to complete the shooting of each shooting point, the gimbal sends an instruction signal to the gimbal respectively , the indication signal is used to indicate the target posture of the gimbal corresponding to the shooting point or the shooting direction of the shooting device, that is, the gimbal is triggered to perform the shooting of the shooting point at each shooting point.
  • the control method of the gimbal attitude switching can be selected according to the type of the gimbal.
  • the gimbal is configured to move around the yaw axis, the roll axis and the pitch axis.
  • the attitude switching of the gimbal may be realized by controlling one or more of the attitude of the roll axis, the attitude of the pitch axis and the attitude of the yaw axis of the gimbal.
  • the yaw axis of the gimbal cannot be rotated in a full circle, so the attitude method of controlling the yaw axis of the gimbal is not used to control the shooting direction of the shooting device at different shooting points in each shooting sequence. Therefore, in the embodiment of the present application, Control any two of the yaw attitude, roll axis attitude and pitch axis attitude of the gimbal to control the gimbal to switch attitudes.
  • the yaw attitude and the pitch axis attitude are controlled to control the gimbal to switch attitudes.
  • the gimbal target poses corresponding to the direction, back shot, right shot, and left shot are: (-60°, 0°, 0°), (-90°, 0°, 0°), (-120°) ,0°,0°), (-60°,0°,90°), (-120°,0°,90°).
  • the yaw attitude and the roll axis attitude are controlled to control the gimbal to switch attitudes.
  • the gimbal target poses corresponding to the direction, back shot, right shot, and left shot are: (-90°, 30°, 0°), (-90°, 0°, 0°), (-90° ,-30°,0°), (-90°,-30°,90°), (-90°,30°,90°).
  • front shooting direction, front shooting direction, back shooting direction, right shooting direction, and left shooting direction are only two of the gimbal target attitudes. You can adjust the rotation angle of each axis and the angle of each axis. The front and rear rotation sequence can also achieve similar shooting directions, but the work efficiency is different.
  • the drone flies along the flight route 80, when the drone reaches the first shooting point of a certain shooting sequence (as shown in FIG. 9, the shooting direction of the first shooting point of the shooting sequence is: forward shooting direction), trigger the gimbal to execute the shooting of the shooting sequence.
  • the shooting sequence is referred to as shooting sequence A
  • the shooting directions of shooting sequence A include front shooting direction, front shooting direction, back shooting direction, right shooting direction and left shooting direction. Then, the process that the gimbal performs the shooting of shooting sequence A is as follows:
  • the PTZ After the PTZ detects that the shooting device has completed the shooting of the last shooting point of the last shooting sequence, it rotates to the first target posture of the PTZ corresponding to the front shooting direction of the shooting sequence A, and then triggers the shooting device to shoot to obtain forward image;
  • the first target posture, the second target posture, the third target posture, the fourth target posture, and the fifth target posture are all preset target postures.
  • the three target attitudes, the fourth target attitude, and the fifth target attitude are respectively: (-60°, 0°, 0°), (-90°, 0°, 0°), (-120°, 0°, 0°) , (-60°, 0°, 90°), (-120°, 0°, 90°), or, the first target attitude, the second target attitude, the third target attitude, the fourth target attitude, and the fifth target Attitude: (-90°,30°,0°), (-90°,0°,0°), (-90°,-30°,0°), (-90°,-30°,90 °), (-90°, 30°, 90°).
  • the camera when the gimbal reaches the preset target posture and is in a stable state, the camera triggers the shooting device to shoot.
  • the stable state of the gimbal may include that the angle of the gimbal relative to the preset direction (eg, the angle of the gimbal to the ground) fluctuates within the preset angle range, that is, the fluctuation of the angle of the gimbal relative to the preset direction is small.
  • the shooting device needs to be in a state where shooting can be performed.
  • the photographing device being in a state where the photographing can be performed includes: the control part of the photographing device is in a triggerable state, exemplarily, the photographing device is a camera, and the control part is a shutter of the camera; optionally, the photographing device is in an executable state.
  • the shooting state includes: the cache memory of the shooting device is larger than the preset capacity, that is, the cache memory of the shooting device is large enough to be able to store at least one image.
  • each shooting sequence corresponds to an image queue, and the image of each shooting point in each shooting sequence can be saved in the corresponding image queue; Images in different shooting directions are stored in different image queues. Specifically, you can choose how to save the images according to your needs.
  • the shooting control method of the embodiment of the present application does not need to rely on a dual-axis or three-axis gimbal, and the drone can also complete shooting in multiple shooting directions with a single-axis gimbal, such as a variable pitch attitude.
  • PTZ + tic-tac-toe route planning can achieve shooting in three shooting directions
  • pan-tilt with variable roll attitude + tic-tac-toe route planning can achieve shooting in three shooting directions
  • PTZ with variable yaw attitude + such as picture
  • the flight route shown in 4A or 4B can realize shooting in three or four shooting directions.
  • the embodiments of the present application may also provide a shooting control method.
  • the execution subject of the shooting control method in the embodiments of the present application is an unmanned aerial vehicle.
  • the shooting control method according to the embodiment of the present application may include S1001 to S1002:
  • each shooting sequence includes one or more consecutive shooting points, and the The shooting directions of one or more shooting points are different, and the shooting points of each shooting direction are located in the effective shooting area of the shooting direction, and the effective shooting area is based on the first position information of the area to be shot and the first position information of the extended shooting area.
  • the second position information is determined, the expanding shooting area is obtained by expanding the area to be shot, and the second position information is determined according to the first position information;
  • the photographing device mounted on the UAV is controlled to photograph based on the flight route and the photographing sequence corresponding to each waypoint.
  • the embodiment of the present application further provides a shooting control apparatus, please refer to FIG. 11 , the shooting control apparatus of the embodiment of the present application may include a storage device and a processor, wherein the processor multiple.
  • a storage device for storing program instructions stores an executable instruction computer program of the shooting control method
  • the storage device may include at least one type of storage medium, and the storage medium includes a flash memory, a hard disk, a multimedia card, and a card-type memory (for example, SD or DX). memory, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, etc.
  • the photographing control device may cooperate with a network storage device that performs the storage function of the memory through a network connection.
  • the memory may be an internal storage unit of the photographing control device, such as a hard disk or a memory of the photographing control device.
  • the memory can also be an external storage device of the shooting control device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, and a flash memory card (Flash Card) equipped on the shooting control device. )Wait.
  • the memory may also include both an internal storage unit of the photographing control apparatus and an external storage device. Memory is used to store computer programs and other programs and data required by the device. The memory may also be used to temporarily store data that has been or will be output.
  • the one or more processors invoke program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to perform the following operations: fetching The first position information of the area to be photographed and the second position information of the expanded photographing area, the expanding photographing area is obtained by expanding the area to be photographed, and the second position information is determined according to the first position information; according to the first position information and The second position information is to determine the third position information of the effective shooting areas in different shooting directions; according to the third position information and the preset flight route of the UAV, the shooting sequence corresponding to each waypoint on the flight route is determined; wherein , each shooting sequence includes one or more continuous shooting points to form a shooting sequence, the shooting direction of one or more shooting points in each shooting sequence is different, and the shooting points in each shooting direction are located in the shooting direction.
  • the processor of this embodiment can implement the shooting control method of the embodiment shown in FIG. 2 or FIG. 7 of the present application, and the shooting control apparatus of this embodiment can be described with reference to the shooting control method of the embodiment shown in FIG. 2 or FIG. 7 above. .
  • the one or more processors invoke program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to perform the following operations: receive The flight route and the shooting sequence corresponding to each waypoint on the flight route sent by the control device of the drone; control the shooting device mounted on the drone to shoot based on the flight route and the shooting sequence corresponding to each waypoint; wherein, Each shooting sequence includes one or more continuous shooting points, the shooting directions of one or more shooting points in each shooting sequence are different, and the shooting points in each shooting direction are all located in the effective shooting area of the shooting direction, effectively
  • the shooting area is determined according to the first position information of the to-be-photographed area and the second position information of the extended shooting area, the extended shooting area is obtained by expanding the to-be-photographed area, and the second position information is determined according to the first position information.
  • the processor of this embodiment can implement the shooting control method of the embodiment shown in FIG. 10 of the present application, and the shooting control apparatus of this embodiment can be described with reference
  • the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA field Field-Programmable Gate Array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the communication processes such as “send” and “receive” involved in the above-mentioned physical device can be performed by using the transceiver or communication interface on the device, and other data processing processes other than “send” and “receive” can be performed by the device on the device. processor executes.
  • an embodiment of the present application is also an unmanned aerial vehicle, please refer to FIG. 1 and FIG. 12
  • the unmanned aerial vehicle may include a body 100 , a pan/tilt 300 and the shooting control device of the above-mentioned embodiment.
  • the pan/tilt 300 is mounted on the body, and the pan/tilt 300 of this embodiment is used for mounting the photographing device 200 .
  • the imaging control device is supported by the body 100 , and the imaging control device is electrically connected to the pan/tilt head 300 .
  • an embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, implements the steps of the shooting control method of the foregoing embodiment.
  • the computer-readable storage medium may be an internal storage unit of the photographing control apparatus described in any of the foregoing embodiments, such as a hard disk or a memory.
  • the computer-readable storage medium can also be an external storage device of the shooting control device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), an SD card, and a flash memory card (Flash Card) equipped on the device. Wait.
  • the computer-readable storage medium may also include both an internal storage unit of the photographing control apparatus and an external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the photographing control device, and can also be used to temporarily store data that has been output or will be output.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM) or the like.

Abstract

一种拍摄控制方法和装置、无人机及计算机可读存储介质,所述方法包括:获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,外扩拍摄区域为将待拍摄区域进行扩大获得,第二位置信息为根据第一位置信息确定;根据第一位置信息和第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;根据第三位置信息及预设的无人机的飞行航线,确定飞行航线上的每个航点对应的拍摄序列;每个拍摄序列包括一个或多个连续的拍摄点,每个拍摄序列的一个或多个拍摄点的拍摄方向各不相同,各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。本申请的每一拍摄序列中各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,防止产生无效图像数据。

Description

拍摄控制方法和装置、无人机及计算机可读存储介质 技术领域
本申请涉及拍摄领域,尤其涉及一种拍摄控制方法和装置、无人机及计算机可读存储介质。
背景技术
倾斜摄影技术是通过在无人机上搭载多台拍摄装置,同时从一个垂直以及四个侧视不同角度采集图像,相比传统摄影多了四个倾斜拍摄角度,从而能够获取到更加丰富的侧面纹理等信息,适用于测绘等需要获取拍摄物多方位的特征信息的领域。相关技术中,为实现多个方向的拍摄,一种方式是在无人机上搭载多拼拍摄装置(如5拼拍摄装置),同时拍摄多个方向的图像,多拼拍摄装置成本较高、重量大,一般经过减振系统直接挂载在无人机的机体上,缺少机械云台进行增稳,导致成像质量差;而为了减小多拼拍摄装置的体积,会采用卷帘门或电子全局快门,卷帘快门在快速运动摄影下会存在“果冻效应”,从而降低建模精度,电子全局快门成像质量差,同样也会影响建模效果。另一种方式是在无人机上搭载具有单个镜头的拍摄装置,并配合多航线来实现多个方向的拍摄,相比多拼拍摄装置,具有单个镜头的拍摄装置成本低、重量小,可通过云台搭载在无人机的机体上,成像质量更佳。
为了确保拍摄到待拍摄区域在各个方向上的图像,在进行航线规划时,会先将待拍摄区域进行外扩,再对外扩区域(即外扩后的待拍摄区域)进行航线规划。无人机沿着规划好的航线飞行,并在飞行至每个拍摄点时,采集每个方向的图像,这样,在待拍摄区域的外扩航线上会产生大量无效图像数据,不仅浪费存储空间,也对建模处理带来不便。
发明内容
本申请提供一种拍摄控制方法和装置、无人机及计算机可读存储介质。
第一方面,本申请实施例提供一种拍摄控制方法,所述方法包括:
获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定;
根据所述第一位置信息和所述第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;
根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列;
其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。
第二方面,本申请实施例提供一种拍摄控制装置,所述装置包括:
存储装置,用于存储程序指令;以及
一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定;
根据所述第一位置信息和所述第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;
根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列;
其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。
第三方面,本申请实施例提供一种无人机,包括:
机体;
云台,搭载在所述机体上,所述云台用于搭载拍摄装置;和
第二方面所述的拍摄控制装置,由所述机体支撑,所述拍摄控制装置与所述云台电连接。
第四方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现第一方面所述的拍摄控制方法。
第五方面,本申请实施例提供一种拍摄控制方法,所述方法包括:
接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列;
基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄;
其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,所述有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定。
第六方面,本申请实施例提供一种拍摄控制装置,所述装置包括:
存储装置,用于存储程序指令;以及
一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列;
基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄;
其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,所述有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定。
第七方面,本申请实施例提供一种无人机,包括:
机体;
云台,搭载在所述机体上,所述云台用于搭载拍摄装置;和
第六方面所述的拍摄控制装置,由所述机体支撑,所述拍摄控制装置与所述云台电连接。
第八方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现第五方面所述的拍摄控制方法。
根据本申请实施例提供的技术方案,本申请在规划拍摄序列时,确保每一拍摄序列中各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,如此,不仅可以防止产生无效图像数据,还可以减少拍摄点的数量,减小拍摄时间,提高多向拍摄的效率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一实施例中的无人机的结构示意图;
图2是本申请一实施例中的拍摄控制方法的方法流程示意图;
图3是本申请一实施例中的待拍摄区域和外扩拍摄区域的位置关系示意图;
图4A是本申请一实施例中的飞行航线的示意图;
图4B是本申请另一实施例中的飞行航线的示意图;
图5A是本申请一实施例中的其中一个拍摄方向的效拍摄区域与待拍摄区域的位置关系示意图;
图5B是本申请一实施例中的另一个拍摄方向的效拍摄区域与待拍摄区域的位置关系示意图;
图5C是本申请一实施例中的另一个拍摄方向的效拍摄区域与待拍摄区域的位置关系示意图;
图5D是本申请一实施例中的另一个拍摄方向的效拍摄区域与待拍摄区域的位置关系示意图;
图6A是本申请一实施例中的无人机在同一拍摄方向不同拍摄点所拍摄图像的对比图;
图6B是本申请另一实施例中的飞行航线的示意图;
图7是本申请一实施例中的一种基于飞行航线及拍摄序列控制搭载在无人机上的拍摄装置进行拍摄的实现方式示意图;
图8是本申请一实施例中的相邻两个拍摄序列的拍摄方向相同的拍摄点的图像之间的位置关系示意图;
图9是本申请一实施例中的云台执行某个拍摄序列的拍摄的过程的示意图;
图10是本申请另一实施例中的拍摄控制方法的方法流程示意图;
图11是本申请一实施例中的拍摄控制装置的结构框图;
图12是本申请一实施例中的无人机的结构框图。
具体实施方式
传统的测绘通过全站仪或GNSS(Global Navigation Satellite System,全球导航卫星系统)手持设备进行测点,其缺点是效率低、操作难度高、作业成本高,对于大面积高精度高分辨测绘,传统测绘无法满足,其已逐步被有人机测绘和无人机测绘所替代。有人机或无人机测绘还可以用于建立测量区域的三维模型,通过倾斜摄影技术对待拍摄区域的多个方向进行拍摄,结合三维建模算法对多个方向的图像进行处理解算得到包含三维空间信息的模型。
为了确保拍摄到待拍摄区域在各个方向上的图像,在进行倾斜摄影的航线规划 时,会先将待拍摄区域进行外扩,再对外扩区域(即外扩后的待拍摄区域)进行航线规划。无人机沿着规划好的航线飞行,并在飞行至每个拍摄点时,采集每个方向的图像,这样,在待拍摄区域的外扩航线上会产生大量无效图像数据,不仅浪费存储空间,也对建模处理带来不便。
针对上述问题,本申请实施例在规划拍摄序列时,确保每一拍摄序列中各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,如此,不仅可以防止产生无效图像数据,还可以减少拍摄点的数量,减小拍摄时间,提高多向拍摄的效率。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
需要说明的是,在不冲突的情况下,下述的实施例及实施方式中的特征可以相互组合。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a和b和c,其中a,b,c可以是单个,也可以是多个。
图1是本申请一实施例中的无人机的结构示意图;请参见图1,本申请实施例的无人机可包括机体100、拍摄装置200和云台300,其中,拍摄装置200通过云台300搭载在机体100上。该无人机可以为固定翼无人机,也可以为多旋翼无人机,具体可根据实际需求选择无人机的类型,例如,当云台300和拍摄装置200的重量较大时,可选择体积和重量较大的固定翼无人机来搭载云台300和拍摄装置200;当云台300和拍摄装置200重量较小时,可选择体积和重量较小的多旋翼无人机来搭载云台300和拍摄装置200。
本申请实施例的拍摄装置的数量为一台,在利用无人机进行倾斜摄影时,只需采用一台拍摄装置,该拍摄装置虽然像素大,但相对多拼拍摄装置体积和重量大大减小,从而大大减小了无人机的重量以及尺寸。拍摄装置200可为集成相机,也可以为图像传感器和镜头组合成的器件,需要说明的是,本申请实施例的拍摄装置200为具有单个镜头的拍摄装置。此外,本申请实施例的云台300可以为单轴云台、两轴云台、三轴云台或其他多轴云台。
该无人机可应用在测绘领域,以拍摄物为地面为例,通过无人机搭载拍摄装置200采集地面图像,再利用软件对地面图像进行三维或二维地图重建,通过测绘获得的地图可应用在不同的行业,如在电力巡检领域,可利用重建的地图检查线路故障;在道路规划领域,可利用重建的地图进行道路的选址;缉毒警察可利用重建的三维地图来检查深山中的罂粟种植情况等等。当然,该无人机并不局限于测绘领域,也可应用在其他需要获取拍摄物多方位的特征信息的领域。拍摄物也不限于地面,也可为大型建筑物、山峦等。
图2是本申请一实施例中的拍摄控制方法的方法流程示意图;请参见图2,本申请实施例的拍摄控制方法可包括步骤S201~S203。
其中,在S201中、获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,外扩拍摄区域为将所述待拍摄区域进行扩大获得,第二位置信息为根据第一位置信息确定。
用户可通过不同方式定义待拍摄区域,如通过手动打点或导入外部文件来定义待拍摄区域。相应的,获取第一位置信息也可采用不同的策略,示例性的,在一些实施例中,第一位置信息由用户设定,如用户通过手动打点方式输入第一位置信息;在一些实施例中,待拍摄区域通过导入外部文件确定,外部文件记录有第一位置信息。可选的,在执行S201之前,可输出提示信息以提示用户对待拍摄区域进行定义。
本申请实施例的待拍摄区域可以为方形区域,也可以为其他形状的区域,如圆形区域、五边形区域等等。
示例性的,待拍摄区域为方形区域,第一位置信息可包括方形区域的四个角部的位置信息,当然,第一位置信息也可包括方形区域的其他位置的位置信息。
另外,在一些实施例中,在获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息之前,若获取到指示进入倾斜拍摄模式的触发指令,则进入倾斜拍摄模式,也即,在进入倾斜拍摄模式后,进行拍摄序列的规划。
应当理解的是,拍摄序列的规划可以由无人机的控制装置执行,也可以由无人机执行,而使用规划好的拍摄序列进行拍摄这一过程是由无人机执行的。因此,若拍摄序列规划过程是在控制装置中进行的,则在控制装置进行拍摄序列的规划之前,可触发控制装置进入倾斜拍摄模式;若拍摄序列规划过程是在无人机中进行的,则在无人机进行拍摄序列的规划之前,需先触发无人机进入倾斜拍摄模式。另外,无人机是在倾斜拍摄模式下使用控制装置规划好的拍摄序列进行拍摄的。
本申请实施例中,无人机的控制装置可为遥控器或者其他能够控制无人机的终端设备,如手机、平板电脑、手提电脑、台式机、智能穿戴设备等。
第二位置信息还与将待拍摄区域进行扩大时采用的策略相关,示例性的,当外扩拍摄区域为将待拍摄区域在各方向进行等大扩大获得时,可根据第一位置信息以及外扩拍摄区域相对待拍摄区域扩大的倍数,确定第二位置信息;也可根据第一位置信息以及外扩拍摄区域的边缘与待拍摄区域的边缘之间的距离,确定第二位置信息。当外扩拍摄区域为将待拍摄区域在至少部分方向进行不同大小的扩大获得时,可根据第一位置信息以及外扩拍摄区域在不同方向上的边缘与待拍摄区域在对应方向上的边缘之间的距离,确定第二位置信息。
示例性的,外扩拍摄区域为将待拍摄区域的不同方向分别扩大第一预设距离获得的区域。示例性的,请参见图3,待拍摄区域为长方形区域10,将长方形区域10在不同方向分别扩大第一预设距离D ext的大小,即获得外扩拍摄区域20。
可选的,第一预设距离为基于无人机的飞行高度及搭载在无人机上的拍摄装置的安装角度确定,这样设置是考虑了拍摄装置采集图像的分辨率及航线规划需求等因素。示例性的,第一预设距离D ext的计算公式如下:
Figure PCTCN2020102249-appb-000001
公式(1)中,H为飞行高度,α为拍摄装置的安装角度,示例性的,α为拍摄装置的镜头光轴对地平面的夹角。
应当理解的是,第一预设距离也可采用其他策略确定。
飞行高度也可采用不同策略确定,例如,在一些实施例中,飞行高度由用户设定,示例性的,飞行高度为用户通过无人机的控制装置输入,这种确定飞行高度的方式可以满足不同的用户需求,灵活性强;在一些实施例中,飞行高度为根据搭载在无人机上的拍摄装置的参数和预设的地面分辨率确定,示例性的,拍摄装置的参数包括 拍摄装置的焦距和拍摄装置的图像传感器的单个像素边长,飞行高度的计算公式可以为:
Figure PCTCN2020102249-appb-000002
公式(2)中,H为飞行高度,f为拍摄装置的焦距,GSD(Ground Sampling Distance)为预设的地面分辨率,pix为拍摄装置的图像传感器的单个像素边长。应当理解的是,拍摄装置的参数不限于上述列举的参数,也可以包括其他,飞行高度的计算公式也不限于上述公式(1),也可以为其他。
在S202中、根据第一位置信息和第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息。
本申请实施例的拍摄方向可包括以下中的至少两种:相对竖直方向倾斜且朝向无人机的前方的前拍方向、相对竖直方向倾斜且朝向无人机的后方的后拍方向、相对竖直方向倾斜且朝向无人机的左侧方向的左拍方向、相对竖直方向倾斜且朝向无人机的右侧方向的右拍方向或拍摄方向竖直朝下的正拍方向。需要说明的是,在无人机正立时,机头指向前方,机尾指向后方。
可根据实际需求将拍摄方向选择上述中的至少两种,示例性的,在测绘时,拍摄方向包括前拍方向(拍摄装置用于拍摄拍摄物的前向图像)、后拍方向(拍摄装置用于拍摄拍摄物的后向图像)、左拍方向(拍摄装置用于拍摄拍摄物的左向图像)和右拍方向(拍摄装置用于拍摄拍摄物的右向图像),或者,拍摄方向包括前拍方向、后拍方向、左拍方向、右拍方向以及正拍方向(拍摄装置用于拍摄拍摄物的正射图像)。可以理解的是,在其他使用场景,拍摄方向可选择为其他,以满足相应的需求。
其中,前拍方向的有效拍摄区域、后拍方向的有效拍摄区域、左拍方向的有效拍摄区域、右拍方向的有效拍摄区域、正拍方向的有效拍摄区域分别为:待拍摄区域向第一方向移动第二预设距离后获得的区域、待拍摄区域向第二方向移动第二预设距离后获得的区域、待拍摄区域向第三方向移动第二预设距离后获得的区域、待拍摄区域向第四方向移动第二预设距离后获得的区域、待拍摄区域,也即,前拍方向的有效拍摄区域为待拍摄区域向第一方向移动第二预设距离后获得的区域,后拍方向的有效拍摄区域为待拍摄区域向第二方向移动第二预设距离后获得的区域,左拍方向的有效拍摄区域为待拍摄区域向第三方向移动第二预设距离后获得的区域,右拍方向的有效拍摄区域为待拍摄区域向第四方向移动第二预设距离后获得的区域,正拍方向的有效拍摄区域为待拍摄区域。第二预设距离与第一预设距离可相等,也可不相等。可以理解的,各拍摄方向的有效区域位于外扩拍摄区域内,因此,第二预设距离小于或等于第一预设距离。
需要说明的是,当待测区域向各个方向分别外扩不同距离得到外扩拍摄区域时,上述各个方向分别向不同方向移动的距离可以不相等。例如,前拍方向的有效区域为待拍摄区域向第一方向移动第二预设距离后获得的区域,该第二预设距离小于或等于外扩拍摄区域由待拍摄区域向第一方向移动的距离,而后拍方向的有效拍摄区域为待拍摄区域向第二方向移动第三预设距离后获得的区域,该第三预设距离小于或等于外扩拍摄区域由待拍摄区域向第二方向移动的距离。
本申请实施例的第一方向与第二方向相反,第三方向与第四方向相反。具体而言,第一方向、第二方向、第三方向或第四方向与无人机的飞行航线的形状相关。
示例性的,飞行航线可包括多条相互平行的子航线,相邻子航线的其中一侧相 连,以形成一条飞行航线。沿用图3所示实施例的待拍摄区域和外扩拍摄区域,可选的,飞行航线的起始航点为外扩拍摄区域的任一边角位置,子航线平行于外扩拍摄区域的其中一条边。示例性的,请参见图4A,飞行航线30的起始航点A为外扩拍摄区域的左下角,飞行航线30的终点B为外扩拍摄区域的右上角;示例性的,请参见图4B,飞行航线40的起始航点C为外扩拍摄区域的左上角,飞行航线40的终点D为外扩拍摄区域的右下角。当然,起始航点也可以为外扩拍摄区域的右上角或右下角,终点相应为外扩拍摄区域左下角或左上角。另外,图4A以及图4B所示实施例中,子航线均平行于外扩拍摄区域的短边,可以理解的,子航线也可以平行于外扩拍摄区域的长边。
示例性的,对于图4A所示的飞行航线,以图4A显示的上、下、左、右方向为基准,则第一方向为下方向,第二方向为上方向,第三方向为右方向,第四方向为左方向,如此,获得的前拍方向的有效拍摄区域为图5A所示的区域51,图5A中外扩拍摄区域20内去除区域51获得的区域即为前拍方向的无效拍摄区域;后拍方向的有效拍摄区域为与5B所示的区域52,图5B中外扩拍摄区域20内去除区域52获得的区域即为后拍方向的无效拍摄区域;左拍方向的有效拍摄区域为图5C所示的区域53,图5C中外扩拍摄区域20内去除区域53获得的区域即为左拍方向的无效拍摄区域;右拍方向的有效拍摄区域为图5D所示的区域54,图5D中外扩拍摄区域20内去除区域54获得的区域即为右拍方向的无效拍摄区域。
示例性的,对于图4B所示的飞行航线,以图4B显示的上、下、左、右方向为基准,则第一方向为上方向,第二方向为下方向,第三方向为左方向,第四方向为右方向,如此,获得的前拍方向的有效拍摄区域为图5B所示的区域52,图5B中外扩拍摄区域20内去除区域52获得的区域即为前拍方向的无效拍摄区域;后拍方向的有效拍摄区域为与5A所示的区域51,图5A中外扩拍摄区域20内去除区域51获得的区域即为后拍方向的无效拍摄区域;左拍方向的有效拍摄区域为图5D所示的区域54,图5D中外扩拍摄区域20内去除区域54获得的区域即为左拍方向的无效拍摄区域;右拍方向的有效拍摄区域为图5C所示的区域53,图5C中外扩拍摄区域20内去除区域53获得的区域即为右拍方向的无效拍摄区域。
对于图4A和图4B所示的飞行航线,正拍方向的有效拍摄区域均为待拍摄区域10,图4A和图4B中外扩拍摄区域20内去除待拍摄区域10获得的区域即为正拍方向的无效拍摄区域。另外,图5A至图5D中D 1即为第二预设距离,第二预设距离与第一预设距离相等。可以理解的,飞行航线不限于图4A和图4B所示的飞行航线,也可以设置为其他。
上述飞行航线的确定方式可根据需要选择,示例性的,飞行航线的确定过程包括但不限于如下步骤:
(1)、根据预设的地面分辨率、预设的旁向重叠率及搭载在无人机上的拍摄装置垂直于无人机的飞行方向上的像元个数(即拍摄装置的图像传感器垂直于无人机的飞行方向上的像元个数),确定飞行航线中相邻两条子航线间的旁向间距;
示例性的,旁向间距D route的计算公式如下:
D route=GSD(1-γ lateral)n H   (3);
公式(3)中,GSD为地面分辨率,γ lateral为旁向重叠率,n H为搭载在无人机上的拍摄装置垂直于无人机的飞行方向上的像元个数。
可以理解的是,旁向间距D route的计算方式不限于公式(3),也可以为其他。
以其中拍摄的正射图像(拍摄装置的拍摄方向为正拍方向)为例,如图6A所示,拍摄点1与拍摄点2由于在同一子航线上,拍摄装置在拍摄点1所拍摄的图像与拍摄装置在拍摄点2所拍摄的图像在飞行方向上的重叠比例称为航向重叠率。拍摄点1与拍摄点12分别在相邻两条子航线上,拍摄装置在拍摄点1所拍摄的图像与拍摄装置在拍摄点12所拍摄的图像在飞行方向的垂直方向上的重叠比例称为旁向重叠率。
(2)、根据第二位置信息及旁向间距,确定飞行航线。
即在外扩拍摄区域进行航线规划,飞行航线中的相邻子航线之间的旁向间距为步骤(1)中确定的旁向间距。
无人机在拍摄过程中,行进的航线所拍摄照片需要保证一定的重叠率,以能够应用在测绘等领域上。旁向重叠率可以为默认数值大小,也可以由用户设定。示例性的,旁向重叠率由用户设定,例如,旁向重叠率为用户通过无人机的控制装置输入,这种确定旁向重叠率的方式可以满足不同的用户需求,灵活性强。可选的,旁向重叠率可大于或等于65%,并小于或等于80%。示例性的,旁向重叠率为65%、70%、75%、80%或其他大于65%,并小于80%的数值大小。
可以理解的是,飞行航线也可采用其他方式规划,示例性的,请参见图6B飞行航线可为井字航线,井字航线包括两条航线(图6B中的航线60和航线70),两条航线的子航线相互垂直,一条航线需要完成两个或三个拍摄方向的倾斜图像的采集,其中一条航线采集左向图像和右向图像,或者左向图像、右向图像和正射图像,另一条航线采集前向图像和后向图像,或者前向图像、后向图像和正射图像,若只需拍摄左向图像和右向图像,或者只需拍摄左向图像、右向图像和正射图像,或者只需拍摄前向图像和后向图像,或者只需拍摄前向图像、后向图像和正射图像,飞行航线则可规划为井字航线。其中,井字航线的旁向间距与上述实施例的旁向间距D route的大小一致。
本申请实施例中,一个拍摄方向对应一个云台的预设目标姿态,即云台到达预设目标姿态时,拍摄装置处于对应的拍摄方向。
在S203中、根据第三位置信息及预设的无人机的飞行航线,确定飞行航线上的每个航点对应的拍摄序列,其中,每个拍摄序列包括一个或多个连续的拍摄点,每个拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。
示例性的,拍摄方向包括前拍方向、后拍方向、左拍方向、右拍方向以及正拍方向,其中,若拍摄序列位于前拍方向的有效拍摄区域外的飞行航线上,则该拍摄序列中不存在前拍方向的拍摄点;若拍摄序列位于后拍方向的有效拍摄区域外的飞行航线上,则该拍摄序列中不存在后拍方向的拍摄点;若拍摄序列位于左拍方向的有效拍摄区域外的飞行航线上,则该拍摄序列中不存在左拍方向的拍摄点;若拍摄序列位于右拍方向的有效拍摄区域外的飞行航线上,则该拍摄序列中不存在右拍方向的拍摄点;若拍摄序列位于正拍方向的有效拍摄区域外的飞行航线上,则该拍摄序列中不存在正拍方向的拍摄点。可以理解的是,本申请实施例中,每一拍摄序列位于前拍方向、后拍方向、左拍方向、右拍方向以及正拍方向中至少一个拍摄方向的有效区域内的飞行航线上,也即,每一拍摄序列在飞行航线上的位置处于至少一个有效拍摄区域,每一拍摄序列包括至少一个拍摄方向的拍摄点。
本申请实施例中,每一拍摄序列中的拍摄点的数量与该拍摄序列在飞行航线上 的位置所处有效拍摄区域的数量正相关,沿用图4A和图4B所示实施例,拍摄序列在飞行航线上的位置可处于区域1、区域2、区域3和区域4中的至少一个。
其中,区域1为5个拍摄方向的有效区域的重叠区域,即区域1为前拍方向、后拍方向、左拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域,区域2分别为4个拍摄方向的有效区域的重叠区域,则区域2包括4个重叠区域:前拍方向、左拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域,后拍方向、左拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域,前拍方向、后拍方向、左拍方向及正拍方向的有效拍摄区域的重叠区域,前拍方向、后拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域;区域3为3个拍摄方向的有效区域的重叠区域,则区域3包括4个重叠区域,分别为:前拍方向、左拍方向及正拍方向的有效拍摄区域的重叠区域,前拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域,后拍方向、左拍方向及正拍方向的有效拍摄区域的重叠区域,后拍方向、右拍方向及正拍方向的有效拍摄区域的重叠区域;区域4为单个拍摄方向的有效拍摄区域,则区域4包括4个独立的有效拍摄区域(不与其他拍摄方向的有效拍摄区域重叠),分别为前拍方向的有效拍摄区域、后拍方向的有效拍摄区域、左拍方向的有效拍摄区域、右拍方向的有效拍摄区域。
若拍摄序列在飞行航线上的位置处于区域1内,则该拍摄序列中拍摄点的数量为5个;若拍摄序列在飞行航线上的位置处于区域2内,则该拍摄序列中的拍摄点的数量为4个;若拍摄序列在飞行航线上的位置处于区域3内,则该拍摄序列中的拍摄点的数量为3个;若拍摄序列在飞行航线上的位置处于区域4内,则该拍摄序列中的拍摄点的数量为1个。
可以理解的是,本申请实施例中,多个拍摄序列按顺序排列,其中,每个拍摄序列的先后顺序与无人机按照飞行航线飞行时,经过该拍摄序列在飞行航线上的位置的先后相一致。
后续无人机在根据规划好的拍摄序列进行倾斜摄影时,可以采用定时拍或定距拍方式触发云台和拍摄装置完成拍摄过程。可选的,拍摄装置完成每个拍摄序列的拍摄所需的时长固定或者相邻拍摄序列之间的间距固定,从而使得两个拍摄序列之间的拍摄间隔时长或间距更加稳定。示例性的,在一些实施例中,拍摄装置完成每个拍摄序列的拍摄所需的时长为第一固定时长,这样,无人机可采用定时拍方式触发云台和拍摄装置完成拍摄过程,可选的,无人机在根据规划好的拍摄序列进行倾斜摄影时,可在云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,发送一次定时拍摄触发信号给云台,云台在收到该定时拍摄触发信号后,会基于第一固定时长定时触发拍摄装置依次完成每个拍摄序列的拍摄,采用这种定时拍的触发方式,无人机只需发送一次定时拍摄触发信号即可,无人机的控制较为简单。当然,拍摄装置完成每个拍摄序列的拍摄所需的时长也可不为固定时长。
进一步可选的,拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长,使得每一拍摄序列内的相邻拍摄点之间的拍摄间隔时长稳定。当然,拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长也可不为固定时长。
可以理解的,对于同一拍摄序列,第一固定时长大于第二固定时长。其中,第一固定时长、第二固定时长的大小可根据需要设置,示例性的,第一固定时长为10秒,第二固定时长为2秒;当然,第一固定时长、第二固定时长也可设置为其他数值大小。
在一些实施例中,相邻拍摄序列之间的间距为第一固定间距,这样,无人机可采用定距拍方式触发云台和拍摄装置完成拍摄过程,可选的,无人机在根据规划好的拍摄序列进行倾斜摄影时,可在通过云台控制拍摄装置完成每个拍摄序列首个拍摄点的拍摄之前,分别发送定距拍摄触发信号至云台,云台在每次接收到定距拍摄触发信号后,先进行姿态切换,使得云台上的拍摄装置在无人机到达对应拍摄序列的每一拍摄点时均处于对应的拍摄方向,再在拍摄装置处于对应的拍摄方向时,触发拍摄装置拍摄,这种定距拍的触发方式,能够使得两个拍摄序列之间的距离更加稳定。
进一步可选的,同一拍摄序列中相邻拍摄点之间的间距为第二固定间距,无人机可在云台控制拍摄装置完成每个拍摄点的拍摄之前,分别发送定距拍摄触发信号至云台,云台在每次接收到定距拍摄触发信号后,先进行姿态切换,使得云台上的拍摄装置在无人机到达对应的拍摄点时处于该拍摄点对应的拍摄方向,再在拍摄装置处于该拍摄点对应的拍摄方向时,触发拍摄装置拍摄,从而使得每一拍摄序列中的相邻两个拍摄点之间的距离更加稳定。
其中,第一固定间距、第二固定间距的大小可根据需要设置,示例性的,第一固定间距为10米,第二固定间距为2米;当然,第一固定间距、第二固定间距也可设置为其他数值大小。
在一些实施例中,无人机采用定距拍方式触发云台和拍摄装置完成拍摄过程,在云台和拍摄装置拍摄过程中,云台定时完成一次摆拍序列,也即,无人机到达每个航点时,触发云台进入拍摄程序,云台在进入拍摄程序后,定时触发拍摄装置在每个拍摄点进行拍摄,示例性的,相邻拍摄序列之间的间距(即相邻航点之间的间距)为第三固定间距,拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第三固定时长,第三固定间距、第三固定时长的大小可根据需要设置,示例性的,第三固定间距为10米,第三固定时长为2秒。
可选的,拍摄点中的初始拍摄点(即首个拍摄序列中的首个拍摄点)为:无人机按照飞行航线飞行时的起始飞行位置;可选的,拍摄点中的初始拍摄点为:飞行航线的初始航点。其中,起始飞行位置与起始航点可为同一位置,也可为不同位置,具体可根据需要选择上述实施例中的一种来确定初始拍摄点。可以理解,初始拍摄点的确定方式并不限于上述列举的几种方式,还可采用其他方式来确定初始拍摄点。
除特别说明外,上述实施例的拍摄控制方法的执行主体可为无人机的控制装置,控制装置可以为诸如遥控器、手机、电脑、智能穿戴设备等能够控制无人机的装置;上述实施例的拍摄控制方法的执行主体还可为无人机,如执行主体可为无人机的飞行控制器或设于无人机的其他控制器或飞行控制器或和设于无人机的其他控制器的组合;上述实施例的拍摄控制方法的执行主体还可为无人机的控制装置和无人机的组合,例如,将第一位置信息、第二位置信息的获取以及飞行航线的规划放在无人机的控制装置执行,将有效拍摄区域的确定以及拍摄序列的确定放在无人机执行;又如,将第一位置信息、第二位置信息的获取放在无人机的控制装置执行,将飞行航线的规划、有效拍摄区域的确定以及拍摄序列的确定放在无人机执行;又如,将第一位置信息的获取放在无人机的控制装置执行,将第二位置信息的确定、飞行航线的规划、有效拍摄区域的确定以及拍摄序列的确定放在无人机执行;又如,还可以将第一位置信息的获取、第二位置信息的确定、飞行航线的规划、有效拍摄区域的确定以及拍摄序列的确定全部放在无人机执行;当然,上述实施例的拍摄控制方法的执行主体不限于无人机的控制装置和/或无人机,也可为诸如其他独立于无人机的控制装置或无人机的 电子设备,如云台或拍摄装置的控制设备。
示例性的,在一些实施例中,上述实施例的方法的执行主体为无人机的控制装置。其中,控制装置在获取外扩拍摄区域的第二位置信息时,具体的,根据第一位置信息,确定外扩拍摄区域的第二位置信息。可选的,飞行航线的规划是在控制装置中进行的,示例性的,根据第二位置信息,规划无人机的飞行航线,其中,飞行航线的规划可参见上述实施例中相应部分的描述,此处不再赘述。进一步的,拍摄控制方法还包括:发送飞行航线至无人机,并且,在根据第三位置信息及预设的无人机的飞行航线,确定飞行航线上的每个航点对应的拍摄序列之后,发送每个航点对应的拍摄序列至无人机,使得无人机基于飞行航线及每个航点对应的拍摄序列控制搭载在无人机上的拍摄装置进行拍摄,从而在无人机进行倾斜摄影之前,通过无人机的控制装置将飞行航线发送给无人机,无人机在执行该飞行航线的过程中执行倾斜摄影。可以理解的是,飞行航线的规划过程也可在无人机中进行。
在一些实施例中,上述实施例的拍摄控制方法的执行主体为无人机,示例性的,无人机可在进行倾斜摄影之前进行上述拍摄序列的规划,规划获得的拍摄序列中各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域;示例性的,每个拍摄序列均包括所有拍摄方向的拍摄点,无人机在进行倾斜摄影的过程中,根据无人机的实时位置信息以及当前执行的拍摄序列的拍摄点,去除当前拍摄序列中位于无效拍摄区域的拍摄点。
下面,以拍摄控制方法的执行主体为无人机为例对拍摄控制方法进一步说明。
其中,第一位置信息可由无人机的控制装置发送,示例性的,用户通过无人机的控制装置输入第一位置信息,无人机的控制装置再将第一位置信息发送给无人机。其中,用户可通过手动打点方式或通过外部文件将第一位置信息输入无人机的控制装置,具体可参见上述实施例中相应部分的描述,此处不再赘述。可以理解的是,若无人机自身具备输入模块,第一位置信息也可由用户操作输入模块直接输入无人机。
可采用不同策略获取第二位置信息,示例性的,在一些实施例中,第二位置信息由无人机的控制装置发送,可选的,由无人机的控制装置根据第一位置信息,确定外扩拍摄区域的第二位置信息,再将第二位置信息发给无人机;在一些实施例中,由无人机自身确定第二位置信息,具体的,根据第一位置信息,确定外扩拍摄区域的第二位置信息。其中,根据第一位置信息,确定外扩拍摄区域的第二位置信息的实现过程可参见上述实施例中相应部分的描述,此处不再赘述。
飞行航线的规划可放在无人机的控制装置进行或放在无人机进行,示例性的,在一些实施例中,飞行航线由无人机的控制装置基于第二位置信息进行规划,无人机的控制装置再将飞行航线发送给无人机;在一些实施例中,飞行航线由无人机基于第二位置信息进行规划。其中,飞行航线的规划可参见上述实施例中相应部分的描述,此处不再赘述。
本申请实施例的拍摄控制方法还可包括:基于飞行航线及每个航点对应的拍摄序列控制搭载在无人机上的拍摄装置进行拍摄。
下面,对无人机基于飞行航线及拍摄序列控制搭载在无人机上的拍摄装置进行拍摄的过程进行详细描述。
图7为本申请一实施例中的一种基于飞行航线及拍摄序列控制搭载在无人机上的拍摄装置进行拍摄的实现方式示意图。请参见图7,基于飞行航线及拍摄序列控制搭载在无人机上的拍摄装置进行拍摄的过程可包括步骤S701~S703。
其中,在S701中,控制无人机按照飞行航线飞行;
在S702中,根据拍摄序列,在无人机从当前拍摄点飞向下一拍摄点的过程中,控制无人机上的云台切换姿态,使得云台上的拍摄装置在无人机到达每一拍摄点时均处于对应的拍摄方向;
在S703中,获取拍摄装置在每一拍摄点所拍摄的图像。
本申请实施例在无人机从当前拍摄点飞行下一拍摄点的过程中,控制搭载拍摄装置的云台切换姿态,使得拍摄装置在无人机到达每一拍摄点时均处于对应的拍摄方向,拍摄过程无需停止无人机飞行,从而提高了拍摄效率,特别适用于地图测绘上;并且,本申请实施例通过云台控制一台拍摄装置异步完成多个拍摄方向的图像的拍摄,相比传统的多拼拍摄装置,本发明的无人机重量大大减轻,从而可选择体积和重量较轻的无人机来搭载拍摄装置,降低了使用成本。
本申请实施例中,拍摄装置进行拍摄不影响无人机的飞行,也即,拍摄装置进行拍摄时,无人机继续执行飞行航线,无人机不会因为拍摄装置的拍摄动作而悬停,进一步提高拍摄效率。
本申请实施例的飞行航线可包括多个航点,其中,飞行航线可由用户预先设定,可选的,用户通过无人机的控制装置将各航点的位置信息输入无人机,无人机可将各航点按照输入的顺序依次相连而形成上述飞行航线。在用户更新已设定好的飞行航线中部分航点的位置时,可通过操作无人机的控制装置来对已设定好的飞行航线中部分航点的位置信息进行修改。对已设定好的飞行航线中部分航点的位置信息进行修改的步骤可在无人机飞行前执行,也可在无人机飞行的过程中执行。可以理解的是,飞行航线也可以为默认的飞行航线。
航点和拍摄点之间的位置设置关系可根据需要选择,例如,在一些实施例中,相邻航点之间设有拍摄点,如此,可以利用航点间的飞行时间大于拍摄装置拍摄图像所需的时间的特点,在航点间插入多个拍摄方向的图像拍摄,拍摄效率较高;在另一些实施例中,多个航点中的一部分作为拍摄点,相邻航点之间可设置拍摄点,也可不设置拍摄点;在另一些实施例中,多个航点全部作为拍摄点,相邻航点之间可设置拍摄点,也可不设置拍摄点。可以理解的是,一个拍摄点具有一次拍摄,对应一个拍摄方向,并对应一个云台的预设目标姿态,得到一张拍摄的图像。
在相关技术中,无论是固定翼无人机还是旋翼无人机,在利用一台拍摄装置实现多个角度的拍摄时,由于速度或效率控制的原因,一般将飞行航线会设计成多条飞行航线,如五航线,每条飞行航线对应一个拍摄方向,分别采集待拍摄区域的前向图像、后向图像、左向图像、右向图像和正射图像,因此,需要控制无人机分别沿着五条航线飞行,但这不利于无人机的续航。对于此,本申请实施例的飞行航线设计为一条,该条飞行航线可以为诸如图4A和图4B所示的飞行航线,也可以为其他,在无人机的一次飞行过程中,控制无人机上的云台进行姿态切换,以实现多个拍摄角度的拍摄,从而不用实现航线的反复巡飞,进而不仅有利于提高拍摄效率,还有利于降低无人机的能耗。
其中,控制无人机按照飞行航线飞行的过程可包括:控制拍摄装置的镜头与待拍摄区域之间的实时高度在预设高度范围内。在利用无人机进行测绘时,在测绘过程中,由于地势的起伏,会导致GSD不均匀,故通过控制拍摄装置的镜头与地面之间的高度,以维持GSD的均匀性,如地势变高,则无人机上升;地势降低,则无人机下降,确保测绘过程中,GSD大致相等。其中,针对固定翼无人机,其上升高度和下降高度 有限,故只能在固定翼无人机的上升高度或下降高度范围内控制固定翼无人机上升或下降,以尽可能地保持GSD一致。
本申请实施例不需要无人机在拍摄点悬停,为保证每个拍摄序列触发时,云台已经完成上一个拍摄序列,需要将飞行速度控制在无人机允许的最大飞行速度内。可结合云台转动性能计算最大飞行速度,可选的,无人机允许的最大飞行速度为基于各拍摄方向的航向间距和拍摄装置完成一个拍摄序列的拍摄并恢复至初始拍摄方向所需的时长确定,其中,各拍摄方向的航向间距大小相等,且航向间距为基于预设的地面分辨率、预设的航向重叠率和拍摄装置平行于无人机的飞行方向上的像元个数(即拍摄装置的图像传感器平行于无人机的飞行方向上的像元个数)确定。
示例性的,最大飞行速度V max的计算公式如下:
Figure PCTCN2020102249-appb-000003
公式(4)中,D 2为各拍摄方向的航向间距,T Gim为拍摄装置完成一个拍摄序列的拍摄并恢复至初始拍摄方向所需的时长。应当理解的是,最大飞行速度V max的计算方式不限于公式(4),还可以为其他。
示例性的,其中两个拍摄序列间各个拍摄方向的相对位置关系如图8所示,F1、F2为前拍方向的有效拍摄区域,D1、D2为正拍方向的有效拍摄区域,B1、B2为后拍方向的有效拍摄区域,R1、R2为右拍方向的有效拍摄区域,L1、L2为左拍方向的有效拍摄区域,D F、D D、D B、D R、D L分别为前拍方向、正拍方向、后拍方向、右拍方向、左拍方向的航向间距。本申请实施例中,D 2=D F=D D=D B=D R=D L
示例性的,前拍方向的航向间距D F的计算公式如下:
D F=GSD(1-γ course)n V    (5);
公式(5)中,γ course为预设的航向重叠率,n V为拍摄装置平行于无人机的飞行方向上的像元个数。应当理解的是,前拍方向的航向间距D F的计算方式不限于公式(5),还可以为其他。
其中,航向重叠率可以为默认数值大小,也可由用户设定。示例性的,航向重叠率由用户设定,例如,航向重叠率为用户通过无人机的控制装置输入,这种确定航向重叠率的方式可以满足不同的用户需求,灵活性强。为保证各拍摄方向的图像满足建模需求,可选的,航向重叠率大于或等于65%,并小于或等于80%。示例性的,航向重叠率为65%、70%、75%、80%或其他大于65%,并小于80%的数值大小。
下面,以前向图像为例分析系统误差对航向重叠率的影响。
假设地面分辨率GSD为2.5cm,航向重叠率γ course为70%,拍摄装置平行于无人机的飞行方向上的像元个数n V为5460,飞行速度为10m/s,则根据公式(5),确定前后两张前向图像之间的理论距离为:
2.5cm*(1-70%)*5460=40.95m;
由于云台转动速度波动、系统延时时间波动等因素,第二张前向图像的实际拍摄时间与理论拍摄时间的误差为0.5s(延时),则前后两张前向图像的实际距离为:
40.95m+10m/s*0.5=45.95m;
则将D F=45.95m、GSD=2.5cm、n V=5460代入公式(5),可以确定出前向图像的实际航向重叠率为66%,仍满足建模需求。
由于前拍方向、后拍方向的有效拍摄区域平行于航向的边长比正拍方向、右拍 方向、左拍方向长,实际拍摄时间与理论拍摄时间的误差对前拍方向、后拍方向的航向重叠率的影像更小。通过控制飞行速度、飞行方向、云台转动速度等参数(需要系统优化),或通过增加航向重叠率大小(会影响整体作业效率),可以保证各拍摄方向的图像的航向重叠率满足建模要求。
初始拍摄方向可以为拍摄序列中其中一个拍摄点对应的拍摄方向,示例性的,初始拍摄方向为拍摄序列的首个拍摄点的拍摄方向。可选的,每个拍摄序列的首个拍摄点的拍摄方向均相同,如每个拍摄序列的首个拍摄点的拍摄方向均为正拍方向;可选的,多个拍摄序列的首个拍摄点的拍摄方向至少部分不相同,如拍摄序列1的首个拍摄点的拍摄方向为左拍方向,拍摄序列2的首个拍摄点的拍摄方向为右拍方向,拍摄序列3的首个拍摄点的拍摄方向为左拍方向等等。其中,T Gim可为拍摄装置完成当前拍摄序列的拍摄并恢复至下一拍摄序列的初始拍摄方向所需的时长,这适用于每个拍摄序列的首个拍摄点的拍摄方向均相同或者多个拍摄序列的首个拍摄点的拍摄方向至少部分不相同的场景;当然,T Gim也可为拍摄装置完成当前拍摄序列的拍摄并恢复至当前拍摄序列的初始拍摄方向,这适用于每个拍摄序列的首个拍摄点的拍摄方向均相同的场景。
其中,在控制无人机上的云台切换姿态,使得云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向时,可选的,获取无人机的实时姿态;确定无人机的实时姿态和下一拍摄点的拍摄方向之间的偏差;根据偏差控制无人机上的云台切换姿态,使得云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向。本申请实施例的拍摄装置通过云台搭载在无人机的机体上,在机体的姿态变化较大时,可通过控制云台的姿态,使得在不同航点(拍摄序列)同一个方向的拍摄,利用云台姿态控制保证云台对地角度保持一致(或偏差很小),保证同一方向的照片序列重叠率保持均匀。
无人机与云台之间可采用不同方式配合以完成多拍摄方向的拍摄,示例性的,根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向的过程可包括:根据拍摄序列,发送拍摄触发信号至云台,以使得云台在无人机从当前拍摄点飞向下一拍摄点的过程中,进行姿态切换,使得云台上的拍摄装置在无人机到达每一拍摄点时均处于对应的拍摄方向。并且,拍摄触发信号还用于指示云台在拍摄装置处于对应的拍摄方向时,触发所述云台拍摄。本实施例中,无人机触发云台进入执行拍摄序列的程序,其中,拍摄序列的程序包括姿态切换和拍摄触发这两个步骤,姿态切换和拍摄触发均由云台完成,从而减少无人机触发信号处理过程的延时对作业效率的影响。其中,姿态切换是为了使得云台上的拍摄装置在无人机到达每一拍摄点时均处于对应的拍摄方向。当然,也可以由无人机直接控制云台进行姿态切换和/或由无人机直接触发拍摄装置拍摄。
本申请实施例中,云台和拍摄装置完成拍摄过程可包括:云台控制拍摄装置完成每一拍摄序列的拍摄,具体而言,云台根据拍摄序列进行姿态切换,使得云台上的拍摄装置在无人机到达每一拍摄序列的每一拍摄点时均处于对应的拍摄方向,并且,云台在拍摄装置处于对应的拍摄方向时,触发拍摄装置拍摄。
拍摄触发信号可以为定时拍摄触发信号,也可以为定距拍摄触发信号,也即,无人机可以采用定时拍或定距拍触发方式触发云台和拍摄装置完成拍摄过程。
示例性的,在一些实施例中,拍摄触发信号为定时拍摄触发信号,定时拍摄触发信号用于指示云台基于第一定时策略触发拍摄装置进行拍摄。其中,第一定时策略 可包括:拍摄装置完成每个拍摄序列的拍摄所需的时长为第一固定时长,如此,使得拍摄装置完成每个拍摄序列的拍摄所需的时长稳定。可选的,发送定时拍摄触发信号至云台的次数为一次,例如,可在云台控制拍摄装置完成首个拍摄序列的首个拍摄点的拍摄之前,发送定时拍摄触发信号至云台,云台在接收到定时触发信号后,依次转动至每个拍摄序列的各个拍摄方向,并定时触发拍摄装置进行拍摄,该方式只需无人机发送一次定时拍摄触发信号即可,无人机的控制较为简单。可以理解的,无人机发送定时拍摄触发信号至云台的次数也可为多次,如在云台控制拍摄装置完成每个拍摄序列的首个拍摄点的拍摄之前,分别发送定时拍摄触发信号至云台,云台在接收到定时拍摄触发信号后,依次转动至对应拍摄序列的各个拍摄方向,并定时触发拍摄装置完成对应拍摄序列中各个拍摄方向的拍摄。可以理解的,第一定时策略也可为其他。
进一步的,定时拍摄触发信号还可用于指示云台基于第二定时策略触发拍摄装置进行拍摄。其中,第二定时策略包括:拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长,如此,使得拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长稳定。可以理解的,第二定时策略也可为其他。
在定时拍触发方式下,可采用不同策略发送拍摄序列至云台,示例性的,拍摄序列的数量为多个,在一些实施例中,在云台控制拍摄装置完成首个拍摄序列的首个拍摄点的拍摄之前,将所有拍摄序列一次性发送给云台,后续云台和拍摄装置配合完成拍摄的过程中,无人机无需再发送拍摄序列给云台;在另外一些实施例中,在拍摄装置完成当前拍摄序列的拍摄之后,发送下一拍摄序列至所述云台,也即,在云台控制拍摄装置完成每一拍摄序列的拍摄之后,发送该拍摄序列的下一拍摄序列至云台,以指示云台执行下一拍摄序列的拍摄。
在一些实施例中,拍摄触发信号为定距拍摄触发信号,定距拍摄触发信号可用于指示基于第一定距策略触发云台控制拍摄装置进行拍摄。其中,第一定距策略包括:相邻拍摄序列之间的间距为第一固定间距,如此,使得相邻拍摄序列之间的间距稳定。可选的,发送定距触发信号至云台的次数为多次,例如,可在云台控制拍摄装置完成每个拍摄序列的首个拍摄点的拍摄之前,分别发送定距拍摄触发信号至云台,云台在每次接收到定距拍摄触发信号后,先进行姿态切换,使得云台上的拍摄装置在无人机到达对应拍摄序列的每一拍摄点时均处于对应的拍摄方向,再在拍摄装置处于对应的拍摄方向时,触发拍摄装置拍摄。由于无人机触发云台存在延时,因此,采用第一定距策略触发云台控制拍摄装置进行拍摄的方式能够减小无人机触发延时对作业效率的影响。
进一步的,可选的,定距拍摄触发信号还可用于指示基于第二定距策略触发云台控制拍摄装置进行拍摄。其中,第二定距策略包括:同一拍摄序列中相邻拍摄点之间的间距为第二固定间距,如此,使得同一拍摄序列中相邻拍摄点之间的间距稳定。另外,由于每个拍摄序列中的拍摄方向未必完全相同,因此,可以在云台控制拍摄装置完成每个拍摄点的拍摄之前,分别发送定距拍摄触发信号至云台,以触发云台完成每一拍摄点的拍摄。可选的,无人机采用定距拍触发方式触发云台和拍摄装置完成拍摄过程,在云台和拍摄装置拍摄过程中,云台定时完成一次摆拍序列,也即,无人机到达每个航点时,触发云台进入拍摄程序,云台在进入拍摄程序后,定时触发拍摄装置在每个拍摄点进行拍摄,则相邻拍摄序列之间的间距(即相邻航点之间的间距)为第一固定间距,拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第三固定时长,第三固定时长的大小可根据需要设置,示例性的,第三固定时长为2秒。
在定距拍触发方式下,可采用不同策略发送拍摄序列至云台,示例性的,在一些实施例中,在云台控制拍摄装置完成每个拍摄序列的首个拍摄点的拍摄之前,发送该拍摄序列至所述云台,也即,在云台控制拍摄装置完成每一拍摄序列的拍摄之后,云台控制拍摄装置执行每一拍摄序列的下一拍摄序列的拍摄之前,发送该拍摄序列的下一拍摄序列至云台,以指示云台执行下一拍摄序列的拍摄;在另外一些实施例中,在云台控制拍摄装置完成每个拍摄点的拍摄之前,分别发送指示信号至云台,指示信号用于指示该拍摄点对应的云台的目标姿态或拍摄装置的拍摄方向,即在每个拍摄点均触发云台执行该拍摄点的拍摄。
此外,本申请实施例中,云台姿态切换的控制方式可根据云台的类型选择,以三轴云台为例,云台被配置为绕偏航轴、横滚轴和俯仰轴运动。可选的,可通过控制云台的横滚轴姿态、俯仰轴姿态和偏航轴姿态中的一个或多个来实现对云台姿态的切换。通常,云台的偏航轴不能整周转动,因此不会采用控制云台的偏航轴姿态方式来控制拍摄装置处于每个拍摄序列中不同拍摄点的拍摄方向,故本申请实施例中,控制云台的偏航姿态、横滚轴姿态和俯仰轴姿态中任两个,以控制云台切换姿态。
示例性的,控制偏航姿态和俯仰轴姿态,以控制云台切换姿态,可选的,云台的目标姿态表征为(俯仰姿态,横滚姿态,偏航姿态),前拍方向、正拍方向、后拍方向、右拍方向、左拍方向对应的云台目标姿态分别为:(-60°,0°,0°)、(-90°,0°,0°)、(-120°,0°,0°)、(-60°,0°,90°)、(-120°,0°,90°)。
示例性的,控制偏航姿态和横滚轴姿态,以控制云台切换姿态,可选的,云台目标姿态表征为(俯仰姿态,横滚姿态,偏航姿态),前拍方向、正拍方向、后拍方向、右拍方向、左拍方向对应的云台目标姿态分别为:(-90°,30°,0°)、(-90°,0°,0°)、(-90°,-30°,0°)、(-90°,-30°,90°)、(-90°,30°,90°)。
可以理解的是,上述列举的前拍方向、正拍方向、后拍方向、右拍方向、左拍方向对应的云台目标姿态只是其中两种,可以通过调整各个轴的转动角度和各个轴的前后转动顺序也可实现类似的拍摄方向的拍摄,但作业效率上有所差别。
示例性的,请参见图9,无人机沿着飞行航线80飞行,当无人机到达某个拍摄序列的首个拍摄点(如图9,该拍摄序列的首个拍摄点的拍摄方向为前拍方向)时,触发云台执行该拍摄序列的拍摄。为方便描述,将该拍摄序列称作为拍摄序列A,拍摄序列A的拍摄方向包括前拍方向、正拍方向、后拍方向、右拍方向和左拍方向。则云台执行拍摄序列A的拍摄的过程如下:
(1)、云台检测到拍摄装置完成上一拍摄序列的最后一个拍摄点的拍摄后,转动至拍摄序列A的前拍方向对应的云台的第一目标姿态,然后触发拍摄装置拍摄,获得前向图像;
(2)、在拍摄装置完成拍摄序列A的前向图像的拍摄后,继续转动至拍摄序列A的正拍方向对应的云台的第二目标姿态,然后触发拍摄装置拍摄,获得正射图像;
(3)、在拍摄装置完成拍摄序列A的正射图像的拍摄后,继续转动至拍摄序列A的后拍方向对应的云台的第三目标姿态,然后触发拍摄装置拍摄,获得后向图像;
(4)、在拍摄装置完成拍摄序列A的后向图像的拍摄后,继续转动至拍摄序列A的右拍方向对应的云台的第四目标姿态,然后触发拍摄装置拍摄,获得右向图像;
(5)、在拍摄装置完成拍摄序列A的右向图像的拍摄后,继续转动至拍摄序列A的左拍方向对应的云台的第五目标姿态,然后触发拍摄装置拍摄,获得左向图像。
至此,云台完成拍摄序列A的拍摄。
可以理解的,第一目标姿态、第二目标姿态、第三目标姿态、第四目标姿态、第五目标姿态均为预设目标姿态,可选的,第一目标姿态、第二目标姿态、第三目标姿态、第四目标姿态、第五目标姿态分别:(-60°,0°,0°)、(-90°,0°,0°)、(-120°,0°,0°)、(-60°,0°,90°)、(-120°,0°,90°),或者,第一目标姿态、第二目标姿态、第三目标姿态、第四目标姿态、第五目标姿态分别:(-90°,30°,0°)、(-90°,0°,0°)、(-90°,-30°,0°)、(-90°,-30°,90°)、(-90°,30°,90°)。
其中,云台是在其到达预设目标姿态且处于稳定状态时,触发拍摄装置拍摄。云台处于稳定状态可包括:云台相对预设方向的角度(如云台对地的角度)波动范围在预设角度范围内,即云台相对预设方向的角度的波动较小。
另外,云台触发拍摄装置拍摄之前,拍摄装置需处于可执行拍摄的状态。可选的,拍摄装置处于可执行拍摄的状态包括:拍摄装置的控制部处于可触发的状态,示例性的,拍摄装置为相机,控制部为相机的快门;可选的,拍摄装置处于可执行拍摄的状态包括:拍摄装置的缓存大于预设容量,即拍摄装置的缓存足够大以能够存储至少一张图像。
可选的,每个拍摄序列对应一个图像队列,可将每个拍摄序列中每一拍摄点的图像保存在对应的图像队列中;可选的,将同一拍摄方向的图像保存在同一图像队列,不同拍摄方向的图像保存在不同图像队列中,具体可根据需要选择图像的保存方式。
此外,需要说明的是,本申请实施例的拍摄控制方法不需要依赖于双轴或三轴云台,无人机配合单轴云台也可完成多拍摄方向的拍摄,如俯仰姿态可变的云台+井字航线规划可实现三个拍摄方向的拍摄、横滚姿态可变的云台+井字航线规划可实现三个个拍摄方向的拍摄、偏航姿态可变的云台+诸如图4A或图4B所示的飞行航线可实现三个或四个拍摄方向的拍摄。
另外,本申请实施例还可以提供一种拍摄控制方法,本申请实施例的拍摄控制方法的执行主体为无人机,如执行主体可为无人机的飞行控制器或设于无人机的其他控制器或飞行控制器或和设于无人机的其他控制器的组合。请参见图10,本申请实施例的拍摄控制方法可包括S1001~S1002:
在S1001中,接收无人机的控制装置发送的飞行航线及飞行航线上的每个航点对应的拍摄序列,其中,每个拍摄序列包括一个或多个连续的拍摄点,每个拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,外扩拍摄区域为将待拍摄区域进行扩大获得,第二位置信息为根据第一位置信息确定;
在S1002中,基于飞行航线及每个航点对应的拍摄序列控制搭载在无人机上的拍摄装置进行拍摄。
对于图10所示实施例的具体说明可参见上述图2至图9所示实施例的内容,此处不再具体限定。
对应于上述实施例的拍摄控制方法,本申请实施例还提供一种拍摄控制装置,请参见图11,本申请实施例的拍摄控制装置可以包括存储装置和处理器,其中,处理器包括一个或多个。
存储装置,用于存储程序指令。所述存储装置存储所述拍摄控制方法的可执行指令计算机程序,所述存储装置可以包括至少一种类型的存储介质,存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、 静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,所述拍摄控制装置可以与通过网络连接执行存储器的存储功能的网络存储装置协作。存储器可以是拍摄控制装置的内部存储单元,例如拍摄控制装置的硬盘或内存。存储器也可以是拍摄控制装置的外部存储设备,例如拍摄控制装置上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步的,存储器还可以既包括拍摄控制装置的内部存储单元也包括外部存储设备。存储器用于存储计算机程序以及设备所需的其他程序和数据。存储器还可以用于暂时地存储已经输出或者将要输出的数据。
在一些实施例中,一个或多个处理器,调用存储装置中存储的程序指令,当程序指令被执行时,一个或多个处理器单独地或共同地被配置成用于实施如下操作:获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,外扩拍摄区域为将待拍摄区域进行扩大获得,第二位置信息为根据第一位置信息确定;根据第一位置信息和第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;根据第三位置信息及预设的无人机的飞行航线,确定飞行航线上的每个航点对应的拍摄序列;其中,每个拍摄序列包括一个或多个连续的拍摄点形成一拍摄序列,每个拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。本实施例的处理器可以实现如本申请图2或图7所示实施例的拍摄控制方法,可参见上述图2或图7所示实施例的拍摄控制方法对本实施例的拍摄控制装置进行说明。
在一些实施例中,一个或多个处理器,调用存储装置中存储的程序指令,当程序指令被执行时,一个或多个处理器单独地或共同地被配置成用于实施如下操作:接收无人机的控制装置发送的飞行航线及飞行航线上的每个航点对应的拍摄序列;基于飞行航线及每个航点对应的拍摄序列控制搭载在无人机上的拍摄装置进行拍摄;其中,每个拍摄序列包括一个或多个连续的拍摄点,每个拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,外扩拍摄区域为将待拍摄区域进行扩大获得,第二位置信息为根据第一位置信息确定。本实施例的处理器可以实现如本申请图10所示实施例的拍摄控制方法,可参见上述图10所示实施例的拍摄控制方法对本实施例的拍摄控制装置进行说明。
所述处理器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
需要说明的是,上述实体装置涉及的“发送”和“接收”等通信过程可以利用装置上的收发器或者通信接口执行,除“发送”和“接收”以外的其他数据处理过程可以由装置上的处理器执行。
进一步的,本申请实施例还一种无人机,请参见图1和图12,该无人机可包括机体100、云台300和上述实施例的拍摄控制装置。其中,云台300搭载在机体上,本实施例的云台300用于搭载拍摄装置200。拍摄控制装置由机体100支撑,并且,拍摄控制装置与云台300电连接。
此外,本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述实施例的拍摄控制方法的步骤。
所述计算机可读存储介质可以是前述任一实施例所述的拍摄控制装置的内部存储单元,例如硬盘或内存。所述计算机可读存储介质也可以是拍摄控制装置的外部存储设备,例如所述设备上配备的插接式硬盘、智能存储卡(Smart Media Card,SMC)、SD卡、闪存卡(Flash Card)等。进一步的,所述计算机可读存储介质还可以既包括拍摄控制装置的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述拍摄控制装置所需的其他程序和数据,还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本申请部分实施例而已,当然不能以此来限定本申请之权利范围,因此依本申请权利要求所作的等同变化,仍属本申请所涵盖的范围。

Claims (170)

  1. 一种拍摄控制方法,其特征在于,所述方法包括:
    获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定;
    根据所述第一位置信息和所述第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;
    根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列;
    其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。
  2. 根据权利要求1所述的方法,其特征在于,所述外扩拍摄区域为将所述待拍摄区域的不同方向分别扩大第一预设距离获得的区域;
    所述第一预设距离为基于所述无人机的飞行高度及搭载在所述无人机上的拍摄装置的安装角度确定。
  3. 根据权利要求2所述的方法,其特征在于,所述拍摄方向包括以下中的至少两种:
    相对竖直方向倾斜且朝向所述无人机的前方的前拍方向、相对竖直方向倾斜且朝向所述无人机的后方的后拍方向、相对竖直方向倾斜且朝向所述无人机的左侧方向的左拍方向、相对竖直方向倾斜且朝向所述无人机的右侧方向的右拍方向或拍摄方向竖直朝下的正拍方向。
  4. 根据权利要求3所述的方法,其特征在于,所述前拍方向的有效拍摄区域、所述后拍方向的有效拍摄区域、所述左拍方向的有效拍摄区域、所述右拍方向的有效拍摄区域分别为:
    所述待拍摄区域向第一方向移动第二预设距离后获得的区域、所述待拍摄区域向第二方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第三方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第四方向移动所述第二预设距离后获得的区域;所述正拍方向的有效拍摄区域为所述待拍摄区域,其中,所述第一方向与所述第二方向相反,所述第三方向与所述第四方向相反。
  5. 根据权利要求4所述的方法,其特征在于,所述第一方向、所述第二方向、所述第三方向或所述第四方向与所述飞行航线的形状相关。
  6. 根据权利要求2所述的方法,其特征在于,所述飞行高度由用户设定。
  7. 根据权利要求6所述的方法,其特征在于,所述飞行高度为用户通过所述无人机的控制装置输入。
  8. 根据权利要求2所述的方法,其特征在于,所述飞行高度为根据搭载在所述无人机上的拍摄装置的参数和预设的地面分辨率确定。
  9. 根据权利要求8所述的方法,其特征在于,所述拍摄装置的参数包括所述拍摄装置的焦距和所述拍摄装置的图像传感器的单个像素边长。
  10. 根据权利要求1所述的方法,其特征在于,所述飞行航线包括多条相互平行的子航线,相邻子航线的其中一侧相连,以形成一条飞行航线;
    所述飞行航线的确定过程包括:
    根据预设的地面分辨率、预设的旁向重叠率及搭载在所述无人机上的拍摄装置垂直于所述无人机的飞行方向上的像元个数,确定飞行航线中相邻两条子航线间的旁向间距;
    根据所述第二位置信息及所述旁向间距,确定所述飞行航线。
  11. 根据权利要求10所述的方法,其特征在于,所述外扩拍摄区域为方形,所述飞行航线的起始航点为所述外扩拍摄区域的任一边角位置,所述子航线平行于所述外扩拍摄区域的其中一条边。
  12. 根据权利要求10所述的方法,其特征在于,所述旁向重叠率由用户设定。
  13. 根据权利要求1所述的方法,其特征在于,所述第一位置信息由用户设定;或者,
    所述待拍摄区域通过导入外部文件确定,所述外部文件记录有所述第一位置信息。
  14. 根据权利要求1所述的方法,其特征在于,搭载在所述无人机上的拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  15. 根据权利要求14所述的方法,其特征在于,所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  16. 根据权利要求1所述的方法,其特征在于,相邻拍摄序列之间的间距为第一固定间距。
  17. 根据权利要求16所述的方法,其特征在于,同一拍摄序列中相邻拍摄点之间的间距为第二固定间距。
  18. 根据权利要求1所述的方法,其特征在于,所述拍摄点中的初始拍摄点为:所述无人机按照所述飞行航线飞行时的起始飞行位置;或者,
    所述拍摄点中的初始拍摄点为:所述飞行航线的初始航点。
  19. 根据权利要求1所述的方法,其特征在于,所述获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息之前,还包括:
    获取到指示进入倾斜拍摄模式的触发指令;
    进入所述倾斜拍摄模式。
  20. 根据权利要求1至19任一项所述的方法,其特征在于,所述方法的执行主体为所述无人机的控制装置。
  21. 根据权利要求20所述的方法,其特征在于,所述获取外扩拍摄区域的第二位置信息,包括:
    根据所述第一位置信息,确定外扩拍摄区域的第二位置信息。
  22. 根据权利要求20所述的方法,其特征在于,所述获取外扩拍摄区域的第二位置信息之后,还包括:
    根据所述第二位置信息,规划所述无人机的飞行航线。
  23. 根据权利要求22所述的方法,其特征在于,还包括:
    发送所述飞行航线至所述无人机;
    所述根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列之后,还包括:
    发送每个航点对应的拍摄序列至所述无人机,使得所述无人机基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄。
  24. 根据权利要求1至19任一项所述的方法,其特征在于,所述方法的执行主体为所述无人机。
  25. 根据权利要求24所述的方法,其特征在于,所述第一位置信息由所述无人机的控制装置发送。
  26. 根据权利要求24所述的方法,其特征在于,所述第二位置信息由所述无人机的控制装置发送。
  27. 根据权利要求24所述的方法,其特征在于,所述获取外扩拍摄区域的第二位置信息,包括:
    根据所述第一位置信息,确定外扩拍摄区域的第二位置信息。
  28. 根据权利要求24所述的方法,其特征在于,所述飞行航线由所述无人机的控制装置基于所述第二位置信息进行规划;或者,
    所述飞行航线由所述无人机基于所述第二位置信息进行规划。
  29. 根据权利要求24所述的方法,其特征在于,还包括:
    基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄。
  30. 根据权利要求29所述的方法,其特征在于,所述基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄,包括:
    控制所述无人机按照所述飞行航线飞行;
    根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    获取所述拍摄装置在每一拍摄点所拍摄的图像。
  31. 根据权利要求30所述的方法,其特征在于,所述拍摄装置进行拍摄不影响所述无人机的飞行。
  32. 根据权利要求30所述的方法,其特征在于,所述根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向,包括:
    根据所述拍摄序列,发送拍摄触发信号至所述云台,以使得所述云台在所述无人机从当前拍摄点飞向下一拍摄点的过程中,进行姿态切换,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    其中,所述拍摄触发信号还用于指示所述云台在所述拍摄装置处于所述对应的拍摄方向时,触发所述云台拍摄。
  33. 根据权利要求32所述的方法,其特征在于,所述拍摄触发信号为定时拍摄触发信号,所述定时拍摄触发信号用于指示所述云台基于第一定时策略触发所述拍摄装置进行拍摄;
    其中,所述第一定时策略包括:所述拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  34. 根据权利要求33所述的方法,其特征在于,所述定时拍摄触发信号还用于指示所述云台基于第二定时策略触发所述拍摄装置进行拍摄;
    其中,所述第二定时策略包括:所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  35. 根据权利要求33所述的方法,其特征在于,发送所述定时拍摄触发信号至所述云台的次数为一次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,发送定时拍摄触发信号至所述云台。
  36. 根据权利要求33至35任一项所述的方法,其特征在于,所述拍摄序列的数量为多个,所述方法还包括:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,将所有拍摄序列一次性发送给所述云台。
  37. 根据权利要求33至35任一项所述的方法,其特征在于,所述拍摄序列的数量为多个,所述方法还包括:
    在所述拍摄装置完成当前拍摄序列的拍摄之后,发送下一拍摄序列至所述云台。
  38. 根据权利要求32所述的方法,其特征在于,所述拍摄触发信号为定距拍摄触发信号,所述定距拍摄触发信号用于指示基于第一定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第一定距策略包括:相邻拍摄序列之间的间距为第一固定间距。
  39. 根据权利要求38所述的方法,其特征在于,所述定距拍摄触发信号还用于指示基于第二定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第二定距策略包括:同一拍摄序列中相邻拍摄点之间的间距为第二固定间距。
  40. 根据权利要求38所述的方法,其特征在于,发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  41. 根据权利要求40所述的方法,其特征在于,所述方法还包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,发送该拍摄序列至所述云台。
  42. 根据权利要求39所述的方法,其特征在于,所述发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  43. 根据权利要求42所述的方法,其特征在于,所述方法还包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送指示信号至所述云台,所述指示信号用于指示该拍摄点对应的所述云台的目标姿态或所述拍摄装置的拍摄方向。
  44. 根据权利要求30所述的方法,其特征在于,所述无人机允许的最大飞行速度为基于各拍摄方向的航向间距和所述拍摄装置完成一个所述拍摄序列的拍摄并恢复至初始拍摄方向所需的时长确定;
    其中,各拍摄方向的航向间距大小相等,且所述航向间距为基于预设的地面分辨率、预设的航向重叠率和所述拍摄装置平行于所述无人机的飞行方向上的像元个数确定。
  45. 根据权利要求44所述的方法,其特征在于,所述航向重叠率由用户设定。
  46. 根据权利要求44所述的方法,其特征在于,所述初始拍摄方向为所述拍摄序列中其中一个拍摄点对应的拍摄方向。
  47. 根据权利要求30所述的方法,其特征在于,所述控制所述无人机按照所述飞 行航线飞行,包括:
    控制所述拍摄装置的镜头与所述待拍摄区域之间的实时高度在预设高度范围内。
  48. 根据权利要求30所述的方法,其特征在于,所述控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向,具体包括:
    获取所述无人机的实时姿态;
    确定所述无人机的实时姿态和下一拍摄点的拍摄方向之间的偏差;
    根据所述偏差控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向。
  49. 根据权利要求30所述的方法,其特征在于,所述云台为三轴云台,所述云台被配置为绕偏航轴、横滚轴和俯仰轴运动;
    所述控制所述无人机上的云台切换姿态,包括:
    控制所述云台的偏航姿态、横滚轴姿态和俯仰轴姿态中任两个,以控制所述云台切换姿态。
  50. 一种拍摄控制装置,其特征在于,所述装置包括:
    存储装置,用于存储程序指令;以及
    一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
    获取待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定;
    根据所述第一位置信息和所述第二位置信息,确定不同拍摄方向的有效拍摄区域的第三位置信息;
    根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列;
    其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域。
  51. 根据权利要求50所述的装置,其特征在于,所述外扩拍摄区域为将所述待拍摄区域的不同方向分别扩大第一预设距离获得的区域;
    所述第一预设距离为基于所述无人机的飞行高度及搭载在所述无人机上的拍摄装置的安装角度确定。
  52. 根据权利要求51所述的装置,其特征在于,所述拍摄方向包括以下中的至少两种:
    相对竖直方向倾斜且朝向所述无人机的前方的前拍方向、相对竖直方向倾斜且朝向所述无人机的后方的后拍方向、相对竖直方向倾斜且朝向所述无人机的左侧方向的左拍方向、相对竖直方向倾斜且朝向所述无人机的右侧方向的右拍方向或拍摄方向竖直朝下的正拍方向。
  53. 根据权利要求52所述的装置,其特征在于,所述前拍方向的有效拍摄区域、所述后拍方向的有效拍摄区域、所述左拍方向的有效拍摄区域、所述右拍方向的有效拍摄区域分别为:
    所述待拍摄区域向第一方向移动第二预设距离后获得的区域、所述待拍摄区域向第二方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第三方向移动所述 第二预设距离后获得的区域、所述待拍摄区域向第四方向移动所述第二预设距离后获得的区域;所述正拍方向的有效拍摄区域为所述待拍摄区域,其中,所述第一方向与所述第二方向相反,所述第三方向与所述第四方向相反。
  54. 根据权利要求53所述的装置,其特征在于,所述第一方向、所述第二方向、所述第三方向或所述第四方向与所述飞行航线的形状相关。
  55. 根据权利要求51所述的装置,其特征在于,所述飞行高度由用户设定。
  56. 根据权利要求55所述的装置,其特征在于,所述飞行高度为用户通过所述无人机的控制装置输入。
  57. 根据权利要求51所述的装置,其特征在于,所述飞行高度为根据搭载在所述无人机上的拍摄装置的参数和预设的地面分辨率确定。
  58. 根据权利要求57所述的装置,其特征在于,所述拍摄装置的参数包括所述拍摄装置的焦距和所述拍摄装置的图像传感器的单个像素边长。
  59. 根据权利要求50所述的装置,其特征在于,所述飞行航线包括多条相互平行的子航线,相邻子航线的其中一侧相连,以形成一条飞行航线;
    所述一个或多个处理器在确定飞行航线时,单独地或共同地被进一步配置成用于实施如下操作:
    根据预设的地面分辨率、预设的旁向重叠率及搭载在所述无人机上的拍摄装置垂直于所述无人机的飞行方向上的像元个数,确定飞行航线中相邻两条子航线间的旁向间距;
    根据所述第二位置信息及所述旁向间距,确定所述飞行航线。
  60. 根据权利要求59所述的装置,其特征在于,所述外扩拍摄区域为方形,所述飞行航线的起始航点为所述外扩拍摄区域的任一边角位置,所述子航线平行于所述外扩拍摄区域的其中一条边。
  61. 根据权利要求59所述的装置,其特征在于,所述旁向重叠率由用户设定。
  62. 根据权利要求50所述的装置,其特征在于,所述第一位置信息由用户设定;或者,
    所述待拍摄区域通过导入外部文件确定,所述外部文件记录有所述第一位置信息。
  63. 根据权利要求50所述的装置,其特征在于,搭载在所述无人机上的拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  64. 根据权利要求63所述的装置,其特征在于,所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  65. 根据权利要求50所述的装置,其特征在于,相邻拍摄序列之间的间距为第一固定间距。
  66. 根据权利要求65所述的装置,其特征在于,同一拍摄序列中相邻拍摄点之间的间距为第二固定间距。
  67. 根据权利要求50所述的装置,其特征在于,所述拍摄点中的初始拍摄点为:所述无人机按照所述飞行航线飞行时的起始飞行位置;或者,
    所述拍摄点中的初始拍摄点为:所述飞行航线的初始航点。
  68. 根据权利要求50所述的装置,其特征在于,所述一个或多个处理器在获取待拍摄区域的第一位置信息之前,单独地或共同地还被配置成用于实施如下操作:
    获取到指示进入倾斜拍摄模式的触发指令;
    进入所述倾斜拍摄模式。
  69. 根据权利要求50至68任一项所述的装置,其特征在于,所述拍摄控制装置设于所述无人机的控制装置。
  70. 根据权利要求69所述的装置,其特征在于,所述一个或多个处理器在获取外扩拍摄区域的第二位置信息时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述第一位置信息,确定外扩拍摄区域的第二位置信息。
  71. 根据权利要求69所述的装置,其特征在于,所述一个或多个处理器在获取外扩拍摄区域的第二位置信息之后,单独地或共同地还被配置成用于实施如下操作:
    根据所述第二位置信息,规划所述无人机的飞行航线。
  72. 根据权利要求69所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    发送所述飞行航线至所述无人机;
    所述一个或多个处理器在根据所述第三位置信息及预设的无人机的飞行航线,确定所述飞行航线上的每个航点对应的拍摄序列之后,单独地或共同地还被配置成用于实施如下操作:
    发送所述拍摄序列至所述无人机,使得所述无人机基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄。
  73. 根据权利要求50至68任一项所述的装置,其特征在于,所述拍摄控制装置设于所述无人机。
  74. 根据权利要求73所述的装置,其特征在于,所述第一位置信息由所述无人机的控制装置发送。
  75. 根据权利要求73所述的装置,其特征在于,所述第二位置信息由所述无人机的控制装置发送。
  76. 根据权利要求73所述的装置,其特征在于,所述一个或多个处理器在获取外扩拍摄区域的第二位置信息时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述第一位置信息,确定外扩拍摄区域的第二位置信息。
  77. 根据权利要求73所述的装置,其特征在于,所述飞行航线由所述无人机的控制装置基于所述第二位置信息进行规划;或者,
    所述飞行航线由所述无人机基于所述第二位置信息进行规划。
  78. 根据权利要求73所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄。
  79. 根据权利要求78所述的装置,其特征在于,所述一个或多个处理器在基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述无人机按照所述飞行航线飞行;
    根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    获取所述拍摄装置在每一拍摄点所拍摄的图像。
  80. 根据权利要求79所述的装置,其特征在于,所述拍摄装置进行拍摄不影响所述无人机的飞行。
  81. 根据权利要求79所述的装置,其特征在于,所述一个或多个处理器在根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述拍摄序列,发送拍摄触发信号至所述云台,以使得所述云台在所述无人机从当前拍摄点飞向下一拍摄点的过程中,进行姿态切换,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    其中,所述拍摄触发信号还用于指示所述云台在所述拍摄装置处于所述对应的拍摄方向时,触发所述云台拍摄。
  82. 根据权利要求81所述的装置,其特征在于,所述拍摄触发信号为定时拍摄触发信号,所述定时拍摄触发信号用于指示所述云台基于第一定时策略触发所述拍摄装置进行拍摄;
    其中,所述第一定时策略包括:所述拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  83. 根据权利要求82所述的装置,其特征在于,所述定时拍摄触发信号还用于指示所述云台基于第二定时策略触发所述拍摄装置进行拍摄;
    其中,所述第二定时策略包括:所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  84. 根据权利要求82所述的装置,其特征在于,所述一个或多个处理器发送所述定时拍摄触发信号至所述云台的次数为一次,所述一个或多个处理器在发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,发送定时拍摄触发信号至所述云台。
  85. 根据权利要求82至84任一项所述的装置,其特征在于,所述拍摄序列的数量为多个,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,将所有拍摄序列一次性发送给所述云台。
  86. 根据权利要求82至84任一项所述的装置,其特征在于,所述拍摄序列的数量为多个,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述拍摄装置完成当前拍摄序列的拍摄之后,发送下一拍摄序列至所述云台。
  87. 根据权利要求81所述的装置,其特征在于,所述拍摄触发信号为定距拍摄触发信号,所述定距拍摄触发信号用于指示基于第一定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第一定距策略包括:相邻拍摄序列之间的间距为第一固定间距。
  88. 根据权利要求87所述的装置,其特征在于,所述定距拍摄触发信号还用于指示基于第二定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第二定距策略包括:同一拍摄序列中相邻拍摄点之间的间距为第二固定间距。
  89. 根据权利要求87所述的装置,其特征在于,所述一个或多个处理器发送所述定距触发信号至所述云台的次数为多次,所述一个或多个处理器在发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前, 分别发送定距拍摄触发信号至所述云台。
  90. 根据权利要求89所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,发送该拍摄序列至所述云台。
  91. 根据权利要求88所述的装置,其特征在于,所述一个或多个处理器在发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  92. 根据权利要求91所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送指示信号至所述云台,所述指示信号用于指示该拍摄点对应的所述云台的目标姿态或所述拍摄装置的拍摄方向。
  93. 根据权利要求79所述的装置,其特征在于,所述无人机允许的最大飞行速度为基于各拍摄方向的航向间距和所述拍摄装置完成一个所述拍摄序列的拍摄并恢复至初始拍摄方向所需的时长确定;
    其中,各拍摄方向的航向间距大小相等,且所述航向间距为基于预设的地面分辨率、预设的航向重叠率和所述拍摄装置平行于所述无人机的飞行方向上的像元个数确定。
  94. 根据权利要求93所述的装置,其特征在于,所述航向重叠率由用户设定。
  95. 根据权利要求93所述的装置,其特征在于,所述初始拍摄方向为所述拍摄序列中其中一个拍摄点对应的拍摄方向。
  96. 根据权利要求79所述的装置,其特征在于,所述一个或多个处理器在控制所述无人机按照所述飞行航线飞行时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述拍摄装置的镜头与所述待拍摄区域之间的实时高度在预设高度范围内。
  97. 根据权利要求79所述的装置,其特征在于,所述一个或多个处理器在控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向时,单独地或共同地被进一步配置成用于实施如下操作:
    获取所述无人机的实时姿态;
    确定所述无人机的实时姿态和下一拍摄点的拍摄方向之间的偏差,
    根据所述偏差控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向。
  98. 根据权利要求79所述的装置,其特征在于,所述云台为三轴云台,所述云台被配置为绕偏航轴、横滚轴和俯仰轴运动;
    所述一个或多个处理器在控制所述无人机上的云台切换姿态时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述云台的偏航姿态、横滚轴姿态和俯仰轴姿态中任两个,以控制所述云台切换姿态。
  99. 一种无人机,其特征在于,包括:
    机体;
    云台,搭载在所述机体上,所述云台用于搭载拍摄装置;和
    权利要求50至68、73-98任一项所述的拍摄控制装置,由所述机体支撑,所述拍摄控制装置与所述云台电连接。
  100. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现权利要求1至49任一项所述的拍摄控制方法。
  101. 一种拍摄控制方法,其特征在于,所述方法包括:
    接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列;
    基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄;
    其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,所述有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定。
  102. 根据权利要求101所述的方法,其特征在于,所述外扩拍摄区域为将所述待拍摄区域的不同方向分别扩大第一预设距离获得的区域;
    所述第一预设距离为基于所述无人机的飞行高度及搭载在所述无人机上的拍摄装置的安装角度确定。
  103. 根据权利要求102所述的方法,其特征在于,所述拍摄方向包括以下中的至少两种:
    相对竖直方向倾斜且朝向所述无人机的前方的前拍方向、相对竖直方向倾斜且朝向所述无人机的后方的后拍方向、相对竖直方向倾斜且朝向所述无人机的左侧方向的左拍方向、相对竖直方向倾斜且朝向所述无人机的右侧方向的右拍方向或拍摄方向竖直朝下的正拍方向。
  104. 根据权利要求103所述的方法,其特征在于,所述前拍方向的有效拍摄区域、所述后拍方向的有效拍摄区域、所述左拍方向的有效拍摄区域、所述右拍方向的有效拍摄区域分别为:
    所述待拍摄区域向第一方向移动第二预设距离后获得的区域、所述待拍摄区域向第二方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第三方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第四方向移动所述第二预设距离后获得的区域;所述正拍方向的有效拍摄区域为所述待拍摄区域,其中,所述第一方向与所述第二方向相反,所述第三方向与所述第四方向相反。
  105. 根据权利要求104所述的方法,其特征在于,所述第一方向、所述第二方向、所述第三方向或所述第四方向与所述飞行航线的形状相关。
  106. 根据权利要求102所述的方法,其特征在于,所述飞行高度由用户设定。
  107. 根据权利要求106所述的方法,其特征在于,所述飞行高度为用户通过所述无人机的控制装置输入。
  108. 根据权利要求102所述的方法,其特征在于,所述飞行高度为根据搭载在所述无人机上的拍摄装置的参数和预设的地面分辨率确定。
  109. 根据权利要求108所述的方法,其特征在于,所述拍摄装置的参数包括所述 拍摄装置的焦距和所述拍摄装置的图像传感器的单个像素边长。
  110. 根据权利要求101所述的方法,其特征在于,所述飞行航线包括多条相互平行的子航线,相邻子航线的其中一侧相连,以形成一条飞行航线;
    所述飞行航线的确定过程包括:
    所述无人机的控制装置根据预设的地面分辨率、预设的旁向重叠率及搭载在所述无人机上的拍摄装置垂直于所述无人机的飞行方向上的像元个数,确定飞行航线中相邻两条子航线间的旁向间距;根据所述第二位置信息及所述旁向间距,确定所述飞行航线。
  111. 根据权利要求110所述的方法,其特征在于,所述外扩拍摄区域为方形,所述飞行航线的起始航点为所述外扩拍摄区域的任一边角位置,所述子航线平行于所述外扩拍摄区域的其中一条边。
  112. 根据权利要求110所述的方法,其特征在于,所述旁向重叠率由用户设定。
  113. 根据权利要求101所述的方法,其特征在于,所述第一位置信息由用户设定;或者,
    所述待拍摄区域通过导入外部文件确定,所述外部文件记录有所述第一位置信息。
  114. 根据权利要求101所述的方法,其特征在于,所述接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列之前,还包括:
    获取到指示进入倾斜拍摄模式的触发指令;
    进入所述倾斜拍摄模式。
  115. 根据权利要求101所述的方法,其特征在于,所述基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄,包括:
    控制所述无人机按照所述飞行航线飞行;
    根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    获取所述拍摄装置在每一拍摄点所拍摄的图像。
  116. 根据权利要求115所述的方法,其特征在于,所述拍摄装置进行拍摄不影响所述无人机的飞行。
  117. 根据权利要求115所述的方法,其特征在于,所述根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向,包括:
    根据所述拍摄序列,发送拍摄触发信号至所述云台,以使得所述云台在所述无人机从当前拍摄点飞向下一拍摄点的过程中,进行姿态切换,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    其中,所述拍摄触发信号还用于指示所述云台在所述拍摄装置处于所述对应的拍摄方向时,触发所述云台拍摄。
  118. 根据权利要求117所述的方法,其特征在于,所述拍摄触发信号为定时拍摄触发信号,所述定时拍摄触发信号用于指示所述云台基于第一定时策略触发所述拍摄装置进行拍摄;
    其中,所述第一定时策略包括:所述拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  119. 根据权利要求118所述的方法,其特征在于,所述定时拍摄触发信号还用于指示所述云台基于第二定时策略触发所述拍摄装置进行拍摄;
    其中,所述第二定时策略包括:所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  120. 根据权利要求118所述的方法,其特征在于,发送所述定时拍摄触发信号至所述云台的次数为一次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,发送定时拍摄触发信号至所述云台。
  121. 根据权利要求118至120任一项所述的方法,其特征在于,所述拍摄序列的数量为多个,所述方法还包括:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,将所有拍摄序列一次性发送给所述云台。
  122. 根据权利要求118至120任一项所述的方法,其特征在于,所述拍摄序列的数量为多个,所述方法还包括:
    在所述拍摄装置完成当前拍摄序列的拍摄之后,发送下一拍摄序列至所述云台。
  123. 根据权利要求117所述的方法,其特征在于,所述拍摄触发信号为定距拍摄触发信号,所述定距拍摄触发信号用于指示基于第一定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第一定距策略包括:相邻拍摄序列之间的间距为第一固定间距。
  124. 根据权利要求123所述的方法,其特征在于,所述定距拍摄触发信号还用于指示基于第二定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第二定距策略包括:同一拍摄序列中相邻拍摄点之间的间距为第二固定间距。
  125. 根据权利要求123所述的方法,其特征在于,发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  126. 根据权利要求125所述的方法,其特征在于,所述方法还包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,发送该拍摄序列至所述云台。
  127. 根据权利要求124所述的方法,其特征在于,所述发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台,包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  128. 根据权利要求127所述的方法,其特征在于,所述方法还包括:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送指示信号至所述云台,所述指示信号用于指示该拍摄点对应的所述云台的目标姿态或所述拍摄装置的拍摄方向。
  129. 根据权利要求115所述的方法,其特征在于,所述无人机允许的最大飞行速度为基于各拍摄方向的航向间距和所述拍摄装置完成一个所述拍摄序列的拍摄并恢复至初始拍摄方向所需的时长确定;
    其中,各拍摄方向的航向间距大小相等,且所述航向间距为基于预设的地面分辨 率、预设的航向重叠率和所述拍摄装置平行于所述无人机的飞行方向上的像元个数确定。
  130. 根据权利要求129所述的方法,其特征在于,所述航向重叠率由用户设定。
  131. 根据权利要求129所述的方法,其特征在于,所述初始拍摄方向为所述拍摄序列中其中一个拍摄点对应的拍摄方向。
  132. 根据权利要求115所述的方法,其特征在于,所述控制所述无人机按照所述飞行航线飞行,包括:
    控制所述拍摄装置的镜头与所述待拍摄区域之间的实时高度在预设高度范围内。
  133. 根据权利要求115所述的方法,其特征在于,所述控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向,具体包括:
    获取所述无人机的实时姿态;
    确定所述无人机的实时姿态和下一拍摄点的拍摄方向之间的偏差;
    根据所述偏差控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向。
  134. 根据权利要求115所述的方法,其特征在于,所述云台为三轴云台,所述云台被配置为绕偏航轴、横滚轴和俯仰轴运动;
    所述控制所述无人机上的云台切换姿态,包括:
    控制所述云台的偏航姿态、横滚轴姿态和俯仰轴姿态中任两个,以控制所述云台切换姿态。
  135. 一种拍摄控制装置,其特征在于,所述装置包括:
    存储装置,用于存储程序指令;以及
    一个或多个处理器,调用所述存储装置中存储的程序指令,当所述程序指令被执行时,所述一个或多个处理器单独地或共同地被配置成用于实施如下操作:
    接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列;
    基于所述飞行航线及每个航点对应的拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄;
    其中,每个拍摄序列包括一个或多个连续的拍摄点,每个所述拍摄序列的一个或多个拍摄点的拍摄方向各不相同,且各拍摄方向的拍摄点均位于该拍摄方向的有效拍摄区域,所述有效拍摄区域为根据待拍摄区域的第一位置信息及外扩拍摄区域的第二位置信息确定,所述外扩拍摄区域为将所述待拍摄区域进行扩大获得,所述第二位置信息为根据所述第一位置信息确定。
  136. 根据权利要求135所述的装置,其特征在于,所述外扩拍摄区域为将所述待拍摄区域的不同方向分别扩大第一预设距离获得的区域;
    所述第一预设距离为基于所述无人机的飞行高度及搭载在所述无人机上的拍摄装置的安装角度确定。
  137. 根据权利要求136所述的装置,其特征在于,所述拍摄方向包括以下中的至少两种:
    相对竖直方向倾斜且朝向所述无人机的前方的前拍方向、相对竖直方向倾斜且朝向所述无人机的后方的后拍方向、相对竖直方向倾斜且朝向所述无人机的左侧方向的左拍方向、相对竖直方向倾斜且朝向所述无人机的右侧方向的右拍方向或拍摄方向竖 直朝下的正拍方向。
  138. 根据权利要求137所述的装置,其特征在于,所述前拍方向的有效拍摄区域、所述后拍方向的有效拍摄区域、所述左拍方向的有效拍摄区域、所述右拍方向的有效拍摄区域分别为:
    所述待拍摄区域向第一方向移动第二预设距离后获得的区域、所述待拍摄区域向第二方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第三方向移动所述第二预设距离后获得的区域、所述待拍摄区域向第四方向移动所述第二预设距离后获得的区域;所述正拍方向的有效拍摄区域为所述待拍摄区域,其中,所述第一方向与所述第二方向相反,所述第三方向与所述第四方向相反。
  139. 根据权利要求138所述的装置,其特征在于,所述第一方向、所述第二方向、所述第三方向或所述第四方向与所述飞行航线的形状相关。
  140. 根据权利要求136所述的装置,其特征在于,所述飞行高度由用户设定。
  141. 根据权利要求140所述的装置,其特征在于,所述飞行高度为用户通过所述无人机的控制装置输入。
  142. 根据权利要求136所述的装置,其特征在于,所述飞行高度为根据搭载在所述无人机上的拍摄装置的参数和预设的地面分辨率确定。
  143. 根据权利要求142所述的装置,其特征在于,所述拍摄装置的参数包括所述拍摄装置的焦距和所述拍摄装置的图像传感器的单个像素边长。
  144. 根据权利要求135所述的装置,其特征在于,所述飞行航线包括多条相互平行的子航线,相邻子航线的其中一侧相连,以形成一条飞行航线;
    所述飞行航线的确定过程包括:
    所述无人机的控制装置根据预设的地面分辨率、预设的旁向重叠率及搭载在所述无人机上的拍摄装置垂直于所述无人机的飞行方向上的像元个数,确定飞行航线中相邻两条子航线间的旁向间距;根据所述第二位置信息及所述旁向间距,确定所述飞行航线。
  145. 根据权利要求144所述的装置,其特征在于,所述外扩拍摄区域为方形,所述飞行航线的起始航点为所述外扩拍摄区域的任一边角位置,所述子航线平行于所述外扩拍摄区域的其中一条边。
  146. 根据权利要求144所述的装置,其特征在于,所述旁向重叠率由用户设定。
  147. 根据权利要求135所述的装置,其特征在于,所述第一位置信息由用户设定;或者,
    所述待拍摄区域通过导入外部文件确定,所述外部文件记录有所述第一位置信息。
  148. 根据权利要求135所述的装置,其特征在于,所述一个或多个处理器在接收无人机的控制装置发送的飞行航线及所述飞行航线上的每个航点对应的拍摄序列之前,单独地或共同地还被配置成用于实施如下操作:
    获取到指示进入倾斜拍摄模式的触发指令;
    进入所述倾斜拍摄模式。
  149. 根据权利要求135所述的装置,其特征在于,所述一个或多个处理器在基于所述飞行航线及所述拍摄序列控制搭载在所述无人机上的拍摄装置进行拍摄时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述无人机按照所述飞行航线飞行;
    根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制 所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    获取所述拍摄装置在每一拍摄点所拍摄的图像。
  150. 根据权利要求149所述的装置,其特征在于,所述拍摄装置进行拍摄不影响所述无人机的飞行。
  151. 根据权利要求149所述的装置,其特征在于,所述一个或多个处理器在根据所述拍摄序列,在所述无人机从当前拍摄点飞向下一拍摄点的过程中,控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向时,单独地或共同地被进一步配置成用于实施如下操作:
    根据所述拍摄序列,发送拍摄触发信号至所述云台,以使得所述云台在所述无人机从当前拍摄点飞向下一拍摄点的过程中,进行姿态切换,使得所述云台上的拍摄装置在所述无人机到达每一拍摄点时均处于对应的拍摄方向;
    其中,所述拍摄触发信号还用于指示所述云台在所述拍摄装置处于所述对应的拍摄方向时,触发所述云台拍摄。
  152. 根据权利要求151所述的装置,其特征在于,所述拍摄触发信号为定时拍摄触发信号,所述定时拍摄触发信号用于指示所述云台基于第一定时策略触发所述拍摄装置进行拍摄;
    其中,所述第一定时策略包括:所述拍摄装置完成每个所述拍摄序列的拍摄所需的时长为第一固定时长。
  153. 根据权利要求152所述的装置,其特征在于,所述定时拍摄触发信号还用于指示所述云台基于第二定时策略触发所述拍摄装置进行拍摄;
    其中,所述第二定时策略包括:所述拍摄装置完成同一拍摄序列中相邻拍摄点的拍摄所需的时长为第二固定时长。
  154. 根据权利要求152所述的装置,其特征在于,发送所述定时拍摄触发信号至所述云台的次数为一次,所述一个或多个处理器在发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,发送定时拍摄触发信号至所述云台。
  155. 根据权利要求152至154任一项所述的装置,其特征在于,所述拍摄序列的数量为多个,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成首个所述拍摄序列的首个拍摄点的拍摄之前,将所有拍摄序列一次性发送给所述云台。
  156. 根据权利要求152至154任一项所述的装置,其特征在于,所述拍摄序列的数量为多个,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述拍摄装置完成当前拍摄序列的拍摄之后,发送下一拍摄序列至所述云台。
  157. 根据权利要求151所述的装置,其特征在于,所述拍摄触发信号为定距拍摄触发信号,所述定距拍摄触发信号用于指示基于第一定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第一定距策略包括:相邻拍摄序列之间的间距为第一固定间距。
  158. 根据权利要求157所述的装置,其特征在于,所述定距拍摄触发信号还用于指示基于第二定距策略触发所述云台控制所述拍摄装置进行拍摄;
    其中,所述第二定距策略包括:同一拍摄序列中相邻拍摄点之间的间距为第二固 定间距。
  159. 根据权利要求157所述的装置,其特征在于,发送所述定距触发信号至所述云台的次数为多次,所述一个或多个处理器在发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  160. 根据权利要求159所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄序列的首个拍摄点的拍摄之前,发送该拍摄序列至所述云台。
  161. 根据权利要求158所述的装置,其特征在于,所述一个或多个处理器在发送所述定距触发信号至所述云台的次数为多次,所述发送拍摄触发信号至所述云台时,单独地或共同地被进一步配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送定距拍摄触发信号至所述云台。
  162. 根据权利要求161所述的装置,其特征在于,所述一个或多个处理器单独地或共同地还被配置成用于实施如下操作:
    在所述云台控制所述拍摄装置完成每个所述拍摄点的拍摄之前,分别发送指示信号至所述云台,所述指示信号用于指示该拍摄点对应的所述云台的目标姿态或所述拍摄装置的拍摄方向。
  163. 根据权利要求151所述的装置,其特征在于,所述无人机允许的最大飞行速度为基于各拍摄方向的航向间距和所述拍摄装置完成一个所述拍摄序列的拍摄并恢复至初始拍摄方向所需的时长确定;
    其中,各拍摄方向的航向间距大小相等,且所述航向间距为基于预设的地面分辨率、预设的航向重叠率和所述拍摄装置平行于所述无人机的飞行方向上的像元个数确定。
  164. 根据权利要求163所述的装置,其特征在于,所述航向重叠率由用户设定。
  165. 根据权利要求163所述的装置,其特征在于,所述初始拍摄方向为所述拍摄序列中其中一个拍摄点对应的拍摄方向。
  166. 根据权利要求151所述的装置,其特征在于,所述一个或多个处理器在控制所述无人机按照所述飞行航线飞行时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述拍摄装置的镜头与所述待拍摄区域之间的实时高度在预设高度范围内。
  167. 根据权利要求151所述的装置,其特征在于,所述一个或多个处理器在控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向时,单独地或共同地被进一步配置成用于实施如下操作:
    获取所述无人机的实时姿态;
    确定所述无人机的实时姿态和下一拍摄点的拍摄方向之间的偏差;
    根据所述偏差控制所述无人机上的云台切换姿态,使得所述云台上的拍摄装置在每一拍摄点均处于对应的拍摄方向。
  168. 根据权利要求151所述的装置,其特征在于,所述云台为三轴云台,所述云台被配置为绕偏航轴、横滚轴和俯仰轴运动;
    所述一个或多个处理器在控制所述无人机上的云台切换姿态时,单独地或共同地被进一步配置成用于实施如下操作:
    控制所述云台的偏航姿态、横滚轴姿态和俯仰轴姿态中任两个,以控制所述云台切换姿态。
  169. 一种无人机,其特征在于,包括:
    机体;
    云台,搭载在所述机体上,所述云台用于搭载拍摄装置;和
    权利要求135-168任一项所述的拍摄控制装置,由所述机体支撑,所述拍摄控制装置与所述云台电连接。
  170. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现权利要求101至134任一项所述的拍摄控制方法。
PCT/CN2020/102249 2020-07-16 2020-07-16 拍摄控制方法和装置、无人机及计算机可读存储介质 WO2022011623A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202311469626.2A CN117641107A (zh) 2020-07-16 2020-07-16 拍摄控制方法和装置
PCT/CN2020/102249 WO2022011623A1 (zh) 2020-07-16 2020-07-16 拍摄控制方法和装置、无人机及计算机可读存储介质
CN202080032440.9A CN113875222B (zh) 2020-07-16 2020-07-16 拍摄控制方法和装置、无人机及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/102249 WO2022011623A1 (zh) 2020-07-16 2020-07-16 拍摄控制方法和装置、无人机及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2022011623A1 true WO2022011623A1 (zh) 2022-01-20

Family

ID=78982120

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/102249 WO2022011623A1 (zh) 2020-07-16 2020-07-16 拍摄控制方法和装置、无人机及计算机可读存储介质

Country Status (2)

Country Link
CN (2) CN117641107A (zh)
WO (1) WO2022011623A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114777744A (zh) * 2022-04-25 2022-07-22 中国科学院古脊椎动物与古人类研究所 一种古生物领域的地质测量方法、装置及电子设备
CN114935942A (zh) * 2022-05-20 2022-08-23 无锡海纳智能科技有限公司 一种分布式光伏电站巡检航线的确定方法及电子设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117470199B (zh) * 2023-12-27 2024-03-15 天津云圣智能科技有限责任公司 一种摆动摄影控制的方法、装置、存储介质及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107504957A (zh) * 2017-07-12 2017-12-22 天津大学 利用无人机多视角摄像快速进行三维地形模型构建的方法
CN109032165A (zh) * 2017-07-21 2018-12-18 广州极飞科技有限公司 无人机航线的生成方法和装置
US10364026B1 (en) * 2015-09-21 2019-07-30 Amazon Technologies, Inc. Track and tether vehicle position estimation
CN110771141A (zh) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 拍摄方法和无人机
CN111226185A (zh) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 飞行航线的生成方法、控制装置及无人机系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706B (zh) * 2016-12-09 2019-05-14 中山大学 一种无人机勘查交通事故现场的航拍图像采集方法及系统
US10364027B2 (en) * 2017-10-24 2019-07-30 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for UAV scanning and imaging
EP3746745A4 (en) * 2018-01-29 2021-08-11 AeroVironment, Inc. PROCEDURES AND SYSTEMS FOR DETERMINING FLIGHT PLANS FOR VERTICAL TAKING OFF AND LANDING AIRCRAFT (VTOL)
EP3885871B1 (en) * 2018-11-21 2023-03-15 Guangzhou Xaircraft Technology Co., Ltd Surveying and mapping system, surveying and mapping method and apparatus, device and medium
CN111373339A (zh) * 2019-05-17 2020-07-03 深圳市大疆创新科技有限公司 飞行任务生成方法、控制终端、无人飞行器及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10364026B1 (en) * 2015-09-21 2019-07-30 Amazon Technologies, Inc. Track and tether vehicle position estimation
CN107504957A (zh) * 2017-07-12 2017-12-22 天津大学 利用无人机多视角摄像快速进行三维地形模型构建的方法
CN109032165A (zh) * 2017-07-21 2018-12-18 广州极飞科技有限公司 无人机航线的生成方法和装置
CN110771141A (zh) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 拍摄方法和无人机
CN111226185A (zh) * 2019-04-22 2020-06-02 深圳市大疆创新科技有限公司 飞行航线的生成方法、控制装置及无人机系统

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114777744A (zh) * 2022-04-25 2022-07-22 中国科学院古脊椎动物与古人类研究所 一种古生物领域的地质测量方法、装置及电子设备
CN114777744B (zh) * 2022-04-25 2024-03-08 中国科学院古脊椎动物与古人类研究所 一种古生物领域的地质测量方法、装置及电子设备
CN114935942A (zh) * 2022-05-20 2022-08-23 无锡海纳智能科技有限公司 一种分布式光伏电站巡检航线的确定方法及电子设备

Also Published As

Publication number Publication date
CN113875222A (zh) 2021-12-31
CN117641107A (zh) 2024-03-01
CN113875222B (zh) 2023-11-24

Similar Documents

Publication Publication Date Title
CN110771141B (zh) 拍摄方法和无人机
WO2022011623A1 (zh) 拍摄控制方法和装置、无人机及计算机可读存储介质
US11377211B2 (en) Flight path generation method, flight path generation system, flight vehicle, program, and storage medium
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
US11042997B2 (en) Panoramic photographing method for unmanned aerial vehicle and unmanned aerial vehicle using the same
WO2018120350A1 (zh) 对无人机进行定位的方法及装置
WO2020237471A1 (zh) 飞行航线生成方法、终端和无人机
WO2018120351A1 (zh) 一种对无人机进行定位的方法及装置
WO2017181513A1 (zh) 无人机的飞行控制方法和装置
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
WO2020237422A1 (zh) 航测方法、飞行器及存储介质
CN111699454B (zh) 一种飞行规划方法及相关设备
CN110720023B (zh) 一种对摄像机的参数处理方法、装置及图像处理设备
CN111247389B (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
JP2021117047A (ja) 無人飛行体を用いた写真測量方法および無人飛行体を用いた写真測量システム
JP6265576B1 (ja) 撮像制御装置、影位置特定装置、撮像システム、移動体、撮像制御方法、影位置特定方法、及びプログラム
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2021056411A1 (zh) 航线调整方法、地面端设备、无人机、系统和存储介质
WO2021051220A1 (zh) 一种点云融合方法、设备、系统及存储介质
CN112313942A (zh) 一种进行图像处理和框架体控制的控制装置
CN111788457A (zh) 形状推断装置、形状推断方法、程序以及记录介质
WO2020087382A1 (zh) 一种定位方法、设备、飞行器及计算机可读存储介质
JP2019158515A (ja) 無人飛行体、情報処理装置、画像情報取得方法、画像情報取得プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20945182

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20945182

Country of ref document: EP

Kind code of ref document: A1