WO2019227289A1 - 延时拍摄控制方法和设备 - Google Patents

延时拍摄控制方法和设备 Download PDF

Info

Publication number
WO2019227289A1
WO2019227289A1 PCT/CN2018/088715 CN2018088715W WO2019227289A1 WO 2019227289 A1 WO2019227289 A1 WO 2019227289A1 CN 2018088715 W CN2018088715 W CN 2018088715W WO 2019227289 A1 WO2019227289 A1 WO 2019227289A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
drone
shooting
lapse
lapse shooting
Prior art date
Application number
PCT/CN2018/088715
Other languages
English (en)
French (fr)
Inventor
颜江
李劲松
刘雨奇
吴洪强
张然
陈福财
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/088715 priority Critical patent/WO2019227289A1/zh
Priority to CN201880031253.1A priority patent/CN110771137A/zh
Publication of WO2019227289A1 publication Critical patent/WO2019227289A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • Embodiments of the present invention relate to the technical field of drones, and in particular, to a method and device for controlling time-lapse shooting.
  • Time-lapse photography is also called time-lapse photography or time-lapse video. It is a time-compressed shooting technology. It takes a group of photos or videos, and then cascades photos or video frames. Compress the process of minutes, hours, or even days and years in a short time and play it as a video.
  • the shooting device is fixed on a stable bearing mechanism, such as a tripod. Due to the movement and installation restrictions of the bearing mechanism, the application scenarios of time-lapse video shooting are restricted.
  • Embodiments of the present invention provide a time-lapse shooting control method and device, which are used to perform time-lapse shooting by a drone to expand the application scene of time-lapse video shooting.
  • an embodiment of the present invention provides a time-lapse shooting control method, which is applied to a control terminal of a drone, and the method includes:
  • the time-lapse shooting parameter setting operation is detected through the interactive device
  • an embodiment of the present invention provides a time-lapse shooting control method, which is applied to a drone, and the method includes:
  • the shooting device of the drone is controlled to perform time-lapse shooting according to the time-lapse shooting parameters.
  • an embodiment of the present invention provides a control terminal, including:
  • a processor configured to determine a time-lapse shooting parameter according to the time-lapse shooting parameter setting operation detected by the interactive device; and control the drone to perform time-lapse shooting according to the time-lapse shooting parameter.
  • an embodiment of the present invention provides a drone, including:
  • a communication device for receiving a time-lapse shooting parameter sent by a control terminal, where the time-lapse shooting parameter is determined by the control terminal by detecting a time-lapse shooting parameter setting operation;
  • a processor configured to control the shooting device of the drone according to the time-lapse shooting parameter to perform time-lapse shooting.
  • an embodiment of the present invention provides a readable storage medium on which a computer program is stored; when the computer program is executed, it implements the embodiments of the present invention as in the first aspect or the second aspect.
  • the time-lapse shooting control method is not limited to:
  • the method and device for controlling time-lapse shooting provided by the embodiments of the present invention.
  • the control terminal detects the time-lapse shooting parameter setting operation through the interactive device, and then determines the time-lapse shooting parameter according to the time-lapse shooting parameter setting operation, and then controls the drone. Time-lapse shooting is performed according to the time-lapse shooting parameters. Therefore, the user can control the drone to perform time-lapse shooting by operating the control terminal, and the time-lapse shooting by the drone can be adapted to various shooting application scenarios, which can bring different shooting experiences to the user.
  • FIG. 1 is a schematic architecture diagram of a drone according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a time-lapse shooting control method according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a time-lapse shooting control method according to another embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a time-lapse shooting scene of a target object provided by a drone according to an embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a control terminal according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a drone provided by an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a time-lapse shooting system according to an embodiment of the present invention.
  • a component when a component is called “fixed to” another component, it may be directly on another component or a centered component may exist. When a component is considered to be “connected” to another component, it can be directly connected to another component or a centered component may exist at the same time.
  • Embodiments of the present invention provide a method, a device, and a drone for controlling a drone.
  • the drone may be a rotorcraft, for example, a multi-rotor aircraft propelled by multiple propulsion devices through air.
  • Embodiments of the present invention are not limited thereto.
  • FIG. 1 is a schematic architecture diagram of an unmanned flight system according to an embodiment of the present invention. This embodiment is described by taking a rotary wing drone as an example.
  • the unmanned aerial system 100 may include an unmanned aerial vehicle 110, a display device 130, and a control device 140.
  • the UAV 110 may include a power system 150, a flight control system 160, a rack, and a gimbal 120 carried on the rack.
  • the drone 110 may perform wireless communication with the control terminal 140 and the display device 130.
  • the frame may include a fuselage and a tripod (also called a landing gear).
  • the fuselage may include a center frame and one or more arms connected to the center frame. One or more arms extend radially from the center frame.
  • the tripod is connected to the fuselage, and is used to support the UAV 110 when landing.
  • the power system 150 may include one or more electronic governors (referred to as ESCs) 151, one or more propellers 153, and one or more electric motors 152 corresponding to the one or more propellers 153, where the electric motor 152 is connected to Between the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are arranged on the arm of the drone 110; the electronic governor 151 is used to receive the driving signal generated by the flight control system 160 and provide driving according to the driving signal Current is supplied to the motor 152 to control the rotation speed of the motor 152.
  • the motor 152 is used to drive the propeller to rotate, so as to provide power for the flight of the drone 110, and the power enables the drone 110 to achieve one or more degrees of freedom.
  • the drone 110 may rotate about one or more rotation axes.
  • the rotation axis may include a roll axis (Roll), a yaw axis (Yaw), and a pitch axis (Pitch).
  • the motor 152 may be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • the flight control system 160 may include a flight controller 161 and a sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the drone, that is, the position information and status information of the drone 110 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 may include, for example, at least one of a gyroscope, an ultrasonic sensor, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a Global Positioning System (GPS).
  • GPS Global Positioning System
  • the flight controller 161 is used to control the flight of the drone 110.
  • the flight controller 161 may control the flight of the drone 110 according to the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the drone 110 according to a pre-programmed program instruction, and may also control the drone 110 by responding to one or more control instructions from the control terminal 140.
  • the gimbal 120 may include a motor 122.
  • the gimbal is used to carry the photographing device 123.
  • the flight controller 161 may control the movement of the gimbal 120 through the motor 122.
  • the PTZ 120 may further include a controller for controlling the movement of the PTZ 120 by controlling the motor 122.
  • the gimbal 120 may be independent of the drone 110 or may be a part of the drone 110.
  • the motor 122 may be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the gimbal can be located on top of the drone or on the bottom of the drone.
  • the photographing device 123 may be, for example, a device for capturing an image, such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and perform shooting under the control of the flight controller.
  • the photographing device 123 of this embodiment includes at least a photosensitive element, such as a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor) sensor or a charge-coupled device (CCD) sensor. It can be understood that the shooting device 123 can also be directly fixed on the drone 110, so that the PTZ 120 can be omitted.
  • a photosensitive element such as a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor) sensor or a charge-coupled device (CCD) sensor.
  • CCD charge-coupled device
  • the display device 130 is located on the ground side of the UAV 100, and can communicate with the UAV 110 wirelessly, and can be used to display the attitude information of the UAV 110. In addition, an image captured by the imaging device may also be displayed on the display device 130. It should be understood that the display device 130 may be an independent device or integrated in the control terminal 140.
  • the control terminal 140 is located on the ground side of the unmanned flight system 100 and can communicate with the unmanned aerial vehicle 110 in a wireless manner for remotely controlling the unmanned aerial vehicle 110.
  • FIG. 2 is a flowchart of a time-lapse shooting control method according to an embodiment of the present invention. As shown in FIG. 2, the method of this embodiment is applied to a control terminal of a drone. The method of this embodiment may include:
  • a time-lapse shooting parameter setting operation is detected through an interactive device.
  • the control terminal of the drone can detect the time-lapse shooting parameter setting operation through the interactive device.
  • the control terminal includes one or more of a remote controller, a smart phone, a tablet computer, a laptop computer, and a wearable device, and details are not described herein again.
  • the interactive device may be an important part of the control terminal and an interface for interacting with the user. The user can control the drone by operating the interactive device. When the user wants to control the drone, the user The interactive device of the control terminal operates, and the control terminal detects the user's operation through the interactive device.
  • the control terminal can detect the user's time-lapse shooting parameter setting operation through the interaction device.
  • the interactive device may be, for example, one or more of a control terminal touching a display screen, a keyboard, a joystick, and a pulsator; at the same time, the touch screen may also display all the parameters of the drone flight, and may display the photos taken by the drone. Screen.
  • the control terminal after the control terminal detects the time-lapse shooting parameter setting operation through the interactive device, it determines the time-lapse shooting parameter set by the user according to the time-lapse shooting parameter setting operation.
  • the time-lapse shooting parameter may include at least one of a shooting time interval, a shooting duration, a number of captured images, and a time-lapse video duration.
  • the control terminal controls the drone to perform time-lapse shooting according to the time-lapse shooting parameter.
  • the control terminal controls the drone to perform time-lapse shooting according to the time-lapse shooting parameter.
  • the control terminal may send the time-lapse shooting parameter to the drone, so that the drone controls the shooting device to perform delay according to the time-lapse shooting parameter. Shooting, where the specific implementation process of the drone can refer to the related description in the embodiment shown in FIG. 3 below, which will not be repeated here.
  • the time-lapse shooting parameter may be, for example, the number of shot images and the shooting duration, where the user performs a time-lapse shooting parameter setting operation on the interactive device to directly input the number of shot images and the shooting duration to the control terminal, and then The control terminal controls the drone to perform time-lapse shooting according to the number of shot images and the shooting duration input by the user through the time-lapse shooting parameter setting operation. For example, if the number of captured images is 500 and the shooting duration is 40s, the control terminal can control the drone to perform a delayed shooting for 40s to obtain 500 delayed shooting images.
  • the time-lapse shooting parameter may be, for example, a shooting time interval (for example, 5s) and a time-lapse video duration (for example, 40s), where the user performs a time-lapse shooting parameter setting operation on the interactive device to directly input the shooting time to the control terminal Interval and time-lapse video duration, and then control the terminal to determine the number of shot images and duration according to the shooting time interval and time-lapse video duration input by the user through the time-lapse shooting parameter setting operation, and then according to the determined number of shot images And shooting time, control the drone for time-lapse shooting.
  • a shooting time interval for example, 5s
  • a time-lapse video duration for example, 40s
  • the control terminal displays the number of captured images and the shooting duration through the interactive interface. If the user inputs the shooting time interval and time-lapse video duration through the time-lapse shooting parameter setting operation, the control terminal displays the shooting time interval and time-lapse video duration through the interactive interface, and also displays, based on the shooting time interval and time-lapse video, the interactive interface The number of shots and the duration of the captured image determined by the duration. In order to allow the user to intuitively understand the number and duration of images that the control terminal controls the drone for time-lapse shooting.
  • the time-lapse shooting parameter setting operation is detected by the interactive device, and then the time-lapse shooting parameter setting operation is determined according to the time-lapse shooting parameter setting operation, and then the drone is controlled to perform time-lapse shooting according to the time-lapse shooting parameter. . Therefore, the user can control the drone to perform time-lapse shooting by operating the control terminal, and use the drone to perform time-lapse shooting, adapting to various different shooting application scenarios, and bringing different shooting experiences to the user.
  • FIG. 3 is a flowchart of a time-lapse shooting control method according to another embodiment of the present invention. As shown in FIG. 3, the method of this embodiment is applied to a drone, and the method of this embodiment may include:
  • S301 Receive a time-lapse shooting parameter sent by a control terminal, where the time-lapse shooting parameter is determined by the control terminal by detecting a time-lapse shooting parameter setting operation.
  • the drone receives the time-lapse shooting parameters sent by the control terminal.
  • the time-lapse shooting parameters are determined by the control terminal by detecting the time-lapse shooting parameter setting operation.
  • the drone controls the shooting device to perform time-lapse shooting according to the received time-lapse shooting parameters.
  • the time-lapse shooting parameter may include at least one of a shooting time interval, a shooting duration, a number of captured images, and a time-lapse video duration.
  • the user inputs the number of shot images and the shooting duration to the control terminal through the time-lapse shooting parameter setting operation, and then the control terminal sends the number of shot images and the shooting duration input by the user to the drone as the time-lapse shooting parameters. . Then the drone controls the shooting device to perform time-lapse shooting according to the number of received captured images and the shooting duration.
  • the user inputs the shooting time interval and the time-lapse video duration to the control terminal through the time-lapse shooting parameter setting operation, and then the control terminal determines the number of shot images and the shooting time according to the shooting time interval and time-lapse video duration input by the user. , And then send the determined number of captured images and shooting duration to the drone as time-lapse shooting parameters. Then the drone controls the shooting device to perform time-lapse shooting according to the number of received captured images and the shooting duration.
  • the user inputs a shooting time interval and a time-lapse video duration to the control terminal through a time-lapse shooting parameter setting operation, and then the control terminal sends the shooting time interval and the time-lapse video duration input by the user as time-lapse shooting parameters to the drone . Then, the shooting time interval and time-lapse video duration received by the drone determine the number of shot images and shooting duration, and control the shooting device to perform time-lapse shooting according to the determined number of shooting images and shooting duration.
  • the drone receives the time-lapse shooting parameters sent by the control terminal. During the flight, the time-lapse shooting is controlled according to the time-lapse shooting parameters. Therefore, the drone can be controlled by the control.
  • the terminal performs time-lapse shooting, and the drone performs time-lapse shooting, which brings different time-lapse shooting experiences to users.
  • the control terminal can also obtain time-lapse video.
  • the control terminal can obtain a time-lapse video sent by the drone.
  • the drone controls the shooting device to perform time-lapse shooting according to the time-lapse shooting parameters, and can obtain a time-lapse shooting image, and then the drone generates a time-lapse video based on the time-lapse shooting image obtained by the time-lapse shooting image, and generates The delayed video is sent to the control terminal, and accordingly, the control terminal obtains the delayed video sent by the drone.
  • a drone can actively send a delayed video to the control terminal every time a delayed video is generated, so that the control terminal displays the delayed video to the user in real time for viewing, or the drone receives the acquisition delay sent by the control terminal.
  • the video instruction is sent to the control terminal.
  • the drone can send time-lapse video to the control terminal through a wireless communication link or a wired communication link.
  • the control terminal may generate a time-lapse video.
  • the drone controls the shooting device to perform time-lapse shooting according to the time-lapse shooting parameters to obtain a time-lapse shooting image, and then the drone sends the time-lapse shooting image to the control terminal.
  • the control terminal acquires a time-lapse shooting image sent by the drone, and then controls the terminal to generate a time-lapse video according to the time-lapse shooting image.
  • the drone may send a time-lapse shooting image to the control terminal through a wireless communication link or a wired communication link.
  • the user can share the delayed video.
  • the user can perform the sharing operation on the interactive device.
  • the control terminal can detect the sharing operation through the interactive device. After detecting the user's sharing operation through the interactive device, the time-lapse video is shared. For example, the control terminal can publish the time-lapse video to the network (such as a social networking site, or Social apps, etc.).
  • the control terminal only controls the drone to perform time-lapse shooting according to the time-lapse shooting parameter when detecting that the time-lapse shooting operation is started. Specifically, the control terminal first detects the time-lapse shooting parameter setting operation through the interactive device, determines the time-lapse shooting parameter according to the time-lapse shooting parameter setting operation, and determines the time-lapse shooting parameter and then starts the time-lapse shooting operation through the interaction device detection.
  • the user wants to control the drone to start time-lapse shooting, the user can start the time-lapse shooting operation on the interactive device, for example: the control terminal can display the start-time-lapse shooting icon, and the user can use the interactive device to start the time-lapse shooting. Icon for contact operation.
  • the control terminal controls the drone to perform time-lapse shooting according to the time-lapse shooting parameter when detecting the start time-lapse shooting operation, wherein the control terminal can send a time-lapse shooting start instruction to the drone, for example, and the time-lapse shooting starts.
  • the instruction is used to control the drone to start time-lapse shooting.
  • the drone first receives the time-lapse shooting parameter sent by the control terminal, and then waits for the time-lapse shooting start instruction sent by the control terminal.
  • the drone receives the time-lapse shooting start instruction, it controls the shooting according to the time-lapse shooting parameter.
  • the device performs time-lapse shooting.
  • the user needs to operate the control terminal to enter the time-lapse shooting mode before setting time-lapse shooting parameters. That is, when the user needs to control the drone for time-lapse shooting, the user needs to control the control terminal to enter the time-lapse shooting mode first. Therefore, the control terminal detects the time-lapse shooting trigger operation through the interactive device, and when the user delays the interactive device When the shooting trigger operation is performed, the control terminal may detect the delay shooting trigger operation through the interactive device. When the control terminal detects the delay shooting trigger operation, the control terminal enters the delay shooting mode. Optionally, after the control terminal enters the time-lapse shooting mode, the control terminal may further display a time-lapse shooting setting interface.
  • the control terminal After the control terminal enters the time-lapse shooting mode, the user (for example, based on the displayed time-lapse shooting setting interface) may perform a time-lapse shooting parameter setting operation on the interactive device. Accordingly, the control terminal detects the time-lapse shooting parameter setting through the interaction device. operating.
  • the user may also set imaging parameters of the shooting device when the drone performs time-lapse shooting, wherein the imaging parameters may include at least one of a focal length, an exposure parameter, and a focus. Therefore, the control terminal can detect the imaging parameter setting operation through the interactive device. When the user needs to set the imaging parameter, the user can perform the imaging parameter setting operation on the interactive device. Accordingly, the control terminal detects the imaging parameter setting operation through the interactive device and determines the imaging parameter according to the imaging parameter setting operation.
  • Controlling the drone to perform time-lapse shooting according to the above-mentioned time-lapse shooting parameters and the determined imaging parameters may specifically be: the control terminal sends the above-mentioned time-lapse shooting parameters and the determined imaging parameters to the drone, and accordingly, no one
  • the camera receives the time-lapse shooting parameters and imaging parameters sent by the control terminal, and then the drone controls the shooting device to perform time-lapse shooting according to the received time-lapse shooting parameters and imaging parameters.
  • the control terminal may send the above-mentioned time-lapse shooting parameters and imaging parameters to the drone at the same time, or may separately send the time-lapse shooting parameters and imaging parameters to the drone.
  • the control terminal may include the time-lapse shooting parameter and the imaging parameter in the time-lapse shooting start instruction and send it to the drone.
  • the user can not only control the drone for time-lapse shooting, but also control the flight mode of the drone when performing time-lapse shooting. Therefore, when the user needs to set the drone's flight mode, the user performs the drone's flight mode setting operation on the interaction device. Accordingly, the control terminal can detect the flight mode setting operation through the interaction device, and then set the flight mode setting operation according to the flight mode. Operation, determine the flight mode of the drone, and then control the drone to fly according to the flight mode. During the flight of the drone according to the flight mode, control the drone to delay according to the time-lapse shooting parameter. When shooting.
  • the control terminal After determining the flight mode of the drone, the control terminal sends a flight mode setting instruction to the drone; after receiving the flight mode setting instruction sent by the control terminal, the drone determines the drone according to the flight mode setting instruction Flight mode, and then the drone flies according to the flight mode.
  • the drone controls the shooting device to perform time-lapse shooting according to the time-lapse shooting parameter.
  • the flight mode may be a free flight mode, and / or a trajectory flight mode, and / or a straight flight mode, and / or a circle flight mode.
  • the control terminal controlling the drone to fly according to the flight mode may be, for example, that the control terminal detects a flight control operation through an interactive device, and the flight control operation is an operation performed by the user on the interactive device, and then the control terminal determines based on the flight control operation. Control the amount of joystick, and then control the drone flight according to the amount of joystick.
  • the control terminal in this embodiment controls the drone to perform time-lapse shooting during the process of controlling the drone flight according to the control lever amount, and accordingly, the drone controls the shooting device during the flight process according to the control lever amount. Perform time-lapse shooting.
  • the control terminal controls the drone flying according to the control lever amount, for example, the control terminal may send the control lever amount to the drone.
  • the flying of the drone according to the flight mode may be, for example, that the drone receives the amount of joystick sent by the control terminal, and then flies according to the amount of joystick.
  • the joystick amount can control the flight trajectory of the drone and / or control the shooting attitude of the drone.
  • the joystick amount can control the position, flight direction, flight speed, flight distance, flight acceleration, etc. of the drone in the air, for example. .
  • the flight trajectory of the drone is preset.
  • the control terminal controls the drone to fly according to the flight mode, for example, the control terminal controls the drone to fly according to a preset trajectory.
  • the drone flying according to the flight mode is, for example, that the drone obtains a preset trajectory, and then flies according to the preset trajectory.
  • the control terminal in this embodiment controls the drone to perform time-lapse shooting during the process of controlling the drone to fly according to a preset trajectory, and accordingly, the drone controls during the process of flying according to a preset trajectory.
  • the camera performs time-lapse shooting.
  • the preset trajectory can be saved in the drone in advance, or it can be saved in the control terminal in advance. If the preset trajectory is saved in the drone, the drone obtains the preset trajectory locally.
  • the preset trajectory is stored in the control terminal, and the drone receives the preset trajectory sent by the control terminal.
  • the preset trajectory includes at least a plurality of waypoints, and the waypoints include at least position information. Therefore, the drone flies to the corresponding position according to the position information of each waypoint.
  • the waypoint further includes a shooting attitude and / or imaging parameters. Therefore, the drone adjusts the shooting attitude and / or imaging parameters according to the shooting attitude and / or imaging parameters of each waypoint.
  • the waypoints mentioned above can be obtained by users performing dot operations on the map displayed on the control terminal.
  • the user operates the control terminal to control the drone to fly to some locations, and the drone records these locations as routes.
  • Points record the position information of these points, and can also record the shooting attitude and / or imaging parameters of the drone at these locations.
  • the drone when the user controls the drone by operating the control terminal to control the drone to perform time-lapse shooting during the flight according to a preset trajectory, the drone may also be controlled to track and shoot the target object. Therefore, the control terminal controls the drone to fly according to a preset trajectory, for example: the control terminal detects a target object selection operation through the interactive device; determines the target object indication information according to the target object selection operation; and controls the drone to follow Pre-set trajectory flight, controlling the shooting attitude of the drone so that the drone tracks the target object indicated by the target object indication information.
  • the user needs to control the drone to track a target object for time-lapse shooting, the user performs a target object selection operation on the interactive device.
  • the control terminal detects the target object selection operation through the interactive interface, and determines according to the target object selection operation.
  • Target object indication information for example, controlling the terminal to display the shooting screen of the shooting device of the drone
  • the target object selection operation may be a frame operation for selecting the target object in the shooting screen, and the object selected by the frame operation frame may be the target Object
  • the instruction information of the object selected by the picture frame operation frame is the target object instruction information
  • the target object instruction information may be the position of the target object in the shooting screen
  • the control terminal controls the drone to fly according to a preset trajectory, Control the shooting attitude of the drone so that the drone tracks the target object indicated by the target object indication information, that is, control the shooting attitude of the shooting device of the drone so that the target object is always in the shooting screen of the shooting device.
  • the control terminal controls the shooting attitude of the drone so that the drone tracks the target object indicated by the target object indication information.
  • the control terminal sends the target object indication information to the drone.
  • the drone flying according to the preset trajectory may be, for example, after the drone receives the target object indication information sent by the control terminal, the drone flies according to the preset trajectory and controls the shooting attitude of the shooting device to target the target The target object indicated by the object instruction information is tracked.
  • tracking the target object means that the shooting device of the drone is always aimed at the target object, so that the target object is in the shooting picture of the drone, for example, the target object may be in the drone The center position of the shooting screen.
  • the drone's flight trajectory is a straight line.
  • the control terminal controls the drone to fly according to the flight mode.
  • the control terminal detects a flight direction setting operation through the interactive device, and the flight direction setting operation is an operation performed by the user on the interactive device.
  • the direction setting operation determines the flight direction instruction information, and the flight direction is determined according to the flight direction instruction information.
  • the control terminal in this embodiment controls the drone to perform time-lapse shooting during the process of controlling the drone to fly straight in the flight direction. Accordingly, the drone is in the process of flying straight in the flight direction. To control the shooting device for time-lapse shooting.
  • the control terminal controlling the drone to fly straight in accordance with the flight direction may be, for example, that the control terminal sends flight direction instruction information to the drone.
  • the flying of the drone according to the flight mode may be, for example, that the drone receives the flight direction instruction information sent by the control terminal, and then flies straight according to the flight direction indicated by the flight direction instruction information.
  • the drone when the user controls the drone by operating the control terminal to control the drone to perform time-lapse shooting during a straight flight in accordance with the flight direction, the drone may also be controlled to track and shoot the target object. Therefore, the control terminal controls the drone to fly straight in accordance with the flight direction, for example: the control terminal detects a target object selection operation through an interactive device; determines the target object indication information according to the target object selection operation; and controls the drone Fly straight in accordance with the flying direction, and control the shooting attitude of the drone so that the drone tracks the target object indicated by the target object instruction information.
  • the control terminal detects the target object selection operation through the interactive interface, and determines according to the target object selection operation.
  • Target object indication information for example, the target object selection operation may be a frame operation for selecting a target object, the object selected by the frame operation frame may be a target object, and the instruction information of the object selected by the frame operation frame is the target object indication information; Then, the control terminal controls the drone to fly straight in accordance with the flight direction, and controls the shooting attitude of the drone so that the drone can track the target object indicated by the target object indication information.
  • the control terminal controls the shooting attitude of the drone so that the drone tracks the target object indicated by the target object indication information.
  • the control terminal sends the target object indication information to the drone.
  • the drone flying according to the preset trajectory may be, for example, after the drone receives the target object instruction information sent by the control terminal, it flies straight in accordance with the flight direction, and controls the shooting attitude to indicate the target object instruction information. The target object is tracked.
  • the drone's flight trajectory is to orbit a target object.
  • the control terminal controls the drone to fly according to the flight mode. For example, the control terminal detects a target object selection operation through the interaction device, and the target object selection operation is an operation performed by the user on the interaction device.
  • the object selection operation determines the target object indication information, for example, the target object selection operation is a picture frame operation for selecting the target object, and the object selected by the picture frame operation is the target object; and then the control terminal controls the drone to instruct the target object.
  • the control terminal in this embodiment controls the drone to perform time-lapse shooting during the process of controlling the drone to orbit the target object, and accordingly, the drone controls the shooting device to perform the process of orbiting the target object. Time-lapse shooting.
  • the control terminal controlling the orbiting of the target object indicated by the drone to the target object instruction information may be, for example, that the control terminal sends the target object instruction information to the drone.
  • the flying of the drone according to the flight mode may be, for example, that the drone receives the target object indication information sent by the control terminal, and flies around the target object according to the target object indication information.
  • the drone flying around the target object may be the drone adjusting the shooting attitude so that the target object is in the shooting frame of the drone, for example, at the center of the shooting frame. As shown in FIG. 4, the drone 401 orbits the target object 402.
  • the drone 401 can control the shooting posture of the shooting device to track the target object 402, that is, the target object 402 is shooting. Device's shooting screen. In addition, the drone 401 may control the shooting device to perform time-lapse shooting on the target object 402 according to the time-lapse shooting parameters.
  • the control terminal detects the surround direction setting operation through the interactive device, determines the surround direction, for example, clockwise or counterclockwise, and then controls the terminal to control the drone to perform the The orbiting direction is about the target pair orbiting.
  • the control terminal may send the orbiting direction indication information to the drone.
  • the drone receives the orbiting direction indication information and according to the orbiting direction indication information, orbits the target object.
  • the user may perform an orbiting distance setting operation on the interactive device.
  • the control terminal detects the orbiting distance setting operation through the interactive device and determines the orbiting distance.
  • the orbiting distance may refer to, for example, the distance between the drone and the target object. Horizontal distance, and then the control terminal controls the drone to fly around the target pair according to the surrounding distance, for example, the control terminal may send the surrounding distance instruction information to the drone, and accordingly, the drone receives the surrounding distance instruction information and The orbiting distance indicated by the orbiting distance instruction information flies around the target object.
  • the user may perform an orbiting height setting operation on the interactive device.
  • the control terminal detects the orbiting height setting operation through the interactive device and determines the orbiting height.
  • the orbiting height may be, for example, between the drone and the target object. Vertical distance, and then the control terminal controls the drone to fly around the target pair according to the orbit height.
  • the control terminal may send the orbit height instruction information to the drone. Accordingly, the drone receives the orbit height instruction information and The orbiting altitude indicated by the orbiting altitude indication information orbits the target object.
  • the user can control the drone to pause the time-lapse shooting at any time.
  • the control terminal detects the pause time-lapse shooting operation through the interactive device; when the user needs to control the time-lapse shooting of the drone, the user can pause the time-lapse shooting of the interactive device.
  • the control terminal can display the pause time-lapse shooting icon when the drone is performing time-lapse shooting, and the user can perform touch operations on the pause time-lapse shooting icon through the interactive device.
  • the control terminal can detect the pause time-lapse shooting operation through the interactive device.
  • control the drone to pause time-lapse shooting for example, when the control terminal detects the time-lapse shooting operation , Sending a pause and delay shooting instruction to the drone, and accordingly, the drone receives the pause and delay shooting instruction sent by the control terminal, and controls the shooting device to pause the delay shooting according to the pause and delay shooting instruction.
  • the user can also control the drone to resume the delayed shooting.
  • the control terminal detects and resumes the delayed shooting operation through the interactive device.
  • the user can perform the resume time-lapse shooting operation on the interactive device.
  • the control terminal can display the resume time-lapse shooting icon after the drone pauses the time-lapse shooting.
  • the interactive device performs a touch operation on the resume time-lapse shooting icon.
  • the control terminal can detect the resume time-lapse shooting operation through the interactive device.
  • control the drone to resume time-lapse shooting for example, when the control terminal detects the resume time-lapse shooting operation , Sending a resume delay shooting instruction to the drone, and accordingly, the drone receives the resume delay shooting instruction sent by the control terminal, and according to the resume delay shooting instruction, continues to control the shooting device for delay shooting.
  • the above-mentioned start time-lapse shooting icon, the above-mentioned pause time-lapse shooting icon, and the above-mentioned resume time-lapse shooting icon may be the same icon displayed by the control terminal, and the icon has different Features.
  • the user can set the time-lapse shooting parameters by operating the control terminal, and make the drone perform time-lapse shooting according to the time-lapse shooting parameters set by the user.
  • the The man-machine can also control the shooting device to perform time-lapse shooting when flying in flight modes such as free flight mode, straight flight mode, trajectory flight mode, or surround flight mode, so that the time-lapse video obtained is more exciting.
  • the user can control the drone to perform time-lapse shooting by operating the control terminal, and the time-lapse shooting is performed by the drone, which is suitable for various shooting application scenarios and can bring different shooting to the user. Experience.
  • An embodiment of the present invention also provides a computer storage medium.
  • the computer storage medium stores program instructions, and the program execution may include a part or all of the steps of the time-lapse shooting control method in the foregoing embodiments.
  • FIG. 5 is a schematic structural diagram of a control terminal according to an embodiment of the present invention.
  • the control terminal 500 in this embodiment may be used to control a drone.
  • the control terminal 500 may include: an interaction device 501 and a process ⁇ 502.
  • the processor 502 may be a central processing unit (CPU), and the processor 502 may also be another general-purpose processor, a digital signal processor (DSP), or an application-specific integrated circuit (Application Specific Integrated Circuit). ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the interactive device 501 is configured to detect a time-lapse shooting parameter setting operation.
  • the processor 502 is configured to determine a time-lapse shooting parameter according to the time-lapse shooting parameter setting operation detected by the interactive device; and control the drone to perform time-lapse shooting according to the time-lapse shooting parameter.
  • the delay shooting parameters include at least one of a shooting time interval, a shooting duration, a number of captured images, and a time-lapse video duration.
  • the processor 502 is further configured to obtain a time-lapse video sent by the drone, where the time-lapse video is obtained by the drone performing time-lapse shooting according to the time-lapse shooting parameters. Time Lapse Image Generated.
  • the processor 502 is further configured to acquire a time-lapse shooting image sent by the drone, and generate a time-lapse video according to the time-lapse shooting image.
  • the time-lapse shooting image is obtained by the drone performing time-lapse shooting according to the time-lapse shooting parameters.
  • the interaction device 501 is further configured to detect a sharing operation.
  • the processor 502 is further configured to share the time-lapse video after the interactive device detects a sharing operation.
  • the interaction device 501 is further configured to detect and start a time-lapse shooting operation before the processor controls the drone to perform time-lapse shooting according to the time-lapse shooting parameters.
  • the processor 502 is specifically configured to control the drone to perform time-lapse shooting according to the time-lapse shooting parameter when the interactive device detects the start time-lapse shooting operation.
  • the interaction device 501 is further configured to detect a time-lapse shooting mode trigger operation before detecting a time-lapse shooting parameter setting operation.
  • the processor 502 is further configured to enter the time-lapse shooting mode when the interactive device detects a trigger operation of the time-lapse shooting mode.
  • the interactive device 501 detects a time-lapse shooting parameter setting operation, it is specifically configured to: after entering the time-lapse shooting mode, detect the time-lapse shooting parameter setting operation.
  • the interaction device 501 is further configured to detect an imaging parameter setting operation.
  • the processor 502 is further configured to determine an imaging parameter according to the imaging parameter setting operation detected by the interaction device 501.
  • the processor 502 controls the drone to perform time-lapse shooting according to the time-lapse shooting parameters
  • the processor 502 is specifically configured to control the drone to perform time-lapse shooting according to the time-lapse shooting parameters and the imaging parameters. .
  • the interaction device 501 is further configured to detect a flight mode setting operation of the drone.
  • the processor 502 is further configured to determine a flight mode of the drone according to the flight mode setting operation detected by the interaction device 501; and control the drone to fly in accordance with the flight mode.
  • the processor 502 controls the drone to perform time-lapse shooting according to the time-lapse shooting parameters, the processor 502 is specifically configured to: during the flight of the drone according to the flight mode, control the drone according to the time-delay Shooting parameters for time-lapse shooting.
  • the flight mode is a free flight mode.
  • the processor 502 controls the drone to fly according to the flight mode, the processor 502 is specifically configured to: detect a flight control operation through the interactive device 501; determine a control lever amount according to the flight control operation; The joystick amount controls the drone flight.
  • the flight mode is a trajectory flight mode.
  • the processor 502 controls the drone to fly according to the flight mode, the processor 502 is specifically configured to control the drone to fly according to a preset trajectory.
  • the interaction device 501 is further configured to detect a target object selection operation.
  • the processor 502 controls the drone to fly according to a preset trajectory
  • the processor 502 is specifically configured to: determine target object indication information according to the target object selection operation detected by the interactive device 501;
  • the human machine flies according to a preset trajectory, and controls the shooting attitude of the drone so that the drone tracks the target object indicated by the target object indication information.
  • the preset trajectory includes at least a plurality of waypoints, and the waypoints include at least position information.
  • the waypoint further includes a shooting attitude and / or an imaging parameter.
  • the flight mode is a straight flight mode.
  • the interaction device 501 is further configured to detect a flying direction setting operation.
  • the processor 502 controls the drone to fly according to the flight mode, the processor 502 is specifically configured to: determine a flight direction according to the flight direction setting operation detected by the interactive device 501; and control the drone Fly straight in the flight direction.
  • the interaction device 501 is further configured to detect a target object selection operation.
  • the processor 502 controls the drone to fly straight in the flight direction
  • the processor 502 is specifically configured to: determine target object indication information according to the target object selection operation detected by the interactive device 501; and control the The drone flies straight in accordance with the flying direction, and controls the shooting attitude of the drone according to the target object indication information so that the drone tracks the target object indicated by the target object indication information.
  • the flight mode is a surround flight mode.
  • the interaction device 501 is further configured to detect a target object selection operation.
  • the processor 502 controls the drone to fly according to the flight mode, the processor 502 is specifically configured to determine target object indication information according to the target object selection operation detected by the interactive device 501; The target object indicated by the human-machine instruction information on the target object orbits.
  • the interaction device 501 is further configured to detect that the time-lapse shooting operation is suspended during the time-lapse shooting of the drone.
  • the processor 502 is further configured to control the drone to suspend time-lapse photography when the interaction device 501 detects the time-lapse photography operation.
  • control terminal 500 in this embodiment may further include a memory (not shown in the figure).
  • the memory is used to store program code.
  • the control terminal 500 may implement the technical solution of the control terminal. .
  • control terminal in this embodiment may be used to execute the technical solutions of the control terminal in the foregoing method embodiments of the present invention.
  • the implementation principles and technical effects of the control terminal are similar, and details are not described herein again.
  • FIG. 6 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • the drone 600 in this embodiment may include a communication device 601, a processor 602, and a photographing device 603.
  • the communication device 601 is configured to receive a time-lapse shooting parameter sent by a control terminal, where the time-lapse shooting parameter is determined by the control terminal by detecting a time-lapse shooting parameter setting operation.
  • the processor 602 is configured to control the shooting device 603 to perform time-lapse shooting according to the time-lapse shooting parameters.
  • the delay shooting parameters include at least one of a shooting time interval, a shooting duration, a number of captured images, and a time-lapse video duration.
  • the processor 602 is further configured to generate a time-lapse video according to a time-lapse shooting image obtained by time-lapse shooting.
  • the communication device 601 is further configured to send the time-lapse video to the control terminal.
  • the communication device 601 is further configured to send a time-lapse shooting image to the control terminal, where the time-lapse shooting image is used to generate a time-lapse video.
  • the communication device 601 is further configured to receive a delayed shooting start instruction sent by a control terminal, where the delayed shooting start instruction is determined by the control terminal by detecting a start of a delayed shooting operation.
  • the processor 602 is specifically configured to: when the communication device 601 receives the delayed shooting start instruction, control the shooting device 603 to perform delayed shooting according to the delayed shooting parameters.
  • the communication device 601 is further configured to receive an imaging parameter sent by the control terminal, where the imaging parameter is determined by the control terminal by detecting an imaging parameter setting operation;
  • the processor 602 controls the shooting device 603 to perform time-lapse shooting according to the time-lapse shooting parameter, it is specifically configured to control the shooting device 603 to perform time-lapse shooting according to the time-lapse shooting parameter and the imaging parameter.
  • the communication device 601 is further configured to receive a flight mode setting instruction sent by the control terminal, where the flight mode setting instruction is determined by the control terminal by detecting a flight mode setting operation.
  • the processor 602 is further configured to determine a flight mode of the drone according to the flight mode setting instruction; and control the drone to fly according to the flight mode.
  • the processor 602 controls the shooting device 603 to perform time-lapse shooting according to the time-lapse shooting parameters
  • the processor 602 is specifically configured to: during the flight of the drone according to the flight mode, according to the time-lapse shooting parameters
  • the shooting device 603 is controlled to perform time-lapse shooting.
  • the flight mode is a free flight mode.
  • the communication device 601 is further configured to receive a lever amount sent by the control terminal.
  • the processor 602 is specifically configured to control the drone to fly according to the amount of the control lever received by the communication device 601.
  • the flight mode is a trajectory flight mode.
  • the processor controls the drone to fly according to the flight mode, the processor is specifically configured to: obtain a preset trajectory; and control the drone to fly according to the preset trajectory.
  • the communication device 601 is further configured to receive target object indication information sent by the control terminal, where the target object indication information is determined by the control terminal by detecting a target object selection operation.
  • the processor 602 controls the drone to fly according to the preset trajectory, it is specifically configured to: control the drone to fly according to the preset trajectory, and control the shooting attitude of the shooting device 603 to target the target object.
  • the target object indicated by the instruction information is tracked.
  • the preset trajectory includes at least a plurality of waypoints, and the waypoints include at least position information.
  • the waypoint further includes a shooting attitude and / or an imaging parameter.
  • the flight mode is a straight flight mode.
  • the communication device 601 is further configured to receive flight direction instruction information sent by the control terminal, where the flight direction instruction information is determined by the control terminal by detecting a flight direction setting operation.
  • the processor 602 controls the drone to fly according to the flight mode, the processor 602 is specifically configured to control the drone to fly straight in the flight direction indicated by the flight direction instruction information.
  • the communication device 601 is further configured to receive target object indication information sent by the control terminal, where the target object indication information is determined by the control terminal by detecting a target object selection operation.
  • the processor 602 controls the drone to fly straight in the direction of flight, it is specifically configured to control the drone to fly straight in the direction of flight and control the shooting attitude of the shooting device 603 to indicate the target object indication information.
  • the target object performs tracking flight.
  • the flight mode is a surround flight mode.
  • the communication device 601 is further configured to receive target object indication information sent by the control terminal, where the target object indication information is determined by the control terminal by detecting a target object selection operation.
  • the processor 602 controls the drone to fly according to the flight mode, it is specifically configured to control the drone to fly around the target object according to the target object indication information.
  • the communication device 601 is further configured to receive a pause time-lapse shooting instruction sent by the control terminal during the time-lapse shooting of the shooting device 603, where the time-lapse pause instruction is Determined by the control terminal by detecting the pause time-lapse shooting operation.
  • the processor 602 is further configured to control the shooting device 603 to suspend time-lapse shooting according to the pause-time-lapse shooting instruction.
  • the drone 600 in this embodiment may further include a memory (not shown in the figure).
  • the memory is used to store program code.
  • the drone 600 may implement the drone described above.
  • the processor 602 may include a flight controller.
  • the drone of this embodiment may be used to implement the technical solutions of the drone in the foregoing method embodiments of the present invention.
  • the implementation principles and technical effects are similar, and details are not described herein again.
  • FIG. 7 is a schematic structural diagram of a time-lapse shooting system according to an embodiment of the present invention.
  • the time-lapse shooting system 700 in this embodiment may include a control terminal 701 and a drone 702.
  • the control terminal 701 may adopt the structure of the embodiment shown in FIG. 5, and correspondingly, the technical solutions of the control terminal in the foregoing method embodiments may be implemented.
  • the implementation principles and technical effects are similar, and are not described herein again.
  • the drone 702 may adopt the structure of the embodiment shown in FIG. 6.
  • the technical solutions of the drone in the foregoing method embodiments may be implemented.
  • the implementation principles and technical effects are similar, and are not described herein again.
  • the foregoing program may be stored in a computer-readable storage medium.
  • the program is executed, the program is executed.
  • the foregoing storage medium includes: a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, etc. The medium.

Abstract

本发明实施例提供一种延时拍摄控制方法和设备,此方法包括:通过交互装置检测到延时拍摄参数设置操作,再根据所述延时拍摄参数设置操作,确定延时拍摄参数,然后控制无人机根据所述延时拍摄参数进行延时拍摄。因此,用户通过操作控制终端即可实现控制无人机进行延时拍摄,通过无人机来进行延时拍摄,适应各种不同的拍摄应用场景,可以为用户带来不同的拍摄体验。

Description

延时拍摄控制方法和设备 技术领域
本发明实施例涉及无人机技术领域,尤其涉及一种延时拍摄控制方法和设备。
背景技术
延时拍摄又叫缩时摄影(Time-lapse photography)或缩时录影,是一种将时间压缩的拍摄技术,其拍摄的是一组照片或是视频,后期通过照片串联或是视频抽帧,把几分钟、几小时甚至是几天几年的过程压缩在一个较短的时间内以视频的方式播放。目前,在拍摄延时视频时,是将拍摄装置固定在一个稳定的承载机构上来拍摄,例如三角架。由于承载机构的移动和安装限制,限制延时视频的拍摄应用场景。
发明内容
本发明实施例提供一种延时拍摄控制方法和设备,用于通过无人机来进行延时拍摄,扩展延时视频拍摄的应用场景。
第一方面,本发明实施例提供一种延时拍摄控制方法,应用于无人机的控制终端,所述方法包括:
通过交互装置检测到延时拍摄参数设置操作;
根据所述延时拍摄参数设置操作,确定延时拍摄参数;
控制无人机根据所述延时拍摄参数进行延时拍摄。
第二方面,本发明实施例提供一种延时拍摄控制方法,应用于无人机,所述方法包括:
接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的;
根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
第三方面,本发明实施例提供一种控制终端,包括:
交互装置,用于检测延时拍摄参数设置操作;
处理器,用于根据所述交互装置检测到的所述延时拍摄参数设置操作,确定延时拍摄参数;以及控制无人机根据所述延时拍摄参数进行延时拍摄。
第四方面,本发明实施例提供一种无人机,包括:
通信装置,用于接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的;
处理器,用于根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
第五方面,本发明实施例提供一种可读存储介质,所述可读存储介质上存储有计算机程序;所述计算机程序在被执行时,实现如第一方面或第二方面本发明实施例所述的延时拍摄控制方法。
本发明实施例提供的延时拍摄控制方法和设备,控制终端通过交互装置检测到延时拍摄参数设置操作,再根据所述延时拍摄参数设置操作,确定延时拍摄参数,然后控制无人机根据所述延时拍摄参数进行延时拍摄。因此,用户通过操作控制终端即可实现控制无人机进行延时拍摄,通过无人机来进行延时拍摄,适应各种不同的拍摄应用场景,可以为用户带来不同的拍摄体验。
附图说明
图1为根据本发明的实施例的无人机的示意性架构图;
图2为本发明一实施例提供的延时拍摄控制方法的流程图;
图3为本发明另一实施例提供的延时拍摄控制方法的流程图;
图4为本发明实施例提供一种无人机对目标对象进行延时拍摄场景的示意图;
图5为本发明一实施例提供的控制终端的一种结构示意图;
图6为本发明一实施例提供的无人机的一种结构示意图;
图7为本发明一实施例提供的延时拍摄系统的一种结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全 部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,当组件被称为“固定于”另一个组件,它可以直接在另一个组件上或者也可以存在居中的组件。当一个组件被认为是“连接”另一个组件,它可以是直接连接到另一个组件或者可能同时存在居中组件。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。本文所使用的术语“及/或”包括一个或多个相关的所列项目的任意的和所有的组合。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本发明的实施例提供了无人机的控制方法、设备和无人机。其中无人机可以是旋翼飞行器(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼飞行器,本发明的实施例并不限于此。
图1是根据本发明的实施例的无人飞行系统的示意性架构图。本实施例以旋翼无人机为例进行说明。
无人飞行系统100可以包括无人机110、显示设备130和控制装置140。其中,无人机110可以包括动力系统150、飞行控制系统160、机架和承载在机架上的云台120。无人机110可以与控制终端140和显示设备130进行无线通信。
机架可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在无人机110着陆时起支撑作用。
动力系统150可以包括一个或多个电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在无人机110的机臂上;电子调速器151用于接收飞行控制系统160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为无人机110的飞行提供动力, 该动力使得无人机110能够实现一个或多个自由度的运动。在某些实施例中,无人机110可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴(Roll)、偏航轴(Yaw)和俯仰轴(pitch)。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以是有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量无人机的姿态信息,即无人机110在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、超声传感器、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。飞行控制器161用于控制无人机110的飞行,例如,可以根据传感系统162测量的姿态信息控制无人机110的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对无人机110进行控制,也可以通过响应来自控制终端140的一个或多个控制指令对无人机110进行控制。
云台120可以包括电机122。云台用于携带拍摄装置123。飞行控制器161可以通过电机122控制云台120的运动。可选地,作为另一实施例,云台120还可以包括控制器,用于通过控制电机122来控制云台120的运动。应理解,云台120可以独立于无人机110,也可以为无人机110的一部分。应理解,电机122可以是直流电机,也可以是交流电机。另外,电机122可以是无刷电机,也可以是有刷电机。还应理解,云台可以位于无人机的顶部,也可以位于无人机的底部。
拍摄装置123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄装置123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄。本实施例的拍摄装置123至少包括感光元件,该感光元件例如为互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)传感器或电荷耦合元件(Charge-coupled Device,CCD)传感器。可以理解,拍摄装置123也可直接固定于无人机110上,从而云台120可以省略。
显示设备130位于无人飞行系统100的地面端,可以通过无线方式与无 人机110进行通信,并且可以用于显示无人机110的姿态信息。另外,还可以在显示设备130上显示成像装置拍摄的图像。应理解,显示设备130可以是独立的设备,也可以集成在控制终端140中。
控制终端140位于无人飞行系统100的地面端,可以通过无线方式与无人机110进行通信,用于对无人机110进行远程操纵。
应理解,上述对于无人飞行系统各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。
图2为本发明一实施例提供的延时拍摄控制方法的流程图,如图2所示,本实施例的方法应用于无人机的控制终端,本实施例的方法可以包括:
S201、通过交互装置检测到延时拍摄参数设置操作。
本实施例中,无人机的控制终端可以通过交互装置检测到延时拍摄参数设置操作。该控制终端包括遥控器、智能手机、平板电脑、膝上型电脑、穿戴式设备中的一种或多种,此处不再赘述。其中,交互装置可以是控制终端的重要组成部分,是与用户进行交互的接口,用户可以通过对交互装置的操作,实现对无人机的控制;当用户想要控制无人机时,用户对控制终端的交互装置进行操作,控制终端通过该交互装置检测到用户的操作,本实施例中,当用户想要对无人机的延时拍摄参数进行设置时,用户便对交互装置进行延时拍摄参数设置操作,交互装置会对该延时拍摄参数设置操作进行检测,因此,控制终端可以通过交互装置检测到用户的延时拍摄参数设置操作。该交互装置例如可以是控制终端触摸显示屏、键盘、摇杆、波轮中的一种或多种;同时触控屏还可以显示无人机的飞行的所有参数,可以显示无人机拍摄的画面。
S202、根据所述延时拍摄参数设置操作,确定延时拍摄参数。
本实施例中,该控制终端通过交互装置检测到延时拍摄参数设置操作后,根据该延时拍摄参数设置操作,确定用户设置的延时拍摄参数。可选地,该延时拍摄参数可以包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
S203、控制无人机根据所述延时拍摄参数进行延时拍摄。
本实施例中,控制终端在确定延时拍摄参数后,控制无人机根据该延时拍摄参数进行延时拍摄。可选地,控制终端控制无人机根据该延时拍摄参数 进行延时拍摄例如可以是:控制终端向无人机发送延时拍摄参数,以便无人机根据延时拍摄参数控制拍摄装置进行延时拍摄,其中,无人机的具体实现过程可以参见下述图3所示实施例中的相关描述,此处不再赘述。
可选地,该延时拍摄参数例如可以是拍摄图像的张数和拍摄时长,其中,用户对交互装置执行延时拍摄参数设置操作以直接向控制终端输入拍摄图像的张数和拍摄时长,然后控制终端根据用户通过延时拍摄参数设置操作输入的拍摄图像的张数和拍摄时长,控制无人机进行延时拍摄。例如:拍摄图像的张数为500,拍摄时长为40s,则控制终端可以控制无人机进行延时拍摄40s,获得500张延时拍摄图像。
可选地,该延时拍摄参数例如可以是拍摄时间间隔(例如5s)和延时视频时长(例如40s),其中,用户对交互装置执行延时拍摄参数设置操作以直接向控制终端输入拍摄时间间隔和延时视频时长,然后控制终端根据用户通过延时拍摄参数设置操作输入的拍摄时间间隔和延时视频时长,确定拍摄图像的张数和拍摄时长,再根据确定出的拍摄图像的张数和拍摄时长,控制无人机进行延时拍摄。
可选地,若用户通过延时拍摄参数设置操作输入的是拍摄图像的张数和拍摄时长,则控制终端通过交互界面显示拍摄图像的张数和拍摄时长。若用户通过延时拍摄参数设置操作输入的是拍摄时间间隔和延时视频时长,则控制终端通过交互界面显示拍摄时间间隔和延时视频时长,还通过交互界面显示根据拍摄时间间隔和延时视频时长确定的拍摄图像的张数和拍摄时长。以便用户直观地了解控制终端控制无人机进行延时拍摄的图像张数和时长。
本实施例中,通过交互装置检测到延时拍摄参数设置操作,再根据所述延时拍摄参数设置操作,确定延时拍摄参数,然后控制无人机根据所述延时拍摄参数进行延时拍摄。因此,用户通过操作控制终端即可实现控制无人机进行延时拍摄,通过无人机来进行延时拍摄,适应各种不同的拍摄应用场景,为用户带来了不同的拍摄体验。
图3为本发明另一实施例提供的延时拍摄控制方法的流程图,如图3所示,本实施例的方法应用于无人机,本实施例的方法可以包括:
S301、接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的。
S302、根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
本实施例中,无人机接收控制终端发送的延时拍摄参数,该延时拍摄参数是控制终端通过检测延时拍摄参数设置操作确定,具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。然后无人机根据接收的延时拍摄参数控制拍摄装置进行延时拍摄。可选地,该延时拍摄参数可以包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
可选地,用户通过延时拍摄参数设置操作向控制终端输入拍摄图像的张数和拍摄时长,然后控制终端将用户输入的拍摄图像的张数和拍摄时长作为延时拍摄参数发送给无人机。然后无人机根据接收的拍摄图像的张数和拍摄时长控制拍摄装置进行延时拍摄。
可选地,用户通过延时拍摄参数设置操作向控制终端输入拍摄时间间隔和延时视频时长,然后控制终端根据用户输入的拍摄时间间隔和延时视频时长,确定拍摄图像的张数和拍摄时长,再将确定的拍摄图像的张数和拍摄时长作为延时拍摄参数发送给无人机。然后无人机根据接收的拍摄图像的张数和拍摄时长控制拍摄装置进行延时拍摄。
可选地,用户通过延时拍摄参数设置操作向控制终端输入拍摄时间间隔和延时视频时长,然后控制终端将用户输入的拍摄时间间隔和延时视频时长作为延时拍摄参数发送给无人机。然后无人机接收的拍摄时间间隔和延时视频时长,确定拍摄图像的张数和拍摄时长,并根据确定的拍摄图像的张数和拍摄时长控制拍摄装置进行延时拍摄。
本实施例中,无人机通过接收控制终端发送的延时拍摄参数,在飞行的过程中,根据所述延时拍摄参数控制拍摄装置进行延时拍摄,因此,无人机可受控于控制终端进行延时拍摄,通过无人机来进行延时拍摄,为用户带来了不同的延时拍摄体验。
在一些实施例中,控制终端控制无人机进行延时拍摄后,控制终端还可以获得延时视频。在一种实现方式中,控制终端可以获取无人机发送的延时视频。其中,无人机根据延时拍摄参数控制拍摄装置进行延时拍摄,可以获得延时拍摄图像,然后无人机根据所述延时拍摄图像获得的延时拍摄图像生成延时视频,并将生成的延时视频发送给控制终端,相应地,控制终端获取无人机发送的延时视频。例如:无人机可以每生成一个延时视频主动向控制 终端发送延时视频,以便控制终端实时将延时视频显示给用户观看,或者,也可以是无人机接收到控制终端发送的获取延时视频指令时再发送给控制终端。其中,无人机可以通过无线通信链路或者有线通信链路向控制终端发送延时视频。
在另一实现方式中,控制终端可以生成延时视频。其中,无人机根据延时拍摄参数控制拍摄装置进行延时拍摄,可以获得延时拍摄图像,然后无人机向控制终端发送延时拍摄图像。相应地,控制终端获取无人机发送的延时拍摄图像,然后控制终端根据延时拍摄图像生成延时视频。其中,无人机可以通过无线通信链路或者有线通信链路向控制终端发送延时拍摄图像。
在一些实施例中,控制终端获得延时视频后,用户可以对该延时视频进行分享。在用户需要对延时视频进行分享时,用户可以对交互装置进行分享操作。而控制终端可以通过交互装置检测分享操作,在通过交互装置检测到用户的分享操作后,对该延时视频进行分享,例如:控制终端可以将该延时视频发布到网络(例如社交网站,或者社交APP等)。
在一些实施例中,控制终端在检测到开始延时拍摄操作时才根据延时拍摄参数控制无人机进行延时拍摄。具体可以为:控制终端先通过交互装置检测到延时拍摄参数设置操作,并根据延时拍摄参数设置操作确定延时拍摄参数,在确定延时拍摄参数后再通过交互装置检测开始延时拍摄操作,当用户想要控制无人机开始进行延时拍摄时,用户可以对交互装置进行开始延时拍摄操作,例如:控制终端可以显示开始延时拍摄图标,用户可以通过交互装置对开始延时拍摄图标进行触点操作。然后控制终端在检测到该开始延时拍摄操作时,控制无人机根据延时拍摄参数进行延时拍摄,其中,控制终端例如可以向无人机发送延时拍摄开始指令,该延时拍摄开始指令用于控制无人机开始进行延时拍摄。相应地,无人机先接收控制终端发送的延时拍摄参数,再等待接收控制终端发送的延时拍摄开始指令,当无人机接收到延时拍摄开始指令时,根据延时拍摄参数控制拍摄装置进行延时拍摄。
在一些实施例中,用户需要操作控制终端进入延时拍摄模式后,才能设置延时拍摄参数。也就是,当用户需要控制无人机进行延时拍摄时,用户需要先控制控制终端进入延时拍摄模式,因此,控制终端通过交互装置检测延时拍摄触发操作,当用户对交互装置进行延时拍摄触发操作时,控制终端可 以通过交互装置检测到延时拍摄触发操作,在控制终端检测到延时拍摄触发操作时,控制终端进入延时拍摄模式。可选地,控制终端在进入延时拍摄模式后,控制终端还可以显示延时拍摄设置界面。在控制终端进入延时拍摄模式后,用户(例如可以基于显示的延时拍摄设置界面)再对交互装置进行延时拍摄参数设置操作,相应地,控制终端通过交互装置检测到延时拍摄参数设置操作。
在一些实施例中,用户还可以对无人机进行延时拍摄时拍摄装置的成像参数进行设置,其中,所述成像参数可以包括焦距、曝光参数、对焦中的至少一个。因此,控制终端可以通过交互装置检测成像参数设置操作。当用户需要对成像参数进行设置时,用户可以对交互装置进行成像参数设置操作,相应地,控制终端通过交互装置检测到成像参数设置操作,并根据该成像参数设置操作,确定成像参数。控制无人机根据上述的延时拍摄参数和确定的成像参数,进行延时拍摄,具体可以为:控制终端向无人机发送上述的延时拍摄参数和确定的成像参数,相应地,无人机接收控制终端发送的延时拍摄参数和成像参数,然后无人机根据接收的延时拍摄参数和成像参数控制拍摄装置进行延时拍摄。其中,控制终端可以同时向无人机发送上述的延时拍摄参数和成像参数,也可以分别向无人机发送延时拍摄参数和成像参数。可选地,控制终端可以将延时拍摄参数和成像参数包括在延时拍摄开始指令中发送给无人机。
在一些实施例中,用户不仅可以控制无人机进行延时拍摄,还可以控制无人机在进行延时拍摄时的飞行模式。因此,当用户需要设置无人机的飞行模式时,用户对交互装置进行无人机的飞行模式设置操作,相应地,控制终端可以通过交互装置检测到飞行模式设置操作,然后根据该飞行模式设置操作,确定无人机的飞行模式,再控制无人机按照该飞行模式飞行,在所述无人机按照所述飞行模式飞行的过程中,控制所述无人机根据延时拍摄参数进行延时拍摄。例如:控制终端在确定无人机的飞行模式后,向无人机发送飞行模式设置指令;无人机接收到控制终端发送的飞行模式设置指令后,根据该飞行模式设置指令,确定无人机的飞行模式,然后无人机按照该飞行模式飞行,在无人机按照所述飞行模式飞行的过程中,无人机根据所述延时拍摄参数控制拍摄装置进行延时拍摄。
可选地,该飞行模式可以为自由飞行模式,和/或者,轨迹飞行模式,和/或者,直线飞行模式,和/或者,环绕飞行模式。
以飞行模式为自由飞行模式为例,无人机在自由飞行模式下的飞行轨迹受控于用户的操作。因此,上述控制终端控制无人机按照飞行模式飞行例如可以为:控制终端通过交互装置检测飞行控制操作,该飞行控制操作是用户对交互装置进行的操作,然后控制终端根据该飞行控制操作,确定控制杆量,再根据控制杆量控制无人机飞行。其中,本实施例的控制终端在根据控制杆量控制无人机飞行的过程中,控制无人机进行延时拍摄,相应地,无人机在根据控制杆量飞行的过程中,控制拍摄装置进行延时拍摄。
其中,控制终端根据控制杆量控制无人机飞行例如可以为:控制终端向无人机发送控制杆量。相应地,无人机按照飞行模式飞行例如可以为:无人机接收控制终端发送的控制杆量,然后根据该控制杆量飞行。该控制杆量可以控制无人机的飞行轨迹和/或控制无人机的拍摄姿态,该控制杆量例如可以控制无人机在空中的位置、飞行方向、飞行速度、飞行距离、飞行加速度等。
以飞行模式为轨迹飞行模式为例,无人机的飞行轨迹是预先设定好的。所述控制终端控制无人机按照飞行模式飞行例如为:控制终端控制所述无人机按照预设的轨迹飞行。相应地,无人机按照飞行模式飞行例如为:无人机获取预设的轨迹,然后按照预设的轨迹飞行。其中,本实施例的控制终端在控制无人机按照预设的轨迹飞行的过程中,控制无人机进行延时拍摄,相应地,无人机在按照预设的轨迹飞行的过程中,控制拍摄装置进行延时拍摄。
该预设的轨迹可以是预先保存在无人机中,也可以是预先保存在控制终端中,若预设的轨迹保存在无人机中,则无人机从本地获取预设的轨迹,若预设的轨迹保存在控制终端中,则无人机接收控制终端发送的预设的轨迹。可选地,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息,因此,无人机按照各个航点的位置信息飞行到相应地位置。可选地,所述航点还包括拍摄姿态和/或成像参数,因此,无人机按照各个航点的拍摄姿态和/或成像参数,调整拍摄姿态和/或成像参数。
其中,上述的航点可以是用户在控制终端显示的地图上进行打点操作得到,某些情况中,在用户操作控制终端以控制无人机飞到一些地点,无人机再记录这些地点为航点,记录这些点的位置信息,还可以记录无人机在这些 地点的拍摄姿态和/或成像参数。
在一些实施例中,用户通过操作控制终端控制无人机按照预设的轨迹飞行过程中控制无人机进行延时拍摄时,还可以控制无人机对目标对象进行跟踪拍摄。因此,控制终端控制无人机按照预设的轨迹飞行例如为:控制终端通过交互装置检测到目标对象选择操作;根据所述目标对象选择操作,确定目标对象指示信息;控制所述无人机按照预设的轨迹飞行,控制所述无人机的拍摄姿态以使所述无人机对目标对象指示信息指示的目标对象进行跟踪。当用户需要控制无人机跟踪某一目标对象进行延时拍摄时,用户对交互装置进行目标对象选择操作,相应地,控制终端通过交互界面检测到目标对象选择操作,根据该目标对象选择操作确定目标对象指示信息,例如控制终端显示无人机的拍摄装置的拍摄画面,目标对象选择操作可以为对拍摄画面中的目标对象进行框选的画框操作,画框操作框选的对象可以为目标对象,该画框操作框选的对象的指示信息为目标对象指示信息,所述目标对象指示信息可以为目标对象在拍摄画面中的位置;然后控制终端控制无人机按照预设的轨迹飞行,控制无人机的拍摄姿态以使无人机对目标对象指示信息指示的目标对象进行跟踪,即控制无人机的拍摄装置的拍摄姿态以使目标对象始终在拍摄装置的拍摄画面中。
其中,控制终端控制无人机的拍摄姿态以使无人机对目标对象指示信息指示的目标对象进行跟踪的一种实现方式为:控制终端向无人机发送目标对象指示信息。相应地,无人机按照预设的轨迹飞行例如可以为:无人机接收控制终端发送的目标对象指示信息后,无人机根据预设的轨迹飞行,并控制拍摄装置的拍摄姿态以对目标对象指示信息指示的目标对象进行跟踪。
其中,对该目标对像进行跟踪是指无人机的拍摄装置始终对准该目标对像,使得该目标对象处于该无人机的拍摄画面中,例如可以使得该目标对象处于该无人机的拍摄画面的中心位置。
以飞行模式为直线飞行模式为例,无人机的飞行轨迹是直线。所述控制终端控制无人机按照飞行模式飞行例如为:控制终端通过所述交互装置检测到飞行方向设置操作,该飞行方向设置操作是用户对交互装置进行的操作;控制终端再根据所述飞行方向设置操作,确定飞行方向指示信息,该飞行方向是根据飞行方向指示信息确定的。例如用户在控制终端中显示拍摄装置的 拍摄画面的交互界面的点击确定的飞行方向;然后控制终端控制所述无人机按照所述飞行方向指示信息指示的飞行方向直线飞行。其中,本实施例的控制终端在控制无人机按照所述飞行方向直线飞行的过程中,控制无人机进行延时拍摄,相应地,无人机在按照所述飞行方向直线飞行的过程中,控制拍摄装置进行延时拍摄。
其中,控制终端控制所述无人机按照所述飞行方向直线飞行例如可以为:控制终端向无人机发送飞行方向指示信息。相应地,无人机按照飞行模式飞行例如可以为:无人机接收控制终端发送的飞行方向指示信息,然后根据该飞行方向指示信息指示的飞行方向直线飞行。
在一些实施例中,用户通过操作控制终端控制无人机按照所述飞行方向直线飞行过程中控制无人机进行延时拍摄时,还可以控制无人机对目标对象进行跟踪拍摄。因此,控制终端控制无人机按照所述飞行方向直线飞行例如为:控制终端通过交互装置检测到目标对象选择操作;根据所述目标对象选择操作,确定目标对象指示信息;控制所述无人机按照所述飞行方向直线飞行,控制所述无人机的拍摄姿态以使所述无人机对目标对象指示信息指示的目标对象进行跟踪。
当用户需要控制无人机跟踪某一目标对象进行延时拍摄时,用户对交互装置进行目标对象选择操作,相应地,控制终端通过交互界面检测到目标对象选择操作,根据该目标对象选择操作确定目标对象指示信息,例如目标对象选择操作可以为框选目标对象的画框操作,画框操作框选的对象可以为目标对象,该画框操作框选的对象的指示信息为目标对象指示信息;然后控制终端控制无人机按照所述飞行方向直线飞行,控制无人机的拍摄姿态以使无人机对目标对象指示信息指示的目标对象进行跟踪,具体原理请参见前述部分。
其中,控制终端控制无人机的拍摄姿态以使无人机对目标对象指示信息指示的目标对象进行跟踪的一种实现方式为:控制终端向无人机发送目标对象指示信息。相应地,无人机按照预设的轨迹飞行例如可以为:无人机接收控制终端发送的目标对象指示信息后,按照所述飞行方向直线飞行,并控制拍摄姿态以对目标对象指示信息指示的目标对象进行跟踪。
以飞行模式为环绕飞行模式为例,无人机的飞行轨迹是环绕一目标对象 飞行。所述控制终端控制无人机按照飞行模式飞行例如为:控制终端通过所述交互装置检测到目标对象选择操作,该目标对象选择操作为用户对交互装置进行的操作;控制终端再根据所述目标对象选择操作,确定目标对象指示信息,例如该目标对象选择操作为框选目标对象的画框操作,画框操作框选的对象为目标对象;然后控制终端控制所述无人机对目标对象指示信息指示的目标对象环绕飞行。其中,本实施例的控制终端在控制无人机对目标对象环绕飞行的过程中,控制无人机进行延时拍摄,相应地,无人机对目标对象环绕飞行的过程中,控制拍摄装置进行延时拍摄。
其中,控制终端控制所述无人机对目标对象指示信息指示的目标对象环绕飞行例如可以为:控制终端向无人机发送目标对象指示信息。相应地,无人机按照飞行模式飞行例如可以为:无人机接收所述控制终端发送的目标对象指示信息,根据所述目标对象指示信息,对目标对象环绕飞行。其中,无人机对目标对象环绕飞行可以是无人机调整拍摄姿态以便目标对象处于无人机的拍摄画面中,例如处于拍摄画面的中心位置。如图4所示,无人机401对目标对象402环绕飞行,在环绕飞行的过程中,无人机401可以控制拍摄装置的拍摄姿态以对目标对象402进行跟踪,即使得目标对象402在拍摄装置的拍摄画面中。另外,无人机401可以控制拍摄装置根据所述延时拍摄参数对目标对象402进行延时拍摄。
可选地,用户可以对交互装置进行环绕方向设置操作,相应地,控制终端通过交互装置检测到环绕方向设置操作,确定环绕方向,例如顺时针或逆时针,然后控制终端控制无人机按照所述环绕方向对目标对环绕飞行,例如控制终端可以向无人机发送环绕方向指示信息,相应地,无人机接收环绕方向指示信息,并按照该环绕方向指示信息,对目标对象环绕飞行。
可选地,用户可以对交互装置进行环绕距离设置操作,相应地,控制终端通过交互装置检测到环绕距离设置操作,确定环绕距离,该环绕距离例如可以是指无人机与目标对象之间的水平距离,然后控制终端控制无人机按照所述环绕距离对目标对环绕飞行,例如控制终端可以向无人机发送环绕距离指示信息,相应地,无人机接收环绕距离指示信息,并按照该环绕距离指示信息指示的环绕距离,对目标对象环绕飞行。
可选地,用户可以对交互装置进行环绕高度设置操作,相应地,控制终 端通过交互装置检测到环绕高度设置操作,确定环绕高度,该环绕高度例如可以是指无人机与目标对象之间的垂直距离,然后控制终端控制无人机按照所述环绕高度对目标对环绕飞行,例如控制终端可以向无人机发送环绕高度指示信息,相应地,无人机接收环绕高度指示信息,并按照该环绕高度指示信息指示的环绕高度,对目标对象环绕飞行。
在上述各实施例的基础上,可选地,在无人机进行延时拍摄的过程中,用户可以控制无人机随时暂停延时拍摄。具体地,在无人机进行延时拍摄的过程中,控制终端通过交互装置检测暂停延时拍摄操作;当用户需要控制无人机暂停延时拍摄时,用户可以对交互装置进行暂停延时拍摄操作,例如:控制终端可以在无人机进行延时拍摄时显示暂停延时拍摄图标,用户可以通过交互装置对该暂停延时拍摄图标进行触点操作。相应地,控制终端可以通过交互装置检测到暂停延时拍摄操作,在检测到该暂停延时拍摄操作时,控制无人机暂停延时拍摄,例如:控制终端在检测到暂停延时拍摄操作时,向无人机发送暂停延时拍摄指令,相应地,无人机接收控制终端发送的暂停延时拍摄指令,并根据该暂停延时拍摄指令,控制拍摄装置暂停延时拍摄。
可选地,在无人机暂时延时拍摄后,用户还可以控制无人机恢复延时拍摄,具体地,在无人机暂停延时拍摄后,控制终端通过交互装置检测恢复延时拍摄操作;当用户需要控制无人机恢复延时拍摄时,用户可以对交互装置进行恢复延时拍摄操作,例如:控制终端可以在无人机暂停延时拍摄后显示恢复延时拍摄图标,用户可以通过交互装置对该恢复延时拍摄图标进行触点操作。相应地,控制终端可以通过交互装置检测到恢复延时拍摄操作,在检测到该恢复延时拍摄操作时,控制无人机恢复延时拍摄,例如:控制终端在检测到恢复延时拍摄操作时,向无人机发送恢复延时拍摄指令,相应地,无人机接收控制终端发送的恢复延时拍摄指令,并根据该恢复延时拍摄指令,继续控制拍摄装置进行延时拍摄。
可选地,上述的开始延时拍摄图标、上述的暂停延时拍摄图标、上述的恢复延时拍摄图标可以为控制终端显示的同一图标,该图标在无人机处于不同的操作下具有不同的功能。
综上所述,本发明实施例中,用户可以通过操作控制终端以设置延时拍摄参数,并使得无人机根据用户设置的延时拍摄参数进行延时拍摄,另外, 本发明实施例的无人机还可以在根据自由飞行模式、直线飞行模式、轨迹飞行模式或环绕飞行模式等飞行模式进行飞行时,控制拍摄装置进行延时拍摄,这样获得的延时视频更加精彩。由于本实施例中用户可以通过操作控制终端即可实现控制无人机进行延时拍摄,通过无人机来进行延时拍摄,适应各种不同的拍摄应用场景,可以为用户带来不同的拍摄体验。
本发明实施例中还提供了一种计算机存储介质,该计算机存储介质中存储有程序指令,所述程序执行时可包括上述各实施例中的延时拍摄控制方法的部分或全部步骤。
图5为本发明一实施例提供的控制终端的一种结构示意图,如图5所示,本实施例的控制终端500可以用于控制无人机,控制终端500可以包括:交互装置501和处理器502。上述处理器502可以是中央处理单元(Central Processing Unit,CPU),该处理器502还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
其中,交互装置501,用于检测延时拍摄参数设置操作。处理器502,用于根据所述交互装置检测到的所述延时拍摄参数设置操作,确定延时拍摄参数;以及控制无人机根据所述延时拍摄参数进行延时拍摄。
可选地,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
可选地,所述处理器502,还用于获取所述无人机发送的延时视频,其中,所述延时视频是所述无人机根据所述延时拍摄参数进行延时拍摄获取的延时拍摄图像生成的。
可选地,所述处理器502,还用于获取所述无人机发送的延时拍摄图像,以及根据所述延时拍摄图像,生成延时视频。其中,所述延时拍摄图像是所述无人机根据所述延时拍摄参数进行延时拍摄获取的。
可选地,所述交互装置501,还用于检测分享操作。所述处理器502,还用于所述交互装置检测到分享操作后,对所述延时拍摄视频进行分享。
可选地,所述交互装置501,还用于在所述处理器控制无人机根据所述延时拍摄参数进行延时拍摄之前,检测开始延时拍摄操作。所述处理器502,具体用于在所述交互装置检测到所述开始延时拍摄操作时,控制所述无人机根据所述延时拍摄参数进行延时拍摄。
可选地,所述交互装置501,还用于在检测到延时拍摄参数设置操作之前,检测延时拍摄模式触发操作。所述处理器502,还用于在所述交互装置检测到延时拍摄模式触发操作时,进入延时拍摄模式。所述交互装置501在检测延时拍摄参数设置操作时,具体用于:在进入延时拍摄模式之后,检测延时拍摄参数设置操作。
可选地,所述交互装置501,还用于检测成像参数设置操作。所述处理器502,还用于根据所述交互装置501检测到的所述成像参数设置操作,确定成像参数。所述处理器502在控制无人机根据所述延时拍摄参数进行延时拍摄时,具体用于:控制所述无人机根据所述延时拍摄参数和所述成像参数,进行延时拍摄。
可选地,所述交互装置501,还用于检测无人机的飞行模式设置操作。所述处理器502,还用于根据所述交互装置501检测到的所述飞行模式设置操作,确定所述无人机的飞行模式;以及控制所述无人机按照所述飞行模式飞行。所述处理器502在控制无人机根据延时拍摄参数进行延时拍摄时,具体用于:在所述无人机按照所述飞行模式飞行的过程中,控制所述无人机根据延时拍摄参数进行延时拍摄。
可选地,所述飞行模式为自由飞行模式。所述处理器502在控制所述无人机按照所述飞行模式飞行时,具体用于:通过所述交互装置501检测到飞行控制操作;根据所述飞行控制操作,确定控制杆量;根据所述控制杆量控制所述无人机飞行。
可选地,所述飞行模式为轨迹飞行模式。所述处理器502在控制所述无人机按照所述飞行模式飞行时,具体用于:控制所述无人机按照预设的轨迹飞行。
可选地,所述交互装置501,还用于检测目标对象选择操作。所述处理器502在控制所述无人机按照预设的轨迹飞行时,具体用于:根据所述交互装置501检测到的所述目标对象选择操作,确定目标对象指示信息;控制所 述无人机按照预设的轨迹飞行,控制所述无人机的拍摄姿态以使所述无人机对目标对象指示信息指示的目标对象进行跟踪。
可选地,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
可选地,所述航点还包括拍摄姿态和/或成像参数。
可选地,所述飞行模式为直线飞行模式。所述交互装置501,还用于检测飞行方向设置操作。所述处理器502在控制所述无人机按照所述飞行模式飞行时,具体用于:根据所述交互装置501检测到的所述飞行方向设置操作,确定飞行方向;控制所述无人机按照所述飞行方向直线飞行。
可选地,所述交互装置501,还用于检测目标对象选择操作。所述处理器502在控制所述无人机按照所述飞行方向直线飞行时,具体用于:根据所述交互装置501检测到的所述目标对象选择操作,确定目标对象指示信息;控制所述无人机按照所述飞行方向直线飞行,并根据所述目标对象指示信息控制所述无人机的拍摄姿态以使所述无人机对所述目标对象指示信息指示的目标对象进行跟踪飞行。
可选地,所述飞行模式为环绕飞行模式。所述交互装置501,还用于检测目标对象选择操作。所述处理器502在控制所述无人机按照所述飞行模式飞行时,具体用于:根据所述交互装置501检测到的所述目标对象选择操作,确定目标对象指示信息;控制所述无人机对目标对象指示信息指示的目标对象环绕飞行。
可选地,所述交互装置501,还用于在无人机进行延时拍摄的过程中,检测暂停延时拍摄操作。所述处理器502,还用于在所述交互装置501检测到所述暂停延时拍摄操作时,控制所述无人机暂停延时拍摄。
可选地,本实施例的控制终端500还可以包括存储器(图中未示出),存储器用于存储程序代码,当程序代码被执行时,所述控制终端500可以实现上述控制终端的技术方案。
本实施例的控制终端,可以用于执行本发明上述各方法实施例中控制终端的技术方案,其实现原理和技术效果类似,此处不再赘述。
图6为本发明一实施例提供的无人机的一种结构示意图,如图6所示,本实施例的无人机600可以包括:通信装置601、处理器602和拍摄装置603。
通信装置601,用于接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的。处理器602,用于根据所述延时拍摄参数控制拍摄装置603进行延时拍摄。
可选地,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
可选地,所述处理器602,还用于根据延时拍摄获取的延时拍摄图像,生成延时视频。所述通信装置601,还用于向所述控制终端发送所述延时视频。
可选地,所述通信装置601,还用于向所述控制终端发送延时拍摄图像,其中,所述延时拍摄图像用于生成延时视频。
可选地,所述通信装置601,还用于接收控制终端发送的延时拍摄开始指令,其中,所述延时拍摄开始指令是所述控制终端通过检测开始延时拍摄操作确定的。所述处理器602,具体用于:在所述通信装置601接收到所述延时拍摄开始指令时,根据所述延时拍摄参数控制拍摄装置603进行延时拍摄。
可选地,所述通信装置601,还用于接收所述控制终端发送的成像参数,所述成像参数是所述控制终端通过检测成像参数设置操作确定的;
所述处理器602在根据所述延时拍摄参数控制拍摄装置603进行延时拍摄时,具体用于:根据所述延时拍摄参数和所述成像参数,控制拍摄装置603进行延时拍摄。
所述通信装置601,还用于接收所述控制终端发送的飞行模式设置指令,所述飞行模式设置指令是所述控制终端通过检测飞行模式设置操作确定的。
所述处理器602,还用于根据所述飞行模式设置指令,确定所述无人机的飞行模式;以及控制无人机按照所述飞行模式飞行。所述处理器602在根据所述延时拍摄参数控制拍摄装置603进行延时拍摄时,具体用于:在所述无人机按照所述飞行模式飞行的过程中,根据所述延时拍摄参数控制拍摄装置603进行延时拍摄。
可选地,所述飞行模式为自由飞行模式。所述通信装置601,还用于接收所述控制终端发送的控制杆量。所述处理器602在控制无人机按照所述飞行模式飞行时,具体用于:根据所述通信装置601接收的所述控制杆量控制 无人机飞行。
可选地,所述飞行模式为轨迹飞行模式。所述处理器控制无人机按照所述飞行模式飞行时,具体用于:获取预设的轨迹;控制无人机按照所述预设的轨迹飞行。
可选地,所述通信装置601,还用于接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的。所述处理器602控制无人机按照所述预设的轨迹飞行时,具体用于:控制无人机按照所述预设的轨迹飞行,并控制拍摄装置603的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪。
可选地,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
可选地,所述航点还包括拍摄姿态和/或成像参数。
可选地,所述飞行模式为直线飞行模式。所述通信装置601,还用于接收所述控制终端发送的飞行方向指示信息,所述飞行方向指示信息是所述控制终端通过检测飞行方向设置操作确定的。所述处理器602控制无人机按照所述飞行模式飞行时,具体用于:控制无人机按照所述飞行方向指示信息指示的飞行方向直线飞行。
可选地,所述通信装置601,还用于接收所述控制终端的发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的。所述处理器602控制无人机按照飞行方向直线飞行时,具体用于:控制无人机按照所述飞行方向直线飞行,并控制拍摄装置603的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪飞行。
可选地,所述飞行模式为环绕飞行模式。所述通信装置601,还用于接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的。所述处理器602控制无人机按照所述飞行模式飞行时,具体用于:根据所述目标对象指示信息,控制无人机对目标对象环绕飞行。
可选地,所述通信装置601,还用于在所述拍摄装置603进行延时拍摄的过程中,接收所述控制终端发送的暂停延时拍摄指令,所述暂停延时拍摄指令是所述控制终端通过检测暂停延时拍摄操作确定的。所述处理器602, 还用于根据所述暂停延时拍摄指令,控制拍摄装置603暂停延时拍摄。
可选地,本实施例的无人机600还可以包括存储器(图中未示出),存储器用于存储程序代码,当程序代码被执行时,所述无人机600可以实现上述无人机的技术方案。
可选地,所述处理器602可以包括飞行控制器。
本实施例的无人机,可以用于执行本发明上述各方法实施例中无人机的技术方案,其实现原理和技术效果类似,此处不再赘述。
图7为本发明一实施例提供的延时拍摄系统的一种结构示意图,如图7所示,本实施例的延时拍摄系统700可以包括:控制终端701和无人机702。其中,控制终端701可以采用图5所示实施例的结构,其对应地,可以执行上述各方法实施例中控制终端的技术方案,其实现原理和技术效果类似,此处不再赘述。无人机702可以采用图6所示实施例的结构,其对应地,可以执行上述各方法实施例中无人机的技术方案,其实现原理和技术效果类似,此处不再赘述。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:只读内存(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (68)

  1. 一种延时拍摄控制方法,应用于无人机的控制终端,其特征在于,包括:
    通过交互装置检测到延时拍摄参数设置操作;
    根据所述延时拍摄参数设置操作,确定延时拍摄参数;
    控制无人机根据所述延时拍摄参数进行延时拍摄。
  2. 根据权利要求1所述的方法,其特征在于,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括,
    获取所述无人机发送的延时视频,其中,所述延时视频是所述无人机根据所述延时拍摄参数进行延时拍摄获取的延时拍摄图像生成的。
  4. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括,
    获取所述无人机发送的延时拍摄图像;
    根据所述延时拍摄图像,生成延时视频;
    其中,所述延时拍摄图像是所述无人机根据所述延时拍摄参数进行延时拍摄获取的。
  5. 根据权利要求3或4所述的方法,其特征在于,所述方法还包括:
    在通过所述交互装置检测到分享操作后,对所述延时拍摄视频进行分享。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述控制无人机根据所述延时拍摄参数进行延时拍摄之前,还包括:
    通过所述交互装置检测开始延时拍摄操作;
    所述控制无人机根据所述延时拍摄参数进行延时拍摄,包括:
    在检测到所述开始延时拍摄操作时,控制所述无人机根据所述延时拍摄参数进行延时拍摄。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述通过交互装置检测到延时拍摄参数设置操作之前,还包括:
    通过所述交互装置检测延时拍摄模式触发操作;
    在检测到延时拍摄模式触发操作时,进入延时拍摄模式;
    所述通过交互装置检测到延时拍摄参数设置操作,包括:
    在进入延时拍摄模式之后,通过交互装置检测到延时拍摄参数设置操作。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述方法还包括:
    通过所述交互装置检测到成像参数设置操作;
    根据所述成像参数设置操作,确定成像参数;
    所述控制无人机根据所述延时拍摄参数进行延时拍摄,包括:
    控制所述无人机根据所述延时拍摄参数和所述成像参数,进行延时拍摄。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述方法还包括:
    通过所述交互装置检测到无人机的飞行模式设置操作;
    根据所述飞行模式设置操作,确定所述无人机的飞行模式;
    控制所述无人机按照所述飞行模式飞行;
    所述控制无人机根据延时拍摄参数进行延时拍摄,包括:
    在所述无人机按照所述飞行模式飞行的过程中,控制所述无人机根据延时拍摄参数进行延时拍摄。
  10. 根据权利要求9所述的方法,其特征在于,所述飞行模式为自由飞行模式;
    所述控制所述无人机按照所述飞行模式飞行,包括:
    通过所述交互装置检测到飞行控制操作,
    根据所述飞行控制操作,确定控制杆量;
    根据所述控制杆量控制所述无人机飞行。
  11. 根据权利要求9所述的方法,其特征在于,所述飞行模式为轨迹飞行模式;
    所述控制所述无人机按照所述飞行模式飞行,包括:
    控制所述无人机按照预设的轨迹飞行。
  12. 根据权利要求11所述的方法,其特征在于,
    所述控制所述无人机按照预设的轨迹飞行,包括:
    通过交互装置检测到目标对象选择操作;
    根据所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机按照预设的轨迹飞行,控制所述无人机的拍摄姿态以使所述无人机对目标对象指示信息指示的目标对象进行跟踪。
  13. 根据权利要求11或12所述的方法,其特征在于,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
  14. 根据权利要求13所述的方法,其特征在于,所述航点还包括拍摄姿态和/或成像参数。
  15. 根据权利要求9所述的方法,其特征在于,所述飞行模式为直线飞行模式;
    所述控制所述无人机按照所述飞行模式飞行,包括:
    通过所述交互装置检测到飞行方向设置操作;
    根据所述飞行方向设置操作,确定飞行方向;
    控制所述无人机按照所述飞行方向直线飞行。
  16. 根据权利要求15所述的方法,其特征在于,
    所述控制所述无人机按照所述飞行方向直线飞行,包括:
    通过所述交互装置检测到目标对象选择操作;
    根据所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机按照所述飞行方向直线飞行,并根据所述目标对象指示信息控制所述无人机的拍摄姿态以使所述无人机对所述目标对象指示信息指示的目标对象进行跟踪飞行。
  17. 根据权利要求9所述的方法,其特征在于,所述飞行模式为环绕飞行模式;
    所述控制所述无人机按照所述飞行模式飞行,包括:
    通过所述交互装置检测到目标对象选择操作;
    根据所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机对目标对象指示信息指示的目标对象环绕飞行。
  18. 根据权利要求1-17任一项所述的方法,其特征在于,所述方法还包括:
    在无人机进行延时拍摄的过程中,通过交互装置检测暂停延时拍摄操作;
    在检测到所述暂停延时拍摄操作时,控制所述无人机暂停延时拍摄。
  19. 一种延时拍摄控制方法,应用于无人机,其特征在于,所述方法包括:
    接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的;
    根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  20. 根据权利要求19所述的方法,其特征在于,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
  21. 根据权利要求19或20所述的方法,其特征在于,还包括:
    根据延时拍摄获取的延时拍摄图像,生成延时视频;
    向所述控制终端发送所述延时视频。
  22. 根据权利要求19或20所述的方法,其特征在于,还包括:
    向所述控制终端发送延时拍摄图像,其中,所述延时拍摄图像用于生成延时视频。
  23. 根据权利要求19-22任一项所述的方法,其特征在于,还包括:
    接收控制终端发送的延时拍摄开始指令,其中,所述延时拍摄开始指令是所述控制终端通过检测开始延时拍摄操作确定的;
    所述根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄,包括:
    在接收到所述延时拍摄开始指令时,根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  24. 根据权利要求19-23任一项所述的方法,其特征在于,所述方法还包括:
    接收所述控制终端发送的成像参数,所述成像参数是所述控制终端通过检测成像参数设置操作确定的;
    所述根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄,包括:
    根据所述延时拍摄参数和所述成像参数,控制无人机的拍摄装置进行延时拍摄。
  25. 根据权利要求19-24任一项所述的方法,其特征在于,所述方法还包括:
    接收所述控制终端发送的飞行模式设置指令,所述飞行模式设置指令是所述控制终端通过检测飞行模式设置操作确定的;
    根据所述飞行模式设置指令,确定所述无人机的飞行模式;
    控制无人机按照所述飞行模式飞行;
    所述根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄,包括:
    在无人机按照所述飞行模式飞行的过程中,根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  26. 根据权利要求25所述的方法,其特征在于,所述飞行模式为自由飞行模式;
    所述控制无人机按照所述飞行模式飞行,包括:
    接收所述控制终端发送的控制杆量;
    根据所述控制杆量控制无人机飞行。
  27. 根据权利要求25所述的方法,其特征在于,所述飞行模式为轨迹飞行模式;
    所述控制无人机按照所述飞行模式飞行,包括:
    获取预设的轨迹;
    控制无人机按照所述预设的轨迹飞行。
  28. 根据权利要求27所述的方法,其特征在于,
    所述控制无人机按照所述预设的轨迹飞行,包括:
    接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    控制无人机按照所述预设的轨迹飞行,并控制拍摄装置的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪。
  29. 根据权利要求27或28所述的方法,其特征在于,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
  30. 根据权利要求29所述的方法,其特征在于,所述航点还包括拍摄姿态和/或成像参数。
  31. 根据权利要求25所述的方法,其特征在于,所述飞行模式为直线飞行模式;
    所述控制无人机按照所述飞行模式飞行,包括:
    接收所述控制终端发送的飞行方向指示信息,所述飞行方向指示信息是所述控制终端通过检测飞行方向设置操作确定的;
    控制无人机按照所述飞行方向指示信息指示的飞行方向直线飞行。
  32. 根据权利要求31所述的方法,其特征在于,
    所述控制无人机按照飞行方向直线飞行,包括:
    接收所述控制终端的发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    控制无人机按照所述飞行方向直线飞行,并控制拍摄装置的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪飞行。
  33. 根据权利要求25所述的方法,其特征在于,所述飞行模式为环绕飞行模式;
    所述控制无人机按照所述飞行模式飞行,包括:
    接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    根据所述目标对象指示信息,控制无人机对目标对象环绕飞行。
  34. 根据权利要求19-33任一项所述的方法,其特征在于,所述方法还包括:
    在进行延时拍摄的过程中,接收所述控制终端发送的暂停延时拍摄指令,所述暂停延时拍摄指令是所述控制终端通过检测暂停延时拍摄操作确定的;
    根据所述暂停延时拍摄指令,控制拍摄装置暂停延时拍摄。
  35. 一种控制终端,其特征在于,包括:
    交互装置,用于检测延时拍摄参数设置操作;
    处理器,用于根据所述交互装置检测到的所述延时拍摄参数设置操作,确定延时拍摄参数;以及控制无人机根据所述延时拍摄参数进行延时拍摄。
  36. 根据权利要求35所述的控制终端,其特征在于,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
  37. 根据权利要求35或36所述的控制终端,其特征在于,所述处理器,还用于获取所述无人机发送的延时视频,其中,所述延时视频是所述无人机根据所述延时拍摄参数进行延时拍摄获取的延时拍摄图像生成的。
  38. 根据权利要求35或36所述的控制终端,其特征在于,所述处理器,还用于获取所述无人机发送的延时拍摄图像,以及根据所述延时拍摄图像,生成延时视频;
    其中,所述延时拍摄图像是所述无人机根据所述延时拍摄参数进行延时拍摄获取的。
  39. 根据权利要求37或38所述的控制终端,其特征在于,所述交互装置,还用于检测分享操作;
    所述处理器,还用于所述交互装置检测到分享操作后,对所述延时拍摄视频进行分享。
  40. 根据权利要求35-39任一项所述的控制终端,其特征在于,所述交互装置,还用于在所述处理器控制无人机根据所述延时拍摄参数进行延时拍摄之前,检测开始延时拍摄操作;
    所述处理器,具体用于在所述交互装置检测到所述开始延时拍摄操作时,控制所述无人机根据所述延时拍摄参数进行延时拍摄。
  41. 根据权利要求35-40任一项所述的控制终端,其特征在于,所述交互装置,还用于在检测到延时拍摄参数设置操作之前,检测延时拍摄模式触发操作;
    所述处理器,还用于在所述交互装置检测到延时拍摄模式触发操作时,进入延时拍摄模式;
    所述交互装置在检测延时拍摄参数设置操作时,具体用于:在进入延时拍摄模式之后,检测延时拍摄参数设置操作。
  42. 根据权利要求35-41任一项所述的控制终端,其特征在于,所述交互装置,还用于检测成像参数设置操作;
    所述处理器,还用于根据所述交互装置检测到的所述成像参数设置操作,确定成像参数;
    所述处理器在控制无人机根据所述延时拍摄参数进行延时拍摄时,具体用于:控制所述无人机根据所述延时拍摄参数和所述成像参数,进行延时拍摄。
  43. 根据权利要求35-42任一项所述的控制终端,其特征在于,所述交互装置,还用于检测无人机的飞行模式设置操作;
    所述处理器,还用于根据所述交互装置检测到的所述飞行模式设置操作,确定所述无人机的飞行模式;以及控制所述无人机按照所述飞行模式飞行;
    所述处理器在控制无人机根据延时拍摄参数进行延时拍摄时,具体用于:
    在所述无人机按照所述飞行模式飞行的过程中,控制所述无人机根据延时拍摄参数进行延时拍摄。
  44. 根据权利要求43所述的控制终端,其特征在于,所述飞行模式为自由飞行模式;
    所述处理器在控制所述无人机按照所述飞行模式飞行时,具体用于:
    通过所述交互装置检测到飞行控制操作,
    根据所述飞行控制操作,确定控制杆量;
    根据所述控制杆量控制所述无人机飞行。
  45. 根据权利要求43所述的控制终端,其特征在于,所述飞行模式为轨迹飞行模式;
    所述处理器在控制所述无人机按照所述飞行模式飞行时,具体用于:
    控制所述无人机按照预设的轨迹飞行。
  46. 根据权利要求45所述的控制终端,其特征在于,所述交互装置,还用于检测目标对象选择操作;
    所述处理器在控制所述无人机按照预设的轨迹飞行时,具体用于:
    根据所述交互装置检测到的所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机按照预设的轨迹飞行,控制所述无人机的拍摄姿态以使所述无人机对目标对象指示信息指示的目标对象进行跟踪。
  47. 根据权利要求45或46所述的控制终端,其特征在于,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
  48. 根据权利要求47所述的控制终端,其特征在于,所述航点还包括拍摄姿态和/或成像参数。
  49. 根据权利要求43所述的控制终端,其特征在于,所述飞行模式为直线飞行模式;
    所述交互装置,还用于检测飞行方向设置操作;
    所述处理器在控制所述无人机按照所述飞行模式飞行时,具体用于:
    根据所述交互装置检测到的所述飞行方向设置操作,确定飞行方向;
    控制所述无人机按照所述飞行方向直线飞行。
  50. 根据权利要求49所述的控制终端,其特征在于,所述交互装置,还用于检测目标对象选择操作;
    所述处理器在控制所述无人机按照所述飞行方向直线飞行时,具体用于:
    根据所述交互装置检测到的所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机按照所述飞行方向直线飞行,并根据所述目标对象指示信息控制所述无人机的拍摄姿态以使所述无人机对所述目标对象指示信息指示的目标对象进行跟踪飞行。
  51. 根据权利要求43所述的控制终端,其特征在于,所述飞行模式为环绕飞行模式;
    所述交互装置,还用于检测目标对象选择操作;
    所述处理器在控制所述无人机按照所述飞行模式飞行时,具体用于:
    根据所述交互装置检测到的所述目标对象选择操作,确定目标对象指示信息;
    控制所述无人机对目标对象指示信息指示的目标对象环绕飞行。
  52. 根据权利要求35-51任一项所述的控制终端,其特征在于,
    所述交互装置,还用于在无人机进行延时拍摄的过程中,检测暂停延时拍摄操作;
    所述处理器,还用于在所述交互装置检测到所述暂停延时拍摄操作时,控制所述无人机暂停延时拍摄。
  53. 一种无人机,其特征在于,包括:
    通信装置,用于接收控制终端发送的延时拍摄参数,其中,所述延时拍摄参数是所述控制终端通过检测延时拍摄参数设置操作确定的;
    处理器,用于根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  54. 根据权利要求53所述的无人机,其特征在于,所述延时拍摄参数包括拍摄时间间隔、拍摄时长、拍摄图像的张数、延时视频时长中的至少一种。
  55. 根据权利要求53或54所述的无人机,其特征在于,所述处理器,还用于根据延时拍摄获取的延时拍摄图像,生成延时视频;
    所述通信装置,还用于向所述控制终端发送所述延时视频。
  56. 根据权利要求53或54所述的无人机,其特征在于,所述通信装置,还用于向所述控制终端发送延时拍摄图像,其中,所述延时拍摄图像用于生成延时视频。
  57. 根据权利要求53-56任一项所述的无人机,其特征在于,所述通信装置,还用于接收控制终端发送的延时拍摄开始指令,其中,所述延时拍摄 开始指令是所述控制终端通过检测开始延时拍摄操作确定的;
    所述处理器,具体用于:在所述通信装置接收到所述延时拍摄开始指令时,根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  58. 根据权利要求53-57任一项所述的无人机,其特征在于,所述通信装置,还用于接收所述控制终端发送的成像参数,所述成像参数是所述控制终端通过检测成像参数设置操作确定的;
    所述处理器在根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄时,具体用于:
    根据所述延时拍摄参数和所述成像参数,控制无人机的拍摄装置进行延时拍摄。
  59. 根据权利要求53-58任一项所述的无人机,其特征在于,
    所述通信装置,还用于接收所述控制终端发送的飞行模式设置指令,所述飞行模式设置指令是所述控制终端通过检测飞行模式设置操作确定的;
    所述处理器,还用于根据所述飞行模式设置指令,确定所述无人机的飞行模式;以及控制无人机按照所述飞行模式飞行;
    所述处理器在根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄时,具体用于:
    在所述无人机按照所述飞行模式飞行的过程中,根据所述延时拍摄参数控制无人机的拍摄装置进行延时拍摄。
  60. 根据权利要求59所述的无人机,其特征在于,所述飞行模式为自由飞行模式;
    所述通信装置,还用于接收所述控制终端发送的控制杆量;
    所述处理器控制无人机按照所述飞行模式飞行时,具体用于:
    根据所述通信装置接收的所述控制杆量控制无人机飞行。
  61. 根据权利要求59所述的无人机,其特征在于,所述飞行模式为轨迹飞行模式;
    所述处理器控制无人机按照所述飞行模式飞行时,具体用于:
    获取预设的轨迹;
    控制无人机按照所述预设的轨迹飞行。
  62. 根据权利要求61所述的无人机,其特征在于,
    所述通信装置,还用于接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    所述处理器控制无人机按照所述预设的轨迹飞行时,具体用于:
    控制无人机按照所述预设的轨迹飞行,并控制拍摄装置的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪。
  63. 根据权利要求61或62所述的无人机,其特征在于,所述预设的轨迹中至少包括多个航点,其中,所述航点中至少包括位置信息。
  64. 根据权利要求63所述的无人机,其特征在于,所述航点还包括拍摄姿态和/或成像参数。
  65. 根据权利要求59所述的无人机,其特征在于,所述飞行模式为直线飞行模式;
    所述通信装置,还用于接收所述控制终端发送的飞行方向指示信息,所述飞行方向指示信息是所述控制终端通过检测飞行方向设置操作确定的;
    所述处理器控制无人机按照所述飞行模式飞行时,具体用于:
    控制无人机按照所述飞行方向指示信息指示的飞行方向直线飞行。
  66. 根据权利要求65所述的无人机,其特征在于,
    所述通信装置,还用于接收所述控制终端的发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    所述处理器在控制无人机按照飞行方向直线飞行时,具体用于:
    控制无人机按照所述飞行方向直线飞行,并控制拍摄装置的拍摄姿态以对所述目标对象指示信息指示的目标对象进行跟踪飞行。
  67. 根据权利要求59所述的无人机,其特征在于,所述飞行模式为环绕飞行模式;
    所述通信装置,还用于接收所述控制终端发送的目标对象指示信息,所述目标对象指示信息是所述控制终端通过检测目标对象选择操作确定的;
    所述飞行控制器在控制无人机按照所述飞行模式飞行时,具体用于:
    根据所述目标对象指示信息,控制无人机对目标对象环绕飞行。
  68. 根据权利要求53-67任一项所述的无人机,其特征在于,所述通信装置,还用于在所述拍摄装置进行延时拍摄的过程中,接收所述控制终端发送的暂停延时拍摄指令,所述暂停延时拍摄指令是所述控制终端通过检测暂 停延时拍摄操作确定的;
    所述处理器,还用于根据所述暂停延时拍摄指令,控制拍摄装置暂停延时拍摄。
PCT/CN2018/088715 2018-05-28 2018-05-28 延时拍摄控制方法和设备 WO2019227289A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/088715 WO2019227289A1 (zh) 2018-05-28 2018-05-28 延时拍摄控制方法和设备
CN201880031253.1A CN110771137A (zh) 2018-05-28 2018-05-28 延时拍摄控制方法和设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/088715 WO2019227289A1 (zh) 2018-05-28 2018-05-28 延时拍摄控制方法和设备

Publications (1)

Publication Number Publication Date
WO2019227289A1 true WO2019227289A1 (zh) 2019-12-05

Family

ID=68698534

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/088715 WO2019227289A1 (zh) 2018-05-28 2018-05-28 延时拍摄控制方法和设备

Country Status (2)

Country Link
CN (1) CN110771137A (zh)
WO (1) WO2019227289A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458958B (zh) * 2020-03-25 2022-04-08 东莞市至品创造数码科技有限公司 一种相机移动速度可调节的延时摄影方法及装置
CN111526281B (zh) * 2020-03-25 2021-06-25 东莞市至品创造数码科技有限公司 一种计算延时摄影影像时长的方法及装置
CN114761898A (zh) * 2020-12-29 2022-07-15 深圳市大疆创新科技有限公司 无人机的控制方法、无人机及存储介质
CN113709377A (zh) * 2021-09-07 2021-11-26 深圳市道通智能航空技术股份有限公司 控制飞行器拍摄旋转延时视频的方法、装置、设备及介质
CN113709376A (zh) * 2021-09-07 2021-11-26 深圳市道通智能航空技术股份有限公司 控制飞行器拍摄旋转镜头视频的方法、装置、设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150271455A1 (en) * 2014-03-24 2015-09-24 Chicony Electronics Co., Ltd. Time-lapse photography method, its computer program product, and electronic device with image-capturing function thereof
CN105956081A (zh) * 2016-04-29 2016-09-21 深圳电航空技术有限公司 地面站地图更新方法及装置
CN106101563A (zh) * 2016-08-15 2016-11-09 杨珊珊 无人飞行器延时拍摄装置及其延时拍摄方法
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9443207B2 (en) * 2012-10-22 2016-09-13 The Boeing Company Water area management system
CN102955478B (zh) * 2012-10-24 2016-01-20 深圳一电科技有限公司 无人机飞行控制方法及系统
KR101541783B1 (ko) * 2014-03-19 2015-08-04 소프트상추주식회사 타임 랩스 영상 제작 장치 및 그 방법
CN104914932A (zh) * 2015-06-11 2015-09-16 邓钰朗 一种用于辅助拍摄的便携式终端配件及其拍摄方法
JP6308238B2 (ja) * 2016-04-07 2018-04-11 カシオ計算機株式会社 飛行型カメラ装置、飛行型カメラシステム、端末装置、飛行型カメラ装置の制御方法およびプログラム
CN205945971U (zh) * 2016-08-15 2017-02-08 杨珊珊 无人飞行器延时拍摄装置
CN107765709B (zh) * 2016-08-22 2021-12-31 广州亿航智能技术有限公司 基于飞行器实现自拍的方法及装置
US20180113462A1 (en) * 2016-10-22 2018-04-26 Gopro, Inc. Position-based soft stop for a 3-axis gimbal
CN107343153A (zh) * 2017-08-31 2017-11-10 王修晖 一种无人设备的拍摄方法、装置及无人机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150271455A1 (en) * 2014-03-24 2015-09-24 Chicony Electronics Co., Ltd. Time-lapse photography method, its computer program product, and electronic device with image-capturing function thereof
CN105956081A (zh) * 2016-04-29 2016-09-21 深圳电航空技术有限公司 地面站地图更新方法及装置
CN106101563A (zh) * 2016-08-15 2016-11-09 杨珊珊 无人飞行器延时拍摄装置及其延时拍摄方法
CN107000839A (zh) * 2016-12-01 2017-08-01 深圳市大疆创新科技有限公司 无人机的控制方法、装置、设备和无人机的控制系统

Also Published As

Publication number Publication date
CN110771137A (zh) 2020-02-07

Similar Documents

Publication Publication Date Title
WO2019227289A1 (zh) 延时拍摄控制方法和设备
US20200346753A1 (en) Uav control method, device and uav
US20200394754A1 (en) Course profiling and sharing
WO2018098784A1 (zh) 无人机的控制方法、装置、设备和无人机的控制系统
WO2019227441A1 (zh) 可移动平台的拍摄控制方法和设备
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
JPWO2018073879A1 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
CN109154815B (zh) 最高温度点跟踪方法、装置和无人机
WO2020143677A1 (zh) 一种飞行控制方法及飞行控制系统
WO2020019106A1 (zh) 云台和无人机控制方法、云台及无人机
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
WO2020172800A1 (zh) 可移动平台的巡检控制方法和可移动平台
WO2020133410A1 (zh) 一种拍摄方法及装置
WO2018214155A1 (zh) 用于设备姿态调整的方法、设备、系统和计算机可读存储介质
WO2021168819A1 (zh) 无人机的返航控制方法和设备
WO2020154942A1 (zh) 无人机的控制方法和无人机
WO2020019212A1 (zh) 视频播放速度控制方法及系统、控制终端和可移动平台
WO2019227287A1 (zh) 无人机的数据处理方法和设备
WO2021251441A1 (ja) 方法、システムおよびプログラム
US20210240185A1 (en) Shooting control method and unmanned aerial vehicle
WO2021223176A1 (zh) 无人机的控制方法和设备
WO2021168821A1 (zh) 可移动平台的控制方法和设备
WO2019104684A1 (zh) 无人机的控制方法、装置和系统
WO2021217371A1 (zh) 可移动平台的控制方法和装置
WO2020237429A1 (zh) 遥控设备的控制方法和遥控设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18920288

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18920288

Country of ref document: EP

Kind code of ref document: A1