CN113940087A - Video editing method, electronic equipment, unmanned aerial vehicle and storage medium - Google Patents
Video editing method, electronic equipment, unmanned aerial vehicle and storage medium Download PDFInfo
- Publication number
- CN113940087A CN113940087A CN202080040151.3A CN202080040151A CN113940087A CN 113940087 A CN113940087 A CN 113940087A CN 202080040151 A CN202080040151 A CN 202080040151A CN 113940087 A CN113940087 A CN 113940087A
- Authority
- CN
- China
- Prior art keywords
- data
- aerial vehicle
- unmanned aerial
- video data
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 94
- 238000004891 communication Methods 0.000 claims abstract description 9
- 230000008569 process Effects 0.000 claims description 18
- 206010034719 Personality change Diseases 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 4
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41422—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4524—Management of client data or end-user data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
The application provides a video editing method, which is applied to electronic equipment, wherein the electronic equipment is in communication connection with an unmanned aerial vehicle, the unmanned aerial vehicle is provided with a shooting device and is used for shooting video data, and the electronic equipment is used for processing the video data; the method comprises the steps of obtaining video data shot by a shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data; clipping the video data based on the flight attitude data and the timestamp data. The application realizes video editing based on flight attitude data of the unmanned aerial vehicle. The application also provides an electronic device, an unmanned aerial vehicle and a computer storage medium.
Description
Technical Field
The present application relates to the field of video editing, and in particular, to a video editing method, an electronic device, an unmanned aerial vehicle, and a storage medium.
Background
At present, in the flying process of an unmanned aerial vehicle, a carried shooting device is often used for shooting and recording aerial pictures. Then, the photographed video data can be edited to obtain a video composed of highlights.
In the related art, video data is automatically clipped through a video automatic clipping technology, and highlights are extracted mainly through an image recognition technology to be clipped, but image recognition is inaccurate, so that some highlights are easy to miss.
Disclosure of Invention
The application provides a video editing method, electronic equipment, an unmanned aerial vehicle and a storage medium, which can automatically edit video data shot by the unmanned aerial vehicle and reserve wonderful pictures expected to be left by a user.
In order to achieve the technical effect, the embodiment of the application discloses the following technical scheme:
in a first aspect, a video editing method is provided, and is applied to an electronic device, where the electronic device is in communication connection with an unmanned aerial vehicle, the unmanned aerial vehicle is equipped with a shooting device and is used to shoot video data, and the electronic device is used to process the video data; the method comprises the following steps:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
In a second aspect, a video clipping method is provided, which is applied to an unmanned aerial vehicle equipped with a camera, and includes:
acquiring video data shot by the shooting device, recording flight attitude data of the unmanned aerial vehicle, and correspondingly generating timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
In a third aspect, an electronic device is provided, which is in communication connection with an unmanned aerial vehicle and is used for controlling the unmanned aerial vehicle to move, where the unmanned aerial vehicle is equipped with a camera, and the electronic device includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
In a fourth aspect, an unmanned aerial vehicle is provided, which is equipped with a camera, and includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring video data shot by the shooting device, recording flight attitude data of the unmanned aerial vehicle in a memory, and correspondingly generating timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
In a fifth aspect, there is provided a computer storage medium having stored thereon a computer program which, when executed, implements any of the methods described above.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the video clipping method, the timestamp data are utilized to synchronize the video data with the flight attitude data of the aircraft, the recorded video is automatically clipped based on the flight attitude data and the timestamp data, the video data shot by the aircraft in the flight at a specific attitude can be automatically clipped, and wonderful pictures which are expected to be left by a user are reserved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is an application scenario of a video clipping method provided in the present application.
FIG. 2 illustrates a video clipping method according to an embodiment of the present application.
FIG. 3 is an electronic device illustrated in accordance with an exemplary embodiment of the present application.
FIG. 4 is a video clipping method illustrated by the present application in accordance with another embodiment.
FIG. 5 is a video clipping method illustrated by the present application in accordance with another embodiment.
Fig. 6 is a schematic structural diagram of an electronic device provided in the present application.
Fig. 7 is a schematic structural diagram of an unmanned aerial vehicle provided by the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, an application scenario of the video editing method provided in the present application is shown. The unmanned aerial vehicle is provided with a shooting device, so that the unmanned aerial vehicle can shoot aerial pictures when flying and hovering. The remote controller is in communication connection with the unmanned aerial vehicle, so that a user can control the flight of the unmanned aerial vehicle through the remote controller, and specifically, the flight speed, the height, the return journey and the like of the unmanned aerial vehicle can be controlled through control elements such as a rocker and a pulley on the remote controller. The unmanned aerial vehicle can be an unmanned aerial vehicle, a traversing machine and the like, and the shooting device can be a camera, a camera and the like with an image acquisition function.
Referring to fig. 2, a video clipping method is shown for the present application according to one embodiment. The method may be performed by an electronic device in communication with the UAV. The electronic device communicates with an unmanned aerial vehicle equipped with a camera and can be used for processing video data. As shown in fig. 3 (a), the electronic device may be a remote controller equipped with a display screen, in which a processor or a processing chip is provided. The user can view the video data returned by the aircraft on the display screen, and the processor or processing chip in the remote controller can process the video data. The electronic device may also be a mobile terminal, such as a cell phone, tablet, ipad, etc. (not shown in fig. 3), which may run a virtual interface with remote control functionality. The mobile terminal can control the flight of the unmanned aerial vehicle through virtual controls (such as a virtual rocker, a virtual pulley and the like) in the virtual interface, and view and process video data returned by the unmanned aerial vehicle. In addition, as shown in fig. 3 (b), if the electronic device is a mobile terminal, the mobile terminal and the remote controller may be combined through an electrical connection (as shown) or a wireless connection (not shown), so that the user can control the flight of the unmanned aerial vehicle through the remote controller on one hand, and can view and process video data returned by the unmanned aerial vehicle at the mobile terminal on the other hand. The electronic devices of the present application include, but are not limited to, the above categories, and are not limited thereto. In an alternative embodiment, the steps shown in fig. 2 may also be performed by the unmanned aerial vehicle, and specifically, may be performed on the video captured by the capturing device through an image processing module on a processing chip provided in the unmanned aerial vehicle.
As shown in fig. 2, the method comprises the steps of:
step 110: and acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data.
Step 120: clipping the video data based on the flight attitude data and the timestamp data.
In one embodiment, when the electronic device performs the above steps, after the electronic device is connected and communicated with the unmanned aerial vehicle, video data shot by a shooting device of the unmanned aerial vehicle, flight attitude data of the unmanned aerial vehicle, and timestamp data can be acquired by the electronic device, wherein the timestamp data is used for synchronizing the flight attitude data and the video data.
In addition, the electronic device may acquire the three data from the unmanned aerial vehicle in real time during the flight of the unmanned aerial vehicle and perform editing, and certainly, the electronic device may also store the video data shot by the shooting device in real time during the flight of the unmanned aerial vehicle, record and store the flight attitude data and the generated timestamp data of the unmanned aerial vehicle at the same time, and read the three data from the memory of the unmanned aerial vehicle to perform editing after the shooting of the unmanned aerial vehicle is completed. The clipped video data can be used for playback or sharing.
In one embodiment, when the above steps are performed by the unmanned aerial vehicle, timestamp data may be generated by the unmanned aerial vehicle in real time during flight based on the flight attitude data of the unmanned aerial vehicle, whereby video data captured by the capturing device is clipped in real time based on the flight attitude data and the timestamp data, and the clipped video is transmitted to the communicatively connected electronic device for playback or sharing. Certainly, the unmanned aerial vehicle may also store the video data shot by the shooting device in real time during the flight, record and store the flight attitude data and the generated timestamp data of the unmanned aerial vehicle, clip the video data after shooting is completed, and send the video obtained by clipping to the electronic device in communication connection for playback or sharing.
It should be noted that, in the clipping process, since the flight attitude data of the unmanned aerial vehicle at different times and the video data captured at the time can be synchronized by the timestamp data, when the video data captured by the unmanned aerial vehicle flying at a specific flight attitude is to be clipped, the electronic device can find out the timestamp data corresponding to the specific flight attitude data by searching, and clip out the video data according to the corresponding timestamp data.
The video clipping method provided by the embodiment synchronizes the video data with the flight attitude data of the aircraft by using the timestamp data, automatically clips the recorded video based on the flight attitude data and the timestamp data, can automatically clip the video data shot by the unmanned aircraft when the unmanned aircraft flies in a specific attitude, and retains the wonderful picture which the user wants to leave.
The following description is made in terms of the video clipping method performed by an electronic device in communication with an unmanned aerial vehicle, and it will be understood that the method is performed by an unmanned aerial vehicle on the same principle as the method performed by an electronic device, and the difference is only transmitted to the electronic device after the unmanned aerial vehicle has finished clipping, and therefore the following description is also applicable to the case where the method is performed by an unmanned aerial vehicle.
In an alternative embodiment, the electronic device executes step 110 to acquire video data captured by the capturing device, flight attitude data of the unmanned aerial vehicle, and time stamp data from the unmanned aerial vehicle. The video data is video frame data recorded by the shooting device when the unmanned aerial vehicle flies and hovers. The flight attitude data can describe the flight attitude of the unmanned aerial vehicle at each time point, and the data can be one or more of flight gear, flight speed, rotation angle, flight height, obstacle distance, carried holder attitude and the like of the unmanned aerial vehicle. Specifically, the rotation angle of the unmanned aerial vehicle may be a Pitch angle (Pitch angle), a yaw angle (yaw angle), a roll angle (roll angle), and the like. The timestamp data is used for identifying the generation time of other data, and in the application, the timestamp data can identify the generation time of video data and flight attitude data and is used for synchronizing the video data and the flight attitude data.
The electronic equipment is communicated with the unmanned aerial vehicle, and video data, flight attitude data and timestamp data can be acquired from the unmanned aerial vehicle. The electronic equipment can acquire video data, flight attitude data and timestamp data in real time when the unmanned aerial vehicle flies; the shooting can be completed by the shooting device, and the video data, the flight attitude data and the timestamp data stored in the aircraft can be acquired after the unmanned aircraft returns.
Optionally, after the electronic device acquires the flight attitude data, denoising processing may be performed on the acquired flight attitude data. The denoising method can be selected according to the needs of those skilled in the art, for example, low-pass filtering or the like can be performed on the flight attitude data to remove data with sudden changes in the flight attitude data, such as data generated during the instantaneous jump of the flight speed, so that the clipping result is smoother and smoother. In addition, the denoising processing of the flight attitude data can also be performed by a processor of the unmanned aerial vehicle. The unmanned aerial vehicle can send the processed flight attitude data to the electronic equipment after denoising processing.
As an example, the electronic device performs step 120, and the process of clipping the video data based on the flight attitude data and the timestamp data can be seen in fig. 4, which includes the following steps:
step 121: judging whether the flight attitude data meets a preset clipping condition or not;
step 122: and splicing the video data corresponding to the time of the flight attitude data meeting the preset splicing condition to obtain the spliced video.
In step 121, the clipping condition can be set by a person skilled in the art according to actual requirements, and there are several possible clipping conditions listed here, which can be specifically any one or more of the following combinations:
(1) the flight gear of the unmanned aerial vehicle is a preset gear: the unmanned aerial vehicle has multiple flight gears, such as a common gear, a power gear, a motion gear, a manual gear, a rolling gear and the like, for different flight gears, the operation instruction of a user, the flight actions (yaw, pitch and rolling actions), the maximum flight speed and the maximum flight height which can be executed by the unmanned aerial vehicle can be different, and the user can select the flight gears according to the own needs. The preset gear may be a gear that allows the unmanned aerial vehicle to perform a high degree of freedom motion, or may be a gear that allows the unmanned aerial vehicle to fly at a higher speed, such as a sport gear, a manual gear, or a roll gear, without limitation.
(2) The rotation angle of the unmanned aerial vehicle is greater than a preset angle threshold value: as described above, the rotation angle of the unmanned aerial vehicle may include a Pitch angle, a yaw angle, a roll angle, and the like, and accordingly, the preset angle threshold may include a Pitch angle threshold, a yaw angle threshold, a roll angle threshold, and the like. The user can set the rotation angle of the unmanned aerial vehicle to be greater than the preset angle threshold value when the specified certain angle rotation angle is greater than the corresponding angle threshold value, for example, the user can set the rotation angle of the unmanned aerial vehicle to be greater than the preset angle threshold value only when the roll angle of the unmanned aerial vehicle is greater than 30 degrees, and the user does not pay attention to other two rotation angles; more than one rotation angle greater than the corresponding angle threshold value can be set, and the rotation angle of the unmanned aerial vehicle is considered to be greater than the preset angle threshold value, and no limitation is made here.
(3) The flight speed of the unmanned aerial vehicle is greater than a preset speed threshold value.
(4) The flight height of the unmanned aerial vehicle is greater than a preset height threshold value.
(5) The distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold value.
(6) The attitude change of the holder on the unmanned aerial vehicle is greater than a preset attitude change threshold value: the unmanned aerial vehicle is because air resistance etc. when flying, the fuselage can inevitable shake, and in order to prevent the shake of fuselage from influencing shooting stationary degree of shooting device, the cloud platform can shake with fuselage shake frequency, amplitude unanimity, but with shake opposite direction's mode to steady shooting device. And in order to distinguish whether the attitude change of the holder is caused by a stable camera or the unmanned aerial vehicle has a large change of flying attitude, the condition that the shaking frequency of the holder is less than the preset frequency threshold value and the rotation amplitude of the holder is greater than the preset rotation amplitude threshold value can be used as the clipping condition. The change of the posture of the pan/tilt head may also be determined according to the change of the angle of the pan/tilt head at the Pitch angle, the yaw angle, and the roll angle, and the preset posture change threshold may also be set according to the above three angles.
(7) The variation amplitude of a control used for operating the unmanned aerial vehicle to fly on the electronic equipment is larger than a preset variation amplitude threshold value. The control can be a physical control such as a rocker and a thumb wheel; or virtual controls such as virtual rockers, virtual thumb wheels. As an example, this condition may be set to a change of the stick amount on the remote controller of more than 30%.
The preset gear, the speed threshold, the altitude threshold, the distance threshold, the frequency threshold, the rotation amplitude threshold, and the like may be set and modified by a user, or may be set by default in a program, which is not limited herein.
The above lists a plurality of selectable clipping conditions, and the user can self-select more than one clipping conditions as the preset clipping conditions. For example, the user may select only "the flight speed of the unmanned aerial vehicle is greater than the preset speed threshold" as the preset clipping condition, and may also select "the flight speed of the unmanned aerial vehicle is greater than the preset speed threshold", and/or "the rotation angle of the unmanned aerial vehicle is greater than the preset angle threshold", and/or "the distance between the unmanned aerial vehicle and the obstacle is less than the preset distance threshold" as the preset clipping condition.
The video data can be clipped by marking the video data, extracting the marked video data in the video data and splicing. Specifically, the flight attitude of the unmanned aerial vehicle at each time point is recorded due to the flight attitude data. For the flight attitude data meeting the preset clipping condition, the timestamp data corresponding to the flight attitude can be marked in the timestamp data. If the preset clipping condition is satisfied when the flight attitude data from the 2 nd minute to the 2 min 20 sec and the flight attitude data from the 3 rd minute to the 3 min 30 sec of the unmanned aerial vehicle are flown, the data from the 2 nd minute to the 2 min 20 sec and the 3 rd minute to the 3 min 30 sec in the time stamp data are marked.
And searching corresponding video data according to the marked timestamp data, and extracting the video data for splicing. The marked video data of 2 minutes to 2 minutes 20 seconds, and 3 minutes to 3 minutes 30 seconds can be found, as from the marked time stamp data. And extracting the two sections of video data, and splicing the two sections of video data to obtain the automatically edited video.
In an alternative embodiment, the automatic clipping method based on flight attitude data may further be combined with an existing automatic clipping method based on video content, that is, the automatic clipping method refers to the flight attitude and the video content at the same time.
Specifically, the automatic clipping method based on video content may be to recognize whether the video data contains a character scene or not and clip the video data containing the character scene by using technologies such as biometrics and scene recognition; the similarity between adjacent frames can be identified by utilizing an image identification technology, the moment when the similarity between the adjacent frames in the video data is greater than a preset similarity threshold value is marked, and the video data in a specified time period containing the moment is extracted for splicing. Those skilled in the art can implement the conventional automatic clipping based on video content, and the application is not limited thereto.
The combination of the two clipping methods can be that corresponding video data are clipped by referring to the flight attitude, and then the video data after the primary clipping is secondarily clipped by referring to the video content. The clips are first clipped with reference to the flight attitude or the video content, and are not limited herein. By combining the two clipping methods, the video data corresponding to the situation that the flight attitude meets the preset clipping condition and the video data identified based on the video content can be clipped, so that the problem that a wonderful picture is possibly omitted due to a single clipping mode can be solved, and the content of the clipped video is greatly enriched.
In one example, if the device performing the clipping operation is an unmanned aerial vehicle, the electronic device may also perform secondary clipping on the clipped video data after the unmanned aerial vehicle transmits the video data to the electronic device. For example, in order to save storage resources, the unmanned aerial vehicle only stores video data shot when the flying height is greater than 50 meters, and the video clipping method is executed to clip the video data recorded by the shooting device with the flying height greater than 50 meters. Thereafter, the unmanned aerial vehicle can transmit the flight attitude data, the time stamp data, and the once clipped video data to the electronic device. The user can further execute the video clipping method on the electronic equipment according to the requirement, and secondary clip out the video data shot when the flying speed is more than 20km/h or the distance from the obstacle is less than 2 m.
In addition, when playing back video on the electronic device, the played back video may be clipped video data, or video data which is marked based on the flight attitude data and the timestamp data but is not clipped, and only the video in the marked time period may be played during playback, for example, the video data in the 2 nd minute to 2 nd minute 20 second during flight is marked by using the above method, and the playback may jump directly to the time of 1 minute 57 second and then the time starts playing to 2 minutes 23 seconds, and then jump to the next marked time for playing.
By combining the above embodiments, it can be seen that the video editing method provided by the application automatically edits the recorded video based on the flight attitude data and the timestamp data, and can automatically edit the video data shot by the aircraft during high-difficulty actions such as high-speed flight, rolling, rotation and the like, so that the wonderful pictures which the user wants to leave are reserved. Especially for the traversing machine, different from the aerial camera, the traversing machine can carry out high-difficulty flying actions such as high-speed flying, large-moving-scene flying and the like, so that shot pictures are changed greatly, and some instant pictures are difficult to be identified accurately by image picture identification, especially for pictures shot by the traversing machine during high-speed flying and turning, and the pictures are wonderful pictures which users want to extract and reserve.
Referring to fig. 5, an application example of the present application includes the following steps:
step 210: and acquiring video data, timestamp data and flight attitude data.
The video data is video frame data recorded by the shooting device of the unmanned aerial vehicle when the unmanned aerial vehicle flies and hovers. The flight attitude data can describe the flight attitude of the unmanned aerial vehicle at each time point, and the data can be one or more of flight gear, flight speed, rotation angle, flight height, obstacle distance, carried holder attitude and the like of the unmanned aerial vehicle. The timestamp data may identify a generation time of the video data and the flight attitude data for synchronizing the video data and the flight attitude data.
The electronic equipment can acquire the three data in real time; the shooting can be completed by the shooting device, and the three data can be acquired after the unmanned aerial vehicle returns.
Step 220: and reading the flight attitude data at a certain moment.
The flight attitude data comprises flight gear, flight speed, rotation angle, flight height, barrier distance, carried holder attitude and other data of the unmanned aerial vehicle. In the present embodiment, the flight position, the flight speed, and the rotation angle are taken as examples of the clipping condition, and actually, the user may select other data or a combination thereof as the clipping condition, which is not limited in the present application.
Step 230: low pass filtering is performed.
In order to make the clipping result smoother, the flight attitude data may be denoised, and the common processing method is low-pass filtering, which may remove data with sudden changes in data, such as data generated when the flight speed jumps instantaneously.
Step 240: judging whether the flight data meet the preset clipping condition or not, wherein the judging step comprises the following steps:
step 240 a: judging whether the flight gear is a preset gear or not;
step 240 b: judging whether the flying speed is greater than a speed threshold value;
step 240 c: judging whether the rotation angle is larger than an angle threshold value;
the preset gear, the speed threshold and the angle threshold may be modified by user settings, or may be default settings of a program, which are not limited herein.
Step 250: it is determined whether more than one clipping condition is satisfied.
Step 250 may also be: it is determined whether 2 or more clipping conditions are satisfied. If yes, go to step 260; otherwise, step 220 is executed to read the flight attitude data at the next moment.
Step 260: the corresponding timestamp data is marked.
And if the flight attitude data at a certain moment meets the judgment condition, marking the corresponding moment from the timestamp data.
Step 270: a video data segment corresponding to the tagged timestamp data is extracted.
Step 280: and splicing the extracted video data segments.
And splicing all the extracted video data segments together to obtain the automatically clipped video.
According to the video clipping method provided by the embodiment of the application, the recorded video is automatically clipped based on the flight attitude data and the timestamp data, the video data shot by the traversing machine during high-speed flight, rolling, rotation and other high-difficulty actions can be automatically clipped, and wonderful pictures which are expected to be left by a user are reserved.
Based on the video clipping method shown in fig. 2, the present application also provides a schematic structural diagram of the electronic device shown in fig. 6. As shown in fig. 6, at the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, but may also include hardware required for other services. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs the computer program to implement the video clipping method described above with reference to fig. 2.
Based on the video clipping method shown in fig. 2, the present application also provides a schematic structural diagram of the unmanned aerial vehicle shown in fig. 7. As shown in fig. 7, at the hardware level, the unmanned aerial vehicle includes a processor, an internal bus, a network interface, a memory, and a nonvolatile memory, but may also include hardware required for other services. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs the computer program to implement the video clipping method described above with reference to fig. 2.
The present application also provides a computer storage medium, which stores a computer program, which, when executed by a processor, is operable to perform the video clipping method provided in fig. 2 above
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method and apparatus provided by the embodiments of the present application are described in detail above, and the principle and the embodiments of the present application are explained herein by applying specific examples, and the description of the embodiments above is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (49)
1. A video editing method is applied to electronic equipment, the electronic equipment is in communication connection with an unmanned aerial vehicle, the unmanned aerial vehicle is provided with a shooting device and is used for shooting video data, and the electronic equipment is used for processing the video data; characterized in that the method comprises:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
2. The method of claim 1, wherein the UAV is a traversing machine.
3. The method of claim 1, wherein said clipping the video data based on the attitude data and the timestamp data comprises:
and splicing the video data corresponding to the time of the flight attitude data meeting the preset splicing condition to obtain the spliced video.
4. The method of claim 3, wherein the flight attitude data comprises at least one of:
the unmanned aerial vehicle comprises a flight gear, a flight speed, a rotation angle, a flight height, a distance of an obstacle and a carried cradle head attitude.
5. The method of claim 4, wherein the preset clipping condition comprises at least one of:
the flight gear of the unmanned aerial vehicle is a preset gear;
the flight speed of the unmanned aerial vehicle is greater than a preset speed threshold value;
the rotation angle of the unmanned aerial vehicle is greater than a preset angle threshold value;
the flight height of the unmanned aerial vehicle is greater than a preset height threshold value;
the distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold value;
and the attitude change of the holder on the unmanned aerial vehicle is greater than a preset attitude change threshold value.
6. The method of claim 1, wherein the electronic device is further configured to control the unmanned aerial vehicle movement via a control, the method further comprising:
marking the moment when the variation amplitude of the control in the video data is larger than a preset variation amplitude threshold value;
and extracting the video data in the specified time period containing the time for splicing.
7. The method of claim 5, wherein the pan-tilt-attitude change on the UAV being greater than a preset attitude change threshold comprises:
the shaking frequency of the holder is smaller than a preset frequency threshold value, and the rotation amplitude of the holder is larger than a preset rotation amplitude threshold value.
8. The method of claim 1, wherein said clipping said video data based on said attitude data and said timestamp data further comprises:
and denoising the flight attitude data.
9. The method of claim 8, wherein the denoising process comprises a low pass filtering process.
10. The method of claim 1, further comprising:
and editing the video data based on the content of the video data, and playing back or sharing the video data obtained by editing.
11. The method of claim 10, wherein the clipping the video data based on the content of the video data comprises:
marking the moment when the similarity between adjacent frames in the video data is greater than a preset similarity threshold;
and extracting the video data in the specified time period containing the time for splicing.
12. The method of claim 1, further comprising:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data in real time; or
And after the shooting of the shooting device is finished, acquiring the video data shot by the shooting device, the flight attitude data of the unmanned aerial vehicle and the timestamp data.
13. A video clipping method is applied to an unmanned aerial vehicle, and the unmanned aerial vehicle is provided with a shooting device, and is characterized by comprising the following steps:
acquiring video data shot by the shooting device, recording flight attitude data of the unmanned aerial vehicle, and correspondingly generating timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
14. The method of claim 13, wherein the UAV is a traversing machine.
15. The method of claim 13, wherein said clipping the video data based on the attitude data and the timestamp data comprises:
and splicing the video data corresponding to the time of the flight attitude data meeting the preset splicing condition to obtain the spliced video.
16. The method of claim 13 or 15, wherein the flight attitude data comprises at least one of:
the unmanned aerial vehicle comprises a flight gear, a flight speed, a rotation angle, a flight height, a distance of an obstacle and a carried cradle head attitude.
17. The method of claim 16, wherein the preset clipping condition comprises at least one of:
the flight gear of the unmanned aerial vehicle is a preset gear;
the flight speed of the unmanned aerial vehicle is greater than a preset speed threshold value;
the rotation angle of the unmanned aerial vehicle is greater than a preset angle threshold value;
the flight height of the unmanned aerial vehicle is greater than a preset height threshold value;
the distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold value;
and the attitude change of the holder on the unmanned aerial vehicle is greater than a preset attitude change threshold value.
18. The method of claim 13, wherein the UAV is communicatively coupled to an electronic device that controls the UAV via controls, the method further comprising:
marking the moment when the variation amplitude of the control in the video data is larger than a preset variation amplitude threshold value;
and extracting the video data in the specified time period containing the time for splicing.
19. The method of claim 17, wherein the pan-tilt-attitude change on the UAV being greater than a preset attitude change threshold comprises:
the shaking frequency of the holder is smaller than a preset frequency threshold value, and the rotation amplitude of the holder is larger than a preset rotation amplitude threshold value.
20. The method of claim 13, wherein said clipping said video data based on said attitude data and said timestamp data further comprises:
and denoising the flight attitude data.
21. The method of claim 20, wherein the denoising process comprises a low pass filtering process.
22. The method of claim 13, further comprising:
clipping the video data based on the content of the video data.
23. The method of claim 22, wherein the clipping the video data based on the content of the video data comprises:
marking the moment when the similarity between adjacent frames in the video data is greater than a preset similarity threshold;
and extracting the video data in the specified time period containing the time for splicing.
24. The method of claim 13, further comprising:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data in real time; or
And after the shooting of the shooting device is finished, acquiring the video data shot by the shooting device, the flight attitude data of the unmanned aerial vehicle and the timestamp data.
25. An electronic device is connected with unmanned vehicles in a communication mode and used for controlling the unmanned vehicles to move, and the unmanned vehicles are provided with shooting devices, and the electronic device is characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
26. The electronic device of claim 25, wherein the UAV is a crossing machine.
27. The electronic device of claim 25, wherein the processor is configured to:
and splicing the video data corresponding to the time of the flight attitude data meeting the preset splicing condition to obtain the spliced video.
28. The electronic device of claim 27, wherein the attitude data comprises at least one of:
the unmanned aerial vehicle comprises a flight gear, a flight speed, a rotation angle, a flight height, a distance of an obstacle and a carried cradle head attitude.
29. The electronic device of claim 28, wherein the preset clipping condition comprises at least one of:
the flight gear of the unmanned aerial vehicle is a preset gear;
the flight speed of the unmanned aerial vehicle is greater than a preset speed threshold value;
the rotation angle of the unmanned aerial vehicle is greater than a preset angle threshold value;
the flight height of the unmanned aerial vehicle is greater than a preset height threshold value;
the distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold value;
and the attitude change of the holder on the unmanned aerial vehicle is greater than a preset attitude change threshold value.
30. The electronic device of claim 25, wherein the electronic device is further configured to control the unmanned aerial vehicle motion via a control, and wherein the processor is further configured to:
marking the moment when the variation amplitude of the control in the video data is larger than a preset variation amplitude threshold value;
and extracting the video data in the specified time period containing the time for splicing.
31. The electronic device of claim 29, wherein the change in pan-tilt attitude on the UAV greater than a preset attitude change threshold comprises:
the shaking frequency of the holder is smaller than a preset frequency threshold value, and the rotation amplitude of the holder is larger than a preset rotation amplitude threshold value.
32. The electronic device of claim 25, wherein the processor is further configured to:
and denoising the flight attitude data.
33. The electronic device of claim 32, wherein the de-noising process comprises a low pass filtering process.
34. The electronic device of claim 25, wherein the processor is further configured to:
and editing the video data based on the content of the video data, and playing back or sharing the video data obtained by editing.
35. The electronic device of claim 34, wherein the processor is configured to:
marking the moment when the similarity between adjacent frames in the video data is greater than a preset similarity threshold;
and extracting the video data in the specified time period containing the time for splicing.
36. The electronic device of claim 25, wherein the processor is further configured to:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data in real time; or
And after the shooting of the shooting device is finished, acquiring the video data shot by the shooting device, the flight attitude data of the unmanned aerial vehicle and the timestamp data.
37. An unmanned aerial vehicle equipped with a camera, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring video data shot by the shooting device, recording flight attitude data of the unmanned aerial vehicle in a memory, and correspondingly generating timestamp data; wherein the timestamp data is used to synchronize the video data with the flight attitude data;
clipping the video data based on the flight attitude data and the timestamp data.
38. The UAV according to claim 37 wherein the UAV is a crossing machine.
39. The UAV of claim 37 wherein the processing device is configured to:
and splicing the video data corresponding to the time of the flight attitude data meeting the preset splicing condition to obtain the spliced video.
40. The UAV of claim 39 wherein the attitude data comprises at least one of:
the unmanned aerial vehicle comprises a flight gear, a flight speed, a rotation angle, a flight height, a distance of an obstacle and a carried cradle head attitude.
41. The UAV according to claim 40 wherein the preset clipping conditions include at least one of:
the flight gear of the unmanned aerial vehicle is a preset gear;
the flight speed of the unmanned aerial vehicle is greater than a preset speed threshold value;
the rotation angle of the unmanned aerial vehicle is greater than a preset angle threshold value;
the flight height of the unmanned aerial vehicle is greater than a preset height threshold value;
the distance between the unmanned aerial vehicle and the obstacle is smaller than a preset distance threshold value;
and the attitude change of the holder on the unmanned aerial vehicle is greater than a preset attitude change threshold value.
42. The UAV of claim 37, wherein the UAV is communicatively coupled to an electronic device, wherein the electronic device controls the UAV via a control, wherein the processor is further configured to:
marking the moment when the variation amplitude of the control in the video data is larger than a preset variation amplitude threshold value;
and extracting the video data in the specified time period containing the time for splicing.
43. The UAV of claim 41, wherein the change in pan-tilt attitude on the UAV greater than a preset attitude change threshold comprises:
the shaking frequency of the holder is smaller than a preset frequency threshold value, and the rotation amplitude of the holder is larger than a preset rotation amplitude threshold value.
44. The UAV of claim 37 wherein the processor is further configured to:
and denoising the flight attitude data.
45. The UAV of claim 44 wherein the de-noising process comprises a low pass filtering process.
46. The UAV of claim 37 wherein the processor is further configured to:
clipping the video data based on the content of the video data.
47. The UAV of claim 46 wherein the processing means is configured to:
marking the moment when the similarity between adjacent frames in the video data is greater than a preset similarity threshold;
and extracting the video data in the specified time period containing the time for splicing.
48. The UAV of claim 37 wherein the processor is further configured to:
acquiring video data shot by the shooting device, flight attitude data of the unmanned aerial vehicle and timestamp data in real time; or
And after the shooting of the shooting device is finished, acquiring the video data shot by the shooting device, the flight attitude data of the unmanned aerial vehicle and the timestamp data.
49. A computer storage medium having a computer program stored thereon, the computer program, when executed, implementing the method of any of claims 1-12 or claims 13-24.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/117488 WO2022061660A1 (en) | 2020-09-24 | 2020-09-24 | Video trimming method, electronic device, unmanned aerial vehicle, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113940087A true CN113940087A (en) | 2022-01-14 |
Family
ID=79275151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080040151.3A Pending CN113940087A (en) | 2020-09-24 | 2020-09-24 | Video editing method, electronic equipment, unmanned aerial vehicle and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113940087A (en) |
WO (1) | WO2022061660A1 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205017419U (en) * | 2015-09-22 | 2016-02-03 | 杨珊珊 | Device of taking photo by plane |
CN108702464A (en) * | 2017-10-16 | 2018-10-23 | 深圳市大疆创新科技有限公司 | A kind of method for processing video frequency, control terminal and movable equipment |
CN109036479A (en) * | 2018-08-01 | 2018-12-18 | 曹清 | Clip point judges system and clip point judgment method |
CN109076263A (en) * | 2017-12-29 | 2018-12-21 | 深圳市大疆创新科技有限公司 | Video data handling procedure, equipment, system and storage medium |
WO2019127027A1 (en) * | 2017-12-26 | 2019-07-04 | 深圳市大疆创新科技有限公司 | Processing method for shooting video of unmanned aerial vehicle, shooting camera and remote control |
CN110832419A (en) * | 2018-07-25 | 2020-02-21 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle control method and system and unmanned aerial vehicle |
WO2021056353A1 (en) * | 2019-09-26 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Video editing method, and terminal apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107197136B (en) * | 2016-06-20 | 2019-04-09 | 普宙飞行器科技(深圳)有限公司 | Realize the control method of the beautification of unmanned aerial vehicle onboard camera image, video clipping |
US20180103197A1 (en) * | 2016-10-06 | 2018-04-12 | Gopro, Inc. | Automatic Generation of Video Using Location-Based Metadata Generated from Wireless Beacons |
CN108320304A (en) * | 2017-12-18 | 2018-07-24 | 广州亿航智能技术有限公司 | A kind of automatic edit methods and system of unmanned plane video media |
-
2020
- 2020-09-24 WO PCT/CN2020/117488 patent/WO2022061660A1/en active Application Filing
- 2020-09-24 CN CN202080040151.3A patent/CN113940087A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205017419U (en) * | 2015-09-22 | 2016-02-03 | 杨珊珊 | Device of taking photo by plane |
CN108702464A (en) * | 2017-10-16 | 2018-10-23 | 深圳市大疆创新科技有限公司 | A kind of method for processing video frequency, control terminal and movable equipment |
WO2019127027A1 (en) * | 2017-12-26 | 2019-07-04 | 深圳市大疆创新科技有限公司 | Processing method for shooting video of unmanned aerial vehicle, shooting camera and remote control |
CN109076263A (en) * | 2017-12-29 | 2018-12-21 | 深圳市大疆创新科技有限公司 | Video data handling procedure, equipment, system and storage medium |
CN110832419A (en) * | 2018-07-25 | 2020-02-21 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle control method and system and unmanned aerial vehicle |
CN109036479A (en) * | 2018-08-01 | 2018-12-18 | 曹清 | Clip point judges system and clip point judgment method |
WO2021056353A1 (en) * | 2019-09-26 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Video editing method, and terminal apparatus |
Also Published As
Publication number | Publication date |
---|---|
WO2022061660A1 (en) | 2022-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160055883A1 (en) | Methods and Apparatus for Automatic Editing of Video Recorded by an Unmanned Aerial Vehicle | |
US10402445B2 (en) | Apparatus and methods for manipulating multicamera content using content proxy | |
US10447929B2 (en) | Video processing method, device and image system | |
CN108702464B (en) | Video processing method, control terminal and mobile device | |
JP4760892B2 (en) | Display control apparatus, display control method, and program | |
US11587317B2 (en) | Video processing method and terminal device | |
CN109076263B (en) | Video data processing method, device, system and storage medium | |
EP2448240A2 (en) | Gps/video data communication system, data communication method and apparatus used in a gps/video data communication system | |
US9787862B1 (en) | Apparatus and methods for generating content proxy | |
US9871994B1 (en) | Apparatus and methods for providing content context using session metadata | |
US20180278844A1 (en) | Photographing method and photographing device of unmanned aerial vehicle, unmanned aerial vehicle, and ground control device | |
CN108650494A (en) | The live broadcast system that can obtain high definition photo immediately based on voice control | |
CN110393008A (en) | Photograph album generating means, photograph album generate system and album creating method | |
WO2022021438A1 (en) | Image processing method, image control method, and related device | |
CN111917979B (en) | Multimedia file output method and device, electronic equipment and readable storage medium | |
US20190164575A1 (en) | Method and system for combining and editing uav operation data and video data | |
CN104202542B (en) | For the subtitle automatic generation method and device of video camera | |
CN110720209A (en) | Image processing method and device | |
US20210258494A1 (en) | Flight control method and aircraft | |
CN113940087A (en) | Video editing method, electronic equipment, unmanned aerial vehicle and storage medium | |
CN108475410B (en) | Three-dimensional watermark adding method, device and terminal | |
CN105765969B (en) | Image treatment method, device and equipment and image shooting system | |
CN117693946A (en) | Unmanned aerial vehicle control method, image display method, unmanned aerial vehicle and control terminal | |
US9807350B2 (en) | Automated personalized imaging system | |
CN117950552B (en) | Unmanned aerial vehicle simulation data playback, labeling and collection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |