CN109922252B - Short video generation method and device and electronic equipment - Google Patents

Short video generation method and device and electronic equipment Download PDF

Info

Publication number
CN109922252B
CN109922252B CN201711322192.8A CN201711322192A CN109922252B CN 109922252 B CN109922252 B CN 109922252B CN 201711322192 A CN201711322192 A CN 201711322192A CN 109922252 B CN109922252 B CN 109922252B
Authority
CN
China
Prior art keywords
short video
image frames
image frame
image
photo
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711322192.8A
Other languages
Chinese (zh)
Other versions
CN109922252A (en
Inventor
马昆
王晨曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201711322192.8A priority Critical patent/CN109922252B/en
Publication of CN109922252A publication Critical patent/CN109922252A/en
Application granted granted Critical
Publication of CN109922252B publication Critical patent/CN109922252B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The disclosure relates to a method and a device for generating a short video, and an electronic device, wherein the method may include: receiving a photographing instruction for the electronic equipment; when the dynamic photo acquisition function of the electronic equipment is in an open state, acquiring a dynamic photo generated by the electronic equipment in response to the photographing instruction; according to the received user editing instruction, carrying out editing operation on at least one image frame for forming the dynamic photo; and synthesizing the edited at least one image frame and other image frames for forming the dynamic photo into a short video.

Description

Short video generation method and device and electronic equipment
Technical Field
The present disclosure relates to the field of video processing technologies, and in particular, to a short video generation method and apparatus, and an electronic device.
Background
In the related art, a user may perform an image capturing operation through an electronic device, so as to obtain a corresponding photo or video. When the user is not satisfied with the photo, the photo can be edited by beautifying the portrait, blurring the background and the like through the quick photo editing function. However, the related art does not provide a scheme for a user to easily edit a video.
Disclosure of Invention
The present disclosure provides a method and an apparatus for generating a short video, and an electronic device, so as to solve the deficiencies in the related art.
According to a first aspect of the embodiments of the present disclosure, there is provided a method for generating a short video, including:
receiving a photographing instruction for the electronic equipment;
when the dynamic photo acquisition function of the electronic equipment is in an open state, acquiring a dynamic photo generated by the electronic equipment in response to the photographing instruction;
according to the received user editing instruction, carrying out editing operation on at least one image frame for forming the dynamic photo;
and synthesizing the edited at least one image frame and other image frames for forming the dynamic photo into a short video.
Optionally, the image frames for forming the dynamic photo include: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation.
Optionally, the method further includes:
when an operation subject of the editing operation is not related to a subject in the at least one image frame, the editing operation is performed on other image frames for constituting the moving picture so that the resulting image frames are used to synthesize the short video.
Optionally, the method further includes:
when an operation object of the editing operation is a photographic object in the at least one image frame, determining an image frame containing the photographic object in other image frames for constituting the dynamic photo;
and carrying out the editing operation on the determined image frames containing the shot object, so that the obtained image frames are used for synthesizing the short video.
Alternatively to this, the first and second parts may,
further comprising: implementing an audio capture operation during the capture of the image frames used to form the motion picture by the electronic device;
the synthesizing the edited at least one image frame and other image frames used for forming the dynamic photo into a short video includes: and synthesizing the edited image frame, other image frames for forming the dynamic photo and the acquired audio into a short video.
Optionally, the method further includes:
setting one image frame for constituting the moving picture as a still cover image of the short video;
when a first trigger operation for the short video is detected, displaying the static cover image;
and when a second trigger operation aiming at the short video is detected, playing the short video.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus for generating a short video, including:
a receiving unit configured to receive a photographing instruction for an electronic device;
the acquisition unit is configured to acquire a dynamic photo generated by the electronic equipment in response to the photographing instruction when a dynamic photo acquisition function of the electronic equipment is in an open state;
a first editing unit configured to perform an editing operation on at least one image frame for constituting the moving picture according to a received user editing instruction;
a synthesizing unit configured to synthesize the edited at least one image frame with other image frames for constituting the moving picture into a short video.
Optionally, the image frames for forming the dynamic photo include: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation.
Optionally, the method further includes:
a second editing unit configured to, when an operation subject of the editing operation is not related to a subject in the at least one image frame, perform the editing operation on other image frames for constituting the moving picture so that the resultant image frames are used by the synthesizing unit to synthesize the short video.
Optionally, the method further includes:
a determination unit configured to determine, when an operation subject of the editing operation is a subject in the at least one image frame, an image frame containing the subject among other image frames for constituting the moving picture;
a third editing unit configured to perform the editing operation on the determined image frame including the subject so that the obtained image frame is used by the combining unit to combine the short video.
Alternatively to this, the first and second parts may,
further comprising: an acquisition unit configured to perform an audio acquisition operation during acquisition of image frames constituting the moving picture by the electronic device;
the synthesis unit is used for synthesizing the edited image frame, other image frames for forming the dynamic photo and the collected audio into a short video.
Alternatively to this, the first and second parts may,
a setting unit that sets one image frame for constituting the moving picture as a still cover image of the short video;
the processing unit is used for displaying the static cover image when a first trigger operation aiming at the short video is detected; and when a second trigger operation aiming at the short video is detected, playing the short video.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method as in any of the above embodiments.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions, wherein the instructions, when executed by a processor, implement the steps of the method as in any one of the above embodiments.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the embodiment, the dynamic photo is obtained when the user takes a picture, the editing function aiming at the image frames forming the dynamic photo is provided for the user, so that the user can conveniently and rapidly implement the editing operation aiming at the image frames forming the dynamic photo, the edited image frames are combined into the short video, and the short video is rapidly generated on the electronic equipment.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flow chart illustrating a method of generating a short video according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating another method of generating short video according to an example embodiment.
Fig. 3-7 are block diagrams illustrating an apparatus for generating short video according to an example embodiment.
Fig. 8 is a schematic structural diagram illustrating an apparatus for short video generation according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a flowchart illustrating a method for generating a short video according to an exemplary embodiment, where the method is applied to an electronic device, as shown in fig. 1, and may include the following steps:
in step 102, a photographing instruction for an electronic device is received.
In step 104, when the dynamic photo capturing function of the electronic device is in an on state, a dynamic photo generated by the electronic device in response to the photographing instruction is acquired.
In one embodiment, when the dynamic photo collection function of the electronic device is in a closed state, the electronic device takes a photo for a received photographing instruction; when the dynamic photo collection function of the electronic device is in an open state, the electronic device obtains a dynamic photo for the received photographing instruction, and the dynamic photo comprises a plurality of image frames and can be regarded as a video with short duration (namely, a short video).
In one embodiment, when the electronic device generates a dynamic photo, one image frame of the dynamic photo (typically, the image frame acquired at the time when the photo instruction is received) may be set as a static cover image, so that when the user performs a first trigger operation (e.g., opening an album) on the dynamic photo, the static cover image can be viewed, which is similar to viewing a normal photo, and when the user performs a second trigger operation (e.g., pressing or pressing the static cover image for a long time) on the dynamic photo, the dynamic photo can be played, which is equivalent to playing a video with a short time to the user. Therefore, the motion picture has the motion information recording characteristic of video and the still information recording characteristic of the picture, and is different from the normal video.
In step 106, an editing operation is performed on at least one image frame used to construct the moving picture according to the received user editing instruction.
In one embodiment, each image frame of a dynamic photo may only implement intra-frame compression, enabling the electronic device to provide any image frame of the dynamic photo to a user for the user to implement an editing operation. In other embodiments, the image frames of the dynamic photo may implement inter-frame compression (only inter-frame compression, or both intra-frame compression and inter-frame compression), and the electronic device may still provide at least one image frame of the dynamic photo to the user for the user to perform an editing operation.
In one embodiment, the image frames used to construct the motion picture comprise: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation. Compared with the common video shooting, the dynamic photo or the short video disclosed by the invention is simple to operate (a user only needs to send a shooting instruction), easy to manufacture, easy to transmit and share due to less data, capable of improving the playing will of the user and beneficial to improving the information sharing efficiency. However, since the shooting time of a moving picture or a short video is short (usually tens of seconds or several seconds), it is often difficult to capture an emergency except for a panning scene, and although the above problems exist in a normal video, the problem can be compensated by long-time continuous shooting and post-editing; and the dynamic photo comprises at least one photo obtained by the electronic equipment performing the photographing operation in response to the photographing instruction, namely the electronic equipment automatically performs the photographing operation before receiving the photographing instruction, so that the user can issue the photographing instruction after finding an emergency, still can capture the emergency in the dynamic photo, and does not need to continuously take a picture for a long time and clip a later period.
In step 108, the edited at least one image frame and other image frames used for forming the dynamic photo are synthesized into a short video.
In one embodiment, the short video may be in the same file format as the dynamic photo, so that the user uses the short video in a similar way to the dynamic photo, such as setting one image frame for constituting the dynamic photo as a static cover image of the short video, presenting the static cover image when a first trigger operation for the short video is detected, and playing the short video when a second trigger operation for the short video is detected. In other embodiments, the short video may be in a different file format than the motion picture, which is not limited by this disclosure.
In an embodiment, an operation object of the editing operation may be independent of a subject in the at least one image frame, for example, the operation object may include a decorative pattern added to the image frame, and the electronic device may automatically perform the editing operation on other image frames used for composing the moving picture, so that the obtained image frames are used for composing the short video. By the embodiment, the editing operation carried out by the user in the partial image frame can be applied to all the image frames, so that the corresponding editing effect can be presented in the whole short video without carrying out the editing operation by the user one by one, and the user operation is facilitated to be simplified. In other embodiments, the corresponding editing effect may be presented only in the image frame in which the user performs the editing operation, and the editing operation may not be automatically performed in other image frames.
In an embodiment, the operation object of the editing operation may be a subject in the at least one image frame, for example, the subject may include a shooting subject, a shooting background, and the like, which is not limited by the present disclosure. By determining the image frame including the subject among the other image frames used to construct the moving picture, the electronic device can automatically perform the editing operation on the determined image frame including the subject, so that the resulting image frame is used to synthesize the short video. By the embodiment, the editing operation carried out by the user in the partial image frame can be applied to all the image frames, so that the corresponding editing effect can be presented in the whole short video without carrying out the editing operation by the user one by one, and the user operation is facilitated to be simplified. For example, if a user performs portrait beautification on a subject person in one image frame, automatic portrait beautification can be performed on the subject person in any of the other image frames.
In one embodiment, an audio capture operation may be performed during the capture of the image frames used to construct the motion picture by the electronic device; then, the edited image frame, other image frames used for forming the dynamic photo and the collected audio are synthesized into a short video, so that the short video has better information expression capability.
According to the embodiment, the dynamic photo is obtained when the user takes a picture, the editing function aiming at the image frames forming the dynamic photo is provided for the user, so that the user can conveniently and rapidly implement the editing operation aiming at the image frames forming the dynamic photo, the edited image frames are combined into the short video, and the short video is rapidly generated on the electronic equipment.
Fig. 2 is a flowchart illustrating another short video generation method according to an exemplary embodiment, and as shown in fig. 2, the method applied to an electronic device may include the following steps:
in step 202, a camera application is started on the electronic device.
In an embodiment, the camera application may provide a switch option of the dynamic photo taking function to the user, and when it is detected that the dynamic photo taking function is in an off state, the electronic device may implement a normal photographing operation when it is detected that the taking option is triggered, so as to implement taking of a single photo or continuous taking of multiple photos. When the dynamic photo shooting function is detected to be in the open state, the electronic equipment can realize the shooting operation of the dynamic photo, and further realize the technical scheme of the disclosure.
In step 204A, the electronic device automatically cycles through image frames in the event that the motion picture taking function has been turned on.
In one embodiment, when the dynamic photo shooting function is turned on, if a camera application of the electronic device is turned on, the camera application may automatically and continuously capture image frames without receiving a shooting instruction issued by a user. In order to avoid occupying too many resources, the image frames acquired by the camera application may be limited within a specific duration, for example, the specific duration may be 1s, and if a photographing instruction is still not detected after 1s, the camera application continues to acquire new image frames and deletes the previously acquired image frames in the acquisition order, so that the image frames remaining in the electronic device do not exceed 1s all the time.
In step 204B, the electronic device detects that the photo option is triggered.
In an embodiment, the camera application may provide a photographing option to the user, and the photographing option is consistent when the dynamic photo photographing function is turned on and off, so that it is ensured that the user experiences in the two scenes are consistent, and the learning threshold of the user may be reduced.
In step 206, the dynamic photograph is composed.
In an embodiment, the camera application may acquire the image frame acquired before the photographing option is triggered and with a specific duration, the image frame acquired when the photographing option is triggered and the image frame acquired after the photographing option is triggered and with a specific duration, and jointly synthesize the image frames into the dynamic photo.
In one embodiment, during the process of synthesizing a dynamic photo, intra-frame compression and/or inter-frame compression may be applied to the image frames to reduce the storage footprint of the dynamic photo. For example, the dynamic photos may adopt a Motion JPEG (MJPEG, Motion Joint Photographic Experts Group, fountain cc: MJPEG) format, wherein each image frame is JPEG-encoded, and only intra-frame compression is performed on the image frame, so that in the process of synthesizing and editing the dynamic photos, encoding and decoding operations can be implemented based on fewer processing resources, which is helpful for increasing the synthesis speed of the dynamic photos (so that a user can hardly perceive the synthesis process, the synthesis process of the dynamic photos approaches to the shooting process of common photos, and is helpful for optimizing the application experience of the user), and smoothly implementing the editing operation of the dynamic photos.
In step 208, it is determined whether the user needs to edit the dynamic photo; when editing is required, proceed to step 210.
In an embodiment, the processing may be performed based on other sequences besides the processing manner of step 206 and step 210 shown in fig. 2. For example, a camera application may acquire image frames for composing a dynamic photograph and first ask the user whether the dynamic photograph needs to be edited; when the user does not need to edit the dynamic photos, the camera application can synthesize the dynamic photos by using the image frames, and when the user needs to edit the dynamic photos, the camera application can show the image frames to the user so as to implement editing operation by the user.
In step 210, the image frames that make up the motion picture are presented.
In one embodiment, the displayed image frames may be image frames acquired by a camera application, i.e., the image frames are not compressed. In another embodiment, the displayed image frame may be an extracted image frame in a dynamic picture, and the image frame may be subjected to a corresponding compression process, such as intra-frame compression or inter-frame compression.
In an embodiment, the image frames may be arranged and displayed according to a time sequence, so that a user may select an image frame that needs to perform an editing operation.
In step 212, the selected image frame is edited according to the detected user editing instruction.
In one embodiment, a user may select one or more image frames and perform an editing operation on the image frames. In one case, the editing object of the editing operation may include a subject in an image frame, such as a subject, a background, etc. in one dimension, and a person, an animal, a plant, a building, a natural landscape, etc. in another dimension, and the editing operation may be performed on one or more subjects, such as beautifying a subject person, blurring a background, etc., which is not limited by the present disclosure. In another case, the editing object of the editing operation may be independent of the subject in the image frame, for example, the user may add a filter, adjust a color tone, add a virtual illumination light source, and the like to the entire image frame, and for example, the user may add a decorative pattern, a decorative text, and the like to the image frame.
In an embodiment, in order to facilitate the user to perform the editing operation, a preprocessing operation may be performed on the image frame, for example, the electronic device may actively identify a subject included in the image frame, and may also mark the subject, thereby improving the efficiency of the user performing the editing operation on the subject.
In step 214, it is determined whether an editing operation needs to be applied to all image frames; when it is determined that the application is needed, the process proceeds to step 216, and when it is determined that the application is not needed, the process proceeds to step 218.
In step 216, the remaining image frames are automatically edited.
In an embodiment, the user may perform an editing operation only for a part of the image frames, and if the user wishes to apply an editing effect to all the image frames constituting the moving picture, the user may select to automatically edit the remaining image frames, and the electronic device may automatically perform the same or corresponding editing operations on the remaining image frames according to the editing operation performed by the user, without the user performing the editing operation for all the image frames one by one.
For example, the user may perform beautification processing on the subject person in one image frame, and when the user selects to automatically edit the remaining image frames, the electronic device may determine the same subject person in the remaining unprocessed image frames and perform the same beautification processing on the subject person, thereby ensuring that the subject person is subjected to the same or similar beautification processing in all image frames.
For another example, the user may add a decoration pattern at a certain position in one image frame, and when the user selects to automatically edit the remaining image frames, the electronic device may add the decoration pattern at the same position in the other unprocessed remaining image frames, thereby ensuring that the decoration pattern is displayed at the same position in all image frames.
For another example, the user may add a decoration pattern at a first position of a first image frame and a decoration pattern at a second position of a last image frame, and when the user selects to automatically edit the remaining image frames, the electronic device may add the decoration pattern at corresponding positions in the remaining unprocessed image frames, thereby ensuring that the decoration pattern is displayed in all the image frames, and the positions of the decoration patterns in the respective image frames are sequentially changed, presenting a visual change effect that the decoration pattern gradually moves from the first position to the second position.
In step 218, the image frames are synthesized into a short video.
In one embodiment, if the editing operation is not applied to all image frames, the image frames edited by the user and the unedited remaining image frames can be synthesized to obtain a short video; if the editing operation is applied to all image frames, all edited image frames can be synthesized to obtain a short video.
In one embodiment, the file format of the short video may be the same as the dynamic photo, which may be understood as replacing the dynamic photo with the edited short video. The image frame collected when the photographing option is triggered can be set as the static cover image of the short video, so that the user can see the static cover image when normally viewing the album, and the corresponding short video can be played when the user presses or re-presses the static cover image for a long time.
In an embodiment, the file format of the short video may be different from that of the motion picture, for example, MP4 format, GIF format, etc. are adopted, so that the short video can be transmitted, shared, and played normally on various electronic devices.
Corresponding to the foregoing embodiment of the short video generation method, the present disclosure also provides an embodiment of a short video generation apparatus.
Fig. 3 is a block diagram illustrating an apparatus for generating a short video according to an example embodiment. Referring to fig. 3, the apparatus includes:
a receiving unit 301 configured to receive a photographing instruction for an electronic apparatus;
an obtaining unit 302 configured to obtain a dynamic photo generated by the electronic device in response to the photographing instruction when a dynamic photo collecting function of the electronic device is in an on state;
a first editing unit 303 configured to perform an editing operation on at least one image frame for constituting the moving picture according to a received user editing instruction;
a synthesizing unit 304 configured to synthesize the edited at least one image frame with other image frames for constituting the moving picture into a short video.
Optionally, the image frames for forming the dynamic photo include: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation.
As shown in fig. 4, fig. 4 is a block diagram of another short video generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 3, and the apparatus may further include:
a second editing unit 305 configured to, when an operation subject of the editing operation is not related to a subject in the at least one image frame, perform the editing operation on other image frames for constituting the moving picture so that the resultant image frames are used by the synthesizing unit 304 to synthesize the short video.
As shown in fig. 5, fig. 5 is a block diagram of another short video generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 3, and the apparatus may further include:
a determining unit 306 configured to determine, when an operation subject of the editing operation is a subject in the at least one image frame, an image frame including the subject in other image frames for constituting the moving picture;
a third editing unit 307 configured to perform the editing operation on the determined image frame including the subject so that the resulting image frame is used by the combining unit 304 to combine the short video.
It should be noted that the structures of the determining unit 306 and the third editing unit 307 in the apparatus embodiment shown in fig. 5 may also be included in the apparatus embodiment shown in fig. 4, and the disclosure is not limited thereto.
As shown in fig. 6, fig. 6 is a block diagram of another short video generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 3, and the apparatus may further include:
an acquisition unit 308 configured to perform an audio acquisition operation during acquisition of image frames constituting the moving picture by the electronic device;
the synthesizing unit 304 is configured to synthesize the edited image frame, other image frames used to form the moving picture, and the captured audio into a short video.
It should be noted that the structure of the collecting unit 308 in the device embodiment shown in fig. 6 may also be included in the device embodiment shown in fig. 4 or 5, and the disclosure is not limited thereto.
As shown in fig. 7, fig. 7 is a block diagram of another short video generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 3, and the apparatus may further include:
a setting unit 309 that sets one image frame for constituting the moving picture as a still cover image of the short video;
the processing unit 310 is used for displaying the static cover image when a first trigger operation for the short video is detected; and when a second trigger operation aiming at the short video is detected, playing the short video.
It should be noted that the configurations of the setting unit 309 and the processing unit 310 in the apparatus embodiment shown in fig. 7 may also be included in the apparatus embodiment shown in any one of fig. 4 to 6, and the disclosure is not limited thereto.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides a short video generating apparatus, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the method for generating short video according to any one of the above embodiments, for example, the method may include: receiving a photographing instruction for the electronic equipment; when the dynamic photo acquisition function of the electronic equipment is in an open state, acquiring a dynamic photo generated by the electronic equipment in response to the photographing instruction; according to the received user editing instruction, carrying out editing operation on at least one image frame for forming the dynamic photo; and synthesizing the edited at least one image frame and other image frames for forming the dynamic photo into a short video.
Accordingly, the present disclosure also provides a terminal including a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by one or more processors to implement the instructions of the method for generating a short video according to any of the above embodiments, such as the method may include: receiving a photographing instruction for the electronic equipment; when the dynamic photo acquisition function of the electronic equipment is in an open state, acquiring a dynamic photo generated by the electronic equipment in response to the photographing instruction; according to the received user editing instruction, carrying out editing operation on at least one image frame for forming the dynamic photo; and synthesizing the edited at least one image frame and other image frames for forming the dynamic photo into a short video.
Fig. 8 is a block diagram illustrating an apparatus 800 for generation of short video according to an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 8, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A method for generating a short video, comprising:
receiving a photographing instruction for the electronic equipment;
when the dynamic photo acquisition function of the electronic equipment is in an open state, acquiring a dynamic photo generated by the electronic equipment in response to the photographing instruction;
according to the received user editing instruction, carrying out editing operation on at least one image frame for forming the dynamic photo;
applying the editing operation to other image frames constituting the dynamic photograph;
synthesizing the edited at least one image frame and other image frames for forming the dynamic photo into a short video;
when an operation object of the editing operation is unrelated to a subject in the at least one image frame, performing the editing operation on other image frames for constituting the moving picture so that the resulting image frames are used for synthesizing the short video;
when an operation object of the editing operation is a photographic object in the at least one image frame, determining an image frame containing the photographic object in other image frames for constituting the dynamic photo;
and carrying out the editing operation on the determined image frames containing the shot object, so that the obtained image frames are used for synthesizing the short video.
2. The method of claim 1, wherein the image frames used to construct the motion picture comprise: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation.
3. The method of claim 1,
further comprising: implementing an audio capture operation during the capture of the image frames used to form the motion picture by the electronic device;
the synthesizing the edited at least one image frame and other image frames used for forming the dynamic photo into a short video includes: and synthesizing the edited image frame, other image frames for forming the dynamic photo and the acquired audio into a short video.
4. The method of claim 1, further comprising:
setting one image frame for constituting the moving picture as a still cover image of the short video;
when a first trigger operation for the short video is detected, displaying the static cover image;
and when a second trigger operation aiming at the short video is detected, playing the short video.
5. An apparatus for generating a short video, comprising:
a receiving unit configured to receive a photographing instruction for an electronic device;
the acquisition unit is configured to acquire a dynamic photo generated by the electronic equipment in response to the photographing instruction when a dynamic photo acquisition function of the electronic equipment is in an open state;
a first editing unit configured to perform an editing operation on at least one image frame for constituting the moving picture according to a received user editing instruction, and apply the editing operation to other image frames constituting the moving picture;
a synthesizing unit configured to synthesize the edited at least one image frame and other image frames for constituting the moving picture into a short video;
a second editing unit configured to perform the editing operation on other image frames for constituting the moving picture so that the resultant image frames are used by the synthesizing unit to synthesize the short video when an operation subject of the editing operation is not related to a subject in the at least one image frame;
a determination unit configured to determine, when an operation subject of the editing operation is a subject in the at least one image frame, an image frame containing the subject among other image frames for constituting the moving picture;
a third editing unit configured to perform the editing operation on the determined image frame including the subject so that the obtained image frame is used by the combining unit to combine the short video.
6. The apparatus of claim 5, wherein the image frames used to construct the motion picture comprise: the electronic equipment responds to the photographing instruction to implement at least one photo obtained by photographing operation, at least one photo collected by the electronic equipment before the photographing operation is implemented, and at least one photo collected by the electronic equipment after the photographing operation.
7. The apparatus of claim 5,
further comprising: an acquisition unit configured to perform an audio acquisition operation during acquisition of image frames constituting the moving picture by the electronic device;
the synthesis unit is used for synthesizing the edited image frame, other image frames for forming the dynamic photo and the collected audio into a short video.
8. The apparatus of claim 5, further comprising:
a setting unit that sets one image frame for constituting the moving picture as a still cover image of the short video;
the processing unit is used for displaying the static cover image when a first trigger operation aiming at the short video is detected; and when a second trigger operation aiming at the short video is detected, playing the short video.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1-4.
10. A computer-readable storage medium having stored thereon computer instructions, which, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 4.
CN201711322192.8A 2017-12-12 2017-12-12 Short video generation method and device and electronic equipment Active CN109922252B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711322192.8A CN109922252B (en) 2017-12-12 2017-12-12 Short video generation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711322192.8A CN109922252B (en) 2017-12-12 2017-12-12 Short video generation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN109922252A CN109922252A (en) 2019-06-21
CN109922252B true CN109922252B (en) 2021-11-02

Family

ID=66956975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711322192.8A Active CN109922252B (en) 2017-12-12 2017-12-12 Short video generation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN109922252B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381356B (en) * 2019-07-19 2022-06-07 北京字节跳动网络技术有限公司 Audio and video generation method and device, electronic equipment and readable medium
CN110995999A (en) * 2019-12-12 2020-04-10 北京小米智能科技有限公司 Dynamic photo shooting method and device
CN112862927B (en) * 2021-01-07 2023-07-25 北京字跳网络技术有限公司 Method, apparatus, device and medium for publishing video
CN113067983B (en) * 2021-03-29 2022-11-15 维沃移动通信(杭州)有限公司 Video processing method and device, electronic equipment and storage medium
CN114584704A (en) * 2022-02-08 2022-06-03 维沃移动通信有限公司 Shooting method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105245777A (en) * 2015-09-28 2016-01-13 努比亚技术有限公司 Method and device for generating video image
CN106776831A (en) * 2016-11-24 2017-05-31 维沃移动通信有限公司 A kind of edit methods and mobile terminal of Multimedia Combination data
CN106780359A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 A kind of data editing method, device and mobile terminal
CN106879263A (en) * 2015-10-13 2017-06-20 华为技术有限公司 A kind of image pickup method and mobile device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7697040B2 (en) * 2005-10-31 2010-04-13 Lightbox Network, Inc. Method for digital photo management and distribution
CN104052935B (en) * 2014-06-18 2017-10-20 广东欧珀移动通信有限公司 A kind of video editing method and device
US20170178685A1 (en) * 2015-12-22 2017-06-22 Le Holdings (Beijing) Co., Ltd. Method for intercepting video animation and electronic device
CN107135419A (en) * 2017-06-14 2017-09-05 北京奇虎科技有限公司 A kind of method and apparatus for editing video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105245777A (en) * 2015-09-28 2016-01-13 努比亚技术有限公司 Method and device for generating video image
CN106879263A (en) * 2015-10-13 2017-06-20 华为技术有限公司 A kind of image pickup method and mobile device
CN106780359A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 A kind of data editing method, device and mobile terminal
CN106776831A (en) * 2016-11-24 2017-05-31 维沃移动通信有限公司 A kind of edit methods and mobile terminal of Multimedia Combination data

Also Published As

Publication number Publication date
CN109922252A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109922252B (en) Short video generation method and device and electronic equipment
US20170304735A1 (en) Method and Apparatus for Performing Live Broadcast on Game
CN109151537B (en) Video processing method and device, electronic equipment and storage medium
WO2018000227A1 (en) Video broadcast method and device
US10270975B2 (en) Preview image display method, apparatus and storage medium
CN106210496B (en) Photo shooting method and device
KR20210133112A (en) Video processing method, apparatus and storage media
CN110677734B (en) Video synthesis method and device, electronic equipment and storage medium
US20170054906A1 (en) Method and device for generating a panorama
CN116156314A (en) Video shooting method and electronic equipment
CN110995993A (en) Star track video shooting method, star track video shooting device and storage medium
CN107105311B (en) Live broadcasting method and device
CN114820296A (en) Image processing method and device, electronic device and storage medium
CN110913120B (en) Image shooting method and device, electronic equipment and storage medium
CN111586296B (en) Image capturing method, image capturing apparatus, and storage medium
KR102557592B1 (en) Method and apparatus for displaying an image, electronic device and computer-readable storage medium
WO2021237744A1 (en) Photographing method and apparatus
CN110312117B (en) Data refreshing method and device
CN111835977B (en) Image sensor, image generation method and device, electronic device, and storage medium
CN114339357A (en) Image acquisition method, image acquisition device and storage medium
CN114078280A (en) Motion capture method, motion capture device, electronic device and storage medium
CN113286073A (en) Imaging method, imaging device, and storage medium
CN113315903A (en) Image acquisition method and device, electronic equipment and storage medium
CN111225158B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN109447929B (en) Image synthesis method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant