CN115484396B - Video processing method and electronic equipment - Google Patents

Video processing method and electronic equipment Download PDF

Info

Publication number
CN115484396B
CN115484396B CN202210038857.7A CN202210038857A CN115484396B CN 115484396 B CN115484396 B CN 115484396B CN 202210038857 A CN202210038857 A CN 202210038857A CN 115484396 B CN115484396 B CN 115484396B
Authority
CN
China
Prior art keywords
interface
video
electronic equipment
template
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210038857.7A
Other languages
Chinese (zh)
Other versions
CN115484396A (en
Inventor
韩笑
李启冒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN115484396A publication Critical patent/CN115484396A/en
Application granted granted Critical
Publication of CN115484396B publication Critical patent/CN115484396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Abstract

The embodiment of the application provides a video processing method and electronic equipment, relates to the technical field of terminals, and can avoid data loss and improve man-machine interaction efficiency in a video editing process. The method comprises the following steps: the electronic equipment displays a first interface; the first interface is a view finding interface before the electronic equipment records the video after determining to adopt the first template; the first template is used for the electronic device to add a first video effect to the shot video. The electronic device displays a second interface in response to a first operation of the first interface by a user. The electronic equipment responds to the ending of video recording and displays a third interface; the third interface is a preview interface of the first tile. And the electronic equipment responds to the second operation of the user on the third interface, and displays the main interface. And the electronic equipment responds to the click operation of the user on the first icon, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays a third interface. And the electronic equipment responds to the third operation of the user on the third interface, and stores the formed video.

Description

Video processing method and electronic equipment
The present application claims priority to the national intellectual property office, application number 202110676709.3, application name "a story line mode-based user video authoring method and electronic device" filed on day 16 of 2021, and priority to the national intellectual property office, application number 202111439292.5, application name "a video processing method and electronic device" filed on day 11 of 2021, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of terminals, in particular to a video processing method and electronic equipment.
Background
Electronic devices such as mobile phones and tablets are generally provided with video shooting and editing functions. Taking the example that the electronic device is a mobile phone, the mobile phone can shoot and store video by using a camera application. And then, the mobile phone responds to clicking operation of a user on a gallery entry at the left lower corner of the view finding interface, so that a preview interface of the video can be displayed, and in the preview interface, the mobile phone can clip, send and the like the video.
However, the inventors found in practicing the embodiments of the present application that: in the prior art, in the process of displaying the preview interface, if the user exits from the main interface of the electronic device, after receiving the click operation of the user on the application icon of the camera application in the main interface, the electronic device will restart the camera application and display the view finding interface of the photographing mode in the camera application. Resulting in loss of the edit data generated in the preview interface.
Disclosure of Invention
The embodiment of the application provides a video processing method and electronic equipment, wherein after a camera application exits from running in a foreground, when the camera is running in the foreground again, the display of a preview interface can be restored. Therefore, data loss is avoided, and the man-machine interaction efficiency in the video editing process is improved.
In a first aspect, an embodiment of the present application provides a video capturing method, which is applied to an electronic device supporting capturing video, such as a mobile phone, a tablet, and the like. The electronic equipment displays a first interface. The first interface is a view finding interface before the electronic equipment records the video after determining to adopt the first template. The first template is used for the electronic device to add a first video effect to the shot video. The electronic device displays a second interface in response to a first operation of the first interface by a user. The second interface is a viewing interface of the electronic device in which the video is being recorded using the first template. The electronic equipment responds to the ending of video recording and displays a third interface; the third interface is a preview interface of a first clip, and the first clip is a video obtained by adding a first video effect to the recorded video by the electronic device. And the electronic equipment responds to the second operation of the user on the third interface, and displays a main interface of the electronic equipment. The main interface includes a first icon of the camera application. And the electronic equipment responds to the click operation of the user on the first icon, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays a third interface. And the electronic equipment responds to the third operation of the user on the third interface, and stores the formed video.
In summary, in a scene of shooting video and previewing using a template, when the video processing method provided by the embodiment of the application is applied, if the camera application is exited from the foreground and the camera application is returned to the main interface of the electronic device in the process of displaying the preview interface, the preview interface is exited. Thereby interrupting video editing. Subsequently, the electronic device may receive a click operation of the camera icon in the main interface by the user, where the click operation is used to trigger the electronic device to run the camera application in the foreground. Responding to clicking operation of a camera icon in a main interface by a user, and if a camera application still exists in the background of the mobile phone, namely: the camera application is not cleaned up and the electronic device may resume displaying the preview interface again. So that editing of the video in the preview interface can continue. Therefore, the loss of editing data can be avoided, and the man-machine interaction efficiency in the editing process is improved.
In one possible design manner of the first aspect, the electronic device sets a state of the third interface to be the first state in response to a second operation of the third interface by a user, and stores interface data of the third interface; the first state is for the electronic device to keep the third interface from being destroyed. Therefore, even if the preview interface is withdrawn later, the electronic equipment can save the data of the third interface, and the quick recovery is convenient. The electronic device displays a third interface, including: the electronic equipment updates the state of the third interface into the second state, and displays the third interface according to the interface data; the second state is for the electronic device to display a third interface in a foreground of the electronic device.
In this embodiment, the electronic device may implement saving and recovering the interface data by adjusting the state of the interface. Therefore, the man-machine interaction efficiency can be improved.
In a possible implementation manner of the first aspect, after the electronic device displays the third interface in response to ending recording the video, the method further includes: the electronic device plays the first piece in the third interface. The electronic equipment responds to the end of the first film forming playing, and if the electronic equipment does not detect the preset operation of the user on the third interface, the electronic equipment displays first prompt information in the third interface; the preset operation is used for triggering the electronic equipment to adjust the video effect of the first piece, and the first prompt information is used for prompting to adjust the video effect of the first piece.
In this embodiment, the electronic device may provide an edit prompt in the preview interface, so that the user may be explicitly instructed to continue editing the film.
In one possible design manner of the first aspect, after displaying the third interface, the method further includes: and the electronic equipment responds to the preset operation of the user on the third interface to adjust the video effect of the first piece. The electronic device responds to a third operation of a third interface by a user, saves the formed video, and comprises: the electronic equipment responds to a third operation of a third interface by a user and stores a second film; the second slice is the video obtained after the electronic device adjusts the video effect of the first slice.
In this embodiment, in the preview interface, the video effect may be adjusted. Therefore, the specified combination of various video effects in the template can be broken, and more diversified videos can be formed.
In one possible design manner of the first aspect, before the electronic device displays the first interface, the method further includes: the electronic device displays a fourth interface. The fourth interface comprises a plurality of template options, and the plurality of template options correspond to a plurality of templates in the electronic equipment; the plurality of template options includes a first option, the first option corresponding to the first template. The electronic device responds to the selection operation of the first option by the user and selects the first template. The electronic device displays a first interface, including: after the first template is selected, the electronic equipment responds to fourth operation of a user on a fourth interface, and the first interface is displayed.
In this embodiment, the electronic device may select a template for capturing a video having an effect consistent with that of the template based on the user's selection. Thereby the intelligence of video shooting can be promoted.
In one possible design manner of the first aspect, after the electronic device displays the second interface, the method further includes: the electronic equipment responds to the fact that the recording duration is equal to the preset duration, and the electronic equipment finishes recording the video; the preset duration is the recording duration corresponding to the first template.
In this embodiment, the electronic device may flexibly control the duration of the video according to the selected template, and may finally form a video most adapted to the template effect.
In one possible design manner of the first aspect, a sample duration of the first template is a first duration, and the first video effect includes a first tail, and the first tail is a second duration; the preset time period is the difference between the first time period and the second time period.
In a second aspect, embodiments of the present application also provide an electronic device that may support a video capture function, the electronic device including a display screen, a memory, and one or more processors. The display screen, the memory, and the processor are coupled. The memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible designs.
In a third aspect, embodiments of the present application provide a chip system that is applied to an electronic device including a display screen and a memory; the system-on-chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the interface circuit is configured to receive a signal from a memory of the electronic device and to send the signal to the processor, the signal including computer instructions stored in the memory; when the processor executes the computer instructions, the electronic device performs the method according to the first aspect and any one of its possible designs.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any one of its possible designs.
In a fifth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method according to the first aspect and any one of its possible designs.
It will be appreciated that the advantages achieved by the electronic device according to the second aspect, the chip system according to the third aspect, the computer storage medium according to the fourth aspect, and the computer program product according to the fifth aspect may refer to the advantages of the first aspect and any one of the possible designs thereof, which are not described herein.
Drawings
Fig. 1A is one of interface schematic diagrams of a mobile phone according to an embodiment of the present application;
FIG. 1B is a second diagram of an interface of a mobile phone according to an embodiment of the present disclosure;
FIG. 1C is a third diagram illustrating an interface of a mobile phone according to an embodiment of the present disclosure;
Fig. 2 is a schematic hardware structure of a mobile phone according to an embodiment of the present application;
fig. 3 is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a software interaction provided in an embodiment of the present application;
FIG. 5 is a diagram illustrating an interface of a mobile phone according to an embodiment of the present disclosure;
FIG. 6A is a fifth exemplary diagram of an interface of a mobile phone according to an embodiment of the present disclosure;
FIG. 6B is a sixth embodiment of an interface diagram of a mobile phone;
FIG. 7 is a diagram of a mobile phone interface according to an embodiment of the present disclosure;
FIG. 8 is a diagram illustrating an interface of a mobile phone according to an embodiment of the present disclosure;
fig. 9 is a ninth schematic diagram of an interface of a mobile phone according to an embodiment of the present application;
fig. 10 is a flowchart of a video processing method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
In the description of the embodiments of the present application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more. In addition, in order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", and the like are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the terms first, second, etc. do not denote any limitation of quantity or order of execution, and that the terms first, second, etc. do not denote necessarily different. In addition, in the description of the embodiments of the present application, unless otherwise stated, the positions and the forms of the interface elements in the interface schematic diagrams are all schematic, and in actual implementation, the positions and the forms can be flexibly adjusted according to actual requirements.
The embodiment of the application provides a video processing method which can be applied to a scene of previewing a shot video in electronic equipment. For example, the electronic device in the embodiments of the present application may be a device such as a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a notebook, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) \virtual reality (VR) device, and the specific form of the electronic device is not limited in the embodiments of the present application. In the following embodiments, the electronic device is mainly taken as an example of a mobile phone, to describe the solution of the present application.
In some embodiments, the mobile phone may capture a video to be previewed in a single-mirror video mode. The single-lens video recording mode is a common video recording mode in a conventional sense, and in the single-lens video recording mode, the mobile phone only adopts one camera to shoot video. For example, the interface 101 shown in fig. 1A (a) is a view interface before recording in a single-mirror recording mode. Only the image 102 captured by a single camera (e.g., front camera) is included in the interface 101.
In other embodiments, the mobile phone may capture a video to be previewed in a multi-mirror video mode. In the multi-lens video recording mode, the mobile phone can adopt two or more cameras to shoot videos at the same time. For example, the interface 103 shown in fig. 1A (b) is a view interface before recording in a multi-mirror recording mode. The interface 103 includes an image 104 and an image 105 acquired by two cameras (e.g., a rear main camera and a front camera). In the following embodiments, a multi-mirror video recording mode will be mainly used as an example to describe the present application.
It should be noted that in the multi-lens video recording mode, the mobile phone may also be triggered to switch to capturing video with a single camera by a switching operation, for example, clicking a switch button 106 in the interface 103. In order to distinguish from a single-mirror video mode, a scene of a video shot by a single camera in a multi-mirror video mode can be recorded as a multi-mirror mode.
In one scenario, the handset may take a video to be previewed with the effect template selected (which may be denoted as template 1). Each effect template corresponds to a group of video effects, and each group of video effects is formed by combining a plurality of effects of background music, a filter, special effects (such as flashing stars), transition, photo frames, stickers, tail pieces and the like. The template 1 is selected before the video shooting, and the mobile phone can shoot the video by adopting the template 1. For convenience of explanation, the template 1 may be referred to as a first template, and a set of video effects corresponding to the template 1 may be referred to as a first video effect.
Illustratively, control 1 is included in the viewfinder interface before the multi-view video mode video recording, for example, control 1 is a "micro movie" button 107 in interface 103 shown in fig. 1A (b). The control 1 is used for triggering the mobile phone to select an effect template. The mobile phone can receive clicking operation or long-press operation of the control 1 by a user. In response to a click operation or a long press operation of the control 1 by the user, a plurality of template options may be displayed. For example, in response to a user clicking on the "micro movie" button 107 (i.e., control 1) shown in fig. 1A (B), the cell phone may display the interface 111 shown in fig. 1B (a), where the interface 111 includes a plurality of template options (shown in the dashed box), such as "hello summer", "good time", "good mood", "light weekend", and the like. Wherein, a plurality of template options correspond to a plurality of effect templates one by one. The plurality of template options may be used to trigger a handset to select an effect template. The handset may receive a user selection of any template option (which may be referred to as option 1, also referred to as a first option). In response to the user selecting the option 1, the mobile phone may select an effect template (which may be referred to as a template 1) corresponding to the option 1. It should be appreciated that the handset may default to selecting the effect template corresponding to the first template option. For example, the default selected by the handset is the template corresponding to the "hello summer" option shown in (a) of fig. 1B. Also, a control 2 is included in an interface (which may be referred to as an interface 1, or may be referred to as a fourth interface) that displays a plurality of template options, and for example, the control 2 is a button 112 in an interface 111 shown in fig. 1B (a). The control 2 is used for triggering the mobile phone to shoot videos.
In some embodiments, after the template 1 is selected, the mobile phone may receive a click operation, a long press operation, or a sliding operation (may also be referred to as a fourth operation) of the control 2 by a user, which will be mainly described below in terms of the click operation. In response to a click operation of the control 2 by the user, the mobile phone may display a shooting preparation interface (may be denoted as interface 2, and may also be referred to as a first interface). In the process of displaying the interface 2, the user can finish the preparation work before shooting such as adjusting the shooting angle, turning on or off the background music of the template 1, and the like. For example, the interface 2 is an interface 113 shown in (B) of fig. 1B, and the interface 113 does not include a shooting timer, a control for ending shooting, suspending shooting, or the like, that is, it indicates that shooting has not been started. Also, a control 3 is included in the interface 2, such as a button 114 in the interface 113 shown in (B) in fig. 1B.
Further, after the interface 2 is displayed, the mobile phone may receive operations (may also be referred to as a first operation) such as a click operation, a long press operation, or a sliding operation of the control 3 by a user. The clicking operation will be mainly described hereinafter. In response to a click operation of the control 3 by the user, the mobile phone may start video capturing using the template 1 and display a view interface (which may be referred to as an interface 3 or may be referred to as a second interface) in capturing. For example, interface 3 is interface 115 shown in (c) of fig. 1B. The interface 115 includes interface elements such as a photographing time period 00:00, an end photographing button 116, a pause photographing button 117, and the like, which indicate that video photographing has started.
In other embodiments, the mobile phone may receive a click operation of the control 2 from the user after selecting the template 1. In response to a click operation of the control 2 by the user, the mobile phone can directly display the interface 3. That is, the interface 2 is not displayed, but the shooting is directly entered.
After starting the video capture, the handset may end the video capture in response to event 1.
In some embodiments, an end recording control is included in interface 3, such as end capture button 116 shown in fig. 1B (c), for triggering the cell phone to end video capture. Event 1 may be a click operation or a long press operation of the user on the end recording control, etc.
In other embodiments, the template 1 has a dailies in which the effect of sheeting that can be obtained after photographing using the template 1 is present. For example, filters, background music, etc. Further, in order to ensure that the sheeting formed after photographing using the template 1 can be kept highly consistent with the dailies, namely: the final sheeting results are seen by the user from the dailies. After the mobile phone selects the template 1, the mobile phone needs to control the duration of shooting the video, so that the duration of forming a slice by using the template 1 finally is the same as the duration of a sample slice of the template 1 (also can be called as a first duration). That is, the template 1 may indicate a video that is longest shot for a certain period of time (may be noted as a preset period of time).
In a specific implementation, the preset duration is the same as the sample duration. For example, if the template 1 is "hello summer" and the duration of the sample of "hello summer" is 15s, after the template of "hello summer" is selected and shooting is started, the longest shooting can be controlled to obtain a video of 15 s.
In another specific implementation manner, the video effect corresponding to the template 1 includes a tail (also referred to as a first tail), where the tail is a fixed content, such as an animation of a logo (logo) of a mobile phone manufacturer, and is irrelevant to the content photographed at this time. Accordingly, when a video effect is added to a photographed video using the template 1, a trailer is added at the end. In this case, the sum of the preset time period and the end-of-chip time period (which may also be referred to as the second time period) is equal to the sample time period, that is, the preset time period is equal to the difference between the sample time period and the end-of-chip time period. For example, if the template 1 is "hello summer", and the duration of the sample in the "hello summer" is 15s, and the corresponding video effect of the template 1 includes 1s tail, the predicted duration is (15-1) s=14s, and the interface 115 shown in (c) in fig. 1B includes 00:14, which is used for indicating that the preset duration is 14s.
It should be understood that the above manner of indicating the preset time period by the sample wafer of the template 1 is only an alternative manner, and is not limited to this in practical implementation. For example, the preset time period may also be indicated by the time period of the background music of the template 1, so that the formed piece cannot exceed the time period of the background music.
In this embodiment, the event 1 may be an event in which a shooting duration (may also be referred to as a recording duration) is equal to a preset duration. In the following embodiments, the present application will be mainly described by taking an example in which the shooting duration of event 1 is equal to the preset duration.
For example, taking the current shooting duration as the duration 00:14 on the left side of "/" in the dashed line box in the interface 121 shown in (a) in fig. 1C, the preset duration is the duration 00:14 on the right side of "/" in the dashed line box in the interface 121 shown in (a) in fig. 1C as an example, that is, the shooting duration is equal to the preset duration, and all the shooting durations are 14s, when event 1 is satisfied, the mobile phone may end shooting.
After the shooting is finished, the mobile phone can form a slice with the video effect indicated by the template 1. That is, after the template 1 is selected, the mobile phone can take the video with the template 1, so as to obtain a slice with the video effect indicated by the template 1. Wherein, adopt template 1 to shoot the video includes: in the process of shooting video, the mobile phone adds an effect to the acquired image in real time according to the video effect indicated by the template 1 and displays the effect in a view finding interface, for example, a filter, a sticker, background music and the like can be added in real time. And/or capturing video using template 1 includes: after the video shooting is finished, the mobile phone performs post-processing on the shot video according to the template 1, and a video effect is added. In a specific implementation manner, effects such as background music, a filter, a sticker, a photo frame and the like can be added in real time in the shooting process, and effects such as transition, tail and the like can be added after shooting is finished.
After forming the sheeting (which may also be referred to as a first sheeting), the handset may display a preview interface (which may also be referred to as a third interface) for the sheeting. In the preview interface, the effect of the sheeting can be previewed. For example, the preview interface is the interface 122 shown in (b) of fig. 1C, and the formed piece can be played in the interface 122. The flakes in interface 122 have the effect of a light grey filter, a decal of "summer feelings, most beautiful summer we together" and the like.
In addition, in the preview interface, the mobile phone can edit the film. That is, the preview interface can also be understood as a sheeted editing interface. After the preview interface is displayed, the mobile phone can receive a preset operation of a user, wherein the preset operation is used for triggering the mobile phone to edit and adjust the effect of the film. For example, edit the effects of background music, filters, special effects (e.g., blinking stars), transitions, photo frames, stickers, and footage, etc. that adjust to the film. The specific implementation of editing will be described in the following detailed examples, which will not be described in any great detail here.
In this embodiment of the present application, the mobile phone may display the preview interface in response to an event (such as event 1) for ending video capturing. Specifically, after ending the video capture, the camera application may invoke the video editing application to display the preview interface. That is, while the preview interface is displayed, the camera application is still running in the foreground.
In the video shooting and previewing scene, if the mobile phone exits from running the camera application in the foreground and exits from the main interface of the mobile phone in the process of displaying the previewing interface, the video processing method provided by the embodiment of the application exits from the previewing interface. Thereby interrupting video editing. Subsequently, the mobile phone may receive a click operation of the camera icon (may also be referred to as a first icon) in the main interface by the user, where the click operation is used to trigger the mobile phone to run the camera application in the foreground. Responding to clicking operation of a camera icon in a main interface by a user, and if a camera application still exists in the background of the mobile phone, namely: the camera application is not cleaned, and the mobile phone can resume displaying the preview interface again. So that editing of the video in the preview interface can continue.
In summary, by adopting the method of the embodiment of the present application, after exiting the preview interface, when the clicking operation of the application icon of the camera application by the user is detected again, the mobile phone can resume displaying the preview interface, so as to facilitate the continuous preview and editing of the video. Therefore, the loss of editing data can be avoided, and the man-machine interaction efficiency in the editing process is improved.
The following will describe the scheme of the present application in detail with reference to the accompanying drawings.
Referring to fig. 2, a hardware configuration diagram of a mobile phone is provided in an embodiment of the present application. As shown in fig. 2, taking the example that the electronic device is a mobile phone, the mobile phone may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (universal serial bus, USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a subscriber identity module (subscriber identification module, SIM) card interface 295, and the like.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 240 may receive a charging input of a wired charger through the USB interface 230. In some wireless charging embodiments, the charge management module 240 may receive wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 240 may also provide power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the external memory, the display 294, the camera 293, the wireless communication module 260, and the like. The power management module 241 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charge management module 240 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The wireless communication module 260 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 200. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device implements display functions through the GPU, the display screen 294, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device may implement shooting functions through an ISP, a camera 293, a video codec, a GPU, a display 294, an application processor, and the like. The ISP is used to process the data fed back by the camera 293. The camera 293 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. In some embodiments, the electronic device may include 1 or N cameras 293, N being a positive integer greater than 1.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 210 through an external memory interface 220 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
Internal memory 221 may be used to store computer executable program code that includes instructions. The processor 210 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 221. For example, the processor 210 may display different contents on the display 284 in response to a user's operation to expand the display 294 by executing instructions stored in the internal memory 221. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an ear-headphone interface 270D, an application processor, and the like. Such as music playing, recording, etc.
Keys 290 include a power on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device. The motor 291 may generate a vibration alert. The motor 291 may be used for incoming call vibration alerting or for touch vibration feedback. The indicator 292 may be an indicator light, which may be used to indicate a state of charge, a change in power, a message indicating a missed call, a notification, etc. The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or removed from the SIM card interface 295 to enable contact and separation from the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
Referring to fig. 3, a partial software structure block diagram of a mobile phone is provided in an embodiment of the present application. The software system of the mobile phone can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. In the embodiment of the application, a layered Android (Android) system is taken as an example, and a software structure of a mobile phone is illustrated. The layered architecture may divide the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer (abbreviated as application layer), an application framework layer (abbreviated as framework layer), a hardware abstraction layer (hardware abstract layer, HAL) layer, and a Kernel layer (Kernel, also referred to as driver layer).
In the embodiments of the present application, an application layer and an application framework layer closely related to a method for implementing the embodiments of the present application are mainly described in more detail. As shown in fig. 3, an application layer 310 and an application framework layer 320 are included in the layered architecture.
Among other things, the application layer 310 includes a desktop application 311, a camera application 312, and a video editing application 313.
The desktop application 311 may be used to display icons of a plurality of applications in a mobile phone, such as application icons of applications in a camera application, gallery application, calendar application, conversation application, map application, music application, video application, short message application, and the like. Desktop application 311 may also be used to receive various user operations on the main interface. For example, the desktop application 311 may receive a user click, long press, etc. of an application icon. It should be appreciated that the main interface of the handset belongs to the desktop application 311.
The camera application 312 includes a plurality of photographing modules, such as a general photographing module, a portrait module, a general video module, a multi-mirror video module, and the like. The multi-mirror video module further comprises a micro-film module. Illustratively, the "micro-movie" button 107 in the interface 103 shown in (b) of FIG. 1A may trigger the invocation of the micro-movie module. After invoking the micro-movie module, an effect template may then be selected before capturing and then the video may be captured using the selected effect template. Still further, the micro-movie module may include a one-click module and a segmentation module (not shown). The one-key shooting module provides a one-key shooting function, and the mobile phone can shoot once to obtain a piece with a corresponding video effect after selecting the effect template by adopting the one-key shooting function. For example, the above examples of fig. 1B and 1C are one-click functions. The segmentation shooting module provides a segmentation shooting function, and the mobile phone can finally form a piece with a corresponding video effect by shooting multiple sections of video after selecting an effect template by adopting the segmentation shooting function. In the following embodiments, the embodiments of the present application will be mainly described by taking a one-click function as an example.
Video editing application 313 is a built-in application of the cell phone that can be used for video preview and editing. Different from the camera application, gallery application, etc., applications are: in the handset, there is no physical portal for the video editing application. For example, there is no application icon for the video editing application on the main interface. Video editing applications can typically only be invoked by camera applications, gallery applications, for video previews and editing. That is, the portals of the video editing applications are camera applications and gallery applications.
Video editing application 313 may be understood as the editing application described previously. Video editing 313 a film preview module is included in the video editing application 313. In some embodiments, after capturing the video using the one-click function is completed, the handset may invoke a tile preview module in video editing application 313 of video editing application 313 to display the preview. Illustratively, the preview interface in the foregoing, interface 122 as shown in (b) in fig. 1C, is displayed by camera application 312 after invoking the film preview module of video editing application 313 of video editing 313.
The video editing application 313 further includes a protocol confirmation module, where the protocol confirmation module is configured to confirm whether the user agrees with the relevant protocol of the video editing application 313 when the mobile phone invokes the video editing application 313 for the first time. That is, the protocol validation module is typically only used when the video editing application 313 is first invoked.
Also included in video editing application 313 in this embodiment are an onPause (onPause) interface and an onResume (onResume) interface. The page onPause interface may be invoked so that the interface exit is not destroyed after foreground display. Thereby facilitating the restoration of the display of the interface when the corresponding application is run again in the foreground. In some embodiments, as shown in FIG. 3, when returning from the preview interface to the main interface, the page onPause interface may be invoked to effect that the preview interface is not destroyed after exiting to display the preview interface in the foreground. Thereby facilitating quick resumption of the preview interface upon reentry of the camera application.
The page onResume interface may be invoked so that the interface that is not destroyed in the background may be displayed in the foreground again so that the user may interact again. In some embodiments, as shown in fig. 3, after the preview interface exits the foreground display, when the clicking operation of the camera icon by the user is detected again, the page onResume interface may be invoked, so as to implement that the preview interface may be quickly restored after the camera application is re-entered.
And, the application framework layer 320 includes an activity manager 321 and a view system 322.
The activity manager 321 may be used to manage the lifecycle of pages in the handset. In some embodiments, as shown in FIG. 3, the page onPause interface, the page onResume interface, and the tile preview module may all interact with the activity manager 321 to enable management of the lifecycle of the preview interface.
View system 322 may be used to build a display interface for an application. Each display interface may be composed of one or more controls. In general, controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets (widgets), and the like. In some embodiments, as shown in FIG. 3, the view system 322 may perform interface rendering, display, such as rendering and displaying a preview interface, according to the lifecycle of the page.
The video processing method of the embodiment of the application can be implemented in the mobile phone with the hardware structure and the software structure. Specifically, as shown in fig. 4, the method in the embodiment of the present application includes:
s401, the camera application 312 receives the start request.
Illustratively, when the camera application 312 receives a click operation of a camera icon in the main interface from a user, a start request is received. Alternatively, the camera application 312 may receive a user click on a camera task in the multi-task interface, or may receive a start request.
S402, the camera application 312 runs in the foreground, and then starts a one-touch function based on a user operation.
The camera application 312 may be launched and run in the foreground in response to the launch request. Then, after passing through (B) in fig. 1A, (a) in fig. 1B, (B) in fig. 1B, and (c) in fig. 1B in this order, a one-touch function may be activated.
S403, the camera application 312 ends the video capturing in response to event 1.
For event 1, reference may be made to the foregoing description, and the description is omitted here.
For example, taking the recording duration corresponding to the template 1, that is, the preset duration is 14s as an example, if the current shooting duration reaches 14s, the event 1 is satisfied and the current shooting duration is equal to the preset duration, at this time, the video shooting can be ended.
S404, the camera application 312 sends a call request to the video editing application 313.
In the present embodiment, the camera application 312 invokes the video editing application 313 to display a preview interface of the sheeting. For the preview interface, see also the description above, and the description is omitted here.
It should be noted that during use of the handset, the user is required to agree with the relevant user protocols of the video editing application 313 when the video editing application 313 is first invoked. Based on this, in some embodiments, video editing application 313 responds to the invocation request, and if it determines to invoke for the first time, video editing application 313 (e.g., protocol validation module) displays the user protocol and provides options for the user to choose whether to agree or not. The video editing application 313 may continue to execute S405 in response to the user agreeing to the user agreement. Otherwise, video editing application 313 returns to display interface 1 in response to the user disagreeing with the user agreement. I.e. an interface comprising a plurality of template options.
S405, the video editing application 313 displays a preview interface.
That is, in the embodiment of the present application, the video editing application 313 is invoked in the camera application 312 to display the preview interface. For example, the preview interface is interface 122 shown in (b) of fig. 1C.
In some embodiments, after displaying the preview interface, if the preset condition is met, the video editing application 313 may display a prompt 1 (may also be referred to as a first prompt) in the preview interface, such as a prompt 505 in the interface 501 shown in fig. 5, where the specific content of the prompt 505 is "click attempt different film styles". The prompt message 1 is used for prompting the effect of switching to the slice. Wherein, meeting the preset condition includes: the display duration of the preview interface reaches a first duration, e.g., 10s, and the video editing application 313 does not detect a preset operation by the user. Or, the preview interface may be played as a slice, and satisfying the preset condition includes: the clip playback ends, and the video editing application 313 does not detect a preset operation by the user. The preset operation is used for triggering the mobile phone to switch into the tablet effect.
The preview interface includes an effect switching control, which is used to trigger the mobile phone to switch to the effect of the film. For example, the effect switching control may include a style switching control, such as a "style" button 502 in the interface 501 shown in fig. 5, for triggering the mobile phone to switch into a style of a slice, where a style refers to an overall effect of the video effect of the template 1, such as an overall effect of multiple effects of a filter, a sticker, a photo frame, and the like, except for background music. For another example, the effect switch control may include a music switch control, such as a music button 503 in the interface 501 shown in fig. 5, for triggering the handset to switch to background music for a piece. For another example, the effect switch controls may include edit controls, such as edit buttons 504 in interface 501 shown in fig. 5, for triggering the cell phone to switch to various effects such as background music, filters, stickers, photo frames, volume, etc. The preset operation may be a click operation or a long press operation of the effect switching control by the user.
Further, the absence of detection of the preset operation by the user means that: starting from the first use of the mobile phone, in a scene of shooting a video by selecting an effect template, after shooting is completed and entering a preview interface, a user input preset operation is never detected. For example, in the process of using the mobile phone in history, after the video is shot by the user selecting the effect template, clicking the style switching control in the preview interface, and when the preview interface is displayed this time, displaying no prompt message 3 after the completion of the film playing. In this way, repeated prompts to the user who is well known for the function of effect switching can be avoided.
In some embodiments, after displaying the preview interface, the video editing application 313 may receive a preset operation of the user for switching to the effect of the tile. For the preset operation, reference may be made to the description in the foregoing embodiments, which is not repeated here. For example, the preset operation may be a click operation of the music button 602 in the interface 601 shown in (a) of fig. 6A by the user. In response to the preset operation, the video editing application 313 may display a plurality of effect options in the preview interface, such as 4 music options of "eased", "summer", "dynamic", "pleasant" in the pop-up window 603 shown in (b 1) in fig. 6A, one for each background music.
The specific implementation of the effect of switching to a slice will be described below mainly taking the switching of background music as an example.
In one particular implementation, multiple background music may be saved in video editing application 313, and each background music has two versions of duration, version 1 duration being the template duration of the effect template that the background music matches. The duration of version 2 is a fixed duration, such as 30s. And the fixed duration (e.g., 30 s) is longer than or equal to the longest template duration (e.g., 20 s). For example, the version of background music stored in the video editing application 313 and its matching effect templates are shown in table 1 below:
TABLE 1
In the present implementation, in response to a preset operation, the video editing application 313 may display a plurality of music options. The plurality of music options includes a first music option and a second music option. Wherein the first music option corresponds to background music (noted as music 1) of matching template 1 and the first music option corresponds to music 1 of version 1. The first music option is typically only one. The second music option corresponds to background music (noted as music 2) that matches other templates (templates other than template 1), and the second music option corresponds to music 2 of version 2. The second music option may typically be plural, such as at least 3. In this way, music sufficient to satisfy the duration of the pieces formed with the template 1 can be provided to the user selection quickly.
For example, table 1 above is an example, if the template 1 selected before shooting is "hello summer", the first music option may be a plurality of pieces of comfort music corresponding to 15s version, and the second music option may be a plurality of pieces of comfort music corresponding to 30s version, dynamic music corresponding to 30s version, pleasant music corresponding to 30s version, careless music corresponding to 30s version, winter music corresponding to 30s version, and the like. For example, the video editing application 313 may display a pop-up window 603 shown in (b 1) in fig. 6A, where the pop-up window 603 includes 15s of the option of soothing music (i.e., the first music option) and 30s of summer music, 30s of dynamic music, 30s of pleasant music (i.e., the second music option). The template duration of "hello summer" is the music duration of the soothing music of version 1, namely 15s, and correspondingly, the duration of the piece formed by "hello summer" is usually not more than 15s. Therefore, the length of music corresponding to the plurality of music options in the pop-up window 603 shown in (b 1) of fig. 6A can satisfy the entire piece.
In another specific implementation, a plurality of background music, each including version 1, is also stored in video editing application 313, see in particular table 1. In the present implementation, in response to a preset operation, the video editing application 313 may display a plurality of music options. The plurality of music options includes a first music option and a second music option. Wherein the first music option corresponds to background music (noted as music 1) of matching template 1 and the first music option corresponds to music 1 of version 1. The first music option is typically only one. The second music option corresponds to the background music (denoted as music 3) matching template 2, the template duration of template 2 being longer than or equal to the template duration of template 1. In the following embodiments, description will be mainly given by taking an example in which the template duration of the template 2 is equal to the template duration of the template 1. Since the music duration of the background music of the version 1 is the same as the template duration of the matched effect template, the template duration of the template 2 is equal to the template duration of the template 1 and can be also understood as: the music duration of music 3 of version 1 is equal to the music duration of music 1 of version 1. And the second music option corresponds to music 3 of version 1. The second music options may be one or more. In this way, the video editing application 313 can provide the user with music that most coincides with the clip duration of the clip, so that operations such as deletion of music at the time of switching and fade processing for the end can be reduced.
For example, still for example, table 1 above, if template 1 selected prior to shooting is "hello summer", then the first music option corresponds to 15s version (i.e., version 1) of soothing music. The second music option may correspond to one or more of 15s version of summer music, 15s version of dynamic music, 15s version of happy music. For example, video editing application 313 may display a pop-up window 604 shown in (b 2) of fig. 6A, where pop-up window 604 includes a 15s version of the option of soothing music (i.e., the first music option) and a 15s version of summer music, a 15s version of dynamic music, a 15s version of pleasant music (i.e., the second music option).
It should be understood that the above two specific implementations are only typical alternatives for determining music options, and are not limited to these implementations. For example, the two specific implementations may be combined, and the option corresponding to the music 3 of the version 1 is first determined to be the second option. If music 3 is not present, or the number of music 3 is insufficient, video editing application 313 may further append the second option with music 2 corresponding to version 2.
In the foregoing description of the music options, the covers and names of the music options are not associated with the corresponding effect templates in order to distinguish between the music and effect templates. In other embodiments, to reflect the association of music and effect templates, when multiple music options are displayed, the cover of the music option may be set to the template cover of the corresponding music-matched effect template, e.g., the cover of the first music option is set to the cover of template 1. Alternatively, the name of the music may be set to the template name of the corresponding music-matched effect template, for example, template 1 is "hello summer", and the name of the first music option may be set to "hello summer".
In some embodiments, after the plurality of music options are displayed, the first music option may be selected by default. For example, if the 15s soothing music option shown in (b 1) of fig. 6A is the first music option, the first option is bolded to indicate that the 15s version of soothing music is currently selected.
In some embodiments, after displaying the plurality of music options, video editing application 313 may receive a user selection operation of any of the music options (which may be denoted as music option 1). In response to the selection operation of music option 1, the mobile phone can replace the background music in the clip with the background music corresponding to option 1 and play the clip from the beginning. Illustratively, taking the example where music option 1 is option 611 shown in (a) in fig. 6B, video editing application 313, in response to a user's selection operation of option 611, may replace the background music in the clip with the music corresponding to option 611, and then play the clip from scratch. In this way, it is possible to facilitate previewing the effect of applying the selected music to the film. Further, if the music duration of the music corresponding to the music option 1 is longer than the slicing duration, the music piece with the same duration as the slicing duration may be cut from the beginning of the music corresponding to the option 1, and the end of the cut music piece may be subjected to the fade treatment.
Subsequently, the mobile phone may receive operation 1 of the user, where operation 1 is used to trigger the mobile phone to switch the background music. Wherein operation 1 may be a slide down operation from top to bottom from a pop-up window (which may be referred to as a music pop-up window) displaying a plurality of music options. Or, the music popup window comprises a confirmation control, and the operation 1 can be a click operation or a long-press operation of the confirmation control by a user. For example, the music pop-up window is a pop-up window 612 shown in (B) in fig. 6B, and a "v" button is included in the pop-up window 612. The ". V" button is a confirmation control. Operation 6 may be a click operation of a "v" button by a user. The form of operation 1 is not particularly limited in the embodiment of the present application. In response to operation 1, the handset may close the plurality of music options and resume displaying the preview interface that does not include the music options. For example, the interface 601 shown in (a) in fig. 6A may be displayed.
In the above-described embodiment, the video editing application 313 can switch to an effect of a film, such as switching background music, in response to a preset operation, so that a fixed combination of effects such as background music, a filter, a photo frame, a sticker, etc., in an effect template (e.g., template 1) selected before shooting can be broken. For example, after the background music is switched, the filter, photo frame, sticker, etc. of the template 1 may be combined with the switched background music. Thus, the diversity of the sheeting effect can be improved.
During the display of the preview interface, the user may enter an operation to return to the main interface, resulting in interruption of the preview. In this embodiment of the present application, in response to the operation of returning to the main interface, not only the main interface may be displayed, as described in detail in S406 below, but also the preview interface may be controlled to enter a pause state, as described in detail in S407-S408 below.
S406, the desktop application 311 responds to the operation of returning to the main interface and displays the main interface.
The operation of returning to the desktop may be a user sliding up operation from the bottom of the preview interface. Alternatively, the return to desktop operation may be a user click operation on a preset control (e.g., home key) hovering in the screen. In the following embodiments, a user's up-slide operation from the bottom of the preview interface will be mainly described as an example. For convenience of explanation, the operation of returning to the main interface may be referred to as a second operation.
Illustratively, the preview interface is an interface 701 shown in (a) of fig. 7, and the desktop application 311 may display an interface 702 shown in (b) of fig. 7 in response to a user's upward slide operation from the bottom of the interface 701. Interface 702 is the main interface of the handset.
S407, the video editing application 313 transmits a call request to the activity manager 321 in response to the operation of returning to the main interface.
Video editing application 313 (e.g., page onPause interface) sends a call request to campaign manager 321 to request that the preview interface not be destroyed after exiting the preview interface.
It should be noted here that video editing application 313 (e.g., the page onPause interface) simply provides a portal that can control the preview interface to enter a pause state. And video editing application 313 (e.g., the page onPause interface) needs to call activity manager 321 to implement state management after entering the background for the preview interface.
It should be noted that the foregoing S407 and S408 do not have absolute sequential timing. For example, S406 and S407 may be performed simultaneously. For another example, S407 may be performed first, and S406 may be performed later. The embodiment of the present application is not particularly limited thereto.
S408, the activity manager 321 controls the preview interface to enter a pause state and stores data of the preview interface.
The pause state (may also be referred to as a first state) is an onPause state. After the preview interface enters a pause state, the preview interface is not destroyed even if the preview interface is currently covered by the main interface, i.e. the preview interface is not displayed in the foreground.
And, the data of the preview interface includes content displayed in the interface. In some embodiments, the data of the preview interface also includes editing data generated at the preview interface, such as switched background music, style, etc. So that all the operations of the user on the sheeting can be recorded.
S409, the desktop application 311 receives a click operation of the camera icon by the user.
After returning to the main interface of the mobile phone, the user may click on the camera icon to trigger the mobile phone to run the camera application in the foreground again. For example, the user clicking on the camera icon 802 in the interface 801 shown in fig. 8 (a) may trigger the handset to run the camera application 312 again in the foreground.
S410, the desktop application 311 sends a start request to the camera application 312.
S411, the camera application 312 inquires whether the background process thereof is destroyed.
If the background process of the camera application 312 is not destroyed, it indicates that the camera application 312 is still in the background. Accordingly, the call relationship (also referred to as call stack) that camera application 312 calls video editing application 313 to display the preview interface also remains. For this case, the embodiment of the present application executes S412 to restore the preview interface.
If the background process of the camera application 312 has been destroyed, it indicates that the camera application 312 has been cleaned up from the background. Accordingly, the calling relationship of camera application 312 calling video editing application 313 to display the preview interface does not exist. In this case, the calling relationship cannot be obtained, and the camera application 312 is only initialized when started, and an interface initialized by the camera application 312, such as a view finding interface before photographing, is displayed. For example, in response to a clicking operation of the camera icon 802 in the interface 801 shown in fig. 8 (a), if it is queried that the camera application 312 has been destroyed, the camera application 312 may display the interface 803 shown in fig. 8 (b 1), where the interface 803 is a view interface before photographing, and is an interface initialized by the camera application 312. It should be noted that the word "camera application destroyed" in the interface 803 shown in (b 1) in fig. 8 is merely for illustration, and is not normally displayed on the interface in practice.
S412, the camera application 312 sends a call request to the video editing application 313 in response to the background process not being destroyed.
In this embodiment, the background process of the camera application 312 is not destroyed, and the camera application 312 may query the call relationship of the camera application 312 to call the video editing application 313, and then the camera application 312 may call the video editing application 313 (such as the page onResume interface) according to the call relationship to request to display the preview interface.
S413, the video editing application 313 transmits a call request to the activity manager 321.
The video editing application 313 (e.g., page onResume interface) further sends a call request to the activity manager 321 in response to the call request of the camera application 312 to request restoration of the preview interface.
S414, the activity manager 321 controls the preview interface to enter a recovery state from a pause state, and recovers the data of the preview interface.
The recovery state (may also be referred to as a second state) is an onResume state. And after the preview interface enters the recovery state, indicating that the preview interface is in a state of being displayed in the foreground.
It should be appreciated that the activity manager 321, when restoring the preview interface, needs to invoke the capabilities of the view system 322 to render and display.
S415, the activity manager 321 transmits the restored preview interface to the video editing application 313.
S416, the video editing application 313 resumes displaying the preview interface.
Illustratively, in response to a user clicking on the camera icon 802 in the interface 801 shown in fig. 8 (a), if it is queried that the camera application is not destroyed, the video editing application 313 may display the interface 804 shown in fig. 8 (b 2), which interface 804 is a preview interface of a clip. It should be noted that the word "camera application is not destroyed" in the interface 803 shown in (b 2) in fig. 8 is merely for illustration, and is not normally displayed on the interface in practice.
After resuming the display of the preview interface, the video editing application 313 may continue to switch to the effect of the tile in response to the preset operation by the user. For a specific implementation of the effect of switching to the slice, reference may be made to the description of the foregoing S405, and the description thereof will not be repeated here.
It can be seen that, in the above S403-S416, in the scenario where the camera application 312 invokes the video editing application 313 to display the preview interface after the video capturing is finished, if the camera application 312 exits to the background of the mobile phone, it is triggered to run in the foreground again through the camera icon in the main interface, and the video editing application 313 can resume displaying the preview interface.
Compared with the conventional technology, in the process of using the system applications such as the camera, the gallery, the setting, the contact person and the like in the mobile phone, the method returns to the main interface of the mobile phone, and after that, when the application icon on the main interface is triggered to run in the foreground, the application is initialized, and the scheme of the initialized interface of the application is displayed: by adopting the scheme, when the camera application is restarted after being exited, the display of the preview interface can be restored, rather than the direct display of the initialized interface. First, after restoring the preview interface, the preview and editing of the film may continue. And secondly, displaying a preview interface after the video shooting is finished, wherein the shot video or the formed film is not stored at the moment, and the preview interface is restored to be displayed, so that the data loss can be avoided.
In some embodiments, after restoring the preview interface, further comprising: s417, the video editing application 313 saves the edited clip in response to the user' S operation of saving the clip (may also be referred to as a third operation). Illustratively, control 4 is included in the preview interface. Such as button 902 in interface 901 shown in fig. 9 (a). The control 4 is used for triggering the mobile phone to store video. The operation of saving the sheeting may be a click operation or a long press operation of the control 4 by the user.
It should be noted that if the video effect of the clip is not edited in the process of displaying the preview interface, the clip after editing is the clip with the video effect corresponding to the template 1; if the video effect of the clip is edited during the process of displaying the preview interface, the edited clip is a clip (which may also be referred to as a second clip) after adjusting one or more video effects based on the clip having the video effect corresponding to the template 1.
Further, during saving a slice, video editing application 313 may display a save prompt in the preview interface. The save prompt is used to prompt that a slice is being saved. For example, the saving hint is a hint 903 shown in (b) in fig. 9, and the hint 903 hint saving progress is 45%. The video editing application 313 may send a notification of the completion of saving to the camera application 312 in response to the end of saving, i.e., after the save progress reaches 100%. The camera 313 may display the interface 1, i.e., an interface including a plurality of template options, in response to the notification of the save completion. Such as interface 904 shown in fig. 9 (c). Thereby facilitating continued selection of the effect template to capture the video.
Because the mobile phone may include the software structures of the desktop application 311, the camera application 312, the video editing application 313, the window manager 321, and the like, the video processing method in the embodiment of the present application may be completed by using the mobile phone as an execution body. The video processing method provided in the embodiment of the present application will be described below with the implementation subject being a mobile phone. Specifically, as shown in fig. 10, the method includes:
S1001, the mobile phone responds to the end of video shooting, and a preview interface is displayed.
In the process of displaying the preview interface, the mobile phone responds to the preset operation of the user and can edit the film. For example, background music, genre, etc. switched to a piece.
S1002, in the process of displaying the preview interface, the mobile phone responds to the operation of returning to the main interface, displays the main interface of the mobile phone, controls the preview interface to enter a pause state, and stores data of the preview interface.
In the embodiment of the application, after the preview interface enters the pause state, the preview interface is not destroyed even if the preview interface is not displayed in the foreground.
S1003, the mobile phone responds to clicking operation of a camera icon in a main interface by a user to inquire whether the camera application is destroyed.
S1004, the mobile phone responds to the destruction of the camera application, and displays a view finding interface before photographing by the camera application.
S1005, the mobile phone responds to the fact that the camera application is not destroyed, controls the preview interface to enter a recovery state, and resumes displaying the preview interface according to the stored data of the preview interface.
In this embodiment of the present application, after the preview interface enters the restored state, displaying the preview interface in the foreground may be restored.
Similarly, in the process of displaying the preview interface, the mobile phone responds to the preset operation of the user and can edit the film. For example, background music, genre, etc. switched to a piece.
S1006, the mobile phone responds to the save operation of the user to save the film.
In summary, by adopting the method of the embodiment of the present application, after exiting from the preview, the mobile phone may resume displaying the preview interface after running the camera application in the foreground again, so as to facilitate the preview and editing to continue. In addition, in the process of displaying the preview interface, the video shot by the template is not stored yet, and the video can be stored into slices by restoring to display the preview interface, so that the shooting result is prevented from being lost.
Other embodiments of the present application provide an electronic device, which may include: the display screen (e.g., touch screen), memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the mobile phone shown in fig. 2.
Embodiments of the present application also provide a chip system, as shown in fig. 11, the chip system 1100 includes at least one processor 1101 and at least one interface circuit 1102. The processor 1101 and interface circuit 1102 may be interconnected by wires. For example, interface circuit 1102 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, the interface circuit 1102 may be used to send signals to other devices (e.g., the processor 1101). The interface circuit 1102 may, for example, read instructions stored in a memory and send the instructions to the processor 1101. The instructions, when executed by the processor 1101, may cause the electronic device to perform the various steps of the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a computer storage medium, which comprises computer instructions, when the computer instructions run on the electronic device, the electronic device is caused to execute the functions or steps executed by the mobile phone in the embodiment of the method.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the mobile phone in the above-mentioned method embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A video processing method, applied to an electronic device, comprising:
the electronic equipment displays a first interface; the first interface is a view finding interface before the electronic equipment records the video after determining to adopt the first template; the first template is used for adding a first video effect to the shot video by the electronic equipment;
the electronic equipment responds to a first operation of a user on the first interface and displays a second interface; the second interface is a view finding interface in which the electronic equipment adopts the first template to record video;
the electronic equipment responds to the ending of video recording and displays a third interface; the third interface is a preview interface of a first film, and the first film is a video obtained by adding the first video effect to the recorded video by the electronic equipment;
The electronic equipment responds to the second operation of the user on the third interface, and a main interface of the electronic equipment is displayed; the main interface comprises a first icon of a camera application in the electronic equipment;
the electronic equipment responds to clicking operation of the user on the first icon, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays the third interface;
the electronic equipment responds to a third operation of the user on the third interface, and the formed video is saved;
the first interface and the second interface are displayed through the camera application, the third interface is displayed through a video editing application in the electronic device, and the video editing application and the camera application are different.
2. The method according to claim 1, wherein the method further comprises:
the electronic equipment responds to a second operation of the user on the third interface, sets the state of the third interface as a first state, and stores interface data of the third interface; the first state is used for the electronic equipment to keep the third interface from being destroyed;
the electronic device displaying the third interface, including:
The electronic equipment updates the state of the third interface into a second state, and displays the third interface according to the interface data; the second state is used for displaying the third interface on the foreground of the electronic equipment by the electronic equipment.
3. The method of claim 1 or 2, wherein after the electronic device displays a third interface in response to ending recording video, the method further comprises:
the electronic equipment plays the first piece in the third interface;
the electronic equipment responds to the end of the first film forming playing, and if the electronic equipment does not detect the preset operation of the user on the third interface, the electronic equipment displays first prompt information in the third interface; the preset operation is used for triggering the electronic equipment to adjust the video effect of the first piece, and the first prompt information is used for prompting to adjust the video effect of the first piece.
4. A method according to any one of claims 1-3, wherein after displaying the third interface, the method further comprises:
the electronic equipment responds to the preset operation of the user on the third interface to adjust the video effect of the first piece;
The electronic device responds to a third operation of the third interface by a user, stores the formed video, and comprises:
the electronic equipment responds to a third operation of the user on the third interface, and a second film is saved; the second film formation is a video obtained after the electronic equipment adjusts the video effect of the first film formation.
5. The method of any of claims 1-4, wherein prior to the electronic device displaying the first interface, the method further comprises:
the electronic equipment displays a fourth interface; the fourth interface comprises a plurality of template options, and the plurality of template options correspond to a plurality of templates in the electronic equipment; the plurality of template options comprise a first option, and the first option corresponds to the first template;
the electronic equipment responds to the selection operation of the user on the first option, and the first template is selected;
the electronic device displays a first interface, including:
after the first template is selected, the electronic equipment responds to fourth operation of a user on the fourth interface, and the first interface is displayed.
6. The method of any of claims 1-5, wherein after the electronic device displays a second interface, the method further comprises:
The electronic equipment responds to the fact that the recording duration is equal to the preset duration, and the electronic equipment finishes recording the video; the preset duration is the recording duration corresponding to the first template.
7. The method of claim 6, wherein the dailies of the first template are of a first duration, the first video effect comprises a first trailer, and the first trailer is of a second duration; the preset time period is the difference between the first time period and the second time period.
8. An electronic device, wherein a plurality of applications are installed in the electronic device, the electronic device comprising a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-7.
9. A computer readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the method of any of claims 1-7.
CN202210038857.7A 2021-06-16 2022-01-13 Video processing method and electronic equipment Active CN115484396B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2021106767093 2021-06-16
CN202110676709 2021-06-16
CN2021114392925 2021-11-29
CN202111439292 2021-11-29

Publications (2)

Publication Number Publication Date
CN115484396A CN115484396A (en) 2022-12-16
CN115484396B true CN115484396B (en) 2023-12-22

Family

ID=84420803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210038857.7A Active CN115484396B (en) 2021-06-16 2022-01-13 Video processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115484396B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016124095A1 (en) * 2015-02-04 2016-08-11 腾讯科技(深圳)有限公司 Video generation method, apparatus and terminal
CN108363499A (en) * 2018-01-31 2018-08-03 维沃移动通信有限公司 A kind of text restoration methods and mobile terminal
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN110515521A (en) * 2019-08-14 2019-11-29 维沃移动通信有限公司 A kind of screenshot method and mobile terminal
CN111385508A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video processing method, device, equipment and storage medium
CN111770354A (en) * 2020-07-02 2020-10-13 广州酷狗计算机科技有限公司 Information interaction method, device, terminal and storage medium
WO2020216096A1 (en) * 2019-04-25 2020-10-29 华为技术有限公司 Video editing method and electronic device
CN112947923A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Object editing method and device and electronic equipment
CN113115099A (en) * 2021-05-14 2021-07-13 北京市商汤科技开发有限公司 Video recording method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8910046B2 (en) * 2010-07-15 2014-12-09 Apple Inc. Media-editing application with anchored timeline

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016124095A1 (en) * 2015-02-04 2016-08-11 腾讯科技(深圳)有限公司 Video generation method, apparatus and terminal
CN108363499A (en) * 2018-01-31 2018-08-03 维沃移动通信有限公司 A kind of text restoration methods and mobile terminal
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN111385508A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video processing method, device, equipment and storage medium
WO2020216096A1 (en) * 2019-04-25 2020-10-29 华为技术有限公司 Video editing method and electronic device
CN110515521A (en) * 2019-08-14 2019-11-29 维沃移动通信有限公司 A kind of screenshot method and mobile terminal
CN111770354A (en) * 2020-07-02 2020-10-13 广州酷狗计算机科技有限公司 Information interaction method, device, terminal and storage medium
CN112947923A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Object editing method and device and electronic equipment
CN113115099A (en) * 2021-05-14 2021-07-13 北京市商汤科技开发有限公司 Video recording method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115484396A (en) 2022-12-16

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
US9465468B2 (en) Mobile terminal and controlling method thereof
JP6072362B2 (en) Application program processing method, apparatus, program, and storage medium
JP6321301B2 (en) Video special effect processing method, apparatus, terminal device, program, and recording medium
CN112449099B (en) Image processing method, electronic equipment and cloud server
RU2608545C1 (en) Method and device for backup video
CN104052909A (en) Shooting method and device
CN114244953B (en) Interface display method, electronic equipment and storage medium
CN110995929A (en) Terminal control method, device, terminal and storage medium
RU2663821C2 (en) Shutdown prompt method and device
CN105843710A (en) Data backup and recovery device and method
CN103999446A (en) User interfaces for electronic devices
CN108965770B (en) Image processing template generation method and device, storage medium and mobile terminal
CN114095776A (en) Screen recording method and electronic equipment
CN105827834A (en) Mobile device application method and device
CN115484396B (en) Video processing method and electronic equipment
WO2022262536A1 (en) Video processing method and electronic device
US20230412535A1 (en) Message display method and electronic device
CN106528197B (en) Shooting method and device
CN115484398B (en) Video shooting method and electronic equipment
CN115480684A (en) Method for returning edited multimedia resource and electronic equipment
CN115484387A (en) Prompting method and electronic equipment
WO2023160208A1 (en) Image deletion operation notification method, device, and storage medium
CN116634058B (en) Editing method of media resources, electronic equipment and readable storage medium
WO2022262453A1 (en) Abnormality prompting method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant