CN115484396A - Video processing method and electronic equipment - Google Patents

Video processing method and electronic equipment Download PDF

Info

Publication number
CN115484396A
CN115484396A CN202210038857.7A CN202210038857A CN115484396A CN 115484396 A CN115484396 A CN 115484396A CN 202210038857 A CN202210038857 A CN 202210038857A CN 115484396 A CN115484396 A CN 115484396A
Authority
CN
China
Prior art keywords
interface
electronic equipment
video
template
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210038857.7A
Other languages
Chinese (zh)
Other versions
CN115484396B (en
Inventor
韩笑
李启冒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN115484396A publication Critical patent/CN115484396A/en
Application granted granted Critical
Publication of CN115484396B publication Critical patent/CN115484396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a video processing method and electronic equipment, relates to the technical field of terminals, and can avoid data loss and improve human-computer interaction efficiency in a video editing process. The method comprises the following steps: the electronic equipment displays a first interface; the first interface is a viewing interface before the electronic equipment records the video after determining that the first template is adopted; the first template is used for adding a first video effect to the shot video by the electronic equipment. The electronic equipment responds to a first operation of the user on the first interface and displays the second interface. The electronic equipment responds to the end of recording the video and displays a third interface; the third interface is a preview interface for the first filmed. And the electronic equipment responds to the second operation of the user on the third interface and displays the main interface. And the electronic equipment responds to the click operation of the user on the first icon, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays a third interface. And the electronic equipment responds to a third operation of the user on the third interface and saves the formed video.

Description

Video processing method and electronic equipment
The present application claims priority of chinese patent application entitled "a method for creating a video for a user and an electronic device based on a story line pattern" filed on 16/6/2021, and the priority of chinese patent application entitled "a method for creating a video for a user and an electronic device" filed on 16/6/2021, and priority of chinese patent application entitled "a method for processing a video and an electronic device" filed on 29/11/2021, and the entire contents of the priority are incorporated by reference in the present application.
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a video processing method and an electronic device.
Background
Electronic devices such as mobile phones and tablets are generally provided with video capturing and editing functions. Taking the example where the electronic device is a cell phone, using the camera application, the cell phone may capture and store video. And then, the mobile phone responds to the click operation of the user on the gallery entrance at the lower left corner of the viewing interface, a video preview interface can be displayed, and the mobile phone can perform operations such as clipping and sending on the video in the preview interface.
However, the inventor finds that in the process of implementing the embodiment of the application: in the prior art, in the process of displaying a preview interface, if the electronic device exits from a main interface of the electronic device, the electronic device starts the camera application again and displays a view-finding interface of a photographing mode in the camera application after receiving a click operation of a user on an application icon of the camera application in the main interface. Resulting in loss of edit data generated in the preview interface.
Disclosure of Invention
The embodiment of the application provides a video processing method and electronic equipment, and after a camera application exits from running in a foreground, when the camera is running in the foreground again, display of a preview interface can be recovered. Therefore, data loss is avoided, and the human-computer interaction efficiency in the video editing process is improved.
In a first aspect, an embodiment of the present application provides a video shooting method, which is applied to an electronic device such as a mobile phone and a tablet that supports video shooting. The electronic equipment displays a first interface. The first interface is a viewing interface before the electronic device records the video after determining to adopt the first template. The first template is used for adding a first video effect to the shot video by the electronic equipment. The electronic equipment responds to a first operation of the user on the first interface and displays the second interface. The second interface is a framing interface in the video being recorded by the electronic device using the first template. The electronic equipment responds to the end of recording the video and displays a third interface; the third interface is a preview interface of the first film, and the first film is a video obtained by adding the first video effect to the recorded video by the electronic device. And the electronic equipment responds to the second operation of the user on the third interface and displays the main interface of the electronic equipment. The home interface includes a first icon of a camera application therein. And the electronic equipment responds to the clicking operation of the first icon by the user, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays a third interface. And the electronic equipment responds to a third operation of the user on the third interface and saves the formed video.
In summary, in a scene where a template is used to capture video for shooting and previewing, by applying the video processing method provided in the embodiment of the present application, in the process of displaying the preview interface, if the camera application is exited from the foreground and the main interface of the electronic device is returned, the preview interface is exited at this time. Thereby interrupting video editing. Subsequently, the electronic device may receive a click operation of the user on a camera icon in the main interface, where the click operation is used to trigger the electronic device to run a camera application in the foreground. Responding to the click operation of the user on the camera icon in the main interface, if the camera application still exists in the background of the mobile phone, namely: the camera application is not cleaned up and the electronic device may resume displaying the preview interface again. So that the video can continue to be edited in the preview interface. Therefore, the editing data can be prevented from being lost, and the human-computer interaction efficiency in the editing process is improved.
In a possible design manner of the first aspect, the electronic device sets the state of the third interface to the first state in response to a second operation of the user on the third interface, and stores interface data of the third interface; the first state is used for the electronic device to keep the third interface from being destroyed. Therefore, even if the preview interface is withdrawn subsequently, the electronic equipment can store the data of the third interface, and the quick recovery is convenient. The electronic device displays a third interface, including: the electronic equipment updates the state of the third interface to be the second state, and displays the third interface according to the interface data; the second state is used for the electronic equipment to display a third interface on the foreground of the electronic equipment.
In this embodiment, the electronic device may store and restore the interface data by adjusting the state of the interface. Therefore, the human-computer interaction efficiency can be improved.
In a possible design manner of the first aspect, after the electronic device displays the third interface in response to ending recording of the video, the method further includes: the electronic device plays the first piece in the third interface. The electronic equipment responds to the end of the playing of the first film, and if the electronic equipment does not detect the preset operation of the user on the third interface, the electronic equipment displays first prompt information in the third interface; the preset operation is used for triggering the electronic equipment to adjust the video effect of the first film, and the first prompt information is used for prompting the adjustment of the video effect of the first film.
In this embodiment, the electronic device may provide an editing prompt in the preview interface, so that the user may be explicitly instructed to continue editing the film.
In a possible design manner of the first aspect, after the displaying the third interface, the method further includes: and the electronic equipment responds to the preset operation of the user on the third interface and adjusts the video effect of the first film. The electronic equipment responds to a third operation of the user on the third interface, and saves the formed video, and the method comprises the following steps: the electronic equipment responds to a third operation of the user on a third interface and saves a second film; the second film is a video obtained by the electronic equipment after adjusting the video effect of the first film.
In this embodiment, in the preview interface, the video effect may be adjusted. Therefore, the specified combination of various video effects in the template can be broken, and more diversified videos can be formed.
In a possible design manner of the first aspect, before the electronic device displays the first interface, the method further includes: the electronic device displays the fourth interface. The fourth interface comprises a plurality of template options, and the plurality of template options correspond to a plurality of templates in the electronic equipment; the template options comprise a first option, and the first option corresponds to the first template. The electronic equipment responds to the selection operation of the user on the first option and selects the first template. Wherein, electronic equipment shows first interface, includes: and after the first template is selected, the electronic equipment responds to fourth operation of the user on the fourth interface and displays the first interface.
In this embodiment, the electronic device may select the template based on a selection of the user for shooting a video having an effect consistent with that of the template. Thereby, the intelligence of video shooting can be improved.
In a possible design manner of the first aspect, after the displaying the second interface by the electronic device, the method further includes: the electronic equipment responds to the fact that the recording duration is equal to the preset duration, and the electronic equipment finishes recording the video; the preset time length is the recording time length corresponding to the first template.
In this embodiment, the electronic device may flexibly control the duration of the video according to the selected template, and may finally form a video most suitable for the template effect.
In a possible design manner of the first aspect, a sample duration of the first template is a first duration, the first video effect includes a first trailer, and a duration of the first trailer is a second duration; the preset time period is a difference between the first time period and the second time period.
In a second aspect, embodiments of the present application further provide an electronic device that may support a video capture function, where the electronic device includes a display screen, a memory, and one or more processors. The display screen, the memory, and the processor are coupled. The memory is adapted to store computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any of its possible designs.
In a third aspect, an embodiment of the present application provides a chip system, where the chip system is applied to an electronic device including a display screen and a memory; the chip system includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a line; the interface circuit is configured to receive signals from a memory of the electronic device and to transmit the signals to the processor, the signals including computer instructions stored in the memory; when the processor executes the computer instructions, the electronic device performs the method as described in the first aspect and any one of its possible designs.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method according to the first aspect and any one of its possible designs.
In a fifth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method according to the first aspect and any one of its possible designs.
It should be understood that beneficial effects that can be achieved by the electronic device according to the second aspect, the chip system according to the third aspect, the computer storage medium according to the fourth aspect, and the computer program product according to the fifth aspect provided above may refer to the beneficial effects of the first aspect and any possible design manner thereof, and are not repeated herein.
Drawings
Fig. 1A is one of schematic interface diagrams of a mobile phone according to an embodiment of the present disclosure;
fig. 1B is a second schematic interface diagram of a mobile phone according to an embodiment of the present disclosure;
fig. 1C is a third schematic interface diagram of a mobile phone according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a software interaction provided in an embodiment of the present application;
fig. 5 is a fourth schematic interface diagram of a mobile phone according to an embodiment of the present application;
fig. 6A is a fifth schematic view of an interface of a mobile phone according to an embodiment of the present application;
fig. 6B is a sixth schematic interface diagram of a mobile phone according to an embodiment of the present application;
fig. 7 is a seventh schematic interface diagram of a mobile phone according to an embodiment of the present application;
fig. 8 is an eighth schematic interface diagram of a mobile phone according to an embodiment of the present disclosure;
fig. 9 is a ninth schematic view illustrating an interface of a mobile phone according to an embodiment of the present disclosure;
fig. 10 is a flowchart of a video processing method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
In the description of the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more, unless otherwise specified. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not necessarily denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance. In the description of the embodiments of the present application, unless otherwise stated, the positions and forms of the interface elements in the interface schematic diagram are schematic, and in actual implementation, the positions and forms may be flexibly adjusted according to actual requirements.
The embodiment of the application provides a video processing method which can be applied to a scene for previewing a shot video in electronic equipment. For example, the electronic device in the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the like, and the embodiment of the present application does not particularly limit the specific form of the electronic device. In the following embodiments, the electronic device is mainly taken as a mobile phone as an example to explain the scheme of the present application.
In some embodiments, the mobile phone may capture a video to be previewed in a single-lens video mode. The single-lens video recording mode is a common video recording mode in the conventional sense, and in the single-lens video recording mode, the mobile phone only adopts one camera to shoot videos. For example, the interface 101 shown in fig. 1A (a) is a viewfinder interface before recording in the single-lens recording mode. The interface 101 includes only images 102 captured by a single camera (e.g., a front-facing camera).
In other embodiments, the mobile phone may capture a video to be previewed in a multi-mirror video mode. Under the multi-lens video mode, the mobile phone can adopt two or more cameras to shoot videos simultaneously. For example, the interface 103 shown in fig. 1A (b) is a viewfinder interface before recording in the multi-lens recording mode. The interface 103 includes images 104 and 105 captured by two cameras (e.g., a rear main camera and a front camera). In the following embodiments, the multi-mirror video recording mode is mainly taken as an example to illustrate the solution of the present application.
It should be noted that, in the multi-mirror recording mode, the mobile phone may also be triggered to switch to shooting a video with a single camera by a switching operation, for example, clicking a switching button 106 in the interface 103. In order to distinguish from the single-lens video recording mode, a single camera can be used for shooting a video scene in the multi-lens video recording mode, and the scene can be recorded as a multi-lens sub-mode.
In one scenario, the handset may take a video to be previewed with the selected effect template (which may be denoted as template 1). Each effect template corresponds to a group of video effects, and each group of video effects is formed by combining various effects of background music, filters, special effects (such as twinkling stars), transition, photo frames, stickers, tails and the like. Template 1 is selected before video shooting, and the mobile phone can shoot a video by adopting template 1. For convenience of description, the template 1 may be referred to as a first template, and a group of video effects corresponding to the template 1 may be referred to as a first video effect.
Illustratively, control 1 is included in the viewfinder interface before the multi-lens recording mode is performed, for example, control 1 is a "micro movie" button 107 in the interface 103 shown in (b) of fig. 1A. The control 1 is used for triggering the mobile phone to select the effect template. The mobile phone can receive the click operation or long-time press operation of the user on the control 1. In response to a user clicking or long-pressing on control 1, a plurality of template options may be displayed. For example, in response to a user clicking on the "micro-movie" button 107 (i.e., control 1) shown in fig. 1A (B), the cell phone may display an interface 111 shown in fig. 1B (a), which includes a plurality of template options (shown in the dashed box) in the interface 111, such as options of "nice summer", "nice time light", "good mood", "easy weekend", and the like. The plurality of template options correspond to the plurality of effect templates one to one. The template options can be used for triggering the effect template selected by the mobile phone. The mobile phone can receive the selection operation of the user on any template option (which can be marked as option 1 and also can be called as a first option). In response to the selection operation of the user on option 1, the mobile phone may select an effect template (which may be denoted as template 1) corresponding to option 1. It should be appreciated that the handset may default to selecting the effect template corresponding to the first template option. For example, the default selected by the mobile phone is the template corresponding to the "hello summer" option shown in (a) of fig. 1B. Control 2 is included in an interface (which may be referred to as interface 1, and may also be referred to as a fourth interface) displaying a plurality of template options, for example, control 2 is a button 112 in an interface 111 shown in fig. 1B (a). And the control 2 is used for triggering the mobile phone to shoot the video.
In some embodiments, after selecting the template 1, the mobile phone may receive operations (which may also be referred to as a fourth operation) such as a click operation, a long-press operation, or a sliding operation of the user on the control 2, which will be mainly described below with the click operation. In response to the click operation of the user on the control 2, the mobile phone may display a shooting preparation interface (which may be referred to as interface 2 and may also be referred to as a first interface). In the process of displaying the interface 2, the user can complete preparation work before shooting, such as adjusting the shooting angle, turning on or off background music of the template 1, and the like. For example, the interface 2 is the interface 113 shown in (B) of fig. 1B, and the interface 113 does not include a shooting timer, nor controls for ending shooting, pausing shooting, and the like, i.e., indicates that shooting has not been started yet. Also, a control 3, such as a button 114 in the interface 113 shown in (B) in fig. 1B, is included in the interface 2.
Further, after the interface 2 is displayed, the mobile phone may receive operations (which may also be referred to as a first operation) such as a click operation, a long-press operation, or a sliding operation of the user on the control 3. The following will also be explained mainly in terms of click operations. In response to the click operation of the user on the control 3, the mobile phone can start video shooting by using the template 1 and display a view interface (which can be recorded as the interface 3 and can also be referred to as a second interface) in the shooting. For example, the interface 3 is the interface 115 shown in (c) in fig. 1B. The interface 115 includes interface elements such as a shooting duration 00, an end shooting button 116, and a pause shooting button 117, and indicates that video shooting has been started.
In other embodiments, after the mobile phone selects the template 1, the mobile phone may receive a click operation of the control 2 by the user. In response to the click operation of the user on the control 2, the mobile phone can directly display the interface 3. That is, the interface 2 is not displayed, but directly enters shooting.
After starting video capture, the handset may end video capture in response to event 1.
In some embodiments, an end recording control, such as an end capture button 116 shown in (c) of fig. 1B, is included in the interface 3, and is used to trigger the mobile phone to end video capture. The event 1 may be a click operation or a long-press operation of the user on the end recording control, and the like.
In other embodiments, the template 1 has a swatch that exhibits the sheeting effect that can be obtained after a photograph taken using the template 1. Such as filters, background music, etc. Further, in order to ensure that the formed piece after being photographed by using the template 1 can keep a high consistency with the sample piece, that is: the user is allowed to see from the swatch as the final piece. After the mobile phone selects the template 1, the mobile phone needs to control the duration of shooting the video, so that the duration of the film finally formed by using the template 1 is the same as the duration of the sample film of the template 1 (which may also be referred to as a first duration). That is, the template 1 may indicate that a video of a certain duration (which may be recorded as a preset duration) is photographed at the longest.
In a specific implementation, the preset duration is the same as the sample duration. For example, if the template 1 is "your summer" and the sample duration of "your summer" is 15s, the longest shot can be controlled to obtain 15s of video after the template of "your summer" is selected and shooting is started.
In another specific implementation manner, the video effect corresponding to the template 1 includes a trailer (which may also be referred to as a first trailer), and the trailer is fixed content, such as an animation of a logo (logo) of a mobile phone manufacturer, and is unrelated to the content of the current shot. Accordingly, when a video effect is added to a photographed video using the template 1, a trailer is added at the end. In this case, the sum of the preset duration and the end-of-slice duration (which may also be referred to as the second duration) is equal to the sample duration, that is, the preset duration is equal to the difference between the sample duration and the end-of-slice duration. Illustratively, if the template 1 is "your summer", "the duration of the sample of your summer" is 15s, and the video effect corresponding to the template 1 includes 1s of trailers, the predicted duration is (15-1) s =14s, as shown in (c) of fig. 1B, in the interface 115, 00 is included, which is used to indicate that the preset duration is 14s.
It should be understood that the manner of indicating the preset time period by the sample of the template 1 is only an optional manner, and the actual implementation is not limited thereto. For example, the preset time length may also be indicated by the time length of the background music of the template 1 so that the formed piece cannot exceed the time length of the background music.
In this embodiment, the event 1 may be an event whose shooting duration (which may also be referred to as recording duration) is equal to a preset duration. In the following embodiments, the scheme of the present application will be mainly described by taking an example that event 1 is a shooting time length equal to a preset time length.
For example, taking the current shooting duration as the duration 00 on the left side of the "/" inside the dashed line frame in the interface 121 shown in (a) in fig. 1C.
After finishing the shot, the handset can form a film with the video effect indicated by template 1. That is, after selecting template 1, the mobile phone can capture a video using template 1, resulting in a film with the video effect indicated by template 1. Wherein, adopt template 1 to shoot the video and include: in the process of shooting the video, the mobile phone adds an effect to the acquired image in real time according to the video effect indicated by the template 1, and displays the effect in a viewing interface, for example, a filter, a sticker, background music and the like can be added in real time. And/or, the shooting of the video by adopting the template 1 comprises the following steps: after the video shooting is finished, the mobile phone performs post-processing on the shot video according to the template 1, and adds a video effect. In a specific implementation manner, effects such as background music, filters, stickers and photo frames can be added in real time in the shooting process, and effects such as transition and end of the line can be added only after the shooting is finished.
After forming a tile (also referred to as a first tile), the cell phone can display a preview interface of the tile (also referred to as a third interface). In the preview interface, the effects of the tiles can be previewed. For example, the preview interface is the interface 122 shown in (b) of fig. 1C, and the formed piece can be played in the interface 122. The finished films in the interface 122 have the effects of a light gray filter, a sticker that makes people in summer sentiment and the most beautiful summer together, and the like.
In addition, in a preview interface, the mobile phone can edit the film. That is, the preview interface may also be understood as a fragmented editing interface. After the preview interface is displayed, the mobile phone can receive preset operation of a user, and the preset operation is used for triggering the effect of editing and adjusting the mobile phone into a piece. For example, editing effects such as background music, filters, special effects (e.g., flashing stars), transitions, photo frames, stickers, and trailers that adjust the tiles. The specific implementation of editing will be described in the following detailed embodiments, and will not be described in any way.
In the embodiment of the application, the mobile phone may display a preview interface in response to an event (such as event 1) that the video shooting is ended. In particular, after ending video capture, the camera application may invoke the video editing application to display the preview interface. That is, while the preview interface is displayed, the camera application is still running in the foreground.
In the video shooting and previewing scene, by applying the video processing method provided by the embodiment of the application, in the process of displaying the preview interface, if the mobile phone exits from the front running camera application and returns to the main interface of the mobile phone, the preview interface exits at the moment. Thereby interrupting video editing. Subsequently, the mobile phone may receive a click operation of the user on a camera icon (which may also be referred to as a first icon) in the main interface, where the click operation is used to trigger the mobile phone to run a camera application in the front desk. Responding to the click operation of the user on the camera icon in the main interface, if the camera application still exists in the background of the mobile phone, namely: the camera application is not cleaned up, and the cell phone can resume displaying the preview interface again. So that the video can continue to be edited in the preview interface.
In summary, with the method of the embodiment of the present application, after exiting the preview interface, when the click operation of the user on the application icon of the camera application is detected again, the mobile phone may resume displaying the preview interface, so as to facilitate continuing to preview and edit the video. Therefore, the editing data can be prevented from being lost, and the human-computer interaction efficiency in the editing process is improved.
The following detailed description of the embodiments will be made with reference to the accompanying drawings.
Referring to fig. 2, a hardware structure diagram of a mobile phone according to an embodiment of the present application is provided. As shown in fig. 2, taking the electronic device as a mobile phone as an example, the mobile phone may include a processor 210, an external memory interface 220, an internal memory 221, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a Subscriber Identity Module (SIM) card interface 295, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device. In other embodiments, an electronic device may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charge management module 240 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 240 may receive charging input from a wired charger via the USB interface 230. In some wireless charging embodiments, the charging management module 240 may receive the wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 240 may also supply power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charging management module 240, and provides power to the processor 210, the internal memory 221, the external memory, the display 294, the camera 291, and the wireless communication module 260. The power management module 241 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charging management module 240 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor, the baseband processor, and the like.
The wireless communication module 260 may provide solutions for wireless communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
The electronic device implements display functions via the GPU, the display screen 294, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The electronic device may implement a shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294, and the application processor. The ISP is used to process the data fed back by the camera 293. The camera 293 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. In some embodiments, the electronic device may include 1 or N cameras 293, N being a positive integer greater than 1.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device. The external memory card communicates with the processor 210 through the external memory interface 220 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 221 may be used to store computer-executable program code, which includes instructions. The processor 210 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 221. For example, processor 210 may display different content on display screen 284 in response to a user's manipulation to expand display screen 294 by executing instructions stored in internal memory 221. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device may implement audio functions through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone interface 270D, the application processor, and the like. Such as music playing, recording, etc.
The keys 290 include a power-on key, a volume key, and the like. The keys 290 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device. The motor 291 may generate a vibration cue. The motor 291 can be used for both incoming call vibration prompting and touch vibration feedback. Indicator 292 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, missed call, notification, etc. The SIM card interface 295 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 295 or being pulled out of the SIM card interface 295. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
Referring to fig. 3, a block diagram of a part of a software structure of a mobile phone provided in the embodiment of the present application is shown. The software system of the mobile phone can adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture or a cloud architecture. The embodiment of the application takes an Android (Android) system with a layered architecture as an example, and exemplifies a software structure of a mobile phone. The layered architecture can divide the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, which are an application layer (abbreviated as application layer), an application framework layer (abbreviated as framework layer), a Hardware Abstraction Layer (HAL) layer, and a Kernel layer (also called driver layer) from top to bottom.
In the embodiment of the present application, an application layer and an application framework layer that are closely related to implementing the method of the embodiment of the present application are mainly described in more detail. As shown in fig. 3, the layered architecture includes an application layer 310 and an application framework layer 320.
Among them, the application layer 310 includes a desktop application 311, a camera application 312, and a video editing application 313.
The desktop application 311 may be used to display icons of a plurality of application programs in the mobile phone, such as application icons of application programs, such as a camera application, a gallery application, a calendar application, a call application, a map application, a music application, a video application, and a short message application. The desktop application 311 may also be used to receive various operations of the user on the main interface. Illustratively, the desktop application 311 may receive a user click, long press, or the like on an application icon. It should be understood that the main interface of the handset belongs to the desktop application 311.
The camera application 312 includes a plurality of shooting modules, such as a general shooting module, a portrait module, a general video module, a multi-mirror video module, and the like. Wherein, the multi-mirror video recording module further comprises a micro-video module. Illustratively, a "micro-movie" button 103 in the interface 103 shown in fig. 1A (b) may trigger the invocation of the micro-movie module. After the micro-shadow module is invoked, an effect template can be selected before shooting, and then the selected effect template is used to shoot the video. Further, the micro-cinema module may include a key-beat module and a segment-beat module (not shown). Wherein, a key claps the module and provides a key and claps the function, adopts a key to clap the function, and the cell-phone can be after selecting the effect template, once only shoots the piece that obtains having corresponding video effect. For example, the above examples of fig. 1B and 1C are functions of one-touch. The sectional shooting module provides a sectional shooting function, the sectional shooting function is adopted, and the mobile phone can finally form a film with a corresponding video effect by shooting multiple sections of videos after selecting an effect template. In the following embodiments, the present application will be mainly described by taking a one-touch function as an example.
The video editing application 313 is a handset built-in application that can be used for video preview and editing. Unlike camera applications, gallery applications, etc., are: in a cell phone, there is no physical entry for a video editing application. For example, there is no application icon for the video editing application on the home interface. The video editing application can typically only be invoked by the camera application, gallery application for video preview and editing. That is, the portals of the video editing application are the camera application and the gallery application.
The video editing application 313 may be understood as an editing application as described earlier. Video editing 313 a filmstrip preview module is included in the video editing application 313. In some embodiments, after the video is captured using the one-touch function, the cell phone may call a slice preview module in the video editing application 313 of the video editing application 313 to display a preview. Illustratively, the preview interface in the foregoing, such as the interface 122 shown in fig. 1C (b), is displayed by the camera application 312 after invoking the filmed preview module of the video editing application 313.
The video editing application 313 further includes a protocol confirming module, which is configured to confirm whether the user agrees with the relevant protocol of the video editing application 313 when the mobile phone first calls the video editing application 313. That is, the protocol validation module is typically only used when the video editing application 313 is first invoked.
In this embodiment, the video editing application 313 further includes a page pause (onPause) interface and a page resume (onResume) interface. The page onPause interface can be called, so that the interface quit cannot be destroyed after being displayed in the foreground. So that the display of the interface can be resumed when the corresponding application is run again in the foreground. In some embodiments, as shown in fig. 3, when the main interface is returned from the preview interface, the page onPause interface may be called, so that the preview interface is not destroyed after exiting the preview interface displayed in the foreground. Thereby facilitating quick resumption of the preview interface upon re-entry into the camera application.
The page onResume interface can be called, so that the interface which is not destroyed in the background can be displayed in the foreground again, and therefore the user can interact with the interface again. In some embodiments, as shown in fig. 3, after the preview interface exits from being displayed in the foreground, when a click operation of the camera icon by the user is detected again, the page onResume interface may be called, so that after the camera application is entered again, the preview interface may be restored quickly.
And, the application framework layer 320 includes an activity manager (activity manager) 321 and a view system (view system) 322.
The activity manager 321 can be used to manage the lifecycle of pages in the handset. In some embodiments, as shown in FIG. 3, the page onPause interface, the page onResume interface, and the sheet preview module may all interact with the activity manager 321 to enable management of the lifecycle of the preview interface.
The view system 322 can be used to build display interfaces for applications. Each display interface may be composed of one or more controls. Generally, a control may include an interface element such as an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, widget, and the like. In some embodiments, as shown in FIG. 3, the view system 322 may render, display, such as render and display a preview interface, the interface according to the lifecycle of the page.
The video processing method of the embodiment of the application can be realized in a mobile phone with the hardware structure and the software structure. Specifically, as shown in fig. 4, the method in the embodiment of the present application includes:
s401, the camera application 312 receives the start request.
Illustratively, the camera application 312 receives a user's click operation on a camera icon in the main interface, and then receives a launch request. Alternatively, the camera application 312 receives a user's click operation on a camera task in the multi-task interface, and may also receive a start request.
S402, the camera application 312 runs in the foreground, and then starts a one-touch shooting function based on user operation.
The camera application 312 may then be launched and run in the foreground in response to the launch request. Then, the one-touch function may be activated by sequentially passing through (B) of fig. 1A, (a) of fig. 1B, (B) of fig. 1B, and (c) of fig. 1B.
S403, the camera application 312 ends the video photographing in response to the event 1.
For event 1, refer to the description above, and the description is omitted here.
Illustratively, taking the recording duration corresponding to the template 1, that is, the preset duration is 14s as an example, if the current shooting duration reaches 14s and is equal to the preset duration, the event 1 is satisfied, and at this time, the video shooting may be ended.
S404, the camera application 312 sends a call request to the video editing application 313.
In the embodiment of the present application, the camera application 312 calls the video editing application 313 to display a fragmented preview interface. For the preview interface, refer to the description in the foregoing, and the description is omitted here.
It should be noted that during use of the handset, the user is required to agree to the relevant user agreement of the video editing application 313 when the video editing application 313 is first invoked. Based on this, in some embodiments, the video editing application 313 responds to the invocation request, and if it is determined that the invocation is the first time, the video editing application 313 (e.g., a protocol confirmation module) displays the user's protocol and provides an option of agreeing or not for the user to select. The video editing application 313 may proceed to S405 in response to the user agreeing to the user agreement. Conversely, the video editing application 313 returns to display interface 1 in response to the user not agreeing with the user agreement. I.e. an interface comprising a plurality of template options.
S405, the video editing application 313 displays a preview interface.
That is, in the present embodiment, the video editing application 313 is invoked in the camera application 312 to display the preview interface. For example, the preview interface is the interface 122 shown in (b) in fig. 1C.
In some embodiments, after displaying the preview interface, if a preset condition is met, the video editing application 313 may display prompt information 1 (which may also be referred to as first prompt information) in the preview interface, such as the prompt 505 in the interface 501 shown in fig. 5, where the specific content of the prompt 505 is "click to try a different movie genre". The prompt information 1 is used to prompt the effect of switching to tiles. Wherein, satisfying preset condition includes: the display duration of the preview interface reaches a first duration, for example, 10s, and the video editing application 313 does not detect the preset operation of the user. Or, the preview interface may be played in a piece, and meeting the preset conditions includes: the playback of the clip ends, and the video editing application 313 does not detect the preset operation by the user. The preset operation is used for triggering the effect of switching the mobile phone into the mobile phone slice.
Illustratively, the preview interface includes an effect switching control, and the effect switching control is used for triggering the effect of switching the mobile phone into a piece. For example, the effect switching control may include a style switching control, such as a "style" button 502 in the interface 501 shown in fig. 5, the style switching control is used to trigger the mobile phone to switch to a piece style, and the style refers to an overall effect of the video effect of the template 1 except for background music, such as an overall effect of multiple effects, such as a filter, a sticker, a photo frame, and the like. As another example, the effect switching control may include a music switching control, such as music button 503 in interface 501 shown in fig. 5, for triggering the cell phone to switch to background music. As another example, the effect switching control may include an editing control, such as an editing button 504 in the interface 501 shown in fig. 5, for triggering the mobile phone to switch various effects such as background music, filter, sticker, photo frame, volume, and the like. The preset operation may be a click operation or a long-time press operation of the effect switching control by the user.
Further, the step of not detecting the preset operation of the user includes: starting from the first use of the mobile phone, in a scene of shooting a video by selecting an effect template, after shooting is finished and a preview interface is entered, a user input preset operation is never detected. For example, in the process of using a mobile phone historically, after selecting an effect template to shoot a video, the user clicks a style switching control in a preview interface, and when the preview interface is displayed at this time, the prompt information 3 is not displayed after the completion of playing a piece. In this way, repeated prompts by a user familiar with the function of effect switching can be avoided.
In some embodiments, after displaying the preview interface, the video editing application 313 may receive a preset operation by the user, the preset operation being for switching to a tiled effect. For the preset operation, reference may be made to the description in the foregoing embodiments, which are not repeated herein. For example, the preset operation may be a click operation of the user on the music button 602 in the interface 601 shown in (a) in fig. 6A. In response to the preset operation, the video editing application 313 may display a plurality of effect options in the preview interface, for example, the plurality of effect options are 4 music options of "relaxing", "summer", "animation", and "pleasure" in the pop-up window 603 shown in (b 1) in fig. 6A, and each music option corresponds to one background music.
The following mainly describes the specific implementation of the effect of switching to tiles by taking an example of switching background music.
In one specific implementation, a plurality of pieces of background music may be stored in the video editing application 313, and each piece of background music has a version with two durations, and the duration of the version 1 is the template duration of the effect template matched with the piece of background music. The duration of version 2 is a fixed duration, such as 30s. And the fixed duration (e.g., 30 s) is longer than or equal to the longest template duration (e.g., 20 s). For example, the version of background music and its matching effect template stored in the video editing application 313 are shown in table 1 below:
TABLE 1
Figure BDA0003469319630000101
Figure BDA0003469319630000111
In the present implementation, the video editing application 313 may display a plurality of music options in response to a preset operation. The plurality of music selections includes a first music selection and a second music selection. The first music option corresponds to the background music (denoted as music 1) matching the template 1, and the first music option corresponds to the music 1 of version 1. The first music option is typically only one. The second music option corresponds to background music (denoted as music 2) that matches other templates (templates other than template 1), and the second music option corresponds to version 2 of music 2. The second music choice may typically be plural, such as at least 3. In this way, music that is sufficient to satisfy the length of a piece formed with the template 1 can be quickly provided to the user for selection.
Illustratively, taking table 1 above as an example, if the template 1 selected before shooting is "hello summer", the first music option may correspond to a release music of 15s version, and the second music option may correspond to a plurality of summer music of 30s version, a kinetic music of 30s version, a happy music of 30s version, a sad music of 30s version, a winter music of 30s version, and the like. For example, the video editing application 313 may display a popup 603 shown in (b 1) in fig. 6A, the popup 603 including 15s of a relaxing music option (i.e., a first music option) and 30s of summer music, 30s of kinetic music, 30s of cheering music (i.e., a second music option) therein. The template duration of "hello summer" is the music duration of the release music of version 1, i.e. 15s, and accordingly, the duration of the sequence formed in "hello summer" usually does not exceed 15s. Therefore, the length of music corresponding to the plurality of music options in the pop-up window 603 shown in (b 1) in fig. 6A can satisfy the entire piece.
In another specific implementation manner, the video editing application 313 also stores a plurality of pieces of background music, where each piece of background music includes version 1, which can be specifically seen from table 1. In the present implementation, the video editing application 313 may display a plurality of music options in response to a preset operation. The plurality of music selections includes a first music selection and a second music selection. The first music option corresponds to background music (denoted as music 1) matching the template 1, and the first music option corresponds to music 1 of version 1. The first music choice is usually only one. The second music option corresponds to the background music (denoted as music 3) matching the template 2, and the template duration of the template 2 is longer than or equal to the template duration of the template 1. In the following embodiments, the template duration of template 2 is mainly equal to the template duration of template 1. Since the music duration of the background music of version 1 is the same as the template duration of the effect template matched with the background music, the fact that the template duration of the template 2 is equal to the template duration of the template 1 can also be understood as: the music duration of the music 3 of version 1 is equal to the music duration of the music 1 of version 1. And the second music option corresponds to version 1 music 3. There may be one or more of the second music options. In this manner, the video editing application 313 can provide music that best matches the length of a filmed piece to the user, so that the operations such as deletion of music at the time of switching and fade-out processing of the ending can be reduced.
Illustratively, and also taking table 1 above as an example, if template 1 selected before shooting is "hello summer", then the first music option corresponds to a 15s version (i.e., version 1) of soothing music. The second music selection may correspond to one or more of a 15s version of summer music, a 15s version of kinetic music, and a 15s version of happy music. For example, the video editing application 313 may display a popup 604 shown in (b 2) in fig. 6A, the popup 604 including an option of a relaxing music of 15s version (i.e., a first music option) and summer music of 15s version, a kinetic music of 15s version, and a pleasant music of 15s version (i.e., a second music option).
It should be understood that the above two specific implementations are only exemplary alternatives for determining the music option, and the practical implementation is not limited thereto. For example, the two specific implementations may be combined, and the option corresponding to the music 3 of version 1 is determined as the second option. If music 3 is not present, or the number of music 3 is insufficient, the video editing application 313 may further complement the second option with music 2 corresponding to version 2.
In the foregoing description about the music selection, in order to distinguish between music and effect templates, the cover sheet and the name of the music selection are not associated with the corresponding effect template at all. In other embodiments, to reflect the association of music and effect templates, when displaying a plurality of music options, the cover page of the music option may be set as the template cover page of the corresponding music matching effect template, for example, the cover page of the first music option may be set as the cover page of template 1. Alternatively, the name of the music may be set to the template name of the corresponding music matching effect template, e.g., template 1 is "hello summer", and the name of the first music option may be set to "hello summer".
In some embodiments, after displaying the plurality of music options, the first music option may be selected by default. For example, if the option for soothing music of 15s shown in (b 1) in fig. 6A is the first music option, the first option is displayed in bold indicating that the version of soothing music of 15s is currently selected.
In some embodiments, after displaying the plurality of music options, the video editing application 313 may receive a user selection operation of any music option (which may be denoted as music option 1). In response to the selection operation of music option 1, the handset may replace the background music in the clip with the background music corresponding to option 1 and play the clip from the beginning. For example, taking the example that music option 1 is option 611 shown in (a) in fig. 6B, the video editing application 313 may replace background music in a clip with music corresponding to option 611 in response to a user's selection operation on option 611, and then play the clip from the beginning. In this way, it is possible to facilitate previewing the effect of applying the selected music to a piece. Further, if the music time length of the music corresponding to the music option 1 is longer than the slicing time length, the music pieces with the same slicing time length can be cut from the beginning of the music corresponding to the music option 1, and the end of the cut music pieces is faded.
Subsequently, the mobile phone may receive an operation 1 of the user, where the operation 1 is used to trigger the mobile phone to switch the background music. Here, the operation 1 may be a slide-down operation from top to bottom of a pop-up window (which may be referred to as a music pop-up window) displaying a plurality of music options. Or, the music popup includes a confirmation control, and operation 1 may be a click operation or a long-press operation of the user on the confirmation control. For example, the music popup is a popup 612 shown in (B) in fig. 6B, and a "√" button is included in the popup 612. The "√" button is a confirmation control. The operation 6 may be a click operation of the "√" button by the user. The embodiment of the present application does not specifically limit the form of operation 1. In response to operation 1, the cell phone may close the plurality of music selections and resume displaying the preview interface that does not include the music selections. For example, the interface 601 shown in (a) in fig. 6A may be displayed.
In the above-described embodiment, the video editing application 313 may switch the effects of the pieces, such as switching the background music, in response to the preset operation, so that the fixed combination of the effects of the background music, the filter, the picture frame, the sticker, and the like in the effect template (such as template 1) selected before shooting may be broken. For example, after background music is switched, the effects of the template 1, such as filter, photo frame, sticker, etc., can be combined with the switched background music. Thereby improving the diversity of the sheeting effect.
In displaying the preview interface, the user may input an operation to return to the main interface, causing the preview to be interrupted. In the embodiment of the present application, in response to the operation of returning to the main interface, not only the main interface may be displayed, which is described in detail in S406 below, but also the preview interface may be controlled to enter a pause state, which is described in S407-S408 below.
S406, the desktop application 311 responds to the operation of returning to the main interface, and displays the main interface.
Wherein, the operation of returning to the desktop can be the user sliding upwards from the bottom of the preview interface. Or, the operation of returning to the desktop may be a user clicking a preset control (e.g., a home key) suspended in the screen. In the following embodiments, the user will mainly slide up from the bottom of the preview interface as an example. For convenience of explanation, the operation of returning to the home interface may be referred to as a second operation.
Illustratively, the preview interface is an interface 701 shown in (a) in fig. 7, and the desktop application 311 may display an interface 702 shown in (b) in fig. 7 in response to a user's slide-up operation from the bottom of the interface 701. Interface 702 is the main interface of the handset.
S407, the video editing application 313 sends a call request to the activity manager 321 in response to the operation of returning to the home interface.
The video editing application 313 (e.g., page onPause interface) sends a call request to the activity manager 321 to request that the preview interface not be destroyed after exiting the preview interface.
It should be noted that the video editing application 313 (e.g., a page onPause interface) merely provides an entry that can control the preview interface to enter a pause state. The video editing application 313 (e.g., a page onPause interface) needs to call the activity manager 321, so as to implement state management after the preview interface enters the background.
It should be noted that the foregoing S407 and S408 have no absolute sequential order. For example, S406 and S407 may be performed simultaneously. For another example, S407 may be executed first, and then S406 may be executed. The embodiment of the present application is not particularly limited to this.
S408, the activity manager 321 controls the preview interface to enter a pause state, and saves the data of the preview interface.
The suspend state (also referred to as the first state) is an onPause state. After the preview interface enters the pause state, even if the preview interface is currently covered by the main interface, namely the preview interface is not displayed in the foreground, the preview interface cannot be destroyed.
And the data of the preview interface comprises the content displayed in the interface. In some embodiments, the data of the preview interface also includes edit data generated at the preview interface, such as switched background music, genre, etc. data. So that all user manipulations of the tiles can be recorded.
S409, the desktop application 311 receives a click operation of the camera icon by the user.
After returning to the main interface of the mobile phone, the user can click the camera icon to trigger the mobile phone to run the camera application again in the foreground. For example, a user clicking on the camera icon 802 in the interface 801 shown in fig. 8 (a) may trigger the cell phone to run the camera application 312 again in the foreground.
S410, the desktop application 311 sends a start request to the camera application 312.
S411, the camera application 312 queries whether the background process is destroyed.
If the background process of the camera application 312 is not destroyed, it indicates that the camera application 312 is still present in the background. Accordingly, a call relationship (which may also be referred to as a call stack) in which the camera application 312 calls the video editing application 313 to display the preview interface still exists. In this embodiment of the present application, for this case, S412 is executed to restore the preview interface.
If the background process of the camera application 312 has been destroyed, it indicates that the camera application 312 has been cleaned up from the background. Accordingly, the call relationship in which the camera application 312 calls the video editing application 313 to display the preview interface does not exist. In this case, the call relationship cannot be acquired, and only initialization is performed when the camera application 312 is started, and an interface initialized by the camera application 312 is displayed, for example, a framing interface before photographing. Illustratively, in response to a user clicking the camera icon 802 in the interface 801 shown in (a) in fig. 8, if the camera application 312 is queried to be destroyed, the camera application 312 may display an interface 803 shown in (b 1) in fig. 8, where the interface 803 is a viewfinder interface before taking a picture and is an interface initialized by the camera application 312. Note that the text "camera application destroyed" in the interface 803 shown in (b 1) in fig. 8 is for illustration only, and is not normally displayed on the interface in practice.
S412, the camera application 312 sends a call request to the video editing application 313 in response to the background process not being destroyed.
In this embodiment, the background process of the camera application 312 is not destroyed, the camera application 312 may query the call relationship of the camera application 312 calling the video editing application 313, and the camera application 312 may call the video editing application 313 (for example, a page onResume interface) according to the call relationship to request to display a preview interface.
S413, the video editing application 313 sends a call request to the window manager 321.
The video editing application 313 (e.g., page onResume interface) further sends a call request to the window manager 321 to request restoration of the preview interface in response to the call request of the camera application 312.
S414, the window manager 321 controls the preview interface to enter the resume state from the pause state, and resumes the data of the preview interface.
The recovery state (which may also be referred to as a second state) is an onResume state. And after the preview interface enters the recovery state, indicating that the preview interface is in a state of being displayed in the foreground.
It should be appreciated that the window manager 321 needs to invoke the capabilities of the view system 322 to render and display when restoring the preview interface.
S415, the window manager 321 sends the restored preview interface to the video editing application 313.
S416, the video editing application 313 resumes displaying the preview interface.
Illustratively, in response to a user clicking on the camera icon 802 in the interface 801 shown in (a) in fig. 8, if it is queried that the camera application is not destroyed, the video editing application 313 may display an interface 804 shown in (b 2) in fig. 8, the interface 804 being a fragmented preview interface. Note that the word "camera application is not destroyed" in the interface 803 shown in (b 2) in fig. 8 is for illustration only, and is not usually displayed on the interface in practice.
After resuming to display the preview interface, the video editing application 313 may continue to switch to the tiled effect in response to the user's preset operation. For a specific implementation of the effect of switching to the tiles, reference may be made to the related description in S405, and details are not described here.
Therefore, through the foregoing S403 to S416, in a scene where the camera application 312 calls the video editing application 313 to display the preview interface after the video shooting is finished, if the camera application 312 exits from the background of the mobile phone and is triggered to run in the foreground again through the camera icon in the main interface, the video editing application 313 may resume displaying the preview interface.
Compared with the conventional technology, in the process of using system applications such as a camera, a gallery, settings, contacts and the like in the mobile phone, the method returns to the main interface of the mobile phone, and then initializes the application when the foreground runs by triggering the application icon on the main interface again, and displays the application initialization interface: by adopting the scheme, when the camera application is restarted after quitting, the preview interface can be restored to be displayed instead of the initialized interface which is directly displayed. First, after the preview interface is restored, the preview and editing of the tiles can continue. And secondly, displaying a preview interface after the video shooting is finished, wherein the shot video or the formed piece is not stored, and restoring to display the preview interface so as to avoid data loss.
In some embodiments, after restoring the preview interface, the method further comprises: s417, the video editing application 313 saves the edited movie in response to the user' S operation of saving the movie (may also be referred to as a third operation). Illustratively, a control 4 is included in the preview interface. Such as a button 902 in an interface 901 shown in fig. 9 (a). And the control 4 is used for triggering the mobile phone to store the video. The save-to-slice operation may be a user click operation or a long press operation on control 4.
Note that if the video effect of the slice is not edited in the process of displaying the preview interface, the edited slice is the slice with the video effect corresponding to the template 1; if the video effect is edited during the process of displaying the preview interface, the edited video effect is the video effect with one or more video effects adjusted based on the video effect corresponding to the template 1 (which may also be referred to as a second slice).
Further, in saving the clip, the video editing application 313 may display a save prompt in the preview interface. The save prompt is used to indicate that the clip is being saved. For example, the save prompt is a prompt 903 shown in fig. 9 (b), and the prompt 903 prompts that the save progress is 45%. The video editing application 313 may send a notification of the completion of the saving to the camera application 312 in response to the end of the saving, i.e., after the progress of the saving reaches 100%. The camera 313 may display an interface 1, i.e., an interface including a plurality of template options, in response to the notification of the completion of saving. As shown in fig. 9 (c) as interface 904. Thereby facilitating continued selection of an effect template to capture the video.
Since the mobile phone may include software structures such as the desktop application 311, the camera application 312, the video editing application 313, and the window manager 321, the video processing method according to the embodiment of the present application may be completed by using the mobile phone as an execution subject. The video processing method provided by the embodiment of the present application will be described below with the execution subject being a mobile phone. Specifically, as shown in fig. 10, the method includes:
and S1001, the mobile phone displays a preview interface in response to the video shooting end.
In the process of displaying the preview interface, the mobile phone responds to the preset operation of a user and can be edited into a piece. For example, background music, style, etc. switched to tiles.
And S1002, in the process of displaying the preview interface, the mobile phone responds to the operation of returning to the main interface, displays the main interface of the mobile phone, controls the preview interface to enter a pause state, and stores the data of the preview interface.
In the embodiment of the application, after the preview interface enters the pause state, the preview interface cannot be destroyed even if the preview interface is not displayed on the foreground.
S1003, the mobile phone responds to the click operation of the user on the camera icon in the main interface, and whether the camera application is destroyed or not is inquired.
And S1004, displaying a view interface before the camera application takes a picture by the mobile phone in response to the destruction of the camera application.
S1005, the mobile phone responds to the fact that the camera application is not destroyed, controls the preview interface to enter a recovery state, and recovers and displays the preview interface according to the stored data of the preview interface.
In the embodiment of the application, after the preview interface enters the recovery state, the preview interface can be recovered to be displayed on the foreground.
Similarly, in the process of displaying the preview interface, the mobile phone can be edited into a piece in response to the preset operation of the user. For example, background music, style, etc. switched to tiles.
And S1006, the mobile phone responds to the saving operation of the user and saves the file.
In summary, by using the method of the embodiment of the present application, after the mobile phone exits from the preview, and after the camera application is run on the foreground again, the preview interface can be restored to be displayed, so that the preview and the edit can be continued conveniently. In addition, in the process of displaying the preview interface, the video shot by the template is not stored, and the preview interface is restored to be displayed and can be stored in a piece, so that the shooting result is prevented from being lost.
Other embodiments of the present application provide an electronic device, which may include: the display screen (e.g., a touch screen), memory, and one or more processors. The display screen, memory and processor are coupled. The memory is for storing computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform various functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device can refer to the structure of the mobile phone shown in fig. 2.
The embodiment of the present application further provides a chip system, as shown in fig. 11, the chip system 1100 includes at least one processor 1101 and at least one interface circuit 1102. The processor 1101 and the interface circuit 1102 may be interconnected by wires. For example, the interface circuit 1102 may be used to receive signals from other devices (e.g., a memory of an electronic device). As another example, the interface circuit 1102 may be used to send signals to other devices (e.g., the processor 1101). Illustratively, the interface circuit 1102 may read instructions stored in the memory and send the instructions to the processor 1101. The instructions, when executed by the processor 1101, may cause the electronic device to perform the various steps in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
The embodiment of the present application further provides a computer storage medium, where the computer storage medium includes computer instructions, and when the computer instructions are run on the electronic device, the electronic device is enabled to execute each function or step executed by the mobile phone in the foregoing method embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute each function or step executed by the mobile phone in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, or portions of the technical solutions that substantially contribute to the prior art, or all or portions of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A video processing method applied to an electronic device includes:
the electronic equipment displays a first interface; the first interface is a viewing interface before the electronic equipment records the video after determining that the first template is adopted; the first template is used for adding a first video effect to a shot video by the electronic equipment;
the electronic equipment responds to a first operation of a user on the first interface and displays a second interface; the second interface is a view finding interface of the electronic equipment which adopts the first template to record videos;
the electronic equipment responds to the end of recording the video and displays a third interface; the third interface is a preview interface of a first film forming, and the first film forming is a video obtained by adding the first video effect to a recorded video by the electronic equipment;
the electronic equipment responds to a second operation of the user on the third interface and displays a main interface of the electronic equipment; the main interface comprises a first icon of a camera application;
the electronic equipment responds to the clicking operation of the user on the first icon, and if the camera application exists in the background of the electronic equipment, the electronic equipment displays the third interface;
and the electronic equipment responds to a third operation of the user on the third interface and saves the formed video.
2. The method of claim 1, further comprising:
the electronic equipment responds to a second operation of a user on the third interface, sets the state of the third interface to be a first state, and stores interface data of the third interface; the first state is used for the electronic equipment to keep the third interface not to be destroyed;
the electronic device displays the third interface, including:
the electronic equipment updates the state of the third interface to be a second state, and displays the third interface according to the interface data; the second state is used for displaying the third interface on the foreground of the electronic equipment by the electronic equipment.
3. The method of claim 1 or 2, wherein after the electronic device displays the third interface in response to ending recording the video, the method further comprises:
the electronic equipment plays the first film in the third interface;
the electronic equipment responds to the end of the playing of the first film, and if the electronic equipment does not detect the preset operation of the user on the third interface, the electronic equipment displays first prompt information in the third interface; the preset operation is used for triggering the electronic equipment to adjust the video effect of the first film, and the first prompt information is used for prompting the adjustment of the video effect of the first film.
4. The method of any of claims 1-3, wherein after displaying the third interface, the method further comprises:
the electronic equipment responds to the preset operation of a user on the third interface, and adjusts the video effect of the first film;
the electronic equipment responds to a third operation of the user on the third interface, saves the formed video and comprises:
the electronic equipment responds to a third operation of the user on the third interface and saves a second film; the second film is a video obtained after the electronic device adjusts the video effect of the first film.
5. The method of any of claims 1-4, wherein prior to the electronic device displaying the first interface, the method further comprises:
the electronic equipment displays a fourth interface; the fourth interface comprises a plurality of template options, and the template options correspond to a plurality of templates in the electronic equipment; the template options comprise a first option, and the first option corresponds to the first template;
the electronic equipment responds to the selection operation of a user on the first option, and the first template is selected;
wherein the electronic device displays a first interface, comprising:
and after the first template is selected, the electronic equipment responds to fourth operation of the user on the fourth interface and displays the first interface.
6. The method of any of claims 1-5, wherein after the electronic device displays the second interface, the method further comprises:
the electronic equipment responds to the fact that the recording duration is equal to the preset duration, and the electronic equipment finishes recording the video; the preset time length is the recording time length corresponding to the first template.
7. The method of claim 6, wherein the sample duration of the first template is a first duration, the first video effect comprises a first trailer, and the duration of the first trailer is a second duration; the preset duration is the difference between the first duration and the second duration.
8. An electronic device, wherein a plurality of applications are installed in the electronic device, the electronic device comprising a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-7.
9. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the method of any of claims 1-7.
CN202210038857.7A 2021-06-16 2022-01-13 Video processing method and electronic equipment Active CN115484396B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2021106767093 2021-06-16
CN202110676709 2021-06-16
CN202111439292 2021-11-29
CN2021114392925 2021-11-29

Publications (2)

Publication Number Publication Date
CN115484396A true CN115484396A (en) 2022-12-16
CN115484396B CN115484396B (en) 2023-12-22

Family

ID=84420803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210038857.7A Active CN115484396B (en) 2021-06-16 2022-01-13 Video processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115484396B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120210221A1 (en) * 2010-07-15 2012-08-16 Khan Itrat U Media-Editing Application with Live Dragging and Live Editing Capabilities
WO2016124095A1 (en) * 2015-02-04 2016-08-11 腾讯科技(深圳)有限公司 Video generation method, apparatus and terminal
CN108363499A (en) * 2018-01-31 2018-08-03 维沃移动通信有限公司 A kind of text restoration methods and mobile terminal
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN110515521A (en) * 2019-08-14 2019-11-29 维沃移动通信有限公司 A kind of screenshot method and mobile terminal
CN111385508A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video processing method, device, equipment and storage medium
CN111770354A (en) * 2020-07-02 2020-10-13 广州酷狗计算机科技有限公司 Information interaction method, device, terminal and storage medium
WO2020216096A1 (en) * 2019-04-25 2020-10-29 华为技术有限公司 Video editing method and electronic device
CN112947923A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Object editing method and device and electronic equipment
CN113115099A (en) * 2021-05-14 2021-07-13 北京市商汤科技开发有限公司 Video recording method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120210221A1 (en) * 2010-07-15 2012-08-16 Khan Itrat U Media-Editing Application with Live Dragging and Live Editing Capabilities
WO2016124095A1 (en) * 2015-02-04 2016-08-11 腾讯科技(深圳)有限公司 Video generation method, apparatus and terminal
CN108363499A (en) * 2018-01-31 2018-08-03 维沃移动通信有限公司 A kind of text restoration methods and mobile terminal
CN109523609A (en) * 2018-10-16 2019-03-26 华为技术有限公司 A kind of method and terminal of Edition Contains
CN111385508A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video processing method, device, equipment and storage medium
WO2020216096A1 (en) * 2019-04-25 2020-10-29 华为技术有限公司 Video editing method and electronic device
CN110515521A (en) * 2019-08-14 2019-11-29 维沃移动通信有限公司 A kind of screenshot method and mobile terminal
CN111770354A (en) * 2020-07-02 2020-10-13 广州酷狗计算机科技有限公司 Information interaction method, device, terminal and storage medium
CN112947923A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Object editing method and device and electronic equipment
CN113115099A (en) * 2021-05-14 2021-07-13 北京市商汤科技开发有限公司 Video recording method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115484396B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN110401766B (en) Shooting method and terminal
US9465468B2 (en) Mobile terminal and controlling method thereof
US20170068380A1 (en) Mobile terminal and method for controlling the same
CN112449099B (en) Image processing method, electronic equipment and cloud server
CN104052909A (en) Shooting method and device
CN114189625B (en) Shooting control method and terminal
CN109167937B (en) Video distribution method, device, terminal and storage medium
CN109587549B (en) Video recording method, device, terminal and storage medium
EP4007287A1 (en) Video processing method, device, terminal, and storage medium
CN108965770B (en) Image processing template generation method and device, storage medium and mobile terminal
CN116156314A (en) Video shooting method and electronic equipment
CN112417180B (en) Method, device, equipment and medium for generating album video
CN109104633B (en) Video screenshot method and device, storage medium and mobile terminal
WO2019061223A1 (en) Camera application control method and device
WO2022262536A1 (en) Video processing method and electronic device
CN112822544A (en) Video material file generation method, video synthesis method, device and medium
CN115484396B (en) Video processing method and electronic equipment
US20220415361A1 (en) Method for processing videos and electronic device
US20230412535A1 (en) Message display method and electronic device
CN106528197B (en) Shooting method and device
CN115480684A (en) Method for returning edited multimedia resource and electronic equipment
CN115484387A (en) Prompting method and electronic equipment
CN115729405A (en) Display method of dock in desktop and electronic equipment
CN115484398B (en) Video shooting method and electronic equipment
CN111399797A (en) Voice message playing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant