CN116634058A - Editing method of media resources and electronic equipment - Google Patents

Editing method of media resources and electronic equipment Download PDF

Info

Publication number
CN116634058A
CN116634058A CN202211036191.8A CN202211036191A CN116634058A CN 116634058 A CN116634058 A CN 116634058A CN 202211036191 A CN202211036191 A CN 202211036191A CN 116634058 A CN116634058 A CN 116634058A
Authority
CN
China
Prior art keywords
interface
user
type
electronic equipment
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211036191.8A
Other languages
Chinese (zh)
Other versions
CN116634058B (en
Inventor
韩笑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN116634058A publication Critical patent/CN116634058A/en
Application granted granted Critical
Publication of CN116634058B publication Critical patent/CN116634058B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The application provides a media resource editing method and electronic equipment, and relates to the technical field of terminals. The method can solve the problem of inconvenient operation of adding materials to the media resources in the prior art, and improves the editing efficiency and experience of users. The method comprises the following steps: the electronic equipment displays a first interface, wherein the first interface comprises a plurality of editing items; in response to a user selecting a first editing item, the electronic device displays a first window on a first interface, the first window including a first icon; responding to the operation of the user on the first icon, the electronic equipment displays a second interface, wherein the second interface comprises a plurality of material type options and materials of a first type; responding to the operation of selecting the first material type by the user, and displaying a third interface by the electronic equipment, wherein the third interface comprises a plurality of materials of a second type; and responding to the operation of selecting the first material by the user, displaying a fourth interface by the electronic equipment, wherein the fourth interface comprises a second area, and the second area displays the first media resource added with the first material.

Description

Editing method of media resources and electronic equipment
The present application claims priority from the chinese patent application filed at 2022, 5-30, with application number 202210603535.2, entitled "an image editing method and apparatus", the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for editing a media resource and an electronic device.
Background
In order to meet the demands of users for recording and sharing life anywhere and anytime, electronic devices such as mobile phones, tablets and the like are generally provided with cameras. To bring further shooting authoring experience, more and more electronic devices support users editing video locally.
In the existing editing mode, the electronic device may provide multiple editing operations on the video editing page for the user to select, such as adding background music, filters, special effects, etc. In the case where the user selects one editing operation (for example, adding a filter) from among a plurality of editing operations to edit a video, if the user wishes to perform another editing operation (for example, adding a special effect) on the video, it is necessary to perform a return operation to return to the video editing page, and then select another editing operation. The user operation is complex, and the user experience is not good enough.
Disclosure of Invention
The embodiment of the application provides a method for editing media resources and electronic equipment, which are used for solving the problem of inconvenient operation of adding materials to the media resources in the prior art.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
In a first aspect, there is provided a method for editing a media asset, the method comprising: the electronic equipment displays a first interface, wherein the first interface comprises a first area and a plurality of editing items, and the first area displays a first media resource; in response to a user selecting a first editing item from a plurality of editing items, the electronic device displays a first window on a first interface, the first window including a first icon; responding to the operation of a user on the first icon, the electronic equipment displays a second interface, wherein the second interface comprises a plurality of material type options and materials of a first type, and the first editing item corresponds to the materials of the first type; responding to the operation of selecting the first material type by the user, displaying a third interface by the electronic equipment, wherein the plurality of material type options comprise the first material type, the third interface comprises a plurality of second type materials, and the plurality of second type materials comprise the first material; and responding to the operation of selecting the first material by the user, displaying a fourth interface by the electronic equipment, wherein the fourth interface comprises a second area, and the second area displays the first media resource added with the first material.
It can be seen that when a certain type of material (e.g., a first type of material) is added to a first media resource (e.g., video, picture), the present application can enter the material center (i.e., the second interface) through the first icon. If the user wishes to add other types (for example, the second type of material) to the first media resource during the period, the user can directly select the required material type (for example, the first material type) in the material center (namely, the second interface) through the material type option, and can select the second type of material in the material center (the third interface) without returning to the first interface for reselection, so that the operation process of the user is simplified, and the user experience is improved.
In an optional embodiment, the fourth interface further includes a second window, where the second window includes a first control and a second control, and if the first material type is a filter, a theme, a background, or a transition, the method further includes: responding to the operation of the user on the first control, and redisplaying the first interface by the electronic equipment; and responding to the operation of the user on the second control, the electronic equipment redisplays the first interface and displays the first media resource added with the first material in the first area.
In an optional implementation manner, the fourth interface further includes a second window, the second window further includes a first control and a second control, the first media resource includes a first segment, the first segment is a segment where the cursor is located, and if the first material type is special effects or subtitles and the first segment has added a second type of material, the method further includes: responding to the operation of the user on the first control, the electronic equipment displays a fifth interface, wherein the fifth interface comprises a third area, the third area displays a first picture, the first picture is a picture of a first media resource on a first playing progress, and the first playing progress is a playing progress indicated by a cursor when the electronic equipment detects the operation of the user on the first control; responding to the operation of the user on the second control, the electronic equipment displays a sixth interface, the sixth interface comprises a fourth area, the fourth area displays a second picture, the second picture is a picture of a second playing progress of the first media resource added with the first material, and the second playing progress is a playing progress indicated by a cursor when the electronic equipment detects the operation of the user on the second control.
In an alternative embodiment, if the first material type is special effects or subtitles and the first segment does not add the second type of material, the method further includes: responding to the operation of the user on the first control, the electronic equipment displays a seventh interface, wherein the seventh interface comprises a fifth area, the fifth area displays a third picture, the third picture is a picture of the first media resource at a third playing progress, and the third playing progress is a playing progress indicated by a cursor when the electronic equipment adds the first material to the first media resource; and responding to the operation of the user on the second control, the electronic equipment displays an eighth interface, wherein the eighth interface comprises a sixth area, the sixth area displays a fourth picture, and the fourth picture is a picture of the first media resource after the first material is added in the third playing progress.
In an optional embodiment, the fifth interface, the sixth interface, the seventh interface, and the eighth interface further include a plurality of editing items, the plurality of editing items including a second editing item, the second editing item being in a selected state, the second editing item corresponding to the second type of material.
In an alternative embodiment, if the first material type is transition, the electronic device displaying the fourth interface in response to the user selecting the first material includes: in response to a user selecting the first material, and determining that the first media asset includes a split point, the electronic device displays a fourth interface.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the first material by the user, determining that the first media resource does not comprise a segmentation point, displaying a ninth interface by the electronic equipment, wherein the ninth interface comprises a seventh area and prompt information, the seventh area displays the first media resource, and the prompt information is used for indicating that the first media resource cannot be added for transition.
In an optional implementation manner, the first media resource includes a first segment, where the first segment is a segment where the cursor is located, and if the first material type is music, responding to the operation of selecting the first material by the user, displaying the fourth interface by the electronic device includes: in response to a user selecting the first material and determining that the first segment does not have music added, the electronic device displays a fourth interface.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the first material by the user, determining that music is added to the first segment, and displaying a tenth interface by the electronic equipment, wherein the tenth interface comprises a prompt box which comprises a third control; and responding to the operation of the user on the third control, the electronic equipment displays an eleventh interface, wherein the eleventh interface comprises a plurality of editing items, the plurality of editing items comprise second editing items, the second editing items are in a selected state, and the second editing items correspond to the materials of the second type.
In an alternative embodiment, the prompt box further includes a fourth control, and the method further includes: responding to the operation of the user on the fourth control, adding a first material to the first segment by the electronic equipment, and displaying a fourth interface, wherein the fourth interface also comprises a third window, and the third window comprises a volume bar, an audio track, a fifth control and a sixth control; in response to a user operation of the fifth control or the sixth control, the electronic device displays an eleventh interface.
In an optional embodiment, the third interface further includes a plurality of type labels and a plurality of controls, each type label corresponds to a plurality of materials of the second type, and the plurality of type labels corresponds to the plurality of controls one to one; in response to a user selecting a seventh control from the plurality of controls, the electronic device displays a twelfth interface, the twelfth interface including an eighth area, the eighth area displaying the first media resource after adding the second material, the second material being a material of the plurality of second types of materials corresponding to the first type label, the first type label being a type label corresponding to the seventh control.
In an alternative embodiment, the fourth interface further includes a fourth window, the fourth window including a plurality of type tags and a plurality of materials of the second type, the first type tags being in a selected state, the first materials being in a selected state.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the second material in the third interface by the user, and displaying a fifth window by the electronic equipment, wherein the fifth window comprises a seventh control; and responding to the operation of the user on the seventh control, the electronic equipment displays a twelfth interface, wherein the twelfth interface comprises a ninth area, and the ninth area displays the first media resource added with the second material.
In an alternative embodiment, the fifth window includes a tenth area for displaying the second material and an eleventh area for displaying a plurality of second type materials, the plurality of second type materials including the second material, the second material being in a selected state.
In an alternative embodiment, when the electronic device displays the first window for the first time, the first window further includes a guide bubble for prompting the user to click on the first icon.
In a second aspect, the application also provides an electronic device comprising a processor, a memory, the processor and the memory coupled, the memory storing program instructions that when executed by the processor cause the electronic device to implement the method of any of the first aspects.
In a third aspect, the present application provides a computer-readable storage medium comprising computer instructions; when executed on an electronic device, the computer instructions cause the electronic device to perform the method as in any of the first aspects.
In a fourth aspect, the application provides a computer program product for causing a computer to carry out the method as in the first aspect and any one of its possible design approaches when the computer program product is run on a computer.
In a fifth aspect, the present application provides a chip system comprising one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a wire. The chip system described above may be applied to an electronic device including a communication module and a memory. The interface circuit is for receiving signals from a memory of the electronic device and transmitting the received signals to the processor, the signals including computer instructions stored in the memory. When the processor executes the computer instructions, the electronic device may perform the method as in the first aspect and any one of its possible designs.
It will be appreciated that the above-provided advantages achieved by the electronic device according to the second aspect, the computer-readable storage medium according to the third aspect, the computer program product according to the fourth aspect and the chip system according to the fifth aspect may refer to the advantages as in the first aspect and any of the possible design manners thereof, and are not described herein again.
Drawings
FIG. 1 is an interface diagram provided in the prior art;
FIG. 2 is an interface diagram provided in the prior art;
FIG. 3 is an interface diagram provided by the prior art;
FIG. 4 is an interface diagram provided by the prior art;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 7 is a set of interface diagrams provided by an embodiment of the present application;
FIG. 8 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 9 is a set of interface diagrams provided by an embodiment of the present application;
FIG. 10 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 11 is an interface diagram provided by an embodiment of the present application;
FIG. 12 is an interface diagram according to an embodiment of the present application;
FIG. 13 is an interface diagram according to an embodiment of the present application;
FIG. 14 is an interface diagram according to an embodiment of the present application;
FIG. 15 is an interface diagram according to an embodiment of the present application;
FIG. 16 is an interface diagram according to an embodiment of the present application;
FIG. 17 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 18 is a set of interface diagrams provided by an embodiment of the present application;
FIG. 19 is an interface diagram according to an embodiment of the present application;
FIG. 20 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 21 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 22 is an interface diagram according to an embodiment of the present application;
FIG. 23 is an interface diagram provided by an embodiment of the present application;
FIG. 24 is an interface diagram provided by an embodiment of the present application;
FIG. 25 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 26 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 27 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 28 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 29 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 30 is an interface diagram provided by an embodiment of the present application;
FIG. 31 is an interface diagram according to an embodiment of the present application;
FIG. 32 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 33 is an interface diagram according to an embodiment of the present application;
FIG. 34 is an interface diagram according to an embodiment of the present application;
FIG. 35 is a set of interface diagrams provided in accordance with an embodiment of the present application;
FIG. 36 is a set of interface diagrams provided in accordance with an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
Currently, electronic devices such as mobile phones and tablet computers generally have a video editing function. The video editing function may enable a user to edit a video on an electronic device, such as editing, adding rich material (e.g., music, transitions, special effects, filters) to the video, and so forth. Illustratively, a picture of the video (which may also be referred to as a video frame) may be edited by clipping, retaining a more prominent picture in the video, adjusting the position of the picture, and so forth. Background music can be added to the video by configuring the music, and the style of the video is set. The transition may slice the video or connect different videos or photos, typically placed between two less related pictures, making the transition of the video content more natural. The video can be more vivid and rich by configuring special effects and filters. Thus, the video created by the secondary creation is more vivid, rich and accords with the creation intention of the user. The video may be a video segment, or a video composed of multiple pictures.
The following describes a video editing manner in the prior art, taking an electronic device as a mobile phone as an example:
as shown in fig. 1, the handset may display an editing interface 101. The editing interface 101 includes a video preview area 102 and editing options 103. The video preview area 102 is used to preview video. The video preview area 102 may display a frame in the video as a cover or play the video in a loop. Editing options 103 are used to provide a variety of editing modes for selection by the user. The plurality of editing modes may include: theme 103a, clip 103b, filter 103c, music 103d, text 103e, etc. In the interface shown in fig. 1, clip 103b may be in a selected state, i.e., video may be clipped in editing interface 101.
The user can also select other editing modes from the editing options 103 according to his own needs. Illustratively, the user may select music 103d, the cell phone may receive an operation by which the user clicks on music 103d, and in response to the operation, the cell phone may display interface 104. As shown in fig. 2, the interface 104 may provide a plurality of pieces of music for selection by a user. And, interface 104 also includes cancel option 104a and use button 104b. By operating cancel option 104a, the user may forgo configuring music to the video; by using the operation of the button 104b, the user can configure the music selected by the user to the video.
Then, if the user needs to perform another editing operation (for example, adding a filter) on the video, the mobile phone needs to return to the editing interface. Wherein the mobile phone detects the user's operation of the cancel option 104a or the use button 104b, or detects the return operation of the system setting, such as sliding from the left to the right of the screen, or sliding from the right to the left of the screen, etc., the editing interface 105 shown in fig. 3 may be displayed. The editing interface 105 is similar to the editing interface 101 in fig. 1, except that the music 103d in the editing interface 105 is in a selected state. The user may then select another editing mode (e.g., filter 103 c) on editing interface 105. After the mobile phone detects the operation of the filter 103c by the user, the interface 106 can be displayed. As shown in fig. 4, the interface 106 is used to provide a variety of filters for user selection.
As can be seen, in the video editing method provided in the prior art, in the process that a user performs an editing operation (for example, adding music) on a video, only the material corresponding to the editing operation (for example, only the music) can be browsed and added; if the user wishes to perform another editing operation (e.g., adding a filter) on the video, a more cumbersome operation (e.g., returning to the editing interface and then re-selecting the filter 103 c) is required, and the user experience is not good enough.
In view of this, the embodiment of the application provides a method for editing media resources, which can be applied to electronic equipment. The electronic equipment can be provided with the material center inlet, the electronic equipment can display the interface of the material center after receiving the operation of the user on the material center inlet, the interface of the material center can provide a plurality of different types of editing materials for the user to select, the operation times of the user are reduced, the operation process of the user is simplified, and the use experience of the user is improved.
It should be noted that the media resource may include video or image. The type of the media resource (such as video or image) edited by the electronic device in the embodiment of the application is not limited. The following embodiments will be described schematically taking the type of edited media asset as an example of video.
In order to more clearly and in detail describe the editing method of the media resource provided by the embodiment of the present application, the electronic device related to implementing the method is first described below.
The electronic device may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with embodiments of the application not being particularly limited as to the particular type of electronic device.
The following describes a schematic structure of an electronic device applied to the implementation of the present application, taking the electronic device as an example of a mobile phone. Referring to fig. 5, the electronic device may include: processor 210, external memory interface 220, internal memory 221, universal serial bus (universal serial bus, USB) interface 230, charge management module 240, power management module 241, battery 242, antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 270A, receiver 270B, microphone 270C, headset interface 270D, sensor module 280, keys 290, motor 291, indicator 292, camera 293, display 294, and subscriber identity module (subscriber identification module, SIM) card interface 295, among others.
The sensor module 280 may include pressure sensors, gyroscope sensors, barometric pressure sensors, magnetic sensors, acceleration sensors, distance sensors, proximity sensors, fingerprint sensors, temperature sensors, touch sensors, ambient light sensors, bone conduction sensors, and the like.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The charging management module 240 may also provide power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the external memory, the display 294, the camera 293, the wireless communication module 260, and the like. In some embodiments, the power management module 241 and the charge management module 240 may also be provided in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. In some embodiments, antenna 1 and mobile communication module 250 of the electronic device are coupled, and antenna 2 and wireless communication module 260 are coupled, such that the electronic device may communicate with a network and other devices through wireless communication techniques.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 250 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The mobile communication module 250 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be provided in the same device as at least some of the modules of the processor 210.
The wireless communication module 260 may provide solutions for wireless communication including WLAN (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on electronic devices.
The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device implements display functions through the GPU, the display screen 294, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel.
The electronic device may implement shooting functions through an ISP, a camera 293, a video codec, a GPU, a display 294, an application processor, and the like. The ISP is used to process the data fed back by the camera 293. The camera 293 is used to capture still images or video. In some embodiments, the electronic device may include 1 or N cameras 293, N being a positive integer greater than 1.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 210 through an external memory interface 220 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
Internal memory 221 may be used to store computer executable program code that includes instructions. The processor 210 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 221. For example, in an embodiment of the present application, the processor 210 may include a memory program area and a memory data area by executing instructions stored in the internal memory 221.
The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an ear-headphone interface 270D, an application processor, and the like. Such as music playing, recording, etc.
Keys 290 include a power on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be a touch key. The motor 291 may generate a vibration alert. The motor 291 may be used for incoming call vibration alerting or for touch vibration feedback. The indicator 292 may be an indicator light, which may be used to indicate a state of charge, a change in power, a message indicating a missed call, a notification, etc. The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or removed from the SIM card interface 295 to enable contact and separation from the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support Nano SIM cards, micro SIM cards, and the like.
Fig. 6 shows a software architecture block diagram of an electronic device according to an embodiment of the present invention. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. By way of example, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. By way of example, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, an editing manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The editing manager is used for providing editing functions corresponding to each editing option when the application needs to edit the data so as to realize the editing of the data (such as multimedia resources) in the application.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The methods in the following embodiments may be implemented in an electronic device having the above-described hardware structure and the above-described software structure.
The method for editing the media resource provided by the embodiment of the application is described below by taking the electronic equipment as an example of a mobile phone and combining the attached drawings.
The user can edit the video on the cell phone, wherein the editing includes editing, setting up a theme (which may also be called a template), adding special effects, filters, music, and the like. The video edited by the user can be video shot by the user by using a camera application of the mobile phone; the video stored in the mobile phone can also be used; or a video made up of a plurality of video clips and/or pictures, without specific limitation.
Illustratively, as shown in fig. 7 (a), the handset may display an interface 701. The interface 701 may include status bars, icons for clock applications, icons for setup applications, icons 702 for camera applications, and icons for other applications, and so forth. In some embodiments, this interface 701 may also be referred to as a main interface, home interface, or the like. After the mobile phone detects the user's operation on the icon 702 of the camera application, the interface 702 may be displayed. As shown in (b) of fig. 7, the interface 702 includes an image 703, a shooting mode option 704, a shutter control 705, a flash control, and the like. The image 703 is an image acquired by a camera in real time. It should be noted that, the mobile phone may refresh the image (i.e. the image 703) displayed on the interface 702 in real time, so that the user may preview the image collected by the camera in real time. The shooting mode option 704 is used to provide a plurality of shooting modes for selection by the user. The plurality of photographing modes may include: a portrait 704a, a photograph 704b, a video 704c, an aperture, a night scene, real-time blurring, a panorama, etc. The cell phone may receive an operation of the user's left/right slide the photographing mode option 704, and in response to the operation, the electronic device may turn on the photographing mode selected by the user. Note that, not limited to the one shown in fig. 7 (b), more or fewer options than those shown in the drawing may be included in the shooting mode option 704. The interface 702 may also be referred to as a capture interface, a capture preview interface, and the like.
As shown in fig. 7 (b), the photo 704b is in the selected state. That is, the cell phone is currently in a photographing mode. If the user wishes to turn on the recording mode, the user may slide left through the capture mode option 704 and select the recording 704c. When the user's left-hand slide of the shooting mode option 704 and selection of the video 704c is detected, the mobile phone may turn on the video mode and display an interface 706 as shown in fig. 8 (a). The interface 706 shown in fig. 8 (a) is similar to the interface 702 shown in fig. 7 (b), except that: video 704c is selected in interface 706, and interface 706 includes video button 706a. After detecting the user's operation of the record button 706a (e.g., clicking, touching the record button 706 a), the mobile phone may begin recording video and display the interface 707. As shown in (b) of fig. 8, the interface 707 includes a stop button 707a, a pause button 707b, a recording duration 707c, and the like. After detecting the user's operation of the stop button 707a, the mobile phone may generate a first video and display an interface 708. As shown in (a) of fig. 9, the interface 708 may display a certain frame picture in the first video that is photographed as a cover. Alternatively, the handset may also loop through the interface 708 to play the first video. The interface 708 includes an edit button 708a. The user can edit the first video captured by operating the edit button 708a. The interface 708 also includes a save button, a style button, a share button, a music button, and the like, without limitation. After the handset detects the user's operation of the edit button 708a, an interface 709 may be displayed. As shown in fig. 9 (b), the interface 709 includes a preview area 709a, an edit option 710, and an export button 711. The preview area 709a may be used to preview the first video in the editing process so that the user can determine the editing effect. The handset can save the edited first video through the export button 711. The editing options 710 are used to provide a plurality of editing items for selection by the user. The plurality of editing items may include: theme 710a, clip 710b, filter 710c, music 710d, text 710e, and special effects 710f. It should be noted that the editing options 710 may include more or fewer editing items than those shown in (b) of fig. 9, and names of the included editing items may also be different, for example, templates may also be referred to as topics, backgrounds, etc., which are not particularly limited herein.
The theme 710a may provide editing templates for selection by a user, each of which may include preset filters, transitions, music, text, special effects, etc. And adding an editing template to the first video, namely adding a filter, transition, music, text, special effects and the like to the first video according to the editing template. The clip 710b may enable a user to clip an image frame of the first video, edit a picture of the first video. The video frames included in the first video can be adjusted, the video pictures can be adjusted, the first video can be divided into a plurality of video clips, or the video clips can be added on the basis of the first video. Through the segmentation point interval between different video clips, in order to make the transition between two video clips smoother and more natural, transition can be added at the segmentation point of the two video clips. Filter 710c may provide a variety of filters for user selection. By changing the filter, the color tone of the first video, such as a cool color tone, a warm color tone, a black-and-white color tone, and the like, can be changed. Music 710d may provide a plurality of pieces of music for selection by the user, adding background music to the first video. Text 710e may provide a variety of text modules for user selection, which may include fonts, colors of text, arrangement of text, and the like. Effect 710f may provide a variety of effects for selection by the user.
It can be seen that the first video edited by the user may be a video shot by the user using the mobile phone. Optionally, the user may also select the first video from the gallery and edit the first video. Illustratively, as shown in fig. 10 (a), the handset may display an interface 801 of the gallery application. The interface 801 includes a preview of a media file, which may include video, pictures. For example, the interface 801 includes a preview view 802 of the first video. After detecting that the user clicks the preview 802 of the first video, the mobile phone may display a video playing window 803, where the video playing window 803 is used to play the first video. As shown in (b) of fig. 10, the video play window 803 further includes an edit button 803a, a share button, a collection button, a delete button, and the like. After detecting the user's operation of the edit button 803a, the mobile phone can display an interface 804 as shown in fig. 10 (c). The interface 804 shown in fig. 10 (c) is similar to the interface 709 shown in fig. 9 (b), and will not be described again.
After the cell phone displays the editing interface (e.g., interface 804, interface 709), the user can edit the first video on the editing interface. In an alternative embodiment, when the handset displays an editing interface in response to a user operation of editing buttons (e.g., editing button 708a, editing button 803 a), clip 710b of the plurality of editing items is in a selected state by default. If the user wishes to perform other editing operations, a selection may be made from among a plurality of editing items.
Illustratively, as shown in FIG. 11, the handset can display an interface 1101, the interface 1101 being similar to the interface 709 shown in FIG. 9 (b), each including a preview area 1101a, a progress bar 1101b, a cursor 1101c, and editing options 710. The preview area 1101a is used for previewing the first video in editing, so that the user can view the editing effect in time. The progress bar 1101b corresponds to the first video and indicates the full duration of the first video. Cursor 1101c is used to indicate the current playing progress of the first video. When the first video is not played, the starting position of the progress bar 1101b coincides with the cursor 1101 c; after the first video starts to be played, the progress bar 1101b moves from left to right, so that the video frame corresponding to the cursor 1101c is changed continuously. In addition, when detecting that the user drives the progress bar 1101b to move leftwards, the mobile phone can play the picture behind the current picture in the preview area 1101 a; when the mobile phone detects that the user moves the progress bar 1101b to the right, a picture before the current picture is played in the preview area 1101 a. The editing options 710 are used to provide a plurality of editing items for selection by the user. The plurality of editing items may include: theme 710a, clip 710b, filter 710c, music 710d, text 710e, and special effects 710f. Wherein clip 710b is in the selected state.
If the user wishes to add a filter to the first video, the user can click (or touch) on the filter 710c. After the mobile phone detects the operation of the filter 710c by the user, on the one hand, the mobile phone may add a filter to the video clip where the cursor is located, and the filter added by the mobile phone may be the filter selected by the user last time, or the filter with the front order among the plurality of filters, or the filter specified by other modes, which is not limited in this way. Alternatively, the handset may display the interface 1102. As shown in fig. 12, the interface 1102 includes a preview area 1101a and a window 1103. The window 1103 includes a material center icon 1104 and a filter type option 1105. Through the material center icon 1104, the cell phone can enter a material center for providing multiple types of material for selection by the user. Filter type option 1105 is used to provide multiple filter classifications for selection by the user. The plurality of filter classifications may include: recently, classical, clap, black and white, architectural, landscape, etc., each filter class may include multiple filters. In the embodiment shown in fig. 12, the filter 1 under this filter class is selected recently, and the preview area 1101a automatically plays the first video to which the filter 1 is added. It should be noted that, in the embodiment of the present application, more or fewer filter classifications and filters than those shown in fig. 12 may be provided, and the selected filter may be other preset filters, which is not limited herein. In addition, when the mobile phone detects the operation of the user on other editing items (such as the theme 710a, the music 710d, the text 710e, or the special effects 710 f), the mobile phone may also display a window corresponding to the editing item, where the window includes the corresponding type of material. For example, when the handset detects a user's operation on the theme 710a, a window providing a variety of themes may be displayed.
Optionally, if the user uses VIP material without logging into the account, or if the user has logged into the account but the account has not opened a member, the interface 1102 further includes a banner 1102a. The banner 1102a is used for prompting the user that the material currently selected is VIP material, and VIP needs to be opened before normal use. For example, "VIP material is in use, click-to-turn-on VIP member unlock material". When the handset detects a user operation on the banner 1102a, an interface 1201 may be displayed. As shown in fig. 13, the interface 1201 may include an account login area 1202, a material preview area 1203, and a member payment area 1204. The account login area 1202 is used for a user to login an account (e.g. glowing account), and after the mobile phone detects the operation of the user on the account login area 1202, an interface for inputting an account and a password can be displayed for the user to login the account. The material preview area 1203 may include thumbnails of various material types and a material center icon 1203a, and the user may cause the mobile phone to display an interface (e.g., interface 1106) of the material center by operating the material center icon 1203a, so as to allow the user to quickly look up the member interests. The member payment area 1204 may include information about the amount of money of an opening member (also referred to as VIP), a corresponding term, and a payment manner, etc., through which the user can open the member or renew the registered account number.
In the case where the mobile phone has logged in an account and the account has opened a member, the mobile phone may display an interface 1205 as shown in fig. 14 (a). The interface 1205 shown in (a) of fig. 14 is similar to the interface 1201 shown in fig. 13, except that: the cell phone has logged in to the account number (e.g., flowers) and has a member opened (member due date 2022/12/31) in interface 1205. And, the material preview area 1203 displays the prompt information 1203b and the material center icon 1203c. The prompt 1203b is, for example, "all materials have been unlocked" to prompt the user to use the material library. The material center icon changes from a form indicating that no account is registered or no member is opened (i.e., material center icon 1203 a) to a form indicating that a member is opened (i.e., material center icon 1203 c). Alternatively, if the mobile phone detects that the member of the account has expired after logging in the account, the mobile phone may display the interface 1206 shown in (b) in fig. 14, where the material center icon changes from a form indicating that the account is not logged in or that the member is not opened (i.e., the material center icon 1203 a) to a form indicating that the member has expired (i.e., the material center icon 1203 d) in the interface 1206. The member payment area 1204 includes information prompting the user to pay a fee to the member.
Alternatively, the cell phone may display a guide bubble for prompting the user to enter the material center for previewing the material when the operation of selecting an editing item (e.g., filter 710 c) by the user is detected for the first time. Illustratively, as shown in fig. 15, the window 1103 further includes a guide bubble 1103a, where the guide bubble 1103a includes content such as "click into the center of the material, previewing all the material". In the case where the mobile phone displays the guide bubble 1103a, the mobile phone may hide the guide bubble 1103a after detecting an operation of clicking an arbitrary position on the interface 1101 by the user.
Optionally, if the mobile phone receives a window (e.g., window 1103) including an icon of the center of the material after the preset time for displaying the guide bubble for the first time, the mobile phone may display the guide bubble again on the window. That is, assuming that the mobile phone never receives an operation of entering the material center after the guide bubble 1103a is displayed for the first time (for example, clicks on the material center icon 1104), if the mobile phone receives an operation of displaying a window 1103 or a window corresponding to another editing item (for example, a window 1302 in (b) in fig. 24) again after one month of displaying the guide bubble 1103a, the mobile phone can display the guide bubble 1103a again.
Optionally, the mobile phone may display the guiding bubble 1103a after the member is opened and the member does not enter the material center.
After the cell phone displays the window 1103, the user can directly select a filter in the window 1103. In this case, the user may also select a filter in the material center by entering the material center icon 1104, considering that the selectable filters in the window 1103 are limited. Illustratively, as shown in fig. 16, the handset may receive a user click on a material center icon 1104, in response to which the handset may display an interface 1106. As shown in fig. 17, the interface 1106 is an interface of the material center. Interface 1106 includes material type options 1107. The material type option 1107 is used to provide multiple material types for selection by the user. The plurality of material types may include: theme 1107a, caption 1107b, filter 1107c, transition 1107d, special effects 1107e, music 1107f, and background 1107g.
It should be noted that, when the mobile phone jumps to the material center (e.g., interface 1106) through the window including the material of the first material type, the first material type is in the selected state on the material center. For example, if the window 1103 includes a filter, after the window 1103 is jumped to the interface 1106 (i.e., the interface of the material center), the corresponding material type (i.e., the filter 1107 c) is in the selected state.
As shown in fig. 17, the interface 1106 also includes a personal center button 1106a and a login button 1106b. The personal center button 1106a is in the form of an avatar, if the mobile phone is not registered with the account, the avatar is a default avatar, and if the mobile phone is registered with the account, the avatar may be an avatar of the account currently registered with the mobile phone. The login button 1106b is used to indicate that the user is logged in to a member, for example, "can synchronize VIP interests after login". After detecting the user's operation on the login button 1106b, the mobile phone may display the interface 1201 shown in fig. 13, and log in or open a member according to the method shown in fig. 13-14.
As also shown in fig. 17, the interface 1106 also includes a plurality of filter classifications (which may also be referred to as filter tab). The plurality of filter classifications may include: recommended, classical, patricia, black and white, architectural, landscape, etc. Each filter class includes multiple filters of similar or similar styles, for example, a "black and white" filter class includes black and white 1-black and white 6, a "nearest" class includes filters 1-6, etc. The interface 1106 also includes a use button 1106c for each filter class, and the user can cause the cell phone to preview a plurality of filters in that filter class by operating the use button 1106 c. Alternatively, if the filter is only used by a particular user (e.g., VIP user), the cell phone may add VIP identification 1106d to the corresponding filter.
As shown in fig. 18 (a), the user can select a desired filter (for example, black and white 3) from among a plurality of filters. After the handset detects the user selection of black and white 3, window 1108 may be displayed. As shown in fig. 18 (b), the window 1108 includes a filter preview area 1108a, a filter selection area 1108b, and a use filter button 1108c. The filter preview area 1108a is used to preview the filter selected by the user (i.e., black and white 3); the filter selection area 1108b is used to show a plurality of filters including a filter selected by the user (i.e., black and white 3) and two filters adjacent to the filter selected by the user (i.e., black and white 2 and black and white 4), and the filter selected by the user is in a selected state and is located in the middle of the plurality of filters. After the mobile phone detects the user's operation of using the filter button 1108c, an interface 1109 may be displayed. As shown in fig. 19, the interface 1109 is similar to the interface 1102 with the difference that: the filter black and white 3 in the window 1109a is in a selected state, and the filter class "black and white" to which the black and white 3 belongs is also in a selected state; the image displayed in the preview area 1109b is changed from the image frame 1 to which the filter 1 is added in fig. 12 to the image frame 1 to which the filter "black and white 3" is added in fig. 19.
Optionally, after the mobile phone jumps from the material center (e.g., interface 1106) to the editing interface (e.g., interface 1109), the mobile phone may automatically select the editing item corresponding to the selected material type. That is, when a certain material type in the material center is selected, after the mobile phone jumps to the editing interface, the editing item corresponding to the material type is in the selected state. For example, in interface 1106, filter 1107c is in a selected state, and after a jump from interface 1106 to interface 1109, filter 710c is also in a selected state.
Optionally, after the mobile phone changes from the heartbeat to the editing interface, the mobile phone can also automatically position the image frame corresponding to the cursor (which can be understood as the currently played image frame), add the filter selected by the user to the image frame, and display the image frame added with the filter in the preview area for the user to view. And the mobile phone can arrange the filters according to the centering principle. The centering principle may mean that the mobile phone may display a plurality of filters, where the plurality of filters includes a filter selected by a user (may also be referred to as a target filter), a front preset number of filters adjacent to the filter, and a rear preset number of filters adjacent to the filter, and the filter selected by the user is located at a middle position of the plurality of filters. In addition, the centering principle also indicates that the filter class to which the user-selected filter belongs should be located in the middle of the multiple filter classes displayed by the mobile phone. For example, as shown in fig. 19, the cell phone may display a plurality of filters in window 1109a, including a user selected filter (i.e., black and white 3), the first 3 filters adjacent black and white 3 (i.e., filter 6, black and white 1, black and white 2), and the last 3 filters adjacent black and white 3 (i.e., black and white 4, black and white 5, black and white 6). Wherein the filter selected by the user (i.e. black and white 3) is in a selected state and is positioned in the middle of the plurality of filters. The window 1109a includes a plurality of filter classifications, each of which is: recently, classical, black and white, clap, architectural and landscape, wherein the filter class to which the user selected filter (i.e. black and white 3) belongs (i.e. black and white) is in the selected state and is located in the middle of the multiple filter classes.
As shown in fig. 19, the window 1109a further includes a cancel button 1109c, a confirm button 1109d, and an apply button 1109e, wherein the cancel button 1109c is located at the upper left corner of the window 1109a, the confirm button 1109d is located at the upper right corner of the window 1109a, and the apply button 1109e is located at the bottom of the window 1109a and centered. After the mobile phone detects the user's operation of the cancel button 1109c as shown in (a) of fig. 20, a first video to which no filter is added may be displayed in the preview area 1109b as shown in (b) of fig. 20. After the mobile phone detects the user's operation of the confirm button 1109d as shown in fig. 21 (a), the operation of adding the filter at this time can be saved, and as shown in fig. 21 (b), the first video with the filter of black and white 3 added is displayed in the preview area 1109 b. If the user determines to add a filter of black and white 3 to the first video, the application button 1109e may be clicked (or touched). After detecting the user's operation of the application button 1109e, the mobile phone may add the filter black and white 3 to the first video. On the other hand, if the mobile phone detects the user's operation of the confirm button 1109d, the operation of adding the filter black and white 3 to the first video can be saved.
The user may also select a filter class (e.g., black and white). For example, as shown in fig. 22, if the user wishes to use the black-and-white filter class, the user may click (or touch) the operation of the use button 1106c corresponding to the black-and-white filter class. After the handset detects the user's operation of the use button 1106c, an interface 1110 as shown in fig. 23 may be displayed. The interface 1110 shown in fig. 23 is similar to the interface 1109 shown in fig. 19, except that: interface 1110 includes a window 1110a where the cell phone displays a plurality of filters including the first filter in the black and white filter class (i.e., black and white 1), the first 3 filters adjacent black and white 1 (i.e., filter 4, filter 5, filter 6), and the last 3 filters adjacent black and white 1 (i.e., black and white 2, black and white 3, black and white 4). Wherein the first filter in the black-and-white filter class (i.e., black-and-white 1) is in a selected state and is located in the middle of the plurality of filters. It can be seen that, in the case where the user selects a certain filter class, the mobile phone can default to select the first filter in the filter class, and arrange the filters based on the centering principle.
Therefore, the editing mode of the media resource provided by the application has the advantages that the materials with similar styles are adjacently arranged, so that a user can quickly find other similar materials which are in favor of the user in the process of selecting the materials, the time of the user is saved, and the user experience is improved.
The foregoing describes a process of adding a filter to the first video by the user, and a process of adding a theme and a background to the first video is similar to a process of adding a filter to the first video, which is not described herein. In addition, the user can also perform other editing operations on the first video, and the process of adding special effects to the first video by the mobile phone will be described below with reference to the accompanying drawings.
In an alternative embodiment, the user may add special effects to the first video after the user has completed an editing operation (e.g., adding a filter). Illustratively, as shown in fig. 24 (a), the handset may display an interface 1101, the interface 1101 including an editing option 710, the editing option 710 for providing a plurality of editing items for selection by a user. The plurality of editing items may include: theme 710a, clip 710b, filter 710c, music 710d, text 710e, and special effects 710f. Wherein clip 710b is in the selected state. If the user wishes to add an effect to the first video, the effect 710f may be clicked (or touched). After the mobile phone detects the user's operation on the special effect 710f, the interface 1301 may be displayed. As shown in fig. 24 (b), the interface 1301 includes a preview area 1301a and a window 1302. The window 1302 includes a material center icon 1302a and a special effect type option 1302b. Through the material center icon 1302a, the phone can enter a material center for providing multiple types of material for selection by the user. The effect type option 1302b is used to provide multiple effect classifications for selection by the user. The plurality of special effects classifications may include: recommendation, popularity, jitter, motion, variety, borders, etc., each special effect category may include a plurality of special effects. In fig. 24 (b), the hot and the special effect 1 under the hot are in the selected state. It should be noted that, in the embodiment of the present application, more or fewer special effects than those in (b) in fig. 24 may be provided, and the selected special effects may be other preset special effects, which are not limited herein.
While effect 710f is in the selected state, the user may select an effect directly in effect type option 1302 b. Alternatively, the user may also enter the material center through the material center icon 1302a, selecting a special effect in the material center. Illustratively, the handset may receive a user click on the material center icon 1302a, and in response to this operation, the handset may display the interface 1303. As shown in (c) of fig. 24, the interface 1303 is an interface of the material center. The interface 1303 includes a material type option 1107. The material type option 1107 is used to provide multiple material types for selection by the user. The plurality of material types may include: theme 1107a, caption 1107b, filter 1107c, transition 1107d, special effects 1107e, music 1107f, and background 1107g. Wherein the special effect 1107e is in a selected state. The interface 1303 further includes a plurality of special effects corresponding to the special effects 1107e, for example, dynamic 1 to dynamic 6 under the recommended type, special effects 1 to 6 under the popular type, jitter 1 to jitter 3 under the jittering type, and the like. As shown in fig. 24 (c), the interface 1303 further includes a use button 1303a corresponding to each special effect category, and the user can make the mobile phone preview a plurality of special effects under the special effect category corresponding to the use button 1303a by operating the use button 1303 a.
In another alternative embodiment, the user may also select special effects 1107e in the material center directly through the material type option 1107 to select special effects in the material center. Illustratively, as shown in fig. 25 (a), the handset may display an interface 1401. Wherein, interface 1401 shown in fig. 25 (a) is similar to interface 1106 shown in fig. 15: filter 1107c is in a selected state. If the user wishes to add an effect to the first video, the effect 1107e can be clicked (or touched). After detecting the user's operation on the special effect 1107e, the mobile phone can display an interface 1402 shown in (b) in fig. 25. The interface 1402 shown in (b) of fig. 25 is similar to the interface 1303 shown in (c) of fig. 24, and will not be described again.
Therefore, when the user is selecting a material corresponding to a certain editing item (e.g., a filter) in the material center, if there is a need to edit other materials, another material type can be directly selected from the material center, and the material of the corresponding type is directly selected in the material center, so that the user operation is simplified and the user experience is improved without returning to the editing interface and then reselecting another editing item.
The user can select a desired effect (e.g., effect 2) from among a plurality of effects provided by interface 1402 (or interface 1303). As shown in (a) of fig. 26, the mobile phone may receive an operation of selecting special effect 2 by the user, and in response to the operation, the mobile phone may add special effect 2 to the first video on the one hand. Specifically, the mobile phone can determine whether the image frame 1 corresponding to the current position of the cursor is added with a special effect, if the image frame 1 is not added with the special effect originally, the mobile phone can add the special effect 2 by taking the image frame 1 as the initial position; if the special effect is added to the image frame 1, the mobile phone can delete the original special effect and add the special effect 2 by taking the image frame 1 as the initial position. On the other hand, the mobile phone may display an interface 1403 as shown in (b) in fig. 26. The interface 1403 includes a preview area 1403a and a window 1404. The preview area 1403a is used to circularly play the first video with the effect 2 added, so that the user can view the effect of the effect 2. The window 1404 can also display thumbnails of a plurality of effects, including a plurality of effects under the effect category to which effect 2 belongs (i.e., popular) and a plurality of effects under another effect category adjacent to the popular category (i.e., jittering). Wherein, the special effect selected by the user (namely, special effect 2) and the category (namely, popular) to which the special effect belongs are in the selected state.
As shown in (b) of fig. 26, the window 1404 further includes a cancel button 1404a and a confirm button 1404b. If the user wishes to save effect 2, the confirm button 1404b can be clicked (or touched) as shown in fig. 27 (a). After detecting the operation of the confirm button 1404b by the user, the mobile phone can determine whether the image frame where the cursor is located (i.e. image frame 1) has an added special effect. If no special effect is added to the image frame where the cursor is located, the mobile phone detects the operation of the confirm button 1404b by the user and then displays an interface 1405. As shown in (b) of fig. 27, the interface 1405 includes an editing option 710, a preview area 1405a, and a special effects progress bar 1405b. Wherein the special effects 710f in the edit selection 710 are in the selected state. In preview area 1405a, screen a is displayed, which is image frame 1 with effect 2 added thereto. The effect progress bar 1405b uses the cursor as a starting position, and extends the duration of effect 2 backward. If the cursor is positioned in the image frame with special effects added, the mobile phone can display the interface 1406 shown in fig. 27 (c) after detecting the operation of the confirm button 1404b by the user. This interface 1406 is similar to interface 1405, except that: the preview area 1406a in the interface 1406 displays a frame B, where the frame B is an image frame of the first video after adding the special effect 2 at a preset progress, and the preset progress may be a playing progress of the first video when the user clicks the confirm button 1404B.
If the user does not wish to save effect 2, the cancel button 1404a can be clicked (or touched) as shown in fig. 28 (a). After detecting the user's operation on the cancel button 1404a, the mobile phone can determine whether the image frame where the cursor is located (i.e. image frame 1) has an added special effect. If the cursor is positioned in the image frame with special effects added, the mobile phone can display the interface 1407 after detecting the operation of the cancel button 1404a by the user. As shown in (b) of fig. 28, the interface 1407 includes an editing option 710, a preview area 1407a, and a special effects progress bar 1407b. Wherein the special effects 710f in the edit selection 710 are in the selected state. In the preview area 1405a, a screen C is displayed, where the screen C is the image frame 2 of the first video after the original special effect is switched back at a preset playing position, and the preset position may be the playing progress of the first video when the user clicks the cancel button 1404a. If no special effect is added to the image frame in which the cursor is located, the mobile phone detects the user's operation of the cancel button 1404a and then displays an interface 1408 shown in fig. 28 (c). The interface 1408 is similar to interface 1407, except that: the special effects progress bar is not included in interface 1408, indicating that no special effects are present in the first video.
Optionally, the user may also select one special effects category (e.g., trending) from a plurality of special effects categories. For example, as shown in fig. 29 (a), if the user wishes to use the special effect category of hotness, the user can click (or touch) the use button 1303a corresponding to hotness. After detecting the user's operation on the use button 1303a, on the one hand, the mobile phone may add the first special effect (i.e. special effect 1) under the popular special effect classification to the first video. Specifically, the mobile phone can determine whether the special effect is added to the image frame 1 corresponding to the current position of the cursor, if the special effect is not added to the image frame 1, the mobile phone can add the special effect 1 by taking the image frame 1 as the initial position; if the special effect is added to the image frame 1, the mobile phone can delete the original special effect and add the special effect 1 by taking the image frame 1 as the initial position. On the other hand, the mobile phone may display an interface 1409 as shown in (b) in fig. 29. This interface 1409 is similar to the interface 1403 in (b) in fig. 26, except that: the interface 1409 includes a preview area 1409a and a window 1409b. The preview area 1409a is used to cyclically play the first video with the special effect 1 added. The plurality of effects displayed in the window 1409b includes all effects under the popular effect category and effects under another effect category (i.e., dithering) adjacent to the popular effect category. Wherein, the special effect classification selected by the user (i.e. popular) and the first special effect under the popular special effect classification (i.e. special effect 1) are in the selected state.
It should be noted that, the process of adding text to the first video is similar to the process of adding special effects to the first video, and will not be described in detail herein.
The process of adding transition to the first video by the mobile phone will be described with reference to the accompanying drawings.
As shown in fig. 30, the handset may display an interface 1501. The interface 1501 is a material center interface, wherein the process of displaying the material center interface can be referred to the process of displaying the interface 1106, and the interface 1501 is similar to the interface 1106 shown in fig. 15, which is not repeated herein. If the user wishes to add a transition to the first video, the transition 1107d in the material type option 1107 may be clicked. After the handset detects the user's operation to transition 1107d, an interface 1502 shown in fig. 31 may be displayed. The interface 1502 is similar to the interface 1501 shown in fig. 30, except that: in the interface 1502, the transition 1107d is in a selected state. The interface 1502 also includes a plurality of transition classifications and a plurality of transitions included in each transition classification, which may include, for example, transitions 1-6 under recommended types, pull-in, push, slide, overlay, blur, black, under classical types, and blinds 1-3 under blind types. As shown in fig. 31, the interface 1502 further includes a use button 1502a corresponding to each transition category, and the user can make the mobile phone preview a plurality of transitions under the transition category corresponding to the use button 1502a by operating the use button 1502 a.
The user may select a transition class (e.g., classical) from a plurality of transition classes provided by interface 1502. For example, as shown in fig. 32 (a), if the user wishes to use the classical one transition category, the classical corresponding use button 1502a may be clicked (or touched). After detecting the user's operation on the use button 1502a, the mobile phone can determine whether the first video has a segmentation point. If the mobile phone determines that the first video does not have a segmentation point, the mobile phone may display an interface 1503 shown in fig. 32 (b). This interface 1503 is similar to the interface 1101 shown in fig. 11, except that: the interface 1503 further includes a prompt 1503a, where the prompt 1503a is used to indicate that the mobile phone cannot add a transition to the first video, for example, "no transition that can be added". In addition, in this interface 1503, no cut point exists in the progress bar 1503b, that is, it indicates that only a section of video is included. If the mobile phone judges that the first video has the dividing point, on one hand, the mobile phone can determine the target dividing point and add the target transition between the two video segments spaced by the target dividing point. The target slicing point may be a later slicing point of the video segment where the cursor is located. For example, the first video includes a video segment 1, a video segment 2, and a video segment 3, where a slicing point 1 exists between the video segment 1 and the video segment 2, and a slicing point 2 exists between the video segment 2 and the video segment 3, and if the cursor is in the progress bar where the video segment 2 is located, the target slicing point is the next slicing point (i.e., the slicing point 2) adjacent to the video segment 2. If the video segment where the cursor is located is the last video segment of the first video, the target segmentation point is the previous segmentation point of the video segment where the cursor is located. For example, the first video includes a video segment 1 and a video segment 2, where a slicing point 1 exists between the video segment 1 and the video segment 2, and if the cursor is in the progress bar of the video segment 2, the target slicing point is the previous slicing point (i.e. the slicing point 1) adjacent to the video segment 2. Wherein the target transition is the first transition in the user-selected transition or the user-selected transition classification. Alternatively, the handset may display the interface 1504. As shown in fig. 32 (c), the interface 1504 includes a preview area 1504a, a cut point 1504c, and a window 1505, and the preview area 1504a is used to play the first video after the addition of the target transition (i.e., pull-in). Specifically, the interface 1504 further includes a play button 1504b, and after the mobile phone detects that the user is touching the play button 1504b, the mobile phone can play the first video after adding the target transition (i.e. pulling in) in the preview area 1504 a. The split point 1504c indicates the split point of the two video segments. As shown in fig. 32 (c), the window 1505 includes a plurality of transitions including all transitions under the classical transition classification and transitions included in another transition classification (e.g., a shutter) adjacent to the classical transition classification. Wherein the user selected transition category (i.e., classical) and the first transition (i.e., pull-in) under the classical transition category are in the selected state.
As shown in (c) of fig. 32, the window 1505 further includes a cancel button 1505a and a confirm button 1505b. After detecting the operation of the cancel button 1505a by the user, the mobile phone can cancel the transition of the adding target; after detecting the operation of the confirm button 1505b by the user, the mobile phone may save the operation of adding the target transition to the first video.
Alternatively, the user may also select a transition from a plurality of transitions. When the mobile phone detects that the user specifically selects a certain transition, the procedure executed by the mobile phone is the same as the operation executed when the user selects a certain transition type (as shown in fig. 32), except that when the mobile phone detects that the user specifically selects a certain transition, the mobile phone directly uses the transition selected by the user as a target transition.
The process of adding music to the first video by the mobile phone will be described with reference to the accompanying drawings.
As shown in fig. 33, the handset may display an interface 1601. The interface 1601 is a material center interface, where the process of displaying the material center interface may be referred to as the process of displaying the interface 1106, and the interface 1601 is similar to the interface 1106 shown in fig. 15, and is not repeated herein. If the user wishes to add music to the first video, the user may click on music 1107f in the material type option 1107. After the handset detects the user's operation on the music 1107f, the interface 1602 may be displayed. As shown in fig. 34, music 1107f in the interface 1602 is in a selected state. The interface 1602 also includes a plurality of music classifications and a plurality of pieces of music included in each music classification, which may include, for example, music 1 through music 3 under a recommendation classification, light music 1 through light music 3 under a light music classification, and cheering music 1 under a cheering music classification. As shown in fig. 34, the interface 1602 further includes a use button 1602a corresponding to each piece of music, and the user can cause the mobile phone to add corresponding pieces of music to the first video by operating the use button 1602 a.
Illustratively, the user may select music 2 as the target music. After detecting the user's operation of the use button 1602a of music 2, the mobile phone can determine whether the first video has added a material of the music type. If the mobile phone determines that the music type material is added to the first video, the mobile phone may display the interface 1603. As shown in fig. 35 (a), the interface 1603 includes a preview area 1603a and a pop-up window 1604. The preview area 1603a is used to preview the first video. The pop-up window 1604 includes a prompt 1604a, a cancel button 1604b, and a continue button 1604c. Wherein the prompt 1604a is used to indicate that the user confirms whether to replace the added music, for example, "whether you will replace the added music, is you continued? ". If the handset detects the user's operation of the cancel button 1604b, the handset does not replace the original music with the target music and displays the interface 1605. As shown in fig. 35 (b), the interface 1605 includes an editing option 710, a preview area 1605a, and an audio progress bar 1605b. Wherein the music 710d in the edit option 710 is in a selected state to indicate that the cell phone is currently editing such material as music. The preview area 1605a is used to preview the first video. The audio progress bar 1605b is a progress bar corresponding to the original music (e.g., music 1). As shown in fig. 36 (a), if the handset detects the user's operation of the continue button 1604c, the handset may display an interface 1606 as shown in fig. 36 (b). The interface 1606 is similar to the interface 1605 shown in fig. 35, except that a window 1607 is included in the interface 1606. The window 1607 includes therein an audio bar 1607a, a volume bar 1607b, and the like. Wherein audio bar 1607a allows the user to select audio clips from the target music. The audio bar 1607b allows the user to adjust the volume level of the target music. The window 1607 also includes a cancel button 1607c and a confirm button 1607d. When the mobile phone detects the user's operation on the cancel button 1607c, the mobile phone may not save the user's setting parameters for the target music (e.g., audio clip, volume, etc.), and display the interface 1608. As shown in fig. 36 (c), the interface 1608 is similar to the interface 1605 shown in fig. 35 (b), except that the interface 1608 includes a progress bar corresponding to the target music (i.e., music 2). When the handset detects the user's operation of the confirm button 1607d, the handset can save the user's setting parameters for the target music, and display the interface 1608. Note that the cancel button 1607c is used only to cancel the setting parameters, and is not an operation to cancel adding the target music to the first video.
If the mobile phone determines that the material of the music type is not added in the first video, the mobile phone may directly display the interface 1606 shown in (b) in fig. 36, so that the user may set the volume of the target music, the audio clip, the playing effect (such as fade-in and fade-out), etc., and the specific content thereof refers to the interfaces shown in (b) to (c) in fig. 36 and the related text descriptions, which are not described herein.
It should be noted that, the interface diagrams shown in the embodiments of the present application are examples, and the interface may further include more or less icons, buttons, controls, options, and the like than those in the diagrams, which are not described in detail herein.
The embodiment of the application also provides an editing method of the media resource, which can be applied to the electronic equipment shown in fig. 5. The method comprises the following steps:
s1, the electronic equipment displays a first interface, wherein the first interface comprises a first area and a plurality of editing items, and the first area displays a first media resource.
By way of example, the first interface may be interface 709 shown in fig. 9 (b), interface 804 shown in fig. 10 (c), etc., the first area may be preview area 709a shown in fig. 9 (b), etc., and the plurality of editing items may be subject 710a, clip 710b, filter 710c, music 710d, text 710e, special effects 710f, etc. shown in fig. 9 (b).
S2, in response to the operation of selecting a first editing item from a plurality of editing items by a user, the electronic device displays a first window on a first interface, wherein the first window comprises a first icon.
The first editing item may be any one of a theme 710a, a filter 710c, music 710d, text 710e, and a special effect 710 f. The first window is, for example, window 1103 in fig. 12, and the first icon is, for example, the material center icon 1104 in fig. 12.
And S3, responding to the operation of the user on the first icon, displaying a second interface by the electronic equipment, wherein the second interface comprises a plurality of material type options and materials of a first type, and the first editing item corresponds to the materials of the first type.
By way of example, the second interface may be interface 1106 in fig. 17, fig. 22, interface 1303 shown in (c) in fig. 24, interface 1502 shown in fig. 31, interface 1602 shown in fig. 34, and the like. The plurality of material type options are, for example, a theme 1107a, a subtitle 1107b, a filter 1107c, a transition 1107d, a special effect 1107e, music 1107f, a background 1107g, and the like.
And S4, responding to the operation of selecting the first material type by the user, displaying a third interface by the electronic equipment, wherein the plurality of material type options comprise the first material type, the third interface comprises a plurality of second type materials, and the plurality of second type materials comprise the first material.
The operation of the first material type may be any one of a theme 1107a, a subtitle 1107b, a filter 1107c, a transition 1107d, a special effect 1107e, music 1107f, a background 1107g, and the like. The third interface may be the interface 1106 in fig. 17, 22, the interface 1303 shown in (c) in fig. 24, the interface 1502 shown in fig. 31, the interface 1602 shown in fig. 34, or the like.
And S5, responding to the operation of selecting the first material by the user, displaying a fourth interface by the electronic equipment, wherein the fourth interface comprises a second area, and the second area displays the first media resource added with the first material.
Illustratively, the first material may be any one of the filters of fig. 18, such as "black and white 3"; any one of the effects shown in fig. 26 (a), for example, "effect 2", may be used; any of the transitions of fig. 31 may be used; or any piece of music in fig. 34, such as music 2.
The fourth interface may be the interface 1109 in fig. 19, the interface 1403 shown in (b) in fig. 26, the interface 1504 in fig. 32, the interface 1606 shown in (b) in fig. 36, or the like. The second region may be a region of the corresponding interface that previews the media asset.
In an optional embodiment, the fourth interface further includes a second window, where the second window includes a first control and a second control, and if the first material type is a filter, a theme, a background, or a transition, the method further includes: responding to the operation of the user on the first control, and redisplaying the first interface by the electronic equipment; and responding to the operation of the user on the second control, the electronic equipment redisplays the first interface and displays the first media resource added with the first material in the first area.
Illustratively, when the first material type is a filter, the second window may be window 1103 in fig. 12. When the first material type is transition, the second window may be window 1505 shown in (c) of fig. 32. When the first material type is a theme or background, the second window is similar to the window 1103 or 1505, except that the materials contained in the window are different. The first control may be a cancel control in the corresponding window, such as cancel button 1505a; the second control may be a confirmation control in the corresponding window, such as confirmation button 1505b.
In an optional implementation manner, the fourth interface further includes a second window, the second window further includes a first control and a second control, the first media resource includes a first segment, the first segment is a segment where the cursor is located, and if the first material type is special effects or subtitles and the first segment has added a second type of material, the method further includes: responding to the operation of the user on the first control, the electronic equipment displays a fifth interface, wherein the fifth interface comprises a third area, the third area displays a first picture, the first picture is a picture of a first media resource on a first playing progress, and the first playing progress is a playing progress indicated by a cursor when the electronic equipment detects the operation of the user on the first control; responding to the operation of the user on the second control, the electronic equipment displays a sixth interface, the sixth interface comprises a fourth area, the fourth area displays a second picture, the second picture is a picture of a second playing progress of the first media resource added with the first material, and the second playing progress is a playing progress indicated by a cursor when the electronic equipment detects the operation of the user on the second control.
Illustratively, the second window may be window 1404 in fig. 27 and 28, and the first control may be a cancel control in the corresponding window, such as cancel button 1404a; the second control may be a confirmation control in the corresponding window, such as confirmation button 1404b. The fifth interface may be the interface 1407 shown in (b) of fig. 28, and the third area may be the preview area 1407a; the sixth interface may be interface 1406 shown in fig. 27 (c), and the fourth area may be preview area 1406a.
In an alternative embodiment, if the first material type is special effects or subtitles and the first segment does not add the second type of material, the method further includes: responding to the operation of the user on the first control, the electronic equipment displays a seventh interface, wherein the seventh interface comprises a fifth area, the fifth area displays a third picture, the third picture is a picture of the first media resource at a third playing progress, and the third playing progress is a playing progress indicated by a cursor when the electronic equipment adds the first material to the first media resource; and responding to the operation of the user on the second control, the electronic equipment displays an eighth interface, wherein the eighth interface comprises a sixth area, the sixth area displays a fourth picture, and the fourth picture is a picture of the first media resource after the first material is added in the third playing progress.
Illustratively, the seventh interface may be interface 1408 shown in (c) of fig. 28, and the fifth area may be a preview area in interface 1408; the eighth interface may be the interface 1405 shown in (b) of fig. 27, and the sixth area may be the preview area 1405a.
In an optional embodiment, the fifth interface, the sixth interface, the seventh interface, and the eighth interface further include a plurality of editing items, the plurality of editing items including a second editing item, the second editing item being in a selected state, the second editing item corresponding to the second type of material.
In an alternative embodiment, if the first material type is transition, the electronic device displaying the fourth interface in response to the user selecting the first material includes: in response to a user selecting the first material, and determining that the first media asset includes a split point, the electronic device displays a fourth interface.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the first material by the user, determining that the first media resource does not comprise a segmentation point, displaying a ninth interface by the electronic equipment, wherein the ninth interface comprises a seventh area and prompt information, the seventh area displays the first media resource, and the prompt information is used for indicating that the first media resource cannot be added for transition.
Illustratively, the ninth interface may be the interface 1503 shown in (b) of fig. 32, and the seventh area may be a preview area in the interface 1503, and the prompt information is, for example, a prompt information 1503a.
In an optional implementation manner, the first media resource includes a first segment, where the first segment is a segment where the cursor is located, and if the first material type is music, responding to the operation of selecting the first material by the user, displaying the fourth interface by the electronic device includes: in response to a user selecting the first material and determining that the first segment does not have music added, the electronic device displays a fourth interface.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the first material by the user, determining that music is added to the first segment, and displaying a tenth interface by the electronic equipment, wherein the tenth interface comprises a prompt box which comprises a third control; and responding to the operation of the user on the third control, the electronic equipment displays an eleventh interface, wherein the eleventh interface comprises a plurality of editing items, the plurality of editing items comprise second editing items, the second editing items are in a selected state, and the second editing items correspond to the materials of the second type.
Illustratively, the tenth interface may be the interface 1603 shown in fig. 35 (a) and fig. 36 (a), the prompt box may be, for example, a pop-up window 1604, the third control may be, for example, a cancel button 1604b, and the fourth control may be, for example, a continue button 1604c. The eleventh interface is, for example, an interface 1605 shown in (b) of fig. 35.
In an alternative embodiment, the prompt box further includes a fourth control, and the method further includes: responding to the operation of the user on the fourth control, adding a first material to the first segment by the electronic equipment, and displaying a fourth interface, wherein the fourth interface also comprises a third window, and the third window comprises a volume bar, an audio track, a fifth control and a sixth control; in response to a user operation of the fifth control or the sixth control, the electronic device displays an eleventh interface.
Illustratively, the third window may be the window 1607 shown in fig. 36 (b), the volume bar may be the volume bar 1607b, the audio track may be, for example, the audio bar 1607a, and the fifth control and the sixth control may be the cancel button 1607c and the confirm button 1607d, respectively.
In an optional embodiment, the third interface further includes a plurality of type labels and a plurality of controls, each type label corresponds to a plurality of materials of the second type, and the plurality of type labels corresponds to the plurality of controls one to one; in response to a user selecting a seventh control from the plurality of controls, the electronic device displays a twelfth interface, the twelfth interface including an eighth area, the eighth area displaying the first media resource after adding the second material, the second material being a material of the plurality of second types of materials corresponding to the first type label, the first type label being a type label corresponding to the seventh control.
For example, the plurality of types of labels may be "recent", "black and white" in fig. 22; may be "recommended", "hot", "jittered" or the like as shown in fig. 24 (c). The plurality of controls may be use controls in the corresponding interface, such as use button 1106c, and the like.
In an alternative embodiment, the fourth interface further includes a fourth window, the fourth window including a plurality of type tags and a plurality of materials of the second type, the first type tags being in a selected state, the first materials being in a selected state.
In an alternative embodiment, the method further comprises: responding to the operation of selecting the second material in the third interface by the user, and displaying a fifth window by the electronic equipment, wherein the fifth window comprises a seventh control; and responding to the operation of the user on the seventh control, the electronic equipment displays a twelfth interface, wherein the twelfth interface comprises a ninth area, and the ninth area displays the first media resource added with the second material.
Illustratively, the fifth window may be window 1108 shown in fig. 18 (b), the seventh control may be a use filter button 1108c, a twelfth interface may be, for example, interface 1109 in fig. 19, and a ninth area may be, for example, preview area 1109b.
In an alternative embodiment, the fifth window includes a tenth area for displaying the second material and an eleventh area for displaying a plurality of second type materials, the plurality of second type materials including the second material, the second material being in a selected state.
Illustratively, the tenth region may be a filter preview region 1108a, the eleventh region may be a filter select region 1108b, and the second material is, for example, "black and white 1".
In an alternative embodiment, when the electronic device displays the first window for the first time, the first window further includes a guide bubble for prompting the user to click on the first icon.
The guide bubble may be, for example, guide bubble 1103a in fig. 15.
In summary, the embodiment of the application provides a method for editing media resources, which can be applied to electronic equipment. The electronic equipment sets the material center inlet, and displays the interface of the material center after receiving the operation of the user on the material center inlet, the interface of the material center can provide a plurality of different types of editing materials for the user to select, the operation times of the user can be reduced, the operation process of the user is simplified, and the use experience of the user is improved.
The embodiment of the application also provides electronic equipment, which comprises: a display screen, a memory, and one or more processors; the display screen and the memory are coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of editing a multimedia asset as provided by the foregoing embodiments. The specific structure of the electronic device may refer to the structure of the electronic device shown in fig. 5.
The embodiment of the application also provides a computer readable storage medium, which comprises computer instructions that, when executed on an electronic device, cause the electronic device to execute the editing method of the multimedia resource as provided in the previous embodiment.
Embodiments of the present application also provide a computer program product containing executable instructions that, when run on an electronic device, cause the electronic device to perform the method of editing a multimedia asset as provided by the previous embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (17)

1. A method of editing a media asset, the method comprising:
the electronic equipment displays a first interface, wherein the first interface comprises a first area and a plurality of editing items, and the first area displays a first media resource;
in response to a user selecting a first editing item from the plurality of editing items, the electronic device displays a first window on the first interface, the first window including a first icon;
responding to the operation of a user on the first icon, displaying a second interface by the electronic equipment, wherein the second interface comprises a plurality of material type options and a first type of material, and the first editing item corresponds to the first type of material;
in response to a user selecting a first material type, the electronic device displays a third interface, wherein the plurality of material type options comprise the first material type, the third interface comprises a plurality of second type materials, and the plurality of second type materials comprise the first material;
and responding to the operation of selecting the first material by the user, displaying a fourth interface by the electronic equipment, wherein the fourth interface comprises a second area, and the second area displays the first media resource added with the first material.
2. The method of claim 1, wherein the fourth interface further comprises a second window comprising a first control and a second control, and wherein if the first material type is a filter, a theme, a background, or a transition, the method further comprises:
responding to the operation of the user on the first control, and redisplaying the first interface by the electronic equipment;
and responding to the operation of the user on the second control, redisplaying the first interface by the electronic equipment, and displaying the first media resource added with the first material in the first area.
3. The method of claim 1, wherein the fourth interface further comprises a second window, the second window further comprises a first control and a second control, the first media resource comprises a first segment, the first segment is a segment in which a cursor is located, and if the first material type is special effects or subtitles and the first segment has added a second type of material, the method further comprises:
responding to the operation of a user on the first control, displaying a fifth interface by the electronic equipment, wherein the fifth interface comprises a third area, the third area displays a first picture, the first picture is a picture of the first media resource on a first playing progress, and the first playing progress is the playing progress indicated by the cursor when the electronic equipment detects the operation of the user on the first control;
Responding to the operation of the user on the second control, the electronic equipment displays a sixth interface, wherein the sixth interface comprises a fourth area, the fourth area displays a second picture, the second picture is a picture of a second playing progress of the first media resource added with the first material, and the second playing progress is the playing progress indicated by the cursor when the electronic equipment detects the operation of the user on the second control.
4. The method of claim 3, wherein if the first material type is special effects or subtitles and the first clip does not add material of a second type, the method further comprises:
responding to the operation of the user on the first control, the electronic equipment displays a seventh interface, wherein the seventh interface comprises a fifth area, the fifth area displays a third picture, the third picture is a picture of the first media resource at a third playing progress, and the third playing progress is the playing progress indicated by the cursor when the electronic equipment adds the first material to the first media resource;
responding to the operation of the user on the second control, the electronic equipment displays an eighth interface, wherein the eighth interface comprises a sixth area, the sixth area displays a fourth picture, and the fourth picture is a picture of the first media resource after the first material is added on the third playing progress.
5. The method of claim 4, wherein the fifth interface, the sixth interface, the seventh interface, and the eighth interface further comprise the plurality of editing items, the plurality of editing items comprising a second editing item, the second editing item being in a selected state, the second editing item corresponding to the second type of material.
6. The method of claim 1, wherein the electronic device displaying a fourth interface in response to a user selecting the first material if the first material type is transition comprises:
responsive to a user selecting the first material, and determining that the first media asset includes a cut point, the electronic device displays the fourth interface.
7. The method of claim 6, wherein the method further comprises:
responding to the operation of selecting the first material by a user, determining that the first media resource does not comprise a segmentation point, and displaying a ninth interface by the electronic equipment, wherein the ninth interface comprises a seventh area and prompt information, the seventh area displays the first media resource, and the prompt information is used for indicating that the first media resource cannot be added for transition.
8. The method of claim 1, wherein the first media asset comprises a first segment, the first segment is a segment in which a cursor is located, and the responding to the user selecting the first material if the first material type is music, the electronic device displaying a fourth interface comprises:
and in response to a user selecting the first material and determining that the first piece is not added with music, the electronic device displays the fourth interface.
9. The method of claim 8, wherein the method further comprises:
responding to the operation of selecting the first material by a user, determining that music is added to the first segment, and displaying a tenth interface by the electronic equipment, wherein the tenth interface comprises a prompt box, and the prompt box comprises a third control;
and responding to the operation of the user on the third control, the electronic equipment displays an eleventh interface, wherein the eleventh interface comprises a plurality of editing items, the plurality of editing items comprise a second editing item, the second editing item is in a selected state, and the second editing item corresponds to the second type of material.
10. The method of claim 9, wherein the prompt box further comprises a fourth control, the method further comprising:
Responding to the operation of the user on the fourth control, adding the first material to the first segment by the electronic equipment, and displaying the fourth interface, wherein the fourth interface also comprises a third window, and the third window comprises a volume bar, an audio track, a fifth control and a sixth control;
and responding to the operation of the user on the fifth control or the sixth control, and displaying the eleventh interface by the electronic equipment.
11. The method of any of claims 1-7, wherein the third interface further comprises a plurality of type labels and a plurality of controls, each type label corresponding to a plurality of second type materials, the plurality of type labels corresponding to the plurality of controls one-to-one;
responding to the operation of selecting a seventh control from the plurality of controls by a user, displaying a twelfth interface by the electronic equipment, wherein the twelfth interface comprises an eighth area, the eighth area displays a first media resource added with a second material, the second material is a material in a plurality of second type materials corresponding to a first type label, and the first type label is a type label corresponding to the seventh control.
12. The method of claim 11, wherein the fourth interface further comprises a fourth window comprising the plurality of type tags and a plurality of second type material, the first type tag being in a selected state and the first material being in a selected state.
13. The method according to any one of claims 1-10, further comprising:
responding to the operation of selecting the second material in the third interface by a user, and displaying a fifth window by the electronic equipment, wherein the fifth window comprises a seventh control;
and responding to the operation of the user on the seventh control, the electronic equipment displays a twelfth interface, wherein the twelfth interface comprises a ninth area, and the ninth area displays the first media resource added with the second material.
14. The method of claim 13, wherein the fifth window includes a tenth area for displaying the second material and an eleventh area for displaying a plurality of second type materials including the second material, the second material being in a selected state.
15. The method of any of claims 1-10, wherein when the electronic device first displays the first window, the first window further comprises a guide bubble for prompting a user to click on the first icon.
16. An electronic device comprising a processor, a memory coupled to the processor, the memory storing program instructions that when executed by the processor cause the electronic device to implement the method of any of claims 1-15.
17. A computer-readable storage medium comprising computer instructions;
the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-15.
CN202211036191.8A 2022-05-30 2022-08-27 Editing method of media resources, electronic equipment and readable storage medium Active CN116634058B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210603535 2022-05-30
CN2022106035352 2022-05-30

Publications (2)

Publication Number Publication Date
CN116634058A true CN116634058A (en) 2023-08-22
CN116634058B CN116634058B (en) 2023-12-22

Family

ID=87640465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211036191.8A Active CN116634058B (en) 2022-05-30 2022-08-27 Editing method of media resources, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116634058B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014119963A (en) * 2012-12-17 2014-06-30 Toshiba Tec Corp Electronic menu editing device and electronic menu editing program
CN105005430A (en) * 2015-07-17 2015-10-28 深圳市金立通信设备有限公司 Window display method and terminal
EP3029676A1 (en) * 2014-12-02 2016-06-08 Bellevue Investments GmbH & Co. KGaA System and method for theme based video creation with real-time effects
CN107770626A (en) * 2017-11-06 2018-03-06 腾讯科技(深圳)有限公司 Processing method, image synthesizing method, device and the storage medium of video material
CN110381371A (en) * 2019-07-30 2019-10-25 维沃移动通信有限公司 A kind of video clipping method and electronic equipment
WO2019227283A1 (en) * 2018-05-28 2019-12-05 深圳市大疆创新科技有限公司 Multimedia editing method and intelligent terminal
CN112987999A (en) * 2021-04-13 2021-06-18 杭州网易云音乐科技有限公司 Video editing method and device, computer readable storage medium and electronic equipment
CN113473204A (en) * 2021-05-31 2021-10-01 北京达佳互联信息技术有限公司 Information display method and device, electronic equipment and storage medium
WO2022022262A1 (en) * 2020-07-28 2022-02-03 游艺星际(北京)科技有限公司 Processing method for multimedia resource, publishing method and electronic device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014119963A (en) * 2012-12-17 2014-06-30 Toshiba Tec Corp Electronic menu editing device and electronic menu editing program
EP3029676A1 (en) * 2014-12-02 2016-06-08 Bellevue Investments GmbH & Co. KGaA System and method for theme based video creation with real-time effects
CN105005430A (en) * 2015-07-17 2015-10-28 深圳市金立通信设备有限公司 Window display method and terminal
CN107770626A (en) * 2017-11-06 2018-03-06 腾讯科技(深圳)有限公司 Processing method, image synthesizing method, device and the storage medium of video material
WO2019227283A1 (en) * 2018-05-28 2019-12-05 深圳市大疆创新科技有限公司 Multimedia editing method and intelligent terminal
CN110381371A (en) * 2019-07-30 2019-10-25 维沃移动通信有限公司 A kind of video clipping method and electronic equipment
WO2022022262A1 (en) * 2020-07-28 2022-02-03 游艺星际(北京)科技有限公司 Processing method for multimedia resource, publishing method and electronic device
CN112987999A (en) * 2021-04-13 2021-06-18 杭州网易云音乐科技有限公司 Video editing method and device, computer readable storage medium and electronic equipment
CN113473204A (en) * 2021-05-31 2021-10-01 北京达佳互联信息技术有限公司 Information display method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孟昊雨: "《网络新媒体营销与技术》", 北京:中国商务出版社, pages: 283 - 290 *

Also Published As

Publication number Publication date
CN116634058B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
WO2021135655A1 (en) Method and device for generating multimedia resources
WO2020172826A1 (en) Video processing method and mobile device
CN115002340B (en) Video processing method and electronic equipment
US20140050422A1 (en) Electronic Apparatus and Image Processing Method
WO2022068511A1 (en) Video generation method and electronic device
CN114827342A (en) Video processing method, electronic device and readable medium
CN113312115A (en) Information collection method, electronic device and computer readable storage medium
CN116634058B (en) Editing method of media resources, electronic equipment and readable storage medium
CN115484387B (en) Prompting method and electronic equipment
CN115480684A (en) Method for returning edited multimedia resource and electronic equipment
CN117692552A (en) Wallpaper display method, electronic equipment and storage medium
CN115484397B (en) Multimedia resource sharing method and electronic equipment
CN115484390B (en) Video shooting method and electronic equipment
CN115484393B (en) Abnormality prompting method and electronic equipment
CN115484399B (en) Video processing method and electronic equipment
CN115484392B (en) Video shooting method and electronic equipment
WO2023142731A1 (en) Method for sharing multimedia file, sending end device, and receiving end device
EP4361805A1 (en) Method for generating theme wallpaper, and electronic device
CN116033261B (en) Video processing method, electronic equipment, storage medium and chip
CN115484391B (en) Shooting method and electronic equipment
CN111381801B (en) Audio playing method based on double-screen terminal and communication terminal
WO2023160143A1 (en) Method and apparatus for viewing multimedia content
CN115484396B (en) Video processing method and electronic equipment
WO2023160208A1 (en) Image deletion operation notification method, device, and storage medium
WO2023160142A1 (en) Video processing method, and electronic device and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant