CN117911582A - Adaptive editing experience for mixed media content - Google Patents

Adaptive editing experience for mixed media content Download PDF

Info

Publication number
CN117911582A
CN117911582A CN202310982890.XA CN202310982890A CN117911582A CN 117911582 A CN117911582 A CN 117911582A CN 202310982890 A CN202310982890 A CN 202310982890A CN 117911582 A CN117911582 A CN 117911582A
Authority
CN
China
Prior art keywords
page
media
scene
pages
elements
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310982890.XA
Other languages
Chinese (zh)
Inventor
I·Y·Q·王
B·G·马修斯
Z·S·西尔弗斯坦
S·M·卡拉布罗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adobe Inc
Original Assignee
Adobe Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/348,522 external-priority patent/US20240127512A1/en
Application filed by Adobe Systems Inc filed Critical Adobe Systems Inc
Publication of CN117911582A publication Critical patent/CN117911582A/en
Pending legal-status Critical Current

Links

Abstract

Embodiments of the present disclosure relate to an adaptive editing experience for mixed media content. A method, apparatus, and non-transitory computer readable medium for multimedia processing are described. Embodiments of the present disclosure obtain a project file that includes page data for one or more pages. Each of the one or more pages includes a spatial arrangement of one or more media elements. The media editing interface presents pages of the one or more pages based on the spatial arrangement. The media editing interface displays a scene line near the page. The scene line includes a temporal arrangement of one or more scenes within the page, and the one or more media elements are temporally arranged within the one or more scenes.

Description

Adaptive editing experience for mixed media content
Cross Reference to Related Applications
According to 35U.S. c. ≡119, the present application claims priority from U.S. provisional application No. 63/379,811 and U.S. application No. 18/348,522, filed to the U.S. patent and trademark office on day 10 and 17 of 2022, the disclosures of which are incorporated herein by reference in their entirety.
Background
The following relates generally to multimedia processing, and more particularly to multimedia content editing. Multimedia processing refers to processing different types of media content using a computer to generate a multimedia digital product. In some cases, the media content relates to text, images, audio, video, animation effects, and the like. The creation of multimedia content includes the use of multimedia-centric tools that are built around one of two examples, namely page-based examples (e.g., presentations, reports, slide material, etc.), or time-based examples (e.g., video).
Disclosure of Invention
Embodiments of the present disclosure include a multimedia creation tool configured to combine a time-based editing paradigm and a page-based editing paradigm. In some cases, the temporal editing paradigm involves video timeline or scene line control. The media editing interface of the multimedia creation tool presents a scene line adjacent to the page. The scene line includes a temporal arrangement of one or more scenes within the page. One or more media elements are temporally disposed within one or more scenes. Thus, a user can easily design a digital output using a time-saving integrated media editing interface. The user may create expressive content for the social media platform, for example (e.g.,Story).
A method, apparatus, and non-transitory computer readable medium for multimedia processing are described. One or more embodiments of the method, apparatus, and non-transitory computer-readable medium include obtaining a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements; presenting pages of the one or more spatially arranged based pages via the media editing interface; and presenting, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
A method, apparatus, and non-transitory computer readable medium for multimedia processing are described. One or more embodiments of the method, apparatus, and non-transitory computer-readable medium include presenting content to a user via a media editing interface that includes a page control element for modifying a spatial arrangement of pages of content and a time control element for modifying a temporal arrangement of scenes located within the pages; receiving page control input for editing content via a page control element; receiving a time control input for editing the content via the time control element; and generating modified content based on the page control input and the time control input.
An apparatus and method for multimedia processing is described. One or more embodiments of the apparatus and method include a processor and a memory including instructions executable by the processor to: obtaining, via the media editing interface, a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements; presenting pages of the one or more pages based on the spatial arrangement via the media editing interface; and presenting, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
Drawings
Fig. 1 illustrates an example of a multimedia processing system in accordance with aspects of the present disclosure.
Fig. 2 illustrates an example of a multimedia processing apparatus according to aspects of the present disclosure.
Fig. 3 illustrates an example of content editing and content generation in accordance with aspects of the present disclosure.
Fig. 4 illustrates an example of a media element according to aspects of the present disclosure.
Fig. 5 illustrates an example of a multimedia activity ("campaign") in accordance with aspects of the present disclosure.
Fig. 6 and 7 illustrate examples of media editing interfaces according to aspects of the present disclosure.
Fig. 8 illustrates an example of a method for multimedia processing in accordance with aspects of the present disclosure.
Fig. 9 illustrates an example of a method of modifying a scene based on time control in accordance with aspects of the present disclosure.
Fig. 10 illustrates an example of a scene line in accordance with aspects of the present disclosure.
Fig. 11 illustrates an example of modifying a scenario in accordance with aspects of the present disclosure.
FIG. 12 illustrates an example of a temporal aspect of modifying a media file according to aspects of the present disclosure.
Fig. 13 illustrates an example of a computing device in accordance with aspects of the present disclosure.
Detailed Description
Embodiments of the present disclosure include a multimedia creation tool configured to combine a time-based editing paradigm and a page-based editing paradigm. In some cases, the temporal editing paradigm involves video timeline or scene line control. The media editing interface of the multimedia creation tool presents a scene line adjacent to the page. The scene line includes a temporal arrangement of one or more scenes within the page. One or more media elements are temporally disposed within one or more scenes. Thus, a user can easily design digital output using an efficient, time-saving, and integrated media editing interface. The user may create expressive content for the social media platform, for example (e.g.,Story).
Creation of multimedia content (e.g., composite multimedia activity or project) includes using a multimedia-centric tool built around one of two examples, i.e., a page-based example (e.g., presentation, report, etc.), or a time-based example (e.g., video). For example, a user who wants to edit three videos must open each of the three videos on a video editing application and perform editing on the videos separately. Further, page-based creation tools (e.g.PowerPoint) is used to edit static media elements without processing the video. Thus, a particular class of tools and outputs is considered outdated for creating multimedia content.
In some cases, a social media story may be considered content broken up into multiple sequential pages. The user may choose between static content (e.g., images) or dynamic content (e.g., videos) on each page. When a user selects dynamic content in a page, the page may include content that includes multiple scenes. For example, a scene includes images, text, audio data, video sequences, and the like. Conventional media creation tools do not provide an integrated editing experience for creating multiple scenes that are temporarily arranged in a single page. In some cases, when posting content to a social media platform, a user obtains video output and static output from different media creation tools and combines the output on a third tool (e.g., social media platform) for posting. Thus, users have to frequently switch between media authoring tools, which is inconvenient and time consuming.
Traditional methods create multimedia output centered on a page-based or time-based paradigm. For example, presentation tools (such asPowerPoint) is page-based, providing minimal support for multimedia authoring by setting up animation of the input video. For example, if a user wishes to create a multimedia output to combine two videos, the user must upload the two videos on two separate pages. However, such presentation tools are not capable of ordering assets over time. Furthermore, such presentation tools are not capable of combining two or more videos into a continuous video. Thus, creating video using such presentation tools can be time consuming.
In some examples, a time-based editing tool (video editing platform, such asOr (b)) Providing multimedia creation, precise ordering, timing and transition. However, the time-based editing tool does not provide page control (e.g., spatial arrangement of media elements on one or more pages). Thus, a presentation cannot be created in a time-based tool.
Embodiments of the present disclosure include a multimedia creation tool configured to combine a time-based editing paradigm and a page-based editing paradigm. In some cases, both of these examples support complete page control and nested temporary control in the project page. In addition, the multimedia creation tool may also provide the functionality to preview, adjust, time and export the actual digital native output at the same time as desired.
In some embodiments, a multimedia processing apparatus obtains a project file that includes page data for one or more pages. Each of the one or more pages includes a spatial arrangement of one or more media elements. The multimedia processing apparatus presents pages of the one or more pages based on the spatial arrangement via the media editing interface. In addition, the multimedia processing apparatus presents a scene line adjacent to the page via the media editing interface. The scene line includes a temporal arrangement of one or more scenes within the page, and the one or more media elements are temporally arranged within the one or more scenes.
Embodiments of the present disclosure include a user interface (e.g., a media editing interface) that includes a display area for displaying a set of pages of a project file. One or more media elements may be spatially arranged on a page. When a page is selected, the media editing interface displays a scene line for editing video content within the page. The scene line includes a temporal arrangement of one or more scenes within the page. The plurality of pages may be displayed in a rotational, in a temporal arrangement, or in a spatial arrangement. The pages of items may be structure-based links, with a sequential order or spatial arrangement. Each page may have metadata including page size, page margin, layout, and list of elements included in the page.
By combining the page-based functionality and the time editing functionality into one integrated user interface, embodiments of the present invention enable users to generate multimedia content with more efficient and seamless content editing. The user avoids switching between different editing tools to combine static media content and dynamic media content (e.g., multiple videos). The multimedia processing apparatus is also capable of integrating various types of media elements and a user can easily modify temporal aspects of scenes along a scene line within a page, such as adding new scenes, removing scenes, arranging ordering of multiple scenes. At the same time, the convenience of page-based design is preserved so that a user can navigate from one page to another for content editing (e.g., working on additional media objects and scenes within different pages).
Multimedia processing architecture
In fig. 1 to 2, an apparatus and method for multimedia processing are described. One or more aspects of the apparatus and method include a processor; and a memory comprising instructions executable by the processor to: obtaining, via the media editing interface, a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements; presenting, via the media editing interface, pages of the one or more pages based on the spatial arrangement; and presenting, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
In some examples, the media editing interface includes a page navigation element, a page control element, and a time control element. In some examples, the page control elements include add page elements, remove page elements, copy page elements, page size elements, or page orientation elements. In some examples, the temporal control element includes a video ordering element, a video transition element, or a video pacing ("paging") element.
Fig. 1 illustrates an example of a multimedia processing system in accordance with aspects of the present disclosure. The illustrated examples include a user 100, a user device 105, a multimedia processing apparatus 110, a cloud 115, a database 120, and a social media platform 125.
As in the example shown in fig. 1, a user 100 provides one or more media content to a multimedia processing device 110 via a user device 105 and a cloud 115. For example, the media content may include images, video, audio tracks, text, and the like. In some cases, the user 100 retrieves media content from the database 120. In some cases, the user 100 uploads the original media content to the multimedia processing device 110. The multimedia processing apparatus 110 (or user device 105) may provide a user interface (e.g., a media editing interface) for editing media content to create multimedia content. The media content may be spatially arranged on one or more pages in the user interface. Further, the media content may be temporarily arranged on one or more pages.
According to some aspects, the multimedia processing apparatus 110 generates multimedia content based on the media content. The multimedia processing apparatus 110 displays multimedia content to the user 100 via the user device 105. Accordingly, the multimedia processing device 110 publishes the modified project file to the social media platform 125. The multimedia processing apparatus 110 is an example of or includes aspects of the corresponding elements described with reference to fig. 2.
The user device 105 may be a personal computer, laptop computer, mainframe computer, palmtop computer, personal assistant, mobile device, or any other suitable processing apparatus. In some examples, the user device 105 includes software that incorporates an image processing application. In some examples, the image processing application on the user equipment 105 may include the functionality of the multimedia processing apparatus 110.
A user interface (e.g., a media editing interface) may enable user 100 to interact with user device 105. In some embodiments, the user interface may include an audio device such as an external speaker system, an external display device such as a display screen, or an input device (e.g., a remote control device that interfaces with the user interface directly or through an I/O controller module). In some cases, the user interface may be a Graphical User Interface (GUI). In some examples, the user interface may be represented by code that is sent to the user device and presented locally by the browser.
The multimedia processing apparatus 110 includes a media editing interface that may be implemented on the user device 105. In some embodiments, the media editing interface includes a page navigation element, a page control element, and a time control element. In addition, the multimedia processing apparatus 110 communicates with the database 120 through the cloud 115.
Further details regarding the architecture of the multimedia processing apparatus 110 are provided with reference to fig. 2 and 13. Further details regarding the application and operation of the multimedia processing apparatus 110 are provided with reference to fig. 3-12.
In some cases, the multimedia processing apparatus 110 is implemented on a server. The server provides one or more functions to users linked through one or more of various networks. In some cases, the server comprises a single microprocessor board that includes a microprocessor responsible for controlling all aspects of the server. In some cases, the server uses one or more microprocessors and protocols to exchange data with other devices/users on one or more of the networks via hypertext transfer protocol (HTTP) and Simple Mail Transfer Protocol (SMTP), although other protocols such as File Transfer Protocol (FTP) and Simple Network Management Protocol (SNMP) may also be used. In some cases, the server is configured to send and receive hypertext markup language (HTML) formatted files (e.g., for displaying web pages). In various embodiments, the server comprises a general purpose computing device, a personal computer, a laptop computer, a mainframe computer, a supercomputer, or any other suitable processing apparatus.
Cloud 115 is a computer network configured to provide on-demand availability of computer system resources such as data storage and computing power. In some examples, cloud 115 provides resources without active management by user 100. The term "cloud" is sometimes used to describe a data center available to many users (e.g., user 100) on the internet. Some large cloud network functions are distributed from a central server across multiple locations. If a server has a direct or close connection with a user (e.g., user 100), the server is designated as an edge server. In some cases, the cloud 115 is limited to a single organization. In other examples, cloud 115 may be used for many organizations. In one example, cloud 115 includes a multi-layer communication network that includes a plurality of edge routers and core routers. In another example, cloud 115 is based on a collection of local switches in a single physical location. According to some embodiments, cloud 115 enables communication between user equipment 105, multimedia processing device 110, and database 120.
Database 120 is an organized collection of data. For example, database 120 stores data in a specified format called schema. Database 120 may be constructed as a single database, a distributed database, multiple distributed databases, or an emergency backup database. In some cases, the database controller may manage the storage and processing of data in database 120. In some cases, the user interacts with the database controller. In other cases, the database controller may operate automatically without user interaction. According to some embodiments, the database 120 is external to the multimedia processing device 110 and communicates with the multimedia processing device 110 via the cloud 115. According to some embodiments, the database 120 is included in the multimedia processing apparatus 110.
Fig. 2 illustrates an example of a multimedia processing apparatus 200 in accordance with aspects of the present disclosure. The illustrated example includes a multimedia processing apparatus 200, a processor unit 205, a memory unit 210, and a media editing interface 215. The multimedia processing apparatus 200 is an example of or includes aspects of the corresponding elements described with reference to fig. 1.
The processor unit 205 is a smart hardware device (e.g., a general purpose processing component, a Digital Signal Processor (DSP), a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a programmable logic device, discrete gate or transistor logic components, discrete hardware components, or any combination thereof). In some cases, the processor unit 205 is configured to operate the memory array using a memory controller. In other cases, the memory controller is integrated into the processor. In some cases, processor unit 205 is configured to execute computer-readable instructions stored in memory to perform various functions. In some embodiments, the processor unit 205 includes dedicated components for modem processing, baseband processing, digital signal processing, or transmission processing. The processor unit 205 is an example of, or includes aspects of, the processor described with reference to fig. 13.
According to some embodiments, memory unit 210 includes instructions executable by a processor to perform certain functions. Examples of the memory unit 210 include Random Access Memory (RAM), read Only Memory (ROM), or a hard disk. Examples of memory unit 210 include solid state memory and hard drives. In some examples, memory unit 210 is used to store computer-readable, computer-executable software comprising instructions that, when executed, cause a processor to perform the various functions described herein. In some cases, memory unit 210 includes, among other things, a basic input/output system (BIOS) that controls basic hardware or software operations, such as interactions with peripheral components or devices. In some cases, the memory controller operates the memory cells. For example, the memory controller may include a row decoder, a column decoder, or both. In some cases, memory cells within memory cell 210 store information in the form of logic states. Memory unit 210 is an example of or includes aspects of the memory subsystem described with reference to fig. 13.
According to some aspects, the media editing interface 215 obtains a project file that includes page data for one or more pages, where each of the one or more pages includes a spatial arrangement of one or more media elements. In some examples, the media editing interface 215 presents pages of the one or more pages based on the spatial arrangement. The media editing interface 215 presents a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
In some examples, the media editing interface 215 identifies an activity that includes a set of items. In some examples, the media editing interface 215 presents a collection of items in an campaign. In some examples, the media editing interface 215 receives user input identifying an item from the set of items, wherein the item file is obtained based on the user input.
In some examples, the media editing interface 215 presents the page navigation element 220. The media editing interface 215 presents additional pages of the one or more pages based on the page navigation input. In some examples, the project file includes an ordering of one or more pages.
In some examples, the media editing interface 215 presents a page control element 225. In some examples, the media editing interface 215 presents the time control element 230. The media editing interface 215 modifies a scene of the one or more scenes based on the time control input. In some examples, the media editing interface 215 receives a media selection input that selects a media element of the one or more media elements. The media editing interface 215 presents context tracking adjacent to the scene line based on the media selection input.
In some examples, the media editing interface 215 receives a media location input that modifies a location of a media element of the one or more media elements. In some examples, the media editing interface 215 presents the media element at a modified location within the page. In some examples, the media editing interface 215 receives user input. In some examples, the media editing interface 215 generates the modified project file based on user input. In some examples, the modified project file includes a multi-scene video. In some examples, the modified project file includes a multi-page presentation.
According to some embodiments, the media editing interface 215 presents content to a user, including a page control element 225 for modifying the spatial arrangement of pages of the content and a time control element 230 for modifying the temporal arrangement of scenes located within the pages; the media editing interface 215 receives page control input for editing content via the page control element 225; the media editing interface 215 receives time control input for editing content via the time control element 230; and the media editing interface 215 generates modified content based on the page control input and the time control input. In some examples, the media editing interface 215 generates the multi-scene video based on the time control input, wherein the modified content comprises the multi-scene video.
In some examples, the media editing interface 215 includes a page navigation element 220, a page control element 225, and a time control element 230. The media editing interface 215 is an example of or includes aspects of the corresponding elements described with reference to fig. 6, 7, and 11.
According to some embodiments, page navigation element 220 receives page navigation input. Page navigation element 220 is an example of or includes aspects of the corresponding elements described with reference to FIGS. 6 and 7.
According to some embodiments, page control element 225 receives page control input. In some examples, the page control elements 225 include add page elements, remove page elements, copy page elements, page size elements, or page orientation elements. The page control element 225 is an example of or includes aspects of the corresponding elements described with reference to fig. 5-7.
According to some embodiments, the time control element 230 receives a time control input. In some examples, the temporal control element 230 includes a video ordering element, a video transition element, or a video pacing element. The time control element 230 is an example of or includes aspects of the corresponding element described with reference to fig. 12.
The described methods may be implemented or performed by a device that comprises a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof. A general purpose processor may be a microprocessor, a conventional processor, a controller, a microcontroller, or a state machine. A processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration). Thus, the functions described herein may be implemented in hardware or software and may be performed by a processor, firmware, or any combination thereof. If implemented in software for execution by a processor, the functions may be stored on a computer-readable medium in the form of instructions or code.
Computer-readable media includes both non-transitory computer storage media and communication media including any medium that facilitates transfer of code or data. Non-transitory storage media may be any available media that can be accessed by a computer. For example, the non-transitory computer-readable medium may include Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), compact Disk (CD) or other optical disk storage, magnetic disk storage, or any other non-transitory medium for carrying or storing data or code.
Further, the connection components may be properly termed a computer-readable medium. For example, if the code or data is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, or microwave signals, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies are included in the definition of medium. Combinations of the media are also included within the scope of computer-readable media.
Media editing and content generation
In fig. 3-13, methods, apparatus, and non-transitory computer readable media for multimedia processing are described. One or more embodiments of the method, apparatus, and non-transitory computer-readable medium include obtaining a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements; presenting, via the media editing interface, pages of the spatially-based one or more pages; and presenting, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include identifying an activity comprising a plurality of items. Some examples also include presenting a plurality of items in an activity. Some examples also include receiving user input identifying items from the plurality of items, wherein the item file is obtained based on the user input.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include presenting a page navigation element in the media editing interface. Some examples also include receiving page navigation input via a page navigation element. Some examples also include presenting additional pages of the one or more pages via the media editing interface based on the page navigation input. In some aspects, the project file includes an ordering of one or more pages.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include presenting a page control element in the media editing interface. Some examples also include receiving a page control input via the page control element. In some aspects, the page control elements include add page elements, remove page elements, copy page elements, page size elements, or page orientation elements.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include presenting a time control element in the media editing interface. Some examples also include receiving a time control input via a time control element. Some examples also include modifying a scene of the one or more scenes based on the time control input. In some examples, the temporal control element includes a video ordering element, a video transition element, or a video pacing element.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include receiving a media selection input that selects a media element of the one or more media elements. Some examples also include presenting context tracking adjacent to the scene line based on the media selection input.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include receiving a media location input modifying a location of a media element of the one or more media elements. Some examples also include presenting the media element at a modified location within the page.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include receiving user input via the media editing interface. Some examples also include generating a modified project file based on the user input.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include posting the modified project file to a social media platform. In some examples, the modified project file includes a multi-scene video. In some examples, the modified project file includes a multi-page presentation.
A method, apparatus, and non-transitory computer readable medium for multimedia processing are described. One or more embodiments of the method, apparatus, and non-transitory computer-readable medium include presenting content to a user via a media editing interface that includes a page control element for modifying a spatial arrangement of pages of content and a time control element for modifying a temporal arrangement of scenes located within the pages; receiving page control input for editing content via a page control element; receiving a time control input for editing the content via the time control element; and generating modified content based on the page control input and the time control input.
Some examples of the method, apparatus, and non-transitory computer-readable medium further include generating a multi-scene video based on the temporal control input, wherein the modified content includes the multi-scene video.
Fig. 3 illustrates an example of content editing and content generation in accordance with aspects of the present disclosure. In some examples, these operations are performed by a system comprising a processor executing a set of codes to control the functional elements of a device. Additionally or alternatively, dedicated hardware is used to perform some of the processing. In general, these operations are performed in accordance with the methods and processes described in aspects of the present disclosure. In some cases, the operations described herein consist of various sub-steps, or are performed with other operations.
At operation 305, the user provides original content through a multimedia editing interface. In some cases, the operations of this step involve, or may be performed by, the user described with reference to fig. 1. The multimedia editing interface is an example of, or includes, aspects of the media editing interface described with reference to fig. 1, 6, 7, and 11. For example, a user may upload original content (e.g., images, video, audio tracks, etc.) through a multimedia editing interface. The first media content and the second media content are placed on a page in the multimedia editing interface, wherein the first media content and the second media content are temporarily arranged on the page. In some cases, the user modifies the spatial arrangement of the first media content and the second media content. In some cases, the first media content is placed on a first page of the multimedia editing interface and the second media content is placed on a second page. The multimedia editing interface is also referred to as a media editing interface (described with reference to fig. 2, 6, 7, and 11).
At operation 310, the system generates multimedia content based on the original content. In some cases, the operations of this step involve, or may be performed by, the multimedia processing apparatus described with reference to fig. 1 and 2. For example, the multimedia processing apparatus generates multimedia content based on user input. User input is obtained through a page navigation element, a page control element, and/or a time control element. The page navigation element is used to select and navigate between one or more pages of a multimedia file (e.g., a media item file). Details regarding the page navigation element are described in detail with reference to fig. 6 and 7. The page control elements are used to add pages, remove pages, enlarge pages, reduce pages, copy pages, adjust aspect ratios of pages, and/or rotate pages. Details about the page control element are described in detail with reference to fig. 2 and 5 to 7.
In some examples, the time control element is used to control the length of a media element (e.g., a video file), to arrange a sequence of multiple media elements (e.g., to place a first video file before a second video file in time), and/or to manage the pace of the media elements. Details about the time control element are described in detail with reference to fig. 2 and 12.
At operation 315, the system publishes the multimedia content to the social media platform. In some cases, the operations of this step involve, or may be performed by, the multimedia processing apparatus described with reference to fig. 1 and 2. For example, the multimedia content includes one or more pages. One of the one or more pages includes a plurality of video clips temporarily and sequentially arranged within the page. In some examples, the page includes three scenes along the scene line such that the media editing interface may play the first scene first, then the second scene, and finally the third scene without accidental interruption. For example, the transition from the first scene to the second scene, from the second scene to the third scene is seamless. Multimedia content is posted to a social media platform in the form of multimedia presentations or multimedia posts, for example,Story demonstration, cover photo,/>A post.
Fig. 4 illustrates an example of a media element according to aspects of the present disclosure. The illustrated example includes activity 400, item 405, page(s) 410, and scene 415. According to some embodiments of the present disclosure, activity 400 includes one or more items 405. Item 405 includes one or more pages 410. Page 410 includes one or more scenes 415.
In one embodiment, a multimedia processing device combines a set of media files (e.g., video, images, text, subtitles, animation effects) to generate a media item file (e.g., multimedia content). With the multimedia creation tool, the multimedia content is structured as a hierarchy of activity 400, item 405, page(s) 410, and scene(s) 415. In some cases, the set of media files is uploaded and placed on page 410 for subsequent editing and placement. In some cases, the media item file is a continuous file of one or more pages that includes a mix of static content (e.g., images) and dynamic content (e.g., video clips, sound, and animation effects). Media files include different file formats such as video files, audio files, image files, text, and the like.
In some aspects, the scene 415 relates to a media element (e.g., a media file) having a temporal aspect. For example, a scene 415, such as a video file, may have a duration (e.g., time aspect) when uploaded to the multimedia creation tool. In some cases, a scene 415, such as an image, may not have a duration. However, the multimedia creation tool applies a temporal aspect to the scene 415. In some cases, the aspect ratio of the scene 415 may be modified by user input or a multimedia creation tool.
In some aspects, the page 410 designs a canvas of a multimedia creation tool, where one or more scenes may be uploaded to the canvas. Page 410 is an organization function within an item. In some cases, the pages may have different aspect ratios. In some embodiments, page 410 includes one or more scenes that are temporally arranged. A scene line is presented adjacent to the page 410, wherein the scene line includes a temporal arrangement of one or more scenes 415 on the page 410. For example, page 1 in page 410 includes a first scene and a second scene. The first scene is temporarily placed before the second scene. Page 2 in page 410 includes a third scene, a fourth scene, and a fifth scene. The third scene is temporarily placed before the fourth scene, which is temporarily placed before the fifth scene. In some cases, the first page is placed before the second page at times. For example, the first scene and the second scene are temporarily placed before the third scene, the fourth scene, and the fifth scene.
In some embodiments, item 405 relates to a compilation of one or more pages. In some embodiments, item 405 includes one or more pages 410 that are temporarily arranged. In some embodiments, pages 410 of item 405 have different aspect ratios.
In some embodiments, activity 400 involves a set of mixed media output use cases. The set of mixed media output use cases is used as a sample template for the item 405 to be generated. For example, an activity may include a social media story, a presentation, a cover photo, and a media output of a social media post. In some examples, the social media story isStory and social media post is/>A post.
Item 405 is an example of or includes aspects of the corresponding element described with reference to fig. 5. Page 410 is an example of or includes aspects of the corresponding elements described with reference to fig. 6, 7, and 11. Scene 415 is an example of or includes aspects of the corresponding elements described with reference to fig. 7.
Fig. 5 illustrates an example of multimedia activity in accordance with aspects of the present disclosure. The illustrated example includes an item 500, a first page 505, a second page 510, a third page 515, a fourth page 520, and a page control element 525. In some cases, item 500 includes multiple pages with different aspect ratios and different content. In some cases, the item 500 may be a continuous media file (e.g., the first through fourth pages may be displayed to the user without delay).
Referring to the example shown in fig. 5, item 500 includes four pages. The first page 505 includes static content. For example, first page 505 includes an image of a woman on a kayak, the image including one or more scenes. The text "Banff" is located at the bottom of the first page 505. The mountain is placed on top of the first page 505. In the first page 505, a star pattern is added to encircle the woman. The first page 505 isStory.
The second page 510 includes media content having a temporal aspect. The second page 510 includes a picture of Fuji mountain. The second page 510 includes one or more scenes that are temporarily arranged. In some examples, the second page 510 includesA post. The first media object within the scene is a static image depicting a Fuji mountain. The second media object is a polygonal shape spatially arranged in the upper left corner of the second page 510. The second media object is presented with the scene for the first 3 seconds (i.e., 0-3 s). The third media object is a polygonal shape spatially arranged in the lower right corner of the second page 510. The third media object is presented within the first 3 seconds (i.e., 0-3 s). The fourth media object is a "Tokyo must-sees" subtitle. The fourth media object lasts 3 seconds (i.e., 1s-4 s) starting from the first second. The fourth media object appears after the first media object, the second media object, and the third media object. Thus, one or more media elements are temporarily arranged within the scene on the second page 510. One or more media elements are multimedia content (e.g., text, subtitles, images, audio, video, animation effects) and are associated with corresponding temporal aspects.
In some examples, the third page 515 isAnd (5) a graph. The third page 515 includes similar types of media elements and scene(s) as described above with respect to the second page 510.
In some examples, fourth page 520 includes a video story that includes one or more scenes. The fourth page 520 includes similar types of media elements and scene(s) as described above with respect to the second page 510. The page control element 525 is used to add pages, remove pages, zoom in pages, zoom out pages, copy pages, adjust aspect ratios of pages, and/or rotate pages in the project 500.
Item 500 is an example of or includes aspects of the corresponding element described with reference to fig. 4. Page control element 525 is an example of or includes aspects of the corresponding elements described with reference to FIGS. 2, 6, and 7.
Fig. 6 illustrates an example of a media editing interface in accordance with aspects of the present disclosure. The illustrated examples include a media editing interface 600, a page content panel 605, a page 610, a scene line 615, a page control element 620, and a page navigation element 625.
According to some embodiments of the present disclosure, static content may be added to page 610 and placed on page 610. Static content relates to media elements that are not associated with a temporal aspect. Content (e.g., content without temporal aspects and content with temporal aspects) may be added to one or more pages of a multimedia creation tool (e.g., media editing interface 600). When a scene (e.g., a first scene) is added to the page 610, a time control element is displayed in the media editing interface 600. In some cases, the second scene is added to the same page (e.g., page 610). The second scene is temporally disposed relative to the first scene (e.g., the first scene and the second scene are sequentially displayed to the user). In some cases, an object (e.g., a third scene or media element) is added to the same page (e.g., page 610), and a user may assign a temporal aspect to the object.
Referring to FIG. 6, a page content panel 605 provides various tools. For example, a user may select a template (e.g., social media story, presentation, job title, etc.) from the page content panel 605. In addition, the page content panel 605 allows access to recent items, brands, library resources, media file text, images, video, audio, shapes, and the like.
In the example shown in fig. 6, page 610 includes multiple scenes. When one or more scenes are added to page 610, scene line 615 is displayed on media editing interface 600. The scene line 615 includes a total of 10 scenes of page 610. The scene has a corresponding temporal aspect. For example, a first scene of the scene line 615 may be presented for 2.6 seconds, a second scene may be presented for 0.4 seconds after the first scene, a third scene may be presented for 0.5 seconds after the second scene, and so on. Scenes include static content (e.g., images) and/or dynamic content (e.g., video clips, animation effects). In one embodiment, the scene has the same aspect ratio as that of page 610. In some cases, the scenes have different aspect ratios between them. Further details regarding the scene line are described with reference to fig. 10. Page 610 is an example of or includes aspects of the corresponding elements described with reference to fig. 4, 7, and 11. The scene line 615 is an example of or includes aspects of the corresponding elements described with reference to fig. 10 and 12.
In some embodiments, page control element 620 is used to add pages, remove pages, zoom in pages, zoom out pages, copy pages, adjust aspect ratios of pages, and/or rotate pages. The page navigation element 625 is used to select and/or navigate between one or more pages of the project file. In the example shown in fig. 6, the user navigates to page 4 using page navigation element 625 on media editing interface 600.
The media editing interface 600 is an example of or includes aspects of the corresponding elements described with reference to fig. 2, 7, and 11. Page control element 620 is an example of or includes aspects of the corresponding elements described with reference to FIGS. 2, 5, and 7. Page navigation element 625 is an example of or includes aspects of the corresponding elements described with reference to FIGS. 2 and 7.
FIG. 7 illustrates an example of a media editing interface in accordance with aspects of the present disclosure. The illustrated examples include a media editing interface 700, an object properties panel 705, a page 710, a context tracking 715, a scene 720, a page control element 725, and a page navigation element 730.
FIG. 7 illustrates an example of a media editing interface when an object or scene on page 710 is selected. For example, the user selects the polygon shape of the upper left corner of page 710. The media editing interface 700 displays an object properties panel 705 that is related to the selected object (i.e., the polygon on page 710). The media editing interface 700 also presents context tracking 715 and scene 720. The object properties panel 705 is used to add animations to objects, manage styles of objects, add filters to objects, and the like. In some cases, the user may further select "input animation", "loop animation", or "output animation" when the animation is applied to the object. When an animation is added to a scene (e.g., scene 720), a playhead may be displayed at the bottom of the media editing interface 700 for the user to preview the animation.
According to some embodiments, a scene line is disposed near page 710, where the scene line includes a temporal arrangement of one or more scenes within the page. The time control element is configured to manage a temporal aspect of the scene 720 (e.g., adjust a temporal length of the scene 720, or rearrange an ordering of a sequence of scenes on the page 710). Scene 720 is an example of or includes aspects of the corresponding elements described with reference to fig. 4.
In the example shown in fig. 7, the scene line includes a temporal arrangement of the scene 720. Context tracking 715 is presented alongside scene 720. In some cases, other scenes may be added by selecting the "+" button. In some cases, when an object is selected, the temporal characteristics of the object are managed or modified via context tracking 715. For example, the user selects a polygonal shape object located in the upper left corner of the media editing interface 700. The user drags either end of the context trace 715 to adjust the start time, the length of occurrence, and the end time of the polygonal shape object relative to the scene 720. For example, scene 720 is 5 seconds long. The context tracking 715 is modified to start at the same time as the scene 720 and last for 4.5 seconds. Thus, the polygonal objects appear at the same time that scene 720 begins to play. As scene 720 continues to play, the polygon shaped object stops or disappears after 4.5 seconds. Context tracking 715 is an example of or includes aspects of the corresponding elements described with reference to fig. 11 and 12. More details regarding the modification scenario are described with reference to fig. 12.
In some embodiments, the page control element 725 is used to add pages, remove pages, zoom in pages, zoom out pages, copy pages, adjust aspect ratios of pages, and/or rotate pages. Page navigation element 730 is used to select and/or navigate between one or more pages of the project file.
In some cases, opening an item opens a display area in which all pages of the item are displayed. Page navigation element 730 can be provided for navigating between pages (e.g., forward and backward, or spatial navigation based on the layout of the pages). In some cases, media content may be moved from one page to another, for example, using drag-and-drop techniques.
The media editing interface 700 is an example of or includes aspects of the corresponding elements described with reference to fig. 2, 6, and 11. Page 710 is an example of or includes aspects of the corresponding elements described with reference to fig. 4, 6, and 11. The page control element 725 is an example of or includes aspects of the corresponding elements described with reference to fig. 2, 5, and 6. Page navigation element 730 is an example of or includes aspects of the corresponding elements described with reference to FIGS. 2 and 6.
Fig. 8 illustrates an example of a method for multimedia processing in accordance with aspects of the present disclosure. In some examples, these operations are performed by a system comprising a processor that executes a set of codes to control the functional elements of a device. Additionally or alternatively, dedicated hardware is used to perform some of the processing. In general, these operations are performed in accordance with the methods and processes described in accordance with aspects of the present disclosure. In some cases, the operations described herein consist of various sub-steps, or are performed with other operations.
At operation 805, the system opens a project file that includes page data for one or more pages, where each of the one or more pages includes a spatial arrangement of one or more media elements. In some cases, the operations of this step involve, or may be performed by, the media editing interface described with reference to fig. 2, 6, 7, and 11. For example, the user may open the project file using a different preset template or custom template. The user may add or delete pages using the add button of the page control element. In some cases, each page is temporarily arranged with another page. The user may add one or more media elements and/or one or more scenes (e.g., images, text, audio data, video sequences, etc.) on the page(s). In some cases, the user may modify the aspect ratio (e.g., size) of the media element and the scene within the page. In some cases, the media elements are spatially arranged on the page. For example, a media object may be placed in the upper corner of a page, with a scene covering the entire page.
In some cases, the media elements may have different media file formats. Media elements include images, text, audio, video, animation effects, and the like. Different types of media elements may be combined and placed within the same page or within different pages.
In some cases, the term "page" refers to the canvas of the multimedia creation tool. A page is an organization unit or edit unit in an item. In some cases, multiple pages may have different aspect ratios and/or directions. One or more scenes may be placed in the same page.
An example of a page is described with reference to fig. 5. Details regarding the page control element are described with reference to fig. 6 and 7. Examples of spatial arrangements of media objects are described with reference to fig. 7 and 11.
At operation 810, the system displays, via the media editing interface, pages of the one or more pages based on the spatial arrangement. In some cases, the operations of this step involve or may be performed by the media editing interface described with reference to fig. 2, 6, 7, and 11. In some cases, the user navigates to the target page to modify the spatial arrangement of one or more media elements within the target page.
In some cases, the term "scene" relates to composite multimedia content associated with a temporal aspect. A scene may include one or more media elements of various types. One or more media elements have corresponding temporal aspects. For example, a scene includes a video clip having a duration (i.e., temporal aspect). The scene also includes a static background image. The scene also includes animation effects associated with the media object, such as a star. A star will pop up after the video clip is played and last for a few seconds. For example, animation effects associated with a media object appear temporally after a video clip. In some examples, an aspect ratio (e.g., size) of the scene may be adjusted.
In some cases, the term "spatial arrangement" relates to the position of one or more media elements within a page or scene. For example, the first media element (image) is spatially arranged in the upper left corner of the page. The second media element (text) is spatially arranged at the bottom of the page. The third media element (video) is spatially arranged in the middle of the page.
In some cases, the term "item" refers to a compilation of one or more pages. In some cases, the term "sports" relates to a compilation of one or more items. One activity includes a set of mixed media output use cases. The activity includes a social media story, a presentation, a cover photo, and/or a social media post.
At operation 815, the system displays, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes. In some cases, the operations of this step involve or may be performed by the media editing interface described with reference to fig. 2, 6, 7, and 11.
In some cases, when there are one or more scenes on a page, a scene line is displayed to the user. The scene line displays a timeline representing thumbnails ("thumb") at different points in the scene associated with the page. In some cases, the scene line includes one or more scenes on the page. In some embodiments, the media editing interface displays a contextual track adjacent to the scene line, where the contextual track represents a temporal aspect of the media object. Details of examples of scene lines are further described with reference to fig. 6, 7, 10, and 11.
In some cases, the term "temporal arrangement" relates to arranging temporal aspects corresponding to one or more scenes on a scene line. For example, the duration of presenting the first scene is 5 seconds, and the duration of presenting the second scene after the first scene is 3 seconds. The time control element of the media editing interface is used to edit the temporal arrangement of the plurality of scenes.
Fig. 9 illustrates an example of a method of modifying a scene based on time control in accordance with aspects of the present disclosure. In some examples, these operations are performed by a system comprising a processor executing a set of codes to control the functional elements of a device. Additionally or alternatively, dedicated hardware is used to perform some of the processing. In general, these operations are performed in accordance with the methods and processes described in accordance with aspects of the present disclosure. In some cases, the operations described herein consist of various sub-steps, or are performed with other operations.
At operation 905, the system displays the time control element in the media editing interface. In some cases, the operations of this step involve or may be performed by a media editing interface as described with reference to fig. 2, 6, 7, and 11. In some examples, the temporal control element includes contextual tracking about the media object (e.g., a star shape) and a scene line including one or more scenes. The time control element is described in more detail with reference to fig. 10 and 12.
At operation 910, the system receives a time control input via a time control element. In some cases, the operations of this step involve, or may be performed by, the time control elements described with reference to fig. 2 and 12. In some cases, the temporal control element is used to modify temporal aspects of the media element and the scene. Details regarding the temporal aspects of modifying a media file are described with reference to fig. 12.
At operation 915, the system modifies a scene of the one or more scenes based on the time control input. In some cases, the operations of this step involve or may be performed by a media editing interface as described with reference to fig. 2, 6, 7, and 11. In some examples, the time control input is based on user input.
Fig. 10 illustrates an example of a scene line 1000 in accordance with aspects of the disclosure. The illustrated example includes a scene line 1000, a first scene 1005, a second scene 1010, a third scene 1015, and a media file 1020.
According to some embodiments of the present disclosure, scene line 1000 is presented within or adjacent to a page (e.g., with reference to page 710 of fig. 7). Scene line 1000 illustrates a temporal arrangement of a set of scenes, where the scenes are represented by corresponding thumbnails. The thumbnail represents a representative frame. The user may add different scenes (e.g., video content) to the page. In the example shown in fig. 10, the scene line 1000 includes a first scene 1005, a second scene 1010, a third scene 1015, and additional scenes omitted for brevity.
When a page is selected within an item, various editing tools may be presented. The available editing tools may depend on the type of content in the page. For example, if a page includes video content, navigation elements may be displayed to open and close a scene line interface (e.g., to display scene line 1000) for viewing and editing video associated with the page.
In some embodiments, scene line 1000 displays a thumbnail representation that includes a sequence of thumbnails representing page views at a selected point in time. A "play" button is displayed alongside the scene line 1000 (e.g., to the left of the scene line 1000) and a "plus" button (e.g., to the right of the last scene on the scene line 1000) for adding additional content (e.g., adding one or more additional scenes) is displayed. Alternatively, the media content may be dragged onto the page and the corresponding scene automatically added to scene line 1000.
Video may be added to the page and edited using tools integrated into the scene line 1000. For example, the video start time may be edited and a clip of the video may be displayed on the main page while other portions (temporal or spatial portions) of the video are hidden. In some cases, contextual tracking is displayed near the scene line 1000. The context tracking may indicate the state of individual elements (or objects) of the page at different points in time and the individual elements are aligned with the scene line 1000. For example, static elements may be displayed in portions of the video and then removed (or hidden from view). Context tracking is an example of or includes aspects of the corresponding elements described with reference to fig. 7, 11, and 12.
Thus, the scene line 1000 may be nested or embedded in a page-based interface such that the scene line 1000 relates to the time course of one or more scenes within a page. The scenes within the page are associated with a corresponding media file (e.g., video file). In some examples, the collection of media files of the same item is associated with one continuous media file within or presented by the media editing interface. That is, the collection of media files is arranged along the scene line 1000, and together they form an aggregate media output, such as a video sequence.
In the example shown in fig. 10, the scene line 1000 includes a temporal arrangement of 10 scenes. For example, scene line 1000 displays a sequence of thumbnails representing a sequence of 10 different scenes. The scenes have corresponding temporal aspects such that the total time length of 10 scenes is 8 seconds long. The first scene 1005 includes a 2.6 second long video clip. The second scene 1010 includes a video clip that is 0.4 seconds long. The third scene 1015 includes a video clip that is 0.5 seconds long.
In some cases, the scene includes dynamic content (e.g., video, animation effects, audio), static content (e.g., images, text, subtitles), or a combination thereof. In some embodiments, media file 1020 is added to scene line 1000. For example, media file 1020 is an audio track played with a scene. The track lasts 8 seconds, corresponding to a total length of 10 scenes.
Scene line 1000 is an example of or includes aspects of the corresponding elements described with reference to fig. 6 and 12. The first scenario 1005 is an example of or includes aspects of the corresponding element described with reference to fig. 11. The second scenario 1010 is an example of or includes aspects of the corresponding element described with reference to fig. 11.
Fig. 11 illustrates an example of modifying a scenario in accordance with aspects of the present disclosure. The illustrated example includes a media editing interface 1100, a page 1105, a context tracking 1110, a first scene 1115, a second scene 1120, and a media object 1125.
FIG. 11 illustrates a media editing interface 1100 having a page 1105 and a scene line including a temporal arrangement of a first scene 1115 and a second scene 1120 within the page 1105. The time control elements are used to manage the ordering of the scenes (e.g., add scenes, remove scenes, rearrange scenes) within the page 1105. The time control elements include a video ordering element, a video transition element, or a video pacing element. Editing elements on the left side of the media editing interface 1100 are used to add or modify media content on the page 1105.
In one embodiment, the media editing interface 1100 includes a display area for displaying a project page (e.g., page 1105 displays an infant in a room). When page 1105 is selected, page 1105 may be highlighted or displayed in full contrast, while unselected pages may be obscured, grayed out, hidden, or partially obscured.
Multiple pages may be displayed in rotation or in a spatial arrangement. The pages of items may be structure-based links, with a sequential order or spatial arrangement. Each page may have metadata including page size, page margin, layout, and list of elements included in the page.
Referring to fig. 11, the scene line is at the bottom of the media editing interface 1100. The scene line includes a context trace 1110, a first scene 1115, and a second scene 1120. Multiple elements (dynamic or static) may be added and spatially arranged (e.g., located) within page 1105. In some cases, a "stacked" symbol next to the page 1105 represents a set of scenes within the page 1105 (e.g., two layers stacked represent two scenes in the page 1105).
As shown in FIG. 11, the context tracking 1110 is a representation of time corresponding to the media object 1125. Media object 1125 may be added to page 1105 via an editing element on the left side of media editing interface 1100. When media object 1125 is selected, context tracking 1110 is presented near the scene line. For example, the scene line includes a temporal arrangement of the first scene 1115 and the second scene 1120. In some cases, the length of time of the context tracking 1110 may be adjusted based on the temporal aspects of the first scene 1115 and the second scene 1120. As shown in this example, media object 1125 appears in the middle of first scene 1115 via context tracking 1110. The media object 1125 is presented for the remaining duration of the first scene 1115 and the entire duration of the second scene 1120. For example, a first scene 1115 lasts 8 seconds and a second scene 1120 lasts 6 seconds. The media object 1125 is displayed with the first scene 1115 at a fourth second (e.g., t=4s) of the first scene 1115 until the first scene ends (e.g., t=8s) and continues to be displayed with the second scene 1120. When the duration of the second scene 1120 expires (e.g., t=14s), the media object 1125 is terminated. The temporal aspect of the media object 1125 is modified by managing the context tracking 1110 (e.g., dragging the context tracking 1110 to increase or decrease the length of time the presented media object 1125). Thus, the media editing interface 1100 generates multimedia content (e.g., video output) based on the first scene 1115, the second scene 1120, and the media object 1125 within the page 1105. Context tracking 1110 is an example of or includes aspects of the corresponding elements described with reference to fig. 7 and 12.
The media editing interface 1100 is an example of or includes aspects of the corresponding elements described with reference to fig. 2, 6, and 7. Page 1105 is an example of or includes aspects of the corresponding elements described with reference to fig. 4, 6, and 7. The first scenario 1115 is an example of or includes aspects of the corresponding element described with reference to fig. 10. The second scenario 1120 is an example of or includes aspects of the corresponding element described with reference to fig. 10.
FIG. 12 illustrates an example of a temporal aspect of modifying a media file according to aspects of the present disclosure. The illustrated example includes a time control element 1200, a context trace 1205, and a scene line 1210. In some examples, the time control element 1200 includes a context trace 1205 and a scene line 1210.
In an embodiment, a temporal aspect (e.g., a length of time) of the media object within the page may be modified. For example, scene line 1210 represents a temporal arrangement of one or more scenes within a page. The context tracking 1205 is used to manage media objects within a page. The context tracking 1205 includes a collection of thumbnail images associated with the media object. The temporal aspect of the media object is modified by dragging the toolbar of the contextual track 1205. For example, the start time, end time, and length of time may be modified via the context tracking 1205.
The top example of fig. 12 shows a scene that is 5 seconds long and a media object (e.g., polygon) is presented for 4.5 seconds. The example at the bottom of fig. 12 shows the same scenario, where the media object is modified to be displayed for 4.3 seconds. Thus, the scene is displayed first for 0.7 seconds, then the media object starts to appear at 0.7 seconds and is displayed with the scene for 4.3 seconds.
Additionally or alternatively, the user may drag the toolbar to the right of the contextual track 1205. In some cases, the context tracking 1205 is used to extend the length of time of the media object or shorten the length of time of the media object. The media object may be presented with the scene. In some cases, when a scene ends earlier than a media object, the media object is presented without the scene. Thus, the temporal aspect of the media object may be modified via the temporal control element 1200.
The time control element 1200 is an example of or includes aspects of the corresponding element described with reference to fig. 2. Context tracking 1205 is an example of or includes aspects of the corresponding elements described with reference to fig. 7 and 11. Scene line 1210 is an example of or includes aspects of the corresponding elements described with reference to fig. 6 and 10.
Fig. 13 illustrates an example of a computing device 1300 in accordance with aspects of the disclosure. The illustrated example includes a computing device 1300, a processor 1305, a memory subsystem 1310, a communication interface 1315, an I/O interface 1320, user interface components 1325, and a channel 1330.
In some embodiments, the computing device 1300 is an example of or includes aspects of the multimedia processing apparatus 110 described with reference to fig. 1-2. In some embodiments, computing device 1300 includes one or more processors 1305 that may execute instructions stored in memory subsystem 1310 to obtain a project file including page data for one or more pages, wherein each page of the one or more pages includes a spatial arrangement of one or more media elements; presenting pages of the one or more spatially-based pages via the media editing interface; and presenting, via the media editing interface, a scene line adjacent to the page, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
According to some aspects, the computing device 1300 includes one or more processors 1305. In some cases, the processor is a smart hardware device (e.g., a general purpose processing component, a Digital Signal Processor (DSP), a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a programmable logic device, discrete gate or transistor logic components, discrete hardware components, or a combination thereof). In some cases, the processor is configured to operate the memory array using the memory controller. In other cases, the memory controller is integrated into the processor. In some cases, the processor is configured to execute computer-readable instructions stored in the memory to perform various functions. In some embodiments, the processor includes dedicated components for modem processing, baseband processing, digital signal processing, or transmission processing. The processor 1305 is an example of, or includes aspects of, the processor unit described with reference to fig. 2.
According to some aspects, memory subsystem 1310 includes one or more memory devices. Examples of memory devices include Random Access Memory (RAM), read Only Memory (ROM), or hard disks. Examples of memory devices include solid state memory and hard drives. In some examples, memory is used to store computer-readable, computer-executable software comprising instructions that, when executed, cause a processor to perform the various functions described herein. In some cases, the memory contains, among other things, a basic input/output system (BIOS) that controls basic hardware or software operations, such as interactions with peripheral components or devices. In some cases, the memory controller operates the memory cells. For example, the memory controller may include a row decoder, a column decoder, or both. In some cases, memory cells within a memory store information in the form of logical states. Memory subsystem 1310 is an example of or includes aspects of the memory cells described with reference to fig. 2.
According to some aspects, the communication interface 1315 operates at the boundary between a communication entity (such as computing device 1300, one or more user devices, cloud, and one or more databases) and channel 1330, and may record and process communications. In some cases, a communication interface 1315 is provided to enable a processing system coupled to a transceiver (e.g., transmitter and/or receiver). In some examples, the transceiver is configured to transmit (or transmit) and receive signals for the communication device via the antenna.
According to some aspects, the I/O interface 1320 is controlled by an I/O controller to manage input and output signals of the computing device 1300. In some cases, I/O interface 1320 manages peripheral devices that are not integrated into computing device 1300. In some cases, I/O interface 1320 represents a physical connection or port to an external peripheral device. In some cases, the I/O controller uses a controller such as Or other operating systems known to those skilled in the art. In some cases, the I/O controller represents or interacts with a modem, keyboard, mouse, touch screen, or similar device. In some cases, the I/O controller is implemented as a component of a processor. In some cases, the user interacts with the device via the I/O interface 1620 or via hardware components controlled by the I/O controller.
According to some aspects, user interface component(s) 1325 enable a user to interact with computing device 1300. In some cases, the user interface component(s) 1325 include an audio device such as an external speaker system, an external display device such as a display screen, an input device (e.g., a remote control device that interfaces with the user interface directly or through an I/O controller), or a combination thereof. In some cases, the user interface component(s) 1325 include a GUI. In some embodiments, the user interface component(s) 1325 are examples of or include aspects of the media editing interface described with reference to fig. 2, 6, 7, and 11.
The description and drawings described herein represent example configurations and do not represent all implementations that are within the scope of the claims. For example, operations and steps may be rearranged, combined, or otherwise modified. Furthermore, structures and devices may be shown in block diagram form in order to represent relationships between components and to avoid obscuring the concepts described. Similar components or features may have the same name but may have different reference numbers corresponding to different figures.
Some modifications to the disclosure will be apparent to those skilled in the art, and the principles defined herein may be applied to other variations without departing from the scope of the disclosure. Thus, the disclosure is not limited to the examples and designs described herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
In this disclosure and in the following claims, the word "or" indicates an inclusive list, e.g., a list of X, Y or Z represents X or Y or Z or XY or XZ or YZ or XYZ. Furthermore, the phrase "based on" is not intended to represent a closed set of conditions. For example, a step described as "based on condition a" may be based on both condition a and condition B. In other words, the phrase "based on" should be construed as "based, at least in part, on". Furthermore, the words "a" or "an" indicate "at least one".

Claims (20)

1. A method, comprising:
Obtaining a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements;
Presenting, via a media editing interface, pages of the one or more pages based on the spatial arrangement; and
A scene line adjacent to the page is presented via the media editing interface, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
2. The method of claim 1, further comprising:
identifying an activity comprising a plurality of items;
presenting the plurality of items in the campaign; and
User input identifying an item from the plurality of items is received, wherein the item file is obtained based on the user input.
3. The method of claim 1, further comprising:
presenting a page navigation element in the media editing interface;
receiving a page navigation input via the page navigation element; and
Additional pages of the one or more pages are presented based on the page navigation input via the media editing interface.
4. The method according to claim 1, wherein:
the project file includes an ordering of the one or more pages.
5. The method of claim 1, further comprising:
Presenting a page control element in the media editing interface; and
A page control input is received via the page control element.
6. The method according to claim 5, wherein:
The page control elements include add page elements, remove page elements, copy page elements, page size elements, or page orientation elements.
7. The method of claim 1, further comprising:
Presenting a time control element in the media editing interface;
Receiving a time control input via the time control element; and
A scene of the one or more scenes is modified based on the time control input.
8. The method of claim 7, wherein:
the time control element comprises a video ordering element, a video transition element or a video pacing element.
9. The method of claim 1, further comprising:
Receiving a media selection input selecting a media element of the one or more media elements; and
Context tracking adjacent to the scene line is presented based on the media selection input.
10. The method of claim 1, further comprising:
receiving a media location input modifying a location of a media element of the one or more media elements; and
The media element is presented at the modified location within the page.
11. The method of claim 1, further comprising:
receiving user input via the media editing interface; and
A modified project file is generated based on the user input.
12. The method of claim 11, further comprising:
The modified project file is published to a social media platform.
13. The method according to claim 11, wherein:
The modified project file includes a multi-scene video.
14. The method according to claim 11, wherein:
The modified project file includes a multi-page presentation.
15. A non-transitory computer readable medium storing code for content editing, the code comprising instructions executable by a processor to:
presenting content to a user via a media editing interface, the media editing interface comprising a page control element for modifying a spatial arrangement of pages of the content and a time control element for modifying a temporal arrangement of scenes located within the pages;
receiving page control input for editing the content via the page control element;
receiving a time control input for editing the content via the time control element; and
Modified content is generated based on the page control input and the time control input.
16. The non-transitory computer-readable medium of claim 15, the code further comprising instructions executable by the processor to:
A multi-scene video is generated based on the temporal control input, wherein the modified content includes the multi-scene video.
17. An apparatus, comprising:
A processor; and
A memory comprising instructions executable by the processor to:
Obtaining, via a media editing interface, a project file comprising page data for one or more pages, wherein each of the one or more pages comprises a spatial arrangement of one or more media elements;
Presenting, via the media editing interface, pages of the one or more pages based on the spatial arrangement; and
A scene line adjacent to the page is presented via the media editing interface, wherein the scene line includes a temporal arrangement of one or more scenes within the page, and wherein the one or more media elements are temporally arranged within the one or more scenes.
18. The apparatus of claim 17, wherein:
The media editing interface includes a page navigation element, a page control element, and a time control element.
19. The apparatus of claim 18, wherein:
The page control elements include add page elements, remove page elements, copy page elements, page size elements, or page orientation elements.
20. The apparatus of claim 18, wherein:
the time control element comprises a video ordering element, a video transition element or a video pacing element.
CN202310982890.XA 2022-10-17 2023-08-07 Adaptive editing experience for mixed media content Pending CN117911582A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/379,811 2022-10-17
US18/348,522 US20240127512A1 (en) 2022-10-17 2023-07-07 Adaptive editing experience for mixed media content
US18/348,522 2023-07-07

Publications (1)

Publication Number Publication Date
CN117911582A true CN117911582A (en) 2024-04-19

Family

ID=90691131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310982890.XA Pending CN117911582A (en) 2022-10-17 2023-08-07 Adaptive editing experience for mixed media content

Country Status (1)

Country Link
CN (1) CN117911582A (en)

Similar Documents

Publication Publication Date Title
US11682150B2 (en) Systems and methods for publishing and/or sharing media presentations over a network
US11354022B2 (en) Multi-directional and variable speed navigation of collage multi-media
US9262036B2 (en) Website image carousel generation
US9043726B2 (en) Position editing tool of collage multi-media
US20070162953A1 (en) Media package and a system and method for managing a media package
WO2022205798A1 (en) Multimedia information editing method and apparatus therefor
US9843823B2 (en) Systems and methods involving creation of information modules, including server, media searching, user interface and/or other features
US11373028B2 (en) Position editing tool of collage multi-media
CN117911582A (en) Adaptive editing experience for mixed media content
US20240127512A1 (en) Adaptive editing experience for mixed media content
CA2857519A1 (en) Systems and methods involving features of creation/viewing/utilization of information modules
AU2005233653A1 (en) A media package and a system and method for managing a media package

Legal Events

Date Code Title Description
PB01 Publication