US20120251080A1 - Multi-layer timeline content compilation systems and methods - Google Patents

Multi-layer timeline content compilation systems and methods Download PDF

Info

Publication number
US20120251080A1
US20120251080A1 US13/433,239 US201213433239A US2012251080A1 US 20120251080 A1 US20120251080 A1 US 20120251080A1 US 201213433239 A US201213433239 A US 201213433239A US 2012251080 A1 US2012251080 A1 US 2012251080A1
Authority
US
United States
Prior art keywords
content
layer
engine
compilation
timeline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/433,239
Inventor
Jostein SVENDSEN
Bjørn Rustberggaard
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WEVIDEO Inc
Original Assignee
WEVIDEO Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WEVIDEO Inc filed Critical WEVIDEO Inc
Priority to US13/433,239 priority Critical patent/US20120251080A1/en
Assigned to WEVIDEO, INC. reassignment WEVIDEO, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RUSTBERGGAARD, BJORN, SVENDSEN, JOSTEIN
Publication of US20120251080A1 publication Critical patent/US20120251080A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/222Secondary servers, e.g. proxy server, cable television Head-end
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • H04N21/2541Rights Management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • H04N21/2543Billing, e.g. for subscription services
    • H04N21/25435Billing, e.g. for subscription services involving characteristics of content or additional data, e.g. video resolution or the amount of advertising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Definitions

  • a film is more than the sum of its composite parts and, recognizing this fact, creative professionals have long toiled to recreate creative elements in film.
  • creative professionals use physical media to capture specific scenes and manually add soundtracks, video clips, and special effects to incorporate creative elements like story elements, plots, characters, and thematic elements.
  • the process provides a classical touch and feel that aligned with the creative energies of film producers, directors, screenwriters, and editors.
  • the process is expensive and requires access to editing equipment typically located in film studios.
  • Locally installed film editing systems such as standalone computer programs allow users to edit digital media using locally stored special effects.
  • locally installed film editing systems require creative professionals to purchase special effects packages, limiting a creative professional to the editing effects locally installed on his or her computer.
  • Locally installed film editing systems therefore fail to deliver high quality without high cost.
  • locally installed film editing systems make it hard for creative professionals to add creative elements to streaming media, including streaming media from multiple sources, such as crowdsourced streaming media.
  • the present application discloses systems and methods of creating and compiling multi-layer timeline compilations.
  • the disclosed systems and methods allow people to access high-quality editing tools without entering film studios and without installing these editing tools on their computers.
  • the disclosed systems and methods are portable and obviate the need for specialized or high-performance computers.
  • Systems include a content datastore, a layer datastore, a layer-scalable content editor launch engine, a client-remote content placement engine, a layer addition engine, a layer superimposing engine, and a variable-layer content playing engine that has an arbitrary number of layers.
  • the layer-scalable content editor launch engine receives a request to launch an editor window for display on a client, while the client-remote content placement engine receives an instruction to place content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client.
  • the layer addition engine receives an instruction to superimpose a superimposable layer from the layer datastore onto existing layers of the multi-layer timeline content compilation.
  • the layer superimposing engine superimposes, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer.
  • the variable-layer content playing engine plays the multi-layer timeline content compilation, including the content, in the superimposed layer.
  • the existing layers comprise first crowdsourcing content and the superimposable layer comprises second crowdsourcing content.
  • Systems can include a content account engine that, in operation, maintains a plurality of user accounts.
  • the content account engine can also associate the multi-layer timeline content compilation with one of the plurality of user accounts.
  • the content account engine can limit another of the plurality of user accounts from accessing the multi-layer timeline content compilation.
  • Systems can include a content rights management engine.
  • the content rights management engine can, in operation, provide a depublication instruction if a user associated with the layer superimposing engine does not have publication rights to the content.
  • Systems can include a publication engine.
  • the publication engine in operation, can provide an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user associated with the layer superimposing engine has publication rights to the content.
  • Systems can also include a royalty estimation engine that, in operation, estimates a cumulative royalty associated with publishing the multi-layer timeline content compilation.
  • the royalty estimation engine can, in operation, suggest a revenue amount for the multi-layer timeline content compilation. The revenue amount can be based at least in part on the cumulative royalty.
  • a content rights acquisition engine that, in operation, provides a request to acquire publication rights if a user associated with the layer superimposing engine does not have publication rights to the content.
  • Systems can also include a content rights crediting engine that, in operation, provides an instruction to acquire publication credits associated with the content, and provides the publication credits for storage in the layer datastore if a user associated with the layer superimposing engine does not have publication rights to the content.
  • Systems can include a crowdsourcing engine.
  • the crowdsourcing engine can, in operation, provide a subject-specific content call, and can receive, in response to the subject-specific content call, subject-specific content.
  • the crowdsourcing engine can also provide the subject-specific content to the content datastore.
  • Systems can include a third-party layer-package engine, that in operation, provides a package of proprietary third-party layers to the layer datastore.
  • Systems can include a content monetization engine.
  • the content monetization engine can provide an advertising package or a pay-per-view package to the layer datastore.
  • Methods can include receiving a request to launch an editor window for display on a client; receiving an instruction to place content from a content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client; receiving an instruction to superimpose a superimposable layer on existing layers of the multi-layer timeline content compilation; superimposing, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer; and playing the multi-layer timeline content compilation, including the content, in the superimposed layer.
  • Methods can also include maintaining a plurality of user accounts; associating the multi-layer timeline content compilation with one of the plurality of user accounts; and limiting another of the plurality of user accounts from accessing the multi-layer timeline content compilation. Methods can further include providing a depublication instruction if a user does not have publication rights to the content.
  • an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user has publication rights to the content can include estimating a cumulative royalty associated with publishing the multi-layer timeline content compilation. Further, methods can provide suggesting a revenue amount for the multi-layer timeline content compilation, the revenue amount based at least in part on the cumulative royalty. Methods can also include providing a request to acquire publication rights to the content if the user does not have the publication rights to the content. Methods can include providing an instruction to acquire publication credits associated with the content, if the user does not have publication rights to the content, and providing the publication credits if the user does not have the publication rights to the content. Methods can include providing a notification if the user does not have publication rights to the content.
  • Methods can include providing a subject-specific content call; receiving, in response to the subject-specific content call, subject-specific content; and providing the subject-specific content to the content datastore. For example, there can be provided a package of proprietary layers to the layer datastore. Methods can also include providing an advertising package or a pay-per-view package to the layer datastore.
  • FIG. 1 shows a diagram of an example of a network environment.
  • FIG. 2 shows a diagram of an example of a variable-layer content server.
  • FIG. 3 shows a diagram of an example of a variable-layer content client.
  • FIG. 4 shows a flowchart of an example of a method for providing subject-specific content in response to a subject-specific content call.
  • FIG. 5 shows a flowchart of an example of a method for creating a multi-layer timeline content compilation.
  • FIG. 6 shows a flowchart of an example of a method for publishing a multi-layer timeline content compilation.
  • FIG. 7 shows an example of a system on which techniques described in this paper can be implemented.
  • FIG. 8 shows a variable-layer content client web browser screenshot.
  • FIG. 9 shows a variable-layer content client web browser screenshot.
  • FIG. 10 shows a variable-layer content client web browser screenshot.
  • FIG. 11 shows a variable-layer content client web browser screenshot.
  • FIG. 12 shows a variable-layer content client web browser screenshot.
  • processor such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor.
  • a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is configured to perform the task at a given time or a specific component that is manufactured to perform the task.
  • processor refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
  • FIG. 1 shows a diagram of an example of a network environment 100 .
  • the network 100 includes a variable-layer content server 102 , a network 104 , and a variable-layer content client 106 .
  • the variable-layer content server 102 can include a computer configured to provide variable-layer content editing services to other computers.
  • the variable-layer content server 102 can include engines that allow a user to edit content, such as video, audio, or other media content, without requiring the user to locally install editing software or download the content being edited.
  • an engine includes a dedicated or shared processor and, typically, firmware or software modules that are executed by the processor. Depending upon implementation-specific or other considerations, an engine can be centralized or its functionality distributed. An engine includes special purpose hardware, firmware, or software embodied in a computer-readable medium for execution by the processor.
  • a computer-readable medium is intended to include all mediums that are statutory (e.g., in the United States, under 35 U.S.C. ⁇ 101), and to specifically exclude all mediums that are non-statutory in nature to the extent that the exclusion is necessary for a claim that includes the computer-readable medium to be valid.
  • Known statutory computer-readable mediums include hardware (e.g., registers, random access memory (RAM), non-volatile (NV) storage, to name a few), but may or may not be limited to hardware.
  • the variable-layer content server 102 can include an operating system.
  • An operating system is a set of programs that manage computer hardware resources, and provides common services for application software. The operating system enables an application to run on a computer, whereas only applications that are self-booting can generally run on a computer that does not have an operating system.
  • Operating systems are found in almost any device that includes a computer (e.g., cellular phones, video game consoles, web servers, etc.). Examples of popular modern operating systems are Linux, Android, iOS, Mac OS X, and Microsoft Windows®.
  • Embedded operating systems are designed to operate on small machines like PDAs with less autonomy (Windows CE and Minix 3 are some examples of embedded operating systems).
  • Operating systems can be distributed, which makes a group of independent computers act in some respects like a single computer.
  • Operating systems often include a kernel, which controls low-level processes that most users cannot see (e.g., how memory is read and written, the order in which processes are executed, how information is received and sent by I/O devices, and devices how to interpret information received from networks).
  • Operating systems often include a user interface that interacts with a user directly to enable control and use of programs. The user interface can be graphical with icons and a desktop or textual with a command line.
  • APIs Application programming interfaces
  • Which features are considered part of the operating system is defined differently in various operating systems, but all of the components are treated as part of the operating system in this paper for illustrative convenience.
  • variable-layer content server 102 can include datastores that hold content to be edited as well as editing layers, and other content.
  • a datastore can be implemented, for example, as software embodied in a physical computer-readable medium on a general- or specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system.
  • Datastores in this paper are intended to include any organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats.
  • Datastore-associated components such as database interfaces, can be considered “part of” a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described in this paper.
  • Datastores can include data structures.
  • a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context.
  • Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program.
  • Some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself.
  • Many data structures use both principles, sometimes combined in non-trivial ways.
  • the implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure.
  • the engines in the variable-layer content server 102 can be cloud-based engines, and the datastores in the variable-layer content server 102 can be cloud-based datastores.
  • the engines and the datastores in the variable-layer content server 102 run applications in a cloud-based computing environment.
  • the variable-layer content server 102 can host a website affiliated with providing variable-layer content editing services. The website can access engines and datastores that provide a user with tools to edit the content online.
  • the engines in the variable-layer content server 102 can execute on the variable-layer content server 102 and can provide a cloud-based interface for display on a host application, such as a web browser on the variable-layer content client 106 . As the engines execute in the cloud, the engines on the variable-layer content server 102 provide a web-based application that looks, acts, and behaves like a locally installed application.
  • the web-based application provided by the engines on the variable-layer content server 102 can run in a host application such as a web browser and obviates the need for specific applications to be installed on client computers.
  • a user need not purchase a proprietary operating system or install expensive content editing software, as long as the user has access to a web browser that can access the engines and datastores on the variable-layer content server 102 .
  • a user also need not purchase expensive and high-performance computing equipment or memory.
  • a user need not purchase extensive content editing packages, such as high-quality editing-effects packages as editing-effects packages would be stored and executed on the variable-layer content server 102 .
  • Users need not worry about software becoming obsolete because a remote online application can be used to run any executable file, regardless of whether the file is currently executable on the user's device; legacy platforms can run on any device.
  • the network 104 can include a computer network.
  • the network 104 can include communication channels to connect server resources and information in the variable-layer content server 102 with client resources and information in the variable-layer content client 106 .
  • the network 104 can be implemented as a personal area network (PAN), a local area network (LAN), a home network, a storage area network (SAN), a metropolitan area network (MAN), an enterprise network such as an enterprise private network, a virtual network such as a virtual private network (VPN), or other network.
  • PAN personal area network
  • LAN local area network
  • SAN storage area network
  • MAN metropolitan area network
  • enterprise network such as an enterprise private network
  • a virtual network such as a virtual private network (VPN), or other network.
  • One network of particular interest for an online application service is the World Wide Web (“the Web”), which is one of the services running on the Internet.
  • the Web is a system of interlinked hypertext documents accessed via the Internet.
  • the network 104 can serve to connect people located around a common area, such as a school, workplace, or neighborhood.
  • the network 104 can also connect people belonging to a common organization, such as a workplace. Portions of the network 104 can be secure and other portions of the network 104 can be insecure.
  • the network 104 can use a variety of physical or other media to connect the variable-layer content server 102 with the variable-layer content client 106 .
  • the network 104 can connect the variable-layer content server 102 with the variable-layer content client 106 using some combination of wired technologies, such as twisted pair wire cabling, coaxial cabling, optical fiber cabling, or other cabling.
  • Wireless networks will typically include an internetworking unit (IWU) that interconnects wireless devices on the relevant one of the wireless networks with another network, such as a wired LAN.
  • IWU internetworking unit
  • WAP wireless access point
  • a WAP is also defined as a station.
  • a station can be a non-WAP station or a WAP station.
  • the WAP is often referred to as a base station.
  • Wireless networks can be implemented using any applicable technology, which can differ by network type or in other ways.
  • the wireless networks can be of any appropriate size (e.g., metropolitan area network (MAN), personal area network (PAN), etc.). Broadband wireless MANs may or may not be compliant with IEEE 802.16, which is incorporated by reference. Wireless PANs may or may not be compliant with IEEE 802.15, which is incorporated by reference.
  • the wireless networks 2404 can be identifiable by network type (e.g., 2G, 3G, Wi-Fi), service provider, WAP/base station identifier (e.g., Wi-Fi SSID, base station and sector ID), geographic location, or other identification criteria.
  • the wireless networks may or may not be coupled together via an intermediate network.
  • the intermediate network can include practically any type of communications network, such as, by way of example but not limitation, the Internet, a public switched telephone network (PSTN), or an infrastructure network (e.g., private LAN).
  • the term “Internet” as used herein refers to a network of networks which uses certain protocols, such as the TCP/IP protocol, and possibly other protocols such as the hypertext transfer protocol (HTTP) for hypertext markup language (HTML) documents that make up the World Wide Web (the web).
  • HTTP hypertext transfer protocol
  • HTML hypertext markup language
  • variable-layer content client 106 can include a computer, which can, in general, have an operating system and include datastores and engines.
  • the variable-layer content client 160 can execute variable-layer content editing services inside a host application (i.e., can execute a browser plug-in in a web browser).
  • the browser plug-in can provide an interface such as a graphical user interface (GUI) for a user to access the content editing services on the variable-layer content server 102 .
  • GUI graphical user interface
  • the browser plug-in can include a GUI to display content and layers on the datastores in the variable-layer content server 102 .
  • the browser plug-in can have display capabilities like the capabilities provided by proprietary commercially available plug-ins like Adobe® Flash Player, QuickTime®, and Microsoft Silverlight®.
  • the browser plug-in can also include an interface to execute functionalities on the engines in the variable-layer content server 102 .
  • a device on which the variable-layer media access client 106 is implemented can be implemented as a station.
  • a station as used herein, may be referred to as a device with a media access control (MAC) address and a physical layer (PHY) interface to the wireless medium that comply with, e.g., the IEEE 802.11 standard.
  • MAC media access control
  • PHY physical layer
  • a station can be described as “IEEE 802.11-compliant” when compliance with the IEEE 802.11 standard is intended to be explicit.
  • IEEE 802.11TM-2007 (Revision of IEEE Std 802.11-1999) is incorporated by reference.
  • IEEE 802.11k-2008, IEEE 802.11n-2009, IEEE 802.11p-2010, IEEE 802.11r-2008, IEEE 802.11w-2009, and IEEE 802.11y-2008 are also incorporated by reference.
  • one or more wireless devices may comply with some other standard or no standard at all, and may have different interfaces to a wireless or other medium. It should be noted that not all standards refer to wireless devices as “stations,” but where the term is used in this paper, it should be understood that an analogous unit will be present on all applicable wireless networks. Thus, use of the term “station” should not be construed as limiting the scope of an embodiment that describes wireless devices as stations to a standard that explicitly uses the term, unless such a limitation is appropriate in the context of the discussion.
  • FIG. 2 shows a diagram of an example of a variable-layer content server 200 .
  • the variable-layer content server 200 includes a number of engines and datastores.
  • the variable-layer media content server 200 includes a layer-scalable content editor launch engine 202 , a client-remote content placement engine 204 , a layer addition engine 206 , a layer superimposing engine 208 , a variable-layer content playing engine 210 , a content account engine 212 , a content publication engine 214 , a content rights management engine 216 , a royalty estimation engine 218 , a content rights acquisition engine 220 , a content rights crediting engine 222 , a content crowdsourcing engine 224 , a third-party layer-package engine 226 , and a content monetization engine 228 .
  • the variable-layer media content server 200 includes a content datastore 230 , a layer datastore 232 , and an account datastore 234 .
  • the layer-scalable content launch engine 202 receives a request to launch an editor window for display on a client.
  • a request to the layer-scalable content launch engine 202 can include a character string that identifies a client device and provides edit window parameters.
  • the request can also include a parameterized object or an instance of a class used to identify the client device and provide the edit window parameters.
  • the request is often received over a network connection to the layer-scalable content server 200
  • the request is often decoded from a data packet to the layer-scalable content server 200 over the network.
  • the request can also be unencrypted if a secure network connection was used to transmit the request.
  • the request may be decoded from a secure or an unsecure data packet.
  • the request to the layer-scalable content launch engine 202 can identify a client device.
  • the request can contain a network address such as an Internet Protocol (IP) or other address of the client.
  • IP Internet Protocol
  • the request can also contain a device identifier such a Media Access Card (MAC) address of the client.
  • MAC Media Access Card
  • the layer-scalable content launch engine 202 can identify a client using destination/network identifiers to launch an editor window on the client.
  • the request to the layer-scalable content launch engine 202 can also identify parameters of a client host application.
  • the request can identify the operating system on the client and can help the layer-scalable content launch engine 202 determine whether to support the client operating system.
  • the request can also identify the type and version of a host application, such as a web browser, on the client.
  • the request can further identify the screen resolution, processor speed, memory, and network speed of the client device.
  • the layer-scalable content launch engine 202 can determine whether to support the client's specific host application.
  • the layer-scalable content launch engine 202 can also use the request to supply an edit window with default parameters based on any of the OS or the host application parameters in the request.
  • the layer-scalable content launch engine 202 can further determine whether to recommend an upgraded operating system or host application to the client.
  • the request to the layer-scalable content launch engine 202 can help the layer-scalable content launch engine 202 perform a “smart-bandwidth” determination.
  • the layer-scalable content launch engine 202 can calculate the resolution of the content to provide for editing. For instance, if the request identifies a client connected to a Digital Signal 3 (T3) connection or other relatively fast Internet connection, the layer-scalable content launch engine 202 can determine it is desirable to provide relatively high quality media content (e.g., high definition (HD) media content) for editing.
  • the layer-scalable content launch engine 202 can determine it is desirable to provide relatively low quality media content for editing.
  • the request to the layer-scalable content launch engine 202 can include user account parameters.
  • the layer-scalable content launch engine 202 the user account parameters in the request to the content account engine 212 .
  • the layer-scalable content launch engine 202 can launch an edit window for display on the identified client.
  • the layer-scalable content launch engine 202 can direct the edit window to the device identified for display.
  • the layer-scalable content launch engine 202 can characterize the edit window with a resolution and other parameters that are supported by the client device's operating system and host application. For instance, the layer-scalable content launch engine 202 can access application programming interfaces or other modules on the client to load an edit window as a browser plug-in in a web browser running on the client.
  • the layer-scalable content launch engine 202 can also use the “smart-bandwidth” determination to limit the maximum resolution of the edit window. As a result, the layer-scalable content launch engine 202 can launch a highly usable, easily portable content edit window while installing no new applications on the client.
  • the client-remote content placement engine 204 receives an instruction to place content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client.
  • An instruction to place content from the content datastore can include an identifier of the specific content needing to be placed as well as content datastore access parameters (such as content datastore usernames and passwords).
  • the identifier of the content can identify a file stored in the content datastore by name, by the file's address in the content datastore, or by the file's relationship to other files in the content datastore.
  • the instruction to place content can also include one or more API calls that obtain the content from the content datastore.
  • the client-remote content placement engine 204 places content from the content datastore on a multi-layer timeline content compilation represented in the editor window.
  • the client-remote content placement engine 204 can provide, as part of an API call to the editor window on the browser plug-in, a server-based address of content to be placed.
  • the content is located on the variable-layer content server 200 .
  • the client-remote content placement engine 204 places a link to the server-based address of the content in the editor window.
  • the client-remote content placement engine 204 also places character strings corresponding to parameters of the content to be placed. For instance, the client-remote content placement engine 204 can provide a target area on the edit window for the content to be placed. The client-remote content placement engine 204 can also provide a resolution and a playback speed of the content to be placed.
  • the act of placing content by the client-remote content placement engine 204 can include loading the content into a server-based playback engine, and then streaming portions of the content to the client.
  • a “smart-bandwidth” feature either of the resolution or the playback speed of the content to be placed can depend on the connection speed of the client.
  • the client-remote content placement engine 204 can place higher quality content into a client identified to have a faster fast Internet connection.
  • the multi-layer timeline content compilation can be raw video that reflects a scene without any editing layers.
  • the raw video can be a layer in the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can also be edited video that reflects a scene with editing layers having been previously applied.
  • Edited video can include, in addition to placed content, multiple editing layers as these multiple editing layers are applied to the content at portions of the compilation timeline.
  • the multi-layer timeline content compilation can include items such as videos, audio, graphics, sound effects, and other editing content as these items are applied to the content at portions of the compilation timeline.
  • the multi-layer timeline compilation has a layer corresponding to a compilation timeline. Portions of the compilation timeline can be associated with the run timeline of the content.
  • the multi-layer timeline content compilation has a set of destination edit layer classifications that classify the layers to be superimposed onto the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can have an “effects” destination edit layer classification that receives layers corresponding to special effects to be added to the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can also have a “graphics” destination edit layer classification that receives layers corresponding to graphics to be added to the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can have “video” and “image” destination edit layer classifications that receive video layers and image layers to be added to the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can have “music” and “audio effects” destination edit layer classifications to receive layers corresponding to music and sound effects to be added to the multi-layer timeline content compilation.
  • the multi-layer timeline content compilation can similarly have “contentum” destination edit layer classifications and other destination edit layer classifications.
  • the destination edit layer classifications can be user-selected or may correspond to packages from a third-party.
  • FIGS. 8-12 show specific destination edit layer classifications.
  • the set of destination edit layer classifications in the multi-layer timeline content compilation can be a user-selected set of destination edit layer classifications.
  • a user wishing to have more than one independent video layer can add multiple video destination edit layer classifications for the multi-layer timeline content compilation.
  • a user can add multiple audio destination edit layer classifications.
  • a user can add multiple effects destination edit layer classifications.
  • the multi-layer timeline content compilation is therefore heavily customizable and can closely accommodate an editor's creative vision without local installation of any new programs.
  • the set of destination edit layer classifications in the multi-layer timeline content compilation is stored on the variable-layer content server 200 , and not on a local machine.
  • the set of destination edit layer classifications benefits from having access to very large server storage resources, the set of destination edit layer classifications is potentially very large.
  • an editor can weave in a potentially infinite number of editing layers into the multi-layer timeline content compilation.
  • the editor can not only create full-length movies with large numbers of scenes.
  • the editor can also apply a potentially infinite number of layers, including a large number of effects, to a particular interval of the multi-layer content composition.
  • editors can create professional and high quality films without installing files or programs to their computers.
  • the layer addition engine 206 facilitates superimposition of additional layers to the content, as housed in the multi-layer timeline compilation.
  • the layer addition engine 206 receives an instruction to superimpose a superimposable layer from the layer datastore onto existing layers of the multi-layer timeline content compilation.
  • An instruction to superimpose a superimposable layer can include an identifier of specific superimposable layers and layer datastore access parameters (such as layer datastore usernames and passwords).
  • the identifier of the superimposable layer can identify the superimposable layer by name, by the superimposable layer address in the layer datastore, or by the superimposable layer relationship to other layers in the layer datastore.
  • the instruction to superimpose the superimposable layer can also include one or more API calls that obtain the superimposable layer from the layer datastore.
  • the instruction to superimpose includes directing the placement of a superimposable layer over at least a portion of the multi-layer timeline content compilation.
  • the instruction to superimpose therefore includes an instruction to help edit the multi-layer timeline content compilation.
  • the instruction to superimpose the superimposable layer can also API calls to the editor window.
  • the instruction to superimpose could include a portion of the compilation timeline of the multi-layer timeline content compilation for which the superimposable layer is to be applied. For instance, the instruction could include superimposing textual credits for ten seconds to start the multi-layer timeline content compilation.
  • the instruction to superimpose could also identify a visual portion of the multi-layer timeline content compilation for which the superimposable layer is to be applied. For example, the instruction to superimpose could include placing textual credits on the bottom left-hand quadrant of the multi-layer timeline content compilation.
  • the superimposable layers could include video layers.
  • Video layers are video clips that can be added to portions of the multi-layer timeline content compilation. For instance, a film editor may wish to add video to a corner of the multi-layer timeline content compilation so that the video appears integrated into the multi-layer timeline content compilation.
  • the superimposable layers could include transition layers. Transition layers are video clips or images used to transition between scenes in the multi-layer timeline content compilation. For instance, a film editor may wish to recreate fading or wiping effects commonly seen in films.
  • the superimposable layers could include sound layers such as audio effects or soundtracks for parts of the multi-layer timeline content compilation.
  • the superimposable layers could further include graphical layers.
  • Graphical layers are animated layers that film editors can use to create graphical effects for parts of the multi-layer timeline content compilation.
  • the superimposable layers could include user-specific media layers, which can correspond to video, audio, animated, and other content created or uploaded by a film editor or other users.
  • FIGS. 8-12 show the video layers, transition layers, sound layers, graphical layers, and user-specific media layers.
  • the instruction to superimpose the superimposable layer can associate the superimposable layer with a destination edit layer classification on the multi-layer timeline content compilation.
  • the layer addition engine 206 can provide an instruction to add any of the superimposable layers to any of the destination edit layer classifications associated with the multi-layer timeline content compilation.
  • the instruction to superimpose the superimposable layer can control effects relating to each superimposable layer.
  • the instruction to superimpose the superimposable layer can control, for instance, whether a specific superimposed layer is to fade in or out.
  • the instruction to superimpose the superimposable layer can also control the transparency and other attributes of a specific superimposed layer.
  • the layer superimposing engine 208 in operation, superimposes, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer.
  • the layer superimposing engine 208 modifies the multi-layer timeline content compilation to include the material from the superimposable layer. For instance, if the superimposable layer was a video layer, the multi-layer timeline content compilation would include the video material from the superimposable layer.
  • the layer superimposing engine 208 similarly adds audio, graphics, and other effects to the multi-layer timeline content compilation.
  • variable-layer content playing engine 210 in operation, plays the multi-layer timeline content compilation, including the content, in the superimposed layer.
  • the variable-layer content playing engine 210 provides instructions to the editor window on the client to display the multi-layer timeline content compilation with the superimposable layer integrated into it.
  • the editor window displays the content along with the video, audio, and other effects added as part of the superimposable layer.
  • the variable-layer content playing engine 210 streams the multi-media content compilation to the editor window.
  • the variable-layer content playing engine 210 activates a set of streaming media APIs on the browser plug-in housing the editor window on the client.
  • an editor can preview a high quality multi-layer timeline content compilation with customizable and a potentially infinite number of editing layers incorporated within.
  • the content account engine 212 limits access to the content based on an account-based access system.
  • the content account engine 212 maintains a plurality of user accounts in a datastore such as the account datastore 234 .
  • Each account can have a username, that is a character string identifying a unique user of the system.
  • Each account can also have a password associated with the username.
  • the usernames and passwords can also be stored in the account datastore 234 .
  • the content account engine 212 can require entry of a username and password associated with any of the plurality of user accounts in order to access the system.
  • the content account engine 212 associates the content with one of the plurality of user accounts.
  • the content account engine 212 can store an access key associated with the multi-layer timeline content compilation in the account datastore 234 .
  • the access key can be a unique character string that contains information about both the multi-layer timeline content compilation and a specific user account.
  • the access key may require the password associated with the user account in order to unlock the content.
  • the content account engine 212 limits another of the plurality of user accounts from accessing the content.
  • the content account engine 212 may prevent the client-remote content placement engine 204 from placing any content into the remote editor window without entry of the username and password associated with the content.
  • the content publication engine 214 in operation, provides an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user associated with the layer superimposing engine 208 has publication rights to the content.
  • the content publication engine 214 checks whether a user associated with the layer superimposing engine 208 has publication rights to the content.
  • the content publication engine 214 can check publication rights by issuing an instruction to the editor window on the client to prompt a user certification of non-infringing use. For instance, the content publication engine 214 can request a given user to certify under penalty of perjury and under all applicable laws that the user has rights to publish the content.
  • the content publication engine 214 can check publication rights of the content by determining which studio owns rights to the content and requesting a content key to the content from the studio. The content publication engine 214 can then request the user to enter the content key to unlock the content for layer superimposition, editing, and other use.
  • the content publication engine 214 can check publication rights by verifying content rights in metadata of the file storing the content.
  • the content publication engine 214 can also check publication rights in other ways. In this manner, the content publication engine 214 ensures users can have access to a broad range of content. Studios and producers of content may also benefit from the fact that the content publication engine 214 limits republication of protected content only by those editors with publication rights.
  • the content publication engine 214 after verifying publication rights, can save the multi-layer timeline content compilation to a streaming media format or a downloadable media format. If necessary, the content publication engine 214 can also set the resolution and the playback speed of the multi-layer timeline content compilation.
  • the content rights management engine 216 in operation, provides a depublication instruction if the user associated with the layer superimposing engine 208 does not have publication rights to the content.
  • the content rights management engine 216 can check publication rights of the content.
  • the content rights management engine 216 can independently check rights by methods such as requesting user verification, requesting content keys, and evaluating content file metadata.
  • the content rights management engine 216 can import these functions from another module such as the content publication engine 214 .
  • the content rights management engine 216 can provide a depublication instruction to the other engines in the variable-layer content server 200 .
  • the content rights management engine 216 can independently generate a depublication key (which may or may not be encrypted) so that the other engines (such as the content publication engine 214 ) do not publish the multi-layer timeline content compilation.
  • the royalty estimation engine 218 estimates a cumulative royalty associated with publishing the multi-layer timeline content compilation.
  • the royalty estimation engine 218 can check publication rights of the content by determining which studio owns rights to the content and requesting a license value to the content from the studio.
  • the license value can reflect the studio's estimated cost of licensing the content.
  • the license value can include bargaining between an entity operating the royalty estimation engine 218 and the studio owning rights to the content. Aggregating the values of all licensed content in the multi-layer timeline content compilation, the royalty estimation engine 218 can then estimate the cumulative royalty associated with publishing the multi-layer timeline content compilation.
  • the royalty estimation engine 218 can also suggest a revenue amount for the multi-layer timeline content compilation. That is, the royalty estimation engine 218 can recommend a minimum amount that the creator of the multi-layer timeline content compilation can charge to license his or her creative product.
  • the minimum amount reflects the cost of obtaining the content for the multi-layer timeline content compilation.
  • the content rights acquisition engine 220 in operation, provides a request to obtain publication rights to the content if a user associated with the layer superimposing engine 208 does not have publication rights to the content.
  • the content rights acquisition engine 220 can check publication rights of the content.
  • the content rights acquisition engine 220 can independently check rights by methods such as requesting user verification, requesting content keys, and evaluating content file metadata.
  • the content rights acquisition engine 220 can import these functions from another module such as the content publication engine 214 .
  • the content rights acquisition engine 220 provides a request to obtain publication rights to the content.
  • the content rights acquisition engine 220 can independently generate a request from a studio for a content key for the content.
  • the content rights acquisition engine 220 allows editors to actively seek licenses to modify content owned by others, including studios.
  • the content rights crediting engine 222 in operation, provides a request to acquire publication credits associated with the content, and provides the publication credits for storage in the layer datastore 232 if a user associated with the layer superimposing engine 208 does not have publication rights to the content.
  • the content rights crediting engine 222 can check publication credits associated with the content.
  • the content rights acquisition engine 220 can independently check credits by methods such as requesting user verification, requesting content credits from a studio owning rights to the content, and evaluating content file metadata.
  • the content rights crediting engine 222 can import these functions from another module such as the content publication engine 214 .
  • the content rights crediting engine 222 can provide the publication credits for storage in the layer datastore 234 . In this way, a user can actually use the publication credits as a superimposable layer to be used in the multi-layer timeline content compilation.
  • the content crowdsourcing engine 224 in operation, allows users to obtain content relating to a specific subject or event from a variety of sources.
  • the content crowdsourcing engine 224 provides a subject-specific content call.
  • a subject-specific content call is a character string corresponding to a subject or an event.
  • a subject-specific content call could be a character string requesting a list of protests that occurred in the year 2011.
  • the subject-specific content call can be limited to a specific country, such as Egypt, Iran, republic, or other country.
  • the subject-specific content call can be limited to a specific movement, such as “Occupy Wall Street.”
  • the content crowdsourcing engine 224 can then provide the subject-specific content call to query search engines, social networks, studios, online video sites, or other content sources.
  • the content crowdsourcing engine 224 receives, in response to the subject-specific content call, subject-specific content.
  • the content crowdsourcing engine 224 can receive content relating to the specific subject or event.
  • the content crowdsourcing engine 224 provides the subject-specific content to the content datastore 230 .
  • the content crowdsourcing engine 224 can save the subject-specific content to the content datastore 230 .
  • the content crowdsourcing engine 224 provides ready access to a variety of clips relating to a specific subject or event. An editor can develop professional high-quality films, including professional high-quality documentaries of specific subjects or events as these events unfold.
  • the content crowdsourcing engine 224 allows an editor to readily access multiple views of a specific subject or event. For instance, many participants of the “Arab Spring” sunlights or the “Occupy Wall Street” movement of 2011 have access to mobile devices that allow protest footage to be captured and uploaded to video sites. An editor using the content crowdsourcing engine 224 can readily obtain numerous clips of an event (e.g., numerous views of a single protest) and develop a professional and high-quality film without installing any editing software to his or her computer. In this example, the editor can even develop a professional-quality documentary based on the crowdsourced footage from a remote location such as a library or an Internet café.
  • a remote location such as a library or an Internet café.
  • the third-party layer-package engine 226 in operation, provides a third-party package of proprietary third-party layers to the layer datastore 232 .
  • the third-party package engine 226 can obtain high-quality or even professional layers from third-parties such as commercial movie studios.
  • the third-party layer-package engine 226 can allow editors access to professional layers that they can add to their creations.
  • the third-party layer-package engine 226 stores these layers in the layer datastore 232 .
  • the content monetization engine 228 in operation, provides an advertising package or a pay-per-view package to the layer datastore 232 .
  • An advertising package can involve links to commercial entities who can pay an editor for including their material in a multi-layer timeline content compilation.
  • a pay-per-view package can request viewers to pay for watching a multi-layer timeline content compilation.
  • the content monetization engine 228 can also provide instructions to deposit revenue in an editor's financial accounts or credit deficiencies in the editor's financial accounts. Using the content monetization engine 228 , an editor can readily monetize his or her creation by adding an edit layer to that creation.
  • the content datastore 230 stores content.
  • the layer datastore 232 stores layers.
  • the account datastore 234 stores account information such as user names and passwords.
  • FIG. 3 shows a diagram of an example of a variable-layer content client 300 .
  • the variable-layer content client 300 includes a number of engines and datastores.
  • the variable-layer content client 300 includes a web browsing engine 302 , a content editor display engine 304 , a client-based content placement instruction engine 306 , a client-based layer placement instruction engine 308 , a superimposable layer display engine 310 , a timeline display engine 312 , and a multi-layer timeline content compilation display engine 314 .
  • the variable-layer content client 300 includes a local datastore 316 and a local storage buffer 318 . The discussion below provides a description of the functionality of each of these engines and datastores.
  • the web browsing engine 302 in operation allows a user of the variable-layer content client 300 to access the Internet.
  • the web browsing engine 302 is incorporated into an Internet browser.
  • Existing Internet browsers include browsers manufactured by Microsoft®, Google®, Mozilla®, Apple®, and others.
  • the web browsing engine 302 can be incorporated into personal computer, a mobile device, or other computing client.
  • the web browsing engine 302 can run a host application. That is, the web browsing engine 302 can execute a browser plug-in in the Internet browser installed on the variable-layer content client 300 .
  • the browser plug-in can provide an interface such as a graphical user interface (GUI) for a user to access the server-based content editing services.
  • GUI graphical user interface
  • the browser plug-in can include a GUI to display content and layers on server datastores.
  • the browser plug-in can have display capabilities like the capabilities provided by proprietary commercially available plug-ins like Adobe®Flash Player, QuickTime®, and Microsoft Silverlight®.
  • the browser plug-in can also include an interface to execute server-initiated functionalities on server based engines.
  • the content editor display engine 304 in operation, can launch an editor window for display on the variable-layer content client 300 .
  • the editor window can be displayed in the host application on the variable-layer content client 300 .
  • the content editor display engine 304 can call one or more APIs of the web browser plug-in, thereby allowing display of an editor window.
  • the client-based content placement instruction engine 306 places a link to the content in the editor window.
  • the client-based content placement engine 306 receives parameters, such as the server-address of the content to be placed, resolution, and playback speed. Based on these parameters, the client-based content placement instruction engine 306 places a link to the content (at the provided resolution, playback speed, etc.) in the editor window.
  • the client-based layer placement instruction engine 308 places a link to a superimposable layer over the link to the content. Placing this link creates, on the server, a multi-layer timeline content compilation on the server.
  • the superimposable layer display engine 310 in operation, displays links to superimposable layers as well as links to destination edit layer classifications in the edit window.
  • the timeline display engine 312 in operation, displays a link to the compilation timeline in the edit window.
  • the multi-layer timeline content compilation display engine 314 can place a link to a multi-layer timeline content compilation in the edit window.
  • the edit window can display a link to the multi-layer content compilation, links to superimposable layers, and links to destination edit layer classifications.
  • a user of the variable-layer content client 300 has access to high-quality professional film editing without needing to install any editing software on the variable-layer content client 300 .
  • the local datastore 316 can store locally store any data on the variable-layer content client 300 . Also shown in FIG. 3 is the local storage buffer 318 , which can buffer content to optimize editing and playback.
  • FIG. 4 shows a flowchart 400 of an example of a method for providing subject-specific content in response to a subject-specific content call.
  • the modules of the flowchart 400 and other flowcharts described in this paper are reordered to a permutation of the illustrated order of modules or reorganized for parallel execution.
  • the flowchart 400 starts at module 402 with providing a subject-specific content call.
  • the flowchart 400 continues to module 404 with receiving, in response to the subject-specific content call, subject-specific content.
  • the flowchart 400 continues to module 406 with providing the subject-specific content to the content datastore.
  • the flowchart 400 continues to module “A.”
  • FIG. 5 shows a flowchart 500 of an example of a method for creating a multi-layer timeline content compilation.
  • the flowchart 500 starts at module “A” and proceeds to module 502 with receiving a request to launch an editor window for display on a client.
  • the flowchart 500 continues to module 504 with receiving an instruction to place content from a content datastore on a multi-layer timeline content compilation in the editor window, wherein the content datastore is remote relative to the client.
  • the flowchart 500 continues to module 506 with receiving an instruction to superimpose a superimposable layer on existing layers of the multi-layer timeline content compilation.
  • the flowchart 500 continues to module 508 with superimposing, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer.
  • the flowchart 500 continues to module 510 with playing the multi-layer timeline content compilation, including the content, in the superimposed layer.
  • the flowchart 500 continues to module “B.”
  • FIG. 6 shows a flowchart of an example of a method for publishing a multi-layer timeline content compilation.
  • the flowchart 600 starts at module “B” and proceeds to module 602 with deciding whether a user has publication rights to the content. If the user does not have publication rights to the content, the flowchart 600 continues to module 604 with deciding whether to obtain publication rights to the content. If it is not found desirable to obtain publication rights to the content, the flowchart 600 continues to module 618 with providing a depublication instruction. If it is found desirable to obtain publication rights to the content, the flowchart 600 continues to module 606 with providing a request to acquire publication rights to the content. The flowchart 600 continues to module 608 with providing a request to acquire publication credits associated with the content. The flowchart 600 continues to module 610 with providing the publication credits for storage.
  • the flowchart 600 returns to module 602 . If the user does have publication rights to the content, the flowchart 600 continues to module 612 with estimating a cumulative royalty associated with publishing the multi-layer timeline content compilation. The flowchart 600 continues to module 614 with suggesting a revenue amount for the multi-layer timeline content compilation. The flowchart 600 continues to module 616 with providing an instruction to publish the multi-layer timeline content compilation.
  • FIG. 7 shows an example of a system on which techniques described in this paper can be implemented.
  • the computer system 700 can be a conventional computer system that can be used as a client computer system, such as a wireless client or a workstation, or a server computer system.
  • the computer system 700 includes a computer 702 , I/O devices 704 , and a display device 706 .
  • the computer 702 includes a processor 708 , a communications interface 710 , memory 712 , display controller 714 , non-volatile storage 716 , and I/O controller 718 .
  • the computer 702 may be coupled to or include the I/O devices 704 and display device 706 .
  • the computer 702 interfaces to external systems through the communications interface 710 , which may include a modem or network interface. It will be appreciated that the communications interface 710 can be considered to be part of the computer system 700 or a part of the computer 702 .
  • the communications interface 710 can be an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g. “direct PC”), or other interfaces for coupling a computer system to other computer systems.
  • the processor 708 may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor.
  • the memory 712 is coupled to the processor 708 by a bus 770 .
  • the memory 712 can be Dynamic Random Access Memory (DRAM) and can also include Static RAM (SRAM).
  • the bus 770 couples the processor 708 to the memory 712 , also to the non-volatile storage 716 , to the display controller 714 , and to the I/O controller 718 .
  • the I/O devices 704 can include a keyboard, disk drives, printers, a scanner, and other input and output devices, including a mouse or other pointing device.
  • the display controller 714 may control in the conventional manner a display on the display device 706 , which can be, for example, a cathode ray tube (CRT) or liquid crystal display (LCD).
  • the display controller 714 and the I/O controller 718 can be implemented with conventional well known technology.
  • the non-volatile storage 716 is often a magnetic hard disk, an optical disk, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory 712 during execution of software in the computer 702 .
  • machine-readable medium or “computer-readable medium” includes any type of storage device that is accessible by the processor 708 and also encompasses a carrier wave that encodes a data signal.
  • the computer system 700 is one example of many possible computer systems which have different architectures.
  • personal computers based on an Intel microprocessor often have multiple buses, one of which can be an I/O bus for the peripherals and one that directly connects the processor 708 and the memory 712 (often referred to as a memory bus).
  • the buses are connected together through bridge components that perform any necessary translation due to differing bus protocols.
  • Network computers are another type of computer system that can be used in conjunction with the teachings provided herein.
  • Network computers do not usually include a hard disk or other mass storage, and the executable programs are loaded from a network connection into the memory 712 for execution by the processor 708 .
  • a Web TV system which is known in the art, is also considered to be a computer system, but it may lack some of the features shown in FIG. 7 , such as certain input or output devices.
  • a typical computer system will usually include at least a processor, memory, and a bus coupling the memory to the processor.
  • the apparatus can be specially constructed for the required purposes, or it can comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • FIG. 8 shows a diagram of a screenshot 800 on a web browser of a variable-layer content client.
  • the screenshot 800 shows an editor window incorporated into an internet browser, here the Internet Explorer web browser from Microsoft®.
  • the editor window displays content, namely, a video for editing in the upper right hand corner.
  • the editor window displays a series of superimposable effects.
  • superimposable effects include “Videos,” “Transitions, “Sounds,” Graphics,” and “My media files.”
  • a user has selected the “Videos” set of superimposable layers and sees a corresponding library of Nature videos.
  • the edit window shows an edit near the center of the edit window.
  • the timeline in this example has a duration of 2 minutes and 20 seconds.
  • a moving marker indicates that the user is viewing a multi-layer timeline content compilation as the multi-layer timeline content compilation looks at about 10 seconds into the multi-layer timeline content compilation.
  • the edit window shows a list of destination edit classifications.
  • destination edit classifications include “Effects,” “Graphics,” “Video/image,” “Music,” “Audio Effects,” and “Contentum.”
  • the list of destination edit classifications is customizable. Moreover, a user can select from an extensive number of destination edit classifications, as the destination edit classifications are stored on a server and therefore benefit from large-volume storage.
  • FIG. 9 shows a diagram of a screenshot 900 on a web browser of the variable-layer content client.
  • the screenshot 900 shows the superimposable “Transitions” layers in greater detail.
  • a user can select from wipe and fade transitions. Wipe transitions include a left wipe, a right wipe, an upward wipe, and a downward wipe.
  • FIG. 10 shows a diagram of a screenshot 1000 on a web browser of the variable-layer content client.
  • the screenshot 1000 shows the superimposable “Sounds” layers in greater detail.
  • a user can select from a variety of music or sound effects.
  • FIG. 11 shows a diagram of a screenshot 1100 on a web browser of the variable-layer content client.
  • the screenshot 1100 shows the superimposable “Graphics” layers in greater detail.
  • a user can select from a 35 mm image, a heart image, comic text, or other text to superimpose on the content.
  • FIG. 12 shows a diagram of a screenshot 1200 on a web browser of the variable-layer content client.
  • the screenshot 1200 shows the superimposable “My media files” layers in greater detail.
  • a user can superimpose other media over existing content.
  • the user can superimpose any of VIDEO 002 , VIDEO 015 , and VIDEO 016 onto any of the destination edit classifications.
  • the user has superimposed VIDEO 016 onto the “Video/image” destination edit classification.
  • Embodiments allow creative professionals to harness the reality that film is more than the sum of individual video clips and special effects.
  • creative professionals can develop high-quality and professional content without needing to access film studios, expensive editing equipment.
  • creative professionals need not locally install complex video editing software just because they want professional films.
  • Creative professionals can easily add creative elements to streaming media, including streaming media from multiple sources, such as crowdsourced streaming media.

Abstract

The application discloses systems of creating multi-layer timeline content compilations. The systems can include a layer-scalable editor launch engine receiving a request to launch an editor window for display on a client. A client-remote content placement engine receives an instruction to place content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client. A layer addition engine receives an instruction to superimpose a superimposable layer from the layer datastore onto existing layers of the multi-layer timeline content compilation. A layer superimposing engine superimposes the superimposable layer onto the existing layers of the multi-layer timeline content compilation. A variable-layer content to play engine plays the multi-layer timeline content compilation, including the content, in a superimposed layer. The present application also discloses related methods.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The present application claims benefit of: U.S. Provisional Patent Application Ser. No. 61/468,725 filed Mar. 29, 2011 and entitled “Media Management;” U.S. Provisional Patent Application Ser. No. 61/564,256 filed Nov. 28, 2011 and entitled “Local Timeline Editing for Online Content Editing;” U.S. Provisional Patent Application Ser. No. 61/564,257 filed Nov. 28, 2011 and entitled “Multi-Layer Timeline Content Compilation Systems and Methods;” and U.S. Provisional Patent Application Ser. No. 61/564,261 filed Nov. 28, 2011 and entitled “Systems and Methods for Low Bandwidth Consumption Online Content Editing;” which are incorporated herein by reference.
  • BACKGROUND
  • A film is more than the sum of its composite parts and, recognizing this fact, creative professionals have long toiled to recreate creative elements in film. With conventional editing equipment, creative professionals use physical media to capture specific scenes and manually add soundtracks, video clips, and special effects to incorporate creative elements like story elements, plots, characters, and thematic elements. The process provides a classical touch and feel that aligned with the creative energies of film producers, directors, screenwriters, and editors. However, the process is expensive and requires access to editing equipment typically located in film studios.
  • Locally installed film editing systems such as standalone computer programs allow users to edit digital media using locally stored special effects. However, locally installed film editing systems require creative professionals to purchase special effects packages, limiting a creative professional to the editing effects locally installed on his or her computer. Locally installed film editing systems therefore fail to deliver high quality without high cost. Unfortunately, locally installed film editing systems make it hard for creative professionals to add creative elements to streaming media, including streaming media from multiple sources, such as crowdsourced streaming media.
  • The foregoing examples of film editing systems are intended to be illustrative and not exclusive. Other limitations of the art will become apparent to those of skill in the relevant art upon a reading of the specification and a study of the drawings.
  • SUMMARY
  • The present application discloses systems and methods of creating and compiling multi-layer timeline compilations. The disclosed systems and methods allow people to access high-quality editing tools without entering film studios and without installing these editing tools on their computers. The disclosed systems and methods are portable and obviate the need for specialized or high-performance computers. Systems include a content datastore, a layer datastore, a layer-scalable content editor launch engine, a client-remote content placement engine, a layer addition engine, a layer superimposing engine, and a variable-layer content playing engine that has an arbitrary number of layers.
  • In operation, the layer-scalable content editor launch engine receives a request to launch an editor window for display on a client, while the client-remote content placement engine receives an instruction to place content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client.
  • In operation, the layer addition engine receives an instruction to superimpose a superimposable layer from the layer datastore onto existing layers of the multi-layer timeline content compilation. The layer superimposing engine superimposes, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer. The variable-layer content playing engine plays the multi-layer timeline content compilation, including the content, in the superimposed layer. For example, the existing layers comprise first crowdsourcing content and the superimposable layer comprises second crowdsourcing content.
  • Systems can include a content account engine that, in operation, maintains a plurality of user accounts. The content account engine can also associate the multi-layer timeline content compilation with one of the plurality of user accounts. The content account engine can limit another of the plurality of user accounts from accessing the multi-layer timeline content compilation. Systems can include a content rights management engine. The content rights management engine can, in operation, provide a depublication instruction if a user associated with the layer superimposing engine does not have publication rights to the content.
  • Systems can include a publication engine. The publication engine, in operation, can provide an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user associated with the layer superimposing engine has publication rights to the content. Systems can also include a royalty estimation engine that, in operation, estimates a cumulative royalty associated with publishing the multi-layer timeline content compilation. For example, the royalty estimation engine can, in operation, suggest a revenue amount for the multi-layer timeline content compilation. The revenue amount can be based at least in part on the cumulative royalty.
  • For example, a content rights acquisition engine that, in operation, provides a request to acquire publication rights if a user associated with the layer superimposing engine does not have publication rights to the content. Systems can also include a content rights crediting engine that, in operation, provides an instruction to acquire publication credits associated with the content, and provides the publication credits for storage in the layer datastore if a user associated with the layer superimposing engine does not have publication rights to the content.
  • Systems can include a crowdsourcing engine. The crowdsourcing engine can, in operation, provide a subject-specific content call, and can receive, in response to the subject-specific content call, subject-specific content. The crowdsourcing engine can also provide the subject-specific content to the content datastore. Systems can include a third-party layer-package engine, that in operation, provides a package of proprietary third-party layers to the layer datastore. Systems can include a content monetization engine. The content monetization engine can provide an advertising package or a pay-per-view package to the layer datastore.
  • Methods can include receiving a request to launch an editor window for display on a client; receiving an instruction to place content from a content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client; receiving an instruction to superimpose a superimposable layer on existing layers of the multi-layer timeline content compilation; superimposing, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer; and playing the multi-layer timeline content compilation, including the content, in the superimposed layer.
  • Methods can also include maintaining a plurality of user accounts; associating the multi-layer timeline content compilation with one of the plurality of user accounts; and limiting another of the plurality of user accounts from accessing the multi-layer timeline content compilation. Methods can further include providing a depublication instruction if a user does not have publication rights to the content.
  • For example, there can be provided an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user has publication rights to the content. Methods can include estimating a cumulative royalty associated with publishing the multi-layer timeline content compilation. Further, methods can provide suggesting a revenue amount for the multi-layer timeline content compilation, the revenue amount based at least in part on the cumulative royalty. Methods can also include providing a request to acquire publication rights to the content if the user does not have the publication rights to the content. Methods can include providing an instruction to acquire publication credits associated with the content, if the user does not have publication rights to the content, and providing the publication credits if the user does not have the publication rights to the content. Methods can include providing a notification if the user does not have publication rights to the content.
  • Methods can include providing a subject-specific content call; receiving, in response to the subject-specific content call, subject-specific content; and providing the subject-specific content to the content datastore. For example, there can be provided a package of proprietary layers to the layer datastore. Methods can also include providing an advertising package or a pay-per-view package to the layer datastore.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a diagram of an example of a network environment.
  • FIG. 2 shows a diagram of an example of a variable-layer content server.
  • FIG. 3 shows a diagram of an example of a variable-layer content client.
  • FIG. 4 shows a flowchart of an example of a method for providing subject-specific content in response to a subject-specific content call.
  • FIG. 5 shows a flowchart of an example of a method for creating a multi-layer timeline content compilation.
  • FIG. 6 shows a flowchart of an example of a method for publishing a multi-layer timeline content compilation.
  • FIG. 7 shows an example of a system on which techniques described in this paper can be implemented.
  • FIG. 8 shows a variable-layer content client web browser screenshot.
  • FIG. 9 shows a variable-layer content client web browser screenshot.
  • FIG. 10 shows a variable-layer content client web browser screenshot.
  • FIG. 11 shows a variable-layer content client web browser screenshot.
  • FIG. 12 shows a variable-layer content client web browser screenshot.
  • DETAILED DESCRIPTION
  • Techniques described in this paper can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
  • FIG. 1 shows a diagram of an example of a network environment 100. The network 100 includes a variable-layer content server 102, a network 104, and a variable-layer content client 106. In the example of FIG. 1, the variable-layer content server 102 can include a computer configured to provide variable-layer content editing services to other computers. The variable-layer content server 102 can include engines that allow a user to edit content, such as video, audio, or other media content, without requiring the user to locally install editing software or download the content being edited.
  • As used in this paper, an engine includes a dedicated or shared processor and, typically, firmware or software modules that are executed by the processor. Depending upon implementation-specific or other considerations, an engine can be centralized or its functionality distributed. An engine includes special purpose hardware, firmware, or software embodied in a computer-readable medium for execution by the processor. As used in this paper, a computer-readable medium is intended to include all mediums that are statutory (e.g., in the United States, under 35 U.S.C. §101), and to specifically exclude all mediums that are non-statutory in nature to the extent that the exclusion is necessary for a claim that includes the computer-readable medium to be valid. Known statutory computer-readable mediums include hardware (e.g., registers, random access memory (RAM), non-volatile (NV) storage, to name a few), but may or may not be limited to hardware.
  • In the example of FIG. 1, the variable-layer content server 102 can include an operating system. An operating system is a set of programs that manage computer hardware resources, and provides common services for application software. The operating system enables an application to run on a computer, whereas only applications that are self-booting can generally run on a computer that does not have an operating system. Operating systems are found in almost any device that includes a computer (e.g., cellular phones, video game consoles, web servers, etc.). Examples of popular modern operating systems are Linux, Android, iOS, Mac OS X, and Microsoft Windows®. Embedded operating systems are designed to operate on small machines like PDAs with less autonomy (Windows CE and Minix 3 are some examples of embedded operating systems). Operating systems can be distributed, which makes a group of independent computers act in some respects like a single computer. Operating systems often include a kernel, which controls low-level processes that most users cannot see (e.g., how memory is read and written, the order in which processes are executed, how information is received and sent by I/O devices, and devices how to interpret information received from networks). Operating systems often include a user interface that interacts with a user directly to enable control and use of programs. The user interface can be graphical with icons and a desktop or textual with a command line. Application programming interfaces (APIs) provide services and code libraries. Which features are considered part of the operating system is defined differently in various operating systems, but all of the components are treated as part of the operating system in this paper for illustrative convenience.
  • In the example of FIG. 1, the variable-layer content server 102 can include datastores that hold content to be edited as well as editing layers, and other content. A datastore can be implemented, for example, as software embodied in a physical computer-readable medium on a general- or specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system. Datastores in this paper are intended to include any organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats. Datastore-associated components, such as database interfaces, can be considered “part of” a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described in this paper.
  • Datastores can include data structures. As used in this paper, a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context. Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program. Thus some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself. Many data structures use both principles, sometimes combined in non-trivial ways. The implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure.
  • In the example of FIG. 1, the engines in the variable-layer content server 102 can be cloud-based engines, and the datastores in the variable-layer content server 102 can be cloud-based datastores. The engines and the datastores in the variable-layer content server 102 run applications in a cloud-based computing environment. In a specific example, the variable-layer content server 102 can host a website affiliated with providing variable-layer content editing services. The website can access engines and datastores that provide a user with tools to edit the content online. The engines in the variable-layer content server 102 can execute on the variable-layer content server 102 and can provide a cloud-based interface for display on a host application, such as a web browser on the variable-layer content client 106. As the engines execute in the cloud, the engines on the variable-layer content server 102 provide a web-based application that looks, acts, and behaves like a locally installed application.
  • In the example of FIG. 1, the web-based application provided by the engines on the variable-layer content server 102 can run in a host application such as a web browser and obviates the need for specific applications to be installed on client computers. A user need not purchase a proprietary operating system or install expensive content editing software, as long as the user has access to a web browser that can access the engines and datastores on the variable-layer content server 102. A user also need not purchase expensive and high-performance computing equipment or memory. Beneficially, a user need not purchase extensive content editing packages, such as high-quality editing-effects packages as editing-effects packages would be stored and executed on the variable-layer content server 102. Users need not worry about software becoming obsolete because a remote online application can be used to run any executable file, regardless of whether the file is currently executable on the user's device; legacy platforms can run on any device.
  • In the example of FIG. 1, the network 104 can include a computer network. The network 104 can include communication channels to connect server resources and information in the variable-layer content server 102 with client resources and information in the variable-layer content client 106. In the example of FIG. 1, the network 104 can be implemented as a personal area network (PAN), a local area network (LAN), a home network, a storage area network (SAN), a metropolitan area network (MAN), an enterprise network such as an enterprise private network, a virtual network such as a virtual private network (VPN), or other network. One network of particular interest for an online application service is the World Wide Web (“the Web”), which is one of the services running on the Internet. The Web is a system of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that can contain text, images, videos, and other multimedia and navigate between the web pages via hyperlinks. The network 104 can serve to connect people located around a common area, such as a school, workplace, or neighborhood. The network 104 can also connect people belonging to a common organization, such as a workplace. Portions of the network 104 can be secure and other portions of the network 104 can be insecure.
  • In the example of FIG. 1, the network 104 can use a variety of physical or other media to connect the variable-layer content server 102 with the variable-layer content client 106. For instance, the network 104 can connect the variable-layer content server 102 with the variable-layer content client 106 using some combination of wired technologies, such as twisted pair wire cabling, coaxial cabling, optical fiber cabling, or other cabling.
  • In the example of FIG. 1, the network 104 can also use some combination of wireless technologies. Wireless networks will typically include an internetworking unit (IWU) that interconnects wireless devices on the relevant one of the wireless networks with another network, such as a wired LAN. The IWU is sometimes referred to as a wireless access point (WAP). In the IEEE 802.11 standard, a WAP is also defined as a station. Thus, a station can be a non-WAP station or a WAP station. In a cellular network, the WAP is often referred to as a base station. Wireless networks can be implemented using any applicable technology, which can differ by network type or in other ways. The wireless networks can be of any appropriate size (e.g., metropolitan area network (MAN), personal area network (PAN), etc.). Broadband wireless MANs may or may not be compliant with IEEE 802.16, which is incorporated by reference. Wireless PANs may or may not be compliant with IEEE 802.15, which is incorporated by reference. The wireless networks 2404 can be identifiable by network type (e.g., 2G, 3G, Wi-Fi), service provider, WAP/base station identifier (e.g., Wi-Fi SSID, base station and sector ID), geographic location, or other identification criteria. The wireless networks may or may not be coupled together via an intermediate network. The intermediate network can include practically any type of communications network, such as, by way of example but not limitation, the Internet, a public switched telephone network (PSTN), or an infrastructure network (e.g., private LAN). The term “Internet” as used herein refers to a network of networks which uses certain protocols, such as the TCP/IP protocol, and possibly other protocols such as the hypertext transfer protocol (HTTP) for hypertext markup language (HTML) documents that make up the World Wide Web (the web).
  • In the example of FIG. 1, the variable-layer content client 106 can include a computer, which can, in general, have an operating system and include datastores and engines. In this example, the variable-layer content client 160 can execute variable-layer content editing services inside a host application (i.e., can execute a browser plug-in in a web browser). The browser plug-in can provide an interface such as a graphical user interface (GUI) for a user to access the content editing services on the variable-layer content server 102. The browser plug-in can include a GUI to display content and layers on the datastores in the variable-layer content server 102. For instance, the browser plug-in can have display capabilities like the capabilities provided by proprietary commercially available plug-ins like Adobe® Flash Player, QuickTime®, and Microsoft Silverlight®. The browser plug-in can also include an interface to execute functionalities on the engines in the variable-layer content server 102.
  • In the example of FIG. 1, a device on which the variable-layer media access client 106 is implemented can be implemented as a station. A station, as used herein, may be referred to as a device with a media access control (MAC) address and a physical layer (PHY) interface to the wireless medium that comply with, e.g., the IEEE 802.11 standard. A station can be described as “IEEE 802.11-compliant” when compliance with the IEEE 802.11 standard is intended to be explicit. (I.e., a device acts as described in at least a portion of the IEEE 802.11 standard.) One of ordinary skill in the relevant art would understand what the IEEE 802.11 standard comprises today and that the IEEE 802.11 standard can change over time, and would be expected to apply techniques described in this paper in compliance with future versions of the IEEE 802.11 standard if an applicable change is made. IEEE Std 802.11™-2007 (Revision of IEEE Std 802.11-1999) is incorporated by reference. IEEE 802.11k-2008, IEEE 802.11n-2009, IEEE 802.11p-2010, IEEE 802.11r-2008, IEEE 802.11w-2009, and IEEE 802.11y-2008 are also incorporated by reference.
  • In alternative embodiments, one or more wireless devices may comply with some other standard or no standard at all, and may have different interfaces to a wireless or other medium. It should be noted that not all standards refer to wireless devices as “stations,” but where the term is used in this paper, it should be understood that an analogous unit will be present on all applicable wireless networks. Thus, use of the term “station” should not be construed as limiting the scope of an embodiment that describes wireless devices as stations to a standard that explicitly uses the term, unless such a limitation is appropriate in the context of the discussion.
  • FIG. 2 shows a diagram of an example of a variable-layer content server 200. The variable-layer content server 200 includes a number of engines and datastores. For instance, the variable-layer media content server 200 includes a layer-scalable content editor launch engine 202, a client-remote content placement engine 204, a layer addition engine 206, a layer superimposing engine 208, a variable-layer content playing engine 210, a content account engine 212, a content publication engine 214, a content rights management engine 216, a royalty estimation engine 218, a content rights acquisition engine 220, a content rights crediting engine 222, a content crowdsourcing engine 224, a third-party layer-package engine 226, and a content monetization engine 228. In this example, the variable-layer media content server 200 includes a content datastore 230, a layer datastore 232, and an account datastore 234.
  • In the example of FIG. 2, the layer-scalable content launch engine 202, in operation, receives a request to launch an editor window for display on a client. A request to the layer-scalable content launch engine 202 can include a character string that identifies a client device and provides edit window parameters. The request can also include a parameterized object or an instance of a class used to identify the client device and provide the edit window parameters. As the request is often received over a network connection to the layer-scalable content server 200, the request is often decoded from a data packet to the layer-scalable content server 200 over the network. The request can also be unencrypted if a secure network connection was used to transmit the request. The request may be decoded from a secure or an unsecure data packet.
  • In the example of FIG. 2, the request to the layer-scalable content launch engine 202 can identify a client device. The request can contain a network address such as an Internet Protocol (IP) or other address of the client. The request can also contain a device identifier such a Media Access Card (MAC) address of the client. Using the request, the layer-scalable content launch engine 202 can identify a client using destination/network identifiers to launch an editor window on the client.
  • In the example of FIG. 2, the request to the layer-scalable content launch engine 202 can also identify parameters of a client host application. The request can identify the operating system on the client and can help the layer-scalable content launch engine 202 determine whether to support the client operating system. The request can also identify the type and version of a host application, such as a web browser, on the client. The request can further identify the screen resolution, processor speed, memory, and network speed of the client device. Using these and other exemplary parameters, the layer-scalable content launch engine 202 can determine whether to support the client's specific host application. The layer-scalable content launch engine 202 can also use the request to supply an edit window with default parameters based on any of the OS or the host application parameters in the request. The layer-scalable content launch engine 202 can further determine whether to recommend an upgraded operating system or host application to the client.
  • In the example of FIG. 2, the request to the layer-scalable content launch engine 202 can help the layer-scalable content launch engine 202 perform a “smart-bandwidth” determination. Using the client network speed supplied in the request to the layer-scalable content launch engine 202, the layer-scalable content launch engine 202 can calculate the resolution of the content to provide for editing. For instance, if the request identifies a client connected to a Digital Signal 3 (T3) connection or other relatively fast Internet connection, the layer-scalable content launch engine 202 can determine it is desirable to provide relatively high quality media content (e.g., high definition (HD) media content) for editing. On the other hand, if the request identifies a client being connected to a dial-up modem, the layer-scalable content launch engine 202 can determine it is desirable to provide relatively low quality media content for editing.
  • In the example of FIG. 2, the request to the layer-scalable content launch engine 202 can include user account parameters. As discussed herein, the layer-scalable content launch engine 202 the user account parameters in the request to the content account engine 212.
  • In the example of FIG. 2, the layer-scalable content launch engine 202 can launch an edit window for display on the identified client. The layer-scalable content launch engine 202 can direct the edit window to the device identified for display. The layer-scalable content launch engine 202 can characterize the edit window with a resolution and other parameters that are supported by the client device's operating system and host application. For instance, the layer-scalable content launch engine 202 can access application programming interfaces or other modules on the client to load an edit window as a browser plug-in in a web browser running on the client. The layer-scalable content launch engine 202 can also use the “smart-bandwidth” determination to limit the maximum resolution of the edit window. As a result, the layer-scalable content launch engine 202 can launch a highly usable, easily portable content edit window while installing no new applications on the client.
  • In the example of FIG. 2, the client-remote content placement engine 204, in operation, receives an instruction to place content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client. An instruction to place content from the content datastore can include an identifier of the specific content needing to be placed as well as content datastore access parameters (such as content datastore usernames and passwords). In the illustrated example, the identifier of the content can identify a file stored in the content datastore by name, by the file's address in the content datastore, or by the file's relationship to other files in the content datastore. The instruction to place content can also include one or more API calls that obtain the content from the content datastore.
  • In the example of FIG. 2, the client-remote content placement engine 204 places content from the content datastore on a multi-layer timeline content compilation represented in the editor window. To place content into the editor window, the client-remote content placement engine 204 can provide, as part of an API call to the editor window on the browser plug-in, a server-based address of content to be placed. The content is located on the variable-layer content server 200. The client-remote content placement engine 204 places a link to the server-based address of the content in the editor window.
  • In the example of FIG. 2, the client-remote content placement engine 204 also places character strings corresponding to parameters of the content to be placed. For instance, the client-remote content placement engine 204 can provide a target area on the edit window for the content to be placed. The client-remote content placement engine 204 can also provide a resolution and a playback speed of the content to be placed.
  • In the example of FIG. 2, the act of placing content by the client-remote content placement engine 204 can include loading the content into a server-based playback engine, and then streaming portions of the content to the client. As part of a “smart-bandwidth” feature, either of the resolution or the playback speed of the content to be placed can depend on the connection speed of the client. The client-remote content placement engine 204 can place higher quality content into a client identified to have a faster fast Internet connection.
  • In the example of FIG. 2, the multi-layer timeline content compilation can be raw video that reflects a scene without any editing layers. In such a case, the raw video can be a layer in the multi-layer timeline content compilation. The multi-layer timeline content compilation can also be edited video that reflects a scene with editing layers having been previously applied. Edited video can include, in addition to placed content, multiple editing layers as these multiple editing layers are applied to the content at portions of the compilation timeline. For instance, the multi-layer timeline content compilation can include items such as videos, audio, graphics, sound effects, and other editing content as these items are applied to the content at portions of the compilation timeline.
  • In the example of FIG. 2, the multi-layer timeline compilation has a layer corresponding to a compilation timeline. Portions of the compilation timeline can be associated with the run timeline of the content.
  • In the example of FIG. 2, the multi-layer timeline content compilation has a set of destination edit layer classifications that classify the layers to be superimposed onto the multi-layer timeline content compilation. For example, the multi-layer timeline content compilation can have an “effects” destination edit layer classification that receives layers corresponding to special effects to be added to the multi-layer timeline content compilation. The multi-layer timeline content compilation can also have a “graphics” destination edit layer classification that receives layers corresponding to graphics to be added to the multi-layer timeline content compilation. The multi-layer timeline content compilation can have “video” and “image” destination edit layer classifications that receive video layers and image layers to be added to the multi-layer timeline content compilation. The multi-layer timeline content compilation can have “music” and “audio effects” destination edit layer classifications to receive layers corresponding to music and sound effects to be added to the multi-layer timeline content compilation. The multi-layer timeline content compilation can similarly have “contentum” destination edit layer classifications and other destination edit layer classifications. The destination edit layer classifications can be user-selected or may correspond to packages from a third-party. FIGS. 8-12 show specific destination edit layer classifications.
  • In the example of FIG. 2, the set of destination edit layer classifications in the multi-layer timeline content compilation can be a user-selected set of destination edit layer classifications. For instance, a user wishing to have more than one independent video layer can add multiple video destination edit layer classifications for the multi-layer timeline content compilation. For a multi-layer timeline content compilation involving edited music or mixed soundtracks, a user can add multiple audio destination edit layer classifications. For a graphics-intensive multi-layer timeline content compilation, a user can add multiple effects destination edit layer classifications. The multi-layer timeline content compilation is therefore heavily customizable and can closely accommodate an editor's creative vision without local installation of any new programs.
  • In the example of FIG. 2, the set of destination edit layer classifications in the multi-layer timeline content compilation is stored on the variable-layer content server 200, and not on a local machine. As the set of destination edit layer classifications benefits from having access to very large server storage resources, the set of destination edit layer classifications is potentially very large. In the illustrated example, an editor can weave in a potentially infinite number of editing layers into the multi-layer timeline content compilation. The editor can not only create full-length movies with large numbers of scenes. The editor can also apply a potentially infinite number of layers, including a large number of effects, to a particular interval of the multi-layer content composition. Thus, editors can create professional and high quality films without installing files or programs to their computers.
  • Once the client-remote content placement engine 204 places the content from the content datastore on the multi-layer timeline content compilation represented in the editor window, the layer addition engine 206 facilitates superimposition of additional layers to the content, as housed in the multi-layer timeline compilation.
  • In the example of FIG. 2, the layer addition engine 206, in operation, receives an instruction to superimpose a superimposable layer from the layer datastore onto existing layers of the multi-layer timeline content compilation. An instruction to superimpose a superimposable layer can include an identifier of specific superimposable layers and layer datastore access parameters (such as layer datastore usernames and passwords). In the illustrated example, the identifier of the superimposable layer can identify the superimposable layer by name, by the superimposable layer address in the layer datastore, or by the superimposable layer relationship to other layers in the layer datastore. The instruction to superimpose the superimposable layer can also include one or more API calls that obtain the superimposable layer from the layer datastore.
  • In the example of FIG. 2, the instruction to superimpose includes directing the placement of a superimposable layer over at least a portion of the multi-layer timeline content compilation. The instruction to superimpose therefore includes an instruction to help edit the multi-layer timeline content compilation.
  • In the example of FIG. 2, the instruction to superimpose the superimposable layer can also API calls to the editor window. The instruction to superimpose could include a portion of the compilation timeline of the multi-layer timeline content compilation for which the superimposable layer is to be applied. For instance, the instruction could include superimposing textual credits for ten seconds to start the multi-layer timeline content compilation. The instruction to superimpose could also identify a visual portion of the multi-layer timeline content compilation for which the superimposable layer is to be applied. For example, the instruction to superimpose could include placing textual credits on the bottom left-hand quadrant of the multi-layer timeline content compilation.
  • In the example of FIG. 2, the superimposable layers could include video layers. Video layers are video clips that can be added to portions of the multi-layer timeline content compilation. For instance, a film editor may wish to add video to a corner of the multi-layer timeline content compilation so that the video appears integrated into the multi-layer timeline content compilation. The superimposable layers could include transition layers. Transition layers are video clips or images used to transition between scenes in the multi-layer timeline content compilation. For instance, a film editor may wish to recreate fading or wiping effects commonly seen in films. The superimposable layers could include sound layers such as audio effects or soundtracks for parts of the multi-layer timeline content compilation. The superimposable layers could further include graphical layers. Graphical layers are animated layers that film editors can use to create graphical effects for parts of the multi-layer timeline content compilation. Moreover, the superimposable layers could include user-specific media layers, which can correspond to video, audio, animated, and other content created or uploaded by a film editor or other users. FIGS. 8-12 show the video layers, transition layers, sound layers, graphical layers, and user-specific media layers.
  • In the example of FIG. 2, the instruction to superimpose the superimposable layer can associate the superimposable layer with a destination edit layer classification on the multi-layer timeline content compilation. Thus, the layer addition engine 206 can provide an instruction to add any of the superimposable layers to any of the destination edit layer classifications associated with the multi-layer timeline content compilation.
  • In the example of FIG. 2, the instruction to superimpose the superimposable layer can control effects relating to each superimposable layer. The instruction to superimpose the superimposable layer can control, for instance, whether a specific superimposed layer is to fade in or out. The instruction to superimpose the superimposable layer can also control the transparency and other attributes of a specific superimposed layer.
  • In the example of FIG. 2, the layer superimposing engine 208, in operation, superimposes, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer. To superimpose the superimposable layer onto the existing layers of the multi-layer timeline content compilation, the layer superimposing engine 208 modifies the multi-layer timeline content compilation to include the material from the superimposable layer. For instance, if the superimposable layer was a video layer, the multi-layer timeline content compilation would include the video material from the superimposable layer. The layer superimposing engine 208 similarly adds audio, graphics, and other effects to the multi-layer timeline content compilation.
  • In the example of FIG. 2, the variable-layer content playing engine 210, in operation, plays the multi-layer timeline content compilation, including the content, in the superimposed layer. The variable-layer content playing engine 210 provides instructions to the editor window on the client to display the multi-layer timeline content compilation with the superimposable layer integrated into it. In response, the editor window displays the content along with the video, audio, and other effects added as part of the superimposable layer. In this example, the variable-layer content playing engine 210 streams the multi-media content compilation to the editor window. As such, the variable-layer content playing engine 210 activates a set of streaming media APIs on the browser plug-in housing the editor window on the client. As a result of the media being played, an editor can preview a high quality multi-layer timeline content compilation with customizable and a potentially infinite number of editing layers incorporated within.
  • In the example of FIG. 2, the content account engine 212, in operation, limits access to the content based on an account-based access system. In the illustrated example, the content account engine 212 maintains a plurality of user accounts in a datastore such as the account datastore 234. Each account can have a username, that is a character string identifying a unique user of the system. Each account can also have a password associated with the username. The usernames and passwords can also be stored in the account datastore 234. In this example, the content account engine 212 can require entry of a username and password associated with any of the plurality of user accounts in order to access the system.
  • In the example of FIG. 2, the content account engine 212 associates the content with one of the plurality of user accounts. In the illustrated example, the content account engine 212 can store an access key associated with the multi-layer timeline content compilation in the account datastore 234. The access key can be a unique character string that contains information about both the multi-layer timeline content compilation and a specific user account. In some embodiments, the access key may require the password associated with the user account in order to unlock the content.
  • In the example of FIG. 2, the content account engine 212 limits another of the plurality of user accounts from accessing the content. In the illustrated example, the content account engine 212 may prevent the client-remote content placement engine 204 from placing any content into the remote editor window without entry of the username and password associated with the content.
  • In the example of FIG. 2, the content publication engine 214, in operation, provides an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user associated with the layer superimposing engine 208 has publication rights to the content. In this example, the content publication engine 214 checks whether a user associated with the layer superimposing engine 208 has publication rights to the content.
  • In one example consistent with FIG. 2, the content publication engine 214 can check publication rights by issuing an instruction to the editor window on the client to prompt a user certification of non-infringing use. For instance, the content publication engine 214 can request a given user to certify under penalty of perjury and under all applicable laws that the user has rights to publish the content.
  • In another example consistent with FIG. 2, the content publication engine 214 can check publication rights of the content by determining which studio owns rights to the content and requesting a content key to the content from the studio. The content publication engine 214 can then request the user to enter the content key to unlock the content for layer superimposition, editing, and other use.
  • In yet another example consistent with FIG. 2, the content publication engine 214 can check publication rights by verifying content rights in metadata of the file storing the content. The content publication engine 214 can also check publication rights in other ways. In this manner, the content publication engine 214 ensures users can have access to a broad range of content. Studios and producers of content may also benefit from the fact that the content publication engine 214 limits republication of protected content only by those editors with publication rights.
  • In the example of FIG. 2, the content publication engine 214, after verifying publication rights, can save the multi-layer timeline content compilation to a streaming media format or a downloadable media format. If necessary, the content publication engine 214 can also set the resolution and the playback speed of the multi-layer timeline content compilation.
  • In the example of FIG. 2, the content rights management engine 216, in operation, provides a depublication instruction if the user associated with the layer superimposing engine 208 does not have publication rights to the content. In the illustrated example, the content rights management engine 216 can check publication rights of the content. The content rights management engine 216 can independently check rights by methods such as requesting user verification, requesting content keys, and evaluating content file metadata. In another example, the content rights management engine 216 can import these functions from another module such as the content publication engine 214.
  • In the example of FIG. 2, the content rights management engine 216 can provide a depublication instruction to the other engines in the variable-layer content server 200. In the illustrated example, the content rights management engine 216 can independently generate a depublication key (which may or may not be encrypted) so that the other engines (such as the content publication engine 214) do not publish the multi-layer timeline content compilation.
  • In the example of FIG. 2, the royalty estimation engine 218, in operation, estimates a cumulative royalty associated with publishing the multi-layer timeline content compilation. In this example, the royalty estimation engine 218 can check publication rights of the content by determining which studio owns rights to the content and requesting a license value to the content from the studio. The license value can reflect the studio's estimated cost of licensing the content. In other examples, the license value can include bargaining between an entity operating the royalty estimation engine 218 and the studio owning rights to the content. Aggregating the values of all licensed content in the multi-layer timeline content compilation, the royalty estimation engine 218 can then estimate the cumulative royalty associated with publishing the multi-layer timeline content compilation.
  • In the example of FIG. 2, the royalty estimation engine 218 can also suggest a revenue amount for the multi-layer timeline content compilation. That is, the royalty estimation engine 218 can recommend a minimum amount that the creator of the multi-layer timeline content compilation can charge to license his or her creative product. Advantageously, the minimum amount reflects the cost of obtaining the content for the multi-layer timeline content compilation.
  • In the example of FIG. 2, the content rights acquisition engine 220, in operation, provides a request to obtain publication rights to the content if a user associated with the layer superimposing engine 208 does not have publication rights to the content. In the illustrated example, the content rights acquisition engine 220 can check publication rights of the content. The content rights acquisition engine 220 can independently check rights by methods such as requesting user verification, requesting content keys, and evaluating content file metadata. In another example, the content rights acquisition engine 220 can import these functions from another module such as the content publication engine 214. In the example of FIG. 2, the content rights acquisition engine 220 provides a request to obtain publication rights to the content. In the illustrated example, the content rights acquisition engine 220 can independently generate a request from a studio for a content key for the content. Thus, in addition to notifying editors of unlicensed work, the content rights acquisition engine 220 allows editors to actively seek licenses to modify content owned by others, including studios.
  • In the example of FIG. 2, the content rights crediting engine 222, in operation, provides a request to acquire publication credits associated with the content, and provides the publication credits for storage in the layer datastore 232 if a user associated with the layer superimposing engine 208 does not have publication rights to the content.
  • In the example of FIG. 2, the content rights crediting engine 222 can check publication credits associated with the content. The content rights acquisition engine 220 can independently check credits by methods such as requesting user verification, requesting content credits from a studio owning rights to the content, and evaluating content file metadata. In another example, the content rights crediting engine 222 can import these functions from another module such as the content publication engine 214. In the example of FIG. 2, the content rights crediting engine 222 can provide the publication credits for storage in the layer datastore 234. In this way, a user can actually use the publication credits as a superimposable layer to be used in the multi-layer timeline content compilation.
  • In the example of FIG. 2, the content crowdsourcing engine 224, in operation, allows users to obtain content relating to a specific subject or event from a variety of sources. In this example, the content crowdsourcing engine 224 provides a subject-specific content call. A subject-specific content call is a character string corresponding to a subject or an event. For example, a subject-specific content call could be a character string requesting a list of protests that occurred in the year 2011. The subject-specific content call can be limited to a specific country, such as Egypt, Syria, Libya, or other country. The subject-specific content call can be limited to a specific movement, such as “Occupy Wall Street.” The content crowdsourcing engine 224 can then provide the subject-specific content call to query search engines, social networks, studios, online video sites, or other content sources.
  • In the example of FIG. 2, the content crowdsourcing engine 224 receives, in response to the subject-specific content call, subject-specific content. Thus, as a result of the query of the search engines, the social networks, the studios, the online video sites, or the other content sources, the content crowdsourcing engine 224 can receive content relating to the specific subject or event.
  • In the example of FIG. 2, the content crowdsourcing engine 224 provides the subject-specific content to the content datastore 230. In this example, the content crowdsourcing engine 224 can save the subject-specific content to the content datastore 230. As a result, the content crowdsourcing engine 224 provides ready access to a variety of clips relating to a specific subject or event. An editor can develop professional high-quality films, including professional high-quality documentaries of specific subjects or events as these events unfold.
  • In this example, the content crowdsourcing engine 224 allows an editor to readily access multiple views of a specific subject or event. For instance, many participants of the “Arab Spring” revolts or the “Occupy Wall Street” movement of 2011 have access to mobile devices that allow protest footage to be captured and uploaded to video sites. An editor using the content crowdsourcing engine 224 can readily obtain numerous clips of an event (e.g., numerous views of a single protest) and develop a professional and high-quality film without installing any editing software to his or her computer. In this example, the editor can even develop a professional-quality documentary based on the crowdsourced footage from a remote location such as a library or an Internet café.
  • In the example of FIG. 2, the third-party layer-package engine 226, in operation, provides a third-party package of proprietary third-party layers to the layer datastore 232. In this example, the third-party package engine 226 can obtain high-quality or even professional layers from third-parties such as commercial movie studios. The third-party layer-package engine 226 can allow editors access to professional layers that they can add to their creations. The third-party layer-package engine 226 stores these layers in the layer datastore 232.
  • In the example of FIG. 2, the content monetization engine 228, in operation, provides an advertising package or a pay-per-view package to the layer datastore 232. An advertising package can involve links to commercial entities who can pay an editor for including their material in a multi-layer timeline content compilation. A pay-per-view package can request viewers to pay for watching a multi-layer timeline content compilation. The content monetization engine 228 can also provide instructions to deposit revenue in an editor's financial accounts or credit deficiencies in the editor's financial accounts. Using the content monetization engine 228, an editor can readily monetize his or her creation by adding an edit layer to that creation.
  • In the example of FIG. 2, the content datastore 230 stores content. In this example, the layer datastore 232 stores layers. The account datastore 234 stores account information such as user names and passwords.
  • FIG. 3 shows a diagram of an example of a variable-layer content client 300. The variable-layer content client 300 includes a number of engines and datastores. For instance, the variable-layer content client 300 includes a web browsing engine 302, a content editor display engine 304, a client-based content placement instruction engine 306, a client-based layer placement instruction engine 308, a superimposable layer display engine 310, a timeline display engine 312, and a multi-layer timeline content compilation display engine 314. In this example, the variable-layer content client 300 includes a local datastore 316 and a local storage buffer 318. The discussion below provides a description of the functionality of each of these engines and datastores.
  • In the example of FIG. 3, the web browsing engine 302, in operation allows a user of the variable-layer content client 300 to access the Internet. In this example, the web browsing engine 302 is incorporated into an Internet browser. Existing Internet browsers include browsers manufactured by Microsoft®, Google®, Mozilla®, Apple®, and others. The web browsing engine 302 can be incorporated into personal computer, a mobile device, or other computing client.
  • In the example of FIG. 3, the web browsing engine 302 can run a host application. That is, the web browsing engine 302 can execute a browser plug-in in the Internet browser installed on the variable-layer content client 300. The browser plug-in can provide an interface such as a graphical user interface (GUI) for a user to access the server-based content editing services. The browser plug-in can include a GUI to display content and layers on server datastores. For instance, the browser plug-in can have display capabilities like the capabilities provided by proprietary commercially available plug-ins like Adobe®Flash Player, QuickTime®, and Microsoft Silverlight®. The browser plug-in can also include an interface to execute server-initiated functionalities on server based engines.
  • In the example of FIG. 3, the content editor display engine 304, in operation, can launch an editor window for display on the variable-layer content client 300. The editor window can be displayed in the host application on the variable-layer content client 300. To launch and display the editor window, the content editor display engine 304 can call one or more APIs of the web browser plug-in, thereby allowing display of an editor window.
  • In the example of FIG. 3, the client-based content placement instruction engine 306, in operation, places a link to the content in the editor window. The client-based content placement engine 306 receives parameters, such as the server-address of the content to be placed, resolution, and playback speed. Based on these parameters, the client-based content placement instruction engine 306 places a link to the content (at the provided resolution, playback speed, etc.) in the editor window.
  • In the example of FIG. 3, the client-based layer placement instruction engine 308, in operation, places a link to a superimposable layer over the link to the content. Placing this link creates, on the server, a multi-layer timeline content compilation on the server.
  • In the example of FIG. 3, the superimposable layer display engine 310, in operation, displays links to superimposable layers as well as links to destination edit layer classifications in the edit window. Further, in this example, the timeline display engine 312, in operation, displays a link to the compilation timeline in the edit window. Additionally, the multi-layer timeline content compilation display engine 314 can place a link to a multi-layer timeline content compilation in the edit window. As a result, the edit window can display a link to the multi-layer content compilation, links to superimposable layers, and links to destination edit layer classifications. A user of the variable-layer content client 300 has access to high-quality professional film editing without needing to install any editing software on the variable-layer content client 300.
  • In the example of FIG. 3, the local datastore 316 can store locally store any data on the variable-layer content client 300. Also shown in FIG. 3 is the local storage buffer 318, which can buffer content to optimize editing and playback.
  • FIG. 4 shows a flowchart 400 of an example of a method for providing subject-specific content in response to a subject-specific content call. In some implementations, the modules of the flowchart 400 and other flowcharts described in this paper are reordered to a permutation of the illustrated order of modules or reorganized for parallel execution. In the example of FIG. 4, the flowchart 400 starts at module 402 with providing a subject-specific content call. The flowchart 400 continues to module 404 with receiving, in response to the subject-specific content call, subject-specific content. The flowchart 400 continues to module 406 with providing the subject-specific content to the content datastore. The flowchart 400 continues to module “A.”
  • FIG. 5 shows a flowchart 500 of an example of a method for creating a multi-layer timeline content compilation. In the example of FIG. 5, the flowchart 500 starts at module “A” and proceeds to module 502 with receiving a request to launch an editor window for display on a client. The flowchart 500 continues to module 504 with receiving an instruction to place content from a content datastore on a multi-layer timeline content compilation in the editor window, wherein the content datastore is remote relative to the client.
  • In the example of FIG. 5, the flowchart 500 continues to module 506 with receiving an instruction to superimpose a superimposable layer on existing layers of the multi-layer timeline content compilation. The flowchart 500 continues to module 508 with superimposing, in response to the instruction to superimpose, the superimposable layer onto the existing layers of the multi-layer timeline content compilation, thereby creating a superimposed layer. The flowchart 500 continues to module 510 with playing the multi-layer timeline content compilation, including the content, in the superimposed layer. The flowchart 500 continues to module “B.”
  • FIG. 6 shows a flowchart of an example of a method for publishing a multi-layer timeline content compilation. In the example of FIG. 6, the flowchart 600 starts at module “B” and proceeds to module 602 with deciding whether a user has publication rights to the content. If the user does not have publication rights to the content, the flowchart 600 continues to module 604 with deciding whether to obtain publication rights to the content. If it is not found desirable to obtain publication rights to the content, the flowchart 600 continues to module 618 with providing a depublication instruction. If it is found desirable to obtain publication rights to the content, the flowchart 600 continues to module 606 with providing a request to acquire publication rights to the content. The flowchart 600 continues to module 608 with providing a request to acquire publication credits associated with the content. The flowchart 600 continues to module 610 with providing the publication credits for storage.
  • In the example of FIG. 6, the flowchart 600 returns to module 602. If the user does have publication rights to the content, the flowchart 600 continues to module 612 with estimating a cumulative royalty associated with publishing the multi-layer timeline content compilation. The flowchart 600 continues to module 614 with suggesting a revenue amount for the multi-layer timeline content compilation. The flowchart 600 continues to module 616 with providing an instruction to publish the multi-layer timeline content compilation.
  • FIG. 7 shows an example of a system on which techniques described in this paper can be implemented. The computer system 700 can be a conventional computer system that can be used as a client computer system, such as a wireless client or a workstation, or a server computer system. The computer system 700 includes a computer 702, I/O devices 704, and a display device 706. The computer 702 includes a processor 708, a communications interface 710, memory 712, display controller 714, non-volatile storage 716, and I/O controller 718. The computer 702 may be coupled to or include the I/O devices 704 and display device 706.
  • The computer 702 interfaces to external systems through the communications interface 710, which may include a modem or network interface. It will be appreciated that the communications interface 710 can be considered to be part of the computer system 700 or a part of the computer 702. The communications interface 710 can be an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g. “direct PC”), or other interfaces for coupling a computer system to other computer systems.
  • The processor 708 may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor. The memory 712 is coupled to the processor 708 by a bus 770. The memory 712 can be Dynamic Random Access Memory (DRAM) and can also include Static RAM (SRAM). The bus 770 couples the processor 708 to the memory 712, also to the non-volatile storage 716, to the display controller 714, and to the I/O controller 718.
  • The I/O devices 704 can include a keyboard, disk drives, printers, a scanner, and other input and output devices, including a mouse or other pointing device. The display controller 714 may control in the conventional manner a display on the display device 706, which can be, for example, a cathode ray tube (CRT) or liquid crystal display (LCD). The display controller 714 and the I/O controller 718 can be implemented with conventional well known technology.
  • The non-volatile storage 716 is often a magnetic hard disk, an optical disk, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory 712 during execution of software in the computer 702. One of skill in the art will immediately recognize that the terms “machine-readable medium” or “computer-readable medium” includes any type of storage device that is accessible by the processor 708 and also encompasses a carrier wave that encodes a data signal.
  • The computer system 700 is one example of many possible computer systems which have different architectures. For example, personal computers based on an Intel microprocessor often have multiple buses, one of which can be an I/O bus for the peripherals and one that directly connects the processor 708 and the memory 712 (often referred to as a memory bus). The buses are connected together through bridge components that perform any necessary translation due to differing bus protocols.
  • Network computers are another type of computer system that can be used in conjunction with the teachings provided herein. Network computers do not usually include a hard disk or other mass storage, and the executable programs are loaded from a network connection into the memory 712 for execution by the processor 708. A Web TV system, which is known in the art, is also considered to be a computer system, but it may lack some of the features shown in FIG. 7, such as certain input or output devices. A typical computer system will usually include at least a processor, memory, and a bus coupling the memory to the processor.
  • Some portions of the detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • Techniques described in this paper relate to apparatus for performing the operations. The apparatus can be specially constructed for the required purposes, or it can comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • FIG. 8 shows a diagram of a screenshot 800 on a web browser of a variable-layer content client. In this example, the screenshot 800 shows an editor window incorporated into an internet browser, here the Internet Explorer web browser from Microsoft®. The editor window displays content, namely, a video for editing in the upper right hand corner. The editor window displays a series of superimposable effects. In this example, superimposable effects include “Videos,” “Transitions, “Sounds,” Graphics,” and “My media files.” In this example, a user has selected the “Videos” set of superimposable layers and sees a corresponding library of Nature videos.
  • In the example of FIG. 8, the edit window shows an edit near the center of the edit window. The timeline in this example has a duration of 2 minutes and 20 seconds. A moving marker indicates that the user is viewing a multi-layer timeline content compilation as the multi-layer timeline content compilation looks at about 10 seconds into the multi-layer timeline content compilation.
  • In the example of FIG. 8, the edit window shows a list of destination edit classifications. In this example, destination edit classifications include “Effects,” “Graphics,” “Video/image,” “Music,” “Audio Effects,” and “Contentum.” In this example, the list of destination edit classifications is customizable. Moreover, a user can select from an extensive number of destination edit classifications, as the destination edit classifications are stored on a server and therefore benefit from large-volume storage.
  • FIG. 9 shows a diagram of a screenshot 900 on a web browser of the variable-layer content client. The screenshot 900 shows the superimposable “Transitions” layers in greater detail. In this example, a user can select from wipe and fade transitions. Wipe transitions include a left wipe, a right wipe, an upward wipe, and a downward wipe.
  • FIG. 10 shows a diagram of a screenshot 1000 on a web browser of the variable-layer content client. The screenshot 1000 shows the superimposable “Sounds” layers in greater detail. In this example, a user can select from a variety of music or sound effects.
  • FIG. 11 shows a diagram of a screenshot 1100 on a web browser of the variable-layer content client. The screenshot 1100 shows the superimposable “Graphics” layers in greater detail. In this example, a user can select from a 35 mm image, a heart image, comic text, or other text to superimpose on the content.
  • FIG. 12 shows a diagram of a screenshot 1200 on a web browser of the variable-layer content client. The screenshot 1200 shows the superimposable “My media files” layers in greater detail. In this example, a user can superimpose other media over existing content. In this example, the user can superimpose any of VIDEO002, VIDEO015, and VIDEO016 onto any of the destination edit classifications. As shown, the user has superimposed VIDEO016 onto the “Video/image” destination edit classification.
  • Embodiments allow creative professionals to harness the reality that film is more than the sum of individual video clips and special effects. As discussed with reference to the examples illustrated herein, creative professionals can develop high-quality and professional content without needing to access film studios, expensive editing equipment. Moreover, creative professionals need not locally install complex video editing software just because they want professional films. Creative professionals can easily add creative elements to streaming media, including streaming media from multiple sources, such as crowdsourced streaming media.
  • Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not necessarily limited to the details provided.

Claims (24)

1. A system, comprising:
a content datastore;
a layer datastore;
a layer-scalable content editor launch engine;
a client-remote content placement engine coupled to the content datastore, the layer datastore, and the layer-scalable content editor launch engine;
a layer superimposing engine coupled to the client-remote content placement engine;
an variable-layer content playing engine having an arbitrary number of layers coupled to the layer superimposing engine;
wherein, in operation:
the layer-scalable content editor launch engine launches an editor window at a client in response to a request for a client;
the client-remote content placement engine places content from the content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client;
the layer superimposing engine superimposes, in response to an instruction to superimpose, a superimposable layer onto the multi-layer timeline content compilation, thereby creating a superimposed layer;
the variable-layer content playing engine plays the multi-layer timeline content compilation, including the content, in the superimposed layer.
2. The system of claim 1, wherein the existing layers comprise first crowdsourcing content and the superimposable layer comprises second crowdsourcing content.
3. The system of claim 1, further comprising a content account engine that, in operation:
maintains a plurality of user accounts;
associates the content with one of the plurality of user accounts;
limits another of the plurality of user accounts from accessing the content.
4. The system of claim 1, further comprising a content rights management engine that, in operation, provides a depublication instruction if a user associated with the layer superimposing engine does not have publication rights to the content.
5. The system of claim 1, further comprising a publication engine that, in operation, provides an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user associated with the layer superimposing engine has publication rights to the content.
6. The system of claim 1, further comprising a royalty estimation engine that, in operation, estimates a cumulative royalty associated with publishing the multi-layer timeline content compilation.
7. The system of claim 6, wherein the royalty estimation engine, in operation, suggests a revenue amount for the multi-layer timeline content compilation, the revenue amount based at least in part on the cumulative royalty.
8. The system of claim 1, further comprising a content rights acquisition engine that, in operation, provides a request to acquire publication rights if a user associated with the layer superimposing engine does not have publication rights to the content.
9. The system of claim 1, further comprising a content rights crediting engine that, in operation, provides an instruction to acquire publication credits associated with the content, and provides the publication credits for storage in the layer datastore if a user associated with the layer superimposing engine does not have publication rights to the content.
10. The system of claim 1, further comprising a content crowdsourcing engine that, in operation:
provides an subject-specific content call;
receives, in response to the subject-specific content call, subject-specific content; and
provides the subject-specific content to the content datastore.
11. The system of claim 1, further comprising a third-party layer-package engine that, in operation, provides a package of proprietary third-party layers to the layer datastore.
12. The system of claim 1, further comprising a content monetization engine that, in operation, provides an advertising package or a pay-per-view package to the layer datastore.
13. A method, comprising:
receiving a request to launch an editor window for display on a client;
receiving an instruction to place content from a content datastore on a multi-layer timeline content compilation represented in the editor window, wherein the content datastore is remote relative to the client;
superimposing, in response to an instruction to superimpose, a superimposable layer onto the multi-layer timeline content compilation, thereby creating a superimposed layer;
playing the multi-layer timeline content compilation, including the content, in the superimposed layer.
14. The method of claim 13, further comprising:
maintaining a plurality of user accounts;
associating the content with one of the plurality of user accounts;
limiting another of the plurality of user accounts from accessing the content.
15. The method of claim 13, further comprising providing a depublication instruction if a user does not have publication rights to the content.
16. The method of claim 13, further comprising providing an instruction to publish the multi-layer timeline content compilation into a streaming media format or a downloadable media format if a user has publication rights to the content.
17. The method of claim 13, further comprising estimating a cumulative royalty associated with publishing the multi-layer timeline content compilation.
18. The method of claim 17, further comprising suggesting a revenue amount for the multi-layer timeline content compilation, the revenue amount based at least in part on the cumulative royalty.
19. The method of claim 13, further comprising providing a request to acquire publication rights to the content if the user does not have the publication rights to the content.
20. The method of claim 13, further comprising:
providing an instruction to acquire publication credits associated with the content, if the user does not have publication rights to the content;
provides the publication credits if the user does not have the publication rights to the content.
21. The method of claim 13, further comprising providing a notification if the user does not have publication rights to the content.
22. The method of claim 13, further comprising:
providing a subject-specific content call;
receiving, in response to the subject-specific content call, subject-specific content; and
providing the subject-specific content to the content datastore.
23. The method of claim 13, further comprising providing a package of proprietary layers to the layer datastore.
24. The method of claim 13, further comprising providing an advertising package or a pay-per-view package to the layer datastore.
US13/433,239 2011-03-29 2012-03-28 Multi-layer timeline content compilation systems and methods Abandoned US20120251080A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/433,239 US20120251080A1 (en) 2011-03-29 2012-03-28 Multi-layer timeline content compilation systems and methods

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201161468725P 2011-03-29 2011-03-29
US201161564261P 2011-11-28 2011-11-28
US201161564257P 2011-11-28 2011-11-28
US201161564256P 2011-11-28 2011-11-28
US13/433,239 US20120251080A1 (en) 2011-03-29 2012-03-28 Multi-layer timeline content compilation systems and methods

Publications (1)

Publication Number Publication Date
US20120251080A1 true US20120251080A1 (en) 2012-10-04

Family

ID=46927379

Family Applications (9)

Application Number Title Priority Date Filing Date
US13/433,239 Abandoned US20120251080A1 (en) 2011-03-29 2012-03-28 Multi-layer timeline content compilation systems and methods
US13/433,252 Active 2035-06-04 US9711178B2 (en) 2011-03-29 2012-03-28 Local timeline editing for online content editing
US13/433,256 Abandoned US20120251083A1 (en) 2011-03-29 2012-03-28 Systems and methods for low bandwidth consumption online content editing
US13/506,156 Abandoned US20120284176A1 (en) 2011-03-29 2012-03-29 Systems and methods for collaborative online content editing
US13/434,709 Active US9460752B2 (en) 2011-03-29 2012-03-29 Multi-source journal content integration systems and methods
US14/874,311 Active US9489983B2 (en) 2011-03-29 2015-10-02 Low bandwidth consumption online content editing
US15/342,364 Active US10109318B2 (en) 2011-03-29 2016-11-03 Low bandwidth consumption online content editing
US16/168,780 Active US11127431B2 (en) 2011-03-29 2018-10-23 Low bandwidth consumption online content editing
US17/460,547 Pending US20220084557A1 (en) 2011-03-29 2021-08-30 Low bandwidth consumption online content editing

Family Applications After (8)

Application Number Title Priority Date Filing Date
US13/433,252 Active 2035-06-04 US9711178B2 (en) 2011-03-29 2012-03-28 Local timeline editing for online content editing
US13/433,256 Abandoned US20120251083A1 (en) 2011-03-29 2012-03-28 Systems and methods for low bandwidth consumption online content editing
US13/506,156 Abandoned US20120284176A1 (en) 2011-03-29 2012-03-29 Systems and methods for collaborative online content editing
US13/434,709 Active US9460752B2 (en) 2011-03-29 2012-03-29 Multi-source journal content integration systems and methods
US14/874,311 Active US9489983B2 (en) 2011-03-29 2015-10-02 Low bandwidth consumption online content editing
US15/342,364 Active US10109318B2 (en) 2011-03-29 2016-11-03 Low bandwidth consumption online content editing
US16/168,780 Active US11127431B2 (en) 2011-03-29 2018-10-23 Low bandwidth consumption online content editing
US17/460,547 Pending US20220084557A1 (en) 2011-03-29 2021-08-30 Low bandwidth consumption online content editing

Country Status (1)

Country Link
US (9) US20120251080A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120284176A1 (en) * 2011-03-29 2012-11-08 Svendsen Jostein Systems and methods for collaborative online content editing
US20140229831A1 (en) * 2012-12-12 2014-08-14 Smule, Inc. Audiovisual capture and sharing framework with coordinated user-selectable audio and video effects filters
EP2775703A1 (en) * 2013-03-07 2014-09-10 Nokia Corporation Method and apparatus for managing crowd sourced content creation
WO2014189485A1 (en) 2013-05-20 2014-11-27 Intel Corporation Elastic cloud video editing and multimedia search
US20150332282A1 (en) * 2014-05-16 2015-11-19 Altair Engineering, Inc. Unit-based licensing for collage content access
US20160342698A1 (en) * 2011-08-11 2016-11-24 Curve Dental Ltd. Media acquisition engine and method
US9773524B1 (en) 2016-06-03 2017-09-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
US9852768B1 (en) 2016-06-03 2017-12-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
US10679151B2 (en) 2014-04-28 2020-06-09 Altair Engineering, Inc. Unit-based licensing for third party access of digital content
US10685055B2 (en) 2015-09-23 2020-06-16 Altair Engineering, Inc. Hashtag-playlist content sequence management
US10739941B2 (en) 2011-03-29 2020-08-11 Wevideo, Inc. Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing
US20200274908A1 (en) * 2016-04-28 2020-08-27 Rabbit Asset Purchase Corp. Screencast orchestration
US10855731B2 (en) 2013-04-11 2020-12-01 Nec Corporation Information processing apparatus, data processing method thereof, and program
US11748833B2 (en) 2013-03-05 2023-09-05 Wevideo, Inc. Systems and methods for a theme-based effects multimedia editing platform
US11799864B2 (en) 2019-02-07 2023-10-24 Altair Engineering, Inc. Computer systems for regulating access to electronic content using usage telemetry data

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006127951A2 (en) 2005-05-23 2006-11-30 Gilley Thomas S Distributed scalable media environment
US9648281B2 (en) 2005-05-23 2017-05-09 Open Text Sa Ulc System and method for movie segment bookmarking and sharing
US8145528B2 (en) 2005-05-23 2012-03-27 Open Text S.A. Movie advertising placement optimization based on behavior and content analysis
US8141111B2 (en) 2005-05-23 2012-03-20 Open Text S.A. Movie advertising playback techniques
US9288248B2 (en) * 2011-11-08 2016-03-15 Adobe Systems Incorporated Media system with local or remote rendering
US9373358B2 (en) 2011-11-08 2016-06-21 Adobe Systems Incorporated Collaborative media editing system
JP2013125346A (en) * 2011-12-13 2013-06-24 Olympus Imaging Corp Server device and processing method
KR101909030B1 (en) 2012-06-08 2018-10-17 엘지전자 주식회사 A Method of Editing Video and a Digital Device Thereof
US10546047B1 (en) 2012-09-27 2020-01-28 Open Text Corporation Method and system for stashing of document alteration information for quicker web preview
US20140101571A1 (en) * 2012-10-04 2014-04-10 Lucid Dream Software, Inc. Shared collaborative environment
FR2998995A1 (en) * 2012-12-03 2014-06-06 France Telecom METHOD OF COMMUNICATION BETWEEN MULTIPLE USERS WITH COMMUNICATION TERMINALS THROUGH A VIRTUAL COMMUNICATION SPACE
US10394877B2 (en) * 2012-12-19 2019-08-27 Oath Inc. Method and system for storytelling on a computing device via social media
JP2016517195A (en) * 2013-03-08 2016-06-09 トムソン ライセンシングThomson Licensing Method and apparatus for improving video and media time series editing utilizing a list driven selection process
US9736448B1 (en) * 2013-03-15 2017-08-15 Google Inc. Methods, systems, and media for generating a summarized video using frame rate modification
US10372764B2 (en) 2013-04-30 2019-08-06 International Business Machines Corporation Extending document editors to assimilate documents returned by a search engine
US9542377B2 (en) 2013-05-06 2017-01-10 Dropbox, Inc. Note browser
US9946992B2 (en) 2013-05-14 2018-04-17 International Business Machines Corporation Temporal promotion of content to a project activity
US10949894B1 (en) 2013-06-07 2021-03-16 Groupon, Inc. Method, apparatus, and computer program product for facilitating dynamic pricing
US10984455B1 (en) * 2013-06-28 2021-04-20 Groupon, Inc. Method and apparatus for generating an electronic communication
US10387902B1 (en) * 2013-06-28 2019-08-20 Groupon, Inc. Method and apparatus for generating an electronic communication
US10002150B1 (en) * 2014-02-11 2018-06-19 United Technologies Corporation Conflict resolution for a multi-user CAx environment
US10534525B1 (en) * 2014-12-09 2020-01-14 Amazon Technologies, Inc. Media editing system optimized for distributed computing systems
US20160266740A1 (en) * 2015-01-19 2016-09-15 Dane Glasgow Interactive multi-media system
WO2016128984A1 (en) * 2015-02-15 2016-08-18 Moviemation Ltd. Customized, personalized, template based online video editing
US10007352B2 (en) 2015-08-21 2018-06-26 Microsoft Technology Licensing, Llc Holographic display system with undo functionality
US9583140B1 (en) * 2015-10-06 2017-02-28 Bruce Rady Real-time playback of an edited sequence of remote media and three-dimensional assets
US10007963B2 (en) * 2015-11-10 2018-06-26 International Business Machines Corporation Context-based provision of screenshot modifications
USD791159S1 (en) * 2016-04-18 2017-07-04 Apple Inc. Display screen or portion thereof with graphical user interface
CA3154210A1 (en) 2017-10-30 2019-04-30 Deltek, Inc. Dynamic content and cloud based content within collaborative electronic content creation and management tools
US11288336B2 (en) * 2018-04-18 2022-03-29 Google Llc Systems and methods for providing content items in situations involving suboptimal network conditions
JP7180111B2 (en) * 2018-04-27 2022-11-30 富士フイルムビジネスイノベーション株式会社 Display editing device and program
US11061744B2 (en) * 2018-06-01 2021-07-13 Apple Inc. Direct input from a remote device
US11899757B2 (en) * 2019-12-02 2024-02-13 Cox Automotive, Inc. Systems and methods for temporary digital content sharing
USD959466S1 (en) * 2020-04-29 2022-08-02 Toontrack Music Ab Display screen or portion thereof with graphical user interface
JP2022012403A (en) * 2020-07-01 2022-01-17 キヤノン株式会社 Program, information processing device, and control method
US11445230B2 (en) * 2020-11-24 2022-09-13 At&T Intellectual Property I, L.P. Analysis of copy protected content and user streams

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010041050A1 (en) * 1999-12-07 2001-11-15 Yoshiaki Iwata Video editing apparatus, video editing method, and recording medium
US20070208442A1 (en) * 2006-02-27 2007-09-06 Perrone Paul J General purpose robotics operating system
US20070250901A1 (en) * 2006-03-30 2007-10-25 Mcintire John P Method and apparatus for annotating media streams
US20080183608A1 (en) * 2007-01-26 2008-07-31 Andrew Gavin Payment system and method for web-based video editing system
US20100169127A1 (en) * 2005-04-08 2010-07-01 Malackowski James E System and method for managing intellectual property-based risks
US20100260468A1 (en) * 2009-04-14 2010-10-14 Maher Khatib Multi-user remote video editing
US20120079606A1 (en) * 2010-09-24 2012-03-29 Amazon Technologies, Inc. Rights and capability-inclusive content selection and delivery
US20120189282A1 (en) * 2011-01-25 2012-07-26 Youtoo Technologies, Inc. Generation and Management of Video Blogs
US20120245952A1 (en) * 2011-03-23 2012-09-27 University Of Rochester Crowdsourcing medical expertise

Family Cites Families (124)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4977594A (en) 1986-10-14 1990-12-11 Electronic Publishing Resources, Inc. Database usage metering and protection system and method
US5247575A (en) 1988-08-16 1993-09-21 Sprague Peter J Information distribution system
US4932054A (en) 1988-09-16 1990-06-05 Chou Wayne W Method and apparatus for protecting computer software utilizing coded filter network in conjunction with an active coded hardware device
JPH08263438A (en) 1994-11-23 1996-10-11 Xerox Corp Distribution and use control system of digital work and access control method to digital work
US5841512A (en) 1996-02-27 1998-11-24 Goodhill; Dean Kenneth Methods of previewing and editing motion pictures
US20030056103A1 (en) 2000-12-18 2003-03-20 Levy Kenneth L. Audio/video commerce application architectural framework
JPH11203837A (en) 1998-01-16 1999-07-30 Sony Corp Editing system and method therefor
US6351765B1 (en) 1998-03-09 2002-02-26 Media 100, Inc. Nonlinear video editing system
US6388668B1 (en) 1998-07-20 2002-05-14 Microsoft Corporation Functional animation including sprite tree generator and interpreter
US6442283B1 (en) 1999-01-11 2002-08-27 Digimarc Corporation Multimedia data embedding
US6564380B1 (en) 1999-01-26 2003-05-13 Pixelworld Networks, Inc. System and method for sending live video on the internet
JP2002057882A (en) 2000-04-21 2002-02-22 Sony Corp Apparatus and method for embedding information, image processor and processing method, content processor and processing method, monitor and monitoring method, and storage medium
JP4660879B2 (en) 2000-04-27 2011-03-30 ソニー株式会社 Information providing apparatus and method, and program
US6760885B1 (en) 2000-06-15 2004-07-06 Microsoft Corporation System and method for using a standard composition environment as the composition space for video image editing
US20020116716A1 (en) * 2001-02-22 2002-08-22 Adi Sideman Online video editor
US7089309B2 (en) 2001-03-21 2006-08-08 Theplatform For Media, Inc. Method and system for managing and distributing digital media
US20020144130A1 (en) 2001-03-29 2002-10-03 Koninklijke Philips Electronics N.V. Apparatus and methods for detecting illicit content that has been imported into a secure domain
US20020181732A1 (en) 2001-04-10 2002-12-05 Motorola, Inc Method of collaborative watermarking of a digital content
JP3736379B2 (en) 2001-04-19 2006-01-18 ソニー株式会社 Digital watermark embedding processing device, digital watermark detection processing device, digital watermark embedding processing method, digital watermark detection processing method, program storage medium, and program
GB0111431D0 (en) 2001-05-11 2001-07-04 Koninkl Philips Electronics Nv A real-world representation system and language
US7512964B2 (en) * 2001-06-29 2009-03-31 Cisco Technology System and method for archiving multiple downloaded recordable media content
JP3852568B2 (en) 2001-09-11 2006-11-29 ソニー株式会社 Apparatus and method for creating multimedia presentation
US8510441B2 (en) 2001-09-18 2013-08-13 Sony Corporation Transmission apparatus, transmission method, content delivery system, content delivery method, and program
US20030233462A1 (en) 2002-05-30 2003-12-18 Herman Chien System and method for providing a digital rights scheme for browser downloads
US20040054694A1 (en) 2002-09-12 2004-03-18 Piccionelli Gregory A. Remote personalization method
JP3941700B2 (en) 2003-01-28 2007-07-04 ソニー株式会社 Information processing apparatus, information processing method, and computer program
US7246356B1 (en) 2003-01-29 2007-07-17 Adobe Systems Incorporated Method and system for facilitating comunications between an interactive multimedia client and an interactive multimedia communication server
US7617278B1 (en) 2003-01-29 2009-11-10 Adobe Systems Incorporated Client controllable server-side playlists
US7882258B1 (en) * 2003-02-05 2011-02-01 Silver Screen Tele-Reality, Inc. System, method, and computer readable medium for creating a video clip
US7272658B1 (en) 2003-02-13 2007-09-18 Adobe Systems Incorporated Real-time priority-based media communication
US7451179B2 (en) 2003-03-14 2008-11-11 Seiko Epson Corporation Image and sound input-output control
US7287256B1 (en) 2003-03-28 2007-10-23 Adobe Systems Incorporated Shared persistent objects
US8205154B2 (en) 2004-04-16 2012-06-19 Apple Inc. User definable transition tool
US7375768B2 (en) 2004-08-24 2008-05-20 Magix Ag System and method for automatic creation of device specific high definition material
US7707249B2 (en) * 2004-09-03 2010-04-27 Open Text Corporation Systems and methods for collaboration
WO2006124193A2 (en) * 2005-04-20 2006-11-23 Videoegg, Inc. Browser enabled video manipulation
US7809802B2 (en) 2005-04-20 2010-10-05 Videoegg, Inc. Browser based video editing
US7769819B2 (en) * 2005-04-20 2010-08-03 Videoegg, Inc. Video editing with timeline representations
US8156176B2 (en) 2005-04-20 2012-04-10 Say Media, Inc. Browser based multi-clip video editing
US20060251382A1 (en) 2005-05-09 2006-11-09 Microsoft Corporation System and method for automatic video editing using object recognition
US8631226B2 (en) 2005-09-07 2014-01-14 Verizon Patent And Licensing Inc. Method and system for video monitoring
US9401080B2 (en) 2005-09-07 2016-07-26 Verizon Patent And Licensing Inc. Method and apparatus for synchronizing video frames
US7945615B1 (en) 2005-10-31 2011-05-17 Adobe Systems Incorporated Distributed shared persistent objects
US8161159B1 (en) 2005-10-31 2012-04-17 Adobe Systems Incorporated Network configuration with smart edge servers
US8191098B2 (en) 2005-12-22 2012-05-29 Verimatrix, Inc. Multi-source bridge content distribution system and method
US20090196570A1 (en) * 2006-01-05 2009-08-06 Eyesopt Corporation System and methods for online collaborative video creation
US20070162855A1 (en) 2006-01-06 2007-07-12 Kelly Hawk Movie authoring
EP1929407A4 (en) 2006-01-13 2009-09-23 Yahoo Inc Method and system for online remixing of digital multimedia
US20080215620A1 (en) 2006-01-13 2008-09-04 Yahoo! Inc. Method and system for social remixing of media content
US9032297B2 (en) 2006-03-17 2015-05-12 Disney Enterprises, Inc. Web based video editing
WO2007137240A2 (en) * 2006-05-21 2007-11-29 Motionphoto, Inc. Methods and apparatus for remote motion graphics authoring
US7945848B2 (en) 2006-06-21 2011-05-17 Microsoft Corporation Dynamically modifying a theme-based media presentation
WO2008016634A2 (en) 2006-08-02 2008-02-07 Tellytopia, Inc. System, device, and method for delivering multimedia
US8762530B2 (en) 2006-09-11 2014-06-24 Fujitsu Limited Peer-to-peer network with paid uploaders
US20080066107A1 (en) 2006-09-12 2008-03-13 Google Inc. Using Viewing Signals in Targeted Video Advertising
US20080123976A1 (en) * 2006-09-22 2008-05-29 Reuters Limited Remote Picture Editing
US8577204B2 (en) 2006-11-13 2013-11-05 Cyberlink Corp. System and methods for remote manipulation of video over a network
US20080165388A1 (en) 2007-01-04 2008-07-10 Bertrand Serlet Automatic Content Creation and Processing
US8218830B2 (en) 2007-01-29 2012-07-10 Myspace Llc Image editing system and method
US8910045B2 (en) 2007-02-05 2014-12-09 Adobe Systems Incorporated Methods and apparatus for displaying an advertisement
US20080193100A1 (en) * 2007-02-12 2008-08-14 Geoffrey King Baum Methods and apparatus for processing edits to online video
EP2111588A2 (en) * 2007-02-13 2009-10-28 Nidvid, Inc. Media editing system and method
US20080208692A1 (en) 2007-02-26 2008-08-28 Cadence Media, Inc. Sponsored content creation and distribution
KR100826959B1 (en) 2007-03-26 2008-05-02 정상국 Method and system for making a picture image
US8667532B2 (en) 2007-04-18 2014-03-04 Google Inc. Content recognition for targeting video advertisements
US7934011B2 (en) 2007-05-01 2011-04-26 Flektor, Inc. System and method for flow control in web-based video editing system
US8265457B2 (en) 2007-05-14 2012-09-11 Adobe Systems Incorporated Proxy editing and rendering for various delivery outlets
US8375086B2 (en) 2007-05-31 2013-02-12 International Business Machines Corporation Shared state manager and system and method for collaboration
US9032298B2 (en) 2007-05-31 2015-05-12 Aditall Llc. Website application system for online video producers and advertisers
US8209618B2 (en) 2007-06-26 2012-06-26 Michael Garofalo Method of sharing multi-media content among users in a global computer network
US8433611B2 (en) 2007-06-27 2013-04-30 Google Inc. Selection of advertisements for placement with content
US7849399B2 (en) 2007-06-29 2010-12-07 Walter Hoffmann Method and system for tracking authorship of content in data
EP2051173A3 (en) 2007-09-27 2009-08-12 Magix Ag System and method for dynamic content insertion from the internet into a multimedia work
US20090094147A1 (en) * 2007-10-09 2009-04-09 Fein Gene S Multi-Computer Data Transfer and Processing to Support Electronic Content Clearance and Licensing
KR101435140B1 (en) 2007-10-16 2014-09-02 삼성전자 주식회사 Display apparatus and method
US7797274B2 (en) 2007-12-12 2010-09-14 Google Inc. Online content collaboration model
US7840661B2 (en) 2007-12-28 2010-11-23 Yahoo! Inc. Creating and editing media objects using web requests
US20100004944A1 (en) 2008-07-07 2010-01-07 Murugan Palaniappan Book Creation In An Online Collaborative Environment
US8225228B2 (en) 2008-07-10 2012-07-17 Apple Inc. Collaborative media production
EP2172936A3 (en) * 2008-09-22 2010-06-09 a-Peer Holding Group, LLC Online video and audio editing
US8051287B2 (en) 2008-10-15 2011-11-01 Adobe Systems Incorporated Imparting real-time priority-based network communications in an encrypted communication session
US8302008B2 (en) 2008-10-23 2012-10-30 International Business Machines Corporation Software application for presenting flash presentations encoded in a flash presentation markup language (FLML)
US8245188B2 (en) 2008-10-23 2012-08-14 International Business Machines Corporation Flash presentation markup language (FLML) for textually expressing flash presentations
GB2464948A (en) * 2008-10-29 2010-05-05 Quolos Limited Online collaboration
US8639086B2 (en) 2009-01-06 2014-01-28 Adobe Systems Incorporated Rendering of video based on overlaying of bitmapped images
WO2010085773A1 (en) 2009-01-24 2010-07-29 Kontera Technologies, Inc. Hybrid contextual advertising and related content analysis and display techniques
US20100257457A1 (en) 2009-04-07 2010-10-07 De Goes John A Real-time content collaboration
US10244056B2 (en) 2009-04-15 2019-03-26 Wyse Technology L.L.C. Method and apparatus for transferring remote session data
US8412729B2 (en) 2009-04-22 2013-04-02 Genarts, Inc. Sharing of presets for visual effects or other computer-implemented effects
US8984406B2 (en) 2009-04-30 2015-03-17 Yahoo! Inc! Method and system for annotating video content
US20100285884A1 (en) 2009-05-08 2010-11-11 Gazillion Inc High performance network art rendering systems
GB0909844D0 (en) 2009-06-08 2009-07-22 Codigital Ltd Method and system for generating collaborative content
US8806331B2 (en) * 2009-07-20 2014-08-12 Interactive Memories, Inc. System and methods for creating and editing photo-based projects on a digital network
US20110026899A1 (en) * 2009-07-31 2011-02-03 Paul Lussier Systems and Methods for Viewing and Editing Content Over a Computer Network in Multiple Formats and Resolutions
US8166191B1 (en) 2009-08-17 2012-04-24 Adobe Systems Incorporated Hint based media content streaming
US8412841B1 (en) 2009-08-17 2013-04-02 Adobe Systems Incorporated Media content streaming using stream message fragments
US8788941B2 (en) 2010-03-30 2014-07-22 Itxc Ip Holdings S.A.R.L. Navigable content source identification for multimedia editing systems and methods therefor
US8463845B2 (en) * 2010-03-30 2013-06-11 Itxc Ip Holdings S.A.R.L. Multimedia editing systems and methods therefor
US8806346B2 (en) 2010-03-30 2014-08-12 Itxc Ip Holdings S.A.R.L. Configurable workflow editor for multimedia editing systems and methods therefor
US8422852B2 (en) 2010-04-09 2013-04-16 Microsoft Corporation Automated story generation
US20110282727A1 (en) 2010-05-14 2011-11-17 Minh Phan Content management in out-of-home advertising networks
US8332752B2 (en) 2010-06-18 2012-12-11 Microsoft Corporation Techniques to dynamically modify themes based on messaging
US8744239B2 (en) 2010-08-06 2014-06-03 Apple Inc. Teleprompter tool for voice-over tool
US20120251080A1 (en) * 2011-03-29 2012-10-04 Svendsen Jostein Multi-layer timeline content compilation systems and methods
US20130132462A1 (en) * 2011-06-03 2013-05-23 James A. Moorer Dynamically Generating and Serving Video Adapted for Client Playback in Advanced Display Modes
US8341525B1 (en) * 2011-06-03 2012-12-25 Starsvu Corporation System and methods for collaborative online multimedia production
US9026446B2 (en) 2011-06-10 2015-05-05 Morgan Fiumi System for generating captions for live video broadcasts
US8749618B2 (en) 2011-06-10 2014-06-10 Morgan Fiumi Distributed three-dimensional video conversion system
US8532469B2 (en) 2011-06-10 2013-09-10 Morgan Fiumi Distributed digital video processing system
US8966402B2 (en) * 2011-06-29 2015-02-24 National Taipei University Of Education System and method for editing interactive three-dimension multimedia, and online editing and exchanging architecture and method thereof
US10708148B2 (en) 2011-09-12 2020-07-07 Microsoft Technology Licensing, Llc Activity-and dependency-based service quality monitoring
JP6242798B2 (en) 2011-10-10 2017-12-06 ヴィヴォーム インコーポレイテッド Rendering and steering based on a network of visual effects
US11210610B2 (en) 2011-10-26 2021-12-28 Box, Inc. Enhanced multimedia content preview rendering in a cloud content management system
CN104412577A (en) 2012-02-23 2015-03-11 大专院校网站公司 Asynchronous video interview system
US9563902B2 (en) 2012-04-11 2017-02-07 Myriata, Inc. System and method for transporting a virtual avatar within multiple virtual environments
US20130311556A1 (en) 2012-05-18 2013-11-21 Yahoo! Inc. System and Method for Generating Theme Based Dynamic Groups
US20140047413A1 (en) 2012-08-09 2014-02-13 Modit, Inc. Developing, Modifying, and Using Applications
US8861005B2 (en) 2012-09-28 2014-10-14 Interactive Memories, Inc. Methods for real time discovery, selection, and engagement of most economically feasible printing service vendors among multiple known vendors
US20140121017A1 (en) 2012-10-25 2014-05-01 University Of Saskatchewan Systems and methods for controlling user interaction with biofeedback gaming applications
US20140143218A1 (en) 2012-11-20 2014-05-22 Apple Inc. Method for Crowd Sourced Multimedia Captioning for Video Content
US11748833B2 (en) 2013-03-05 2023-09-05 Wevideo, Inc. Systems and methods for a theme-based effects multimedia editing platform
US20140255009A1 (en) * 2013-03-05 2014-09-11 WeVideo, Inc.. Theme-based effects multimedia editor systems and methods
US20140317506A1 (en) 2013-04-23 2014-10-23 Wevideo, Inc. Multimedia editor systems and methods based on multidimensional cues
US20150050009A1 (en) 2013-08-13 2015-02-19 Wevideo, Inc. Texture-based online multimedia editing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010041050A1 (en) * 1999-12-07 2001-11-15 Yoshiaki Iwata Video editing apparatus, video editing method, and recording medium
US20100169127A1 (en) * 2005-04-08 2010-07-01 Malackowski James E System and method for managing intellectual property-based risks
US20070208442A1 (en) * 2006-02-27 2007-09-06 Perrone Paul J General purpose robotics operating system
US20070250901A1 (en) * 2006-03-30 2007-10-25 Mcintire John P Method and apparatus for annotating media streams
US20080183608A1 (en) * 2007-01-26 2008-07-31 Andrew Gavin Payment system and method for web-based video editing system
US20100260468A1 (en) * 2009-04-14 2010-10-14 Maher Khatib Multi-user remote video editing
US20120079606A1 (en) * 2010-09-24 2012-03-29 Amazon Technologies, Inc. Rights and capability-inclusive content selection and delivery
US20120189282A1 (en) * 2011-01-25 2012-07-26 Youtoo Technologies, Inc. Generation and Management of Video Blogs
US20120245952A1 (en) * 2011-03-23 2012-09-27 University Of Rochester Crowdsourcing medical expertise

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10109318B2 (en) 2011-03-29 2018-10-23 Wevideo, Inc. Low bandwidth consumption online content editing
US11402969B2 (en) 2011-03-29 2022-08-02 Wevideo, Inc. Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing
US11127431B2 (en) 2011-03-29 2021-09-21 Wevideo, Inc Low bandwidth consumption online content editing
US20120284176A1 (en) * 2011-03-29 2012-11-08 Svendsen Jostein Systems and methods for collaborative online content editing
US10739941B2 (en) 2011-03-29 2020-08-11 Wevideo, Inc. Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing
US9460752B2 (en) 2011-03-29 2016-10-04 Wevideo, Inc. Multi-source journal content integration systems and methods
US9489983B2 (en) 2011-03-29 2016-11-08 Wevideo, Inc. Low bandwidth consumption online content editing
US10915591B2 (en) 2011-08-11 2021-02-09 Cd Newco, Llc Media acquisition engine and method
US10002197B2 (en) * 2011-08-11 2018-06-19 Curve Dental Ltd. Media acquisition engine and method
US10496723B2 (en) 2011-08-11 2019-12-03 Curve Dental Ltd. Media acquisition engine and method
US20160342698A1 (en) * 2011-08-11 2016-11-24 Curve Dental Ltd. Media acquisition engine and method
US11264058B2 (en) 2012-12-12 2022-03-01 Smule, Inc. Audiovisual capture and sharing framework with coordinated, user-selectable audio and video effects filters
US9459768B2 (en) * 2012-12-12 2016-10-04 Smule, Inc. Audiovisual capture and sharing framework with coordinated user-selectable audio and video effects filters
US10607650B2 (en) 2012-12-12 2020-03-31 Smule, Inc. Coordinated audio and video capture and sharing framework
US20140229831A1 (en) * 2012-12-12 2014-08-14 Smule, Inc. Audiovisual capture and sharing framework with coordinated user-selectable audio and video effects filters
US11748833B2 (en) 2013-03-05 2023-09-05 Wevideo, Inc. Systems and methods for a theme-based effects multimedia editing platform
EP2775703A1 (en) * 2013-03-07 2014-09-10 Nokia Corporation Method and apparatus for managing crowd sourced content creation
US9251359B2 (en) 2013-03-07 2016-02-02 Nokia Technologies Oy Method and apparatus for managing crowd sourced content creation
US10855731B2 (en) 2013-04-11 2020-12-01 Nec Corporation Information processing apparatus, data processing method thereof, and program
WO2014189485A1 (en) 2013-05-20 2014-11-27 Intel Corporation Elastic cloud video editing and multimedia search
EP3000238A4 (en) * 2013-05-20 2017-02-22 Intel Corporation Elastic cloud video editing and multimedia search
US11056148B2 (en) 2013-05-20 2021-07-06 Intel Corporation Elastic cloud video editing and multimedia search
US9852769B2 (en) 2013-05-20 2017-12-26 Intel Corporation Elastic cloud video editing and multimedia search
JP2016524833A (en) * 2013-05-20 2016-08-18 インテル コーポレイション Adaptive cloud editing and multimedia search
US11837260B2 (en) 2013-05-20 2023-12-05 Intel Corporation Elastic cloud video editing and multimedia search
US10679151B2 (en) 2014-04-28 2020-06-09 Altair Engineering, Inc. Unit-based licensing for third party access of digital content
US20150332282A1 (en) * 2014-05-16 2015-11-19 Altair Engineering, Inc. Unit-based licensing for collage content access
US10685055B2 (en) 2015-09-23 2020-06-16 Altair Engineering, Inc. Hashtag-playlist content sequence management
US20200274908A1 (en) * 2016-04-28 2020-08-27 Rabbit Asset Purchase Corp. Screencast orchestration
US9773524B1 (en) 2016-06-03 2017-09-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
US9852768B1 (en) 2016-06-03 2017-12-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
US11799864B2 (en) 2019-02-07 2023-10-24 Altair Engineering, Inc. Computer systems for regulating access to electronic content using usage telemetry data

Also Published As

Publication number Publication date
US20160027472A1 (en) 2016-01-28
US20120284176A1 (en) 2012-11-08
US20190325913A1 (en) 2019-10-24
US20220084557A1 (en) 2022-03-17
US9460752B2 (en) 2016-10-04
US11127431B2 (en) 2021-09-21
US20120251083A1 (en) 2012-10-04
US20170236550A1 (en) 2017-08-17
US9489983B2 (en) 2016-11-08
US20120254752A1 (en) 2012-10-04
US9711178B2 (en) 2017-07-18
US10109318B2 (en) 2018-10-23
US20120254778A1 (en) 2012-10-04

Similar Documents

Publication Publication Date Title
US20120251080A1 (en) Multi-layer timeline content compilation systems and methods
US11402969B2 (en) Multi-source journal content integration systems and methods and systems and methods for collaborative online content editing
JP6189317B2 (en) Content provider using multi-device secure application integration
US8245124B1 (en) Content modification and metadata
US8701008B2 (en) Systems and methods for sharing multimedia editing projects
US7908270B2 (en) System and method for managing access to media assets
KR101486602B1 (en) Advertising funded data access services
US20080193100A1 (en) Methods and apparatus for processing edits to online video
US10268760B2 (en) Apparatus and method for reproducing multimedia content successively in a broadcasting system based on one integrated metadata
US20190098369A1 (en) System and method for secure cross-platform video transmission
US20060085830A1 (en) Method and apparatus for content provisioning in a video on demand system
US20150286489A1 (en) Automatic detection and loading of missing plug-ins in a media composition application
TW200919211A (en) Server-controlled distribution of media content
US10430868B2 (en) Content purchases and rights storage and entitlements
US20150050009A1 (en) Texture-based online multimedia editing
US20230177070A1 (en) Tokenized voice authenticated narrated video descriptions
US20140156631A1 (en) Decoration of search results by third-party content providers
US10156955B2 (en) Method and server for storing, encoding and uploading video or object captured from a webpage using a toolbar
US20090070371A1 (en) Inline rights request and communication for remote content
US20160300280A1 (en) Verified-party content
US9219945B1 (en) Embedding content of personal media in a portion of a frame of streaming media indicated by a frame identifier
US9615145B2 (en) Apparatus and methods for providing interactive extras

Legal Events

Date Code Title Description
AS Assignment

Owner name: WEVIDEO, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SVENDSEN, JOSTEIN;RUSTBERGGAARD, BJORN;REEL/FRAME:028038/0887

Effective date: 20120329

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION