WO2012028198A1 - Media server and method for streaming media - Google Patents

Media server and method for streaming media Download PDF

Info

Publication number
WO2012028198A1
WO2012028198A1 PCT/EP2010/062934 EP2010062934W WO2012028198A1 WO 2012028198 A1 WO2012028198 A1 WO 2012028198A1 EP 2010062934 W EP2010062934 W EP 2010062934W WO 2012028198 A1 WO2012028198 A1 WO 2012028198A1
Authority
WO
WIPO (PCT)
Prior art keywords
server
update
client
media structure
media
Prior art date
Application number
PCT/EP2010/062934
Other languages
French (fr)
Inventor
Gabor Gesztesi
Miklos Tamas Bodi
Lorant Farkas
Original Assignee
Nokia Siemens Networks Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Siemens Networks Oy filed Critical Nokia Siemens Networks Oy
Priority to PCT/EP2010/062934 priority Critical patent/WO2012028198A1/en
Publication of WO2012028198A1 publication Critical patent/WO2012028198A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26291Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists for providing content or additional data updates, e.g. updating software modules, stored at the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs

Abstract

A method of composing a streamed scene, comprising the steps of : assembling components of the scene to create a server-side scene tree (204); assembling components of the scene to create a client-side scene tree, the client-side scene tree corresponding to the server-side scene tree; applying a modification to the server-side scene tree; generating an update corresponding to the modification; sending the update to at least one client on the client side; and applying the update to the client-side scene tree.

Description

Description
Title MEDIA SERVER AND METHOD FOR STREAMING MEDIA
This invention relates to media servers and a method for streaming media. It is particularly, but not exclusively, related to media servers capable of streaming scenes. It may relate to streaming of rich media content to mobile terminal devices .
Rich media content is typically graphically rich content, and often contains several types of compound media, such as text, graphics, animation, audio and/or video. Generally, rich media is dynamic in the sense that it can change as time passes and in some cases can respond to user interaction.
More rich media content is becoming available via the
Internet and the demand for delivering it is increasing. As a consequence, it is becoming desirable to provide the means to enable resource constrained devices, such as mobile phones or embedded devices, to be capable of receiving and handling rich media content.
A rich media system in the form of a client-server system 100 is shown in Figure 1. It comprises a rich media server 102, a rich media client 104, and a transportation mechanism 106. The rich media server 102 is configured to receive and store rich media in the form of a rich media container 108
comprising discrete media (such as still images) and
continuous media (such as audio and/or video streams) 110 and scene descriptor commands 112. The scene descriptor commands 112 define the way in which the rich media content is to be built up, presented, and changed as time passes. Such changes are effected by scene update commands which are used to update the scene over time. The rich media server 102 also comprises a remote interaction service block 114 which is configured to receive remote interaction from a user and use this to modify the scene as presented on a terminal device which incorporates and runs the rich media client 104.
The rich media client 104 has a rich media rendering block 116, a local interaction block 118, and a remote interaction block 120. The rich media rendering block 116 receives scene descriptor commands from the rich media server 102 via the transportation mechanism 106 and processes them for visual and audible presentation by the terminal device. The local interaction block 118 receives user commands which determine how the scene is presented by the terminal device. The remote interaction block 120 takes remote user commands and/or feedback and provides it to the remote interaction service block 114 via the transportation mechanism 106. These
commands and/or feedback could be direct requests for new content or indirect requests in which a user action causes the system to seek to deliver new content.
The transportation mechanism 106 is capable of using a forward channel 122 to stream, download, and/or progressively download rich media to individual users in a one-to-one, or unicast, delivery mode or to many users in a one-to-many, or broadcast, delivery mode. Feedback provided by the remote interaction block 120 of the rich media client 104 is
provided to the rich media server 102 by a feedback channel 124.
When the rich media is being streamed by the rich media server 102 to rich media clients 104 during a streaming session, it is either streamed so that different parts, for example audio, video, and a scene descriptor, are streamed as separate tracks, for example in the case of the Real Time Streaming Protocol (RTSP) and the Real-time Transport
Protocol (RTP) being used, or all of these elements are multiplexed together, for example in the case of the MPEG transport stream being used. The scene descriptor sent in a streaming unit is either a full (new) scene descriptor or an update command. An update command can be declarative (such as insert, delete, replace etc) or a script.
A typical rich media client runs on a user terminal device such as a personal computer, a net-book, a PDA, or a cellular wireless device such as a mobile telephone. In some cases, a rich media content stream can require a significant amount of processing which, if being handled by a resource constrained personal mobile device, can require long processing time periods to deal with the commands in the stream.
The rich media client 104 is responsible for rendering the content. Once the rich media content has been rendered and thus made available to a user, the rich media client 104 can interact with the content either locally or remotely. In the former case, user input changes the rendering of the rich media content on the terminal device. In the latter case, user input causes the rich media client 104 to request rich media content from the rich media server 102 so that content present in the rich media client 104 is changed and rendering and thus presentation on the terminal device can be changed.
Examples of such rich media systems include MPEG-BIFS, MPEG- LASeR and 3GPP-DIMS. BIFS is the default presentation engine for MPEG-4, while the latter two are scene description standards specifically designed for resource constrained applications and are both based on the Tiny profile of the Scalable Vector Graphics (SVG) language.
SVG is an XML based format for scalable two dimensional graphics. The SVG standard defines a yDOM tree and provides an API that allows access to initial and computed attributes and properties in the tree. The Tiny profile of SVG has been specified by W3C for use on mobile and other resource
constrained devices. LASeR and DIMS define commands to modify the yDOM tree (such as Insert, Replace and Delete) , and techniques to deliver these commands and to process them in the rich media clients. SMIL is another known standard in the rich media domain. It can be used to create multimedia presentations by defining spatial and temporal relationships between media objects. The spatial model used by SMIL is relatively simple, but its time model is sophisticated supporting many practical types of synchronization. A small subset of the SMIL model and timing attributes is used by the SVG animation feature.
Content is important for all rich media applications.
Existing tools and techniques are used to create basic content parts, such as movies, audio clips, images and subtitles. There are also tools available to combine them together, for example tools which can create scene
descriptors. This is done offline. In other words, a scene is composed by an author. The author also determines changes which are to occur to the scene during its playback, and composes corresponding updates. The updates are pre-scheduled so that they are available to be streamed according to their schedule . Although scripts can add interactivity and behaviour to created content, client side scripting has limitations. The execution process is controlled by the client, which can lead to inconsistent or unauthorised service/content access. (For example the client can possibly bypass the presentation of advertisements or can access premium content without paying.) Building up dynamic content on the client side can be
resource consuming. Remote interaction may not be directly supported. If feedback and broadcast channels are separated, then a feedback channel may be used to add dynamic content via client side scripting.
Dynamic and personalised content can be delivered with server side applications. There exist such applications for specific purposes like video subtitling or advertisement injection. However these are specific to their intended purpose and a change to this purpose may require writing or changing programming code.
It is also not straightforward to build a continuous, live broadcast or streaming channel. Even if a content source (for example a camera) or specific server software generates stream output, switching to another content source or
software is not always possible without interruption.
Rich media content authoring tools (like Ikivo Animator) support creating and editing parts of the rich media content (such as media objects) . Some of them also allow combining media objects together and creating scene descriptors.
However these are based on offline working. They may support scripting for dynamic contents and interactivity, but this limits processing to the client side which may not be
sufficient. Furthermore, they do not support content
personalisation.
Media streaming servers usually include basic tools allowing continuous playback of a set of media files. An example streaming server is the QuickTime Streaming Server from
Apple, or its open source version, the Darwin Streaming
Server. They contain a tool named PlaylistBroadcaster . It allows sequential or random playback of a set of media files, and supports dynamic changes of the file list using file access or a web interface. However these tools do not support dynamic or complex rich media contents, content
personalisation, or remote user interaction.
There are advanced rich media advertisement systems which support the addition of rich media advertisements to content. They can support personalised and dynamic selection of the advertisements. There are also other specific applications, such as subtitling systems such as WinCAPS from SysMedia. However these systems are highly specialised and only support only pre-coded business logic to gather and present dynamic content .
Application and portal servers such as BEA WebLogic, IBM WebSphere or JBoss are flexible general purpose platforms to create dynamic and personalised contents. They also have support user interactivity.
According to a first aspect of the invention there is provided a method of composing streaming media, comprising the steps of:
assembling components of the media to create a server-side media structure;
assembling components of the media to create a client-side media structure, the client-side media structure
corresponding to the server-side media structure;
applying a modification to the server-side media structure; generating an update corresponding to the modification; and sending the update to at least one client on the client side in order that it can be applied to the client-side media structure .
Preferably, the streaming media is streaming of a scene. Preferably, the server-side and the client-side media structures are scene trees. The server-side media structure may comprise a streamed scene (a data flow tree) , and a description of streaming media, for example associated audio and video streams.
Preferably, applying the update to the server-side media structure results in a corresponding modification being applied to the client-side media structure. Accordingly, applying an update to change the server-side media structure may then cause a corresponding change to the client-side media structure. The components of the media structure may be a root node and/or leaf nodes. The media may have a hierarchical
structure. It may have a tree-like structure having a root elements and leaf elements. It may be a scene tree. The scene tree may be defined in a scene descriptor.
In an embodiment in which the server-side media structure is a scene tree, it is managed by a scene tree module. The scene tree may provide the update to an update generator. The update generator may be subscribed to the scene tree, or specifically to scene update events for a particular server- side media structure related to the streamed media. The update generator may monitor the server-side media structure to determine when changes have been made and update commands need to be generated to be sent to clients.
The clients may be mobile client devices. Preferably, the streamed media comprises a scene and video and/or audio. There may be a scene channel and relevant media channels .
Preferably, the client-side media structure is in a client on a client side. Preferably the server-side media structure is in a server on a server side.
The server may be capable of instantiating a plug-in when it receives a plug-in reference. It may also be capable of handling relevant parameters from the reference. The
reference may then be substituted in the server-side media structure by a concrete portion as generated and provided by the plug-in. A plug-in manager may be used to instantiate a plug-in. The plug-in manager may be responsible for managing the life- cycles of plug-ins and accessing relevant parts of the server-side media structure which are to be updated according to an update command.
A plug-in may generate updated content in the form of code defining content and presentation-related aspects, and provide this updated content to the update scheduler.
The plug-in may stay alive after generating the concrete portion, and issue update commands from time to time. It may continue to listen (periodically poll) sources of content. If content changes, then the plug-in can send an update command to update the relevant portion of the server-side media structure . Preferably, the plug-in is pre-installed . The plug-in may be an external plug-in provided as an optional add-on. This may enable external application developers to contribute new code that can be utilised to customise content. The plug-in may be used to automate the insertion of content from various content sources into the media.
Preferably, the server comprises an update scheduler. This may provide updates to the server-side media structure according to any scheduling information contained in an update command.
Preferably, the server comprises a stream controller. This may apply a suitable configuration (or configurations) for media channels and/or for scene description channels.
A client may open a number of respective connections to the server, for respective streams. Preferably, the server receives authoring commands from an authoring tool. The authoring tool may be capable of handling authoring commands and dispatching suitable instructions to the update scheduler. It may be used to add a feature to a scene. In doing this, a particular element of content may be identified, for example a video stream or an RSS feed.
Aspects of the content may be defined. These may be
presentation-related aspects and, if specified, schedule- related aspects.
Preferably, the update scheduler receives an authoring command and derives from it any schedule-related aspects to generate command scheduling information. Schedule-related aspects can be in the form of concrete/absolute time or in the form of relative time. Timing information may specify execution relative to the playing or presentation of other content . The authoring tool may be used to provide to the server plug- in authoring commands. It may be used to provide to the server non-plug-in authoring commands. In respect of non- plug-in authoring commands, the update scheduler may provide the command to a module which applies the update to the server-side media structure without taking any action to instantiate a plug-in.
The authoring tool may be used to update the server-side media structure with an advertisement into its content, for example into a scene.
The update scheduler may be capable of splitting update commands into sub-commands, including a first sub-command that performs a relevant action, for example including content in a scene, and a second sub-command which causes the action of the first sub-command to become apparent so that the content may become visible. The second sub-command may be executed at a scheduled time even though a related first sub¬ command was executed at an earlier time.
The update scheduler may aggregate update commands from different sources, both plug-in and non-plug-in related, and initiate the execution of the commands in proper time, that is according to scheduling information and/or immediately.
Preferably, the server comprises a rich media server. The rich media server may receive updates from the update
generator which it then streams towards clients having first performed any relevant filtering operations or
transformations indicated by the update command. The rich media server may apply stream processing depending on
filtering rules defined in a command. The rich media server may aggregate different content streams from different content sources in order to deliver them to clients.
Preferably, the server is configured with an initial
descriptor capable of setting up a new stream and at least one channel element. The channel element may be at least one of "scene", "video", and "audio". Clients receiving the stream may listen to and act upon updates affecting issues of presentation and content.
Preferably, the media structure may also be provided with filters. Sub-parts of the media structure, such as branches or leaves, may be provided with filter conditions determining the provision of updates related to these sub-parts. Filter conditions may relate to subscriber class, types of client, and/or capabilities of client devices.
Preferably the server contains one common media structure capable of dealing with clients of different classes.
Filtering may be carried out in different functionalities of the server. Filters may be associated with the media
structure in a module which maintains the media structure. An update generator may receive change notifications relating to the media structure which it uses to generate updates
according to the change notifications. It may generate a plurality of updates in respect of a single change according to filtering conditions associated with the change. A streaming server may send or discard one or more of the plurality of updates according to subscription information, and/or client device capabilities, and/or other information. According to a second aspect of the invention there is provided a server capable of composing streaming media, the server comprising:
a media structure module capable of assembling components of the media to create a server-side media structure, the server-side media structure corresponding to a client-side media structure, the media structure module capable of receiving an instruction to modify the server-side media structure and applying a modification to the server-side media structure;
an update generator capable of generating an update
corresponding to the modification;
a streaming sub-server capable of sending the update to at least one client on the client side in order that the update can be applied to the client-side media structure.
According to the invention, the server may be capable of maintaining the server-side media structure in such a way that modifications to the media by a content author are applied to modify the server-side media structure. The server may be capable of creating update commands based on
modifications to the server-side media structure and
providing the update commands to clients in order for them to modify their respective client-side media structures. The server may be a media server. It may be a media server capable of streaming scenes. It may be capable of streaming of rich media content to mobile terminal devices.
According to a third aspect of the invention there is provided a communication system comprising:
a server according to the second aspect of the invention; at least one client. The system may comprise a transportation mechanism.
According to a fourth aspect of the invention there is provided a computer program product comprising software code that when executed on a computing system performs a method of composing streaming media, comprising the steps of:
assembling components of the media to create a server-side media structure;
assembling components of the media to create a client-side media structure, the client-side media structure
corresponding to the server-side media structure;
applying a modification to the server-side media structure; generating an update corresponding to the modification;
sending the update to at least one client on the client side; and
applying the update to the client-side media structure.
Preferably, the computer program product has executable code portions which are capable of carrying out the steps of the method.
Preferably, the computer program product is stored on a computer-readable medium. Preferably, it is stored in a non- transient way, that is its storage persists on the medium.
The invention may provide for modification of streamed media during its streaming. That is, it may provide for online composition. Therefore, the configuration of the server and the provision of a server-side media structure which is modifiable by update commands enable media to be modified during the time in which it is being played at, and being updated on, client devices.
An embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
Figure 1 shows a rich media system; Figure 2 shows an application server according to the
invention ;
Figure 3 shows steps involved in initial setup of a scene; and
Figure 4 shows steps involved in updating a scene by
inserting an RSS headline by means of a plug-in.
Figure 2 shows an application server 200 according to the invention. The application server is suitable for use in a client-server system 100 according to Figure 1. The
configuration of the application server 200 and its operation will now be described.
The application server 200 comprises a plug-in and update scheduling part 202, an extended scene tree module 204 which manages an extended scene tree, an update generator 206, a stream controller 208, and a rich media server 210.
The plug-in and update scheduling part 202 comprises a plug- in manager 214, an update scheduler 216, and various plug-ins 218, 220, and 222.
When the application server 200 is used in a client-server system such as system 100 in Figure 1, it is used in place of the rich media server 102, and the rich media server 210 of the application server 200 provides streaming to rich media clients. Accordingly, it can be understood that the rich media server 210 is in effect a sub-server of the application server 200, that is a server component under its control. It can be considered to be a streaming sub-server.
In addition to its function of providing rich media streams, and associated streams, to clients, the application server 200 also carries out the functions of:
(i) maintaining the extended scene tree in such a way that modifications to the scene by a content author are applied to modify the extended scene tree; and (ii) creating update commands based on the modifications to the extended scene tree and providing the update commands via the rich media server 210 to clients in order for them to modify their respective scene trees (corresponding to the extended scene tree) .
The configuration of the application server 200, and the provision of an extended scene tree which is modifiable by update commands, enable a scene to be modified during the time in which it is being played at, and being updated on, client devices. Such a capability is referred to as real-time authoring .
It will be understood that the extended scene tree module 204 is able to generate and manage an extended scene tree
embodied in the form of a document containing code, for example an XML document, which expresses relationships between elements in terms of hierarchical dependencies. The XML document may therefore define a tree structure that has a root element and dependent nodes such as leaf elements.
Therefore, it will be understood that the extended scene tree is similar to conventional scene trees. However, in addition to it having a conventional structure, the extended scene tree is a scene descriptor tree extended with features which support the operation of the application server 200 as will be described in the following.
It will be understood that the extended scene tree is not a static entity. The XML code defining the extended scene tree, and thus the extended scene tree itself, can be continually changed inside the extended scene tree module 204 as a result of updates coming from the update scheduler 216.
The extended scene tree module 204 operates to take content and/or a file representing the extended scene tree in the form of code, for example XML code, and to generate the extended scene tree, and furthermore to modify the extended scene tree according to any updates received. The extended scene tree module 204 is able to generate the extended scene tree in a form which includes plug-in definitions, the nature and use of which is described in the following.
Dynamic content can be supported by scene plug-ins. These are programming code bundles implementing a specific interface. Scene descriptors and update commands may contain references to these plug-ins. When the application server 200 receives a plug-in reference, the plug-in manager 214 is able to
instantiate a corresponding plug-in and also to pass relevant parameters from the reference. The reference is then
substituted in the scene tree by a concrete tree portion as generated and provided by the plug-in. The plug-in may stay alive after generating the associated scene tree portion, and issue update commands to the update scheduler 216 that schedules them to be provided in a timely fashion to the extended scene tree module 204. This allows implementation of dynamic, continuously changing rich media contents. Plug-in interfaces may also support additional lifecycle management, for example:
- Stop a plug-in, when the associated scene tree part is not active (Deactivate, Save) .
Restart a plug-in, when the associated scene tree part is re-activated (Activate, Restore) .
Release plug-in, when the associated scene tree part is deleted.
The plug-in manager 214 is able to determine from update commands received through the update scheduler 216 whether they require the instantiation of pre-existing plug-ins or of a new plug-in. In the example shown, the plug-in manager 214 is capable of instantiating pre-existing plug-ins 218, 220, and 222. The update scheduler 216 is connected to the
extended scene tree module 204 and is capable of providing update commands to it, whether they relate to plug-in update commands provided by the plug-ins 218, 220, and 222, or to non-plug-in update commands. The update scheduler 216 provides the update commands to the extended scene tree module 204 according to any scheduling information contained in an update command. No scheduling information in an update command may be taken by the update scheduler 216/extended scene tree module 204 as an implicit instruction to apply the update command immediately. In an example in which an update command is to happen at an
absolute time, such as 12:34.56, the update scheduler 216 provides the update command to the extended scene tree module 204 at this time. In an alternative embodiment of the
invention, the update scheduler 216 may provide the update command to the extended scene tree module 204 at any
convenient time in advance of any scheduling information together with the scheduling information in order that the extended scene tree module 204 may apply the update command to the extended scene tree at a scheduled time.
The extended scene tree module 204 is connected to the update generator 206 and also to the stream controller 208. When the extended scene tree module 204 modifies the extended scene tree according to updates received from the update scheduler 216, in addition to modifying the extended scene tree, it creates a change notification which it provides to the update generator 206. The update generator 206 monitors the extended scene tree to determine when changes have been made in order that, when it receives notification that such a change has taken place, it can generate update commands to be sent to clients. Alternatively, the update generator 206 is able to monitor the extended scene tree and determine itself if the extended scene tree has changed and thus generate a
corresponding update.
In the case of update commands comprising a videoUnit or an audioUnit element such an element is provided to the stream controller 208 which triggers it to apply a suitable
configuration (or configurations) in the rich media server 210 for new media channels and for scene description channels in a case in which there are streams provided by the extended scene tree. The stream controller 208 configures the rich media server according to a data flow tree which defines how the incoming media streams are processed (for example to be converted in terms of resolution, transcoding, etc.) and streamed to the clients. The update generator 206 and the stream controller 208 are each connected to the rich media server 210.
The application server has a number of inputs and an output. The inputs come from an authoring tool 212 and from content sources 230, and the output (which may convey a number of concurrent streams) , which comes from the rich media server 210, goes to clients 232. The clients may be mobile client devices such as personal digital assistants (PDAs) , mobile telephones (for example smart phones) , and personal computers (PCs) . The authoring tool 212 is capable of receiving author input, creating authoring commands, and dispatching suitable instructions to the update scheduler 216 (directly) and to the plug-in manager 214 (indirectly) . The authoring tool 212 is connected to the update scheduler 216 via an I_US_ctrl interface 252. The content sources are different rich media content sources, such as a camera 234, an RSS feed 236, and a movie archive 238. Individual ones of the content sources are connected to respective plug-ins 218, 220, and 222 via interfaces I_Pl_in interface 240, I_Pl_in interface 242, and I_Pl_in interface 244. There is also an interface between the content sources 230 in general and the rich media server 210 via an I_RMS_inl interface 246. Outputs go from the rich media server 210 via an I_RMS_out interface 248.
There are a number of internal inputs/outputs or interfaces within the application server 200 itself. The plug-in manager 214 is connected to respective ones of the plug-ins 218, 220, and 222 via an I_Pl_ctrl interface 254. The plug-in manager 214 is connected to the extended scene tree module 204 via an I_PM_ctrl interface 256. The plug-ins 218, 220, and 222 are connected to the update scheduler 216 via respective interfaces I_Pl_out interface 258, I_Pl_out interface 260, and I_Pl_out interface 262.
The update scheduler 216 is connected to the extended scene tree module 204 via an I_XST_in interface 264.
The extended scene tree module 204 is connected to the update generator 206 via an I_XST_out interface 266 which is in turn connected to the rich media server 210 via an I_RMS_in2 interface 268.
The extended scene tree module 204 is also connected to the stream controller 208 via an I_SC_ctrl interface 270 which is in turn connected to the rich media server 210 via an
I_RMS_ctrl interface 272.
The nature of the interfaces may be summarised as follows: I_Pl_in (240, 242, 244) - plug-in input from content sources. I_Pl_out (258, 260, 262) - update commands generated by plug- ins.
I_Pl_ctrl (254) - plug-in life-cycle management interface. I_PM_ctrl (256) - plug-in related changes in extended scene tree .
I_XST_in (264) - scheduled extended scene tree update
commands.
I_XST_out (266) - registration for extended scene tree updates/extended scene tree update events.
I_SC_ctrl (270) - data flow tree update events.
I_RMS_ctrl (272) - rich media server configuration interface. I_RMS_inl (246) - rich media server content input streams. I_RMS_in2 (268) - rich media server scene descriptor input. I_RMS_out (248) - rich media server output to client devices.
Operation of the application server 200 will now be
described. A user of a client mobile device 232, in deciding that the device should present a scene, starts a client application in that device and submits a request for a scene which results in the device acting as a client and contacting the application server 200, or specifically the rich media server 210, which is the "side" of the application server visible to the clients, to obtain from it a session
descriptor, for example a Session Description Protocol (sdp) file that describes how the extended scene tree can be accessed by means of RTP (which is used, among other things, to provide endpoints, ports, etc) . In one embodiment of the invention, a client is able to access the extended scene tree directly. The extended scene tree in fact may have a
reference to an sdp file describing how the client can access the video (decoder parameters, ports etc) . The client opens a number of connections to the rich media server 210, one for each particular stream (for example one or more audio
streams, one or more video streams, and a scene stream, as described in the extended scene tree) . As a result, the scene stream, the one or more audio streams with specified applied audio transformations, and the one or more video streams with specified applied video transformations are being streamed to the client in order to be presented on the device. In
addition to arranging for the provision of relevant streams to clients, the update generator 206 subscribes to scene update events for a particular extended scene tree related to the streamed scene. When that extended scene tree is updated in some way, a corresponding update command is sent by the update generator 206 to the rich media server 210 which streams it towards the client having first performed any relevant filtering operations or transformations indicated by the update command. It should be noted that an update command may be for the modification of an existing scene or to start streaming of a new scene.
According to the invention, a content author is able to use the authoring tool 212 to modify a scene, either at a time when it is streaming the scene and corresponding content to one or more clients, or when no streaming is occurring, in order that aspects of presentation and/or content not hitherto present in the session descriptor can be added, or existing aspects modified or removed. This modification accordingly leads to a modification of the extended scene tree in the application server 200. In a typical embodiment, the content author is able to view the scene on a graphical user interface of the authoring tool 212, for example a rendered version of the scene, and modify its appearance and content. This may involve adding scene elements such as banners or boxes containing content, or modifying or deleting already existing scene elements. It should be noted that, although this embodiment of the invention refers to
modifications being carried out via a graphical user
interface, all that is required is that an author be provided with an expression of the document which defines the scene tree which the author can interact with and modify.
The content author may use the graphical user interface of the authoring tool 212 to add a feature to the scene. In doing this, a particular element of content is identified, for example a video stream or an RSS feed, and aspects of the content are defined. These aspects are presentation-related aspects of the content, such as its location in the scene and the nature of its presentation, for example whether it is to be presented as a banner or a box, the size at which the content is to be displayed, and, if specified, schedule- related aspects indicating, for example, that the content is to appear immediately, at a defined time, or following an identified event. Some authoring input may omit aspects related to scheduling, for example if they are to be executed by the relevant element ( s ) /functionality ( ies ) of the
application server 200 immediately upon being received. The authoring tool 212 receives relevant input from the content author via its graphical user interface and converts this input into an authoring command. The authoring tool 212 then sends an authoring command, or a version of the authoring command abstracted into another form, to the update scheduler In a case in which the authoring command relates to modifying a scene to include content provided by a plug-in, the
authoring tool 212 provides the authoring command to the update scheduler 216 which passes it on as an update command to the extended scene tree module 204. Scheduling information is generally kept by the update scheduler 216 and not sent to the plug-in manager 214. However, in cases in which the scheduling information may affect certain parameters of the instantiation of the plug-in, such as how often the RSS plug- in should check the content source whether it changed or not, this timing information may be provided to a relevant plug- in. When the extended scene tree module 204 receives an update command containing a reference to one of the plug-in definitions of the extended scene tree with the associated parameters to be used, for example its name such as
RSSHeadline, formatting parameters such as relative location of the content on the screen, colour of text and functional parameters such as the URL from where the content should be retrieved by the plug-in, it sends the plug-in reference, and associated parameters (such as child elements defined by XML code), to the plug-in manager 214. The plug-in manager 214 identifies which of the content sources the plug-in reference relates to, and which associated parameters refer to the plug-in, and instantiates a suitable plug-in. For example, if the content relates to a specific RSS feed 236, the plug-in 220 is instantiated. The plug-in manager 214 provides the associated parameters to the instantiated plug-in which then generates updated content in the form of XML code defining the content and presentation-related aspects, and this updated content is provided to the update scheduler 216. The update scheduler 216 receives the updated content and creates an update command based on it according to any scheduling information. The update command is provided to the extended scene tree module 204 for it to be applied to the extended scene tree, that is for relevant code according to the update command in the form of a scene unit to be applied to the extended scene tree, whether that be by addition, deletion, or modification. The plug-in manager 214 is responsible for managing the life- cycle of plug-ins and has access to relevant parts of the extended scene tree which are to be updated according to an update command. An example of life cycle management of a particular instance of a plug-in, the RSSHeadline plug-in, will now be described. The update command contains an
insertion of content where the RSSHeadline plug-in is
referenced. The plug-in manager 214 instantiates the
RSSHeadline plug-in which means, for instance, that it creates a thread in which the code of the plug-in is running as a result of this particular update command. The result of the code running may be that the headline of the RSS feed is retrieved from the specified URL after which the thread is terminated. Alternatively the thread, after creation, might be put into an idle mode by the plug-in manager 214 and brought back to active mode periodically in order to retrieve new headlines from the specified URL and generate subsequent update commands with the retrieved content replacing the URL of the RSS feed in the body of the update command. Finally, as a result of a delete command, the scene unit containing the RSS headline is deleted from the extended scene tree. Since there is no need to keep the thread running the
RSSHeadline plug-in instance running any more, the plug-in manager 214 terminates the thread.
Therefore, it is understood that an instance of the plug-in can be created and run. There can be several independent running instances of one plug-in within one extended scene tree and other running instances for other extended scene trees. In one example, there may be a scene consisting of a list of RSS items (one RSS feed for the weather in UK, US, Europe, etc), for which is defined an RSSContent plug-in. This enables several running plug-in instances in one
extended scene tree with each instance retrieving content from a different source (weather service in the US, UK, Europe etc) . Although the application server 200 may already contain any of the plug-ins referred to, which may even have been present within it when it was provided to an operator of a rich media system, additional plug-ins may be created in the application server 200. Therefore, there can be, as yet undefined, plug- ins that the plug-in manager 214 is able to download from a URL. For example, instead of a specific, previously defined plug-in such as <scenePlugin class="RssHeadline">, there may be a plug-in class <scenePlugin class="NewPlugin"
href="http : //www . plugins . com/NewPlugin> which is useable to set up new, previously undefined, plug-ins in the application server 200. It is advantageous to apply a degree of control in order to prevent the retrieval of malicious plug-ins from arbitrary external sources so that only plug-ins from trusted sources are downloaded. In this case, the application server 200 may have a configuration file with trusted URLs. In one embodiment of the invention, a settings functionality enables an administrator of the application server 200 to manually configure a set of URLs from where new plug-ins may be downloaded by the application server 200. The settings functionality may be useable such that an update containing a reference to a new plug-in indicates additionally the URL from where the code implementing the plug-in functionality can be downloaded by the application server 200.
Alternatively or additionally, the plug-in code can be executed by an application server different from, and
external with respect to, the application server 200, and parameters in a request are passed to this external
application server via a web service interface.
Plug-ins may be developed by programmers in an integrated development environment (IDE) . The authoring tool 212 is useable to cause the instantiation of such pre-defined plug- ins .
In a case in which the authoring command relates to modifying a scene to include content which is not provided by a plug- in, then such an update command does not require a plug-in to run. For example, for a simple insert new text command
"insert a new text paragraph in the scene tree with this content", the application server 200 handles this by using a "null" plug-in that does not retrieve any additional content, but just provides the authoring command as such, without modifications. Such a non-plug-in authoring command is provided by the authoring tool 212 to the update scheduler 216. In this case, the update scheduler 216 is able to provide the command, as a "null" plug-in, to the extended scene tree module 204 which applies the update to the scene tree without taking any action to instantiate a related plug- in .
In either case, that is authoring commands for plug-in related updates and for non-plug-in updates, the update scheduler 216 receives the authoring command and derives from it any associated concrete and/or relative time execution times to generate command scheduling information. It should be understood that authoring commands are thus leading to the generation of updates which are different from updates pre-composed for the scene. As discussed in the foregoing, pre-composed updates may have been composed during the composition of the scene and are pre-scheduled . According to the invention, real-time updates can be generated during the playback of a scene and be applied to modify the scene during its playback.
The extended scene tree module 204 uses the update command to modify the extended scene tree according to scheduling information. The update command is applied to the extended scene tree with an indication that the associated content, in this case the RSS feed, is to appear at the time defined by the scheduling information. In a first variant of the
invention, the update command is retained at the update scheduler 216 until a moment in time is reached which
corresponds to the scheduling information (that a specific time or a particular event having taken place) and then the update command is provided to the extended scene tree module 204 for it to be executed immediately. In a second variant of the invention, the update command is split into sub-commands by the update scheduler 216, including a first sub-command that performs the relevant action, for example including content in a scene, and a second sub-command which causes the action of the first sub-command to become apparent. For example, the first sub-command may include a component indicating that content to be added by the first sub-command is to remain hidden, and the second sub-command may be to switch a hidden status of the content to a shown status. In this case, the content may become visible at a scheduled time even though a related sub-command was executed at an earlier time .
The scheduling information can take a number of different forms. It can be express or implied, for example no
scheduling information leads to immediate execution of an update command. In the case of explicit scheduling
information, it can be defined as an absolute time, for example 12:34.56 or it can be a relative time.
As a result, an update command can be received by the
extended scene tree module 204 at in a timely manner for the extended scene tree to be modified according to the authoring command. The extended scene tree module 204 modifies the extended scene tree according to the authoring command.
Accordingly, it will be seen that the update scheduler 216 aggregates update commands from different sources, both plug- in and non-plug-in related, and initiates the execution of the commands in the proper time, that is according to
scheduling information (which may be immediately) . In the case of scheduling information being in the form of relative time, more sophisticated expressions of time can be applied. The timing may be relative to the playing or
presentation of other content. This may mean that an update can be for content to be played once playing of a (different) video has ended, or at a certain elapsed time from the end of the (different) video playing. The update generator 206, connected to the extended scene tree module 204 via the I_XST_out interface, 266 monitors (is subscribed to) the extended scene tree to determine when changes have been made. In fact, in an embodiment of the invention, the extended scene tree module 204 manages a number of extended scene trees representing different scenes. Whenever there is a change in one of the managed extended scene trees to which the update generator 206 is subscribed, then a corresponding update is transferred directly to the update generator 206. From the perspective of the update generator 216, updates may be pushed to it by the extended scene tree module 204.
It should be noted that although reference is made to "update commands", in some cases, an authoring command may not lead to an update command being provided to clients. This may be the case depending on the change requested by the authoring command in respect of, for example, certain features of streamed audio or video. For instance, if the encoding level of an MP4 video stream is changed, there may not be need for an update command. However, if the size of the MP4 video stream is changed from VGA to QVGA, then there is need for an update command in order for the client to be able to display the content correctly. As discussed in the foregoing, a number of clients may have accessed a particular scene from the rich media server 210. Once the client has opened connections to the rich media server 210 for the streams described in the session
descriptor, when there is an update to the relevant extended scene tree, the update is sent through the update generator 206 to the rich media server 210 and the rich media server 210 streams the update, and also provides new streams or modifies existing streams according to the update command, to the clients. As a result, in the event of a change having been made, the update generator 206 generates update commands to be sent to relevant clients. Content defined in the streamed scene is provided to clients via the rich media server 210 operating together with the stream controller 208. The stream controller 208 receives a command with parameters describing the requirement for a new video session from the extended scene tree module 204 through the I_SC_ctrl interface 270. As a result, the stream
controller 208 sends a command through the I_RMS_crtl
interface 272 to the rich media server 210 to set up the video channel and the scene descriptor channel.
Referring to a particular example of a command received by the stream controller 208 via the I_SC_ctrl interface 270, if there is a new scene having video content, or video content is added to an existing scene, the extended scene tree, in addition to providing relevant updates to the update
generator 206, will also send to the stream controller 208 an indication that a new video channel needs to be set up with associated parameters so the stream controller 208 can instruct the rich media server 210 to stream the video according to the parameters. The instruction can result, for instance, in transcoding video to be used in the new video channel to a lower quality and streaming at that lower quality. This could be carried out, for example, in a case where there are non-premium (normal) subscribers to which this low quality stream should be provided. If there is no such subscriber currently accessing the rich media server, it is possible to avoid the processing costs involved with carrying out this transcoding.
The content goes from the content source to a viewing client via the rich media server 210 which applies stream processing depending on filtering rules defined in the insert command. From the streaming content sources (for example a camera or a microphone) the rich media server 210 may simply forward the stream towards a client or apply transformations, for
instance downgrading the frame rate. The transformation depends on the commands the streaming server receives from the streaming controller 208. For example, streams coming from the content sources come into the rich media server 210 and in a certain format are transcoded to another format. In this way, the rich media server is applying a change in quality of content. As a specific example, video at 640 x 480 VGA format encoded with 30 frames per second may be converted into a 352 x 288 QCIF format encoded with 10 frames per second. Alternatively, the rich media server 210 may remove something from the content, or may add something to the content (for example an advertisement is mixed into the video stream) . The rich media server 210 streams (via unicast or broadcast) the various types of contents (audio, video, scene updates) to clients.
Therefore, it will be understood that the rich media server 210 aggregates different content streams from different content sources in order to deliver them to clients.
The extended scene tree module 204 may also be provided with filters. In effect, this means that sub-parts of the extended scene tree, such as branches or leaves, can be provided with filter conditions determining the provision of updates related to these sub-parts. For example, the extended scene tree may be configured to contain content which is to be provided to subscribers entitled to receive premium content but not to subscribers on a lower subscription level, for example a normal subscription. In this case, sub-parts of the extended scene tree relating to premium content will be used to enable the provision of updates to premium subscribers and sub-parts related to ordinary content will be useable to enable the provision of updates to normal subscribers.
Accordingly, when an update is to be generated by the update generator 206, the update generated is provided with
information relating to a change in a part of the extended scene tree together with any filter information associated with that change or that part of the extended scene tree. In other words, the update generator 206 is provided with information relating to one or more filter conditions which are to apply.
In generating a corresponding update triggered by a
modification to the extended scene tree, or in the handling of a normal update which is part of a streamed scene, the update generator 206 refers to the filter information and then generates one or more updates. This can be illustrated by an example. The extended scene tree is modified as a result of an authoring command, or as a result of a pre- scheduled update, that additional content, such as an
advertisement or RSS feed information, is to be inserted into a scene but the type of content to be inserted depends on the nature of the subscription held by subscribers receiving the streamed scene. Premium subscribers are to receive one type of content, such as the RSS feed information, and normal subscribers are to receive a different type of content, such as the advertisement. As a result of the corresponding change to the extended scene tree, the update generator 206 obtains a notification that a change has occurred and that a filter condition is present. The update generator 206 then generates an update for premium subscribers instructing the insertion into the scene of the one type of content, such as the RSS feed information. In addition, the update generator 206 generates an update for normal subscribers instructing the insertion into the scene of the different type of content, such as the advertisement. It will thus be seen that the update generator 206 is able to generate different classes of updates corresponding to a common modification of the
extended scene tree. The update generator 206 provides both of these updates to the rich media server 210. The rich media server 210, being aware of the clients to which the scene is being streamed, can then send to each client the relevant update, that is premium- or normal-related, to clients according to the subscriber level associated with that client. In one
embodiment, the rich media server 210 is configured to obtain relevant subscriber/client credentials, at the time the client initially opens the session. This is done when
accessing the sdp (session descriptor) from a URL. At this point, there is an http authentication operation in respect of the client and the server uses information from this operation to check against a database containing relevant client credentials. The rich media server 210 can be provided with particular ones of the those credentials so that it is able to direct different classes of updates to appropriate subscribers/clients . In the event that a plurality of different classes of a common update are generated but there is only one class of subscribers/clients having the scene streamed to them, for example all of the subscribers/clients are in a premium class, then the update in the redundant class, in this case the normal class, can be discarded by the rich media server 210. However, it may still have been generated by the update generator 206 because this element is simply following the possibilities indicated by filter information in any change notification it receives.
One advantage of having filters associated with parts of the extended scene tree is that one common extended scene tree may be maintained in the application server 200 for every (potential) kind of subscriber/client. With this advantage in mind, it will be understood that applying filters to a server-side extended scene tree does not necessarily require the application server to have the capability to handle real¬ time authoring. A server-side extended scene tree with associated filters can be provided which is configured to handle pre-composed updates which have been composed during the composition of a scene prior to its streaming and are pre-scheduled . That is, for a scene which was composed offline . In another embodiment of the invention, rather than having a common extended scene tree with having associated filters, the extended scene tree module may be capable of receiving code defining an extended scene tree containing filter conditions and using this code and the filter conditions to generate a number of different versions of the extended scene tree, representing all possible filter combinations. To give an example, if the extended scene tree has a first filter condition to provide advertisements to normal subscribers and not to premium subscribers, and a second filter condition defining a first video format and a second video format according to the capabilities of client devices which may receive a streamed scene, then four extended scene trees may be generated and maintained: one for normal subscribers receiving video format 1; one for normal subscribers
receiving video format 2; one for premium subscribers receiving video format 1; and one for premium subscribers receiving video format 2.
It should be noted that a common extended scene tree having filters associated with parts of it is more suitable for an implementation used for broadcasting a streamed scene. If the extended scene tree module is capable of generating and maintaining a number of different versions of the same extended scene tree, this may be more suitable for an
implementation used for unicasting a streamed scene to individual client devices. One advantage of the invention is that the output of the application server 200, that is the streams and updates are, from the perspective of the clients, being provided in a way which corresponds to known techniques. As a consequence, a client receives an update and it renders it (displays it directly, opens a video window in which it starts playing a video from a URL specified in the update command, or does some other action as specified in the update command) . This is advantageous because the client may receive both pre- composed and real-time authored updates and not see any difference between them and thus apply them in a known way. The client model applied may be any of those currently available such as the LaSeR Systems Decoder module.
In the case of the client device, the XML code received from the rich media server 210 is used to embody a scene tree which is stored in its memory. This scene tree in the client device corresponds to the extended scene tree in the
application server 200. It should be noted that these scene trees are not necessarily identical since the extended scene tree may be provided with additional features and elements according to the invention, for example plug-in and filter related features. The client device is able to use its scene tree and the client application to display the content described by the XML code as a rendered scene. When an update command arrives at the client (via IP multicast, via client application polling, or any other mechanism) , the XML code stored in the client memory embodying the scene tree is modified accordingly, that is the extant code may be modified by each update.
The first active instance of the XML code may be sent to the client device as an XML document when the client application is opened and a user enters the URL at which the scene may be obtained (or by some other user interaction mechanism) .
It is possible that the same XML document requested at this point can already have been stored in the client device because of earlier sessions accessing the same document. If a newly requested XML document is the same as an XML document previously stored in the client device from an earlier session, the server may decide that there were no
modifications since the last access (by means of the client specifying a "modified-since" http header, if the scene tree is downloaded via http) . However, the server may decide to completely replace any old XML code stored on the client from a previous session. A simple example of the application server 200 operating according to the invention will now be described. The application server is configured with an initial descriptor (XML stream definition) capable of setting up a new scene comprising a video and a scene descriptor so that it is available for streaming. The video is to be streamed from medial.mydomain.com. Initially the scene defines only the video at full size:
<stream xmlns : svg="http : //www . w3. org/2000 /svg"
xmlns : xlink="http : / /www .w3.org/1999/xlink" url="rtsp : / /mediaserver .mydomain . com/sportchannel">
<channel type="video" number="l"
handler="DataFlowManager">
<sdp file="mobilevideo . sdp"/>
<forward id="forwardedvideo"
url="rtsp : / /medial .mydomain . com/formulal . sdp">
</channel>
<channel type="scene" number="2" handler="SceneManager"> <svg:svg width="240" height="320" xmlns : xlink=""> <svg:g id="root">
<svg:video id="movie" width="240" height="320" xlink:href="#l">
</svg : g>
</svg : svg>
</channel>
</stream>
This is an extended scene tree. It can be seen that the extended scene tree is in three sub-blocks. The first sub- block
<stream xmlns : svg="http : / /www .w3.org/2000/svg"
xmlns : xlink="http : / /www .w3.org/1999/xlink" url="rtsp : / /mediaserver .mydomain . com/sportchannel"> is the initial descriptor which sets up a new stream. The video content is obtained by the application server 200 from url="rtsp : / /mediaserver .mydomain . com/sportchannel">" . Video content which has been transformed (for example
trimmed, filtered, or transcoded) is available for the client at url="rtsp : //medial .mydomain . com/formulal . sdp"> .
The second sub-block
<channel type="video" number="l"
handler="DataFlowManager">
<sdp file="mobilevideo . sdp"/>
<forward id="forwardedvideo"
url="rtsp : / /medial .mydomain . com/formulal .
sdp">
</channel>
is a channel element of the type "video". It is the first channel and the handler for this channel is the
DataFlowManager .
The third sub-block
<channel type="scene" number="2" handler="SceneManager">
<svg:svg width="240" height="320" xmlns : xlink="">
<svg:g id="root">
<svg:video id="movie" width="240" height="320" xlink:href="#l">
</svg:g>
</svg : svg>
</channel>
is a channel element of type "scene". It contains a scene tree of the type "svg". However, scene trees of some other type, such as SMIL may be defined instead. The third sub- block is the second channel and the handler for this channel is the SceneManager . Since there is a scene channel, clients receiving the streamed scene listen to and act upon updates coming through that channel. The updates affect the way the scene is displayed, for example how video content is
displayed. In this case, there is no audio channel although it should be understood that the initial descriptor may include an audio channel. The handlers are processes running in the application server 200 which handle a channel type, for example a video channel, an audio channel. A SceneManager handler handles scene update commands. A DataFlowManager handler handles update commands concerning audio and video channels. These are commands related to the characteristics of the video/audio content, rather than the scene. For example, there could be an update command specifying that the coding rate of video or audio is to change (due to network congestion, for instance) from 30 frames/second to 15 frames/second and thus the rich media server 210 should re-negotiate appropriate stream parameters with clients. The handlers are generated by suitable
processors present in the rich media server 210.
It will be understood that this example of code represents a data flow tree describing channels: one channel of type
"video" and one channel of type "scene". The video and scene channels are more or less independent channels.
The extended scene tree is closed with a </stream> tag.
Handling of the initial descriptor by the application server 200 will now be described with reference to Figure 3. This shows the steps involved in initial setup of a scene.
In a first step 310, the authoring tool 212 sends an update command to the update scheduler 216 to set up a new stream according to the initial descriptor given in the foregoing.
In a second step 312, the update scheduler 216 executes the update command immediately, and initialises the extended scene tree. In a third step 314, the extended scene tree module 204 notices that a new stream has been created, so it notifies the stream controller 208. In a fourth step 316, the stream controller 208 configures a new video channel in the rich media server 210 as described in the video channel definition, that is the second sub-block in the foregoing.
In a fifth step 318, the stream controller 208 configures a new scene description channel in the rich media server 210 as described in the scene channel definition, that is the third sub-block in the foregoing.
In a sixth step 320, the stream controller 208 sets up the update generator 206 to send scene tree updates from the changes of this extended scene tree to the newly created scene description channel.
In a seventh step 322, the update generator 206 registers for change notifications at the extended scene tree.
In an eighth step 324, the extended scene tree notifies the update generator 206 that the scene has been changed.
In a ninth step 326, the update generator 206 sees that it is a new scene, so it creates a new scene command and sends it to the rich media server 210.
As described above, the client devices can send requests to the application server 200 and the scene will be streamed to them. Therefore, it will be understood that the scene is being streamed by the application server 200 to one or more client devices. At this point, a content author decides to add a headline to the bottom of the scene. In a particular
embodiment of the invention, the headline is to display supplemental information based on an RSS feed and is provided by use of an RSS plug-in which has already been defined by using the authoring tool 212 (or may have been pre-loaded in the application server 200) . The authoring tool 212 takes instructions from the content author (provided via the graphical user interface of the authoring tool 212) and then transforms them into a corresponding insert command of XML code defining the update to modify the extended scene tree which is sent to, and is understandable by, the application server 200:
<update xmlns : svg="http : / /www . w3. org/2000/svg"
xmlns : lsr="urn : mpeg : mpeg4 : LASeR : 2005"
url="rtsp : / /mediaserver .mydomain . com/sportchannel">
<sceneUnit>
<lsr: Insert ref="root">
<svg:g id="headline">
<scenePlugin class="RssHeadline">
<location x="0" y="280" width="240"
height="30"/>
<style color="green" opacity="0.4"/>
<source href="http : //stats . formulal . com/rss"/> </scenePlugin>
</svg : g>
</lsr : Insert>
</sceneUnit>
</update>
In general terms, this is an update defining the way in which specified content will be displayed on client devices. The content is not always directly specified in the update command, but may be the result of code execution that fetches data from an external source or from external sources. An update may define that content is to be trimmed, shifted, rotated, hidden, shown (coming back from being hidden) , or otherwise transformed.
The inserted scene tree fragment "<sceneUnit>" contains a reference to the RssHeadline scene plug-in. It implements the following interface: public interface ScenePlugin {
SceneTreeFragment init (XmlFragment parameters,
SceneManager manager) ;
void stop ( ) ;
void start ( ) ;
void release ( ) ;
} A piece of code in the plug-in manager 214 called RSSPlugin implements the public interface ScenePlugin and implements init, stop, start and release methods that accept the
parameters referred to in the foregoing. In particular, the init method accepts XMLFragment and SceneManager for init, and the other methods do not accept parameters. The piece of code returns either a SceneTreeFragment (for the init method) or nothing (for the stop, start, and release methods) . In order to create a new plug-in that will be used by the plug- in manager 214, it is necessary to implement these methods. In other words, the piece of code is configured to expect that the plug-in manager 214 will call these methods with these parameters and will expect the specified return values.
A reference to the SceneManager itself is also passed to the plug-in to enable call-back, that is to allow the plug-in in a lower-level software layer to call the SceneManager in a higher-level software layer.
Handling of the insert command by the application server 200 will now be described with reference to Figure 4. This shows the steps involved in updating a scene by inserting an RSS headline by means of a plug-in.
In a first step 410, the authoring tool 212 is used to invoke the instantiation of a plug-in according to the insert command given in the foregoing and specifies the parameters for the newly instantiated plug-in to update a scene tree. In a second step 412, the update scheduler 216 determines that the insert command does not contain explicit timing, and consequently executes it immediately, inserting the block with the RSS plug-in into the extended scene tree.
In a third step 414, the extended scene tree module 204 notices that a plug-in instantiation with a specified set of parameters is requested and sends a message to the plug-in manager 214.
In a fourth step 416, the plug-in manager 214 instantiates the plug-in and initialises it giving it the responsibility to fill the corresponding sub-tree with concrete content. At this point, the plug-in is started. Instantiation" creates the thread and "starting" launches the functionality of the plug-in causing it to retrieve the content from the RSS feed in the internet.
In a fifth step 418, the plug-in sends an update command to the update scheduler 216 to fill the corresponding sub-tree with a - so far empty - headline. It may be preferred to give a field corresponding to the headline an attribute "hidden" so that any resulting update provided to a client has an empty headline which is not visible.
In a sixth step 420, the update scheduler 216 executes the command and inserts the headline.
In a seventh step 422, the extended scene tree notices that the actual scene tree content has been changed and sends a notification to the update generator 206.
In an eighth step 424, the update generator generates an update command from the extended scene tree change and sends it to the rich media server 210 (which in turn streams it out to the listening clients - this is omitted for simplicity) . In a ninth step 426, it is shown that the plug-in manager 214 starts the new plug-in. This is not strictly necessary and instead the plug-in may have been started in the fourth step 416, and at the point of the ninth step 426, instead of being started, the plug-in is operating according to the way in which it was configured. For example, operating autonomously and periodically polling a content feed for updated content.
At this point, the RSS plug-in has been instantiated and it takes the control of the associated scene sub-tree. In effect, this means that extra content (a sub-tree) has been added to the extended scene tree and the RSS plug-in is able to cause this content to be updated. The plug-in instance is able to provide update commands referring to the sub-tree (XML fragment) it controls.
Once the plug-in is running, it is able to provide updated content to the extended scene tree via the update scheduler 216 as will now be described. In the following, the actual content of this sub-tree is updated when a new RSS entry appears .
In a tenth step 428, the plug-in periodically polls the given RSS feed for headline content.
In an eleventh step 430, the plug-in retrieves the content from the RSS feed and sends an update to the update scheduler 216 for it to incorporate the content into the extended scene tree and thus generate an update to apply an overlay headline to the video:
<update xmlns : lsr="urn : mpeg : mpeg4 : LASeR : 2005 "
url="rtsp : / /mediaserver .mydomain . com/sportchannel"> <sceneUnit >
<lsr:Replace ref="s58874574">
<g id="s58874575"
xmlns="http : / /www . w3. org/2000/svg"> <rect x="0" y="280" width="240" height="30" fill="green" opacity=" 0.4 " />
<text x="20" y="300"> Felipe Massa 1 : 43 : ll</text> </g>
</lstr : Replace>
</sceneUnit>
</update>
This code is generated both from location, style and source input parameters and from the RSS content retrieved from the URL http://stats. formulal . com/rss .
In a twelfth step 432, the update scheduler 216 sends the update to the extended scene tree. As discussed in the foregoing, this update can be sent for immediate execution, for execution according to a fixed or relative time schedule, or to have execution carried out in a number of sub-steps so that execution and visibility of execution happen at
different times according to separate sub-commands.
In a thirteenth step 434, the extended scene tree module 204 notices that the actual scene tree content has been changed and sends a notification to the update generator 206. In a fourteenth step 436, the update generator 206 generates an update command from the extended scene tree change and sends it to the rich media server 210, and the update
command, or a transformed version of it, is streamed to clients .
In Figures 3 and 4, the entity referred to as "Extended Scene Tree (XST) " is to refer to the extended scene tree module 204 or the extended scene tree maintained by it, as the context requires .
In Figures 3 and 4, acknowledge messages and response
messages are omitted for simplicity. A plug-in is able to keep alive and continue to listen to (periodically poll) the RSS feed. If a leading article changes, then the plug-in can send an update command to the SceneManager to update the controlled scene tree portion:
Another kind of update will now be described. In this case, an advertising system inserts advertisement clips into content by means of the authoring tool. The following
commands interrupt movie playback and play an advertisement clip. After the end of the clip the movie is automatically continued. The end time of the advertisement clip is
specified as a SMIL expression time="id (advertisement ) (end), so the advertisement management system does not need to know or determine how long the advertisement clip takes. The commands also have a filter condition, so that an
advertisement is presented to normal subscribers and not presented to premium subscribers.
The first sub-block
<update xmlns : svg="http : //www . w3. org/2000 /svg"
xmlns : xlink="http : / /www .w3.org/1999/xlink"
xmlns : lsr="urn : mpeg : mpeg4 : LASeR : 2005 "
url="rtsp : / /mediaserver .mydomain . com/sportchannel"> is the descriptor which sets up an update.
The second sub-block
<sceneUnit filter="subscription . level ! =' premium' ">
<lsr:Replace ref="headline"
attributeName="visibility" value="hidden" />
<lsr:Replace ref="movie">
<video id="advertisement xlink : href="rtsp : / /adserver . mydomain . com/adverstisement 65. mp4 "/>
</lsr : Rep1ace>
</sceneUnit>
<sceneUnit time="id (advertisement ) (end)
filter="subscription. level ! = =' premium > <lsr : Replace ref="advertisement ">
<video id="movie" xlink:href="#l"/>
</lsr : Rep1ace>
</sceneUnit>
</update>
is a scene tree fragment which may replace the headline inserted earlier with an advertisement, in this case a video, in which a filter parameter (subscription level not equal to premium) is also specified.
This update command will be inserted into the extended scene tree and the same set of procedures from Figure 3 are
executed, where instead of "2: Insert RSS plug-in" we have "2: Update headline with advertisement window" and steps 3...6 do not occur as there is no plug-in involved in the execution of this update command.
The invention may be implemented in a streaming media system capable of broadcast or point-to-point delivery. In a
broadcasting implementation, filters are not applied because there may be different classes of subscribers. To deal with this; subscribers may be clustered according to their premium or otherwise nature and have respective broadcasts for respective ones of the classes.
Therefore, it will be understood that the invention enables modification of a scene in real time. This means that a scene tree in the application server 200 is modified while it is being used to provide updates to a corresponding scene tree present in a client, even during the time that the scene tree in the client is being rendered. This means that updates may be newly scheduled and generated after the beginning of streaming of the scene to clients. There may be update commands coming from authoring tools running in the premises of one/several command authors. While preferred embodiments of the invention have been shown and described, it will be understood that such embodiments are described by way of example only. Numerous variations, changes and substitutions will occur to those skilled in the art without departing from the scope of the present
invention. Accordingly, it is intended that the following claims cover all such variations or equivalents as fall within the spirit and the scope of the invention.

Claims

Claims
1. A method of composing streaming media, comprising the steps of:
assembling components of the media to create a server-side media structure;
assembling components of the media to create a client-side media structure, the client-side media structure
corresponding to the server-side media structure;
applying a modification to the server-side media structure; generating an update corresponding to the modification; and sending the update to at least one client on the client side in order that it can be applied to the client-side media structure .
2. A method according to claim 1 in which the streaming media is streaming of a scene
3. A method according to claim 1 or claim 2 in which the server-side and the client-side media structures are scene trees .
4. A method according to any preceding claim in which applying the modification to the server-side media structure results in a corresponding modification being applied to the client-side media structure.
5. A method according to any preceding claim in which an update generator monitors the server-side media structure to determine when changes have been made and update commands need to be generated to be sent to clients.
6. A method according to any preceding claim in which the server side is capable of instantiating a plug-in when it receives a plug-in reference.
7. A method according to claim 6 in which the plug-in is useable to automate the insertion of content from various content sources into the media.
8. A method according to claim 6 or claim 7 in which the plug-in is capable of handling relevant parameters from a reference, the reference then being substituted in the server-side media structure by a concrete portion as
generated and provided by the plug-in.
9. A method according to any preceding claim in which the server-side comprises an update scheduler which provides updates to the server-side media structure according to any scheduling information contained in a command.
10. A method according to any preceding claim in which the server-side receives authoring commands from an authoring tool .
11. A method according to any preceding claim in which the media structure is provided with filters.
12. A method according to claim 11 in which sub-parts of the media structure are provided with filter conditions
determining the provision of updates related to these sub¬ parts .
13. A method according to claim 12 in which the filter conditions relate to at least one condition selected from the following: subscriber class, type of client, and capabilities of a client device.
14. A server capable of composing streaming media, the server comprising:
a media structure module capable of assembling components of the media to create a server-side media structure, the server-side media structure corresponding to a client-side media structure, the media structure module being capable of receiving an instruction to modify the server-side media structure and applying a modification to the server-side media structure;
an update generator capable of generating an update
corresponding to the modification; and
a streaming sub-server capable of sending the update to at least one client on the client side in order that the update can be applied to the client-side media structure.
15. A communication system comprising:
a server capable of composing streaming media,
a streaming sub-server capable of sending the update to at least one client on the client side in order that the update can be applied to the client-side media structure; and at least one client,
the server comprising:
a media structure module capable of assembling components of the media to create a server-side media structure, the server-side media structure corresponding to a client-side media structure, the media structure module being capable of receiving an instruction to modify the server-side media structure and applying a modification to the server-side media structure; and
an update generator capable of generating an update
corresponding to the modification.
16. A computer program product comprising software code that when executed on a computing system performs a method of composing streaming media, comprising the steps of:
assembling components of the media to create a server-side media structure;
assembling components of the media to create a client-side media structure, the client-side media structure
corresponding to the server-side media structure;
applying a modification to the server-side media structure; generating an update corresponding to the modification; and sending the update to at least one client on the client side in order that it can be applied to the client-side media structure .
PCT/EP2010/062934 2010-09-03 2010-09-03 Media server and method for streaming media WO2012028198A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/062934 WO2012028198A1 (en) 2010-09-03 2010-09-03 Media server and method for streaming media

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/062934 WO2012028198A1 (en) 2010-09-03 2010-09-03 Media server and method for streaming media

Publications (1)

Publication Number Publication Date
WO2012028198A1 true WO2012028198A1 (en) 2012-03-08

Family

ID=42984086

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/062934 WO2012028198A1 (en) 2010-09-03 2010-09-03 Media server and method for streaming media

Country Status (1)

Country Link
WO (1) WO2012028198A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2765983A1 (en) * 1997-07-11 1999-01-15 France Telecom DATA SIGNAL FOR CHANGING A GRAPHIC SCENE, CORRESPONDING METHOD AND DEVICE
EP1133189A2 (en) * 2000-03-09 2001-09-12 SANYO ELECTRIC Co., Ltd. Transmission system, reception system, and transmission and reception system capable of displaying a scene with high quality
WO2001091464A1 (en) * 2000-05-23 2001-11-29 Koninklijke Philips Electronics N.V. Communication system with mpeg-4 remote access terminal
US20010056471A1 (en) * 2000-02-29 2001-12-27 Shinji Negishi User interface system, scene description generating device and method, scene description distributing method, server device, remote terminal device, recording medium, and sending medium
US6697869B1 (en) * 1998-08-24 2004-02-24 Koninklijke Philips Electronics N.V. Emulation of streaming over the internet in a broadcast application
FR2873884A1 (en) * 2004-07-29 2006-02-03 France Telecom Multimedia data exchanging system for establishing e.g. videoconference, has transmission unit transmitting, to terminals, common updating data for graphical scene of each terminal as data exchanged for management of communication session

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2765983A1 (en) * 1997-07-11 1999-01-15 France Telecom DATA SIGNAL FOR CHANGING A GRAPHIC SCENE, CORRESPONDING METHOD AND DEVICE
US6697869B1 (en) * 1998-08-24 2004-02-24 Koninklijke Philips Electronics N.V. Emulation of streaming over the internet in a broadcast application
US20010056471A1 (en) * 2000-02-29 2001-12-27 Shinji Negishi User interface system, scene description generating device and method, scene description distributing method, server device, remote terminal device, recording medium, and sending medium
EP1133189A2 (en) * 2000-03-09 2001-09-12 SANYO ELECTRIC Co., Ltd. Transmission system, reception system, and transmission and reception system capable of displaying a scene with high quality
WO2001091464A1 (en) * 2000-05-23 2001-11-29 Koninklijke Philips Electronics N.V. Communication system with mpeg-4 remote access terminal
FR2873884A1 (en) * 2004-07-29 2006-02-03 France Telecom Multimedia data exchanging system for establishing e.g. videoconference, has transmission unit transmitting, to terminals, common updating data for graphical scene of each terminal as data exchanged for management of communication session

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
J. ROYER, H. NGUYEN, O. MARTINOT, M. PREDA, F. PRETEUX, T. ZAHARIA: "Interactive TV on parliament session", SPIE, PO BOX 10 BELLINGHAM WA 98227-0010 USA ORD - 0000-00-00, 26 August 2007 (2007-08-26), San Diego, CA, USA, pages 1 - 8, XP040245209, DOI: 10.1117/12.734272 *
PELLAN B ET AL: "Media-Driven Dynamic Scene Adaptation", IMAGE ANALYSIS FOR MULTIMEDIA INTERACTIVE SERVICES, 2007. WIAMIS &APOS ;07. EIGHTH INTERNATIONAL WORKSHOP ON, IEEE, PI, 1 June 2007 (2007-06-01), pages 67 - 67, XP031119854, ISBN: 978-0-7695-2818-2 *
RIITTA VAANANEN: "User Interaction and Authoring of 3D Sound Scenes in the Carrouso EU projec", AES, 60 EAST 42ND STREET, ROOM 2520 NEW YORK 10165-2520, USA ORD - 0000-00-00, 25 March 2003 (2003-03-25), Amsterdam, The Netherlands, pages 1 - 9, XP040372176 *

Similar Documents

Publication Publication Date Title
US9843774B2 (en) System and method for implementing an ad management system for an extensible media player
US6631403B1 (en) Architecture and application programming interfaces for Java-enabled MPEG-4 (MPEG-J) systems
US7664813B2 (en) Dynamic data presentation
NL2016051B1 (en) Live-stream video advertisement system
US8555163B2 (en) Smooth streaming client component
US20090313122A1 (en) Method and apparatus to control playback in a download-and-view video on demand system
US10904642B2 (en) Methods and apparatus for updating media presentation data
CN109286820B (en) Stream media ordering method and system based on distributed memory system
US20020024539A1 (en) System and method for content-specific graphical user interfaces
US20090106315A1 (en) Extensions for system and method for an extensible media player
US8266246B1 (en) Distributed playback session customization file management
CN102007484A (en) Method and apparatus for providing and receiving user interface
US20150172353A1 (en) Method and apparatus for interacting with a media presentation description that describes a summary media presentation and an original media presentation
US11792248B2 (en) Methods and apparatuses for dynamic adaptive streaming over http
KR20010028861A (en) System and Method for Web Cataloging Dynamic Multimedia Using Java
WO2012028198A1 (en) Media server and method for streaming media
CN105594220A (en) Reception device, reception method, transmission device, and transmission method
WO2018033051A1 (en) Method and system for personalized presentation of multimedia content assembly
Jansen et al. Workflow Support for Live Object-Based Broadcasting
Song et al. Mobile rich media technologies: current status and future directions
KR102659938B1 (en) Method and apparatus for dynamic adaptive streaming over HTTP
WO2009045189A1 (en) Method and system for facilitating track-based server-side streaming playlist functionality
Ferreira Moreno et al. Specifying Intermedia Synchronization with a Domain-Specific Language: The Nested Context Language (NCL)
Ibrahim et al. TV graphics personalization using in-band events
Yu et al. A Service-Oriented Platform for Ubiquitous Personalized Multimedia Provisioning.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10749652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10749652

Country of ref document: EP

Kind code of ref document: A1