GB2495088A - Provision of Interactive Content - Google Patents

Provision of Interactive Content Download PDF

Info

Publication number
GB2495088A
GB2495088A GB1116598.2A GB201116598A GB2495088A GB 2495088 A GB2495088 A GB 2495088A GB 201116598 A GB201116598 A GB 201116598A GB 2495088 A GB2495088 A GB 2495088A
Authority
GB
United Kingdom
Prior art keywords
metadata
interactive
data
content
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1116598.2A
Other versions
GB2495088B (en
GB201116598D0 (en
Inventor
Andrew William Deeley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1116598.2A priority Critical patent/GB2495088B/en
Publication of GB201116598D0 publication Critical patent/GB201116598D0/en
Priority to PCT/GB2012/052386 priority patent/WO2013045922A1/en
Publication of GB2495088A publication Critical patent/GB2495088A/en
Application granted granted Critical
Publication of GB2495088B publication Critical patent/GB2495088B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/237Communication with additional data server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6581Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • H04N21/8405Generation or processing of descriptive data, e.g. content descriptors represented by keywords

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Interactive content is provided at a device (111) by receiving and decoding metadata (107), receiving data derived from the metadata at a server (117) and returning interactive data. The interactive data is used to provide interactive content. A tuner module receives a data signal including content payload (105) and metadata and a control module decodes the metadata from the data signal. Action data is issued to a server on the basis of instructions in the metadata, and interactive elements data are received for the content payload. A digital media processor uses the interactive data to provide a displaying device with interactive content. Synchronization is maintained between displayed elements and the content to be consumed. The metadata may include ID which is communicated to the server. The server may use the ID to provide extended metadata, which may be decoded. Also disclosed is a control module using multiple layered metadata representing interactive data elements for video content, and which determines points of the content to be enhanced with the interactive content. Interactive content is therefore provided by using metadata associated with the video content to designate points or regions of the video content to be enhanced with interactive content.

Description

INTERACTIVE SYSTEM
BACKGROUND
Broadcasters are typically concerned with the engagement and retention of viewers and in building ratings. To this end, they can use tried and tested formulae for engaging an audience, such as phone voting for example -initially with live person call handlers counting votes, and then progressively taking advantage of other technologies. For example, from key pad control telephone menu systems, fully automated Interactive Voice Recognition to SMS, online services and dedicated applications ("apps") for viewers' phones or tablet devices for example.
Simultaneously, users are expecting to be able to interact to a greater degree with the content that they consume, and are typically more engaged with content where some degree of interaction is provided.
To this end, the proliferation of high speed broadband connections, internet enabled TV's, phones and tablets equipped with high resolution screens, camera tracking technologies and the huge growth in internet video production is resulting in a convergence into interactive and immersive viewing experiences for viewers of both traditional viewing technologies such as terrestrial, cable and satellite as well as newer age broadcasting from internet or cloud based services which can be leveraged by broadcasters.
Dedicated apps for shows that engage viewers with voting have already started to appear. For example, one such app works by selling viewers a number of voting credits and allowing them to cast their vote using the app on their smart phone. Another such example of an interactive app includes one which allows viewers of certain entertainment shows to play along with the contestants.
Interactive services which are provided through satellite, cable or some terrestrial systems are known. For example, many advertisements give viewers the option to press a button on their remote control in order to engage with the company behind the advertisement. Once engaged viewers are then able to give personal details, book a service, arrange a call or order products for example.
Such techniques are limited however. For example, systems can be restricted to live shows thereby reducing an available voting window. There is also no real opportunity for giving participants feedback. Dedicated Apps can improve interaction between the show and the viewer but then both programme makers and viewers have to contend with creating or downloading apps for different platforms and shows.
SUMMARY
According to an example, there is provided a control system for an internet-enabled television apparatus, comprising a tuner module to receive a data signal including content payload and metadata, a control module to decode the metadata from the data signal, a communications module to: communicate with a server over a communications network; issue action data to the server on the basis of instructions in the metadata; and receive interactive elements data for the content payload, a digital media processor to use the interactive data to provide a displaying device with interactive content. The control module can be further operable to use the decoded rnetadata to determine a metadata ID, the communications module to communicate the metadata ID to the server. The server can use the metadata ID to provide extended metadata for the processor. The control module can decode the extended metadata. The metadata can include multiple metadata layers representing respective different interactive actions, and can include an ID to map to extended metadata layers for the television apparatus.
According to an example, there is provided a system for providing interactive content at a device, comprising a client device to receive video data representing video content, a control module of the client device to decode metadata from the video data, a server to receive control data derived from the metadata from the client device and to send interactive data to the device on the basis of the control data, a digital media processor to use the interactive data to provide the device with interactive content. The control module can fetch extended metadata from the server for decoding. The server can receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data; According to an example, there is provided a method comprising receiving metadata at a client device, decoding the metadata, providing data representing a set of actions for interactive elements of content to be consumed using the client device, providing a set of interactive data elements for the client device, and maintaining synchronization of interaction data elements displayed using the client device with the content to be consumed. The rnetadata can include multiple layers for respective ones of multiple stakeholders in the content. The set of actions for interactive elements can be derived directly from the metadata. The set of actions for interactive elements can be determined by mapping data representing an ID embedded in the rnetadata to multiple metadata items.
According to an example, there is provided apparatus comprising a control module to use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content. The apparatus can include a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module. The control module can decode data representing an identifier from the metadata.
BRIEF DESCRIPTION OF THE DRAWINGS
An embodiment of the invention will now be described, by way of example only, and with reference to the accompanying drawings, in which: Figure 1 is a schematic block diagram of a system according to an
example;
Figure 2 is a schematic state diagram of a method according to an
example;
Figure 3 is a schematic block diagram of a method according to an
example;
Figure 4 is a schematic block diagram of a system according to an
example; and
Figure 5 is a schematic block diagram of a client device 500 according to
an example.
DETAILED DESCRIPTION
Interactive television is an interactive audio/video delivery medium which provides broadcast audiovisual content to a number of subscribers.
Interactive television provides broadcast video and audio to users and also provides a return path for the user to interact with the content, e.g., to make selections or order desired products, etc. According to an example, a system for providing interactive content uses metadata associated with the video content in order to designate points and regions of the video content to be augmented or enhanced with interactive content. The interactive content can be provided from a server.
Alternatively a pointer to some other online content that a content provider wishes to serve to the end user can be provided. For example, the pointer can be provided on the server. Accordingly, event markers can be added along a timeiine within the video content or an audio stream of media content. Event markers can be layered, so that more than one event can be triggered at the same point along the timeline. Respective event markers can include positioning and behavior data, such as for any visual clues that are to be displayed over the broadcast at each frame along the timeline. In an example, an event marker or flag can indicate the position of an advert break, a special offer for a product in a scene, a call to vote, a survey about a topic matter discussed on the program or a decision point in an interactive program. Other alternatives are possible, as will be appreciated by those skilled in the art.
According to an example, a system for providing interactive content can include a control system which is part of a set-top box, internet enabled television apparatus or other device capable of receiving video data for display. The same system can also provide interactive content for online video or audio broadcasts by various means including a media player for web broadcasters which can be embedded in web page and used to play media. Browser plugins can allow the system to provide interactive content to third party media players that online broadcasters can embed within web pages. In addition, a media player can be installed on a viewer's computer, tablet or smart phone. Within the system, certain aspects provide that event markers can be detected by a codec and acted upon, as will be explained below. Such a codec can be provided in a number of different forms suitable for the hardware being used to receive the media and the operating environment used to control the hardware.
For example, the HbbTV standard is a European led standard for IPTV manufacturers to adhere to. The platform is Javascipt, CSS3 and HTML 5 compliant. Therefore, in an example, a codec can be written in Javascript.
The codec can then either be incorporated into a device's onboard firmware or, if provision is made for the HbbTV compliant device to retrieve instructions from an external IF address, the codec can detect event markers from an external server without having to embed the codec onboard the device itself.
Similar approaches may be taken with other standards or propriety platforms developed by different hardware manufacturers. These may require the codec to be developed using different programming languages if they are to be embedded into a device's firmware.
In an example, the codec can also exist embedded within a bespoke javascript or json media player library that online broadcasters can embed within their web pages thereby allowing them to hook into interactive servers for example. It can also exist as a plugin to a browser so that a user can choose to install when they visit a site for example. This approach can be used when the broadcaster has reliance upon some other third party media player on their website, thereby providing an easy way for them to migrate to a new system such that the browser plugin can be used to enhance the features of the current media players on their website for
example.
In an example, the codec may also exist in a form which permits it to be installed on a computer, tablet computer or smart phone for enhancing the features of any media players found on those devices, as well as a means of monitoring events of any media player software or app that can be provided for users to run on their device.
A codec according to an example can capture when a viewer is watching or listening to a piece of media and identify when an event marker or flag is reached or will be reached.
In an example, a codec can be used in multiple modes, such as in one of three modes. The first mode is where metadata for a main event marker timeline can be stored on a central server. In this example, a media file being broadcast includes a unique identifier. The codec reads the identifier and keeps the event marker timeline being retrieved from the server synchronized with the location within the broadcast where the user is. The codec will keep maintain synchronization irrespective of whether the broadcast is live, streamed on demand or recorded -either on a PVR type device or cloud based PVR -paused (live TV, streamed or recorded playback), fast forwarded or rewound etc. A second mode is where broadcasted media contains event markers in a self-contained metadata package that is broadcast with the media whilst viewing live, streaming on demand or playing back recorded material.
A third mode is where a codec receives metadata from existing video or audio standards and uses this to enhance the media with interactive content which is provided from a server. This provides broadcasters with means to migrate without having to rework back catalogues of media over a short timeframe.
In an example, event markers can include event codes which can be sent via an Internet connection to a server along with a viewer ID that can be used and stored on the device a user is watching or otherwise using.
Such event codes can be used to receive instructions from the server and send the instructions back to any device that the viewer has registered for the purposes of the interactive system. Accordingly, multiple devices can be maintained in sync with what a user is watching.
In an example, instructions received from the server on a device could execute a streamed advert from an online source or display a graphic overlaid on a display screen, such as for information or for a request for the viewer to interact. In one configuration, a viewer can interact with a television (or similar media consumption device) remote control.
Alternatively (or in combination), a smart phone or tablet device can be used.
According to an example, there is provided a system which includes: 1. A control system including a control module as part of an apparatus such as a set-top box or television; 2. A metadata layer controller; 3. A server, such as a cloud based control server; 4. Device independent interaction applications, such as HTML 5 compatible applications; 5. Optional device specific interaction applications; 6. A cloud based editor tool for adding interactive layers to video content; 7. Browser plugins for HTML 5 compatible browsers; 8. Third party plugins for professional video editing software for adding events onto a video's timeline, for the owner's layer and for publishing this to the IMES server; 9. Plugin control software to extend functionality of the most popular PC or Mac based video players; 10. Compatible browsers.
According to an example, an internet enabled television apparatus includes a television or a set-top box or other similar device to provide a signal to a television.
Figure 1 is a schematic block diagram of a system according to an example. A content provider 101 such as a broadcaster for example, produces content data 103. Content data 103, which can include content such as television programs for example, includes payload data 105, which can be video data, and metadata 107. The content data 103 is sent, or broadcast, over a delivery network 109. Depending on the nature of the content provider 101, the delivery network can include terrestrial analogue or digital television broadcasting or streamed or downloaded content provided over the internet. In an example, both payload data 105 and metadata 107 are sent using the same delivery network. However, it will be appreciated that one may be sent using a different delivery network or mechanism than the other. For example, payload data 105 may be sent over a terrestrial digital channel with metadata 107 being sent out of band', such as over the internet for example.
Whichever mechanism is used, the content data 103 is received by a client device 111. The client device 111 can be a television, such as an internet enabled television (IPTV), a set top box or other apparatus capable of receiving and decoding a signal received from at least the delivery network 109. In an example, the client device 111 is an internet enabled television apparatus or internet enabled set to box apparatus. Accordingly, a broadcast can be provided over the internet as a streamed broadcast to the IPTV. The client device 111 includes a control module 113. In an example, control module 113 can read metadata 107 from content 103. For example, the control module 113 can include a metadata control layer to decode metadata 107 and to perform appropriate actions based upon the instructions or data held in the metadata 107. In an example, the control module 113 can determine a current timeline/viewing status of programs being viewed using the client device 111. For example, when the client device 111 is powered on, the control module 113 can extract metadata and current timeline/viewing status from programs being watched either through an internet connection to the internet or using a tuner or PVR recording and playback device. In an example, metadata supplied within the content 103 may be a simple ID number which the control module 113 passes back to the server 117. The ID can map to data representing one or more extended metadata layers, and the server 117 can return such extended metadata layers to the client device 111 (and any additional devices). That is, metadata layers can either be wholly included in payload or can be referenced within the payload so that detailed metadata can be retrieved from an online source. Accordingly, a metadata ID can be mapped to a location or pointer on server 117 so that interactive content can be located from and provided by a location remote from the server 117.
The provision of remote content can be executed directly from the remote location to a device 111, or via server 117 for example.
In an example, control data 115 passes between the control module 113 and a server 117. For example, the control data 115 can be communicated over a network such as the internet. The control data 115 includes two components (not shown) -a component sent upstream to server 117, and a downstream component received from server 117. The distinction will become apparent below.
Typically, metadata for video content 103 includes a number of pre-defined elements representing specific attributes of the content, with each element capable of having one or more values. According to an example, metadata 107 is standard metadata suitable for content 103, and which is typically extendable metadata. Accordingly, a tag can be added to the metadata in order to leverage the standard metadata functionality that is found in most widely used terrestrial/digital broadcasted video formats and downloaded or streamed online video.
In an example, this can achieved by adding a tag 106 to the extendable metadata tags attached to content data 103. The tag 106 can be extracted from the metadata 107 and passed back to the server 117 by the control module 113 on the client device 111. In an example, tag 106 includes a representation for a set of layers for the content payload 105 in order to provide the client device 111 and server 117 with markers to enable execution of events and to serve as control points for the provision and execution of interactive elements for the content being consumed.
Figure 2 is a schematic state diagram of a method according to an example. Certain actions are shown as being performed contemporaneously or sequentially. However, it should be noted that the chain of events depicted in figure 2 provides only a high level view, and that certain actions may or may not be performed simultaneously with others, and that the relative order may be different. A network 203 is depicted.
Communications between various elements can be executed using the network 203, which can be a local network, GPRS or other mobile communications network or the internet for example.
A content provider 101 provides content 103 to a client device 111. A control module 113 of the client device extracts metadata 107 from the content 103. In an example, the metadata can be sent to server 117 in an unprocessed form. Alternatively, the metadata 107 can be decoded by the control module 113 in order to determine a set of actions 201 from a tag 106. The actions 201 are sent to server 117 and represent events for the client device 111. Such events can be interactive events in which a user of the client device 111 interacts with content 103 to provide a more immersive experience. In an example, actions 201 includes data representing a time for an interactive event, an identification of the device from which the actions 201 have been sent. Actions 201 can also include data representing content 103, such as an identifier for the content, and other data including an event duration for example. As noted above, in an example metadata 107 can contain all the actions and work as described above or it can hold an ID which can be sent to the server 117 so that full extended metadata layers can be sent to the client device 111 and the any additional devices 205 for processing by control module 113.
Data representing an event received at the server 117 is processed in order to determine the nature of the event and the required data which needs to be fed to the client device in order to execute the event. There can be further two-way communication if an event triggers further requirements for data. Accordingly, event data can be passed to the client device 111 from server 117, and can include data for an interactive event including audio and/or visual content and data representing one or more interactive elements such as buttons, scroll bars etc. Event data returned from the client device 111 to server 117 can include data representing an interaction and which includes data generated in response to a user interaction with one or more of interactive elements for example.
Control data is fed between the client device 111 and the server 117 to maintain synchronization between an event and a timeline of content 103 being consumed. For example, control data 115 can be used to ensure that an event is executed at a specific time, or that an event which is being executed (inasmuch as a user may be interacting with content in some way for example) continues to make sense or be relevant vis-à-vis the related content 103.
An additional device 205 can be used according to an example. Such a device can be a smart phone, tablet device or other suitable apparatus capable of displaying information to a user. For example, device 205 can include televisions, PCs, remote controls and camera based tracking and control system's which may be built into these devices.
In an example, additional device 205 is an internet enabled device which is communicatively coupled to network 203. The device 205 can register itself with server 117 in order to index itself as a member of the environment of figure 2. For example, the device 205 can send registration data to server 117. The server can register the device and send a confirmation, although this need not be required. In an example, registration data can include an identifier which uniquely identifies the additional device 205 in the environment. The registration can include data representing the capabilities of the device 205 such as the device's internet connectivity options, display options etc, all of which can be used to determine suitable actions which can be executed by or on the device 205.
According to an example, common interactions between a device 111, 205, such as via the interactive elements described above for example, can be handled by the firmware of the device in question, as is typical. Bespoke interactions, specific to the content being consumed are passed back to the server 117 where details of the interactions are validated before passing back instructions to a control module of the device in question, thereby providing it with the data required to display and process the bespoke interactions. As well as being pushed to the server via control device 113, interactions may also be pulled down or pushed down from the server 117 to client device 111 or additional devices 205 in an example.
Events can include to execute a streamed advert from an online source or to display a graphic overlaid on a display screen either for information or for a request for the viewer to interact for example. A user can interact using a remote control, smart phone or tablet device for example.
According to an example, there are multiple layers of events that can be represented in one or more metadata tags 106 for content 103. For example, there can be a Production Company Layer which is reserved for the company that produced the content 103. Multiple Broadcaster Layers can be owned by whoever airs or streams the content 103. There can be multiple Advertisers Layers, as well as Social Network Layers which can be owned by any number of Social Networks. In an example only viewer preferred Social Networks will be active.
None of the layers are mandatory so it is possible for Broadcasters to embed their own Broadcast layer into media content even if the Production Company has not set up their reserved layer. This ensures broadcasters can enhance older content without the need for Production Company's to register layers on old shows.
In an example, a tag 106 can therefore include data for multiple layers.
Each layer can be demarcated from the others by specific identifiers for layers, such as an identification code or name. Accordingly, the tag can include layer metadata which provides a structure for layers. Payload data for layers can include information specific to each layer. For example, for each layer there can be data representing actions which correspond to interactive elements and portions of content 103. Alternatively, separate layers can be embodied in separate tags in metadata 107.
Figure 3 is a schematic block diagram of a method according to an example. As described above, content 103 can include payload 105 and metadata 107. A control module 113 of a client device 111 can decode the metadata 107 in order to provide a specific tag 106. The tag 106 provides data for provdfrig interactive elements for the content 103. According to an example, tag 106 can include data for multiple layers. A layer can be owned or otherwise managed by a certain stakeholder such as a broadcaster or content provider 101, an advertiser, a social network provider etc. For each layer, payload data can be provided which provides data representing interactive elements which the stakeholder of the layer in question would like to present to a user who is consuming the content which has spawned the elements.
In an example, actions 201 can be extracted from tag 106 and provided to server 117. The server 117 can process the actions in order to determine suitable interactive data elements 301 to provide for a device 111, 205.
Accordingly, multiple actions for different layers can be provided to server 117. Each layer can be distinguished from others using a layer identification 303 which can be mapped to specific sets of interactive elements. These can be selected either in connection with some characteristics of the layer owner or stakeholder, or can be provided by the owner or stakeholder. The relevant interactive data elements 301 can then be provided to devices 111, 205. Control data 305 flowing between devices 111, 205 and server 117 provides data which enables the timely and relevant provision of interactive elements as described above. For example, control data can relate to timing and duration of certain interactive elements. Further, control data can relate to certain actions performed in connection with an interactive element. For example, an interactive button can be pressed, and data representing the interaction can be part of the control data which flows between devices 111, 205 and server 117.
According to an example, server 117 is thus operable to pass interactions or instructions received from a first device of the devices 111, 205 to the other in order to update the software running on the other device to progress the interaction or to simply remotely control the other device from the first device, ristead of using a remote control for example.
As described above, rnetadata 107 can comprise an ID for use by server 117. That is, an ID can map to data representing one or more extended metadata layers, and the server 117 can return such extended metadata layers to a client device 111, 205 of figure 3 for example. Accordingly, a metadata ID can be mapped to a location or pointer on server 117 so that interactive content can be located from and provided by a location remote from the server 117. The provision of remote content can be executed directly from the remote location to a device 111, or via server 117 for
example.
In other examples, control data can include the provision of: linking a user to friends in their social network, either using a proprietary social network of existing users or one or more social networks linked to by the user; passing interactive content from layer owners' servers or pointing the users devices to interactive content via inbuilt device firmware, a device independent interaction application or a device specific application; passing data and interactions from a user back to a relevant service or layer owner; recording Click Throughs' on adverts or other paid interactive events that appear on contents time line.
In an example, server 117 can be used to manage online bidding for advertisers to secure interactive ads around events or the advertising layer or to add sponsorship or attach more traditional ads to the video material as well as managing payment for services rendered.
According to an example, a device independent interaction application (app) can be provided. The app can be written using the HTML 5 scripting language, combined with Javascript and JSON to provide a cross platform interactive user interface that can run on any HTML 5, W3C compliant browser. When opened on one of a user's devices, the server 117 can send to the app details of what the user is consuming. Where more than one registered device is being used to view a broadcast TV program or a streamed/downloaded video for example, the app can provide the user the option of which TV or device to sync to. Accordingly, the app can provide the viewer with extended information connected to the program or video that they are watching. This information can be viewed immediately or queued for a later time.
The information provided may be simple extended information provided by the program maker or some other layer owner, or it may be in the form of micro sites provided by the program maker or advertiser to provide a rich interactive engagement or shopping experience. In an example, the app can also act as a social portal where the user can interact with family and friends about the program or offer they are looking at.
Extended media can be additional video footage served from the internet by the layer owner or server 117. Additional footage may also include interactive events, and therefore the app also has video player capabilities according to an example which will identify if an metadata tag is present and pass this back to the server 117 in order to retrieve additional interactive content.
In an example, a Layer Editor is provided in the form of a Web App designed to give publishers and content providers the means to add events onto a metadata event timeline. A drag and drop interface is provided to allow a user to import a video file they wish to add a layer to, select the type of event they wish to insert and then allow them to drag this onto the video's timeline.
An event timeline can have multiple dimension elements -for example, Time, Duration, X and Y co-ordinates and behaviors of the event. The Time element represents the start position on the video's timeline, the Duration allows the user to determine how long the event will remain active/visible and the X and Y co-ordinate represents the position on the video frame. The behaviors will help provide extendable actions to control the way in which the events appear on screen, how they may animate, fade in and out, etc. An event may stretch across a number of video frames across the timeline, as dictated by the duration. Each event can have a different X and Y co-ordinate for each frame within the Duration of the event. This gives the flexibility to generate moving visible event tags that follow an object on the TV, PC, Smart Phone or Tablet being used to watch a video that supports the IMES service.
In an example, visible events can be provided in a number of different forms, such as: 1. Semi-transparent pop ups that contain images, actions and links to extended content; 2. Visible transparent glowing (ghostly) vector outlines over an object or person that that the event is attached to, which when pressed can execute actions or open up a link for example; 3. Simple text web link tags; and 4. Graphical images, both animated and not.
Actions can be instructions given by the user that execute something. The Actions that the user may wish to carry out include: 1. Add to wish list; 2. Review later; 3. Give me more info; 4. Visit website; 5. Share with friends (using one or more social networks); 6. Email details; and 7. Pause transmission/video (fast forward, rewind).
Actions may require additional information, either from the viewer's preferences or from live input from the viewer. Examples to help illustrate could be: 1. Visit website', where the viewer may have set the default behaviour to either "Interrupt TV viewing, pause if possible and show website on my internet enabled television" 2. Don't interrupt TV viewing and show website on my tablet/smart phone" 3. "Pause TV if possible and show website on my tablet/smart phone".
The actions sets are extendible and so the above represents only a selection of the options that are available. For example, playback of media on a device can be extended to allow an Electronic Program Guide and other TV menus for settings and IPTV apps to be provided onto the secondary device, so that some control of a main viewing device can be effected using a secondary device such as a tablet or phone for example.
When an Action is triggered by the viewer Action data is sent back to server 117 where it is processed and appropriate instructions sent to the relevant registered devices.
In an example, browser plugins can be provided to further extend the functionality of a web browser so that content providers can allow viewers to interact with online streamed video from a standard web page, viewed on their computer, in the same way as they would if they were watching a TV program or streamed content on their IPTV. In addition, third party plugins for professional video editing software for adding events onto a video's timeline, for the owners layer and for publishing this to server 117 are provided according to an example. That is, professional production and broadcast companies may wish to incorporate the creation of layers into their editing workflows. To achieve this Editing Suite plugins are provided to extend the functionality of the users applications.
Figure 4 is a schematic block diagram of a system according to an example. A client device 111 in the example of figure 4 is a set-top box connected to a television apparatus 401. The device 111 is internet-enabled, and is connected to the internet 403. Device 111 receives content 103 from content provider 101 over the delivery network 109, which in the example of figure 4 is a cable television network. Control module 113 of device 111 decodes metadata 107 from the content 103 and determines actions 201 relevant to the content being displayed on television 401. The actions are sent to server 117 over the internet 403 where relevant interactive elements data is determined or generated and sent back to device 111 via the internet 403.
In an example, device 205 is registered at the server 117 as a user device.
Device 111 is also registered at server 117. A user preference file 405 stored on server 117 provides data in the form of a profile for the user of the system of figure 4 which sets out the user's preferences in relation to their registered devices 111, 205 and the way in which each device should handle interactive content received from server 117 in connection with content 103. In the example of figure 4, preference file 405 provides that device 205 can receive interactive elements data 301 from server 117.
Accordingly, following receipt of actions 201 and determination of suitable interactive content, server 117 can send data 301 to device 205 over the internet 403. As device 205 is registered with server 117, a communicative coupling is established between it and the server 117. In this connection, data 301 can be sent to device 205 in preference to or in addition to device 111. Alternatively, different data 301 can be sent to the devices 111, 205.
For example, data 301 sent to device 111 can provide first interactive or passive elements for display on television 401, and data 301 sent to device 205 can provide second interactive or passive elements for display using a display 407 of device 205.
In an example, when a user interacts with device 111 or 205, control data representing an action performed in connection with an interactive elements displayed on the device can be sent to server 117 via internet 403. The interaction may prompt further interactive elements to be sent to a device, such as in response to a user selection for example. In a basic example, an interactive element can include a button to change an aspect of the television 401 such as channel, volume etc. Accordingly, the server can send corresponding control data to device 111 in order to effect the desired change.
Figure 5 is a schematic block diagram of a client device 500 according to an example. A digital media processor 501 can decode data received from a content provider. For example, processor 501 can decode a video datastream for display, such as a datastream received via a tuner 503 for example. A control module 505 is communicatively coupled to processor 501 and operable to decode metadata from a video datastream.
Alternatively, control module can form pad of processor 501. An Ethernet port 507 can be provided in communication with processor 501 so that data can be sent and received over the internet or another network for example.
Similarly, a wireless interface 509 can provide wireless radio-frequency network communication. Elements 507, 509 can be commurications modules for the device 500.

Claims (7)

  1. <claim-text>CLAIMSWhat is claimed is: 1. A control system for an internet-enabled television apparatus, comprising: a tuner module to receive a data signal including content payload and metadata; a control module to decode the metadata from the data signal; a communications module to: communicate with a server over a communications network; issue action data to the server on the basis of instructions in the metadata; and receive interactive elements data for the content payload; a digital media processor to use the interactive data to provide a displaying device with interactive content.</claim-text> <claim-text>2. A control system as claimed in claim 1, the control module further use the decoded metadata to determine a metadata ID, the communications module to communicate the metadata ID to the server.</claim-text> <claim-text>3. A control system as claimed in claim 2, the server to use the metadata ID to provide extended metadata for the processor.</claim-text> <claim-text>4. A control system as claimed in claim 3, the control module to decode the extended metadata.</claim-text> <claim-text>5. A control system as claimed in claim 1, wherein the metadata includes multiple metadata layers representing respective different interactive actions.</claim-text> <claim-text>6. A control system as claimed in claim 1, wherein the metadata includes an ID to map to extended metadata layers for the television apparatus.</claim-text> <claim-text>7. A system for providing interactive content at a device, comprising: a client device to receive video data representing video content; a control module of the client device to decode metadata from the video data; a server to receive control data derived from the metadata from the client device and to send interactive data to the device on the basis of the control data; a digital media processor to use the interactive data to provide the device with interactive content.</claim-text> <claim-text>8. A system as claimed in claim 7, the control module to fetch extended metadata from the serverfor decoding.</claim-text> <claim-text>9. A system as claimed in claim 7 or 8, the server to receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data; 10. A method comprising: receiving metadata at a client device; decoding the metadata; providing data representing a set of actions for interactive elements of content to be consumed using the client device; providing a set of interactive data elements for the client device; and maintaining synchronization of interaction data elements displayed using the client device with the content to be consumed.11. A method as claimed in claim 10, wherein the metadata includes multiple layers for respective ones of multiple stakeholders in the content.12. A method as claimed in claim 10 or 11, wherein the set of actions for interactive elements are derived directly from the metadata.13. A method as claimed in claim 10 or 11, wherein the set of actions for interactive elements are determined by mapping data representing an ID embedded in the metadata to multiple metadata items.14. Apparatus comprising: a control module to use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content.15. Apparatus as claimed in claim 14, further comprising a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module.16. Apparatus as claimed in claim 14 or 15, the control module to decode data representing an identifier from the metadata.Amendments to the claims have been filed as follows:CLAIMS1. A control system for an internet-enabled television apparatus, comprising: a fh-st client device including: a tuner module to receive a data signal including content payload and metadata; a control module to extract the metadata from the received data signal; a communications module to: receive metadata from the control module; communicate with a server over a communications network; issue action data to the server on the basis of instructions in, or associated with, the metadata; said communications module or a second client device being adapted to C\J receive interactive data elements for the content payload from the server based on the action data; o a digital media processor located on said first andlor second client device to use the interactive data elements to provide a displaying device with interactive content.
  2. 2. A control system as claimed in claim 1, the control module further use the metadata to determine a metadata ID, the communications module to communicate the metadata ID to the server.
  3. 3. A control system as claimed in claim 2, the server to use the metadata ID to provide extended metadata for the processor.
  4. 4. A control system as claimed in claim 3, the control module to decode the extended metadata.
  5. 5. A control system as claimed in claim 1, wherein the metadata includes multiple metadata layers representing respective different interactive actions.
  6. 6. A control system as claimed n claim 1, wherein the metadata includes an ID to map to extended metadata layers for the television apparatus.
  7. 7. A system for providing interactive content at a device, comprising: a client device to receive video data representing video content; a control module of the cUent device to extract metadata from the received video data; a server to rec&ve control data derived from, or associated with, the rnetadata, extracted by the control module, from the client device, and to send interactive data to the said device and/or another client device on the (3) basis of the control data; O a digital media processor located on one or both client devices to use the interactive data to provide the device(s) with interactive content. 1-158. A system as claimed in claim 7, the control module to fetch extended rnetadata from the server for decoding.9. A system as claimed in claim 7 or 8, the server to receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data; 10. A method comprising: receiving a signal including content payload and metadata at a client device; extracting the received metadata; providing, to a server, data representing a set of actions for interactive data elements of content to be consumed using a client device on the basis of instructions in, or associated with, the metadata; providing by the server, said set of interactive data elements for the said client device or other client device; and maintaining synchronization of interactive data elements displayed using a client device(s) with the content to be consumed.11. A method as claimed in claim 10, wherein the metadata includes multiple layers for respective ones of multiple stakeholders in the content.12. A method as claimed in claim 10 or 11, wherein the set of actions for interactive elements are derived directly from the metadata. C)O 15 13. A method as claimed in claim 10 or 11, wherein the set of actions for C) interactive elements are determined by mapping data representing an ID in embedded in the metadata to multiple metadata items.14. A system as claimed in claims 1 to 9 wherein the said control module is adapted to use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content.15. A system as claimed in claim 14, further comprising a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module.16. A system as claimed in claim 14 or 15, the control module to decode data representing an identifier from the metadata.</claim-text>
GB1116598.2A 2011-09-27 2011-09-27 Interactive system Expired - Fee Related GB2495088B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1116598.2A GB2495088B (en) 2011-09-27 2011-09-27 Interactive system
PCT/GB2012/052386 WO2013045922A1 (en) 2011-09-27 2012-09-26 System for providing interactive content to an internet - enabled television apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1116598.2A GB2495088B (en) 2011-09-27 2011-09-27 Interactive system

Publications (3)

Publication Number Publication Date
GB201116598D0 GB201116598D0 (en) 2011-11-09
GB2495088A true GB2495088A (en) 2013-04-03
GB2495088B GB2495088B (en) 2013-11-13

Family

ID=44993406

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1116598.2A Expired - Fee Related GB2495088B (en) 2011-09-27 2011-09-27 Interactive system

Country Status (2)

Country Link
GB (1) GB2495088B (en)
WO (1) WO2013045922A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3076634A4 (en) * 2015-02-03 2016-11-02 Huawei Tech Co Ltd Method for playing media content, server and display apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102335007B1 (en) 2015-04-01 2021-12-06 삼성전자주식회사 Method and device for transmitting/receiving information in a broadcating system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020035601A1 (en) * 1996-03-08 2002-03-21 Craig Ullman Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
US20020162117A1 (en) * 2001-04-26 2002-10-31 Martin Pearson System and method for broadcast-synchronized interactive content interrelated to broadcast content
US20080201736A1 (en) * 2007-01-12 2008-08-21 Ictv, Inc. Using Triggers with Video for Interactive Content Identification
US20100158109A1 (en) * 2007-01-12 2010-06-24 Activevideo Networks, Inc. Providing Television Broadcasts over a Managed Network and Interactive Content over an Unmanaged Network to a Client Device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4735700B2 (en) * 2008-10-08 2011-07-27 ソニー株式会社 Receiving device, receiving method, server device
KR20100062158A (en) * 2008-12-01 2010-06-10 삼성전자주식회사 Display apparatus and method of displaying
KR101608763B1 (en) * 2009-06-11 2016-04-04 엘지전자 주식회사 Mobile terminal and method for participating interactive service thereof, and internet protocol television terminal and communication system
US8839306B2 (en) * 2009-11-20 2014-09-16 At&T Intellectual Property I, Lp Method and apparatus for presenting media programs
US20110154200A1 (en) * 2009-12-23 2011-06-23 Apple Inc. Enhancing Media Content with Content-Aware Resources

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020035601A1 (en) * 1996-03-08 2002-03-21 Craig Ullman Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
US20020162117A1 (en) * 2001-04-26 2002-10-31 Martin Pearson System and method for broadcast-synchronized interactive content interrelated to broadcast content
US20080201736A1 (en) * 2007-01-12 2008-08-21 Ictv, Inc. Using Triggers with Video for Interactive Content Identification
US20100158109A1 (en) * 2007-01-12 2010-06-24 Activevideo Networks, Inc. Providing Television Broadcasts over a Managed Network and Interactive Content over an Unmanaged Network to a Client Device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3076634A4 (en) * 2015-02-03 2016-11-02 Huawei Tech Co Ltd Method for playing media content, server and display apparatus

Also Published As

Publication number Publication date
GB2495088B (en) 2013-11-13
GB201116598D0 (en) 2011-11-09
WO2013045922A1 (en) 2013-04-04

Similar Documents

Publication Publication Date Title
US20230143423A1 (en) Providing enhanced content
US9967708B2 (en) Methods and systems for performing actions based on location-based rules
CN103210654B (en) Digital receiver and the method for real-time audience ratings is provided
CN103348693B (en) Systems and methods for navigating through content in an interactive media guidance application
US9106873B2 (en) Methods and systems for providing enhanced content by way of a virtual channel
KR101774039B1 (en) Automatic media asset update over an online social network
CA2602109C (en) Method and system of providing user interface
US20120233646A1 (en) Synchronous multi-platform content consumption
US20120284744A1 (en) Automated playlist generation
US20140130097A1 (en) Apparatus and method for television
US20110178854A1 (en) Method and system for enhancing and/or monitoring visual content and method and/or system for adding a dynamic layer to visual content
US20120159541A1 (en) Platform shifted advertising and information fulfillment
US20120240142A1 (en) Content Provision
CN101606171A (en) The apparatus and method of access and first media data correlation combiner information
US20130312049A1 (en) Authoring, archiving, and delivering time-based interactive tv content
US20130054319A1 (en) Methods and systems for presenting a three-dimensional media guidance application
US8327404B2 (en) Methods and systems for providing enhanced content associated with a media content instance available for purchase
US20150312633A1 (en) Electronic system and method to render additional information with displayed media
US20120179968A1 (en) Digital signage system and method
CN105208415A (en) Advertisement playing method, advertisement playing service device and advertisement playing system based on interactive network television
WO2013045922A1 (en) System for providing interactive content to an internet - enabled television apparatus
KR101537547B1 (en) On-line live-broadcasting advertisement system and method using overlay streaming
KR20120057028A (en) System, method and apparatus of providing/receiving advertisement content of service providers and client
KR20120078069A (en) System, method and apparatus of providing/receiving contents of plurality of content providers and client
WO2014191081A1 (en) Providing information about internet protocol television streams

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20210927

S28 Restoration of ceased patents (sect. 28/pat. act 1977)

Free format text: APPLICATION FILED

S28 Restoration of ceased patents (sect. 28/pat. act 1977)

Free format text: RESTORATION ALLOWED

Effective date: 20221007

PCNP Patent ceased through non-payment of renewal fee

Effective date: 20230927