WO2013045922A1 - System for providing interactive content to an internet - enabled television apparatus - Google Patents
System for providing interactive content to an internet - enabled television apparatus Download PDFInfo
- Publication number
- WO2013045922A1 WO2013045922A1 PCT/GB2012/052386 GB2012052386W WO2013045922A1 WO 2013045922 A1 WO2013045922 A1 WO 2013045922A1 GB 2012052386 W GB2012052386 W GB 2012052386W WO 2013045922 A1 WO2013045922 A1 WO 2013045922A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- metadata
- interactive
- data
- content
- server
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 93
- 230000009471 action Effects 0.000 claims abstract description 39
- 238000004891 communication Methods 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims description 12
- 230000003190 augmentative effect Effects 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 2
- 230000003993 interaction Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 10
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- JCYWCSGERIELPG-UHFFFAOYSA-N imes Chemical compound CC1=CC(C)=CC(C)=C1N1C=CN(C=2C(=CC(C)=CC=2C)C)[C]1 JCYWCSGERIELPG-UHFFFAOYSA-N 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/237—Communication with additional data server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41407—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6581—Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
- H04N21/8405—Generation or processing of descriptive data, e.g. content descriptors represented by keywords
Definitions
- Broadcasters are typically concerned with the engagement and retention of 5 viewers and in building ratings. To this end, they can use tried and tested
- users are expecting to be able to interact to a greater degree with the content that they consume, and are typically more engaged with content where some degree of interaction is provided.
- Dedicated apps for shows that engage viewers with voting have already started to appear For example, one such app works by selling viewers a 25 number of voting credits and allowing them to cast their vote using the app
- Another such example of an interactive app includes one which allows viewers of certain entertainment shows to play along with the contestants.
- Interactive services which are provided through satellite, cable or some terrestrial systems are known. For example, many advertisements give viewers the option to press a button on their remote control in order to engage with the company behind the advertisement. Once engaged viewers are then able to give personal details, book a service, arrange a call or order products for example.
- a control system for an internet- enabled television apparatus comprising: a first client device including: a tuner module to receive a data signal including content payload and metadata; a control module to extract the metadata from the received data signal; a communications module to: receive metadata from the control module; communicate with a server over a communications network; issue action data to the server on the basis of instructions in, or associated with, the metadata; said communications module or a client device being adapted to receive interactive elements data for the content payload from the server based on the action data; and a digital media processor located on said first and/or second client device to use the interactive elements data to provide a displaying device with interactive content.
- the control module can be further operable to use the metadata to determine a metadata ID, the communications module to communicate the metadata ID to the server.
- the server can use the metadata ID to provide extended metadata for the processor.
- the control module can decode the extended metadata.
- the metadata can include multiple metadata layers representing respective different interactive actions, and can include an ID to map to extended metadata layers for the television apparatus.
- a system for providing interactive content at a device comprising: a client device to receive video data representing video content; a control module of the client device to extract metadata from the received video data; a server to receive control data derived from, or associated with, the metadata extracted by the control module, from the client device, and to send interactive data to the said device and/or another client device on the basis of the control data; a digital media processor located on one or both client devices to use the interactive data to provide the device(s) with interactive content.
- the control module can fetch extended metadata from the server for decoding.
- the server can receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data.
- a method comprising: receiving a signal including content payload and metadata at a client device; extracting the received metadata; providing, to a server, data representing a set of actions for interactive data elements to be consumed using a client device on the basis of instructions in, or associated with, the metadata; providing by the server said set of interactive data elements for the said client device or other client device; and maintaining synchronization of interactive data elements displayed using a client device(s) with the content to be consumed.
- the metadata can include multiple layers for respective ones of multiple stakeholders in the content.
- the set of actions for interactive elements can be derived directly from the metadata.
- the set of actions for interactive elements can be determined by mapping data representing an ID embedded in the metadata to multiple metadata items.
- control module may use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content.
- the control module can include a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module.
- the control module can decode data representing an identifier from the metadata.
- Figure 1 is a schematic block diagram of a system according to an example
- Figure 2 is a schematic state diagram of a method according to an example
- Figure 3 is a schematic block diagram of a method according to an example
- Figure 4 is a schematic block diagram of a system according to an example.
- FIG. 5 is a schematic block diagram of a client device 500 according to an example. DETAILED DESCRIPTION
- Interactive television is an interactive audio/video delivery medium which provides broadcast audiovisual content to a number of subscribers.
- Interactive television provides broadcast video and audio to users and also provides a return path for the user to interact with the content, e.g., to make selections or order desired products, etc.
- a system for providing interactive content uses metadata associated with the video content in order to designate points and regions of the video content to be augmented or enhanced with interactive content.
- the interactive content can be provided from a server.
- a pointer to some other online content that a content provider wishes to serve to the end user can be provided.
- the pointer can be provided on the server.
- event markers can be added along a timeline within the video content or an audio stream of media content. Event markers can be layered, so that more than one event can be triggered at the same point along the timeline.
- Respective event markers can include positioning and behavior data, such as for any visual clues that are to be displayed over the broadcast at each frame along the timeline.
- an event marker or flag can indicate the position of an advert break, a special offer for a product in a scene, a call to vote, a survey about a topic matter discussed on the program or a decision point in an interactive program.
- Other alternatives are possible, as will be appreciated by those skilled in the art.
- a system for providing interactive content can include a control system which is part of a set-top box, internet enabled television apparatus or other device capable of receiving video data for display.
- the same system can also provide interactive content for online video or audio broadcasts by various means including a media player for web broadcasters which can be embedded in web page and used to play media. Browser plugins can allow the system to provide interactive content to third party media players that online broadcasters can embed within web pages.
- a media player can be installed on a viewer's computer, tablet or smart phone.
- Such a codec can be provided in a number of different forms suitable for the hardware being used to receive the media and the operating environment used to control the hardware.
- the HbbTV standard is a European led standard for IPTV manufacturers to adhere to.
- the platform is Javascipt, CSS3 and HTML 5 compliant. Therefore, in an example, a codec can be written in Javascript. The codec can then either be incorporated into a device's onboard firmware or, if provision is made for the HbbTV compliant device to retrieve instructions from an external IP address, the codec can detect event markers from an external server without having to embed the codec onboard the device itself.
- the codec can also exist embedded within a bespoke javascript or json media player library that online broadcasters can embed within their web pages thereby allowing them to hook into interactive servers for example. It can also exist as a plugin to a browser so that a user can choose to install when they visit a site for example. This approach can be used when the broadcaster has reliance upon some other third party media player on their website, thereby providing an easy way for them to migrate to a new system such that the browser plugin can be used to enhance the features of the current media players on their website for example.
- the codec may also exist in a form which permits it to be installed on a computer, tablet computer or smart phone for enhancing the features of any media players found on those devices, as well as a means of monitoring events of any media player software or app that can be provided for users to run on their device.
- a codec can capture when a viewer is watching or listening to a piece of media and identify when an event marker or flag is reached or will be reached.
- a codec can be used in multiple modes, such as in one of three modes.
- the first mode is where metadata for a main event marker timeline can be stored on a central server.
- a media file being broadcast includes a unique identifier.
- the codec reads the identifier and keeps the event marker timeline being retrieved from the server synchronized with the location within the broadcast where the user is.
- the codec will keep maintain synchronization irrespective of whether the broadcast is live, streamed on demand or recorded - either on a PVR type device or cloud based PVR - paused (live TV, streamed or recorded playback), fast forwarded or rewound etc.
- a second mode is where broadcasted media contains event markers in a self-contained metadata package that is broadcast with the media whilst viewing live, streaming on demand or playing back recorded material.
- a third mode is where a codec receives metadata from existing video or audio standards and uses this to enhance the media with interactive content which is provided from a server. This provides broadcasters with means to migrate without having to rework back catalogues of media over a short timeframe.
- event markers can include event codes which can be sent via an Internet connection to a server along with a viewer ID that can be used and stored on the device a user is watching or otherwise using.
- Such event codes can be used to receive instructions from the server and send the instructions back to any device that the viewer has registered for the purposes of the interactive system. Accordingly, multiple devices can be maintained in sync with what a user is watching.
- instructions received from the server on a device could execute a streamed advert from an online source or display a graphic overlaid on a display screen, such as for information or for a request for the viewer to interact.
- a viewer can interact with a television (or similar media consumption device) remote control.
- a smart phone or tablet device can be used.
- a system which includes:
- a control system including a control module as part of an apparatus such as a set-top box or television; 2.
- a server such as a cloud based control server
- an internet enabled television apparatus includes a television or a set-top box or other similar device to provide a signal to a television.
- FIG. 1 is a schematic block diagram of a system according to an example.
- a content provider 101 such as a broadcaster for example, produces content data 103.
- Content data 103 which can include content such as television programs for example, includes payload data 105, which can be video data, and metadata 107.
- the content data 103 is sent, or broadcast, over a delivery network 109.
- the delivery network can include terrestrial analogue or digital television broadcasting or streamed or downloaded content provided over the internet.
- both payload data 105 and metadata 107 are sent using the same delivery network. However, it will be appreciated that one may be sent using a different delivery network or mechanism than the other.
- payload data 105 may be sent over a terrestrial digital channel with metadata 107 being sent Out of band', such as over the internet for example.
- the client device 1 1 1 can be a television, such as an internet enabled television (IPTV), a set top box or other apparatus capable of receiving and decoding a signal received from at least the delivery network 109.
- IPTV internet enabled television
- the client device 1 1 1 is an internet enabled television apparatus or internet enabled set to box apparatus. Accordingly, a broadcast can be provided over the internet as a streamed broadcast to the IPTV.
- the client device 1 1 1 1 includes a control module 1 13.
- control module 1 13 can read metadata 107 from content 103.
- control module 1 13 can include a metadata control layer to decode metadata 107 and to perform appropriate actions based upon the instructions or data held in the metadata 107.
- control module 1 13 can determine a current timeline/viewing status of programs being viewed using the client device 1 1 1. For example, when the client device 1 11 is powered on, the control module 1 13 can extract metadata and current timeline/viewing status from programs being watched either through an internet connection to the internet or using a tuner or PVR recording and playback device.
- metadata supplied within the content 103 may be a simple ID number which the control module 1 13 passes back to the server 1 17.
- the ID can map to data representing one or more extended metadata layers, and the server 1 17 can return such extended metadata layers to the client device 1 1 1 (and any additional devices). That is, metadata layers can either be wholly included in payload or can be referenced within the payload so that detailed metadata can be retrieved from an online source. Accordingly, a metadata ID can be mapped to a location or pointer on server 1 17 so that interactive content can be located from and provided by a location remote from the server 1 17. The provision of remote content can be executed directly from the remote location to a device 1 1 1 , or via server 1 17 for example.
- control data 1 15 passes between the control module 1 13 and a server 1 17. For example, the control data 1 15 can be communicated over a network such as the internet.
- the control data 1 15 includes two components (not shown) - a component sent upstream to server 1 17, and a downstream component received from server 1 17.
- metadata for video content 103 includes a number of pre-defined elements representing specific attributes of the content, with each element capable of having one or more values.
- metadata 107 is standard metadata suitable for content 103, and which is typically extendable metadata. Accordingly, a tag can be added to the metadata in order to leverage the standard metadata functionality that is found in most widely used terrestrial/digital broadcasted video formats and downloaded or streamed online video.
- tag 106 can be added to the extendable metadata tags attached to content data 103.
- the tag 106 can be extracted from the metadata 107 and passed back to the server 1 17 by the control module 1 13 on the client device 1 1 1.
- tag 106 includes a representation for a set of layers for the content payload 105 in order to provide the client device 1 11 and server 1 17 with markers to enable execution of events and to serve as control points for the provision and execution of interactive elements for the content being consumed.
- Figure 2 is a schematic state diagram of a method according to an example. Certain actions are shown as being performed contemporaneously or sequentially. However, it should be noted that the chain of events depicted in figure 2 provides only a high level view, and that certain actions may or may not be performed simultaneously with others, and that the relative order may be different.
- a network 203 is depicted. Communications between various elements can be executed using the network 203, which can be a local network, GPRS or other mobile communications network or the internet for example.
- a content provider 101 provides content 103 to a client device 1 1 1.
- a control module 1 13 of the client device extracts metadata 107 from the content 103.
- the metadata can be sent to server 1 17 in an unprocessed form.
- the metadata 107 can be decoded by the control module 1 13 in order to determine a set of actions 201 from a tag 106.
- the actions 201 are sent to server 1 17 and represent events for the client device 1 1 1.
- Such events can be interactive events in which a user of the client device 1 1 1 interacts with content 103 to provide a more immersive experience.
- actions 201 includes data representing a time for an interactive event, an identification of the device from which the actions 201 have been sent.
- Actions 201 can also include data representing content 103, such as an identifier for the content, and other data including an event duration for example.
- metadata 107 can contain all the actions and work as described above or it can hold an ID which can be sent to the server 1 17 so that full extended metadata layers can be sent to the client device 1 1 1 and the any additional devices 205 for processing by control module 1 13.
- Data representing an event received at the server 1 17 is processed in order to determine the nature of the event and the required data which needs to be fed to the client device in order to execute the event. There can be further two-way communication if an event triggers further requirements for data.
- event data can be passed to the client device 1 11 from server 1 17, and can include data for an interactive event including audio and/or visual content and data representing one or more interactive elements such as buttons, scroll bars etc.
- Event data returned from the client device 1 1 1 to server 1 17 can include data representing an interaction and which includes data generated in response to a user interaction with one or more of interactive elements for example.
- Control data is fed between the client device 1 1 1 and the server 1 17 to maintain synchronization between an event and a timeline of content 103 being consumed.
- control data 1 15 can be used to ensure that an event is executed at a specific time, or that an event which is being executed (inasmuch as a user may be interacting with content in some way for example) continues to make sense or be relevant vis-a-vis the related content 103.
- An additional device 205 can be used according to an example.
- a device can be a smart phone, tablet device or other suitable apparatus capable of displaying information to a user.
- device 205 can include televisions, PCs, remote controls and camera based tracking and control system's which may be built into these devices.
- additional device 205 is an internet enabled device which is communicatively coupled to network 203.
- the device 205 can register itself with server 1 17 in order to index itself as a member of the environment of figure 2.
- the device 205 can send registration data to server 1 17.
- the server can register the device and send a confirmation, although this need not be required.
- registration data can include an identifier which uniquely identifies the additional device 205 in the environment.
- the registration can include data representing the capabilities of the device 205 such as the device's internet connectivity options, display options etc, all of which can be used to determine suitable actions which can be executed by or on the device 205.
- a device 1 1 1 , 205 can be handled by the firmware of the device in question, as is typical.
- Bespoke interactions specific to the content being consumed are passed back to the server 1 17 where details of the interactions are validated before passing back instructions to a control module of the device in question, thereby providing it with the data required to display and process the bespoke interactions.
- interactions may also be pulled down or pushed down from the server 1 17 to client device 1 1 1 or additional devices 205 in an example.
- Events can include to execute a streamed advert from an online source or to display a graphic overlaid on a display screen either for information or for a request for the viewer to interact for example.
- a user can interact using a remote control, smart phone or tablet device for example.
- a Production Company Layer which is reserved for the company that produced the content 103.
- Multiple Broadcaster Layers can be owned by whoever airs or streams the content 103.
- Advertisers Layers can be owned by any number of Social Networks. In an example only viewer preferred Social Networks will be active. None of the layers are mandatory so it is possible for Broadcasters to embed their own Broadcast layer into media content even if the Production Company has not set up their reserved layer. This ensures broadcasters can enhance older content without the need for Production Company's to register layers on old shows.
- a tag 106 can therefore include data for multiple layers. Each layer can be demarcated from the others by specific identifiers for layers, such as an identification code or name. Accordingly, the tag can include layer metadata which provides a structure for layers. Payload data for layers can include information specific to each layer. For example, for each layer there can be data representing actions which correspond to interactive elements and portions of content 103. Alternatively, separate layers can be embodied in separate tags in metadata 107.
- Figure 3 is a schematic block diagram of a method according to an example. As described above, content 103 can include payload 105 and metadata 107. A control module 1 13 of a client device 1 1 1 can decode the metadata 107 in order to provide a specific tag 106.
- the tag 106 provides data for providing interactive elements for the content 103.
- tag 106 can include data for multiple layers.
- a layer can be owned or otherwise managed by a certain stakeholder such as a broadcaster or content provider 101 , an advertiser, a social network provider etc.
- payload data can be provided which provides data representing interactive elements which the stakeholder of the layer in question would like to present to a user who is consuming the content which has spawned the elements.
- actions 201 can be extracted from tag 106 and provided to server 1 17.
- the server 1 17 can process the actions in order to determine suitable interactive data elements 301 to provide for a device 1 1 1 , 205. Accordingly, multiple actions for different layers can be provided to server 1 17. Each layer can be distinguished from others using a layer identification 303 which can be mapped to specific sets of interactive elements. These can be selected either in connection with some characteristics of the layer owner or stakeholder, or can be provided by the owner or stakeholder.
- the relevant interactive data elements 301 can then be provided to devices 1 1 1 , 205. Control data 305 flowing between devices 1 1 1 , 205 and server 1 17 provides data which enables the timely and relevant provision of interactive elements as described above.
- control data can relate to timing and duration of certain interactive elements. Further, control data can relate to certain actions performed in connection with an interactive element. For example, an interactive button can be pressed, and data representing the interaction can be part of the control data which flows between devices 1 1 1 , 205 and server 1 17. According to an example, server 1 17 is thus operable to pass interactions or instructions received from a first device of the devices 1 1 1 , 205 to the other in order to update the software running on the other device to progress the interaction or to simply remotely control the other device from the first device, instead of using a remote control for example.
- metadata 107 can comprise an ID for use by server 1 17. That is, an ID can map to data representing one or more extended metadata layers, and the server 1 17 can return such extended metadata layers to a client device 1 1 1 , 205 of figure 3 for example. Accordingly, a metadata ID can be mapped to a location or pointer on server 1 17 so that interactive content can be located from and provided by a location remote from the server 1 17. The provision of remote content can be executed directly from the remote location to a device 1 1 1 , or via server 1 17 for example.
- control data can include the provision of: linking a user to friends in their social network, either using a proprietary social network of existing users or one or more social networks linked to by the user; passing interactive content from layer owners' servers or pointing the users devices to interactive content via inbuilt device firmware, a device independent interaction application or a device specific application; passing data and interactions from a user back to a relevant service or layer owner; recording 'Click Throughs' on adverts or other paid interactive events that appear on contents time line.
- server 1 17 can be used to manage online bidding for advertisers to secure interactive ads around events or the advertising layer or to add sponsorship or attach more traditional ads to the video material as well as managing payment for services rendered.
- a device independent interaction application can be provided.
- the app can be written using the HTML 5 scripting language, combined with Javascript and JSON to provide a cross platform interactive user interface that can run on any HTML 5, W3C compliant browser.
- the server 1 17 can send to the app details of what the user is consuming.
- the app can provide the user the option of which TV or device to sync to. Accordingly, the app can provide the viewer with extended information connected to the program or video that they are watching. This information can be viewed immediately or queued for a later time.
- the information provided may be simple extended information provided by the program maker or some other layer owner, or it may be in the form of micro sites provided by the program maker or advertiser to provide a rich interactive engagement or shopping experience.
- the app can also act as a social portal where the user can interact with family and friends about the program or offer they are looking at.
- Extended media can be additional video footage served from the internet by the layer owner or server 1 17. Additional footage may also include interactive events, and therefore the app also has video player capabilities according to an example which will identify if an metadata tag is present and pass this back to the server 1 17 in order to retrieve additional interactive content.
- a Layer Editor is provided in the form of a Web App designed to give publishers and content providers the means to add events onto a metadata event timeline.
- a drag and drop interface is provided to allow a user to import a video file they wish to add a layer to, select the type of event they wish to insert and then allow them to drag this onto the video's timeline.
- An event timeline can have multiple dimension elements - for example, Time, Duration, X and Y co-ordinates and behaviors of the event.
- the Time element represents the start position on the video's timeline
- the Duration allows the user to determine how long the event will remain active/visible
- the X and Y co-ordinate represents the position on the video frame.
- the behaviors will help provide extendable actions to control the way in which the events appear on screen, how they may animate, fade in and out, etc.
- An event may stretch across a number of video frames across the timeline, as dictated by the duration. Each event can have a different X and Y co-ordinate for each frame within the Duration of the event. This gives the flexibility to generate moving visible event tags that follow an object on the TV, PC, Smart Phone or Tablet being used to watch a video that supports the IMES service.
- visible events can be provided in a number of different forms, such as:
- Actions can be instructions given by the user that execute something.
- the Actions that the user may wish to carry out include: 1. Add to wish list;
- Actions may require additional information, either from the viewer's preferences or from live input from the viewer. Examples to help illustrate could be:
- the actions sets are extendible and so the above represents only a selection of the options that are available. For example, playback of media on a device can be extended to allow an Electronic Program Guide and other TV menus for settings and IPTV apps to be provided onto the secondary device, so that some control of a main viewing device can be effected using a secondary device such as a tablet or phone for example.
- a secondary device such as a tablet or phone for example.
- browser plugins can be provided to further extend the functionality of a web browser so that content providers can allow viewers to interact with online streamed video from a standard web page, viewed on their computer, in the same way as they would if they were watching a TV program or streamed content on their IPTV.
- third party plugins for professional video editing software for adding events onto a video's timeline, for the owners layer and for publishing this to server 1 17 are provided according to an example. That is, professional production and broadcast companies may wish to incorporate the creation of layers into their editing workflows. To achieve this Editing Suite plugins are provided to extend the functionality of the users applications.
- FIG. 4 is a schematic block diagram of a system according to an example.
- a client device 1 1 1 in the example of figure 4 is a set-top box connected to a television apparatus 401.
- the device 1 1 1 is internet- enabled, and is connected to the internet 403.
- Device 1 1 1 receives content 103 from content provider 101 over the delivery network 109, which in the example of figure 4 is a cable television network.
- Control module 1 13 of device 1 1 1 1 decodes metadata 107 from the content 103 and determines actions 201 relevant to the content being displayed on television 401.
- the actions are sent to server 1 17 over the internet 403 where relevant interactive elements data is determined or generated and sent back to device 1 1 1 via the internet 403.
- device 205 is registered at the server 1 17 as a user device.
- Device 1 1 1 is also registered at server 1 17.
- a user preference file 405 stored on server 1 17 provides data in the form of a profile for the user of the system of figure 4 which sets out the user's preferences in relation to their registered devices 1 1 1 , 205 and the way in which each device should handle interactive content received from server 1 17 in connection with content 103.
- preference file 405 provides that device 205 can receive interactive elements data 301 from server 1 17. Accordingly, following receipt of actions 201 and determination of suitable interactive content, server 1 17 can send data 301 to device 205 over the internet 403. As device 205 is registered with server 1 17, a communicative coupling is established between it and the server 1 17.
- data 301 can be sent to device 205 in preference to or in addition to device 1 1 1.
- different data 301 can be sent to the devices 1 1 1 , 205.
- data 301 sent to device 1 1 1 can provide first interactive or passive elements for display on television 401
- data 301 sent to device 205 can provide second interactive or passive elements for display using a display 407 of device 205.
- control data representing an action performed in connection with an interactive elements displayed on the device can be sent to server 1 17 via internet 403.
- the interaction may prompt further interactive elements to be sent to a device, such as in response to a user selection for example.
- an interactive element can include a button to change an aspect of the television 401 such as channel, volume etc.
- the server can send corresponding control data to device 1 1 1 in order to effect the desired change.
- FIG. 5 is a schematic block diagram of a client device 500 according to an example.
- a digital media processor 501 can decode data received from a content provider.
- processor 501 can decode a video datastream for display, such as a datastream received via a tuner 503 for example.
- a control module 505 is communicatively coupled to processor 501 and operable to decode metadata from a video datastream. Alternatively, control module can form part of processor 501.
- An Ethernet port 507 can be provided in communication with processor 501 so that data can be sent and received over the internet or another network for example.
- a wireless interface 509 can provide wireless radio-frequency network communication. Elements 507, 509 can be communications modules for the device 500.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A control system for an internet-enabled television apparatus, comprises a tuner module to receive a data signal including content payload and metadata, a control module to decode the metadata from the data signal, a communications module to: communicate with a server over a communications network; issue action data to the server on the basis of instructions in the metadata; and receive interactive elements data for the content payload, a digital media processor to use the interactive data to provide a displaying device with interactive content.
Description
SYSTEM FOR PROVIDING INTERACTIVE CONTENT TO AN INTERNET - ENABLED TELEVISION APPARATUS
BACKGROUND
Broadcasters are typically concerned with the engagement and retention of 5 viewers and in building ratings. To this end, they can use tried and tested
formulae for engaging an audience, such as phone voting for example - initially with live person call handlers counting votes, and then progressively taking advantage of other technologies. For example, from key pad control telephone menu systems, fully automated Interactive Voice 10 Recognition to SMS, online services and dedicated applications ("apps") for
viewers' phones or tablet devices for example.
Simultaneously, users are expecting to be able to interact to a greater degree with the content that they consume, and are typically more engaged with content where some degree of interaction is provided.
15 To this end, the proliferation of high speed broadband connections, internet
enabled TV's, phones and tablets equipped with high resolution screens, camera tracking technologies and the huge growth in internet video production is resulting in a convergence into interactive and immersive viewing experiences for viewers of both traditional viewing technologies
20 such as terrestrial, cable and satellite as well as newer age broadcasting
from internet or cloud based services which can be leveraged by broadcasters.
Dedicated apps for shows that engage viewers with voting have already started to appear. For example, one such app works by selling viewers a 25 number of voting credits and allowing them to cast their vote using the app
on their smart phone. Another such example of an interactive app includes
one which allows viewers of certain entertainment shows to play along with the contestants.
Interactive services which are provided through satellite, cable or some terrestrial systems are known. For example, many advertisements give viewers the option to press a button on their remote control in order to engage with the company behind the advertisement. Once engaged viewers are then able to give personal details, book a service, arrange a call or order products for example.
Such techniques are limited however. For example, systems can be restricted to live shows thereby reducing an available voting window. There is also no real opportunity for giving participants feedback. Dedicated Apps can improve interaction between the show and the viewer but then both programme makers and viewers have to contend with creating or downloading apps for different platforms and shows. SUMMARY
According to an example, there is provided a control system for an internet- enabled television apparatus, comprising: a first client device including: a tuner module to receive a data signal including content payload and metadata; a control module to extract the metadata from the received data signal; a communications module to: receive metadata from the control module; communicate with a server over a communications network; issue action data to the server on the basis of instructions in, or associated with, the metadata; said communications module or a client device being adapted to receive interactive elements data for the content payload from the server based on the action data; and a digital media processor located on said first and/or second client device to use the interactive elements data to provide a displaying device with interactive content. The control module can be further operable to use the metadata to determine a
metadata ID, the communications module to communicate the metadata ID to the server. The server can use the metadata ID to provide extended metadata for the processor. The control module can decode the extended metadata. The metadata can include multiple metadata layers representing respective different interactive actions, and can include an ID to map to extended metadata layers for the television apparatus.
According to an example, there is provided a system for providing interactive content at a device, comprising: a client device to receive video data representing video content; a control module of the client device to extract metadata from the received video data; a server to receive control data derived from, or associated with, the metadata extracted by the control module, from the client device, and to send interactive data to the said device and/or another client device on the basis of the control data; a digital media processor located on one or both client devices to use the interactive data to provide the device(s) with interactive content. The control module can fetch extended metadata from the server for decoding. The server can receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data. According to an example, there is provided a method comprising: receiving a signal including content payload and metadata at a client device; extracting the received metadata; providing, to a server, data representing a set of actions for interactive data elements to be consumed using a client device on the basis of instructions in, or associated with, the metadata; providing by the server said set of interactive data elements for the said client device or other client device; and maintaining synchronization of interactive data elements displayed using a client device(s) with the content to be consumed. The metadata can include multiple layers for respective ones of multiple stakeholders in the content. The set of actions for
interactive elements can be derived directly from the metadata. The set of actions for interactive elements can be determined by mapping data representing an ID embedded in the metadata to multiple metadata items.
According to an example, there is provided the control module may use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content. The control module can include a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module. The control module can decode data representing an identifier from the metadata.
BRIEF DESCRIPTION OF THE DRAWINGS
An embodiment of the invention will now be described, by way of example only, and with reference to the accompanying drawings, in which: Figure 1 is a schematic block diagram of a system according to an example;
Figure 2 is a schematic state diagram of a method according to an example;
Figure 3 is a schematic block diagram of a method according to an example;
Figure 4 is a schematic block diagram of a system according to an example; and
Figure 5 is a schematic block diagram of a client device 500 according to an example.
DETAILED DESCRIPTION
Interactive television is an interactive audio/video delivery medium which provides broadcast audiovisual content to a number of subscribers. Interactive television provides broadcast video and audio to users and also provides a return path for the user to interact with the content, e.g., to make selections or order desired products, etc.
According to an example, a system for providing interactive content uses metadata associated with the video content in order to designate points and regions of the video content to be augmented or enhanced with interactive content. The interactive content can be provided from a server. Alternatively a pointer to some other online content that a content provider wishes to serve to the end user can be provided. For example, the pointer can be provided on the server. Accordingly, event markers can be added along a timeline within the video content or an audio stream of media content. Event markers can be layered, so that more than one event can be triggered at the same point along the timeline. Respective event markers can include positioning and behavior data, such as for any visual clues that are to be displayed over the broadcast at each frame along the timeline. In an example, an event marker or flag can indicate the position of an advert break, a special offer for a product in a scene, a call to vote, a survey about a topic matter discussed on the program or a decision point in an interactive program. Other alternatives are possible, as will be appreciated by those skilled in the art.
According to an example, a system for providing interactive content can include a control system which is part of a set-top box, internet enabled television apparatus or other device capable of receiving video data for display. The same system can also provide interactive content for online video or audio broadcasts by various means including a media player for
web broadcasters which can be embedded in web page and used to play media. Browser plugins can allow the system to provide interactive content to third party media players that online broadcasters can embed within web pages. In addition, a media player can be installed on a viewer's computer, tablet or smart phone. Within the system, certain aspects provide that event markers can be detected by a codec and acted upon, as will be explained below. Such a codec can be provided in a number of different forms suitable for the hardware being used to receive the media and the operating environment used to control the hardware. For example, the HbbTV standard is a European led standard for IPTV manufacturers to adhere to. The platform is Javascipt, CSS3 and HTML 5 compliant. Therefore, in an example, a codec can be written in Javascript. The codec can then either be incorporated into a device's onboard firmware or, if provision is made for the HbbTV compliant device to retrieve instructions from an external IP address, the codec can detect event markers from an external server without having to embed the codec onboard the device itself.
Similar approaches may be taken with other standards or propriety platforms developed by different hardware manufacturers. These may require the codec to be developed using different programming languages if they are to be embedded into a device's firmware.
In an example, the codec can also exist embedded within a bespoke javascript or json media player library that online broadcasters can embed within their web pages thereby allowing them to hook into interactive servers for example. It can also exist as a plugin to a browser so that a user can choose to install when they visit a site for example. This approach can be used when the broadcaster has reliance upon some other third party media player on their website, thereby providing an easy way for
them to migrate to a new system such that the browser plugin can be used to enhance the features of the current media players on their website for example.
In an example, the codec may also exist in a form which permits it to be installed on a computer, tablet computer or smart phone for enhancing the features of any media players found on those devices, as well as a means of monitoring events of any media player software or app that can be provided for users to run on their device.
A codec according to an example can capture when a viewer is watching or listening to a piece of media and identify when an event marker or flag is reached or will be reached.
In an example, a codec can be used in multiple modes, such as in one of three modes. The first mode is where metadata for a main event marker timeline can be stored on a central server. In this example, a media file being broadcast includes a unique identifier. The codec reads the identifier and keeps the event marker timeline being retrieved from the server synchronized with the location within the broadcast where the user is. The codec will keep maintain synchronization irrespective of whether the broadcast is live, streamed on demand or recorded - either on a PVR type device or cloud based PVR - paused (live TV, streamed or recorded playback), fast forwarded or rewound etc.
A second mode is where broadcasted media contains event markers in a self-contained metadata package that is broadcast with the media whilst viewing live, streaming on demand or playing back recorded material. A third mode is where a codec receives metadata from existing video or audio standards and uses this to enhance the media with interactive content which is provided from a server. This provides broadcasters with
means to migrate without having to rework back catalogues of media over a short timeframe.
In an example, event markers can include event codes which can be sent via an Internet connection to a server along with a viewer ID that can be used and stored on the device a user is watching or otherwise using.
Such event codes can be used to receive instructions from the server and send the instructions back to any device that the viewer has registered for the purposes of the interactive system. Accordingly, multiple devices can be maintained in sync with what a user is watching. In an example, instructions received from the server on a device could execute a streamed advert from an online source or display a graphic overlaid on a display screen, such as for information or for a request for the viewer to interact. In one configuration, a viewer can interact with a television (or similar media consumption device) remote control. Alternatively (or in combination), a smart phone or tablet device can be used.
According to an example, there is provided a system which includes:
1. A control system including a control module as part of an apparatus such as a set-top box or television; 2. A metadata layer controller;
3. A server, such as a cloud based control server;
4. Device independent interaction applications, such as HTML 5 compatible applications;
5. Optional device specific interaction applications; 6. A cloud based editor tool for adding interactive layers to video content;
7. Browser plugins for HTML 5 compatible browsers;
8. Third party plugins for professional video editing software for adding events onto a video's timeline, for the owner's layer and for publishing this to the IMES server; 9. Plugin control software to extend functionality of the most popular PC or Mac based video players;
10. Compatible browsers.
According to an example, an internet enabled television apparatus includes a television or a set-top box or other similar device to provide a signal to a television.
Figure 1 is a schematic block diagram of a system according to an example. A content provider 101 such as a broadcaster for example, produces content data 103. Content data 103, which can include content such as television programs for example, includes payload data 105, which can be video data, and metadata 107. The content data 103 is sent, or broadcast, over a delivery network 109. Depending on the nature of the content provider 101 , the delivery network can include terrestrial analogue or digital television broadcasting or streamed or downloaded content provided over the internet. In an example, both payload data 105 and metadata 107 are sent using the same delivery network. However, it will be appreciated that one may be sent using a different delivery network or mechanism than the other. For example, payload data 105 may be sent over a terrestrial digital channel with metadata 107 being sent Out of band', such as over the internet for example. Whichever mechanism is used, the content data 103 is received by a client device 1 1 1. The client device 1 1 1 can be a television, such as an internet enabled television (IPTV), a set top box or other apparatus capable of receiving and decoding a signal received from at least the delivery network 109. In an example, the client device 1 1 1 is an internet enabled television
apparatus or internet enabled set to box apparatus. Accordingly, a broadcast can be provided over the internet as a streamed broadcast to the IPTV. The client device 1 1 1 includes a control module 1 13. In an example, control module 1 13 can read metadata 107 from content 103. For example, the control module 1 13 can include a metadata control layer to decode metadata 107 and to perform appropriate actions based upon the instructions or data held in the metadata 107. In an example, the control module 1 13 can determine a current timeline/viewing status of programs being viewed using the client device 1 1 1. For example, when the client device 1 11 is powered on, the control module 1 13 can extract metadata and current timeline/viewing status from programs being watched either through an internet connection to the internet or using a tuner or PVR recording and playback device. In an example, metadata supplied within the content 103 may be a simple ID number which the control module 1 13 passes back to the server 1 17. The ID can map to data representing one or more extended metadata layers, and the server 1 17 can return such extended metadata layers to the client device 1 1 1 (and any additional devices). That is, metadata layers can either be wholly included in payload or can be referenced within the payload so that detailed metadata can be retrieved from an online source. Accordingly, a metadata ID can be mapped to a location or pointer on server 1 17 so that interactive content can be located from and provided by a location remote from the server 1 17. The provision of remote content can be executed directly from the remote location to a device 1 1 1 , or via server 1 17 for example. In an example, control data 1 15 passes between the control module 1 13 and a server 1 17. For example, the control data 1 15 can be communicated over a network such as the internet. The control data 1 15 includes two components (not shown) - a component sent upstream to server 1 17, and a downstream component received from server 1 17. The distinction will become apparent below.
Typically, metadata for video content 103 includes a number of pre-defined elements representing specific attributes of the content, with each element capable of having one or more values. According to an example, metadata 107 is standard metadata suitable for content 103, and which is typically extendable metadata. Accordingly, a tag can be added to the metadata in order to leverage the standard metadata functionality that is found in most widely used terrestrial/digital broadcasted video formats and downloaded or streamed online video.
In an example, this can achieved by adding a tag 106 to the extendable metadata tags attached to content data 103. The tag 106 can be extracted from the metadata 107 and passed back to the server 1 17 by the control module 1 13 on the client device 1 1 1. In an example, tag 106 includes a representation for a set of layers for the content payload 105 in order to provide the client device 1 11 and server 1 17 with markers to enable execution of events and to serve as control points for the provision and execution of interactive elements for the content being consumed.
Figure 2 is a schematic state diagram of a method according to an example. Certain actions are shown as being performed contemporaneously or sequentially. However, it should be noted that the chain of events depicted in figure 2 provides only a high level view, and that certain actions may or may not be performed simultaneously with others, and that the relative order may be different. A network 203 is depicted. Communications between various elements can be executed using the network 203, which can be a local network, GPRS or other mobile communications network or the internet for example.
A content provider 101 provides content 103 to a client device 1 1 1. A control module 1 13 of the client device extracts metadata 107 from the content 103. In an example, the metadata can be sent to server 1 17 in an
unprocessed form. Alternatively, the metadata 107 can be decoded by the control module 1 13 in order to determine a set of actions 201 from a tag 106. The actions 201 are sent to server 1 17 and represent events for the client device 1 1 1. Such events can be interactive events in which a user of the client device 1 1 1 interacts with content 103 to provide a more immersive experience. In an example, actions 201 includes data representing a time for an interactive event, an identification of the device from which the actions 201 have been sent. Actions 201 can also include data representing content 103, such as an identifier for the content, and other data including an event duration for example. As noted above, in an example metadata 107 can contain all the actions and work as described above or it can hold an ID which can be sent to the server 1 17 so that full extended metadata layers can be sent to the client device 1 1 1 and the any additional devices 205 for processing by control module 1 13. Data representing an event received at the server 1 17 is processed in order to determine the nature of the event and the required data which needs to be fed to the client device in order to execute the event. There can be further two-way communication if an event triggers further requirements for data. Accordingly, event data can be passed to the client device 1 11 from server 1 17, and can include data for an interactive event including audio and/or visual content and data representing one or more interactive elements such as buttons, scroll bars etc. Event data returned from the client device 1 1 1 to server 1 17 can include data representing an interaction and which includes data generated in response to a user interaction with one or more of interactive elements for example.
Control data is fed between the client device 1 1 1 and the server 1 17 to maintain synchronization between an event and a timeline of content 103 being consumed. For example, control data 1 15 can be used to ensure that an event is executed at a specific time, or that an event which is being
executed (inasmuch as a user may be interacting with content in some way for example) continues to make sense or be relevant vis-a-vis the related content 103.
An additional device 205 can be used according to an example. Such a device can be a smart phone, tablet device or other suitable apparatus capable of displaying information to a user. For example, device 205 can include televisions, PCs, remote controls and camera based tracking and control system's which may be built into these devices.
In an example, additional device 205 is an internet enabled device which is communicatively coupled to network 203. The device 205 can register itself with server 1 17 in order to index itself as a member of the environment of figure 2. For example, the device 205 can send registration data to server 1 17. The server can register the device and send a confirmation, although this need not be required. In an example, registration data can include an identifier which uniquely identifies the additional device 205 in the environment. The registration can include data representing the capabilities of the device 205 such as the device's internet connectivity options, display options etc, all of which can be used to determine suitable actions which can be executed by or on the device 205. According to an example, common interactions between a device 1 1 1 , 205, such as via the interactive elements described above for example, can be handled by the firmware of the device in question, as is typical. Bespoke interactions, specific to the content being consumed are passed back to the server 1 17 where details of the interactions are validated before passing back instructions to a control module of the device in question, thereby providing it with the data required to display and process the bespoke interactions. As well as being pushed to the server via control device 1 13,
interactions may also be pulled down or pushed down from the server 1 17 to client device 1 1 1 or additional devices 205 in an example.
Events can include to execute a streamed advert from an online source or to display a graphic overlaid on a display screen either for information or for a request for the viewer to interact for example. A user can interact using a remote control, smart phone or tablet device for example.
According to an example, there are multiple layers of events that can be represented in one or more metadata tags 106 for content 103. For example, there can be a Production Company Layer which is reserved for the company that produced the content 103. Multiple Broadcaster Layers can be owned by whoever airs or streams the content 103. There can be multiple Advertisers Layers, as well as Social Network Layers which can be owned by any number of Social Networks. In an example only viewer preferred Social Networks will be active. None of the layers are mandatory so it is possible for Broadcasters to embed their own Broadcast layer into media content even if the Production Company has not set up their reserved layer. This ensures broadcasters can enhance older content without the need for Production Company's to register layers on old shows. In an example, a tag 106 can therefore include data for multiple layers. Each layer can be demarcated from the others by specific identifiers for layers, such as an identification code or name. Accordingly, the tag can include layer metadata which provides a structure for layers. Payload data for layers can include information specific to each layer. For example, for each layer there can be data representing actions which correspond to interactive elements and portions of content 103. Alternatively, separate layers can be embodied in separate tags in metadata 107.
Figure 3 is a schematic block diagram of a method according to an example. As described above, content 103 can include payload 105 and metadata 107. A control module 1 13 of a client device 1 1 1 can decode the metadata 107 in order to provide a specific tag 106. The tag 106 provides data for providing interactive elements for the content 103. According to an example, tag 106 can include data for multiple layers. A layer can be owned or otherwise managed by a certain stakeholder such as a broadcaster or content provider 101 , an advertiser, a social network provider etc. For each layer, payload data can be provided which provides data representing interactive elements which the stakeholder of the layer in question would like to present to a user who is consuming the content which has spawned the elements.
In an example, actions 201 can be extracted from tag 106 and provided to server 1 17. The server 1 17 can process the actions in order to determine suitable interactive data elements 301 to provide for a device 1 1 1 , 205. Accordingly, multiple actions for different layers can be provided to server 1 17. Each layer can be distinguished from others using a layer identification 303 which can be mapped to specific sets of interactive elements. These can be selected either in connection with some characteristics of the layer owner or stakeholder, or can be provided by the owner or stakeholder. The relevant interactive data elements 301 can then be provided to devices 1 1 1 , 205. Control data 305 flowing between devices 1 1 1 , 205 and server 1 17 provides data which enables the timely and relevant provision of interactive elements as described above. For example, control data can relate to timing and duration of certain interactive elements. Further, control data can relate to certain actions performed in connection with an interactive element. For example, an interactive button can be pressed, and data representing the interaction can be part of the control data which flows between devices 1 1 1 , 205 and server 1 17.
According to an example, server 1 17 is thus operable to pass interactions or instructions received from a first device of the devices 1 1 1 , 205 to the other in order to update the software running on the other device to progress the interaction or to simply remotely control the other device from the first device, instead of using a remote control for example.
As described above, metadata 107 can comprise an ID for use by server 1 17. That is, an ID can map to data representing one or more extended metadata layers, and the server 1 17 can return such extended metadata layers to a client device 1 1 1 , 205 of figure 3 for example. Accordingly, a metadata ID can be mapped to a location or pointer on server 1 17 so that interactive content can be located from and provided by a location remote from the server 1 17. The provision of remote content can be executed directly from the remote location to a device 1 1 1 , or via server 1 17 for example. In other examples, control data can include the provision of: linking a user to friends in their social network, either using a proprietary social network of existing users or one or more social networks linked to by the user; passing interactive content from layer owners' servers or pointing the users devices to interactive content via inbuilt device firmware, a device independent interaction application or a device specific application; passing data and interactions from a user back to a relevant service or layer owner; recording 'Click Throughs' on adverts or other paid interactive events that appear on contents time line.
In an example, server 1 17 can be used to manage online bidding for advertisers to secure interactive ads around events or the advertising layer or to add sponsorship or attach more traditional ads to the video material as well as managing payment for services rendered.
According to an example, a device independent interaction application (app) can be provided. The app can be written using the HTML 5 scripting language, combined with Javascript and JSON to provide a cross platform interactive user interface that can run on any HTML 5, W3C compliant browser. When opened on one of a user's devices, the server 1 17 can send to the app details of what the user is consuming. Where more than one registered device is being used to view a broadcast TV program or a streamed/downloaded video for example, the app can provide the user the option of which TV or device to sync to. Accordingly, the app can provide the viewer with extended information connected to the program or video that they are watching. This information can be viewed immediately or queued for a later time.
The information provided may be simple extended information provided by the program maker or some other layer owner, or it may be in the form of micro sites provided by the program maker or advertiser to provide a rich interactive engagement or shopping experience. In an example, the app can also act as a social portal where the user can interact with family and friends about the program or offer they are looking at.
Extended media can be additional video footage served from the internet by the layer owner or server 1 17. Additional footage may also include interactive events, and therefore the app also has video player capabilities according to an example which will identify if an metadata tag is present and pass this back to the server 1 17 in order to retrieve additional interactive content. In an example, a Layer Editor is provided in the form of a Web App designed to give publishers and content providers the means to add events onto a metadata event timeline. A drag and drop interface is provided to allow a user to import a video file they wish to add a layer to, select the type
of event they wish to insert and then allow them to drag this onto the video's timeline.
An event timeline can have multiple dimension elements - for example, Time, Duration, X and Y co-ordinates and behaviors of the event. The Time element represents the start position on the video's timeline, the Duration allows the user to determine how long the event will remain active/visible and the X and Y co-ordinate represents the position on the video frame. The behaviors will help provide extendable actions to control the way in which the events appear on screen, how they may animate, fade in and out, etc. An event may stretch across a number of video frames across the timeline, as dictated by the duration. Each event can have a different X and Y co-ordinate for each frame within the Duration of the event. This gives the flexibility to generate moving visible event tags that follow an object on the TV, PC, Smart Phone or Tablet being used to watch a video that supports the IMES service.
In an example, visible events can be provided in a number of different forms, such as:
1. Semi-transparent pop ups that contain images, actions and links to extended content; 2. Visible transparent glowing (ghostly) vector outlines over an object or person that that the event is attached to, which when pressed can execute actions or open up a link for example;
3. Simple text web link tags; and
4. Graphical images, both animated and not. Actions can be instructions given by the user that execute something. The Actions that the user may wish to carry out include:
1. Add to wish list;
2. Review later;
3. Give me more info;
4. Visit website; 5. Share with friends (using one or more social networks);
6. Email details; and
7. Pause transmission/video (fast forward, rewind).
Actions may require additional information, either from the viewer's preferences or from live input from the viewer. Examples to help illustrate could be:
1. 'Visit website', where the viewer may have set the default behaviour to either "Interrupt TV viewing, pause if possible and show website on my internet enabled television"
2. "Don't interrupt TV viewing and show website on my tablet/smart phone"
3. "Pause TV if possible and show website on my tablet/smart phone".
The actions sets are extendible and so the above represents only a selection of the options that are available. For example, playback of media on a device can be extended to allow an Electronic Program Guide and other TV menus for settings and IPTV apps to be provided onto the secondary device, so that some control of a main viewing device can be effected using a secondary device such as a tablet or phone for example. When an Action is triggered by the viewer Action data is sent back to server
1 17 where it is processed and appropriate instructions sent to the relevant registered devices.
In an example, browser plugins can be provided to further extend the functionality of a web browser so that content providers can allow viewers to interact with online streamed video from a standard web page, viewed on their computer, in the same way as they would if they were watching a TV program or streamed content on their IPTV. In addition, third party plugins for professional video editing software for adding events onto a video's timeline, for the owners layer and for publishing this to server 1 17 are provided according to an example. That is, professional production and broadcast companies may wish to incorporate the creation of layers into their editing workflows. To achieve this Editing Suite plugins are provided to extend the functionality of the users applications.
Figure 4 is a schematic block diagram of a system according to an example. A client device 1 1 1 in the example of figure 4 is a set-top box connected to a television apparatus 401. The device 1 1 1 is internet- enabled, and is connected to the internet 403. Device 1 1 1 receives content 103 from content provider 101 over the delivery network 109, which in the example of figure 4 is a cable television network. Control module 1 13 of device 1 1 1 decodes metadata 107 from the content 103 and determines actions 201 relevant to the content being displayed on television 401. The actions are sent to server 1 17 over the internet 403 where relevant interactive elements data is determined or generated and sent back to device 1 1 1 via the internet 403. In an example, device 205 is registered at the server 1 17 as a user device. Device 1 1 1 is also registered at server 1 17. A user preference file 405 stored on server 1 17 provides data in the form of a profile for the user of the system of figure 4 which sets out the user's preferences in relation to
their registered devices 1 1 1 , 205 and the way in which each device should handle interactive content received from server 1 17 in connection with content 103. In the example of figure 4, preference file 405 provides that device 205 can receive interactive elements data 301 from server 1 17. Accordingly, following receipt of actions 201 and determination of suitable interactive content, server 1 17 can send data 301 to device 205 over the internet 403. As device 205 is registered with server 1 17, a communicative coupling is established between it and the server 1 17. In this connection, data 301 can be sent to device 205 in preference to or in addition to device 1 1 1. Alternatively, different data 301 can be sent to the devices 1 1 1 , 205. For example, data 301 sent to device 1 1 1 can provide first interactive or passive elements for display on television 401 , and data 301 sent to device 205 can provide second interactive or passive elements for display using a display 407 of device 205. In an example, when a user interacts with device 1 1 1 or 205, control data representing an action performed in connection with an interactive elements displayed on the device can be sent to server 1 17 via internet 403. The interaction may prompt further interactive elements to be sent to a device, such as in response to a user selection for example. In a basic example, an interactive element can include a button to change an aspect of the television 401 such as channel, volume etc. Accordingly, the server can send corresponding control data to device 1 1 1 in order to effect the desired change.
Figure 5 is a schematic block diagram of a client device 500 according to an example. A digital media processor 501 can decode data received from a content provider. For example, processor 501 can decode a video datastream for display, such as a datastream received via a tuner 503 for example. A control module 505 is communicatively coupled to processor 501 and operable to decode metadata from a video datastream.
Alternatively, control module can form part of processor 501. An Ethernet port 507 can be provided in communication with processor 501 so that data can be sent and received over the internet or another network for example. Similarly, a wireless interface 509 can provide wireless radio-frequency network communication. Elements 507, 509 can be communications modules for the device 500.
Claims
1. A control system for an internet-enabled television apparatus, comprising: a first client device including:
a tuner module to receive a data signal including content payload and metadata;
a control module to extract the metadata from the received data signal; a communications module to: receive metadata from the control module; communicate with a server over a communications network; issue action data to the server on the basis of instructions in, or associated with, the metadata;
said communications module or a second client device being adapted to receive interactive data elements for the content payload from the server based on the action data;
a digital media processor located on said first and/or second client device to use the interactive data elements to provide a displaying device with interactive content.
2. A control system as claimed in claim 1 , the control module further use the metadata to determine a metadata ID, the communications module to communicate the metadata ID to the server.
3. A control system as claimed in claim 2, the server to use the metadata ID to provide extended metadata for the processor.
4. A control system as claimed in claim 3, the control module to decode the extended metadata.
5. A control system as claimed in claim 1 , wherein the metadata includes multiple metadata layers representing respective different interactive actions.
6. A control system as claimed in claim 1 , wherein the metadata includes an ID to map to extended metadata layers for the television apparatus.
7. A system for providing interactive content at a device, comprising: a client device to receive video data representing video content; a control module of the client device to extract metadata from the received video data;
a server to receive control data derived from, or associated with, the metadata, extracted by the control module, from the client device, and to send interactive data to the said device and/or another client device on the basis of the control data;
a digital media processor located on one or both client devices to use the interactive data to provide the device(s) with interactive content.
8. A system as claimed in claim 7, the control module to fetch extended metadata from the server for decoding.
9. A system as claimed in claim 7 or 8, the server to receive an ID tag derived from the metadata from the client device and to locate control data to send interactive data to the device on the basis of the control data;
10. A method comprising:
receiving a signal including content payload and metadata at a client device;
extracting the received metadata; providing, to a server, data representing a set of actions for interactive data elements of content to be consumed using a client device on the basis of instructions in, or associated with, the metadata;
providing by the server, said set of interactive data elements for the said client device or other client device; and
maintaining synchronization of interactive data elements displayed using a client device(s) with the content to be consumed.
1 1 . A method as claimed in claim 10, wherein the metadata includes multiple layers for respective ones of multiple stakeholders in the content.
12. A method as claimed in claim 10 or 1 1 , wherein the set of actions for interactive elements are derived directly from the metadata.
13. A method as claimed in claim 10 or 1 1 , wherein the set of actions for interactive elements are determined by mapping data representing an ID embedded in the metadata to multiple metadata items.
14. A system as claimed in claims 1 to 9 wherein the said control module is adapted to use metadata having multiple different layers associated therewith representing respective interactive data elements for video content, and to determine points of the content to be augmented or enhanced with the interactive content.
15. A system as claimed in claim 14, further comprising a communications module to communicate with a server to provide interactive content for a layer on the basis of a request from the control module.
16. A system as claimed in claim 14 or 15, the control module to decode data representing an identifier from the metadata.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1116598.2 | 2011-09-27 | ||
GB1116598.2A GB2495088B (en) | 2011-09-27 | 2011-09-27 | Interactive system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013045922A1 true WO2013045922A1 (en) | 2013-04-04 |
Family
ID=44993406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2012/052386 WO2013045922A1 (en) | 2011-09-27 | 2012-09-26 | System for providing interactive content to an internet - enabled television apparatus |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2495088B (en) |
WO (1) | WO2013045922A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107431843A (en) * | 2015-04-01 | 2017-12-01 | 三星电子株式会社 | Method and apparatus for the communication between devices in multimedia system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104618376B (en) * | 2015-02-03 | 2019-08-20 | 华为技术有限公司 | Play method, server and the display device of media content |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100088734A1 (en) * | 2008-10-08 | 2010-04-08 | Yoshiharu Dewa | Reception apparatus, reception method, and server apparatus |
EP2192776A1 (en) * | 2008-12-01 | 2010-06-02 | Samsung Electronics Co., Ltd. | Display apparatus and display method |
EP2262261A2 (en) * | 2009-06-11 | 2010-12-15 | LG Electronics Inc. | Mobile terminal, method of participating in interactive service therein, internet protocol television terminal and communication system including the same |
US20110126252A1 (en) * | 2009-11-20 | 2011-05-26 | At&T Intellectual Property I, L.P. | Method and apparatus for presenting media programs |
US20110154200A1 (en) * | 2009-12-23 | 2011-06-23 | Apple Inc. | Enhancing Media Content with Content-Aware Resources |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020049832A1 (en) * | 1996-03-08 | 2002-04-25 | Craig Ullman | Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments |
US20020162117A1 (en) * | 2001-04-26 | 2002-10-31 | Martin Pearson | System and method for broadcast-synchronized interactive content interrelated to broadcast content |
US9826197B2 (en) * | 2007-01-12 | 2017-11-21 | Activevideo Networks, Inc. | Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device |
US20080201736A1 (en) * | 2007-01-12 | 2008-08-21 | Ictv, Inc. | Using Triggers with Video for Interactive Content Identification |
-
2011
- 2011-09-27 GB GB1116598.2A patent/GB2495088B/en not_active Expired - Fee Related
-
2012
- 2012-09-26 WO PCT/GB2012/052386 patent/WO2013045922A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100088734A1 (en) * | 2008-10-08 | 2010-04-08 | Yoshiharu Dewa | Reception apparatus, reception method, and server apparatus |
EP2192776A1 (en) * | 2008-12-01 | 2010-06-02 | Samsung Electronics Co., Ltd. | Display apparatus and display method |
EP2262261A2 (en) * | 2009-06-11 | 2010-12-15 | LG Electronics Inc. | Mobile terminal, method of participating in interactive service therein, internet protocol television terminal and communication system including the same |
US20110126252A1 (en) * | 2009-11-20 | 2011-05-26 | At&T Intellectual Property I, L.P. | Method and apparatus for presenting media programs |
US20110154200A1 (en) * | 2009-12-23 | 2011-06-23 | Apple Inc. | Enhancing Media Content with Content-Aware Resources |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107431843A (en) * | 2015-04-01 | 2017-12-01 | 三星电子株式会社 | Method and apparatus for the communication between devices in multimedia system |
US11159844B2 (en) | 2015-04-01 | 2021-10-26 | Samsung Electronics Co., Ltd. | Method and device for communicating between devices in multimedia system |
US11606601B2 (en) | 2015-04-01 | 2023-03-14 | Samsung Electronics Co., Ltd. | Method and device for communicating between devices in multimedia system |
Also Published As
Publication number | Publication date |
---|---|
GB2495088A (en) | 2013-04-03 |
GB2495088B (en) | 2013-11-13 |
GB201116598D0 (en) | 2011-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230143423A1 (en) | Providing enhanced content | |
US9967708B2 (en) | Methods and systems for performing actions based on location-based rules | |
US9106873B2 (en) | Methods and systems for providing enhanced content by way of a virtual channel | |
CN103210654B (en) | Digital receiver and the method for real-time audience ratings is provided | |
CN103348693B (en) | Systems and methods for navigating through content in an interactive media guidance application | |
CA2602109C (en) | Method and system of providing user interface | |
KR101774039B1 (en) | Automatic media asset update over an online social network | |
US20120233646A1 (en) | Synchronous multi-platform content consumption | |
US20140130097A1 (en) | Apparatus and method for television | |
US20120284744A1 (en) | Automated playlist generation | |
US20110178854A1 (en) | Method and system for enhancing and/or monitoring visual content and method and/or system for adding a dynamic layer to visual content | |
US20120240142A1 (en) | Content Provision | |
CN101606171A (en) | The apparatus and method of access and first media data correlation combiner information | |
US8327404B2 (en) | Methods and systems for providing enhanced content associated with a media content instance available for purchase | |
US20130312049A1 (en) | Authoring, archiving, and delivering time-based interactive tv content | |
EP2501144A2 (en) | Content provision | |
US20150312633A1 (en) | Electronic system and method to render additional information with displayed media | |
US20120179968A1 (en) | Digital signage system and method | |
CN105208415A (en) | Advertisement playing method, advertisement playing service device and advertisement playing system based on interactive network television | |
JP5420023B2 (en) | Video related information transmitting system and method, and related information transmitting apparatus used therefor | |
US20120143661A1 (en) | Interactive E-Poster Methods and Systems | |
WO2013045922A1 (en) | System for providing interactive content to an internet - enabled television apparatus | |
KR101537547B1 (en) | On-line live-broadcasting advertisement system and method using overlay streaming | |
KR20140147906A (en) | Apparatus and method for downloading contents | |
KR20120057028A (en) | System, method and apparatus of providing/receiving advertisement content of service providers and client |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12773103 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12773103 Country of ref document: EP Kind code of ref document: A1 |