WO2018215732A1 - Audio and/or video receiving and transmitting apparatuses and methods - Google Patents

Audio and/or video receiving and transmitting apparatuses and methods Download PDF

Info

Publication number
WO2018215732A1
WO2018215732A1 PCT/GB2018/051097 GB2018051097W WO2018215732A1 WO 2018215732 A1 WO2018215732 A1 WO 2018215732A1 GB 2018051097 W GB2018051097 W GB 2018051097W WO 2018215732 A1 WO2018215732 A1 WO 2018215732A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
ancillary data
audio
data
update
Prior art date
Application number
PCT/GB2018/051097
Other languages
French (fr)
Inventor
Nigel Stuart Moore
Original Assignee
Sony Corporation
Sony Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation, Sony Europe Limited filed Critical Sony Corporation
Publication of WO2018215732A1 publication Critical patent/WO2018215732A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/25Arrangements for updating broadcast information or broadcast-related information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/86Arrangements characterised by the broadcast information itself
    • H04H20/93Arrangements characterised by the broadcast information itself which locates resources of other pieces of information, e.g. URL [Uniform Resource Locator]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/09Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
    • H04H60/13Arrangements for device control affected by the broadcast information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/68Systems specially adapted for using specific information, e.g. geographical or meteorological information
    • H04H60/73Systems specially adapted for using specific information, e.g. geographical or meteorological information using meta-information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2362Generation or processing of Service Information [SI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4345Extraction or processing of SI, e.g. extracting service information from an MPEG stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • H04N21/4586Content update operation triggered locally, e.g. by comparing the version of software modules in a DVB carousel to the version stored locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6112Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving terrestrial transmission, e.g. DVB-T
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • H04N21/8402Generation or processing of descriptive data, e.g. content descriptors involving a version number, e.g. version number of EPG data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Abstract

A video and/or audio receiving apparatus comprising: receiver circuitry operable to receive a signal comprising video and/or audio data; network interface circuitry operable to send and receive data over a network; and processor circuitry operable: based on location information comprised within the video and/or audio data, to control the network interface circuitry to receive ancillary data associated with the video and/or audio data from a location on the network, the location information indicating the location on the network at which the ancillary data is located, to execute a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.

Description

Audio and/or Video Receiving and Transmitting Apparatuses and Methods
BACKGROUND
Field of the Disclosure
The present invention relates to audio and/or video receiving and transmitting apparatuses and methods.
Description of the Related Art
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in the background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention. Systems such as hybrid broadcast broadband TV (HbbTV ®) are aimed at harmonising broadcast, IPTV and broadband delivery of content to televisions (TVs) (such as smart TVs and set up boxes (STBs)). These are examples of receiving apparatuses. In embodiments of the disclosure, they may comprise antennae or display screens, or antennae or display screens ma be attachable to the apparatus. One feature of HbbTV is that it allows software
applications associated with broadcast audio and/or video content to be executed in response signalling included in the broadcast. This is known as application information table (AIT) signalling. The application information table (AIT) is, for example, in DVB (Digital Video
Broadcasting) section format. However, on-line delivery of AIT is often done as an extensible mark up language (XML) file on the basis of which the software application is run (this may be referred to as an X LAIT file). in some cases, however, the AIT signalling is lost. This can happen when, for example, a non- HbbTV set top box is connected to an HbbTV-enabled TV via, for example, HD ! (High Definition Multimedia Interface). In this case, the TV cannot benefit from a signal to start and control an HbbTV software application since the TV does not know that the application exists (this is because the connection, (for example, HDMI) does not carry the AIT signal). A wireless connection between TV and set top box may also be configured, intentionally or unintentionally, not to carry the AIT signalling. In another example, the AIT signalling is lost when a TV channel is re-broadcast using a different technique (such as via a community antenna network (for example, a cable network or a network within a block of apartments of a hotel where, for example, there is a common receiving antenna for a number of receiving ports for multiple receiving apparatuses)) which does not have the capability (intentionally or unintentionally) to transmit the AIT signalling. Other reasons may exists for removing the AIT signalling, whether intentionally or unintentionally. Without the AIT signalling, an HbbTV software application cannot be started or controlled, and thus the benefits of HbbTV are not realised in such a situation.
A solution to this problem is to include location information within the audio and/or video data (the location information indicating, for example, a location on a network such as the internet) so that the TV (once the audio and/or video signal is received) may still obtain the AIT (in the form of an XMLA!T, for example). The AIT can therefore still be obtained even though the AIT signalling is removed from the broadcast audio and/or video stream before it is received by the TV. A problem with this approach, however, is that the TV does not know when the AIT has been updated. One solution to this is that the TV continually polls the server on which the AIT is stored so as to determine whether the AIT has been updated. This, however, may result in an unnecessarily high use of network bandwidth and also increases the processing requirements at both the TV and the server at which the AIT is stored.
There is a need therefore to overcome the above-mentioned problems. SUMMARY
The present technique provides a video and/or audio receiving apparatus comprising: receiver circuitry operable to receive a signal comprising video and/or audio data; network interface circuitry operable to send and receive data over a network; and processor circuitry operable: based on location information comprised within the video and/or audio data, to control the network interface circuitry to receive ancillary data associated with the video and/or audio data from a location on the network, the location information indicating the location on the network at which the ancillary data is located, to execute a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
The present technique provides a video and/or audio transmitting apparatus comprising:
transmitter circuitry operable to transmit a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
The present technique provides a receiver comprising: first interface circuitry configured to receive a signal representing audio visual data; second interface circuitry configured to request and receive first ancillary data associated with the audio visual data and, based on a change to a property of the signal representing audio visual data, to request and receive second ancillary data; memory configured to store the first and second ancillary data; and processing circuitry configured to perform a process by executing code components stored in a non-transitory storage medium using the first ancillary data as an input instruction and to modify the process using the second ancillary data.
The present technique provides a video and/or audio receiving method comprising: receiving a signal comprising video and/or audio data; network interface circuitry operable to send and receive data over a network; and processor circuitry operable: based on location information comprised within the video and/or audio data, receiving ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, executing a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, determining whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, applying the update to the received ancillary data and executing a predetermined process associated with the updated ancillary data.
The present technique provides a video and/or audio transmitting method comprising:
transmitting a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 schematically describes a video and/or audio receiving apparatus according to the present technique;
Figure 2 schematically describes a video and/or audio transmitting apparatus according to the present technique;
Figure 3 schematically describes a system according to the present technique;
Figures 4A-4F schematically describe a number of example data structures according to the present technique; and
Figures 5A and 5B show flow charts schematically illustrating processes according to the present technique.
DESCRIPTION OF THE EMBODIMENTS
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
Figure 1 shows a television receiver 100 (which is an example of a video and/or audio receiving apparatus) according to embodiments of the present disclosure. The television receiver 100 may be a set-top box receiving terrestrial television, cable television, satellite television or IPTV in a managed network or over-the-top in an unmanaged network, for example. Devices may be mains electricity or battery powered. The television receiver 100 may be integrated into a television display device or an optical disc reader such as a Blu-Ray or DVD player.
Alternatively, the television receiver may be integrated into a games console, Personal
Computer or any kind of suitable device. The television receiver may take the form of a software application that is executed on a hand-held device or on any of the aforementioned device types and stored locally or remotely in a data storage medium. An automotive vehicle may comprise a television receiver according to embodiments of the present disclosure. A handheld computing device or mobile telephone may comprise a television receiver according to embodiments of the present disclosure. An electronic device of any sort may comprise a video and/or audio receiving apparatus such as television receiver 100 according to the present disclosure.
The operation of the television receiver 100 is controlled by a controller 105. The controller 105 may take the form of a controller circuitry (or, more generally, processor circuitry) which is typically made of semiconductor material and which runs under the control of computer software embodied as computer readable code. This code may be stored within the controller 105 or may be stored elsewhere within the television receiver 100. In one embodiment example, the computer software is stored within storage medium 125 which is connected to the controller 105. Storage medium 125 may be formed of any kind of suitable media such as solid-state storage or magnetic or optical readable media. Other data such as user profile information, application data, and content may be also stored on storage medium 125. Also connected to controller 105 is a television decoder 120. The television decoder 120 may take the form of receiver circuitry which is configured to receive television signals encoded using the Advanced Television Systems Committee (ATSC) set of Standards. In embodiments, the encoded television signals may be broadcast or delivered in a multicast or unicast mode over a terrestrial link, a cable link, satellite link, broadband internet connection or over a cellular (mobile telephone) network. Indeed, although ATSC is described here, the disclosure is not limited and the television decoder 120 may be configured to receive television signals in one of the Digital Video Broadcasting (DVB) formats, according to Association of Radio Industries and Businesses (ARIB) standards, DTMB (Digital Terrestrial Multimedia Broadcast) or any other appropriate format. The television decoder 120 comprises a demodulator 121 and a tuner 122. 121 and 122 may be embodied in a single circuitry package. The television decoder 120 is connected to an antenna 130 which allows these television signals to be received. The antenna 130 may take the form of a Yagi and log-periodic type antenna or a satellite dish, cable head-end or any kind of appropriate reception device. In some television decoder device embodiments, the antenna 130 may take the form of a cabled interface such as a High-Definition Multimedia Interface (HDMI) for receiving a video and/or audio signal from another device (for example, if the television receiver 100 is comprised within a TV, then the video and/or audio signal may be received from a STB via an HDMI interface). The antenna 130 may also take the form a of short range wireless interface such as a wireless HDMI interface. As well as the television receiver 100 of embodiments comprising an antenna 130 to receive broadcast video and/or audio signals, the television receiver also comprises network interface circuitry 140 which enables data to be sent to and received from the television receiver 100 over the internet or other network. The network interface circuitry 140 may allow data to be sent to and received from the television receiver 100 via one of a wired connection (such as Ethernet) or wireless connection (such as WiFi), for example.
In embodiments, therefore, the television receiver 100 may receive data (such as, but not limited to audio and/or video data) as both broadcast signals (via the television decoder 120) and Internet Protocol packets (via the network interface circuitry). The controller 105 is also connected to a user input module 135. The user input module 135 may be a remote control or commander, touchscreen, stylus, keyboard, mouse, gesture recognition system, microphone for voice control or any kind of device suitable to allow the user to control the operation of the television receiver 100.
The controller 105 is also connected to a user output module 1 15. The user output module 1 15 may be a display (into which the television receiver 100 is integrated or connected), wearable technology such as a smart-watch or goggles (or, more generally, electronic eyewear), or any kind of device suitable to allow the user to receive the televisual output of the television receiver 100.
Figure 2 shows a television transmission device 200 (which is an example of a video and/or audio transmitting apparatus) according to embodiments of the present disclosure. The television transmission device 200 may be a delivery system transmitting terrestrial television, cable television or satellite television, for example.
The operation of the television transmission device 200 is controlled by a controller 205. The controller 205 may take the form of a controller circuitry (or, more generally, processor circuitry) which is typically made of semiconductor material and which runs under the control of computer software embodied as computer readable code. This code may be stored within the controller 205 or may be stored elsewhere within the television transmission device 200. In this specific example, the computer software is stored within storage medium 225 which is connected to the controller 205. Storage medium 225 may be formed of any kind of suitable media such as solid- state storage or magnetic or optical readable media. Other data such as user profile
information, application data, and content may be also stored on storage medium 225. 225 may be cloud storage. 225 may be a dispersed storage medium comprising multiple storage devices.
Also connected to controller 205 is a television encoder 220. The television encoder 220 may take the form of transmitter circuitry which is configured to transmit television signals encoded using the Advanced Television Systems Committee (ATSC) set of Standards. In embodiments, the encoded television signals may be broadcast or delivered in a multicast or unicast mode over a terrestrial link, a cable link, satellite link, broadband internet connection or over a cellular (mobile telephone) network. Again, although ATSC is described here, the disclosure is not limited and the television encoder 220 may be configured to receive television signals in the Digital Video Broadcasting (DVB) format, according to Association of Radio Industries and Businesses (ARIB) standards, DTMB (Digital Terrestrial Multimedia Broadcast) or any other appropriate format.
The television encoder 220 is connected to an antenna 230 which allows these television signals to be transmitted or broadcast. In the case that a cable link is used for transmission of a video and/or audio signal, the antenna 320 takes the form of a suitable cabled interface such as a High-Definition Multimedia Interface (HDMI) for transmitting a video and/or audio signal to another device (for example, if the television transmission device 100 is comprised within a STB, then the video and/or audio signal may be transmitted to a TV via an HDMI interface). As well as the television transmission device 200 of embodiments comprising an antenna 230 to transmit broadcast video and/or audio signals, the television transmission device also comprises network interface circuitry 235 which enables data to be sent to and received from the television transmission device over the internet or other network. The network interface circuitry 235 may allow data to be sent to and received from the television transmission device via one of a wired connection (such as Ethernet) or wireless connection (such as WiFi), for example.
In an embodiment, a video and/or audio signal is transmitted from the television transmission device 200 to the television receiver 100. The television receiver 100 is comprised within an HbbTV enabled device (such as an HbbTV enabled TV). However, the technique via which the video and/or audio signal is transmitted from the television transmission device 200 to the television receiver 100 (or, more specifically, from the television encoder 220 of the television transmission device 200 to the television decoder 120 of the television receiver 100) means that AIT signalling (which in embodiments is necessary to start and control HbbTV software applications) cannot be transmitted. As previously mentioned, this may occur when a video and/or audio signal originally broadcast to include the AIT signalling is re-broadcast by the television transmission device 200 using a transmission technique in which the AIT signalling is lost. Such transmission techniques can include various cable transmission services or transmission via an HDMI cabled interface, for example. There is therefore a need to allow the television receiver 100 to obtain an AIT which has previously been lost so as to enable HbbTV software applications to be run. It is desirable for the television receiver 100 to be notified of updates to the AIT so as to enable HbbTV software applications to be updated and/or controlled effectively. HbbTV is described in ETSI TS 102 796 V1 .4.1 (2016-08) which is incorporated herein in its entirety by reference. Hybrid Broadcast Broadband TV Application Discovery over Broadband is described in ETSI TS 103 464 V1 .1 .1 (2016-09) which is incorporated herein in its entirety by reference. The AIT is described in Signalling and carriage of interactive applications and services in Hybrid broadcast/broadband environments (ETSI TS 102 809 V1 .2.1 (2013-07)) which is incorporated herein in its entirety by reference.
Figure 3 shows a system according to the present technique in which a video and/or audio receiving apparatus (in this case, television receiver 100), a video and/or audio transmitting apparatus (in this case, television transmission device 200) and a server 300 are connected via a network 301 (the network may be the internet, for example, or any other suitable type of communications network).
According to an embodiment, the video and/or audio receiving apparatus 100 comprises receiver circuitry (in this case, television decoder 120) operable to receive a signal comprising video and/or audio data, network interface circuitry (in this case, network interface circuitry 140) operable to send and receive data over the network 301 and processor circuitry (in this case, controller 105). The process circuitry is operable, based on location information comprised within the video and/or audio data, to control the network interface circuitry to receive ancillary data (such as an AIT on the basis of which an HbbTV software application is executable) associated with the video and/or audio data from a location on the network (in this case, the server 300), the location information indicating the location on the network at which the ancillary data is located. The processor circuitry is operable to execute a pre-determined process associated with the received ancillary data (for example, to control execution of an HbbTV software application). The processor circuitry is operable, based on update information comprised within the video and/or audio data, to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a pre-determined process associated with the updated ancillary data.
According to an embodiment, the video and/or audio transmitting apparatus 200 comprises transmitter circuitry (in the case, television encoder 220) to transmit the signal comprising video and/or audio data to the video and/or audio receiving apparatus 100, the video and/or audio data comprising the location information and update information. It is on the basis of the location information that the video and/or audio receiving apparatus 100 is operable to receive the ancillary data associated with the video and/or audio data from the location on the network (in this case, server 300). It is on the basis of the update information that the video and/or audio receiving apparatus 100 is operable to determine whether or not there is an update associated with the received ancillary data. Thus, it will be appreciated that, with the present technique, a signal comprising video and/or audio data is transmitted from a video and/or audio transmitting apparatus 200 (which maybe referred to as a "transmitter") to a video and/or audio receiving apparatus 100 (which maybe referred to as a "receiver"). Furthermore, ancillary data associated with the video and/or audio data may be received by the receiver 100 over the network 301 from the server 300. The receiver 100 knows the location of the ancillary data (in particular, that the ancillary data is located at server 300 together with the location of the ancillary data within the server 300) based on the location information comprised within the video and/or audio data transmitted from the transmitter 200. Furthermore, the receiver 100 is able to determine whether or not there is an update associated with the received ancillary data based on the update information comprised within the video and/or audio data transmitted from the transmitter 200. Examples of the location information and update information are described later on.
Figures 4A to 4F show some example ways in which the location information and update information may be structured as part of the video and/or audio data. These structures are included in the audio and/or video data (for example, as audio and/or video watermarks in audio and/or video data formatted in the ATSC format) and comprise a number of fields. Each of the fields contain a pre-determined type of information.
Figure 4A shows a basic data structure according to an embodiment. The data structure has two fields (and may therefore be referred to as a duplet), these being a first field 400 in which location information is located and a second field 402 in which update information is located. The location information and update information may take any suitable format, as long as both formats are readable by the receiver 100 (in particular, by the controller 105 of the receiver 100). A number of example embodiments of the location information and update information are presented in Figures 4B to 4F. Figure 4B shows an embodiment in which the update information is version information of a current version of the ancillary data located at the location on the network (in this case, the server 300) and wherein the processer circuitry (in this case, controller 105) is operable to determine that there is an update associated with the received ancillary data when it is determined, based on a change of the version information, that there is a change to the current version of the ancillary data located at the location on the network. In this case, the controller 105 monitors the version number in the data structure which is received as part of the audio and/or video data (the structure comprising the location information and update information being transmitted repeatedly during transmission of the audio and/or video data at, for example, a predetermined rate (that is, a predetermined number of times per unit time)). When it is detected that the version number has changed, the controller 105 knows that an update to the ancillary data is available. In this case, the updated ancillary data may be retrieved from the server 300. Version number is not limited to numerical values. It may include alphanumeric characters or other symbols such as punctuation marks or the like. It may be in hexadecimal representation or the like. Generally, it will be appreciated that the version information may take any form which enables the receiver 100 to determine the version of the ancillary data stored at the server 300.
Thus, for example, the ancillary data may be updated in response to a command from a user at the transmitter 200 (using a user interface (not shown) of the transmitter 200, for example), thus causing the transmitter 200 to transmit updated ancillary data to the server 300 over the network 301 for storage at the server 300. The controller 205 then controls the transmitter circuitry 220 to transmit the data structure shown in Figure 4B with an updated version number (that is a version number which is different to the version number contained in the data structure the previous time it was transmitted). In response to detecting this change in the version number, the controller 105 of the receiver 100 then controls the network interface circuitry 140 to retrieve the update ancillary data from the server 300 over the network 301 . Thus, due to the change in the version number (the version number being the update information in this case), the receiver 100 is informed of the fact that updated ancillary data is available from the server 300 and the receiver 100 is able to retrieve the updated ancillary data from the server 300 over the network 301 . It is therefore not necessary for the receiver 100 to continuously poll the server 300, for example, in order to determine whether or not the ancillary data has been updated. The network bandwidth and the processing requirements at the receiver 100 and server 300 are therefore reduced. Furthermore, the user of the receiver 100 is able to enjoy the benefits of the ancillary data (which, in this case, may include allowing an HbbTV application to be run and controlled when the ancillary data is an AIT associated with that HbbTV application, for example) even though this ancillary data itself was not transmitted from the transmitter 200 to the receiver 100 with the audio and/or video data. In embodiments, the change in version number or any similar identifier implicitly indicates to a receiver that the ancillary data has been updated and should be retrieved. In embodiments, the change in version number or any similar version identifier is not in itself a computer executable instruction to retrieve the updated ancillary data. It provides a hint or a pointer to the receiver of updates ancillary data. In Figure 4B, the location information is information indicative of a uniform resource indicator (URI) which indicates the location of the ancillary data on the network (in this example, the ancillary data is located at server 300). This information maybe the URI itself, or, alternatively, may be information based on which the URI can be derived by the controller 105 of the receiver 100. In one example, in which the ancillary data is an AIT on the basis of which an HbbTV application may be run and controlled, the location information may comprise one or more of a country identifier (which may be, for example, a country name or a country code), a network identifier ("onid"), a broadcast delivery system identifier ("network"), a service name ("servicename") and a service identifier ("sid"). In this case, the URI can be derived using these pieces of information (in one example, the controller 105 accesses a lookup table stored either on the storage medium 125 or on an external server (in this case, the controller 105 accesses the lookup table via the network interface circuitry 140) in order to derive the URI). In particular, there may be a plurality of locations on the network (for example, a plurality of locations within the server 300 or a plurality of locations spread amongst a plurality of servers), each of which is identified by a particular URI. The transmitter 200, which knows the location of the ancillary data, will then transmit the relevant ones of the above-mentioned parameters so that the URI associated with the location at which the ancillary data is located is derivable by the controller 105 of the receiver 100. The generation of URIs on the basis of the above-mentioned parameters is already known in the art, and will therefore not be described in detail here. In embodiments, the URI does not necessarily identify the file name for the ancillary data, but may merely identify a location on a network where a file is located and can be retrieved from. Figure 4C shows a data structure according to another embodiment. In this case, the location information in the location information field 400 is once again in the form of a URI (or, more generally, information on the basis of which a URI maybe derived, as previously explained). However, in this case, rather than having a version number, the update information in the update information field 401 comprises a counter that comprises a changing value (the value changing, for example, every time the data structure is transmitted over the duration of the transmission of the audio and/or video data). The changing value changes from a start value to a predetermined end value over time in a predetermined manner. The controller 105 of the receiver 100 is operable to determine that there is an update associated with the received ancillary data when it is determined that the changing value of the counter is equal to the predetermined end value.
In one example, the update information of Figure 4C takes the form of a numerical value which changes from a start value (such as 255) to a predetermined end value (such as 0) in a linear fashion. For example, the data structure may be repeatedly transmitted with each subsequent transmission having a value in the counter field 401 which is one less than in the counter field 401 of the previously transmitted data structure.
Because the change in value changes from a start value to a predetermined end value over time in a predetermined manner, even if some of the transmitted data structures are lost (due to, for example, a poor quality signal carrying the video and/or audio data or corruption of the data structure), the controller 105 of the receiver 100 is still able to predict the value of the counter because the counter is known to change in the pre-determined manner. For example, if the data structure as shown in Figure 4C is repeatedly transmitted with the value of the counter of each data structure being one less than the value of the counter of the previously transmitted data structure, and the data structures are transmitted at a consent rate, then even if, for example, the data structure containing a counter value of 0 (0 being the predetermined end value), then the controller 105 is able to recognise that, although the data structure containing the counter value equal to 0 has not been received, the time is such that is should have been received. The controller 105 is therefore able to update the ancillary data accordingly, even though the specific data structure indicating this (that is, the data structure in which the changing value indicated by the counter in the update information field 401 is equal to 0) has not been received. Such an arrangement provides a more robust arrangement for ensuring that updates to the ancillary data are made available at the receiver 100 despite unpredictable conditions for transmission of the signal comprising the video and/or audio date from the transmitter 200 to the receiver 100.
Figure 4D shows a data structure according to another embodiment. In this case, the location information once again takes the structure of information on the basis of which a URI at which the ancillary data is located on the network may be derived. In this case, however, the update information in the update information field 401 takes the form of information representing the update associated with the received ancillary data. This information maybe referred to as a "delta" or "patch". This delta or patch represents new data which is to be added to the previously received ancillary data (either as an addition or insertion, or in embodiments, as a replacement of an existing portion of the ancillary data).
One example of such a delta or patch relates to when the ancillary data is an application information table (AIT) represented by an extensible markup language (XML) file. Such a file is known in HbbTV an XMLAIT file. In this case, the delta or patch will be indicative of an element of the XMLAIT file which is to be replaced. For example, XMLAIT files will typically comprise an instruction relating to starting and stopping of the HbbTV software application with which the XMLAIT file is associated. Such instructions may include the commands "autostart" (which instructs the controller 105 to begin execution of the HbbTV software application), "stop" or "kill" (which instructs the controller 105 to stop (or, in other words, kill) the HbbTV software application) and "present" (which instructs the controller 105 to continue current execution of the HbbTV software application - in this case, a receiver 100 which newly starts receiving the video and/or audio data from the transmitter 200 does not start execution of the HbbTV software application, however a receiver 100 which was already receiving the video and/or audio data and which was already executing the HbbTV software application may continue to run the HbbTV software application). Thus, with the data structure shown in Figure 4D, the receiver 100 does not have to receive updates to the ancillary data from the server 300. That is, it will initially receive the ancillary data from the server 300. Subsequent to this, however, updates to the ancillary data are received by way of a delta or patch included as update information in the data structure comprising the location information and update information relating to the ancillary data which is received as part of the video and/or audio data received from the transmitter 200. Such an arrangement means that the receiver 100 does not have to download updates to the ancillary data from the server 300, thus reducing bandwidth usage over the network 301 . In particular, this arrangement helps to alleviate a spike in network traffic when ancillary data is updated (and when a potentially large number of receivers 100 therefore need to receive the update to the ancillary data). With the arrangement shown in Figure 4D, it is noted that the delta or patch included as update information in the data structure may be subject to corruption or even malicious interference directed to changing the information in the delta or patch (thus allowing an unauthorised party to amend the ancillary data (such as an XMLAIT file) currently being used by the receiver 100 (it is noted that, in embodiments, the ancillary data received by the receiver 100 may be stored in the storage medium 125 the receiver 100, for example). To help alleviate this problem, as well as the ancillary data being updated at the receiver 100 via the data structure shown in Figure 4D, updated ancillary data is also stored on the server 300. For example, a user (via a suitable user interface (not shown), for example) of the transmitter 200 may control the transmitter 200 to overwrite the current ancillary data stored in the server 300 with the new ancillary data (that is, the ancillary data comprising the new data comprised within the transmitted delta or patch). Then, after the receiver 100 has received the data structure and updated the ancillary data on the basis of the delta or patch, the controller 105 controls the network interface circuitry 140 to receive the updated ancillary data from the server 300. The controller 105 then compares the update of the received updated ancillary data (that is, the ancillary data newly received from the server 300) with the update comprised within the update information of the data structure (that is, the update indicated by the patch or delta included the data structure).
In the case that the update of the received updated ancillary data matches the update comprised within the update information of the data structure, the controller 105 determines that the ancillary data has been correctly updated. On the other hand, in the case that the update of the received updated ancillary data does not match the update comprised in the update information of the data structure, the controller 105 determines that the ancillary data has not been correctly updated and executes a pre-determined process in response to this. In one example, the controller 105 may completely discard the current ancillary data (that is, the ancillary data stored in the storage medium 125) and control the network interface circuitry 140 to again download the ancillary data in its entirety from the server 300. Alternatively, the controller may take the newly downloaded ancillary data that was used for the comparison as the correct ancillary data and use it to update the ancillary data stored in the storage medium 125 (rather than using the update information of the data structure, which is now known not to be correct). It is noted that the received updated ancillary data may be the ancillary data in its entirety or, alternatively, may be only the portion of the ancillary data which has been updated. This helps to ensure that the integrity of the ancillary data is maintained even if an update to the ancillary data comprised within the update information transmitted as part of the video and/or audio data (in the form of the delta or patch) is corrupted or interfered with. In this case, the controller 105 may also, for example, alert a user of the receiver 100 that there has been a mismatch between the delta or patch received as part of the video and/or audio data and the updated ancillary data as stored on the server 300 (this alert may be output via the user output module 1 15, for example). The controller 105 may also control the network interface circuitry 140 to transmit a message to the transmitter 200 (or to any other appropriate entity of the network 301 ) indicating that such a mismatch has occurred. This may allow, for example, a broadcaster operating the transmitter 200 to investigate signalling conditions between the transmitter 200 and receiver 100 or, for example, to investigate whether there has been a malicious attack on the transmitted signal comprising the video and/or audio data.
Figures 4E and 4F show further example data structures according to embodiments of the present technique. In each of these data structures, the update information is comprised within two fields 401 A and 401 B. In Figure 4E, as well as having information indicative of a URI indicating the location of the ancillary data, the data structure comprises a first update information field 401 A into which an update of the ancillary data (in the form of a delta or patch) may be inserted and second update information field 401 B in which a version number of the current ancillary data stored at the server 300 is indicated. Such an arrangement allows different portions of the update
information to be sent to different times.
In the example of Figure 4E, at first time ti , the first update information field 401 A comprises a "null" value (indicating to the controller 105 of the receiver 100 that there is no further information in this field). Furthermore, the second update information field 401 B indicates that the version number of the ancillary data currently stored at the server 300 is version number 1 . At a later time t2, the first update information field 401 A comprises the delta or patch comprising the update of the ancillary data. The second update information field 401 B indicates that the version of the ancillary data stored at the server 300 is still version number 1 . In this case, the controller 105 causes the delta or patch to be stored in the storage medium 125, but does not yet update the ancillary data stored in the storage medium 125 with the delta or patch. At a later time t3, the first update information field 401 A once again contains a value "null". However, the ancillary data stored at the server 300 has now been updated to a different version (with version number 2). The second update information field 401 B therefore indicates that the ancillary data stored at the server 300 is the version number 2. This new version of the ancillary data comprises the same update contained in the delta or patch. In response to this, the controller 105 updates the ancillary data stored in the storage medium 125 with the information comprised within the delta or patch received at time t2.
Such an arrangement allows different portions of the update information (the update information in the case of Figure 4E comprising both the delta or patch and the version number the ancillary data) at different times. In particular, it allows the receiver 100 to already know what the update to the ancillary data is going to be at a time prior to the ancillary data actually being updated. When the time (t3 in this case) arrives at which the ancillary data needs to be updated, all the controller 105 needs to do is to update the stored ancillary data with the previously received delta or patch. The stored ancillary data may therefore be updated very quickly, without the need for new ancillary data to be downloaded from the server 300. Such an arrangement also means, for example, that a data structure of a type shown for time t2 in Figure 4E may be transmitted a plurality of times prior to the time at which the ancillary data must be updated (t3 in this case). The receiver 100 can therefore make multiple attempts at receiving the update (in the form of the delta or patch) prior to the update being required, thus helping to make the system more robust (since, if one of the data structures comprising the delta or patch is corrupted, one or more other data structures containing the delta or patch may not be corrupted, thus allowing the receiver 100 to receive the delta or patch prior to the updated ancillary data being required).
Figure 4F shows another arrangement according to an embodiment. Again, the update information of this data structure comprises two portions, the first portion being defined in the first update information field 401A and second portion being define in the second update information field 401 B. The portion of the update information indicated in the first update information field 401 A is either the value "null" or a delta or patch update for the ancillary data currently stored in the storage medium 125 (this is the same as previously described with reference to Figure 4E). However, this time, instead of the version number being indicated in the second update information field 401 B, this field contains a counter. Like the counter described with reference to Figure 4C, the counter takes a value which changes from a start value to a predetermined end value over time in a predetermined manner, thus allowing the controller 105 of the receiver 100 to determine the counter value of a data structure which should have been received (but, due to corruption or the like, was not received) based on a previous value of the counter and the known manner in which the value of the counter changes over time. In this example, at time ti , the value of the first update information field 401 A is "null" and the value of the counter in update information field 401 B is 255. At a later time t2, the first update information field 401 A comprises a delta or patch update for the ancillary data currently stored in the storage medium 125 and the value of the counter is 127. In this case, a number of other data structures have been transmitted between the times ti and t2. In one example, each of these intermediate data structures has a respective counter value which decreases by one for each consecutively transmitted data structure (for example, after the data structure shown at ti , the subsequent data structures will have counter values 255, 254, 253, etc.). In one example, each intermediate data structure has a value of "null" or in the update information field 401 A. At a later time t3, the update information field 401 A once again contains the value "null" and the counter value is equal to 0. In this case, the value 0 is the predetermined end value, and therefore the controller 105 of the receiver 100 knows that the ancillary data stored in the storage medium 125 needs to be updated with the update information comprised within the delta or patch data transmitted previously in the data structure at time t2. Again, in this case, there will be a number of intermediate data structures transmitted between the times t2 and t3. In one example, each of these intermediate data structures has a respective counter value which decreases by one for each consecutively transmitted data structure (for example, after the data structure shown at t2, the subsequent data structures will have counter values 126, 125, 124, etc.). Each of these intermediate data structures comprises either the value "null" in the update information field 401 A or comprises the delta or patch in the update information field 401 A. It will be appreciated that, by having the delta or patch information repeated in a plurality of transmitted data structures (that is, data structures transmitted at different times), the system is made more robust since the receiver 100 is likely to be able to receive the delta or patch via at least one of these data structures even if one or more of those data structures are corrupted. As with the arrangement to describe with reference to Figure 4E, the arrangement of Figure 4F also helps ensure that the update to be applied to the ancillary data stored in the storage medium 125 in the form of the delta or patch is known to the receiver 100 (as stored in the storage medium 125) prior to the updated ancillary data being required at time t3 (when the value of the counter reaches the predetermined end value of 0). This ensures a rapid update of the ancillary data stored in the storage medium 125 at time t3 since the receiver 100 does not need to retrieve the updated ancillary data from the server 300 but, rather, is able to rely on the update provided by the delta or patch within the data structure(s) transmitted from the transmitter 200 as part of the video and/or audio data.
It will be appreciated that the previously described comparison of the update to the ancillary data defined by the delta or patch with the update to the ancillary data stored on the server 300 may be applied to the arrangements described with reference to Figures 4E and 4F in the same way as described with reference to Figure 4D. Again, this helps to maintain the integrity of the ancillary data in the face of data corruption or a malicious attack of the transmitted video and/or audio signal.
In an embodiment, the location information and update information comprised within the video and/or audio data are comprised within a watermark of the video and/or audio data. A watermark is part of the data which defines the video and/or audio data itself and has a predetermined format which is readable by the controller 105 of the receiver 100 so as to retrieve the location information and update information. Each of the data structures illustrated in Figures 4A to 4F, for example, may be comprised within a watermark. The concept of a watermark in video and/or audio data is well known in the art and will therefore not be described in detail here. In embodiments the watermark may be an ATSC3.0 video or audio watermark. It may be carried in only a portion of the video spatially or temporally. It may be carried in only certain audio frequencies, such as a high frequency band relative to the audio signal.
In one embodiment, the location information and update information may be comprised within a hierarchical watermark. Such hierarchical watermarks are described, for example, in previous patent documents US 7284129 and US 2010/026625. With hierarchical watermarks, the data represented by the watermark is split into at least two fields or hierarchies. When the first field is accurately decoded, the second field is then decoded. Such hierarchical watermarks are useful for speeding up watermark detection, since their use can save testing or correlating video and/or audio data for watermarks or steganographically encoded data which does not exist in the video and/or audio data (for example, if the first field does not exist, then testing or correlating for the second field is unnecessary).
With the present technique, the information defined in the data structures may be split between different hierarches of a hierarchical watermark. For example, the update information may be located at a higher level of the hierarchy than the location information (a higher level of the hierarchy being decoded before a lower level of the hierarchy). Thus, as the receiver 100 receives a stream of data structures of the types shown in Figures 4A to 4F (for example), after initially decoding the entire watermark so as to obtain the location information of the ancillary data (and to therefore initially retrieve the ancillary data from the server 300), for subsequently received data structures, the controller 105 need only decode the higher level of the hierarchy comprising the update information. Since the decoding of the higher level of the hierarchy occurs first (and, if the required information is obtained from the higher level of the hierarchy, further decoding of the watermark is no longer required), processing requirements for decoding the watermark are reduced. The processing load at the receiver 100 is therefore reduced. In the described embodiments, it should be noted that an "update" of the received ancillary data should be understood to mean an update to at least a portion of the ancillary data. For example, if the ancillary data is an XMLAIT file, then the update to the ancillary data may comprise changing a single element of the XMLAIT file so as to control execution of an HbbTV software application (for example, the element of the XMLAIT file comprising the instruction "autostart", "stop" (or "kill") or "present" may be updated to a different one of the instructions "autostart", "stop" (or "kill") or "present" in this way). It will be appreciated, however, that ancillary data previously received by the receiver 100 may be updated in any suitable manner according to the described techniques. Although, in the above-mentioned embodiments, the ancillary data is an XMLAIT file for controlling the execution of an HbbTV software application, it will be appreciated that the ancillary data may take any other suitable form. For example, the ancillary data may take the form of text information describing a television programme being transmitted as video and/or audio data, an image, a sound file or any form of metadata associated with the video and/or audio signal which is transmitted from the transmitter 200 to the receiver 100.
In the case that the location information and update information are comprised within a watermark of the audio and/or audio data, it is noted that the location information and update information may be included in both a first watermark of the video data and a second watermark of the audio data. In this case, the location information and update information is received twice by the receiver 100, once as part of a video watermark and once as part of an audio watermark. Such an arrangement helps to improve the reliability of the system, in that if one of the watermarks is blocked (for example, if video data is obscured (by a popup, for example) or if the audio is muted) or if one of the video and/or audio watermarks are corrupted, then the receiver 100 is still able to decode the remaining unaffected watermark so as to obtain the location information and update information. It will be appreciated that such an arrangement helps increase the reliability of a system.
Figures 5A and 5B show, respectively, flow charts describing a process carried out by a video and/or audio receiving apparatus (such as receiver 100) and a video and/or audio transmitting apparatus (such as transmitter 200) according to the present technique. Figure 5A shows a process implemented by a video and/or audio receiving apparatus 100 according to an embodiment. The process starts at step 500. At step 501 , a signal comprising video and/or audio data is received. As step 502, ancillary data associated with the video and/or audio data is received from a location (such as server 300) on the network 301 . The location of the ancillary data is known from location information comprised with the video and/or audio data. At step 503, a predetermined process associated with the received ancillary data (such as the execution of an HbbTV software application when the received ancillary data is an XMLAIT file associated with that HbbTV software application, for example) is executed. At step 504, it is determined, based on update information comprised within the video and/or audio data, whether there is an update associated with the received ancillary data. If it is determined that there is no update associated with the received ancillary data, then the process returns to step 504. On the other hand, if it is determined that there is an update associated with the received ancillary data, then the process proceeds to step 505, in which the update is applied to the received ancillary data (for example, when the received ancillary data is an XMLAIT file, and the update associated with the received ancillary data is an update to an element of the XMLAIT file, then that element of the XMLAIT file is updated). At step 506, a predetermined process associated with the updated ancillary data is executed (for example, again, if the ancillary data is an XMLAIT file for controlling an HbbTV software application, then this predetermined process may be to execute the HbbTV software application based on the updated XMLAIT file - this may include, as previously discussed, stopping execution of the software application (if the updated element of the XMLAIT file is a "stop" or "kill" instruction), starting execution of the software application (if the updated element of the XMLAIT file is an "autostart" instruction) or maintaining execution of a software application only if execution of the software application has started prior to receiving the update to the ancillary data (if the updated element of the XMLAIT file is a "present" instruction). The process then ends at step 507. Figure 5B shows a process implemented by a video and/or audio transmitting apparatus 200 according to an embodiment of the present technique. The process starts at step 509. At step 510, a signal comprising video and/or audio data is transmitted to a video and/or audio receiving apparatus 100. As previously described, the signal comprising the video and/or audio data comprises location information on the basis of which the video and/or audio data receiving apparatus 100 may determine the location on the network at which ancillary data associated with the audio and/or video data is located. The video and/or audio data also comprises update information on the basis of which the video and/or audio data receiving apparatus 100 is operable to determine whether or not there is an update associated with the ancillary data received by the video and/or audio receiving apparatus. At step 51 1 it is determined whether or not there has been an update to the ancillary data. In one embodiment, for example, a user of the video and/or audio transmitting apparatus enters information indicative of the update to the ancillary data into the video and/or audio transmitting apparatus 200 (via a suitable user interface (not shown), for example). Processor circuitry (such as controller 205 of transmitter 200) then controls the update information (on the basis of which the video and/or audio data receiving apparatus is able to determine whether or not there is an update associated with the received ancillary data) to be included in the transmitted video and/or audio data and controls the network interface circuitry 235 to apply the update to the ancillary data stored on the network. This occurs at step 512. On the other hand, if there is no update to the ancillary data, then the process returns to step 51 1 . After step 512, the process ends at step 513.
Some example embodiments of the present technique are defined by the following numbered clauses: 1 . A video and/or audio receiving apparatus comprising:
receiver circuitry operable to receive a signal comprising video and/or audio data;
network interface circuitry operable to send and receive data over a network; and processor circuitry operable:
based on location information comprised within the video and/or audio data, to control the network interface circuitry to receive ancillary data associated with the video and/or audio data from a location on the network, the location information indicating the location on the network at which the ancillary data is located,
to execute a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data. 2. A video and/or audio receiving apparatus according to clause 1 , wherein the update information is version information indicative of a current version of the ancillary data located at the location on the network and wherein the processor circuitry is operable to determine that there is an update associated with the received ancillary data when it is determined, based on a change to the version information, that there is a change to the current version of the ancillary data located at the location on the network.
3. A video and/or audio receiving apparatus according to clause 1 , wherein the update information comprises a changing value which changes from a start value to a predetermined end value over time in a predetermined manner and wherein the processor circuitry is operable to determine that there is an update associated with the received ancillary data when it is determined that the changing value is equal to the predetermined end value.
4. A video and/or audio receiving apparatus according to any one of clauses 1 to 3, wherein the processor circuitry is operable to control the network interface circuitry to receive updated ancillary data from the location on the network. 5. A video and/or audio receiving apparatus according to any one of clauses 1 to 3, wherein the update information comprises the update associated with the received ancillary data and wherein the processor circuitry is operable to update the received ancillary data with the update comprised within the update information when it is determined that the update information has been received.
6. A video and/or audio receiving apparatus according to clause 5, wherein:
updated ancillary data is located at the location on the network;
after the update comprised within the update information has been received, the processor circuitry is configured:
to control the network interface circuitry to receive the updated ancillary data from the location on the network and to compare the update of the received updated ancillary data with the update comprised within the update information;
in the case that the update of the received updated ancillary data matches the update comprised within the update information, to determine that the ancillary data has been correctly updated; and
in the case that the update of the received updated ancillary data does not match the update comprised within the update information, to determine that the ancillary data has not been correctly updated and to execute a predetermined process in response to determining that the ancillary data has not been correctly updated.
7. A video and/or audio receiving apparatus according to any preceding clause, wherein: the received ancillary data is data associated with a software application;
the processor circuitry is operable to execute the software application based on the received ancillary data; and
if it is determined that there is an update associated with the received ancillary data, the processor circuitry is operable to execute the software application based on the updated ancillary data. 8. A video and/or audio receiving apparatus according to clause 7, wherein:
the ancillary data associated with the software application comprises one of a first instruction indicating to the processor circuitry to begin execution of the software application, a second instruction indicating to the processor circuitry to continue a current execution of the software application and a third instruction indicating to the processor circuitry to control the processor to stop execution of the software application; and the update associated with the ancillary data associated with the software application comprises a change from one of the first, second and third instructions to another one of the first, second and third instructions. 9. A video and/or audio receiving apparatus according to clause 7 or 8, wherein the ancillary data comprises an Application Information Table (AIT).
10. A video and/or audio apparatus according to clause 9, wherein AIT is represented as an Extensible Markup Language (XML) file.
1 1 . A video and/or audio receiving apparatus according to any preceding clause, wherein the video and/or audio data comprises a watermark comprising the location information and update information. 12. A video and/or audio receiving apparatus according to clause 1 1 , wherein the watermark is a hierarchical watermark and the update information is comprised at a higher level of the hierarchy than the location information.
13. A video and/or audio receiving apparatus according to any preceding clause, wherein, on the basis of the location information, the processor circuitry is operable to determine one of a plurality of predetermined locations on the network at which the ancillary data is located and to control the network interface circuitry to receive the ancillary data from the determined one of the plurality of predetermined locations. 14. A video and/or audio receiving apparatus according to clause 13, wherein:
each of the plurality of predetermined locations on the network is indicated by a respective predetermined Uniform Resource Indicator (URI) ; and
the processor circuitry is operable to derive the URI of the one of the plurality of predetermined locations on the network at which the ancillary data is located from the location information.
15. A video and/or audio receiving apparatus according to clause 14, wherein the ancillary data comprises an Application Information Table (AIT) and the location information comprises a country identifier, a network identifier (onid), a broadcast delivery system identifier
(networktype), a service name (servicename) and a service identifier (sid).
16. A video and/or audio transmitting apparatus comprising : transmitter circuitry operable to transmit a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein
based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and
based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data. 17. A video and/or audio transmitting apparatus according to clause 16, wherein the update information is version information indicative of a current version of the ancillary data located at the location on the network so as to allow the video and/or audio receiving apparatus to determine that there is an update associated with the ancillary data received by the video and/or audio receiving apparatus when it is determined, based on a change to the version information, that there is a change to the current version of the ancillary data located at the location on the network.
18. A video and/or audio transmitting apparatus according to clause 16, wherein the update information comprises a changing value which changes from a start value to a predetermined end value over time in a predetermined manner so as to allow the video and/or audio receiving apparatus to determine that there is an update associated with the received ancillary data when it is determined that the changing value is equal to the predetermined end value.
19. A video and/or audio transmitting apparatus according to any one of clauses 16 to 18, wherein the video and/or audio receiving apparatus is operable to receive updated ancillary data from the location on the network.
20. A video and/or audio transmitting apparatus according to clause any one of clauses 16 to 18, wherein the update information comprises the update associated with the ancillary data so as to allow the video and/or audio receiving apparatus to update the ancillary data received by the video and/or audio receiving apparatus with the update comprised within the update information when it is determined by the video and/or audio receiving apparatus that the update information has been received.
21 . A video and/or audio transmitting apparatus according to any one of clauses 16 to 20, wherein the ancillary data received by the video and/or audio receiving apparatus is data associated with a software application which is executable by the video and/or audio receiving apparatus based on the received ancillary data and wherein the software application is executable by the video and/or audio receiving apparatus based on the updated ancillary data if it is determined by the video and/or audio receiving apparatus that there is an update associated with the received ancillary data.
22. A video and/or audio transmitting apparatus according to clause 21 , wherein:
the ancillary data associated with the software application comprises one of a first instruction indicating to the video and/or audio receiving apparatus to begin execution of the software application, a second instruction indicating to the video and/or audio receiving apparatus to continue a current execution of the software application and a third instruction indicating to the video and/or audio receiving apparatus to stop execution of the software application; and
the update associated with the ancillary data associated with the software application comprises a change from one of the first, second and third instructions to another one of the first, second and third instructions.
23. A video and/or audio transmitting apparatus according to clause 21 or 22, wherein the ancillary data is an Application Information Table (AIT).
24. A video and/or audio transmitting apparatus according to clause 23, wherein AIT is represented as an Extensible Markup Language (XML) file.
25. A video and/or audio transmitting apparatus according to any one of clauses 16 to 24, wherein the video and/or audio data comprises a watermark comprising the location information and update information.
26. A video and/or audio transmitting apparatus according to clause 25, wherein the watermark is a hierarchical watermark and the update information is comprised at a higher level of the hierarchy than the location information. 27. A video and/or audio transmitting apparatus according to any one of clauses 16 to 26, wherein, on the basis of the location information, the video and/or audio receiving apparatus is operable to determine one of a plurality of predetermined locations on the network at which the ancillary data is located and to receive the ancillary data from the determined one of the plurality of predetermined locations.
28. A video and/or audio transmitting apparatus according to clause 27, wherein:
each of the plurality of predetermined locations on the network is indicated by a respective predetermined Uniform Resource Indicator (URI); and
the URI of the one of the plurality of predetermined locations on the network at which the ancillary data is located is derivable by the video and/or audio processing apparatus from the location information.
29. A video and/or audio transmitting apparatus according to clause 28, wherein the ancillary data is an Application Information Table (AIT) and the location information comprises a country identifier, a network identifier (onid), a broadcast delivery system identifier (network), a service name (servicename) and a service identifier (sid).
30. A receiver comprising:
first interface circuitry configured to receive a signal representing audio visual data; second interface circuitry configured to request and receive first ancillary data associated with the audio visual data and, based on a change to a property of the signal representing audio visual data, to request and receive second ancillary data;
memory configured to store the first and second ancillary data; and
processing circuitry configured to perform a process by executing code components stored in a non-transitory storage medium using the first ancillary data as an input instruction and to modify the process using the second ancillary data.
31 . A system comprising an audio and/or video receiving apparatus according to clause 1 and an audio and/or video transmitting apparatus according to clause 16.
32. A video and/or audio receiving method comprising:
receiving a signal comprising video and/or audio data;
network interface circuitry operable to send and receive data over a network; and processor circuitry operable: based on location information comprised within the video and/or audio data, receiving ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, executing a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, determining whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, applying the update to the received ancillary data and executing a predetermined process associated with the updated ancillary data.
33. A non-transitory data storage medium storing a computer program for controlling a computer to perform a method according to clause 32.
34. A video and/or audio transmitting method comprising:
transmitting a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein
based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and
based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
35. A non-transitory data storage medium storing a computer program for controlling a computer to perform a method according to clause 34.
Numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practiced otherwise than as specifically described herein.
In so far as embodiments of the disclosure have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non- transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present disclosure.
It will be appreciated that the above description for clarity has described embodiments with reference to different functional units, circuitry and/or processors. However, it will be apparent that any suitable distribution of functionality between different functional units, circuitry and/or processors may be used without detracting from the embodiments.
Described embodiments may be implemented in any suitable form including hardware, software, firmware or any combination of these. Described embodiments may optionally be implemented at least partly as computer software running on one or more data processors and/or digital signal processors. The elements and components of any embodiment may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the disclosed embodiments may be implemented in a single unit or may be physically and functionally distributed between different units, circuitry and/or processors.
Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in any manner suitable to implement the technique.

Claims

1 . A video and/or audio receiving apparatus comprising:
receiver circuitry operable to receive a signal comprising video and/or audio data;
network interface circuitry operable to send and receive data over a network; and processor circuitry operable:
based on location information comprised within the video and/or audio data, to control the network interface circuitry to receive ancillary data associated with the video and/or audio data from a location on the network, the location information indicating the location on the network at which the ancillary data is located,
to execute a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
2. A video and/or audio receiving apparatus according to claim 1 , wherein the update information is version information indicative of a current version of the ancillary data located at the location on the network and wherein the processor circuitry is operable to determine that there is an update associated with the received ancillary data when it is determined, based on a change to the version information, that there is a change to the current version of the ancillary data located at the location on the network.
3. A video and/or audio receiving apparatus according to claim 1 , wherein the update information comprises a changing value which changes from a start value to a predetermined end value over time in a predetermined manner and wherein the processor circuitry is operable to determine that there is an update associated with the received ancillary data when it is determined that the changing value is equal to the predetermined end value.
4. A video and/or audio receiving apparatus according to claim 1 , wherein the processor circuitry is operable to control the network interface circuitry to receive updated ancillary data from the location on the network.
5. A video and/or audio receiving apparatus according to claim 1 , wherein the update information comprises the update associated with the received ancillary data and wherein the processor circuitry is operable to update the received ancillary data with the update comprised within the update information when it is determined that the update information has been received.
6. A video and/or audio receiving apparatus according to claim 5, wherein:
updated ancillary data is located at the location on the network;
after the update comprised within the update information has been received, the processor circuitry is configured:
to control the network interface circuitry to receive the updated ancillary data from the location on the network and to compare the update of the received updated ancillary data with the update comprised within the update information;
in the case that the update of the received updated ancillary data matches the update comprised within the update information, to determine that the ancillary data has been correctly updated; and
in the case that the update of the received updated ancillary data does not match the update comprised within the update information, to determine that the ancillary data has not been correctly updated and to execute a predetermined process in response to determining that the ancillary data has not been correctly updated.
7. A video and/or audio receiving apparatus according to claim 1 , wherein:
the received ancillary data is data associated with a software application;
the processor circuitry is operable to execute the software application based on the received ancillary data; and
if it is determined that there is an update associated with the received ancillary data, the processor circuitry is operable to execute the software application based on the updated ancillary data.
8. A video and/or audio receiving apparatus according to claim 7, wherein:
the ancillary data associated with the software application comprises one of a first instruction indicating to the processor circuitry to begin execution of the software application, a second instruction indicating to the processor circuitry to continue a current execution of the software application and a third instruction indicating to the processor circuitry to control the processor to stop execution of the software application; and
the update associated with the ancillary data associated with the software application comprises a change from one of the first, second and third instructions to another one of the first, second and third instructions.
9. A video and/or audio receiving apparatus according to claim 7, wherein the ancillary data comprises an Application Information Table (AIT).
10. A video and/or audio apparatus according to claim 9, wherein AIT is represented as an Extensible Markup Language (XML) file.
1 1 . A video and/or audio receiving apparatus according to claim 1 , wherein the video and/or audio data comprises a watermark comprising the location information and update information.
12. A video and/or audio receiving apparatus according to claim 1 1 , wherein the watermark is a hierarchical watermark and the update information is comprised at a higher level of the hierarchy than the location information.
13. A video and/or audio receiving apparatus according to claim 1 , wherein, on the basis of the location information, the processor circuitry is operable to determine one of a plurality of predetermined locations on the network at which the ancillary data is located and to control the network interface circuitry to receive the ancillary data from the determined one of the plurality of predetermined locations.
14. A video and/or audio receiving apparatus according to claim 13, wherein:
each of the plurality of predetermined locations on the network is indicated by a respective predetermined Uniform Resource Indicator (URI); and
the processor circuitry is operable to derive the URI of the one of the plurality of predetermined locations on the network at which the ancillary data is located from the location information.
15. A video and/or audio receiving apparatus according to claim 14, wherein the ancillary data comprises an Application Information Table (AIT) and the location information comprises a country identifier, a network identifier (onid), a broadcast delivery system identifier
(networktype), a service name (servicename) and a service identifier (sid).
16. A video and/or audio transmitting apparatus comprising :
transmitter circuitry operable to transmit a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein
based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and
based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
17. A video and/or audio transmitting apparatus according to claim 16, wherein the update information is version information indicative of a current version of the ancillary data located at the location on the network so as to allow the video and/or audio receiving apparatus to determine that there is an update associated with the ancillary data received by the video and/or audio receiving apparatus when it is determined, based on a change to the version information, that there is a change to the current version of the ancillary data located at the location on the network.
18. A video and/or audio transmitting apparatus according to claim 16, wherein the update information comprises a changing value which changes from a start value to a predetermined end value over time in a predetermined manner so as to allow the video and/or audio receiving apparatus to determine that there is an update associated with the received ancillary data when it is determined that the changing value is equal to the predetermined end value.
19. A video and/or audio transmitting apparatus according to claim 16, wherein the video and/or audio receiving apparatus is operable to receive updated ancillary data from the location on the network.
20. A video and/or audio transmitting apparatus according to claim 16, wherein the update information comprises the update associated with the ancillary data so as to allow the video and/or audio receiving apparatus to update the ancillary data received by the video and/or audio receiving apparatus with the update comprised within the update information when it is determined by the video and/or audio receiving apparatus that the update information has been received.
21 . A video and/or audio transmitting apparatus according to claim 16, wherein the ancillary data received by the video and/or audio receiving apparatus is data associated with a software application which is executable by the video and/or audio receiving apparatus based on the received ancillary data and wherein the software application is executable by the video and/or audio receiving apparatus based on the updated ancillary data if it is determined by the video and/or audio receiving apparatus that there is an update associated with the received ancillary data.
22. A video and/or audio transmitting apparatus according to claim 21 , wherein:
the ancillary data associated with the software application comprises one of a first instruction indicating to the video and/or audio receiving apparatus to begin execution of the software application, a second instruction indicating to the video and/or audio receiving apparatus to continue a current execution of the software application and a third instruction indicating to the video and/or audio receiving apparatus to stop execution of the software application; and
the update associated with the ancillary data associated with the software application comprises a change from one of the first, second and third instructions to another one of the first, second and third instructions.
23. A video and/or audio transmitting apparatus according to claim 21 , wherein the ancillary data is an Application Information Table (AIT).
24. A video and/or audio transmitting apparatus according to claim 23, wherein AIT is represented as an Extensible Markup Language (XML) file.
25. A video and/or audio transmitting apparatus according to claim 16, wherein the video and/or audio data comprises a watermark comprising the location information and update information.
26. A video and/or audio transmitting apparatus according to claim 25, wherein the watermark is a hierarchical watermark and the update information is comprised at a higher level of the hierarchy than the location information.
27. A video and/or audio transmitting apparatus according to claim 16, wherein, on the basis of the location information, the video and/or audio receiving apparatus is operable to determine one of a plurality of predetermined locations on the network at which the ancillary data is located and to receive the ancillary data from the determined one of the plurality of
predetermined locations.
28. A video and/or audio transmitting apparatus according to claim 27, wherein: each of the plurality of predetermined locations on the network is indicated by a respective predetermined Uniform Resource Indicator (URI); and
the URI of the one of the plurality of predetermined locations on the network at which the ancillary data is located is derivable by the video and/or audio processing apparatus from the location information.
29. A video and/or audio transmitting apparatus according to claim 28, wherein the ancillary data is an Application Information Table (AIT) and the location information comprises a country identifier, a network identifier (onid), a broadcast delivery system identifier (network), a service name (servicename) and a service identifier (sid).
30. A receiver comprising:
first interface circuitry configured to receive a signal representing audio visual data; second interface circuitry configured to request and receive first ancillary data associated with the audio visual data and, based on a change to a property of the signal representing audio visual data, to request and receive second ancillary data;
memory configured to store the first and second ancillary data; and
processing circuitry configured to perform a process by executing code components stored in a non-transitory storage medium using the first ancillary data as an input instruction and to modify the process using the second ancillary data.
31 . A system comprising an audio and/or video receiving apparatus according to claim 1 and an audio and/or video transmitting apparatus according to claim 16.
32. A video and/or audio receiving method comprising:
receiving a signal comprising video and/or audio data;
network interface circuitry operable to send and receive data over a network; and processor circuitry operable:
based on location information comprised within the video and/or audio data, receiving ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, executing a predetermined process associated with the received ancillary data, and based on update information comprised within the video and/or audio data, determining whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, applying the update to the received ancillary data and executing a predetermined process associated with the updated ancillary data.
33. A non-transitory data storage medium storing a computer program for controlling a computer to perform a method according to claim 32.
34. A video and/or audio transmitting method comprising:
transmitting a signal comprising video and/or audio data to a video and/or audio receiving apparatus, the video and/or audio data comprising location information and update information, wherein
based on the location information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to receive ancillary data associated with the video and/or audio data from a location on a network, the location information indicating the location on the network at which the ancillary data is located, and to execute a predetermined process associated with the received ancillary data, and
based on the update information comprised within the video and/or audio data, the video and/or audio receiving apparatus is operable to determine whether or not there is an update associated with the received ancillary data and, if it is determined that there is an update associated with the received ancillary data, to apply the update to the received ancillary data and to execute a predetermined process associated with the updated ancillary data.
35. A non-transitory data storage medium storing a computer program for controlling a computer to perform a method according to claim 34.
PCT/GB2018/051097 2017-05-26 2018-04-26 Audio and/or video receiving and transmitting apparatuses and methods WO2018215732A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1708487.2A GB2562796A (en) 2017-05-26 2017-05-26 Audio and/or video receiving and transmitting apparatuses and methods
GB1708487.2 2017-05-26

Publications (1)

Publication Number Publication Date
WO2018215732A1 true WO2018215732A1 (en) 2018-11-29

Family

ID=59270859

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/051097 WO2018215732A1 (en) 2017-05-26 2018-04-26 Audio and/or video receiving and transmitting apparatuses and methods

Country Status (2)

Country Link
GB (1) GB2562796A (en)
WO (1) WO2018215732A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041955A1 (en) * 1998-03-25 2005-02-24 Canal+ Societe Anonyme Authentification of data in a digital transmission system
US20140351860A1 (en) * 2012-05-10 2014-11-27 David W. Chen Media synchronization within home network using set-top box as gateway

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100777409B1 (en) * 2006-06-05 2007-11-19 주식회사 알티캐스트 Method for provisioning network service provider application in digital interactive broadcasting
JP6018798B2 (en) * 2011-05-20 2016-11-02 日本放送協会 Receiving machine
US8839302B2 (en) * 2011-07-14 2014-09-16 Samsung Electronics Co., Ltd. Launching an application from a broadcast receiving apparatus
WO2015012309A1 (en) * 2013-07-23 2015-01-29 シャープ株式会社 Delivery device, delivery method, playback device, playback method, and program
JP6442897B2 (en) * 2014-07-18 2018-12-26 ソニー株式会社 Transmission device, transmission method, reception device, and reception method
JP2016174239A (en) * 2015-03-16 2016-09-29 ソニー株式会社 Transmitter and transmission method, and receiver and reception method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041955A1 (en) * 1998-03-25 2005-02-24 Canal+ Societe Anonyme Authentification of data in a digital transmission system
US20140351860A1 (en) * 2012-05-10 2014-11-27 David W. Chen Media synchronization within home network using set-top box as gateway

Also Published As

Publication number Publication date
GB201708487D0 (en) 2017-07-12
GB2562796A (en) 2018-11-28

Similar Documents

Publication Publication Date Title
EP3269146B1 (en) Permissions management for watermarked data in a broadcast environment
JP2006515132A (en) Method, system and network entity for indicating a hierarchical mode for a transport stream carried in broadband transmission
US20170164071A1 (en) Broadcast signal transmission apparatus, broadcast signal reception apparatus, broadcast signal transmission method, and broadcast signal reception method
CN107431842B (en) System and method for communication of content information
KR102160585B1 (en) Event registration and notification
CN102300119A (en) Hybrid live broadcasting method and equipment
US8826387B2 (en) Validation and fast channel change for broadcast system
WO2017002371A1 (en) Systems and methods for current service information
CA3160253A1 (en) Techniques for replacement content signaling in atsc 3.0 television
WO2016157795A1 (en) Systems and methods for content information message exchange
US11611799B2 (en) ATSC 3 application context switching and sharing
US11553245B1 (en) Techniques for receiving non-real time (NRT) data whilst traversing a multi-frequency network boundary
WO2023094923A1 (en) Atsc boundary condition fall over to internet
WO2018215732A1 (en) Audio and/or video receiving and transmitting apparatuses and methods
WO2017213000A1 (en) Current service information
US20210258633A1 (en) Intelligent unload of broadcaster application on channel change
US11546650B1 (en) Techniques for ATSC 3.0 broadcast boundary area management using plural tuners with different numbers of antennae
US11711568B2 (en) Techniques for ATSC 3.0 broadcast boundary area management using plural tuners handing off between presentation and scanning
US11838680B2 (en) Techniques for ATSC 3.0 broadcast boundary area management using complete service reception during scan to determine signal quality of frequencies carrying the duplicate service
US11601707B2 (en) Techniques for ATSC 3.0 broadcast boundary area management using plural tuners
US11848716B2 (en) Techniques for ATSC 3.0 broadcast boundary area management using signal quality and packet errors to differentiate between duplicated services on different frequencies during scan
US20230038971A1 (en) Stream repair memory management
KR101366328B1 (en) a receiver and a processing method for data broadcasting signal
WO2023012747A1 (en) Techniques for atsc 3.0 broadcast boundary area management using plural tuners with different numbers of antennae
WO2023012756A1 (en) Techniques for atsc 3.0 broadcast boundary area management using plural tuners handing off between presentation and scanning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18722162

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18722162

Country of ref document: EP

Kind code of ref document: A1