WO2012028851A1 - Procédé et système de synchronisation de services additionnels - Google Patents
Procédé et système de synchronisation de services additionnels Download PDFInfo
- Publication number
- WO2012028851A1 WO2012028851A1 PCT/GB2011/001288 GB2011001288W WO2012028851A1 WO 2012028851 A1 WO2012028851 A1 WO 2012028851A1 GB 2011001288 W GB2011001288 W GB 2011001288W WO 2012028851 A1 WO2012028851 A1 WO 2012028851A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- broadcast
- user device
- data
- server
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000005540 biological transmission Effects 0.000 claims description 33
- 230000001360 synchronised effect Effects 0.000 claims description 12
- 238000012360 testing method Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 2
- 238000012986 modification Methods 0.000 abstract description 4
- 230000004048 modification Effects 0.000 abstract description 4
- 230000004044 response Effects 0.000 description 24
- 230000006978 adaptation Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000007667 floating Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 206010011878 Deafness Diseases 0.000 description 1
- GTAXGNCCEYZRII-UHFFFAOYSA-N Eperisone hydrochloride Chemical compound Cl.C1=CC(CC)=CC=C1C(=O)C(C)CN1CCCCC1 GTAXGNCCEYZRII-UHFFFAOYSA-N 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000001339 gustatory effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000035943 smell Effects 0.000 description 1
- 235000019640 taste Nutrition 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/18—Arrangements for synchronising broadcast or distribution via plural systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/09—Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
- H04H60/13—Arrangements for device control affected by the broadcast information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/35—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
- H04H60/38—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space
- H04H60/40—Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast time
Definitions
- This invention relates to audience experience of broad audio-video content.
- it relates to the degree with which a viewer can adapt & personalise the traditional broadcast experience using content derived from a secondary data source.
- Television broadcasting has been around in commercial form since the 1930s, and whilst the technologies to deliver it have changed over time, the user experience remains largely the same - a single box with a single screen, and one or more speakers playing video and audio chosen by a broadcaster. Since television, and broadcasting in general is fundamentally about storytelling - having a specific take on a story by a given storyteller, this is unsurprising. However, how we enjoy a story can depend largely on the senses and capabilities we have available. If I am blind, I need more audio cues. If I am deaf, I need more visual cues. If I speak a different language, I need the story in my language.
- Example adaptations include subtitles, teletext, interactive programming (eg MHEG), signing and audio description (also known as narrative subtitles). Whilst these adaptations have often been termed "accessibility", they are generally useful adaptations and the term accessibility should be understood broadly to include any functionality to improve access to content. Someone can watch subtitles whilst another person is on the phone. Another may listen to the audio and audio description whilst using a second device that uses their eyes.
- broadcast spectrum is limited. Better use of broadcast spectrum has been achieved by switching from analogue broadcast to digital broadcast. Balances between signal robustness and bit rate have been made, in an attempt to increase available bandwidth. Video and audio quality has been degraded, in order to fit in ever more channels, and ever more adaptation services. Despite all of these technological changes, there are both theoretical and practical limits as to how many services you can fit in the broadcast chain. There is in essence a broadcast bandwidth limit. Over the past 20-25 years however, newer mechanisms for delivery of content have been growing in popularity. Examples here include the Internet and mobile telephone networks (both voice and data).
- Such networks and clients of such networks have become very small, very powerful devices, more than capable of playing back audio & video, and a plethora of user choice driven services for audio and video have come into existence. Furthermore, the capabilities of the audience have grown beyond simply being able to have one video and audio display. They may have a laptop, several mobile phones, and even a network connected audio system, including a 3D surround sound system all within a single room. Adapting the broadcast service to such systems is possible, but again hits the broadcast bandwidth limit, limiting the number of services which can expand to such a system.
- An example here is of a television or set top box with integrated network connection.
- These typically timestamp the Internet delivered secondary content (eg secondary audio track) with a presentation timestamp that correlates with a presentation timestamp in the broadcast content, allowing the two signals to be integrated in the receiver.
- secondary content eg secondary audio track
- presentation timestamp that correlates with a presentation timestamp in the broadcast content
- U.S Pat. No 7634798 by Watson discloses a system for enhanced interactive television. This does indeed take advantage of the multi-device nature of the modern home, allowing a user to participate in interactive experiences in time with a broadcast event.
- this system does not attempt to enhance the linear nature of storytelling, and in particular offers a process oriented approach based upon the execution of commands, for interactivity requiring the use of a common clock. As a result this approach is primarily oriented towards "events" such as quizzes, rather than general programmes.
- vents such as quizzes, rather than general programmes.
- there is prior art aimed at enabling more detail regarding advertising to be displayed synchronously with a programme, which may be viewed as a form of content adaptation.
- a local device it is preferable for a local device to be able to synchronise retrieved additional content without modification to the broadcast chain and, in particular, without modification to the broadcast receiver device.
- the invention provides a system for synchronising retrieved additional service data with a broadcast service as received at a broadcast receiver by additionally receiving the broadcast service at a local time server and supplying a time or synchronisation signal from the local time server to an additional service device.
- the use of such an extra local time server allows the additional service device, such as a laptop mobile or the like, to determine the time of program content, as received at the local time server and, hence, within the tolerance needed as received at the broadcast receiver, and thereby to synchronise assertion of additional services with the programme at the broadcast receiver.
- the additional device is thereby provided with a synchronisation signal relative to the programme as received at the broadcast receiver, but without requiring a connection between the additional device and the broadcast receiver.
- Figure 1 shows an overview of a system embodying the invention
- Figure 2 is a schematic view of the core functional components of a system embodying the invention
- Figure 3 is a time line showing receipt of timed signals from a broadcast time server to an additional service device
- Figure 4 is a time line showing transmission of a test time signal from an additional service device to a broadcast time server and back again for calculation of transmission time delay;
- Figure 5 is a schematic view of the core functional components of a user device. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
- the embodiment of the invention seeks to provide the viewer the possibility to enhance their traditional television broadcast experience, by using the devices they own to play back additional adaptation content synchronously with the broadcast.
- the audience's devices utilise a network connection to identify the programme being watched, the time into the programme and the broadcaster's concept of time as received at the user's receiver. This enables these additional devices to synchronise a local playout scheduler for playback of additional content.
- additional adaptive content can be played back timed relative to a programme, or to a schedule.
- This additional content can be obtained by the audiences' devices via a number of means including pulled from a network (Internet, mobile, etc), pushed from the network by the broadcaster (eg over X PP, SMS, MMS), or on pre-recorded media (CDROM, DVD, memory stick etc), or even from a previous broadcast by the broadcaster.
- the system does not require continuous connection to a central server, and enables each device the user or users own to be synchronised to the broadcast, allowing unlimited adaptations, removing the broadcast bandwidth limitation, without change to a pre-existing broadcast infrastructure.
- Combining the adaptability of modern networks, such as the Internet, with broadcast enables broadcasters to send additional adaptation services over such modern networks, without the need for additional broadcast spectrum, essentially enabling them to break the broadcast bandwidth limit.
- the broadcaster broadcasts the common core of the story being told using audio and video, but can present adaptations - for example subtitles, audio description, director's narrative - via another medium, such as a network based around user choice - such as the Internet or mobile.
- a personal device eg mobile
- a broadcast transmitter 10 transmits television programmes (over air or by cable) which are received at user receivers 12 which may be set- top-boxes, freeview receivers, television receivers, or other receivers of broadcast audio-video services. It is noted now for ease of understanding that there is a broadcast delay between the broadcast transmitter 10 and the receiver 12 of a user.
- the user also has an additional device, such as a mobile telephone, laptop or indeed any user device 14 capable of receiving data related to the programme being received by the receiver 12.
- Such data may be considered as "additional data" 16 in the sense that it supplements or is additional to the corresponding programme being broadcast and received at the receivers.
- the additional data may also be referred to as supplementary data, additional service data or the like, being the important point being that the data in some way relates to the programme broadcast from the transmitter and received at the receivers.
- the user device may be referred to as an additional service device for ease of description. It is desired that this additional service device should assert retrieved additional data relating to the programme being viewed on the receiver 12 in synchronisation with that programme.
- the system does not have any connection between the additional service device 14 and the broadcast receiver 12 because it is preferred not to require any modification to a standard broadcast receiver 12.
- a broadcast time server 18 which receives the broadcast signals and extracts from the signals broadcast timing information and programme time information which the additional service device can then receive.
- the additional service device is thereby able to calculate based on its own internal clock when additional content should be asserted in relation to the programme as received at the local time server and therefore as received at the user's own receiver 12.
- the additional service may comprise text audio-video 3D information telemetry and a wide variety of other possible data and can be retrieved by any suitable routes such as via the Internet or mobile telephone network.
- the system shown in Figure 2 comprises a broadcast transmitter 10 and receiver 12 as already described and an additional service device 14, such as a mobile telephone, laptop or the like which can retrieve additional service data from an additional service data store 16.
- a broadcast time server 18 also receives the broadcast programme from the broadcast transmitter 10 over a broadcast channel 24.
- the additional service device may communicate with the broadcast time server over a communication link 22.
- the broadcaster makes available a television broadcast service (TBS) which includes a programme service, a time clock, along with now and next information, which identifies the currently broadcast programme.
- TBS television broadcast service
- This may be as simple as an analogue service with a clock on a teletext data service along with a now and next page.
- the time is taken from a digital video broadcasting (DVB) time and date table of the programme status information, and the now and next information taken from the event information table of the programme status information.
- the time clock broadcast by the transmitter 10 and received at the broadcast time server 18 is indicative of the broadcast time clock as received at the receiver 12.
- the difference in time of receipt depends upon the difference between the time A of the transmission via the broadcast channel 20 in comparison to the time B of transmission of the broadcast channel 24.
- the broadcast time server 18 may therefore be local to a locality such as a particular city or geographical area of a country such that the time difference between time A and time B is of the order 10s of milliseconds and therefore imperceptible to the user.
- the locality may therefore be considered as a "transmitter region" in the sense that all receivers receiving the broadcast signal from a given transmitter are considered to be within the same locality.
- the locality may therefore be a certain distance from the terrestrial broadcast antenna.
- the locality may be the footprint of a satellite which could cover an entire country.
- the concept of "locality" therefore relates to the fact that all receivers in the given locality receive the broadcast transmission at substantially the same time and any differences in time of arrival of the broadcast transmission are imperceptible to a user.
- the now and next information mentioned above is a specific example of programme clock information and is the preferred choice in a DVB implementation of the system.
- the now and next information indicates the start of a given programme with reference to the broadcast time clock.
- the now and next information is typically broadcast once per second giving an indication of the current programme and next programme to be broadcast.
- an extra now and next information signal is transmitted when a programme changes. This junction between different now and next signals may be used as the indicator of a new programme relative to the broadcast time clock.
- the broadcast time server 18 thus receives a broadcast time clock within an allowable tolerance and also the timing of a programme being broadcast relative to the broadcast time clock.
- the broadcaster makes the broadcast time server 18 (BTS) available over a communications network 22.
- This time server provides the broadcaster concept of time, as received by receivers of the broadcast, rather than as generated by a playout system prior to transmission, and thus represents actual broadcast time, irrespective of delays integral to a playout system which may change over time.
- the broadcast time server also provides a current programme server (CPS) available over the communications network 22. This receives the now and next information, for example from DVB's programme status information's event information table, to identify the current broadcast programme, and makes this available as a network queryable service.
- the broadcast time server also provides a time into current programme server (TICPS) available over the communications network. This receives the now and next information, and monitors it for changes. When the now and next information changes, the system takes the junction point as the start time for the new programme.
- the broadcast time server (BTS) thereby provides current programme and time into current programme with which the additional service can synchronise.
- the broadcast time server may derive the time signal from the audio-video transmission in a variety of ways.
- One approach may be based on timing of subtitles for a programme. Subtitles are timed accurately relative to a programme and so extracting the subtitles and program clock reference from the subtitles gives a direct indication of time into the current programme, and this may be used as the derived broadcast time signal.
- the broadcaster also makes available an additional services server system 16 (ASSS) available over a communications network 26.
- ASSS additional services server system 16
- These services are made available as declarative timed schedules.
- the simplest of these comprises of a list of timestamps and textual information.
- More complex schedules comprise of a list of triples, where each triple is a timestamp, datatype tag, and an arbitrary octet sequence. These timestamps may be relative to the current programme time, or relative to the broadcaster's concept of time.
- the system requires the broadcaster to ensure that the now and next information as broadcast is synchronised with the programme. This is practical due to automated playout systems, and essentially a configuration for the broadcaster's existing equipment, rather than a change to the broadcaster's existing equipment.
- the audience has a standard receiver 12 which receives the television broadcast service and displays this.
- An example receiver is a traditional television, another would be a digital video broadcasting receiver.
- the audience also has secondary device 14 - having a network client.
- the audience may have a plurality of such devices.
- Each network client is synchronised with the broadcast chain, and hence with the audience's receiver as will now be described.
- the user configures the device for a particular broadcast service.
- the configuring of a device for a particular broadcast service may be by any convenient user interface.
- a typical approach on a smart mobile telephone would be to access a mobile compatible web page from which a given broadcast service may be selected by a user.
- the additional service device may then immediately commence retrieval of additional service data and cache this data in readiness for presentation to the user at appropriate time relative to the broadcast being viewed on the television receiver 12.
- the additional service data may be retrieved from a remote server as shown over a network, but equally may be retrieved from a local store or from any other storage arrangement such as CDRO , memory and so on.
- the device also contacts the time server 18 and synchronises a client local application clock (CLAC).
- CLAC client local application clock
- the synchronisation of the local clock of the additional service device 14 with the time server 18 can be achieved in a number of ways.
- a possible approach is simply to receive a number of time samples from the time server to reproduce a clock locally based on calculating the skew and drift relative to the system clock of the client additional service device 14.
- such an approach would omit any calculation of the network latency of the network 22 via which the time signals are provided.
- a time delta D may exist between the broadcast time server and the additional service device.
- the preferred approach to calculate this latency is for the additional service device to transmit a signal to the broadcast time server and for the broadcast time server to return a signal, so that the additional service device can calculate the time delta D from the transmitted and received signals.
- a time server may choose to allow the client to synchronise using a known network time synchronisation algorithm such as the Marzullo's algorithm (as used by the Network Time Protocol (RFC 1035), Simple Network Time Protocol (RFC 4330), etc).
- a known network time synchronisation algorithm such as the Marzullo's algorithm (as used by the Network Time Protocol (RFC 1035), Simple Network Time Protocol (RFC 4330), etc).
- the broadcast time server can repeatedly transmit a clock signal in the time server domain (TS1 , TS2...TSN). This is received after a latency time D at the additional service device.
- the time signals are received at corresponding times in a local clock time domain (LC1 , LC2...LCN).
- a broad view of time at the local device in the broadcast time domain may then be calculated so that the connection does not need to be permanently maintained. This is done by calculating the relative drift of the two clocks by dividing the difference between TSN and TS by the difference between LCN and LC to derive the relative rates of the clocks.
- Multiplying this relative rate by the difference between the local clock at any given point in time and the local clock at a start point provides a broad view of time in the broadcast time server domain effectively removing any different rates of the two clocks.
- this does not take account of the latency delta D and for this purpose the client device can determine this delta D by transmission and reception of a test signal, as shown in Figure 4.
- the preferred test signal is for the local device simply to transmit a time stamp at a given time which may be in the broad view of time domain calculated above, here shown as BVT1.
- the time stamp is received at the broadcast time server and immediately transmitted back to the local device, along with a time indicator in the time server time domain, here shown as TS1.
- This signal containing BVT1 and TS1 are received at the local device at a second time BVT2.
- the additional service device can then calculate the time delta D by simple subtraction of BVT1-TS1. Alternatively, if it is assumed that the uplink and down link times are the same, then the additional service device can calculate BVT2-BVT1 which will give twice the time delta 2D.
- the broadcast time server provides to the additional service device the ability to derive the broadcast time clock as received at the receiver 12 and also start and stop times of programmes relative to the broadcast clock.
- the synchronisation of the local clock of the user device with the timed server as described above may be performed just once when a user requests additional data to be presented at their user device, and thereafter the local clock in the user device is sufficiently accurate that contents can be presented relative to the local clock. If a user requests the additional content just once, though, and then continues to use their device all day, perhaps using data related to a given television channel, it is possible for the local clock to become shifted relative to the broadcast time server. Accordingly, the broadcast time server may periodically push the synchronisation signals described above or alternatively the user device may periodically pull the time synchronisation signals.
- the device's network client can then query the current programme server for the current programme, and use this information to choose which additional service to use.
- This may be one stored locally - for example on a DVD, CDROM, hard drive, or similar storage device, or from a network location, such as an HTTP download, FTP download, or by requesting the addition service description (ASD) from a mobile system, for example, sending an SMS to a server and receiving an MMS reply with the ASD as the payload.
- the network client can then use the additional service description to schedule events to occur at particular times relative to the network client's local application clock (CLAC).
- CLAC local application clock
- the network client interprets the message according to rules appropriate for the specific additional service description.
- the event is defined as simply the textual data.
- the interpretation of such data is to simply display the data.
- the textual data display system may choose to detect HTML formatted text, and render such fragments according to HTML rendering rules.
- an additional service description triple is a timestamp, datatype tag, and an arbitrary octet sequence
- the event is defined to be the contents of the octet sequence, interpreted according to rules defined relative to the datatype tag.
- the rules for the datatype tag are defined relative to the broadcaster.
- Figure 5 is a schematic view of the core functional components of a user device, along with its external dependencies on the broadcast time server and additional service data store.
- the broadcast time server provides three 3 services on a given IP address, with each service on a separate port.
- a programme time summary service is provided on port 1700.
- the user client uses these, along with the data from the additional service data to control 1 or more output devices in a timely fashion.
- the user client's broadcast time synchronisation subsystem 30 initiates a TCP connection to port 1800 on the broadcast time server 18.
- the time server responds by sending an octet stream to the client.
- the octet stream forms a string, terminated by the connection close.
- the octect sequence forms a string of textual characters representing a floating point number.
- the user client may then parse this string to turn it into an internal representation of a floating point number.
- the floating point number relates to the number of seconds since the beginning of what is known as the Unix Epoch - or more specifically the precise length of time, according to the broadcast time server, that has elapsed, in seconds, since 00:00:00 on the 1 st of January 970.
- This time is the remote baseline time.
- the user client's broadcast time synchronisation subsystem 30 then reads the user devices system clock, and that time is denoted as the local baseline time. For example the octet stream " 249038001.709007" represents the time Friday Jul 31 11:00:01 2009 and 0.709007 seconds.
- the broadcast time synchronisation subsystem 30 can initiate a second TCP connection to port 1800 on the broadcast time server. Again, it receives a time back. This time can be denote remote current time. The system clock is read, and this is denoted local current time.
- the remote elapsed time is calculated by subtracting the remote baseline time from the remote current time.
- the local elapsed time is calculated by subtracting the local baseline time from the local current time.
- a ratio denoting the skew between the two clocks can be calculated by dividing the local elapsed time by remote elapsed time. This enables a calculation to be performed that transforms a local time derived from the system clock into the remote time. That is this allows the user device to map from local system clock time to the broadcast view of time as received by a broadcast receiver. To do this, this requires the triplet of information (local baseline time, remote baseline time , clock ratio )
- the user application clock provides two services to other subsystems. One is the ability to provide the current time according to the broadcast view of time, the other is to sleep for a given number of seconds, including fractional seconds, according to the broadcast view of time. This given a time "now" taken from the system clock, a first order approximation of the broadcast time can be derived from the calculation: remote baseline time + (now - local baseline time) * ratio .
- the application clock simply divides the requested number of seconds to sleep by the ratio. This is to transform broadcast elapsed time into local elapsed time. The sleep service may then sleep for this time period, waking after the local elapsed time in sync with the remote elapsed time.
- the network delay measurement subsystem 34 retrieves the broadcast view of time from the application clock. This time is denoted send time. This time is denoted by a floating point number representing the number of seconds elapsed since 1 st January 1970. It then initiates a TCP connection to the broadcast time server on port 1801. It then sends an octet sequence string representation of send time. This string is terminated by the addition of a network end of line - that is " ⁇ r ⁇ n" or specifically the raw octet values are 13 and 10 respectively.
- the broadcast time server treats the network end of line " ⁇ r ⁇ n" as a message terminator, and throws the network end of line away. It then appends a space to the message, and then appends the current time encoded as an octet sequence which is again a string representation of a floating point value which represents the number of seconds elapsed since 1 st Janurary 1970.
- the broadcast time server then terminates the TCP connection to signify that it has finished sending its response message. For example, if the network delay is 50ms, then the time on the time server will be 50ms ahead of the user device's application clock. Additionally, the message from the user device network delay measurement subsystem will take an additional 50ms to reach the server.
- the broadcast time server's clock would be1249038300.050000 .
- the broadcast time server's clock would be 1249038300.100000.
- the response message sent by the broadcast time server to the user device would be "1249038300.000000 1249038300.100000"
- the user client device can then parse these two timestamps to give sent time and remote time .
- the user client also retrieves an expected time from the application clock. This time should match the remote time within a certain tolerance level.
- the tolerance level used by the preferred embodiment is 10ms. If the difference between remote time and expected time match is not within tolerance, the user device restarts the clock calibration process.
- the round trip network delay is then be calculated by subtracting the remote time from the sent time.
- the one way network delay, and hence error in the local application clock can then be determined by dividing the round trip network delay by two. This delay is then used to calibrate the user device application clock.
- the user device application clock simply uses this network delay by adding it to the times it current calculates.
- the application clock is synchronised with the broadcast view of time as received by a broadcast receiver.
- the user is then required to inform the user device what channel they are watching.
- the user must do this because in the preferred embodiment the user's set top box and broadcast chain are unmodified.
- other embodiments may take this from another system that is able to communicate to enhanced set top boxes that can inform external devices what channel the set top box is tuned to.
- the channel could be determined via audio watermarking or by another other method. In the preferred embodiment, this is achieved by the user typing the channel name on a keyboard, though clearly a graphical menu system or voice recognition system could be used to achieve the same goal.
- the programme time client 36 then create a TCP connection to port 1700 on the broadcast time server to connect to the programme time summary service.
- the progamme time client sends the octet string "summary ⁇ r ⁇ n", that is the single word “summary” followed by a network end of line sequence.
- the server then sends a response message to this, which is an octet sequence terminated by the connection being closed.
- the programme time client 36 can then parse this response as follows.
- the expected format of the response string is ( "OK”
- the command tag is the command the programme time client sent to the server.
- the response is the required message. This can be parsed by searching the string for the first space character, and taking the response code - OK or ERROR - from the characters in the string preceding that first space.
- the command tag can be found by searching for the second space character in the response.
- the command can then be extracted from between the first and second space characters in the response.
- the actual command result - the result can be extracted from the response string by extracting the string after the second space character up to and excluding the 2 network end of line characters at the end of the string.
- the response is a JSON encoded message - irrespective of success or error.
- JavaScript Object Notation - JSON is an encoding format commonly used by internet clients, and is defined in RFC 4627, as published by the Internet Engineering Task Force (ietf). The standard can be found at this URL: http://www.ietf.org/rfc/rfc4627.txt . This is then decoded by the client using any suitable JSON parser.
- the response represents a "dictionary" object, that is an object that maps keys to values.
- the keys are channel names
- the values are arrays. These arrays are pairs of values - the first relates to a timestamp that the programme started, the second is the programme name.
- the programme start time has been derived by the broadcast time server from the broadcast chain.
- the preferred embodiment which uses DVB-T performs this by monitoring the now and next event information table from for change, and using this as programme start time. This could also be derived from subtitle junction changes. It could also be enhanced by the broadcaster inserting a marker indicating the start time of the programme into the broadcast transmission - for example using the related content table descriptor in DVB.
- the programme start time information is then passed onto Timed Event Scheduler 38.
- the programme start time and programme name is passed onto the events retriever subsystem 40.
- the events retriever subsystem 40 uses the programme name to read a given scheduled events file from a file system on the user device.
- the events retriever subsystem is preconfigured to connect to an additional services server system and makes a request for the scheduled events file for the given programme name being broadcast at the given time.
- the server is an HTTP (web) server, which responds to POST requests on a preconfigued URL containing the programme name and programme start time.
- the additional services server system then responds with the scheduled events file.
- the timed event scheduler subsystem takes the scheduled events file and parses it.
- the scheduled events file is a JSON format file containing one object representing a schedule.
- the schedule object is an array of event objects. Each event object is an array consisting of 3 parts - a timestamp, an event type and event data. In the case of the events file being timed against broadcast time, this schedule object can be used "as is" to drive the timed event scheduler 38.
- the timestamps will be relative to programme time.
- the timed event scheduler has to take the programme start time - as provided by the programme time server and add this to each of the timestamps in the schedule object - mapping programme time to broadcast time. Now that the timestamps are relative to broadcast time, the schedule object can then be used to drive the timed event scheduler.
- the timed event scheduler then consists of two key portions - a timed scheduler, and an event handler.
- the timed scheduler uses the application clock to drive a local scheduler. This works through the schedule object - that is the array of events in order, looking at the timestamps.
- each timestamp For each timestamp, it looks at the current (broadcast) time as retrieved from the application clock, and subtracts that from the next event's timestamp. It then uses the sleep service from the application clock to sleep for the given (broadcast) time period (or 0s if the difference is less than zero). Once the scheduler has finished sleeping for the given period, the time for the scheduled event has been reached. The scheduler then looks at the event type to determine how to handle the event data. In particular, the scheduler can then send the event data to different output subsystems.
- output subsystems 42 There will be one or more such output subsystems 42. These subsystems can be audio systems, text display systems, video display systems or even systems that control physical devices via interface boards such as arduino. Thus the system can use the event data produce a variety of synchronised reponses in addition to the main broadcast channel.
- the event type "text” causes the text to be sent a text display output system;
- the event type "audio” causes the data to be passed to an audio output subsystem and played back;
- the event type "arduino” causes the data to be sent to an arduino output subsystem which passes the event data unmodified over a serial port to an attached arduino device which may spin a motor or flash a light, etc in response to the command.
- an arduino is just a microcontroller based electronic breakout box this allows this system to use the event data to control anything from robots through motion systems, lighting systems, etc this means that the broadcast content can be synchronously augmented by this system by anything within the imagination of the programme maker.
- the user device makes HTTP requests for each of the 3 services described above ⁇
- the MIME type of the response is of application/json, containing a dictionary object with 3 representations of time - timestamp as before, english textual and a 9 part array of year, month, day, hours, minutes, seconds, weekday (0..6, monday is 0), day in year, and whether daylight saving is active.
- An example response is: ⁇ "localtime”: [2010, 7, 5, 17, 21 , 10, 0, 186, 1], "asctime”: “Mon Jul 5 17:21 :10 2010", "time”: 1278346870.0 ⁇ .
- the MIME type of the response is also application/json, and contains a dictionary object with the same 3 representation of time as the time service, but additionally includes an extra field which contains the send time from the user device.
- the MIME type of the response is again application/json, and contains a dictionary object representing a summary of all channels, programme start times and programme names as before.
- the format of this response is exactly the same as the response previously described.
- the user device uses this data in precisely the same way as before.
- An example network client may connect to a service based around interpreting events in the service in the same way as a web browser interprets content.
- An example datatype tag for a broadcaster could be "text/html”, which the network client could interpret the octet sequence as HTML to be rendered according to HTML rendering rules.
- Another example may be "base64/audio/wav”, which the network client could interpret the octet sequence as a base 64 encoded audio file in wav form, which would then be played at that point in time.
- Another example may be "link", which the network client would interpret the octet sequence as an URL to be downloaded and interpreted as a web browser would in the usual way as soon as possible.
- a network client could choose to precache a local copy of the link's content.
- Such a network client could render textual data - such as subtitles, links, textual footnotes & comments, audio, video, flash, (and so on) synchronously with the broadcast.
- Another network client may connect to a service based solely around audio, and play back audio event synchronous with the broadcast.
- audio services could include audio description, director narratives, audio footnotes, or even 3D surround sound (such as ambisonics).
- Another network client may be a games console or similar device rich in computer processing power.
- Such a device may connect to an additional service description which is a moving 3D model of the programme as broadcast. (This may be captured via a system such as ORIGAMI, or i3DLive).
- This network client may also act as a receiver system, and project the broadcast video as a texture onto the 3D model, providing the audience with a 3D video experience.
- video data may also be provided as additional services, and synchronised using an appropriate additional service description. These secondary views could then be applied to the 3D model providing higher quality 3D video.
- Such a platform could also be capable of creating a local stereoscopic 3D rendering for playback on a suitable stereoscopic display.
- Other possible service descriptions may include information such as telemetry, motion vectors, olfactory or even gustatory data, enabling the control of the location of local devices, force feedback (games consoles), generation of timely smells or tastes.
- the system described is applicable to analogue digital terrestrial cable or satellite television.
- the receiver described may be a terrestrial satellite or cable digital television, set-top-box, or other separate audio or video decoder.
- the network over which the additional service data is provided may be the Internet, a mobile phone data service or the like and may use HTTP, TCP, UDP, and XMPP, IRC or other known protocol. Similarly, the request for time may use any of these protocols and may be sent by SMS or MMS.
- the processing mechanism which is used to process the additional service data may be specific to object a network or may be according to a standard and will depend upon the type of the additional service data. As already discussed, this may be text, audio, video telemetry and may include data such as 3D model data, motion vectors, footnotes or alternative view points from camera angles being viewed and, indeed, any data which may be related to the broadcast programme and which may be asserted by a user device. Within the scope of the term "asserted” is included any delivery to a user which can include control of other devices within the user's environment.
- the broadcast time server may generate an appropriate time signal of a variety of forms.
- the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
- timing may be provided from a now and next page on analog teletext.
- the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
- timing may be provided from a now and next page on analog teletext.
- the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
- timing may be provided from a now and next page on analog teletext.
- the time stamp may be from a program status information or from a time and date table.
- timing may be provided from a now and next page on analog teletext.
- the time stamp may be from a program status information
- embodiment is a digital implementation using the event information table.
- the receiver of the audio-video broadcast may therefore be an analog or digital television, cable receiver, set-top box, or similar. It is noted, for the avoidance of doubt, that no change is required of the receiver for implementation of the invention. In particular, the receiver does not need to have any additional connection for providing timing information to the user device, because the user device retrieves timing information from the separate broadcast time server.
- the user device can be a variety of different types of device including mobile telephones, laptops, personal data assistance (PDA) music players and similar user devices.
- the user device is a portable device.
- the connection from the user device to the broadcast time server is preferably a network such as the internet or a mobile phone data service using any known protocol such as HTTP, TCP, UDP, EMPP or IRC.
- the request for the broadcast time signal may be sent from the user device to the broadcast time server by SMS, MMS or similar protocol.
- the additional data is considered additional in the sense that it supplements the content of the broadcast audio-video and is asserted by a presentation synchronised to the audio-video as presented at a receiver near the user of the user device.
- the additional service data may be provided to the user device by download, in advance, or retrieved dynamically alongside receipt of the broadcast audio-video at the receiver.
- the data may be provided as a file containing a list of timed events. Having one or more time stamps relative to broadcast time, program time or the beginning of a given sequence of audio-video data.
- the additional data may be text, images, sound of many different types, and may also include data that causes the user device to instruct another device. A particular example of this would be receipt of additional audio at the user device which is then provided to a separate audio decoder.
- a further example would be movement data which may be received and asserted to cause movement of a further user device such as any device for providing special effects.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Le système et le procédé selon l'invention servant à produire des données comme du texte, des graphiques ou du son vers un dispositif utilisateur portable en synchronisation avec un service d'émission et reçu par des récepteurs séparés comprennent un serveur d'heure d'émission. Le serveur d'heure d'émission reçoit un service d'émission sensiblement au même moment que d'autre récepteurs d'émission dans la localité d'un transmetteur et produit des signaux de synchronisation temporels par un réseau séparé pour les dispositifs d'utilisateurs. Un dispositif d'utilisateur comme un téléphone mobile peut alors obtenir le signal de synchronisation du serveur d'heure d'émission de sorte que les données additionnelles peuvent être maintenues en synchronisation avec un programme de télévision regardé par l'utilisateur à la télévision. Le dispositif d'utilisateur ne requiert aucune connexion au récepteur de télévision de l'utilisateur si bien qu'aucune modification du transmetteur ni du récepteur d'émission n'est requise.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1014608.2 | 2010-09-02 | ||
GB1014608.2A GB2483277A (en) | 2010-09-02 | 2010-09-02 | Additional service synchronisation using time server |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012028851A1 true WO2012028851A1 (fr) | 2012-03-08 |
Family
ID=43013584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2011/001288 WO2012028851A1 (fr) | 2010-09-02 | 2011-09-01 | Procédé et système de synchronisation de services additionnels |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2483277A (fr) |
WO (1) | WO2012028851A1 (fr) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2579605A1 (fr) * | 2011-10-07 | 2013-04-10 | Accenture Global Services Limited | Synchronisation d'un contenu multimédia numérique |
WO2015116984A1 (fr) * | 2014-01-30 | 2015-08-06 | Echostar Uk Holdings Limited | Procédés et appareil de création d'un indice chronologique de référence de programmation audio/vidéo |
US9237368B2 (en) | 2009-02-12 | 2016-01-12 | Digimarc Corporation | Media processing methods and arrangements |
US9292894B2 (en) | 2012-03-14 | 2016-03-22 | Digimarc Corporation | Content recognition and synchronization using local caching |
US9615122B2 (en) | 2014-01-30 | 2017-04-04 | Echostar Technologies L.L.C. | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
US9787768B1 (en) * | 2013-03-15 | 2017-10-10 | Arris Enterprises Llc | M-CMTS, Edge-QAM and upstream receiver core timing synchronization |
WO2018009287A1 (fr) * | 2016-07-02 | 2018-01-11 | Qualcomm Incorporated | Architecture de mise en œuvre distribuée pour récepteur de diffusion |
US9971319B2 (en) | 2014-04-22 | 2018-05-15 | At&T Intellectual Property I, Lp | Providing audio and alternate audio simultaneously during a shared multimedia presentation |
US10673609B2 (en) | 2015-12-07 | 2020-06-02 | Fujitsu Limited | Synchronization device, method, program and system |
CN113923522A (zh) * | 2021-10-14 | 2022-01-11 | 深圳市华曦达科技股份有限公司 | 机顶盒的时间更新方法、装置及计算机可读存储介质 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2992516A1 (fr) * | 2012-06-22 | 2013-12-27 | France Telecom | Restitution d'un contenu complementaire a un contenu d'un flux |
EP2677764B1 (fr) * | 2012-06-22 | 2017-10-25 | Orange | Declenchement d'une action relative a un flux |
FR2993742A1 (fr) * | 2012-07-18 | 2014-01-24 | France Telecom | Declenchement d'une action relative a un flux |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003003743A2 (fr) * | 2001-06-29 | 2003-01-09 | Lightmotive Technologies | Procede et appareil pour synchroniser des reseaux medias paralleles |
US6630963B1 (en) * | 2001-01-23 | 2003-10-07 | Digeo, Inc. | Synchronizing a video program from a television broadcast with a secondary audio program |
WO2008084947A1 (fr) * | 2007-01-08 | 2008-07-17 | Sk Telecom Co., Ltd. | Système et procédé assurant une synchronisation d'un contenu d'une émission avec une information supplémentaire |
US7634798B2 (en) | 2000-11-03 | 2009-12-15 | The Walt Disney Company | System and method for enhanced broadcasting and interactive television |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU3083901A (en) * | 1999-11-22 | 2001-06-04 | Spiderdance, Inc. | System and method for synchronizing online activities with broadcast programming |
TWI220036B (en) * | 2001-05-10 | 2004-08-01 | Ibm | System and method for enhancing broadcast or recorded radio or television programs with information on the world wide web |
US20070022437A1 (en) * | 2005-07-19 | 2007-01-25 | David Gerken | Methods and apparatus for providing content and services coordinated with television content |
US8407741B2 (en) * | 2006-11-20 | 2013-03-26 | Sk Planet Co., Ltd. | System, server and method for providing supplementary information service related to broadcast content |
-
2010
- 2010-09-02 GB GB1014608.2A patent/GB2483277A/en not_active Withdrawn
-
2011
- 2011-09-01 WO PCT/GB2011/001288 patent/WO2012028851A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7634798B2 (en) | 2000-11-03 | 2009-12-15 | The Walt Disney Company | System and method for enhanced broadcasting and interactive television |
US6630963B1 (en) * | 2001-01-23 | 2003-10-07 | Digeo, Inc. | Synchronizing a video program from a television broadcast with a secondary audio program |
WO2003003743A2 (fr) * | 2001-06-29 | 2003-01-09 | Lightmotive Technologies | Procede et appareil pour synchroniser des reseaux medias paralleles |
WO2008084947A1 (fr) * | 2007-01-08 | 2008-07-17 | Sk Telecom Co., Ltd. | Système et procédé assurant une synchronisation d'un contenu d'une émission avec une information supplémentaire |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9237368B2 (en) | 2009-02-12 | 2016-01-12 | Digimarc Corporation | Media processing methods and arrangements |
EP2579605A1 (fr) * | 2011-10-07 | 2013-04-10 | Accenture Global Services Limited | Synchronisation d'un contenu multimédia numérique |
US9986282B2 (en) | 2012-03-14 | 2018-05-29 | Digimarc Corporation | Content recognition and synchronization using local caching |
US9292894B2 (en) | 2012-03-14 | 2016-03-22 | Digimarc Corporation | Content recognition and synchronization using local caching |
US9787768B1 (en) * | 2013-03-15 | 2017-10-10 | Arris Enterprises Llc | M-CMTS, Edge-QAM and upstream receiver core timing synchronization |
US9615122B2 (en) | 2014-01-30 | 2017-04-04 | Echostar Technologies L.L.C. | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
US9942599B2 (en) | 2014-01-30 | 2018-04-10 | Echostar Technologies Llc | Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data |
WO2015116984A1 (fr) * | 2014-01-30 | 2015-08-06 | Echostar Uk Holdings Limited | Procédés et appareil de création d'un indice chronologique de référence de programmation audio/vidéo |
US9971319B2 (en) | 2014-04-22 | 2018-05-15 | At&T Intellectual Property I, Lp | Providing audio and alternate audio simultaneously during a shared multimedia presentation |
US10754313B2 (en) | 2014-04-22 | 2020-08-25 | At&T Intellectual Property I, L.P. | Providing audio and alternate audio simultaneously during a shared multimedia presentation |
US10673609B2 (en) | 2015-12-07 | 2020-06-02 | Fujitsu Limited | Synchronization device, method, program and system |
WO2018009287A1 (fr) * | 2016-07-02 | 2018-01-11 | Qualcomm Incorporated | Architecture de mise en œuvre distribuée pour récepteur de diffusion |
CN113923522A (zh) * | 2021-10-14 | 2022-01-11 | 深圳市华曦达科技股份有限公司 | 机顶盒的时间更新方法、装置及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
GB2483277A (en) | 2012-03-07 |
GB201014608D0 (en) | 2010-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012028851A1 (fr) | Procédé et système de synchronisation de services additionnels | |
JP5903924B2 (ja) | 受信装置および字幕処理方法 | |
KR100449742B1 (ko) | 멀티미디어 방송 송수신 장치 및 방법 | |
CN1742492B (zh) | 媒体内容的基于音频和视频的媒体服务的自动同步 | |
JP6935396B2 (ja) | メディアコンテンツタグデータの同期化 | |
KR101727050B1 (ko) | 미디어 세그먼트 송수신 방법 및 그를 이용한 송수신 장치 | |
Howson et al. | Second screen TV synchronization | |
CN101809965B (zh) | 将接收的流与发送至另外装置的流同步的通信技术 | |
US20090106357A1 (en) | Synchronized Media Playback Using Autonomous Clients Over Standard Internet Protocols | |
US10503460B2 (en) | Method for synchronizing an alternative audio stream | |
KR20120080214A (ko) | 다이내믹 미디어 파일 스트리밍을 위한 시스템, 방법 및 장치 | |
Boronat et al. | HbbTV-compliant platform for hybrid media delivery and synchronization on single-and multi-device scenarios | |
KR101192207B1 (ko) | 온라인 생방송을 위한 실시간 다국어 자막 서비스 시스템 및 그 방법 | |
US20190373296A1 (en) | Content streaming system and method | |
US20040244057A1 (en) | System and methods for synchronizing the operation of multiple remote receivers in a broadcast environment | |
EP2891323B1 (fr) | Commande de temps de rendu | |
CA2938478A1 (fr) | Procedes et appareil de creation d'un indice chronologique de reference de programmation audio/video | |
WO2014178796A1 (fr) | Système et procédé permettant d'identifier et de synchroniser un contenu | |
CN107534792B (zh) | 接收设备、发送设备以及数据处理方法 | |
van Deventer et al. | Media synchronisation for television services through HbbTV | |
KR101025274B1 (ko) | 이동통신 방송 시스템 및 동기를 위한 부가 정보 변환 방법 | |
CN102088625A (zh) | 媒体内容的基于音频和视频的媒体服务的自动同步 | |
JP2001094945A (ja) | デジタル放送における映像音声の部分再生方法及び受信装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11752607 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11752607 Country of ref document: EP Kind code of ref document: A1 |