GB2544796A - Video content synchronisation - Google Patents

Video content synchronisation Download PDF

Info

Publication number
GB2544796A
GB2544796A GB1521000.8A GB201521000A GB2544796A GB 2544796 A GB2544796 A GB 2544796A GB 201521000 A GB201521000 A GB 201521000A GB 2544796 A GB2544796 A GB 2544796A
Authority
GB
United Kingdom
Prior art keywords
stream
time
segment
start point
synchronisation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1521000.8A
Other versions
GB201521000D0 (en
GB2544796B (en
Inventor
Gower Andrew
Trimby Martin
Rennison Jonathan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Telecommunications PLC
BT Group PLC
Original Assignee
British Telecommunications PLC
BT Group PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications PLC, BT Group PLC filed Critical British Telecommunications PLC
Priority to GB1521000.8A priority Critical patent/GB2544796B/en
Publication of GB201521000D0 publication Critical patent/GB201521000D0/en
Publication of GB2544796A publication Critical patent/GB2544796A/en
Application granted granted Critical
Publication of GB2544796B publication Critical patent/GB2544796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43079Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on multiple devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/18Arrangements for synchronising broadcast or distribution via plural systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/38Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space
    • H04H60/40Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H2201/00Aspects of broadcast communication
    • H04H2201/40Aspects of broadcast communication characterised in that additional data relating to the broadcast data are available via a different channel than the broadcast channel

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A process for synchronising a first 11 and second 21 content stream on first 10 and second 20 respective receiving devices which respectively output to a first display screen 1, and a second display screen 2, which is typically a handheld tablet or smartphone. The synchronisation process is initiated by the activation of a seek command 408, i.e.; playback, fast-forward, or re-wind function, on the first device. The process further involves identifying a start point in the first media from which to resume display of content 401, and reporting the point to a synchronisation controller 3. Each stream comprises a sequence of segments. The controller identifies the segment in the first video stream associated with the start point, and the time from the start point to the beginning of the next segment. The controller transmits a command to the second device, identifying the corresponding segment and time for the second stream to resume play; causing resynchronisation of the first and second stream. The first stream may be a multicast transmission; and the second stream may be a unicast transmission of content related to the first stream such as alternative camera angles, different audio soundtracks or languages, or play-along games.

Description

VIDEO CONTENT SYNCHRONISATION
This invention relates to the synchronisation of live media streams on two or more display screens.
In so-called second-screen applications, a viewer is typically watching a streamed video on a primary or main device and watching (or listening to) related content on a second device. In a common situation a first receiver unit associated with the primary screen receives a multicast transmission - (that is, a transmission made to many end-users at substantially the same time), and a separate unicast (one to one) transmission is sent to a second receiver unit, typically a handheld tablet or smartphone device. Examples of such related content may be alternative camera angles, different soundtracks (e.g language), play-along games, etc. These are usually of more specialised interest, and/or are interactive and therefore bespoke to the end-user, and thus are only transmitted on demand.
In many cases, the nature of the related content requires precise synchronisation between the primary device and the second device. (Alternative camera angles, subtitles, etc). This can be difficult to achieve. In particular, the multicast transmission to the primary device is usually over a wired or cabled broadband connection whereas the second, unicast, stream is typically transmitted over a wireless connection with a narrower bandwidth, either independently or relayed from the primary device over a local wireless connection. Transmission and buffering delays are unlikely to be identical in the two paths.
Where such transmissions are of live streams, one solution is to include an additional delay to allow adjustment of the start times of the streams to allow their synchronisation. This is normally achieved by reference to an absolute time or clock, as described for example in ETSI DVB technical specifications. For example, the MPEG-DASH (Dynamic Adaptive Streaming over HTTP) protocol has segment header data including an absolute time reference enabling video synchronisation based on those absolute time references. However some streaming protocols lack such an absolute reference. An example is the HLS (HTTP Live Streaming) protocol, which is a proprietary unicast streaming protocol for delivery of live or on-demand media, and is currently widely used for streaming video on mobile devices. A further problem arises if the user wishes to interrupt the playing of the stream, and play a section out of sequence - for example to replay a section or, if the stream is not being viewed live, to “fast forward” to a point later in the stream. A user uses a “seek” command in the viewing controls to identify from which point in the stream the user wishes to resume playing. Various “trick play” systems are available to allow the user to identify the relevant part of the video stream in question and, having done so, the "seek” command then instructs the viewing apparatus to play from the segment in question. In such circumstances it is then necessary to re-synchronise the stream played on the second screen. This requires a second “seek” command, to identify the correct segment in the second stream and to resume playing it at the right time. It would be inconvenient and distracting for a user to have to repeat the “replay/fast forward” process for the second stream, and difficult to synchronise them exactly by eye.
According to a first aspect of the invention, there is provided a process for synchronising playback of output from a first content stream and a second content stream delivered to respective first and second receiving devices, each stream comprising a sequence of segments, wherein corresponding segments in the respective streams are identified by reference to a reference point common to the streams, the process being initiated by activation of a seek command on the first receiving device, identifying a start point in a first stream from which to resume display of content, and wherein the first receiving device reports the identified start point in the first stream to a synchronisation controller, the synchronisation controller identifies the segment in the first stream in which the start point is located and the duration of time between the start point and the beginning of the next segment in the sequence of segments, the synchronisation controller transmits a command to the second receiving device identifying a corresponding segment in the second stream at which to resume play and a time at which to start displaying the second stream, to cause output of the content of the second stream to be resynchronised with the output of the content of the first stream.
According to a second aspect of the invention, there is provided a synchronisation control device for co-ordinating play of first and second content streams delivered to respective first and second receiving devices, wherein corresponding segments in the respective streams are identified by reference to a reference point common to the streams, the synchronisation control device having a first interface for intercepting, from a first receiving device, indication of a start point in a first stream from which display of content is to resume following a seek command, a synchronisation controller for identifying a segment in the first stream in which the start point is located, and the duration of time between the start point and the beginning of the next segment and a second interface for transmitting a command to a second receiving device identifying a segment in a second stream, at which to resume play, and a time at which to start displaying, to cause the display controlled by the second receiving device to be resynchronised with the display controlled by the first receiving device after operation of the seek command on the first device.
In a preferred embodiment, the time at which the second receiving device is to resume play is the beginning of the next segment to begin after a predetermined minimum time has elapsed after the identified start point in the first stream. If a predetermined offset time or segment number is previously specified between display of corresponding segments in the first and second streams, that offset may be applied to the time or segment at which the second receiving device resumes play.
The device may be integrated with one of the first and second receiving devices. Embodiments of the invention will now be described, byway of example, with reference to the drawings, in which
Figure 1 depicts a general arrangement of the various elements that co-operate in an embodiment of the invention
Figure 2 is a schematic of a device for synchronising streamed content
Figure 3 illustrates the process for initially synchronising the streams.
Figure 4 illustrates the process for re-synchronising the streams after a “seek” command
Figure 5 is a sequence diagram illustrating a first process for starting playback when segment numbers are reported
Figure 6 is a sequence diagram illustrating an alternative process for starting playback when segment numbers are not reported
Figure 7 is a sequence diagram for synchronisation of the two streams after a “seek” request
The embodiment as described is arranged to facilitate synchronisation using mechanisms specific to HLS. The skilled person will appreciate that the embodiment may be adapted to other protocols. HTTP Live Streaming (HLS) is a streaming protocol for delivery of live or on-demand media, including audio and video. HLS uses media in MPEG2 TS format. MPEG2 TS media data is divided into chunks of a fixed time length, for example 10 seconds. Each chunk of media data is stored in a separate file; these files are known as segments. Segments are numbered sequentially. Audio, video and any other associated data for the same time period are stored interleaved in the same segment. Clients play the media stream by downloading each of the segments in sequence using HTTP, and playing them out in sequence, such that the presented output is seamless and continuous.
Each segment starts with a decoder refresh point (referred to as an IDR or l-frame), which allows the player to begin playback at the start of any segment without requiring any media data stored in any previous segment.
Multiple versions of the same media stream can be made available by dividing the multiple versions of the stream into segments at the same time points. This allows clients to switch between different versions of the media stream at segment boundaries; this is typically used for adaptive streaming based on available network bandwidth, where the different versions of the media are at different bandwidth/quality levels.
In the case of a live stream the server maintains a manifest file which contains the numbers, names and time lengths of a fixed number of the most recent segments. The manifest is updated whenever new segments become available. The manifest does not indicate the absolute (also known as wall clock) time of the stream start or of any individual segment. Clients consume the media stream by periodically downloading the manifest file using HTTP and downloading any segments added to the end of the manifest which have not already been downloaded.
The architecture in which this embodiment operates is depicted in Figure 1. It comprises two or more client devices 1,2 each of which can play one or more media streams, for display on associated screens or other display devices 10, 20. The term “display” in this context includes video, audio, and audio-visual output. A synchronisation controller 3 coordinates synchronisation of the outputs delivered by the client devices 1, 2. The controller is in communication with each of the client devices, for example through a networking protocol, such as over a wireless LAN. Although in practice, when being used for “second screen” applications, the client devices 1,2 and associated screens 10, 20 are likely to be used by the same person, physical proximity of the client devices to each other is not a requirement for the embodiment to operate. The client devices 1, 2 do not need to communicate directly with each other, nor do they need to be topologically co-located with each other or with the synchronization controller. For example, the controller 3 may be on a separate local network from each of the client devices 1, 2 and each of the media servers 12, 22, where the local networks are connected via one or more wide area networks.
In some arrangements, the synchronisation controller 3 and one of the client devices 2 may be integrated into a single device 4, as depicted in Figure 1.
Each client device 1,2 receives one or more media streams 11,21. In many cases the streams 11, 21 may both be delivered from one and the same media server, but in general the controller 3 is capable of operation with feeds from different sources 12, 22, as depicted.
The functional elements of the controller 3 are depicted schematically in Figure 2. It will be appreciated by those skilled in the art that some or all of these elements may be implemented in software running on a general purpose computer, or coded in firmware. The controller is controlled by a processor 36.
The controller comprises first and second input ports 13, 23, for monitoring the data segments carried by the respective data streams 11,21 being delivered to the client devicesl, 2. Respective header-reading elements 14, 24 identify the individual segments, and deliver their identities to respective stores 15, 25.
The processor 36 has interfaces 18, 28 associated with the respective client devices 1, 2. Periodically, the data stores 15, 25 can be accessed by the processor 36 to calculate a timeshift value which is stored in a data store 37 and transmitted to at least one of the outputs 18, 28 for transmission to the respective client device 1,2 to cause it to synchronise its output with the other client device.
This embodiment comprises a two-stage method. The first stage is depicted in Figure 3, Figure 5 and Figure 6, (steps 100-314) and uses a method disclosed in ETSI TS 103 286-2 to determine the current time position in each media stream during playback by querying each client device. In this stage, segments of the stream are mapped to an absolute timing reference, which is derived from the arithmetical product of the segment number and the segment length, referenced to a nominal stream start time, in order to obtain a current media time. Using a reference time generated by one of the devices to be synchronised reduces the processing power required to achieve synchronisation, compared with synchronisation to a centrally controlled time, as the client device is already operating in the time frame to be used when it is required to synchronise to the other device. The nominal stream start times are estimated in the controller, based on relative times reported by each client device. The actual estimation process will be specific to the type of media being synchronised and the type of client device playing the media.
The second stage, depicted in Figure 4 and Figure 7, then controls the playback of one or both media streams, by buffering or “fast-forwarding” playback, to bring the media streams into synchronisation within an acceptable threshold. As standard FILS segments are not indexed, it is not possible to directly identify which segment in a first stream is being played, or which segment of the second stream corresponds to it. The actual processes of seeking will be specific to the type of media being synchronised and the type of client device playing the media.
The time-location process of Figures 3, 5 and 6 will now be discussed in more detail.
The method of determining the current time position of a media stream is specific to the type of media being synchronised and the type of client device playing the media, in particular, it depends on whether the client devices 1,2 report the current segment number. A mapping between HLS segment numbers in different streams requires that they can be both related to a reference timescale common to them. This can be defined as a nominal stream start time plus the segment number times the segment length. The nominal stream start time can either be referenced to a standard value, such as the Unix epoch (seconds after midnight UTC, January 1st 1970), or could be a stream-specific value stored in the manifest or elsewhere as an additional field, defining which segment in each stream corresponds to a reference time T=0. The nominal stream start time is not required to equal the actual clock time at which the stream or the stream content was started.
To determine the current time position of a media stream on a client device the process comprises an initial process to be applied when the controller requests that the client device begins playing a media stream (steps 101-105, 207-209), and a repeating process (steps 311-314) to be subsequently applied whenever the controller queries the state of the client device to determine the current position of a media stream being played on the client device.
The initial process has two variants depending on whether the client device reports the initial segment number. In the case where the client device does not report the initial segment number, the initial steps 101 -105 are performed to deduce the initial segment number by the controller choosing the time at which the playback request is sent to the client device before proceeding to the process of steps 207-209. This also requires that the clocks of the controller and media server are reasonably synchronised. If the initial segment number is reported by the clients, the initial process (steps 101-106) can be omitted and the process starts at step 207.
As shown in Figure 1, if the primary client device is of a type that reports the segment number (step 100) the controller initially sends a request to begin media playback to the client device (step 207). The client device starts playback (step 208) and responds to the controller with the state of the client device just after media playback has been started (step 209), and the controller records the media time and segment number (210) as indicated in the response from the client device.
If the primary client device is of a type which does not record the segment number the controller has to derive a reference time by identifying when new segments are expected to be published at the media server, and at what time it sent the request to the primary client device to start playback. The controller can then use this derived reference time to identify the segment at the primary client device will start. In order for this to operate reliably, the controller must avoid sending a request to the primary client device to start playback too close to when the server is expected to publish a new segment, as any timing or other variations could cause the controller to estimate a value associated with the wrong segment. The margins before and after new segments are expected to be published are defined as the pre- and post-publication margins; the values of these margins are implementation-defined.
In this process the controller first queries the media server associated with the stream for the nominal stream start time (TO) and the length (t) of each segment in the stream, (step 101). It is assumed that each segment is of the same length. The controller then subtracts the nominal stream start time from the current time T, and adds a prepublication margin Tp; to produce a media offset time Tm = T+Tp-T0 (step 102).
The controller divides the media offset time by the segment length, to produce a division result and a remainder (step 103). The division result minus the number of segments back that the client device starts playing from is recorded as the current segment number (step 104)
The controller then tests if the remainder is greater than the sum of the pre- and postpublication margins (step 105).
If the remainder is less than these margins, this indicates that the current time is too close to a segment boundary for the controller to be able to correctly determine which segment the client device would start with, if the client device was requested to begin playback immediately. The controller then inserts a delay (step 106) for the value of the sum of the pre- and post-publication margins minus the remainder, until the current time is to no longer too close to a segment boundary.
This step 106 is omitted if the remainder is already greater than the sum of the pre-and post-publication margins, as this indicates that the current time is not near to a segment boundary, so the controller can request that the client device begin playback immediately.
The process then continues as for the method already described, wherein the controller sends a request to begin media playback to the primary client device (step 207), the primary client device responds to the controller with the state of the client device just after media playback has been started (step 208) and the controller records the media time as indicated in the response from the primary client device (step 209). However, in this case the segment number is derived from the controller’s own calculation (step 103) instead of the response (209) from the primary client device, which only indicates the time.
The creation of this time reference can be used to synchronise multiple feeds when operating in “second screen” systems, where a user is taking two feeds of the same programme on different devices not sharing a common feed, as shown in Figure 1. If the feeds have already been downloaded to the devices, synchronisation can be done by advancing one playback by omitting segments, and/or by retarding the other feed by pausing playback, until both devices are at the beginning of the segments in each feed which correspond. Of course, if the feeds are live, it is only possible to pause display of the stream that is currently ahead, to await download to the other device of the segment corresponding to that at which the first stream has been paused, or to advance within the buffer of already downloaded media stream data which has not yet been played. A possible use case would be for a viewer watching motor racing where additional camera feeds are provided from on-board the racing vehicles and from helicopter cameras in addition to the primary editorial feed. The viewer would watch the primary programme on the TV, while watching additional video feeds on a companion screen application running on a tablet or smartphone.
Periodically, the controller requests (311) that the client device report its current state, and the client device responds with a report (312) of its current state. This includes the current media position of the media stream. The controller calculates the absolute time of the current position of the media stream (step 313) as the nominal stream start time plus the current media position (as reported in step 312) minus the initial media time (as determined in step 210) plus the product of the segment length and the initial media segment number.
Having determined the absolute time of the media stream the controller may add an additional variable offset (step 314) to the absolute time of the media stream. This offset could for example be the sum of one or more of: • A user-defined timing offset, to allow end-user customisation of the synchronisation of a particular stream. For example the user could view a stream as a synchronised 10 second delayed replay stream, instead of a live synchronised stream. This offset could be adjusted by the user while the stream is playing. • A stream-specific timing offset used to compensate for variations in the content capture and encoding process, where the absolute time of the content encoding is significantly different from the absolute time at which the content was captured. For example, this could be applied to correct delays in streams from wireless cameras on-board vehicles in a motorracing event, as such streams are significantly delayed relative to wired track-side cameras.
The use of segment data is used in this embodiment to “seek” or adjust the buffering of a media stream in a format such as HLS, which does not have an absolute timing reference.
The two devices 1, 2 can be synchronised by identifying the segments which have been or are about to be downloaded to each device, which have start times at or the closest after the time point in each media stream to which the seek command is directed, to achieve synchronisation, and then playing each stream from the beginning of that segment, after a delay equal to the difference between the desired time in the media stream to seek to, and the start time of the segment, if non-zero, such that the streams become synchronised once both streams have begun playing. As discussed above, a user may impose an additional offset, for example to compensate for any latency in the communication between the controller 3 and the playback devices 1,2.
When viewing a transmission in “second screen” mode, a user will wish both displays to respond when he operates a control, such as rewind or playback, on either one of the devices. For present purposes the device operated directly will be referred to as the “master” device and the indirectly controlled device as the “slave” device. For example, the “master” device 1 may be satellite television “set top box” associated with a widescreen television set 10 and controlled by a remote control unit, and the “slave” device 2 may incorporated in a handheld “tablet” computer 4 with an associated display 20 connected to the “Internet” through a wireless LAN. If the user operates a rewind control on the remote control unit, causing the content displayed on the widescreen television set to reverse, the user will wish the content on the tablet computer to do the same. Conversely, the user may wish to control the content on both the tablet display 20 and the widescreen television set 10 from the controls on the tablet 4. The “master” unit in each case is the one directly controlled by the user, and the “slave” unit is the device which adjusts its timing to synchronise with the master.
It should be noted, however, that if the feeds are being transmitted in real time, it is not possible to skip forward other than within the playback buffer of already downloaded but not yet played media data, as the later segment will not yet have been received, so any adjustment in timing has to be by pausing one stream for the other to catch up, except where one stream can be advanced within its playback buffer between its current position and the live edge. Thus, of two units 1,2 receiving live video streams, the master unit will be the one that receives each segment of its stream later than the other unit, if it is not possible to seek forward in time in the other stream.
Following fulfilment of a playback or rewind request to the master unit 1, the streams require resynchronisation, and this is achieved by changing the current media time of the slave stream in playback, by adding or subtracting a time offset from the current media time to match the current time of the “master” device. As neither device is now operating in real time, the offset calculated can be positive or negative.
The method of Figure 4 works by only seeking the HLS media stream to the next segment boundary. HLS segments are not indexed, so attempting to seek into the middle of a segment would require the client to parse all of the media data between the start of the segment and the seek point identified by the seek command. Of that media data, at least the media data between the seek point and the previous decoder refresh point would need to be decoded. This introduces a significant delay, potentially as much as the time that would be required to play out the media data from the seek point to the end of the segment at normal playback speed. This delay would not be acceptable for media synchronisation. This is avoided by continuing to play the media stream until such a time as an adjustment of the required offset can be made by a seek to the start of a segment. HLS segments are specified to start with a decoder refresh point (also referred to as an IDR or l-frame), such that the player can begin playback immediately at that point.
The method of Figure 4 and Figure 7 is executed by the controller 3, in co-operation with the client devices 1, 2. The method starts when the controller is about to adjust the playback of the media streams by a positive or negative time offset, for example by “rewinding” or “fast forwarding” to a point in the stream selected by the user and reported by the master device 1 to the controller (step 401). The point to which the user has moved in the stream 21 running on the master device 1 (the “second screen”) is compared with the time that it would have reached had the command not been given. Any user-applied offset figure (314) is retrieved (step 402) and added to the current media time to form the target media time (step 403). This is the time that would be the seek point, if the seek command were performed immediately.
The target media time is divided by the segment length, to produce a remainder (step 404) . This remainder is the time elapsed between the beginning of the segment and the target media time, in other words how close the time selected by the user is to the beginning of the segment in which it falls. A test is performed to determine if the remainder is less than a threshold value (step 405) , selected to allow sufficient processing time to allow playback of the slave stream to start at the beginning of the next segment. If the remainder is greater than this threshold, (in other words, the time selected is close to the end of the current segment) playback on the slave device is delayed for the duration of one segment, minus the remainder (step 406) and the offset is added to the current media time to amend the target media time, as the current media time has now changed (step 407). The target media time, as determined in step 403, as adjusted by step 407 if applicable, now corresponds to the start of a segment. The slave device 2 is then instructed to seek to the target media time (step 408) and synchronise with the master device 2, following the steps 311-314 as above, such that the slave device resumes playing at the beginning of a segment, at the same time that the master device reaches the beginning of the corresponding segment.
This allows playback to be initiated on the slave device with less delay than would be required by parsing the entire segment. The delay between playback starting on the master device and playback resuming on the slave device would be, at a maximum, the segment length plus the threshold value set in step 405

Claims (7)

  1. CLAIMS 1 A process for synchronising playback of output from a first content stream and a second content stream delivered to respective first and second receiving devices, each stream comprising a sequence of segments, wherein corresponding segments in the respective streams are identified by reference to a reference point common to the streams, the process being initiated by activation of a seek command on the first receiving device, identifying a start point in a first stream from which to resume display of content, and wherein the first receiving device reports the identified start point in the first stream to a synchronisation controller, the synchronisation controller identifies the segment in the first stream in which the start point is located and the duration of time between the start point and the beginning of the next segment in the sequence of segments, the synchronisation controller transmits a command to the second receiving device identifying a corresponding segment in the second stream at which to resume play and a time at which to start displaying the second stream, to cause output of the content of the second stream to be resynchronised with the output of the content of the first stream.
  2. 2 A process according to claim 1, in which the time at which the second receiving device is to resume play is the beginning of the next segment to begin after a predetermined minimum time has elapsed after the identified start point in the first stream.
  3. 3. A process according to Claim 1 or Claim 2, in which, if a predetermined offset time or segment number is previously specified between display of corresponding segments in the first and second streams, that offset is applied to the time or segment at which the second receiving device resumes play.
  4. 4. A synchronisation control device for co-ordinating play of first and second content streams delivered to respective first and second receiving devices, wherein corresponding segments in the respective streams are identified by reference to a reference point common to the streams, the synchronisation control device having a first interface for intercepting, from a first receiving device, indication of a start point in a first stream from which display of content is to resume following a seek command, a synchronisation controller for identifying a segment in the first stream in which the start point is located, and the duration of time between the start point and the beginning of the next segment and a second interface for transmitting a command to a second receiving device identifying a segment in a second stream, at which to resume play, and a time at which to start displaying, to cause the display controlled by the second receiving device to be resynchronised with the display controlled by the first receiving device after operation of the seek command on the first device.
  5. 5. A synchronisation control device according to claim 4, arranged such that the time at which the second receiving device is to resume play is the beginning of the next segment to begin after a predetermined minimum time has elapsed after the identified start point in the first stream.
  6. 6. A synchronisation control device according to Claim 4 or Claim 5,arranged such that, if a predetermined offset time or segment number is previously specified between display of corresponding segments in the first and second streams, that offset is applied to the time or segment at which the second receiving device resumes play.
  7. 7. A synchronisation control device according to Claim 4, Claim 5 or Claim 6, wherein the device is integrated with one of the first and second receiving devices.
GB1521000.8A 2015-11-27 2015-11-27 Video content synchronisation Active GB2544796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1521000.8A GB2544796B (en) 2015-11-27 2015-11-27 Video content synchronisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1521000.8A GB2544796B (en) 2015-11-27 2015-11-27 Video content synchronisation

Publications (3)

Publication Number Publication Date
GB201521000D0 GB201521000D0 (en) 2016-01-13
GB2544796A true GB2544796A (en) 2017-05-31
GB2544796B GB2544796B (en) 2019-11-13

Family

ID=55177370

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1521000.8A Active GB2544796B (en) 2015-11-27 2015-11-27 Video content synchronisation

Country Status (1)

Country Link
GB (1) GB2544796B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180367827A1 (en) * 2017-06-19 2018-12-20 Wangsu Science & Technology Co., Ltd. Player client terminal, system, and method for implementing live video synchronization
EP3448041A4 (en) * 2017-06-19 2019-02-27 Wangsu Science & Technology Co., Ltd. Video player client, system, and method for live broadcast video synchronization

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130170818A1 (en) * 2011-12-28 2013-07-04 United Video Properties, Inc. Systems and methods for synchronizing playback at multiple locations
US20140321826A1 (en) * 2013-04-26 2014-10-30 Microsoft Corporation Synchronizing external data to video playback
EP2925003A1 (en) * 2014-03-25 2015-09-30 Cisco Technology, Inc. A system and method for synchronized presentation of video timeline metadata

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI20001570A (en) * 2000-06-30 2001-12-31 Nokia Corp Synchronized provision of services over a telecommunications network
JP6240315B2 (en) * 2013-09-20 2017-11-29 コニンクリーケ・ケイピーエヌ・ナムローゼ・フェンノートシャップ Correlating timeline information between media streams
US10575042B2 (en) * 2015-11-27 2020-02-25 British Telecommunications Public Limited Company Media content synchronization

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130170818A1 (en) * 2011-12-28 2013-07-04 United Video Properties, Inc. Systems and methods for synchronizing playback at multiple locations
US20140321826A1 (en) * 2013-04-26 2014-10-30 Microsoft Corporation Synchronizing external data to video playback
EP2925003A1 (en) * 2014-03-25 2015-09-30 Cisco Technology, Inc. A system and method for synchronized presentation of video timeline metadata

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180367827A1 (en) * 2017-06-19 2018-12-20 Wangsu Science & Technology Co., Ltd. Player client terminal, system, and method for implementing live video synchronization
EP3448041A4 (en) * 2017-06-19 2019-02-27 Wangsu Science & Technology Co., Ltd. Video player client, system, and method for live broadcast video synchronization

Also Published As

Publication number Publication date
GB201521000D0 (en) 2016-01-13
GB2544796B (en) 2019-11-13

Similar Documents

Publication Publication Date Title
US10575042B2 (en) Media content synchronization
JP6982021B2 (en) Receiving method and receiving device
JP6783293B2 (en) Synchronizing multiple over-the-top streaming clients
US10924582B2 (en) Distributed control of synchronized content
CA2976437C (en) Methods and apparatus for reducing latency shift in switching between distinct content streams
US10205981B2 (en) Playback synchronization across playback devices
GB2500746B (en) A method and device for outputting a transport stream
US9013632B2 (en) Apparatus, systems and methods for user controlled synchronization of presented video and audio streams
EP3741131A1 (en) Methods, systems, and media for synchronizing audio and video content on multiple media devices
US20140149606A1 (en) Server, multimedia apparatus and control method thereof for synchronously playing multimedia contents through a plurality of multimedia devices
JP2015515208A (en) Buffer management method for synchronization of correlated media presentations
JPWO2013190787A1 (en) Reception device and synchronization processing method thereof
US9136964B2 (en) Viewing impression report collection, storage, and reportback of inserted content with DVR playback
WO2011075016A1 (en) Pausing of a live media stream
GB2544796B (en) Video content synchronisation
EP2549769B1 (en) System and method of implementing synchronized audio and video streaming
JP2006157278A (en) Information distribution system
JP6738639B2 (en) Distribution system, mid-roll server, terminal device, advertisement firing device, information processing method, and program
US11856242B1 (en) Synchronization of content during live video stream