US20180109826A1 - Method and infrastructure for synchronized streaming of content - Google Patents

Method and infrastructure for synchronized streaming of content Download PDF

Info

Publication number
US20180109826A1
US20180109826A1 US15/847,060 US201715847060A US2018109826A1 US 20180109826 A1 US20180109826 A1 US 20180109826A1 US 201715847060 A US201715847060 A US 201715847060A US 2018109826 A1 US2018109826 A1 US 2018109826A1
Authority
US
United States
Prior art keywords
content
content playback
playback device
playback
devices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/847,060
Inventor
Charles McCoy
True Xiong
Ling Jun Wong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Interactive Entertainment LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment LLC filed Critical Sony Interactive Entertainment LLC
Priority to US15/847,060 priority Critical patent/US20180109826A1/en
Publication of US20180109826A1 publication Critical patent/US20180109826A1/en
Assigned to Sony Interactive Entertainment LLC reassignment Sony Interactive Entertainment LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SONY CORPORATION, SONY NETWORK ENTERTAINMENT INTERNATIONAL LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25816Management of client data involving client authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25841Management of client data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6175Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/64322IP
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • H04N5/932Regeneration of analogue synchronisation signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • H04N5/935Regeneration of digital synchronisation signals

Definitions

  • Streaming refers to a delivery of media content in a constant fashion, from a transmission at a source to a reception and presentation at a receiver.
  • Internet delivery of digital content presentations to network computers is commonly streamed, as is Internet television content.
  • Systems and methods according to the principles described here involve synchronizing the playback of network media across multiple content playback devices, occasionally termed herein as “playback devices”, “clients”, or “client devices”.
  • client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device.
  • a device lag is measured between the generation or output of a signal and the final display or playback of that signal to the user.
  • the lag may be compensated for to allow better playback synchronization.
  • lags may also be measured and compensated for relating to network delays in obtaining content.
  • a first playback device is playing back content
  • systems and methods according to the principles described here allow for a second playback device to become synchronized with the first, such that playback of the content item on the second is synchronized to the playback on the first.
  • the second content playback device may begin to buffer content prior to display, and may estimate which content to buffer based on a determined playback point on the first playback device, as well as network bandwidth and network and device lag. Once the second playback device has buffered sufficient content such that playback can begin in a synchronized fashion, playback begins and the first and second playback devices are in sync.
  • systems and methods according to the principles described here include setting up a master and slave relationship between two devices, so that the output of the slave device is the same content as the output of the master device.
  • the playback between the two may be synchronized.
  • the master device need not playback the content itself, and may in fact be playing back other content.
  • the source of content may be the master device, e.g., via a tuner, or the source may be upstream of the master device, with the master device just providing throughput of the content.
  • the master device may encode or encrypt the content item for subsequent transmission as need be. In some cases the master device may authenticate a slave device.
  • a common use is when a content item is downloaded or streamed onto the master device, a slave device wishing to sync and playback the same content may need to be authenticated so that it has 1) permission, and 2) capabilities to support playback of that content (e.g., can support 3D, Dolby plus codec, etc.).
  • a service provider can mark its content as “redistributable” or otherwise shared.
  • access to some network content is such that client playback devices play content items after authentication of the playback devices with one or both of a management server and a service provider.
  • synchronized playback may be afforded using certain above synchronization steps as well as content access steps, e.g., affiliation steps with the service provider.
  • the invention is directed towards a method of synchronizing playback of IPTV content between a first content playback device and a second content playback device, including: coupling first and second content playback devices to a controller, the controller configured to control playback of a content item from a service provider on the first and second content playback device; sending data about a device lag time associated with at least one of the first and second content playback devices to the controller; calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag time; and sending signals to the first and second content playback devices to begin playback of the content item, such that the first and second content playback devices begin playback of the content item at substantially the same time.
  • Implementations of the invention may include one or more of the following.
  • the sending of signals to the first and second content playback device may be separated by the time differential.
  • the signals sent to the first and second content playback devices may include data indicating to the first and second content playback devices a respective delay after which playback should begin.
  • the delay may be between zero and the time differential.
  • the controller may be within the first or second content playback device.
  • the controller may be in data communication with a local network associated with the first or second content playback device.
  • the method may further include accessing a management server in data communication with the first and second content playback devices, the management server controlling access to the service provider, and where the controller is in data communication with the management server.
  • the controller may be configured to receive geographic data about a location of the first and second content playback devices, and the calculating a time differential may be further based on the geographic location of the first and second content playback devices.
  • the method may further include determining a network lag time by sending a signal from the first or second content playback device, or both, to the management server, and the time differential may be further based on the network lag time.
  • the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • the invention is directed towards a method of determining a device lag time, including: generating a test signal; sending the test signal to initiate a signal indicating that rendering of a content item should begin; detecting the rendering of the content item; and measuring a time between the sending and the detecting to calculate a device lag time.
  • Implementations of the invention may include one or more of the following.
  • the method may further include sending the device lag time to a controller.
  • the rendering of a content item may cause a change in brightness or volume.
  • the detecting may include detecting with a microphone or an optical sensor.
  • the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • the invention is directed towards a method of synchronizing playback of IPTV content between a first content playback device and a second content playback device, including: playing back a content item on a first content playback device; buffering but not playing back the content item on a second content playback device, the buffering but not playing back occurring at least until the buffer includes a portion of the content item currently being played back on the first content playback device; and sending a signal to begin playback of the content item on the second content playback device, such that the playback of the content item on the first and second content playback devices is synchronized.
  • Implementations of the invention may include one or more of the following.
  • the first and second content playback devices may be in data communication with a controller, and the method may further include: sending data about a device lag time associated with the second content playback device to the controller; and sending a signal to the second content playback device to begin playback of the partially buffered content item, the time of the sending a signal based on the device lag time.
  • the buffering may be in response to a request from the second content playback device to join the playback of the content item.
  • the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • the invention is directed towards a method of playback of at least a portion of a content item on a second content playback device based on a presence of the content item at a first content playback device, including: at least partially receiving a content item on a first content playback device; transmitting at least a portion of the received content item to a second content playback device; and encoding or encrypting the content item by the first content playback device prior to the transmitting.
  • Implementations of the invention may include one or more of the following.
  • the first content playback device may generate a portion of the content item using a tuner.
  • the first content playback device may have received a portion of the content item from another content playback device.
  • the method may further include controlling operation of the first content playback device using the second content playback device.
  • the transmitting may be performed immediately upon the receiving.
  • the method may further include receiving device or network lag information at the first content playback device, and the transmitting may be performed following a time differential based on the received device or network lag information.
  • the transmitting may be performed while the first content playback device is playing back the content item, playing back another content item, or not playing a content item.
  • the transmitting may be performed while the first content playback device is playing back the content item, and the transmitting may be performed such that the second content playback device plays back the content item in synchronization with the first content playback device.
  • Multiple second content playback devices may be in data communication with the first content playback device, and the method may further include selecting a second content playback device to receive the content item prior to the transmitting.
  • a plurality of second content playback devices may be in data communication with the first content playback device, and the method may further include transmitting the content item to the plurality of second content playback devices.
  • a plurality of second content playback devices are in data communication with the first content playback device, and the method may further include transmitting the content item using a multicasting method to the plurality of second content playback devices.
  • a plurality of second content playback devices may be in data communication with the first content playback device, and the method may further include: at least partially receiving another content item on the first content playback device; and transmitting at least a portion of the received content item to one content playback device of the plurality and transmitting at least a portion of the received another content item to another content playback device of the plurality.
  • the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • the invention is directed towards a method of synchronizing the playback of a content item among a plurality of content playback devices, the content item available through a service provider requiring an affiliation process, including: coupling a plurality of content playback devices in data communication with a controller, the controller configured to at least partially control playback of a content item on the plurality of content playback devices through a service provider, the plurality of content playback devices constituting a synchronization group; sending a signal from the controller to each of the plurality to cause each of the plurality to contact the service provider to obtain access to the content item; in the event one of the plurality is not allowed access to the content item, then notifying the controller of the event and removing the one from the synchronization group; and sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item.
  • Implementations of the invention may include one or more of the following. At least a portion of the plurality may be in data communication with a proxy device, and the sending a signal to cause each of the plurality to contact the service provider may include sending a signal to cause each of the portion of the plurality to contact the service provider through the proxy device.
  • the proxy device may be a second display.
  • the controller may configure the plurality of content playback devices for synchronized playback through a second display.
  • the second display may indicate a list of content items for which access may be obtained by each of the plurality, or a list of content playback devices within the plurality that can obtain access to a given content item.
  • the method may further include sending each of the content playback devices in the synchronization group a unique URL with which to access the content item.
  • the method may further include: receiving data about device lag times associated with at least a first and a second content playback device in the plurality; calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag times; and where the sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item may include sending signals to the first and second content playback devices to begin playback of the content item, a time of each sending separated by the time differential.
  • the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • Synchronization of playback devices may lead to a significantly enhanced user experience, particularly when playback devices are in close proximity to each other.
  • Device lags may be conveniently measured, and by having the device measure the lag, the timing used to overcome that lag will be accurate, leading to synchronization that is more precise than can be achieved by manually modifying timer settings through trial and error.
  • Device lags may be accounted for from all sources, such as signal transmission, signal decoding, delays used to synchronize video and audio, lag due to the timing of when video frames are displayed, and the like.
  • Certain implementations allow playback devices to join in the playback of a content item in a synchronized fashion, without interrupting the original playback of the content item. Certain implementations further allow synchronized playback where clients are required to go through a management server infrastructure to play content, thus allowing synchronized playback in situations more complex than just the synchronization of direct networked media playback.
  • FIG. 1 is a block diagram of an exemplary system in accordance with an aspect of the present principles, illustrating a network which may be employed to deliver content in a synchronized fashion to multiple content playback devices using one or more controllers.
  • FIG. 2 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for delivering a content item in a synchronized fashion using one or more controllers.
  • FIG. 3 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system for measuring a device lag for a content playback device.
  • FIG. 4 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for measuring device lags and delivering data about the same to one or more controllers.
  • FIG. 5 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system in which a second content playback device may become synchronized with a first content playback device.
  • FIG. 6 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for synchronizing a second content playback device with a first content playback device.
  • FIG. 7 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system in which a content playback device serves as a master device and one or more content playback devices serve as slave devices.
  • FIG. 8 is a flowchart illustrating an exemplary method, which may be employed in the system of FIG. 7 , to create a master/slave relationship among two or more devices.
  • FIG. 9 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a network which may be employed to deliver content in a synchronized fashion to multiple content playback devices where content is delivered within an infrastructure including a management server and a service provider.
  • FIG. 10 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for delivering a content item in a synchronized fashion to multiple content playback devices where content is delivered within an infrastructure including a management server and a service provider.
  • FIG. 11 illustrates an exemplary computing environment, e.g., that of the disclosed IPTV or client content playback device, management server, second display, or the like.
  • a system 10 including content playback devices 12 and 14 coupled to a local network 15 , which may be wired, wireless, or a combination of both.
  • a second display 16 is also illustrated on the local network 15 and the same may control the operation of the content playback devices (or other devices) on the local network.
  • the second display 16 may in some cases also display content itself.
  • a remote control 22 may be employed to control the content playback device, or control may be exercised by way of the second display 16 .
  • the use of second display devices in such contexts has certain benefits because the same provides complementary functionality to the IPTV, but generally does not require additional investment by the user because the same make use of a device, e.g., a smartphone, tablet computer, or the like, which most users already have in their possession. Additional details about such second displays and their interactions with content playback devices, e.g., through proxy servers and otherwise, may be seen from Applicants' co-pending U.S. patent application Ser. No. 13/077,181, filed Mar. 31, 2011, entitled “PERSONALIZED SECOND DISPLAY BROWSING EXPERIENCE DUE TO MULTIPLE SESSION FEATURE”, owned by the assignee of the present application and incorporated herein by reference in its entirety.
  • a number of servers may be accessed by the content playback devices 12 and 14 through the local network 15 and the Internet 25 , including a management server 24 and one or more content servers 26 , 28 , and 32 corresponding to content providers.
  • content provider is used synonymously with “service provider”.
  • the servers may communicate with a content delivery network 34 to enable content items to be delivered to the content playback devices, or such delivery may be direct.
  • a user has a user account with a source or clearinghouse of services.
  • the source or clearinghouse is represented as a management server, but it should be understood that the user account may be with a service provider directly.
  • the management server communicates with at least one content server (generally associated with the service provider) such that the content server provides content items such as streaming assets for presentation or access at the content playback device.
  • the user account has information stored thereon related to what content playback devices are associated with the user account. When a user logs on, they may see this list of content playback devices and may choose a particular content playback device. Once a content playback device has been chosen, a list of services may be displayed from which the user may choose. From a chosen service, a user may select a content item for viewing, undergoing an affiliation or authentication step if required by the service. Additional details may be found in the application incorporated by reference above.
  • controllers 36 - 54 also termed just “controllers”, are also illustrated. Controllers may be in one or all of the content playback devices, second displays, or servers controlling content delivery. In general, at least one controller is required, and the controller may be implemented in hardware, software, firmware, or the like. The controller can even be in an external device 18 , devoted to controller functionality, or providing other functionality in addition to controller functions.
  • a typical situation represented by FIG. 1 would be a home, sports bar, and even locations such as gas pumps and grocery store checkout aisles.
  • a number of content playback devices may be in close proximity to each other. Differences in the timing of playback become noticeable and distracting. For example, if audio is produced by more than one of the devices, then users may experience an echo. With more serious offsets, they may experience two separate portions of the content competing with each other.
  • Such dyssynchrony may be typical where content sources differ, e.g., satellite versus terrestrial, or even within separate models of playback devices.
  • manually starting playback of a content item at the same time on two different devices cannot provide playback sufficiently close in time to avoid such difficulties. And tuning all of the playback devices to the same source signal does not work for the playback of network media.
  • controllers 36 - 54 are employed to coordinate such playback. All of the client devices participating in the synchronized playback establish data communication with a controlling device or controller that coordinates the playback timing across all participating clients.
  • the controller can be one of the client devices, or it may be a separate device. Generally, some client devices will be capable of operating as controlling devices, and others will not.
  • the individual content playback devices parse and buffer media content separately (step 56 ).
  • the devices may get the media directly from the source, or they may obtain the media through a proxy device.
  • one of the client devices, or the controlling device may operate as a proxy device and provide throughput or distribution of content in a manner described below, e.g., with respect to FIGS. 5-8 .
  • the proxy may distribute the media to the clients using multicast communications to reduce the amount of bandwidth that is needed.
  • the steps below may be performed when client devices have a degree of lag associated with them, e.g., either due to the network or due to device characteristics.
  • the content playback devices Once the content playback devices are ready to playback the content item, they signal their readiness to the controller (step 68 ), e.g., to controller 42 in the second display 16 .
  • the controller e.g., to controller 42 in the second display 16 .
  • the client devices are waiting for a start signal to begin playback of the content item.
  • a start signal may be sent to each client to begin playback.
  • Each client device should be in a state where playback may begin immediately or at a specified future time, so as to account for the local network lag of communications from the controller to all clients, upon receiving the signal from the controller.
  • the controller 42 may adjust the timing of start signals (step 62 ) so that the output is displayed in synchronization on each client device. For example, the controller 42 may delay the sending of start signals based on the network lag of each client device, or may send all the start signals but indicate within the signal a respective delay after which playback should begin. In this latter alternative, all the playback devices have an opportunity to cache content while they are waiting through the delay.
  • a number of steps may be employed in determining the timing of the start signals. For example, if the controller is based at the server level, e.g., within the management server, the same may be aware of and account for differences in location of the service provider or source server relative to the client devices (step 64 ). In other words, some client devices may be located much closer to the source of content than others, and thus will experience less network lag or delay in receiving content.
  • Device lags may also be accounted for, such as the device lag between when a playback signal is generated and when that signal is actually displayed to the user. Such device lags may be measured using techniques described below, and in any case data about such lags may be communicated to the controller (step 66 ). Client devices may also employ a step of attempting to measure their network lag, and communicating the same to the controller (step 76 ), by measuring how long it takes for a test signal to traverse to a network location and back, e.g., to the management server.
  • the one or more controllers may use the data to order the start times at which signals will be sent to client devices to begin playback (step 72 ). For example, the controller may compensate for the differing lag times of the clients by giving a start command to the client with the most lag first and giving a start command to the other clients with enough delay so that the final display of the content will occur in a synchronized fashion. Once the ordering is done, and timing differentials calculated between the various start times, start signals may be sent to client devices (step 74 ).
  • device lag times may be measured and used in calculations to provide for synchronized playback. It is noted in this connection that systems and methods according to the principles described here may be broadly applied to any combination of devices, and not just a particular content playback device. For example, some devices display their signals by outputting them, such as through an HDMI connector, and another device actually performs the display. The amount of lag depends thus on the combination of devices. Lag times can generally not be predetermined, as different TV models, even from the same manufacturer, may have different lag times between when they receive a digital input signal and when that signal is displayed to the user.
  • FIG. 3 a content playback device 78 is shown having a network interface 82 that allows connection to service providers (not shown) through the Internet 25 .
  • the content playback device 78 is illustrated connected to a display 86 and an audio system 88 . It will be understood, however, that such may form part of an integrated content playback device 108 that incorporates all of these sub systems.
  • the content playback device 78 includes a playback signal generation circuit 84 that accepts a start signal from the network interface 82 which generally originates from a controller.
  • the start signal indicates that playback should begin.
  • An exemplary playback signal is illustrated in the graph 85 .
  • a finite amount of time ⁇ t passes before a user actually sees a corresponding signal on the display, illustrated as ⁇ t v or hears a corresponding sound on the audio system, illustrated as ⁇ t a , in graphs 98 and 102 , respectively.
  • an optical sensor 94 such as a camera, is disposed to receive displayed signals from the display 86 .
  • An audio sensor such as a microphone 96 , is disposed to receive rendered signals from the audio system 88 .
  • a light detector may be placed in front of the display and a microphone in front of a speaker.
  • the same provides signals to a measurement circuit 104 , which also receives an indication of the playback signal 85 from the playback signal generation circuit 84 . By measuring the time between the playback signal 85 and signals 98 and 102 , a measurement of the device lag may be calculated.
  • the type of sensor may vary, and the only requirement is that they be positioned such that the same can detect the playback being output by the device.
  • a built-in microphone may not need any special positioning if the device is located in the same room as the playback.
  • a light intensity sensor or detector should be located so that the same is facing the screen where the video output is playing.
  • optical detectors should have a narrow field of vision and may employ shielding, such as a flat black tube, to reduce the amount of stray light from other angles being picked up by the sensor.
  • the sensors need not be of any particular high-quality as the same only need to respond quickly to the overall intensity they are receiving.
  • inexpensive microphones as are commonly used in telephones, will generally be sufficient for detecting overall sound intensity in real-time.
  • any camera sensor may be employed, even those lacking optics necessary to produce a clear picture.
  • the light detector may also simply detect overall light intensity and need not employ multiple pixels or be able to detect different intensities for different wavelengths of visible light.
  • the system measures the overall lag, the same being a primary parameter required to synchronize the output. No matter how complex the signal processing pathway is, the overall result is measured. In this way, complex cases where significant signal processing exists may still be afforded synchronized playback, e.g., in professional broadcast environments where signals may be routed through many pieces of equipment.
  • the measurement may be for a lag time through an arbitrary signal path, and may not necessarily include rendering of the signal at the end of the path.
  • an intermediate sensor 106 may be employed to monitor the signal at the end of the signal path being measured, to look for timing when the generated signal reaches that point.
  • the lag measurement may be automated such that device lags are automatically measured each time a change in signal path is detected, such as when a new device is attached to an HDMI output.
  • Such automation may be provided in any of the embodiments described above.
  • a method that may be employed by the system of FIG. 3 is illustrated by a flowchart 40 in FIG. 4 .
  • a first step is that a content playback device receives the start signal, or may by itself initiate a test signal (step 112 ).
  • the content playback device may begin by outputting a black video signal and then for the test signal output one or more white video frames (step 114 ).
  • the test signal is sent to the measurement circuit (step 116 ) as well as to the actual display.
  • test signal is then rendered, e.g., visually and/or aurally (step 118 ), and the same is detected by the optical sensor and/or microphone (step 122 ), respectively.
  • Indication of receipt of the test signal is sent to the measurement circuit (step 124 ).
  • the difference between the time of arrival of the start signal (or initiation of test signal) and the time of detection yields the lag time for the signal (step 126 ).
  • This “device lag time” may then be sent to one or more controllers in data communication with the content playback device (step 128 ).
  • the device may begin by outputting a silent audio signal and then outputting a loud signal.
  • the audio signal that is used may vary, but should substantially immediately increase from silence to a steady volume.
  • a single tone such as a sine wave or square wave, can be used, or the output may include white noise.
  • Music outputs may be employed if the first note is of a sufficiently consistent loud amplitude.
  • the lag may be calculated from the difference in timing from when the sound being output went from silence to the audio signal and when the sound intensity detector picked up the sudden increase in sound intensity.
  • the device may calculate the display lag using only one of the sensors, e.g. optical or audio, or it may use both. In the case where the device uses both, both measurements may occur simultaneously as they do not generally interfere with each other. It is noted that in such cases, the measurements of rendered signals may occur at different times. For example, if the audio and video synchronization of the output device is off, there may be a variation in the device lag for the audio and video outputs. In the case of a difference in device lag, the controller may employ different timings for the audio and video to compensate for that difference.
  • the measurements may be repeated, e.g., by cycling from low to high intensity several times, to ensure that the changes picked up were from the playback of the output and not from environmental interference. Statistical methods may be employed to ensure that enough points have been collected to obtain a true measurement.
  • FIG. 5 illustrates another implementation according to the principles described here, this implementation of a system 50 in which a first content playback device 132 is currently playing back a content item 138 received or streaming from a service provider through the Internet 25 .
  • a playback point 139 has been reached in the content item 138 .
  • a second content playback device 134 is illustrated, and the second content playback device has been indicated as desiring to join the playback of the content item 138 .
  • the first and second content playback devices 132 and 134 are illustrated as part of the local network 15 . It will be understood that there is no requirement the two are on the same local network.
  • a separate synchronization controller 136 is illustrated, and the same may form a portion of the second display, may form a portion of either content playback device, or may be a separate device entirely.
  • the second content playback device 134 has a buffer 135 and upon indication that the second content playback device wishes to join the playback of the first, the buffer 135 may begin to receive the content item through the Internet and/or the local network.
  • FIG. 6 illustrates a flowchart 60 for a method employing the system of FIG. 5 .
  • the first content playback device plays back or streams the content item (step 142 ).
  • a second content playback device indicates a desire to join the playback of the content item, and indicates this desire to a controller (step 144 ).
  • the second content playback device may communicate with the controller to obtain the current playback timing.
  • the first content playback device also communicates its playback point in the content item to the controller.
  • the client and controller may employ knowledge of the network and device lags, such knowledge gained using techniques described elsewhere in this application, to predict how long it will take to buffer to the point where playback can begin in a synchronized fashion.
  • the second content playback device may calculate at which point in the content item data the playback will be that far into the future, and may load any header or index data it needs for that part of the content item. Once the header and index data are loaded, the second content playback device may update its bandwidth estimate and therefore the estimate of what part of the content item data will be needed at the point in time when the second content playback device has managed to buffer enough data to start playing.
  • the controller causes the second content playback device to begin buffering content (step 146 ), starting with the portion of the content item data that it estimates will contain the portion that will be played at the point in time when it has buffered enough data to start playing e.g., at a first target point.
  • the second content playback device buffers the content until it has sufficient to join the playback (step 148 ). In so doing it may employ data about known network and device lags and delays (step 162 ).
  • the second content playback device may then compare the portion of data it has with the current playback point, e.g., point 139 . Additional communication with the controller may be made during buffering to double check that the playback timing information received by the second content playback device is still correct and was not affected by, e.g., abnormally high network lag on the part of either or both content playback devices or other such interruptions. If buffering happened quickly and the playback point has not yet reached the start of the content being buffered, the second content playback device may wait until the playback position reaches the playback point, and then begin playing the beginning of the content it has buffered (step 158 ).
  • the client may determine at what point the current playback is, within the buffered data, and will check to see if there is adequate data buffered beyond that to start playback at that position. If there is sufficient data, then playback begins at the position within the data that corresponds with the current playback point. If there is not enough data buffered, playback will not begin at this point, and the client will continue to buffer the media (step 154 ), repeating the check each time a new segment of content item data is received. Once enough data is received, such that the buffer includes the playback point, the second content playback device may join the playback (step 158 ).
  • a sufficiently disruptive network interruption may occur.
  • the latest data in the buffer may be behind the current playback point, in which case the second content playback device may start over from the beginning with its attempt to begin synchronized playback (step 159 ).
  • FIGS. 5 and 6 may be employed in a number of scenarios, including where, if existing devices are not in synchronization, one or more may attempt synchronization by following the steps that devices, that are newly joining the playback, will perform.
  • the network and device lags pertaining to each may be employed by the controller in calculating when to send start signals to the multiple devices to begin playback.
  • the above techniques may be employed to synchronize the playback of live streaming content, even if there is no existing playback with which to synchronize, as a current playback location for a live media stream is constantly changing, just as when playback of network media already exists.
  • a device that loses network connectivity can rejoin synchronized playback when it regains network connectivity.
  • synchronized playback may be again obtained.
  • the device that lost connectivity may have certain relevant content item data buffered that it may take advantage of to reduce the amount of data needed to download before playback can begin again.
  • one content playback device acts as a master device and another a slave.
  • Systems and methods according to the principles described here, in particular with respect to FIGS. 7 and 8 provide functionality to transmit content over a network to other devices, e.g., from masters to slaves.
  • the devices have the ability to receive this content through their network connections from other devices, and play back that content in a synchronized manner.
  • the output of the slave device is configured to be the same content as the output of the master device, generally, though not in every implementation, with synchronization.
  • the system 70 includes a master content playback device 164 , which receives content from the Internet 25 , and three slave content playback devices 172 , 174 , and 176 .
  • the master content playback device 164 is coupled to the slave content playback device 172 through the local network 15 .
  • the slave content playback devices 174 and 176 are driven directly from the master content playback device 164 , such as through an HDMI or NTSC connection.
  • the master content playback device 164 may itself generate content items through one or more tuners 168 , or the same may be stored in a storage 166 .
  • the storage 166 may be employed to store content items that are then output to clients.
  • Such allows functionality like that of a DVR, e.g., trick play including pause, rewind, and fast-forward.
  • Such commands may need to originate with the master device, or may come from one or more client devices, depending on how the settings are configured. If the master device generates the content, then the same can display such content from the media data in memory, in which case the quality may be degraded if that data is more compressed then the source media.
  • the master content playback device is, e.g., a Blu-ray® player, playing a disc
  • that internally-generated content item can be the source signal that all slave devices play back.
  • the master content playback device may also receive content items from another device, such as through an HDMI input 167 .
  • the master content playback device may need to encrypt the transmitted signal to the slave content playback device in order to ensure continued protection of the signal.
  • the master may need to encode the source material for transmittal to the slave device over the network if the source is not already in a suitable format.
  • the encoding may employ stronger compression, based on the available bandwidth between the master and the slave device.
  • a first step is that a first content playback device receives a request for another to become a slave device (step 178 ). For example, a user of one device may wish to view content displayed on another device, and so the one device becomes the slave of the other.
  • a next step is that the master content playback device may poll a local or network controller for information about device and network lags (step 182 ). Such information once received allows the master playback device to provide for synchronization, if such synchronization is called for by the application, e.g., where several devices will be in close proximity.
  • the master content playback device then transmits the synchronized content to the slave content playback device (step 184 ). Such may be done immediately if no lags are expected, or with delays or lags to accommodate for such as has been described above.
  • the transmission of synchronized content may have a number of variations associated.
  • the master content playback device may provide content using one or more internal tuners (step 186 ).
  • the master content playback device may encode content (step 188 ) to ensure that slave content playback devices can use the content.
  • the master content playback device may further encrypt the content if required by the system (step 194 ).
  • the master content playback device may send the content using a physical input, e.g., HDMI, NTSC, etc.
  • a master device may have more than one slave device synchronized.
  • the content output by the slave device may be the same content that is output by the master device, regardless of the source. Provision may be made for the ability to control which content the master device is sending to a client device through the client device's user interface.
  • one or more slave content playback devices may be given permission to control the master content playback device.
  • the slave device may be enabled to issue control commands to the master, such as to change the channel or to switch to an external input.
  • the master device may execute these commands, which may change what is being displayed, and therefore what is being sent to all the subscribed client or slave devices.
  • the master content playback device may have privacy settings configured to allow the user to allow all client connections, disallow all client connections, allow only certain clients to connect, or allow clients to connect only if they supply proper authentication credentials. Other such settings will also be understood.
  • the master device need not display the content that it supplies to the client or slave device. This allows slave devices to access external inputs, e.g., a TV tuner, disc player, or other content source in the master device, even if there is no desire for the master device to also display that content.
  • the master device may display other content while supplying the desired content or the master device may have the display portion of its circuitry in an off state to conserve power.
  • the master device may supply more than one separate content stream to its connected slave or client devices.
  • a particular content playback device may act as a master device relative to some devices, and as a client to others, even at the same time.
  • the user may or may not be concerned about the synchronization of the playback between the master device and the slave device, or between a plurality of slave devices. For example, where devices are not in close proximity, such synchronization is not necessary.
  • the master content playback device may need to delay the playback of its own signal relative to when it transmits a signal to one or more slave devices to account for lag in the transmission of the signal and the processing of the signal by the slave devices. Each device would generally add enough delay so that the content item would be played at the same playback point as the device with the most lag would play the same with no delay.
  • display is interpreted to be inclusive of playing an audio signal through speakers in the case where the media being played contains audio information, regardless of whether the media also contains video or image information.
  • An audio device such as a home audio receiver, may synchronize to a device with an audio and video signal, such as a TV, in which case the home audio device may only request and receive the audio portion of the information.
  • the master device may choose to use multicast communications so that the content item data only needs to be transmitted once in a single stream, thus saving significant bandwidth over having to broadcast the same data in multiple separate communications to each client device.
  • a first content playback device 196 is illustrated as part of a local network 15 .
  • a second display 198 is also illustrated as part of this network, and the second display may control aspects of the content playback device 196 using a user interface 199 .
  • the second display 198 further includes a synchronization controller 201 , although, as disclosed above, such a controller may be disposed at various locations in the system.
  • a second content playback device 206 is illustrated, and both the first and second content playback devices are in data communication with the Internet 25 .
  • the first and second content playback devices are not illustrated as being on the same local network, although in an alternative implementation, they may be so coupled.
  • the first and second content playback devices 196 and 206 may communicate with a content or service provider 214 through, in some cases, a management server 212 .
  • the management server 212 may arrange for the presentation of services and assets, including an asset 202 having an asset ID 202 ′, on a user interface of the second display or content playback device. Users may browse content and identify assets through the use of the asset ID. The users of the content playback devices may select the asset 202 for playback, in which case the asset 202 from the service provider is downloaded and played back or streamed to the content playback devices. As noted in FIG. 1 , the same may take place through a content delivery network, not shown in FIG. 9 for clarity.
  • multiple devices are signaled to start playback of the same asset ID.
  • Each client content playback device accesses the management server and/or service provider to obtain the location of the media to play.
  • the same authenticate each content playback device, to ensure that the same is authorized to view the content.
  • the authorization may be withheld in such cases as when the client device does not have the capability to play the media, such as due to a hardware limitation or no software support for the codec which encoded the content item.
  • the system may also restrict the client from playing back content if the client has a rating limit set that would cause the content playback to be blocked on that client.
  • each client would make its own request to the service provider to obtain the media data to play.
  • a proxy device may be employed to reduce the number of requests made to the service provider, e.g., one of the content playback devices may act as a proxy device.
  • a first step is that a plurality of content playback devices indicate a desire to view a common asset (step 216 ).
  • Each content playback device authenticates and affiliates with the service provider (step 218 ).
  • each content playback device may establish a session with a management server by logging in, and may further login to the service provider site (in many cases done automatically), providing authentic IPTV credentials to enable a content item to be delivered to the particular content playback device.
  • the content playback device is included in the synchronization group which will view the common asset in a synchronized fashion (step 222 ).
  • the synchronization group may then be filtered based on various factors, if such filtering has not been performed at the authentication step (step 224 ). Examples of such factors include that certain content may employ differing formats that may require hardware support of codec software that is not available on all clients. Another factor may be that some content distribution licenses only allow the content to be displayed in certain geographical regions. Another factor that may prevent playback is if a device has a rating limit set that would prevent the playback of the content item with the given rating. If playback is not allowed on the client, the controller informs the client and the client is removed from the synchronization group (step 226 ). Synchronized playback may then begin, as arranged and coordinated by the controller (step 228 ), with each client device obtaining and using its own unique URL to access the media.
  • a source device obtains content to play from a service provider, that source device may use the service provider as it would any other content source and transmit the content item to any subscribed client devices as noted above with reference to FIGS. 7 and 8 .
  • the subscribed client devices need not be capable of operating as clients of the service provider.
  • the source device may if necessary re-encode the media to a format that can be passed to the client devices.
  • the second display may operate software that allows the same to choose from a plurality of target devices for playback.
  • the content navigation on the second device may indicate which content is playable by each device that is currently targeted for playback, or may even filter the content choices presented to the user to ensure that the user can only see and choose from content that can be played on all targeted devices.
  • the second display can designate one of the content playback devices to be the controller, in which case the content playback devices to be synchronized establish communication between themselves to synchronize with the controller content playback device.
  • the second display device may act as the controller even though it is not one of the playback devices, in which case the content playback devices to be synchronized communicate with the second display device.
  • the content playback devices may address their communications directly to the controller or may communicate to an external server that is in data communication with all.
  • One implementation includes one or more programmable processors and corresponding computing system components to store and execute computer instructions, such as to execute the code that provides the various server functionality, e.g., that of the management server or content server, second display, or content playback device.
  • a representation of an exemplary computing environment 110 for a server, second display, content playback device, or other such computing device is illustrated.
  • the computing environment includes a controller 234 , a memory 236 , storage 242 , a media device 246 , a user interface 254 , an input/output (I/O) interface 256 , and a network interface 258 .
  • the components are interconnected by a common bus 262 .
  • different connection configurations can be used, such as a star pattern with the controller at the center.
  • the controller 234 includes a programmable processor and controls the operation of the servers, second displays, content playback devices, controllers, and their components.
  • the controller 234 loads instructions from the memory 236 or an embedded controller memory (not shown) and executes these instructions to control the system.
  • Memory 236 which may include non-transitory computer-readable memory 238 , stores data temporarily for use by the other components of the system.
  • the memory 236 is implemented as DRAM.
  • the memory 236 also includes long-term or permanent memory, such as flash memory and/or ROM.
  • Storage 242 which may include non-transitory computer-readable memory 244 , stores data temporarily or long-term for use by other components of the system, such as for storing data used by the system.
  • the storage 242 is a hard disc drive or a solid state drive.
  • the media device 246 which may include non-transitory computer-readable memory 248 , receives removable media and reads and/or writes data to the inserted media.
  • the media device 246 is an optical disc drive or disc burner, e.g., a writable Blu-ray® disc drive 252 .
  • the user interface 254 includes components for accepting user input, e.g., the user indications of streaming content items, and presenting service lists, asset lists and categories, and individual assets to the user.
  • the user interface 254 includes a keyboard, a mouse, audio speakers, and a display.
  • the controller 234 uses input from the user to adjust the operation of the computing environment.
  • the I/O interface 256 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices, e.g., a printer or a PDA.
  • the ports of the I/O interface 256 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports.
  • the I/O interface 256 includes a wireless interface for wireless communication with external devices. These I/O interfaces may be employed to connect to one or more content playback devices.
  • the network interface 258 allows connections with the local network and optionally with content playback devices and second displays and includes a wired and/or wireless network connection, such as an RJ-45 or Ethernet connection or “Wi-Fi” interface (802.11). Numerous other types of network connections will be understood to be possible, including WiMax, 3G or 4G, 802.15 protocols, 802.16 protocols, satellite, Bluetooth®, or the like.
  • the servers, second displays, and content playback devices may include additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity.
  • additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity.
  • different configurations of the devices can be used, e.g., different bus or storage configurations or a multi-processor configuration.
  • the content playback device can take many forms, and multiple content playback devices can be coupled to and selected from within a given local network.
  • Exemplary content playback devices may include, e.g., an IPTV, a digital TV, a digital sound system, a digital entertainment system, a digital video recorder, a video disc player, a combination of these, or any number of other electronic devices addressable by a user on the local network 16 and capable of delivering an ad over the Internet.
  • the same may also include more traditional video and audio systems that have been appropriately configured for connectivity.
  • the content playback device has generally been exemplified by an IPTV, in which case the same will generally include a processor that controls a visual display and an audio renderer such as a sound processor and one or more speakers.
  • the processor may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage.
  • DRAM dynamic random access memory
  • Software code implementing present logic executable by the content playback device may also be stored on various memories to undertake present principles.
  • the processor can receive user input signals from various input devices including a second display, a remote control device, a point-and-click device such as a mouse, a keypad, etc.
  • a TV tuner may be provided in some implementations, particularly when the content playback device is an IPTV, to receive TV signals from a source such as a set-top box, satellite receiver, cable head end, terrestrial TV signal antenna, etc. Signals from the tuner are then sent to the processor for presentation on the display and sound system.
  • a network interface such as a wired or wireless modem communicates with the processor to provide connectivity to the Internet through the local network. It will be understood that communications between the content playback device and the Internet, or between the second display and the Internet, may also take place through means besides the local network. For example, the second display may communicate with the content playback device through a separate mobile network.
  • the second displays may include any device that can run an application that communicates with a content playback device, including, but not limited to, personal computers, laptop computers, notebook computers, netbook computers, handheld computers, personal digital assistants, mobile phones, smart phones, tablet computers, hand-held gaming devices, gaming consoles, Internet appliances, and also on devices specifically designed for these purposes, in which case the special device would include at least a processor and sufficient resources and networking capability to run the second display application.
  • the second displays may each bear a processor and components necessary to operate an application for service provider and content selection.
  • the processor in the second display may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage.
  • DRAM dynamic random access memory
  • the second display can receive user input signals from various input devices including a point-and-click device such as a mouse, a keypad, a touch screen, a remote control, etc.
  • a network interface such as a wired or wireless modem communicates with the processor to provide connectivity to wide area networks such as the Internet 26 as noted above.
  • the servers e.g., the management server and content server, have respective processors accessing respective computer-readable storage media which may be, without limitation, disk-based and/or solid state storage.
  • the servers communicate with a wide area network such as the Internet via respective network interfaces.
  • the servers may mutually communicate via the Internet.
  • two or more of the servers may be located on the same local network, in which case they may communicate with each other through the local network without accessing the Internet.
  • a client device i.e., a content playback device, e.g., an IPTV
  • a second display presenting appropriate authentication credentials to a management server, as disclosed in assignee's co-pending US patent applications incorporated by reference above.
  • the description above may pertain to any digital content, including streamed, live streaming, video-on-demand content, and stored digital content. Any type of digital content file is contemplated, including media files in live streaming formats, e.g., .m3u8 files.
  • content item “content”
  • assert have been used interchangeably, unless the context dictates otherwise.
  • the master device may provide to the slave device alternate versions of presented content, the alternate versions incorporating video of lower quality, different codecs, different subtitles, different captions, as well as alternate audio tracks such as descriptive audio for the blind, etc.
  • a master device may simultaneously transmit a plurality of content items to multiple content playback devices, instead of just a common content item.
  • the master device may receive network content or DVR content and transmit the same to one content playback device while the master device is simultaneously receiving content from a tuner and transmitting such tuner content to another content playback device.
  • a content playback device may act simultaneously as both a master and a slave, connecting to two separate devices.
  • the content that the master device is transmitting may be the content it is receiving or content from another source, such as a tuner, that it has access to.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Systems and methods for synchronizing the playback of network media across multiple content playback devices, termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device. In other implementations, device lag times may be measured. In still other implementations, a master device may synchronize playback of media content on slave devices. In yet other implementations, devices may buffer and join playback of media content occurring on other devices. In further implementations, the systems and methods may be expanded to include steps of processing authentication for service providers prior to arranging synchronized playback.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application is a divisional application of U.S. application Ser. No. 14/661,092, filed Mar. 18, 2015, now U.S. Pat. No. 9,848,22, which is a divisional of U.S. application Ser. No. 13/428,855, filed Mar. 23, 2012 entitled “SYSTEM, METHOD, AND INFRASTRUCTURE FOR SYNCHRONIZED STREAMING OF CONTENT”, now U.S. Pat. No. 8,997,169 issued Mar. 31, 2015, which is owned by the assignee of the present application and is incorporated by reference herein.
  • BACKGROUND
  • Infrastructures exist to provide Internet video from various service providers or services. For example, the Sony Bravia® Internet Link (“BIVL”) technology from Sony Corporation provides a hardware device which when coupled to a broadband Internet connection allows access to Internet video services through a management server infrastructure. Such infrastructures deliver recorded audiovisual media content such as video, audio, and image files.
  • Streaming refers to a delivery of media content in a constant fashion, from a transmission at a source to a reception and presentation at a receiver. For example, Internet delivery of digital content presentations to network computers is commonly streamed, as is Internet television content.
  • With the proliferation of Internet video connected devices, it is common to have several devices playing back the same program, either pre-recorded or live. For example, in a home, the viewer may have two devices in separate rooms, such that the viewer can travel from room-to-room while watching the program. Such is even more common in a sports bar, where many TVs may be streaming the same sporting event. While useful for allowing many viewers to watch the event, such situations commonly experience synchronization problems due to network lag, the source of the signal, and even the model of playback device, e.g., type of IPTV. The problem is accentuated in such situations because viewers will hear a first audio signal from one device and then hear echoes from others. For traditional network media playback, even more serious timing issues may ensue as each playback is started by a device independent of any other playback of the content.
  • In one prior art attempt to remedy this situation, some recent devices have included in an “advanced mode” menu the capability to allow the user to manually specify the timing to compensate for lag. Such has a disadvantage that most users have no way of measuring lag, which is commonly measured in milliseconds, and so will end up setting the value by trial and error, if such is attempted at all.
  • SUMMARY
  • Systems and methods according to the principles described here involve synchronizing the playback of network media across multiple content playback devices, occasionally termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device.
  • In another implementation, a device lag is measured between the generation or output of a signal and the final display or playback of that signal to the user. The lag may be compensated for to allow better playback synchronization. Besides lags due to the device characteristics, lags may also be measured and compensated for relating to network delays in obtaining content.
  • In a further implementation, if a first playback device is playing back content, systems and methods according to the principles described here allow for a second playback device to become synchronized with the first, such that playback of the content item on the second is synchronized to the playback on the first. The second content playback device may begin to buffer content prior to display, and may estimate which content to buffer based on a determined playback point on the first playback device, as well as network bandwidth and network and device lag. Once the second playback device has buffered sufficient content such that playback can begin in a synchronized fashion, playback begins and the first and second playback devices are in sync.
  • In yet another implementation, systems and methods according to the principles described here include setting up a master and slave relationship between two devices, so that the output of the slave device is the same content as the output of the master device. The playback between the two may be synchronized. The master device need not playback the content itself, and may in fact be playing back other content. The source of content may be the master device, e.g., via a tuner, or the source may be upstream of the master device, with the master device just providing throughput of the content. The master device may encode or encrypt the content item for subsequent transmission as need be. In some cases the master device may authenticate a slave device. A common use is when a content item is downloaded or streamed onto the master device, a slave device wishing to sync and playback the same content may need to be authenticated so that it has 1) permission, and 2) capabilities to support playback of that content (e.g., can support 3D, Dolby plus codec, etc.). A service provider can mark its content as “redistributable” or otherwise shared.
  • In yet another implementation, access to some network content is such that client playback devices play content items after authentication of the playback devices with one or both of a management server and a service provider. In these implementations, synchronized playback may be afforded using certain above synchronization steps as well as content access steps, e.g., affiliation steps with the service provider.
  • In one aspect, the invention is directed towards a method of synchronizing playback of IPTV content between a first content playback device and a second content playback device, including: coupling first and second content playback devices to a controller, the controller configured to control playback of a content item from a service provider on the first and second content playback device; sending data about a device lag time associated with at least one of the first and second content playback devices to the controller; calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag time; and sending signals to the first and second content playback devices to begin playback of the content item, such that the first and second content playback devices begin playback of the content item at substantially the same time.
  • Implementations of the invention may include one or more of the following. The sending of signals to the first and second content playback device may be separated by the time differential. The signals sent to the first and second content playback devices may include data indicating to the first and second content playback devices a respective delay after which playback should begin. The delay may be between zero and the time differential. The controller may be within the first or second content playback device. The controller may be in data communication with a local network associated with the first or second content playback device. The method may further include accessing a management server in data communication with the first and second content playback devices, the management server controlling access to the service provider, and where the controller is in data communication with the management server. The controller may be configured to receive geographic data about a location of the first and second content playback devices, and the calculating a time differential may be further based on the geographic location of the first and second content playback devices. The method may further include determining a network lag time by sending a signal from the first or second content playback device, or both, to the management server, and the time differential may be further based on the network lag time.
  • In another aspect, the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • In a further aspect, the invention is directed towards a method of determining a device lag time, including: generating a test signal; sending the test signal to initiate a signal indicating that rendering of a content item should begin; detecting the rendering of the content item; and measuring a time between the sending and the detecting to calculate a device lag time.
  • Implementations of the invention may include one or more of the following. The method may further include sending the device lag time to a controller. The rendering of a content item may cause a change in brightness or volume. The detecting may include detecting with a microphone or an optical sensor.
  • In another aspect, the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • In yet another aspect, the invention is directed towards a method of synchronizing playback of IPTV content between a first content playback device and a second content playback device, including: playing back a content item on a first content playback device; buffering but not playing back the content item on a second content playback device, the buffering but not playing back occurring at least until the buffer includes a portion of the content item currently being played back on the first content playback device; and sending a signal to begin playback of the content item on the second content playback device, such that the playback of the content item on the first and second content playback devices is synchronized.
  • Implementations of the invention may include one or more of the following. The first and second content playback devices may be in data communication with a controller, and the method may further include: sending data about a device lag time associated with the second content playback device to the controller; and sending a signal to the second content playback device to begin playback of the partially buffered content item, the time of the sending a signal based on the device lag time. The buffering may be in response to a request from the second content playback device to join the playback of the content item.
  • In another aspect, the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • In yet another aspect, the invention is directed towards a method of playback of at least a portion of a content item on a second content playback device based on a presence of the content item at a first content playback device, including: at least partially receiving a content item on a first content playback device; transmitting at least a portion of the received content item to a second content playback device; and encoding or encrypting the content item by the first content playback device prior to the transmitting.
  • Implementations of the invention may include one or more of the following. The first content playback device may generate a portion of the content item using a tuner. The first content playback device may have received a portion of the content item from another content playback device. The method may further include controlling operation of the first content playback device using the second content playback device. The transmitting may be performed immediately upon the receiving. The method may further include receiving device or network lag information at the first content playback device, and the transmitting may be performed following a time differential based on the received device or network lag information. The transmitting may be performed while the first content playback device is playing back the content item, playing back another content item, or not playing a content item. The transmitting may be performed while the first content playback device is playing back the content item, and the transmitting may be performed such that the second content playback device plays back the content item in synchronization with the first content playback device. Multiple second content playback devices may be in data communication with the first content playback device, and the method may further include selecting a second content playback device to receive the content item prior to the transmitting. A plurality of second content playback devices may be in data communication with the first content playback device, and the method may further include transmitting the content item to the plurality of second content playback devices. A plurality of second content playback devices are in data communication with the first content playback device, and the method may further include transmitting the content item using a multicasting method to the plurality of second content playback devices. A plurality of second content playback devices may be in data communication with the first content playback device, and the method may further include: at least partially receiving another content item on the first content playback device; and transmitting at least a portion of the received content item to one content playback device of the plurality and transmitting at least a portion of the received another content item to another content playback device of the plurality.
  • In another aspect, the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • In a further aspect, the invention is directed towards a method of synchronizing the playback of a content item among a plurality of content playback devices, the content item available through a service provider requiring an affiliation process, including: coupling a plurality of content playback devices in data communication with a controller, the controller configured to at least partially control playback of a content item on the plurality of content playback devices through a service provider, the plurality of content playback devices constituting a synchronization group; sending a signal from the controller to each of the plurality to cause each of the plurality to contact the service provider to obtain access to the content item; in the event one of the plurality is not allowed access to the content item, then notifying the controller of the event and removing the one from the synchronization group; and sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item.
  • Implementations of the invention may include one or more of the following. At least a portion of the plurality may be in data communication with a proxy device, and the sending a signal to cause each of the plurality to contact the service provider may include sending a signal to cause each of the portion of the plurality to contact the service provider through the proxy device. The proxy device may be a second display. The controller may configure the plurality of content playback devices for synchronized playback through a second display. The second display may indicate a list of content items for which access may be obtained by each of the plurality, or a list of content playback devices within the plurality that can obtain access to a given content item. The method may further include sending each of the content playback devices in the synchronization group a unique URL with which to access the content item. The method may further include: receiving data about device lag times associated with at least a first and a second content playback device in the plurality; calculating a time differential between a start time associated with the first content playback device and a start time associated with the second content playback device, the time differential at least partially based on the device lag times; and where the sending a signal to each of the content playback devices in the synchronization group to begin playback of the content item may include sending signals to the first and second content playback devices to begin playback of the content item, a time of each sending separated by the time differential.
  • In another aspect, the invention is directed towards a non-transitory computer-readable medium, including instructions for causing a computing device to implement the above method.
  • Advantages of certain implementations of the system and method may include one or more of the following. Synchronization of playback devices may lead to a significantly enhanced user experience, particularly when playback devices are in close proximity to each other. Device lags may be conveniently measured, and by having the device measure the lag, the timing used to overcome that lag will be accurate, leading to synchronization that is more precise than can be achieved by manually modifying timer settings through trial and error. Device lags may be accounted for from all sources, such as signal transmission, signal decoding, delays used to synchronize video and audio, lag due to the timing of when video frames are displayed, and the like.
  • Certain implementations allow playback devices to join in the playback of a content item in a synchronized fashion, without interrupting the original playback of the content item. Certain implementations further allow synchronized playback where clients are required to go through a management server infrastructure to play content, thus allowing synchronized playback in situations more complex than just the synchronization of direct networked media playback.
  • Other advantages will be apparent from the description that follows, including the figures and claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Like reference numerals denote like elements throughout.
  • FIG. 1 is a block diagram of an exemplary system in accordance with an aspect of the present principles, illustrating a network which may be employed to deliver content in a synchronized fashion to multiple content playback devices using one or more controllers.
  • FIG. 2 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for delivering a content item in a synchronized fashion using one or more controllers.
  • FIG. 3 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system for measuring a device lag for a content playback device.
  • FIG. 4 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for measuring device lags and delivering data about the same to one or more controllers.
  • FIG. 5 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system in which a second content playback device may become synchronized with a first content playback device.
  • FIG. 6 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for synchronizing a second content playback device with a first content playback device.
  • FIG. 7 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a system in which a content playback device serves as a master device and one or more content playback devices serve as slave devices.
  • FIG. 8 is a flowchart illustrating an exemplary method, which may be employed in the system of FIG. 7, to create a master/slave relationship among two or more devices.
  • FIG. 9 is a block diagram of an exemplary system in accordance with another aspect of the present principles, illustrating a network which may be employed to deliver content in a synchronized fashion to multiple content playback devices where content is delivered within an infrastructure including a management server and a service provider.
  • FIG. 10 is a flowchart illustrating an exemplary method according to another aspect of the present principles, the method for delivering a content item in a synchronized fashion to multiple content playback devices where content is delivered within an infrastructure including a management server and a service provider.
  • FIG. 11 illustrates an exemplary computing environment, e.g., that of the disclosed IPTV or client content playback device, management server, second display, or the like.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, a system 10 is shown including content playback devices 12 and 14 coupled to a local network 15, which may be wired, wireless, or a combination of both. A second display 16 is also illustrated on the local network 15 and the same may control the operation of the content playback devices (or other devices) on the local network. The second display 16 may in some cases also display content itself.
  • A remote control 22 may be employed to control the content playback device, or control may be exercised by way of the second display 16. The use of second display devices in such contexts has certain benefits because the same provides complementary functionality to the IPTV, but generally does not require additional investment by the user because the same make use of a device, e.g., a smartphone, tablet computer, or the like, which most users already have in their possession. Additional details about such second displays and their interactions with content playback devices, e.g., through proxy servers and otherwise, may be seen from Applicants' co-pending U.S. patent application Ser. No. 13/077,181, filed Mar. 31, 2011, entitled “PERSONALIZED SECOND DISPLAY BROWSING EXPERIENCE DUE TO MULTIPLE SESSION FEATURE”, owned by the assignee of the present application and incorporated herein by reference in its entirety.
  • As illustrated in FIG. 1, a number of servers may be accessed by the content playback devices 12 and 14 through the local network 15 and the Internet 25, including a management server 24 and one or more content servers 26, 28, and 32 corresponding to content providers. In this specification, the term “content provider” is used synonymously with “service provider”. The servers may communicate with a content delivery network 34 to enable content items to be delivered to the content playback devices, or such delivery may be direct.
  • In a general method, including use of a second display, a user has a user account with a source or clearinghouse of services. Here, the source or clearinghouse is represented as a management server, but it should be understood that the user account may be with a service provider directly. The management server communicates with at least one content server (generally associated with the service provider) such that the content server provides content items such as streaming assets for presentation or access at the content playback device. The user account has information stored thereon related to what content playback devices are associated with the user account. When a user logs on, they may see this list of content playback devices and may choose a particular content playback device. Once a content playback device has been chosen, a list of services may be displayed from which the user may choose. From a chosen service, a user may select a content item for viewing, undergoing an affiliation or authentication step if required by the service. Additional details may be found in the application incorporated by reference above.
  • A number of synchronization controllers 36-54, also termed just “controllers”, are also illustrated. Controllers may be in one or all of the content playback devices, second displays, or servers controlling content delivery. In general, at least one controller is required, and the controller may be implemented in hardware, software, firmware, or the like. The controller can even be in an external device 18, devoted to controller functionality, or providing other functionality in addition to controller functions.
  • A typical situation represented by FIG. 1 would be a home, sports bar, and even locations such as gas pumps and grocery store checkout aisles. In such situations, a number of content playback devices may be in close proximity to each other. Differences in the timing of playback become noticeable and distracting. For example, if audio is produced by more than one of the devices, then users may experience an echo. With more serious offsets, they may experience two separate portions of the content competing with each other. Such dyssynchrony may be typical where content sources differ, e.g., satellite versus terrestrial, or even within separate models of playback devices. Generally, manually starting playback of a content item at the same time on two different devices cannot provide playback sufficiently close in time to avoid such difficulties. And tuning all of the playback devices to the same source signal does not work for the playback of network media.
  • Consequently, the controllers 36-54 are employed to coordinate such playback. All of the client devices participating in the synchronized playback establish data communication with a controlling device or controller that coordinates the playback timing across all participating clients. The controller can be one of the client devices, or it may be a separate device. Generally, some client devices will be capable of operating as controlling devices, and others will not.
  • Referring to the flowchart 20 of FIG. 2, the individual content playback devices, e.g., devices 12 and 14, parse and buffer media content separately (step 56). The devices may get the media directly from the source, or they may obtain the media through a proxy device. For example, one of the client devices, or the controlling device, may operate as a proxy device and provide throughput or distribution of content in a manner described below, e.g., with respect to FIGS. 5-8. In the case where the network configuration allows multicast communications between the proxy device and the clients, the proxy may distribute the media to the clients using multicast communications to reduce the amount of bandwidth that is needed. The steps below may be performed when client devices have a degree of lag associated with them, e.g., either due to the network or due to device characteristics.
  • Once the content playback devices are ready to playback the content item, they signal their readiness to the controller (step 68), e.g., to controller 42 in the second display 16. In particular, once all of the content playback devices have decoded the index and header information they need and have buffered enough data such that they may start playback, their readiness is communicated to the controller. At this point, the client devices are waiting for a start signal to begin playback of the content item. Once the controller has received a signal from all clients indicating their readiness to begin playback, a start signal may be sent to each client to begin playback. Each client device should be in a state where playback may begin immediately or at a specified future time, so as to account for the local network lag of communications from the controller to all clients, upon receiving the signal from the controller. The controller 42 may adjust the timing of start signals (step 62) so that the output is displayed in synchronization on each client device. For example, the controller 42 may delay the sending of start signals based on the network lag of each client device, or may send all the start signals but indicate within the signal a respective delay after which playback should begin. In this latter alternative, all the playback devices have an opportunity to cache content while they are waiting through the delay.
  • A number of steps may be employed in determining the timing of the start signals. For example, if the controller is based at the server level, e.g., within the management server, the same may be aware of and account for differences in location of the service provider or source server relative to the client devices (step 64). In other words, some client devices may be located much closer to the source of content than others, and thus will experience less network lag or delay in receiving content.
  • Device lags may also be accounted for, such as the device lag between when a playback signal is generated and when that signal is actually displayed to the user. Such device lags may be measured using techniques described below, and in any case data about such lags may be communicated to the controller (step 66). Client devices may also employ a step of attempting to measure their network lag, and communicating the same to the controller (step 76), by measuring how long it takes for a test signal to traverse to a network location and back, e.g., to the management server.
  • Once data is obtained about network lags and device lags, the one or more controllers may use the data to order the start times at which signals will be sent to client devices to begin playback (step 72). For example, the controller may compensate for the differing lag times of the clients by giving a start command to the client with the most lag first and giving a start command to the other clients with enough delay so that the final display of the content will occur in a synchronized fashion. Once the ordering is done, and timing differentials calculated between the various start times, start signals may be sent to client devices (step 74).
  • Referring to the system 30 of FIG. 3, device lag times may be measured and used in calculations to provide for synchronized playback. It is noted in this connection that systems and methods according to the principles described here may be broadly applied to any combination of devices, and not just a particular content playback device. For example, some devices display their signals by outputting them, such as through an HDMI connector, and another device actually performs the display. The amount of lag depends thus on the combination of devices. Lag times can generally not be predetermined, as different TV models, even from the same manufacturer, may have different lag times between when they receive a digital input signal and when that signal is displayed to the user.
  • To adjust for this, the device of FIG. 3 may be employed. In FIG. 3, a content playback device 78 is shown having a network interface 82 that allows connection to service providers (not shown) through the Internet 25. The content playback device 78 is illustrated connected to a display 86 and an audio system 88. It will be understood, however, that such may form part of an integrated content playback device 108 that incorporates all of these sub systems.
  • The content playback device 78 includes a playback signal generation circuit 84 that accepts a start signal from the network interface 82 which generally originates from a controller. The start signal indicates that playback should begin. An exemplary playback signal is illustrated in the graph 85. Once the playback signal is generated, a finite amount of time Δt passes before a user actually sees a corresponding signal on the display, illustrated as Δtv or hears a corresponding sound on the audio system, illustrated as Δta, in graphs 98 and 102, respectively. To determine these time differentials, an optical sensor 94, such as a camera, is disposed to receive displayed signals from the display 86. An audio sensor, such as a microphone 96, is disposed to receive rendered signals from the audio system 88. For example, a light detector may be placed in front of the display and a microphone in front of a speaker. The same provides signals to a measurement circuit 104, which also receives an indication of the playback signal 85 from the playback signal generation circuit 84. By measuring the time between the playback signal 85 and signals 98 and 102, a measurement of the device lag may be calculated.
  • It will be understood that the type of sensor may vary, and the only requirement is that they be positioned such that the same can detect the playback being output by the device. As audio is not highly directional, a built-in microphone may not need any special positioning if the device is located in the same room as the playback. A light intensity sensor or detector should be located so that the same is facing the screen where the video output is playing. Generally, such optical detectors should have a narrow field of vision and may employ shielding, such as a flat black tube, to reduce the amount of stray light from other angles being picked up by the sensor.
  • In general, the sensors need not be of any particular high-quality as the same only need to respond quickly to the overall intensity they are receiving. For example, inexpensive microphones, as are commonly used in telephones, will generally be sufficient for detecting overall sound intensity in real-time. For the light detector, any camera sensor may be employed, even those lacking optics necessary to produce a clear picture. The light detector may also simply detect overall light intensity and need not employ multiple pixels or be able to detect different intensities for different wavelengths of visible light.
  • The above system provides various advantages. For example, the system measures the overall lag, the same being a primary parameter required to synchronize the output. No matter how complex the signal processing pathway is, the overall result is measured. In this way, complex cases where significant signal processing exists may still be afforded synchronized playback, e.g., in professional broadcast environments where signals may be routed through many pieces of equipment. In this connection, it is noted that the measurement may be for a lag time through an arbitrary signal path, and may not necessarily include rendering of the signal at the end of the path. For such implementations, an intermediate sensor 106 may be employed to monitor the signal at the end of the signal path being measured, to look for timing when the generated signal reaches that point.
  • In variations of the above, the lag measurement may be automated such that device lags are automatically measured each time a change in signal path is detected, such as when a new device is attached to an HDMI output. Such automation may be provided in any of the embodiments described above.
  • A method that may be employed by the system of FIG. 3 is illustrated by a flowchart 40 in FIG. 4. A first step is that a content playback device receives the start signal, or may by itself initiate a test signal (step 112). For a light intensity sensor, the content playback device may begin by outputting a black video signal and then for the test signal output one or more white video frames (step 114). The test signal is sent to the measurement circuit (step 116) as well as to the actual display.
  • The test signal is then rendered, e.g., visually and/or aurally (step 118), and the same is detected by the optical sensor and/or microphone (step 122), respectively. Indication of receipt of the test signal is sent to the measurement circuit (step 124). The difference between the time of arrival of the start signal (or initiation of test signal) and the time of detection yields the lag time for the signal (step 126). This “device lag time” may then be sent to one or more controllers in data communication with the content playback device (step 128).
  • For a sound intensity sensor, the device may begin by outputting a silent audio signal and then outputting a loud signal. The audio signal that is used may vary, but should substantially immediately increase from silence to a steady volume. A single tone, such as a sine wave or square wave, can be used, or the output may include white noise. Musical outputs may be employed if the first note is of a sufficiently consistent loud amplitude. As with the optical detector, the lag may be calculated from the difference in timing from when the sound being output went from silence to the audio signal and when the sound intensity detector picked up the sudden increase in sound intensity.
  • The device may calculate the display lag using only one of the sensors, e.g. optical or audio, or it may use both. In the case where the device uses both, both measurements may occur simultaneously as they do not generally interfere with each other. It is noted that in such cases, the measurements of rendered signals may occur at different times. For example, if the audio and video synchronization of the output device is off, there may be a variation in the device lag for the audio and video outputs. In the case of a difference in device lag, the controller may employ different timings for the audio and video to compensate for that difference.
  • The measurements may be repeated, e.g., by cycling from low to high intensity several times, to ensure that the changes picked up were from the playback of the output and not from environmental interference. Statistical methods may be employed to ensure that enough points have been collected to obtain a true measurement.
  • FIG. 5 illustrates another implementation according to the principles described here, this implementation of a system 50 in which a first content playback device 132 is currently playing back a content item 138 received or streaming from a service provider through the Internet 25. In the playback by the first content playback device 132, a playback point 139 has been reached in the content item 138.
  • A second content playback device 134 is illustrated, and the second content playback device has been indicated as desiring to join the playback of the content item 138. The first and second content playback devices 132 and 134, respectively, are illustrated as part of the local network 15. It will be understood that there is no requirement the two are on the same local network. In addition, a separate synchronization controller 136 is illustrated, and the same may form a portion of the second display, may form a portion of either content playback device, or may be a separate device entirely.
  • The second content playback device 134 has a buffer 135 and upon indication that the second content playback device wishes to join the playback of the first, the buffer 135 may begin to receive the content item through the Internet and/or the local network.
  • FIG. 6 illustrates a flowchart 60 for a method employing the system of FIG. 5. In a first step, the first content playback device plays back or streams the content item (step 142). A second content playback device indicates a desire to join the playback of the content item, and indicates this desire to a controller (step 144). In so doing, the second content playback device may communicate with the controller to obtain the current playback timing. In this process, the first content playback device also communicates its playback point in the content item to the controller. The client and controller may employ knowledge of the network and device lags, such knowledge gained using techniques described elsewhere in this application, to predict how long it will take to buffer to the point where playback can begin in a synchronized fashion. The second content playback device may calculate at which point in the content item data the playback will be that far into the future, and may load any header or index data it needs for that part of the content item. Once the header and index data are loaded, the second content playback device may update its bandwidth estimate and therefore the estimate of what part of the content item data will be needed at the point in time when the second content playback device has managed to buffer enough data to start playing.
  • In particular, the controller causes the second content playback device to begin buffering content (step 146), starting with the portion of the content item data that it estimates will contain the portion that will be played at the point in time when it has buffered enough data to start playing e.g., at a first target point. The second content playback device buffers the content until it has sufficient to join the playback (step 148). In so doing it may employ data about known network and device lags and delays (step 162).
  • Once the second content playback device has buffered enough data to start playback, it may then compare the portion of data it has with the current playback point, e.g., point 139. Additional communication with the controller may be made during buffering to double check that the playback timing information received by the second content playback device is still correct and was not affected by, e.g., abnormally high network lag on the part of either or both content playback devices or other such interruptions. If buffering happened quickly and the playback point has not yet reached the start of the content being buffered, the second content playback device may wait until the playback position reaches the playback point, and then begin playing the beginning of the content it has buffered (step 158).
  • If the current playback point has already passed the beginning of the data that was buffered, the client may determine at what point the current playback is, within the buffered data, and will check to see if there is adequate data buffered beyond that to start playback at that position. If there is sufficient data, then playback begins at the position within the data that corresponds with the current playback point. If there is not enough data buffered, playback will not begin at this point, and the client will continue to buffer the media (step 154), repeating the check each time a new segment of content item data is received. Once enough data is received, such that the buffer includes the playback point, the second content playback device may join the playback (step 158).
  • In some cases, a sufficiently disruptive network interruption may occur. In this case, the latest data in the buffer may be behind the current playback point, in which case the second content playback device may start over from the beginning with its attempt to begin synchronized playback (step 159).
  • The system and method of FIGS. 5 and 6 may be employed in a number of scenarios, including where, if existing devices are not in synchronization, one or more may attempt synchronization by following the steps that devices, that are newly joining the playback, will perform. Where multiple devices are joining a playback of the content item, the network and device lags pertaining to each may be employed by the controller in calculating when to send start signals to the multiple devices to begin playback. The above techniques may be employed to synchronize the playback of live streaming content, even if there is no existing playback with which to synchronize, as a current playback location for a live media stream is constantly changing, just as when playback of network media already exists. In another implementation, a device that loses network connectivity can rejoin synchronized playback when it regains network connectivity. By following the steps described above, i.e., that a new client would employ, synchronized playback may be again obtained. In some cases, the device that lost connectivity may have certain relevant content item data buffered that it may take advantage of to reduce the amount of data needed to download before playback can begin again.
  • In some cases of synchronization, it may be desired to set up a direct relationship such that one content playback device acts as a master device and another a slave. Systems and methods according to the principles described here, in particular with respect to FIGS. 7 and 8, provide functionality to transmit content over a network to other devices, e.g., from masters to slaves. The devices have the ability to receive this content through their network connections from other devices, and play back that content in a synchronized manner. In this way, the output of the slave device is configured to be the same content as the output of the master device, generally, though not in every implementation, with synchronization.
  • For example, referring to FIG. 7, the system 70 includes a master content playback device 164, which receives content from the Internet 25, and three slave content playback devices 172, 174, and 176. The master content playback device 164 is coupled to the slave content playback device 172 through the local network 15. The slave content playback devices 174 and 176 are driven directly from the master content playback device 164, such as through an HDMI or NTSC connection. The master content playback device 164 may itself generate content items through one or more tuners 168, or the same may be stored in a storage 166. The storage 166 may be employed to store content items that are then output to clients. Such allows functionality like that of a DVR, e.g., trick play including pause, rewind, and fast-forward. Such commands may need to originate with the master device, or may come from one or more client devices, depending on how the settings are configured. If the master device generates the content, then the same can display such content from the media data in memory, in which case the quality may be degraded if that data is more compressed then the source media. Where the master content playback device is, e.g., a Blu-ray® player, playing a disc, that internally-generated content item can be the source signal that all slave devices play back.
  • The master content playback device may also receive content items from another device, such as through an HDMI input 167. Where the input is a protected signal, as through an HDMI connection, the master content playback device may need to encrypt the transmitted signal to the slave content playback device in order to ensure continued protection of the signal. Moreover, the master may need to encode the source material for transmittal to the slave device over the network if the source is not already in a suitable format. In some cases, the encoding may employ stronger compression, based on the available bandwidth between the master and the slave device.
  • Referring to the flowchart 80 of FIG. 8, in which an exemplary method of use is described for the system 70 of FIG. 7, a first step is that a first content playback device receives a request for another to become a slave device (step 178). For example, a user of one device may wish to view content displayed on another device, and so the one device becomes the slave of the other. A next step is that the master content playback device may poll a local or network controller for information about device and network lags (step 182). Such information once received allows the master playback device to provide for synchronization, if such synchronization is called for by the application, e.g., where several devices will be in close proximity.
  • The master content playback device then transmits the synchronized content to the slave content playback device (step 184). Such may be done immediately if no lags are expected, or with delays or lags to accommodate for such as has been described above.
  • The transmission of synchronized content may have a number of variations associated. For example, the master content playback device may provide content using one or more internal tuners (step 186). The master content playback device may encode content (step 188) to ensure that slave content playback devices can use the content. The master content playback device may further encrypt the content if required by the system (step 194). In yet other implementations, the master content playback device may send the content using a physical input, e.g., HDMI, NTSC, etc.
  • Other variations will also be seen. For example, and as indicated in FIG. 7, a master device may have more than one slave device synchronized. The content output by the slave device may be the same content that is output by the master device, regardless of the source. Provision may be made for the ability to control which content the master device is sending to a client device through the client device's user interface.
  • In some implementations, one or more slave content playback devices may be given permission to control the master content playback device. In this case, the slave device may be enabled to issue control commands to the master, such as to change the channel or to switch to an external input. The master device may execute these commands, which may change what is being displayed, and therefore what is being sent to all the subscribed client or slave devices. The master content playback device may have privacy settings configured to allow the user to allow all client connections, disallow all client connections, allow only certain clients to connect, or allow clients to connect only if they supply proper authentication credentials. Other such settings will also be understood.
  • It is noted that the master device need not display the content that it supplies to the client or slave device. This allows slave devices to access external inputs, e.g., a TV tuner, disc player, or other content source in the master device, even if there is no desire for the master device to also display that content. The master device may display other content while supplying the desired content or the master device may have the display portion of its circuitry in an off state to conserve power. In some implementations, the master device may supply more than one separate content stream to its connected slave or client devices. It is further noted that a particular content playback device may act as a master device relative to some devices, and as a client to others, even at the same time.
  • In some implementations, the user may or may not be concerned about the synchronization of the playback between the master device and the slave device, or between a plurality of slave devices. For example, where devices are not in close proximity, such synchronization is not necessary. Where synchronization is employed, the master content playback device may need to delay the playback of its own signal relative to when it transmits a signal to one or more slave devices to account for lag in the transmission of the signal and the processing of the signal by the slave devices. Each device would generally add enough delay so that the content item would be played at the same playback point as the device with the most lag would play the same with no delay.
  • It is understood that the term “display” is interpreted to be inclusive of playing an audio signal through speakers in the case where the media being played contains audio information, regardless of whether the media also contains video or image information. An audio device, such as a home audio receiver, may synchronize to a device with an audio and video signal, such as a TV, in which case the home audio device may only request and receive the audio portion of the information.
  • In another variation of the above implementations, if the slave devices that are subscribed to a master device are connected within the same local network, such that multicast network communications are enabled between the devices, the master device may choose to use multicast communications so that the content item data only needs to be transmitted once in a single stream, thus saving significant bandwidth over having to broadcast the same data in multiple separate communications to each client device.
  • In yet another implementation, systems and methods according to the principles described here relate to providing synchronized playback even when content playback devices must access management server infrastructures to access content, including undergoing affiliation and authentication procedures. For example, referring to the system 90 of FIG. 9, a first content playback device 196 is illustrated as part of a local network 15. A second display 198 is also illustrated as part of this network, and the second display may control aspects of the content playback device 196 using a user interface 199. The second display 198 further includes a synchronization controller 201, although, as disclosed above, such a controller may be disposed at various locations in the system. A second content playback device 206 is illustrated, and both the first and second content playback devices are in data communication with the Internet 25. The first and second content playback devices are not illustrated as being on the same local network, although in an alternative implementation, they may be so coupled.
  • Through the Internet 25, the first and second content playback devices 196 and 206, respectively, may communicate with a content or service provider 214 through, in some cases, a management server 212. For example, the management server 212 may arrange for the presentation of services and assets, including an asset 202 having an asset ID 202′, on a user interface of the second display or content playback device. Users may browse content and identify assets through the use of the asset ID. The users of the content playback devices may select the asset 202 for playback, in which case the asset 202 from the service provider is downloaded and played back or streamed to the content playback devices. As noted in FIG. 1, the same may take place through a content delivery network, not shown in FIG. 9 for clarity.
  • Generally, to access content to a content or service provider, steps of affiliation are required to ensure access by a particular device is allowed and enabled. Steps of such affiliation processes are described in co-pending applications: U.S. patent application Ser. No. 13/077,298, filed Mar. 31, 2011, entitled “Direct Service Launch On A Second Display”: U.S. patent application Ser. No. 13/207,581, filed Aug. 11, 2011, entitled “System And Method To Easily Return To A Recently Accessed Service On A Second Display”: U.S. patent application Ser. No. 13/233,398, filed Sep. 15, 2011, entitled “System And Method To Store A Service Or Content List For Easy Access On A Second Display”; and U.S. patent application Ser. No. 13/217,931, filed Aug. 25, 2011, entitled “System And Method Providing A Frequently Accessed Service Or Asset List On a Second Display”; all of which are owned by the assignee of the present application and herein incorporated by reference in their entireties.
  • In systems and methods according to FIGS. 9 and 10, multiple devices are signaled to start playback of the same asset ID. Each client content playback device accesses the management server and/or service provider to obtain the location of the media to play. The same authenticate each content playback device, to ensure that the same is authorized to view the content. The authorization may be withheld in such cases as when the client device does not have the capability to play the media, such as due to a hardware limitation or no software support for the codec which encoded the content item. The system may also restrict the client from playing back content if the client has a rating limit set that would cause the content playback to be blocked on that client. If playback is not allowed on one or more clients, those clients can inform the controller, which may remove them from a synchronization group and continue with the remaining clients. The controller may also choose to treat such disallowance as an error condition, reporting or logging it, as configured, and aborting the playback. In this system, each client would make its own request to the service provider to obtain the media data to play. Depending on how the system is configured, a proxy device may be employed to reduce the number of requests made to the service provider, e.g., one of the content playback devices may act as a proxy device.
  • In more detail, and referring to a flowchart 100 in FIG. 10, a first step is that a plurality of content playback devices indicate a desire to view a common asset (step 216). Each content playback device authenticates and affiliates with the service provider (step 218). For example, each content playback device may establish a session with a management server by logging in, and may further login to the service provider site (in many cases done automatically), providing authentic IPTV credentials to enable a content item to be delivered to the particular content playback device. Upon authentication, the content playback device is included in the synchronization group which will view the common asset in a synchronized fashion (step 222).
  • The synchronization group may then be filtered based on various factors, if such filtering has not been performed at the authentication step (step 224). Examples of such factors include that certain content may employ differing formats that may require hardware support of codec software that is not available on all clients. Another factor may be that some content distribution licenses only allow the content to be displayed in certain geographical regions. Another factor that may prevent playback is if a device has a rating limit set that would prevent the playback of the content item with the given rating. If playback is not allowed on the client, the controller informs the client and the client is removed from the synchronization group (step 226). Synchronized playback may then begin, as arranged and coordinated by the controller (step 228), with each client device obtaining and using its own unique URL to access the media.
  • Variations of the above system and method will be understood given the teaching therein. For example, combinations of the above synchronization techniques may be employed. As another example, if a source device obtains content to play from a service provider, that source device may use the service provider as it would any other content source and transmit the content item to any subscribed client devices as noted above with reference to FIGS. 7 and 8. In this situation, the subscribed client devices need not be capable of operating as clients of the service provider. The source device may if necessary re-encode the media to a format that can be passed to the client devices.
  • Where a second display controls playback, the second display may operate software that allows the same to choose from a plurality of target devices for playback. The content navigation on the second device may indicate which content is playable by each device that is currently targeted for playback, or may even filter the content choices presented to the user to ensure that the user can only see and choose from content that can be played on all targeted devices. If playback is initiated by a second display, then the second display can designate one of the content playback devices to be the controller, in which case the content playback devices to be synchronized establish communication between themselves to synchronize with the controller content playback device. Also if playback was initiated by a second display device, the second display device may act as the controller even though it is not one of the playback devices, in which case the content playback devices to be synchronized communicate with the second display device. The content playback devices may address their communications directly to the controller or may communicate to an external server that is in data communication with all.
  • Systems and methods have been disclosed that allow improvement of the user experience of the IPTV without adding to the hardware costs of the unit. As disclosed above, users may employ the system and method to playback content in a synchronized fashion, allowing enjoyment of content items without the disadvantages suffered by prior attempts at coordinated playback.
  • One implementation includes one or more programmable processors and corresponding computing system components to store and execute computer instructions, such as to execute the code that provides the various server functionality, e.g., that of the management server or content server, second display, or content playback device. Referring to FIG. 11, a representation of an exemplary computing environment 110 for a server, second display, content playback device, or other such computing device is illustrated.
  • The computing environment includes a controller 234, a memory 236, storage 242, a media device 246, a user interface 254, an input/output (I/O) interface 256, and a network interface 258. The components are interconnected by a common bus 262. Alternatively, different connection configurations can be used, such as a star pattern with the controller at the center.
  • The controller 234 includes a programmable processor and controls the operation of the servers, second displays, content playback devices, controllers, and their components. The controller 234 loads instructions from the memory 236 or an embedded controller memory (not shown) and executes these instructions to control the system.
  • Memory 236, which may include non-transitory computer-readable memory 238, stores data temporarily for use by the other components of the system. In one implementation, the memory 236 is implemented as DRAM. In other implementations, the memory 236 also includes long-term or permanent memory, such as flash memory and/or ROM.
  • Storage 242, which may include non-transitory computer-readable memory 244, stores data temporarily or long-term for use by other components of the system, such as for storing data used by the system. In one implementation, the storage 242 is a hard disc drive or a solid state drive.
  • The media device 246, which may include non-transitory computer-readable memory 248, receives removable media and reads and/or writes data to the inserted media. In one implementation, the media device 246 is an optical disc drive or disc burner, e.g., a writable Blu-ray® disc drive 252.
  • The user interface 254 includes components for accepting user input, e.g., the user indications of streaming content items, and presenting service lists, asset lists and categories, and individual assets to the user. In one implementation, the user interface 254 includes a keyboard, a mouse, audio speakers, and a display. The controller 234 uses input from the user to adjust the operation of the computing environment.
  • The I/O interface 256 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices, e.g., a printer or a PDA. In one implementation, the ports of the I/O interface 256 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports. In another implementation, the I/O interface 256 includes a wireless interface for wireless communication with external devices. These I/O interfaces may be employed to connect to one or more content playback devices.
  • The network interface 258 allows connections with the local network and optionally with content playback devices and second displays and includes a wired and/or wireless network connection, such as an RJ-45 or Ethernet connection or “Wi-Fi” interface (802.11). Numerous other types of network connections will be understood to be possible, including WiMax, 3G or 4G, 802.15 protocols, 802.16 protocols, satellite, Bluetooth®, or the like.
  • The servers, second displays, and content playback devices may include additional hardware and software typical of such devices, e.g., power and operating systems, though these components are not specifically shown in the figure for simplicity. In other implementations, different configurations of the devices can be used, e.g., different bus or storage configurations or a multi-processor configuration.
  • Aspects specific to certain computing environments are discussed below.
  • The content playback device can take many forms, and multiple content playback devices can be coupled to and selected from within a given local network. Exemplary content playback devices may include, e.g., an IPTV, a digital TV, a digital sound system, a digital entertainment system, a digital video recorder, a video disc player, a combination of these, or any number of other electronic devices addressable by a user on the local network 16 and capable of delivering an ad over the Internet. The same may also include more traditional video and audio systems that have been appropriately configured for connectivity. For the sake of simplicity, in this specification, the content playback device has generally been exemplified by an IPTV, in which case the same will generally include a processor that controls a visual display and an audio renderer such as a sound processor and one or more speakers. The processor may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage. Software code implementing present logic executable by the content playback device may also be stored on various memories to undertake present principles. The processor can receive user input signals from various input devices including a second display, a remote control device, a point-and-click device such as a mouse, a keypad, etc. A TV tuner may be provided in some implementations, particularly when the content playback device is an IPTV, to receive TV signals from a source such as a set-top box, satellite receiver, cable head end, terrestrial TV signal antenna, etc. Signals from the tuner are then sent to the processor for presentation on the display and sound system. A network interface such as a wired or wireless modem communicates with the processor to provide connectivity to the Internet through the local network. It will be understood that communications between the content playback device and the Internet, or between the second display and the Internet, may also take place through means besides the local network. For example, the second display may communicate with the content playback device through a separate mobile network.
  • The second displays may include any device that can run an application that communicates with a content playback device, including, but not limited to, personal computers, laptop computers, notebook computers, netbook computers, handheld computers, personal digital assistants, mobile phones, smart phones, tablet computers, hand-held gaming devices, gaming consoles, Internet appliances, and also on devices specifically designed for these purposes, in which case the special device would include at least a processor and sufficient resources and networking capability to run the second display application. The second displays may each bear a processor and components necessary to operate an application for service provider and content selection. In particular, the processor in the second display may access one or more computer-readable storage media such as but not limited to RAM-based storage, e.g., a chip implementing dynamic random access memory (DRAM), flash memory, or disk-based storage. Software code implementing present logic executable by the second display may also be stored on various memories to undertake present principles. The second display can receive user input signals from various input devices including a point-and-click device such as a mouse, a keypad, a touch screen, a remote control, etc. A network interface such as a wired or wireless modem communicates with the processor to provide connectivity to wide area networks such as the Internet 26 as noted above.
  • The servers, e.g., the management server and content server, have respective processors accessing respective computer-readable storage media which may be, without limitation, disk-based and/or solid state storage. The servers communicate with a wide area network such as the Internet via respective network interfaces. The servers may mutually communicate via the Internet. In some implementations, two or more of the servers may be located on the same local network, in which case they may communicate with each other through the local network without accessing the Internet.
  • Various illustrative implementations of the present invention have been described. However, one of ordinary skill in the art will recognize that additional implementations are also possible and are within the scope of the present invention. For example, service and asset choices may be made by a client device, i.e., a content playback device, e.g., an IPTV, or the same may also be made by a second display presenting appropriate authentication credentials to a management server, as disclosed in assignee's co-pending US patent applications incorporated by reference above.
  • The description above may pertain to any digital content, including streamed, live streaming, video-on-demand content, and stored digital content. Any type of digital content file is contemplated, including media files in live streaming formats, e.g., .m3u8 files. The terms “content item”, “content”, and “asset”, have been used interchangeably, unless the context dictates otherwise.
  • In the system where master devices drive slave devices, the master device may provide to the slave device alternate versions of presented content, the alternate versions incorporating video of lower quality, different codecs, different subtitles, different captions, as well as alternate audio tracks such as descriptive audio for the blind, etc. Further in such systems, a master device may simultaneously transmit a plurality of content items to multiple content playback devices, instead of just a common content item. For example, the master device may receive network content or DVR content and transmit the same to one content playback device while the master device is simultaneously receiving content from a tuner and transmitting such tuner content to another content playback device. In a further implementation, it is noted that a content playback device may act simultaneously as both a master and a slave, connecting to two separate devices. The content that the master device is transmitting may be the content it is receiving or content from another source, such as a tuner, that it has access to.
  • Not all steps described above (or in any of the flowcharts below) need be undertaken in any particular implementation, and the order of steps may vary to a certain extent as well.
  • Accordingly, the present invention is not limited to only those implementations described above.

Claims (22)

1. A method of synchronizing playback of IPTV content between a first content playback device and a second content playback device, comprising: a. playing back a content item on a first content playback device; b. buffering but not playing back the content item on a second content playback device, the buffering but not playing back occurring at least until the buffer includes a portion of the content item currently being played back on the first content playback device; and c. sending a signal to begin playback of the content item on the second content playback device, such that the playback of the content item on the first and second content playback devices is synchronized.
2. The method of claim 1, wherein the first and second content playback devices are in data communication with a controller, and further comprising: a. sending data about a device lag time associated with the second content playback device to the controller; and b. sending a signal to the second content playback device to begin playback of the partially buffered content item, the time of the sending a signal based on the device lag time.
3. The method of claim 1, wherein the buffering is in response to a request from the second content playback device to join the playback of the content item.
4. A non-transitory computer-readable medium, comprising instructions for causing a computing device to implement the method of claim 1.
5. A method of determining a device lag time, comprising:
a. generating a test signal;
b. sending the test signal to initiate a signal indicating that rendering of a content item should begin;
c. detecting the rendering of the content item; and
d. measuring a time between the sending and the detecting to calculate a device lag time.
6. The method of claim 5, further comprising sending the device lag time to a controller.
7. The method of claim 5, wherein the rendering of a content item causes a change in brightness or volume.
8. The method of claim 5, wherein the detecting includes detecting with a microphone or an optical sensor.
9. A non-transitory computer-readable medium, comprising instructions for causing a computing device to implement the method of claim 5.
10. A method of playback of at least a portion of a content item on a second content playback device based on a presence of the content item at a first content playback device, comprising:
a. at least partially receiving a content item on a first content playback device;
b. transmitting at least a portion of the received content item to a second content playback device; and
c. encoding or encrypting the content item by the first content playback device prior to the transmitting.
11. The method of claim 10, wherein the first content playback device generates a portion of the content item using a tuner.
12. The method of claim 10, wherein the first content playback device has received a portion of the content item from another content playback device.
13. The method of claim 10, further comprising controlling operation of the first content playback device using the second content playback device.
14. The method of claim 10, wherein the transmitting is performed immediately upon the receiving.
15. The method of claim 10, further comprising receiving device or network lag information at the first content playback device, and wherein the transmitting is performed following a time differential based on the received device or network lag information.
16. The method of claim 10, wherein the transmitting is performed while the first content playback device is playing back the content item, playing back another content item, or not playing a content item.
17. The method of claim 16, wherein the transmitting is performed while the first content playback device is playing back the content item, and wherein the transmitting is performed such that the second content playback device plays back the content item in synchronization with the first content playback device.
18. The method of claim 10, wherein multiple second content playback devices are in data communication with the first content playback device, and further comprising selecting a second content playback device to receive the content item prior to the transmitting.
19. The method of claim 10, wherein a plurality of second content playback devices are in data communication with the first content playback device, and further comprising transmitting the content item to the plurality of second content playback devices.
20. The method of claim 10, wherein a plurality of second content playback devices are in data communication with the first content playback device, and further comprising transmitting the content item using a multicasting method to the plurality of second content playback devices.
21. The method of claim 10, wherein a plurality of second content playback devices are in data communication with the first content playback device, and further comprising:
a. at least partially receiving another content item on the first content playback device; and
b. transmitting at least a portion of the received content item to one content playback device of the plurality and transmitting at least a portion of the received another content item to another content playback device of the plurality.
22. A non-transitory computer-readable medium, comprising instructions for causing a computing device to implement the method of claim 10.
US15/847,060 2012-03-23 2017-12-19 Method and infrastructure for synchronized streaming of content Abandoned US20180109826A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/847,060 US20180109826A1 (en) 2012-03-23 2017-12-19 Method and infrastructure for synchronized streaming of content

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/428,855 US8997169B2 (en) 2012-03-23 2012-03-23 System, method, and infrastructure for synchronized streaming of content
US14/661,092 US9848221B2 (en) 2012-03-23 2015-03-18 Method and infrastructure for synchronized streaming of content
US15/847,060 US20180109826A1 (en) 2012-03-23 2017-12-19 Method and infrastructure for synchronized streaming of content

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/661,092 Division US9848221B2 (en) 2012-03-23 2015-03-18 Method and infrastructure for synchronized streaming of content

Publications (1)

Publication Number Publication Date
US20180109826A1 true US20180109826A1 (en) 2018-04-19

Family

ID=49195866

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/428,855 Active 2032-07-12 US8997169B2 (en) 2012-03-23 2012-03-23 System, method, and infrastructure for synchronized streaming of content
US14/661,092 Active US9848221B2 (en) 2012-03-23 2015-03-18 Method and infrastructure for synchronized streaming of content
US15/847,060 Abandoned US20180109826A1 (en) 2012-03-23 2017-12-19 Method and infrastructure for synchronized streaming of content

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/428,855 Active 2032-07-12 US8997169B2 (en) 2012-03-23 2012-03-23 System, method, and infrastructure for synchronized streaming of content
US14/661,092 Active US9848221B2 (en) 2012-03-23 2015-03-18 Method and infrastructure for synchronized streaming of content

Country Status (2)

Country Link
US (3) US8997169B2 (en)
CN (1) CN103327377B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019209269A1 (en) * 2018-04-24 2019-10-31 Google Llc Methods, systems, and media for synchronized media content playback on multiple devices
US10587908B2 (en) 2015-09-28 2020-03-10 Google Llc Time-synchronized, multizone media streaming
US10609441B1 (en) * 2018-12-10 2020-03-31 Broadsign Serv Llc Master computing device and method for synchronizing display of a digital content
US10609432B1 (en) 2018-12-10 2020-03-31 Broadsign Serv Llc Server and method for synchronizing display of a digital content on a plurality of computing devices
US10728443B1 (en) 2019-03-27 2020-07-28 On Time Staffing Inc. Automatic camera angle switching to create combined audiovisual file
US10963841B2 (en) 2019-03-27 2021-03-30 On Time Staffing Inc. Employment candidate empathy scoring system
US11023735B1 (en) 2020-04-02 2021-06-01 On Time Staffing, Inc. Automatic versioning of video presentations
US11107121B2 (en) 2018-12-10 2021-08-31 Broadsign Serv, Inc. Master computing device and method for determining an actual number of impressions provided by a synchronized group of devices
US11127232B2 (en) 2019-11-26 2021-09-21 On Time Staffing Inc. Multi-camera, multi-sensor panel data extraction system and method
US11144882B1 (en) 2020-09-18 2021-10-12 On Time Staffing Inc. Systems and methods for evaluating actions over a computer network and establishing live network connections
US11166065B1 (en) * 2017-12-27 2021-11-02 Facebook, Inc. Synchronizing presentation of content presented by multiple client devices
US11423071B1 (en) 2021-08-31 2022-08-23 On Time Staffing, Inc. Candidate data ranking method using previously selected candidate data
US11727040B2 (en) 2021-08-06 2023-08-15 On Time Staffing, Inc. Monitoring third-party forum contributions to improve searching through time-to-live data assignments
US11907652B2 (en) 2022-06-02 2024-02-20 On Time Staffing, Inc. User interface and systems for document creation

Families Citing this family (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130003544A (en) * 2011-06-30 2013-01-09 한국전자통신연구원 Method and system for synchronizing contents between terminals
JP2015515208A (en) * 2012-03-23 2015-05-21 トムソン ライセンシングThomson Licensing Buffer management method for synchronization of correlated media presentations
US10674191B2 (en) 2012-04-06 2020-06-02 Minerva Networks, Inc Systems and methods to remotely synchronize digital data
US10321192B2 (en) * 2012-04-06 2019-06-11 Tok.Tv Inc. System and methods of communicating between multiple geographically remote sites to enable a shared, social viewing experience
US10469886B2 (en) * 2012-04-06 2019-11-05 Minerva Networks, Inc. System and methods of synchronizing program reproduction on multiple geographically remote display systems
US9787523B2 (en) * 2012-07-05 2017-10-10 Eric Lazarus Managing data in a data queue including synchronization of media on multiple devices
US9749373B2 (en) * 2012-08-14 2017-08-29 Apple Inc. System and method for improved content streaming
CN103634619B (en) * 2012-08-29 2018-04-20 中兴通讯股份有限公司 A kind of synchronous method, system and the terminal of social television state
US8935735B2 (en) * 2013-01-07 2015-01-13 Time Warner Cable Enterprises Llc Methods and apparatus for supporting trick play functions in devices without local storage
US20140207901A1 (en) * 2013-01-18 2014-07-24 Richard Lesser Media rendering system
US9154535B1 (en) * 2013-03-08 2015-10-06 Scott C. Harris Content delivery system with customizable content
WO2014145976A1 (en) * 2013-03-15 2014-09-18 Troxler Robert E Systems and methods for identifying and separately presenting different portions of multimedia content
US9307508B2 (en) 2013-04-29 2016-04-05 Google Technology Holdings LLC Systems and methods for syncronizing multiple electronic devices
US9712266B2 (en) 2013-05-21 2017-07-18 Apple Inc. Synchronization of multi-channel audio communicated over bluetooth low energy
US9143565B2 (en) * 2013-05-30 2015-09-22 Opentv, Inc. Synchronizing an application on a companion device
KR20150037372A (en) * 2013-09-30 2015-04-08 삼성전자주식회사 Image display apparatus, Server for synchronizing contents, and method for operating the same
US9654545B2 (en) * 2013-09-30 2017-05-16 Sonos, Inc. Group coordinator device selection
US9331799B2 (en) 2013-10-07 2016-05-03 Bose Corporation Synchronous audio playback
US9628525B2 (en) * 2013-10-22 2017-04-18 Polytechnic Institute Of New York University Determining user perceived delays in voice conferencing systems and video conferencing systems
RU2015143731A (en) * 2013-11-01 2017-04-19 Алифком WIRELESS CONTROL OF MEDIA DEVICES FOR MEDIA PRESENTATIONS
JP6213181B2 (en) * 2013-11-20 2017-10-18 ヤマハ株式会社 Synchronous playback system and synchronous playback method
JP5880526B2 (en) * 2013-11-28 2016-03-09 オンキヨー&パイオニアテクノロジー株式会社 Information sharing system
GB2527734A (en) * 2014-04-30 2016-01-06 Piksel Inc Device synchronization
US20150334471A1 (en) * 2014-05-15 2015-11-19 Echostar Technologies L.L.C. Multiple simultaneous audio video data decoding
US10306021B1 (en) * 2014-08-21 2019-05-28 Amazon Technologies, Inc. Streaming content to multiple clients
US10275138B2 (en) 2014-09-02 2019-04-30 Sonos, Inc. Zone recognition
CN104269182B (en) * 2014-09-18 2017-05-31 歌尔股份有限公司 The methods, devices and systems that a kind of audio sync is played
US10284639B2 (en) * 2014-10-27 2019-05-07 Adobe Inc. Synchronized view architecture for embedded environment
US20160150011A1 (en) * 2014-11-26 2016-05-26 Qualcomm Incorporated Media output device to transmit and synchronize playback of a media content stream that is received over a point-to-point connection on multiple interconnected devices
US20160191584A1 (en) * 2014-12-30 2016-06-30 Myine Electronics, Inc. Synchronized vehicle media content sharing moderation
FR3034605A1 (en) * 2015-03-30 2016-10-07 Orange METHOD FOR RETRIEVING SHARED CONTENT, SHARING METHOD, COMPUTER PROGRAM PRODUCTS, AND CORRESPONDING DEVICES
CN104867513B (en) * 2015-04-20 2017-09-29 广东欧珀移动通信有限公司 A kind of control method for playing back and equipment
US9928024B2 (en) * 2015-05-28 2018-03-27 Bose Corporation Audio data buffering
WO2016200998A1 (en) * 2015-06-09 2016-12-15 Arris Enterprises Llc Http live streaming (hls) video client synchronization
JP6536201B2 (en) * 2015-06-16 2019-07-03 ヤマハ株式会社 Control terminal device, audio system and audio system control program
US20170006331A1 (en) * 2015-06-30 2017-01-05 Stmicroelectronics International N.V. Synchronized rendering of split multimedia content on network clients
KR102387867B1 (en) * 2015-09-07 2022-04-18 삼성전자주식회사 Method and apparatus for transmitting and receiving data in communication system
WO2017089183A1 (en) * 2015-11-27 2017-06-01 British Telecommunications Public Limited Company Media content synchronisation
CN105578248B (en) * 2015-12-30 2020-07-31 Tcl新技术(惠州)有限公司 Fancy splicing playing method, device and system
US11589269B2 (en) 2016-03-31 2023-02-21 British Telecommunications Public Limited Company Mobile communications network
CN108781347B (en) 2016-03-31 2022-01-04 英国电讯有限公司 Base station of a mobile communication network, method of operating a base station of a mobile communication network
WO2017167838A1 (en) 2016-03-31 2017-10-05 British Telecommunications Public Limited Company Mobile communications network
US10735508B2 (en) 2016-04-04 2020-08-04 Roku, Inc. Streaming synchronized media content to separate devices
WO2018001897A1 (en) 2016-06-29 2018-01-04 British Telecommunications Public Limited Company Multicast-broadcast mobile communications network
CN109565408B (en) 2016-08-04 2021-09-28 英国电讯有限公司 Method for handover, mobile terminal, base station, and computer-readable storage medium
CN107707504B (en) * 2016-08-08 2020-11-10 中国电信股份有限公司 Streaming media playing method and system, server and client
CN107819809B (en) * 2016-09-14 2024-03-05 京东方科技集团股份有限公司 Method and device for synchronizing content
WO2018078650A1 (en) * 2016-10-26 2018-05-03 Bhide Priyadarshan Method and system for showcasing of media between a plurality of electronic devices.
JP7014956B2 (en) * 2017-10-12 2022-02-02 株式会社ミクシィ Information processing systems, information processing methods, and programs
US11509726B2 (en) * 2017-10-20 2022-11-22 Apple Inc. Encapsulating and synchronizing state interactions between devices
CN108289232B (en) * 2018-01-26 2021-01-08 Oppo广东移动通信有限公司 Control method of playing device, terminal device and storage medium
US10993274B2 (en) 2018-03-30 2021-04-27 Apple Inc. Pairing devices by proxy
US11297369B2 (en) 2018-03-30 2022-04-05 Apple Inc. Remotely controlling playback devices
WO2019209271A1 (en) * 2018-04-24 2019-10-31 Google Llc Methods, systems, and media for adjusting quality level during synchronized media content playback on multiple devices
EP3769510A1 (en) 2018-05-07 2021-01-27 Apple Inc. User interfaces for viewing live video feeds and recorded video
CN110581727A (en) 2018-06-08 2019-12-17 英国电讯有限公司 Wireless telecommunications network
EP3794880A1 (en) * 2018-06-20 2021-03-24 Sony Corporation Infrastructure equipment, communications device and methods
US10614857B2 (en) * 2018-07-02 2020-04-07 Apple Inc. Calibrating media playback channels for synchronized presentation
US10631047B1 (en) 2019-03-29 2020-04-21 Pond5 Inc. Online video editor
TWI730439B (en) 2019-10-03 2021-06-11 瑞昱半導體股份有限公司 System and method for playing network data
TWI727447B (en) * 2019-10-03 2021-05-11 瑞昱半導體股份有限公司 Playing system and playing method
BE1027862B1 (en) * 2019-12-17 2021-07-15 Expo Sport Media R&D Bv METHOD AND INFRASTRUCTURE FOR DISPLAYING CONTENT
CN111277882A (en) * 2020-01-19 2020-06-12 广州南翼信息科技有限公司 Terminal program synchronous playing system and method
CN111277883B (en) * 2020-02-13 2022-05-20 京东方科技集团股份有限公司 Playing method, terminal and playing system
US11178446B2 (en) * 2020-03-09 2021-11-16 Haworth, Inc. Synchronous video content collaboration across multiple clients in a distributed collaboration system
EP4189682A1 (en) * 2020-09-05 2023-06-07 Apple Inc. User interfaces for managing audio for media items
EP4024878A1 (en) * 2020-12-30 2022-07-06 Advanced Digital Broadcast S.A. A method and a system for testing audio-video synchronization of an audio-video player
KR20230158841A (en) * 2022-05-12 2023-11-21 현대자동차주식회사 System and method for controlling vehicle
US20230379529A1 (en) * 2022-05-18 2023-11-23 Microsoft Technology Licensing, Llc Distributed media stream playback suspension and synchronization
US11589104B1 (en) * 2022-06-17 2023-02-21 Userful Corporation Latency compensation for external networks

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7287068B1 (en) * 2002-12-13 2007-10-23 Bmc Software, Inc. System and method for updating devices that execute an operating system or application program directly from nonvolatile storage
US20120082424A1 (en) * 2010-09-30 2012-04-05 Verizon Patent And Licensing Inc. Method and apparatus for synchronizing content playback
US8261314B2 (en) * 2009-09-17 2012-09-04 At&T Intellectual Property I, Lp Apparatus and method for managing media content presentation
US20130198298A1 (en) * 2012-01-27 2013-08-01 Avaya Inc. System and method to synchronize video playback on mobile devices

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2001236504A1 (en) * 2000-01-20 2001-07-31 Interactual Technologies, Inc. System, method and article of manufacture for remote control and navigation of local content
KR100672406B1 (en) * 2002-07-22 2007-01-23 엘지전자 주식회사 Method and apparatus for permitting a potential viewer to view a desired program
US8116465B2 (en) 2004-04-28 2012-02-14 Sony Corporation Measuring apparatus and method, and recording medium
JP2005341384A (en) 2004-05-28 2005-12-08 Sony Corp Sound field correcting apparatus and sound field correcting method
US7631119B2 (en) 2004-06-25 2009-12-08 Apple Inc. Techniques for providing audio for synchronized playback by multiple devices
US8190680B2 (en) 2004-07-01 2012-05-29 Netgear, Inc. Method and system for synchronization of digital media playback
US7792158B1 (en) 2004-08-18 2010-09-07 Atheros Communications, Inc. Media streaming synchronization
US8015306B2 (en) 2005-01-05 2011-09-06 Control4 Corporation Method and apparatus for synchronizing playback of streaming media in multiple output devices
US7953118B2 (en) 2006-12-08 2011-05-31 Microsoft Corporation Synchronizing media streams across multiple devices
JP4935345B2 (en) * 2006-12-25 2012-05-23 ソニー株式会社 Content reproduction system, reproduction apparatus, reproduction control method, and program
US7827479B2 (en) 2007-01-03 2010-11-02 Kali Damon K I System and methods for synchronized media playback between electronic devices
US8027560B2 (en) 2007-02-05 2011-09-27 Thales Avionics, Inc. System and method for synchronizing playback of audio and video
JP5151211B2 (en) 2007-03-30 2013-02-27 ソニー株式会社 Multi-screen synchronized playback system, display control terminal, multi-screen synchronized playback method, and program
JP2009177591A (en) * 2008-01-25 2009-08-06 Mitsubishi Electric Corp Synchronization display system
US8707382B2 (en) * 2008-02-13 2014-04-22 At&T Intellectual Property I, L.P. Synchronizing presentations of multimedia programs
US20090310027A1 (en) 2008-06-16 2009-12-17 James Fleming Systems and methods for separate audio and video lag calibration in a video game
US7996566B1 (en) 2008-12-23 2011-08-09 Genband Us Llc Media sharing
US8340654B2 (en) * 2009-05-26 2012-12-25 Lextech Labs Llc Apparatus and method for video display and control for portable device
WO2011010345A1 (en) 2009-07-22 2011-01-27 Thomson Licensing Synchronous control system including a master device and a slave device, and synchronous control method for controlling the same
US20110040981A1 (en) 2009-08-14 2011-02-17 Apple Inc. Synchronization of Buffered Audio Data With Live Broadcast
US20110107238A1 (en) 2009-10-29 2011-05-05 Dong Liu Network-Based Collaborated Telestration on Video, Images or Other Shared Visual Content
WO2011087727A1 (en) * 2009-12-22 2011-07-21 Delta Vidyo, Inc. System and method for interactive synchronized video watching
US8930577B2 (en) * 2011-09-13 2015-01-06 Microsoft Corporation Synchronizing streaming video between viewers over a network
US9654821B2 (en) * 2011-12-30 2017-05-16 Sonos, Inc. Systems and methods for networked music playback

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7287068B1 (en) * 2002-12-13 2007-10-23 Bmc Software, Inc. System and method for updating devices that execute an operating system or application program directly from nonvolatile storage
US8261314B2 (en) * 2009-09-17 2012-09-04 At&T Intellectual Property I, Lp Apparatus and method for managing media content presentation
US20120082424A1 (en) * 2010-09-30 2012-04-05 Verizon Patent And Licensing Inc. Method and apparatus for synchronizing content playback
US20130198298A1 (en) * 2012-01-27 2013-08-01 Avaya Inc. System and method to synchronize video playback on mobile devices

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11051066B2 (en) 2015-09-28 2021-06-29 Google Llc Time-synchronized, multizone media streaming
US10587908B2 (en) 2015-09-28 2020-03-10 Google Llc Time-synchronized, multizone media streaming
US11166065B1 (en) * 2017-12-27 2021-11-02 Facebook, Inc. Synchronizing presentation of content presented by multiple client devices
US11438644B1 (en) 2017-12-27 2022-09-06 Meta Platforms, Inc. Synchronizing presentation of content presented by multiple client devices
WO2019209269A1 (en) * 2018-04-24 2019-10-31 Google Llc Methods, systems, and media for synchronized media content playback on multiple devices
US11006169B2 (en) 2018-12-10 2021-05-11 Broadsign Serv, Inc. Master computing device and method for synchronizing display of a digital content
US10609432B1 (en) 2018-12-10 2020-03-31 Broadsign Serv Llc Server and method for synchronizing display of a digital content on a plurality of computing devices
US12058400B2 (en) 2018-12-10 2024-08-06 Broadsign Serv Inc. Master computing device and method for synchronizing display of a digital content
US10609441B1 (en) * 2018-12-10 2020-03-31 Broadsign Serv Llc Master computing device and method for synchronizing display of a digital content
US11107121B2 (en) 2018-12-10 2021-08-31 Broadsign Serv, Inc. Master computing device and method for determining an actual number of impressions provided by a synchronized group of devices
US11388468B2 (en) 2018-12-10 2022-07-12 Broadsign Serv Inc. Master computing device and method for synchronizing display of a digital content
US11961044B2 (en) 2019-03-27 2024-04-16 On Time Staffing, Inc. Behavioral data analysis and scoring system
US11863858B2 (en) 2019-03-27 2024-01-02 On Time Staffing Inc. Automatic camera angle switching in response to low noise audio to create combined audiovisual file
US10963841B2 (en) 2019-03-27 2021-03-30 On Time Staffing Inc. Employment candidate empathy scoring system
US10728443B1 (en) 2019-03-27 2020-07-28 On Time Staffing Inc. Automatic camera angle switching to create combined audiovisual file
US11457140B2 (en) 2019-03-27 2022-09-27 On Time Staffing Inc. Automatic camera angle switching in response to low noise audio to create combined audiovisual file
US11783645B2 (en) 2019-11-26 2023-10-10 On Time Staffing Inc. Multi-camera, multi-sensor panel data extraction system and method
US11127232B2 (en) 2019-11-26 2021-09-21 On Time Staffing Inc. Multi-camera, multi-sensor panel data extraction system and method
US11636678B2 (en) 2020-04-02 2023-04-25 On Time Staffing Inc. Audio and video recording and streaming in a three-computer booth
US11184578B2 (en) 2020-04-02 2021-11-23 On Time Staffing, Inc. Audio and video recording and streaming in a three-computer booth
US11861904B2 (en) 2020-04-02 2024-01-02 On Time Staffing, Inc. Automatic versioning of video presentations
US11023735B1 (en) 2020-04-02 2021-06-01 On Time Staffing, Inc. Automatic versioning of video presentations
US11720859B2 (en) 2020-09-18 2023-08-08 On Time Staffing Inc. Systems and methods for evaluating actions over a computer network and establishing live network connections
US11144882B1 (en) 2020-09-18 2021-10-12 On Time Staffing Inc. Systems and methods for evaluating actions over a computer network and establishing live network connections
US11727040B2 (en) 2021-08-06 2023-08-15 On Time Staffing, Inc. Monitoring third-party forum contributions to improve searching through time-to-live data assignments
US11966429B2 (en) 2021-08-06 2024-04-23 On Time Staffing Inc. Monitoring third-party forum contributions to improve searching through time-to-live data assignments
US11423071B1 (en) 2021-08-31 2022-08-23 On Time Staffing, Inc. Candidate data ranking method using previously selected candidate data
US11907652B2 (en) 2022-06-02 2024-02-20 On Time Staffing, Inc. User interface and systems for document creation

Also Published As

Publication number Publication date
CN103327377B (en) 2017-04-12
US9848221B2 (en) 2017-12-19
US20150195590A1 (en) 2015-07-09
US8997169B2 (en) 2015-03-31
US20130251329A1 (en) 2013-09-26
CN103327377A (en) 2013-09-25

Similar Documents

Publication Publication Date Title
US9848221B2 (en) Method and infrastructure for synchronized streaming of content
US10405026B2 (en) Methods, devices and systems for audiovisual synchronization with multiple output devices
US10637894B2 (en) Real-time placeshifting of media content to paired devices
EP2599296B1 (en) Methods and apparatus for automatic synchronization of audio and video signals
USRE47825E1 (en) Methods, systems, and media for certifying a playback device
US11606596B2 (en) Methods, systems, and media for synchronizing audio and video content on multiple media devices
JP5986074B2 (en) Method and apparatus for presenting media content
US20150296247A1 (en) Interaction of user devices and video devices
KR102284721B1 (en) Method and apparatus for displaying application data in wireless communication system
US20150213576A1 (en) Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US20130050573A1 (en) Transmission of video content
US11212357B2 (en) Media player for receiving media content from a remote server
CN102625116B (en) For managing the method and apparatus of 3D video content
US8943247B1 (en) Media sink device input identification

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SONY CORPORATION;SONY NETWORK ENTERTAINMENT INTERNATIONAL LLC;REEL/FRAME:046725/0835

Effective date: 20171206

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION