US20160134833A1 - Apparatus, systems and methods for synchronization of multiple headsets - Google Patents
Apparatus, systems and methods for synchronization of multiple headsets Download PDFInfo
- Publication number
- US20160134833A1 US20160134833A1 US14/534,650 US201414534650A US2016134833A1 US 20160134833 A1 US20160134833 A1 US 20160134833A1 US 201414534650 A US201414534650 A US 201414534650A US 2016134833 A1 US2016134833 A1 US 2016134833A1
- Authority
- US
- United States
- Prior art keywords
- audio
- time delay
- wireless
- headset
- content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43076—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
- H04N5/935—Regeneration of digital synchronisation signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4341—Demultiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44227—Monitoring of local network, e.g. connection or bandwidth variations; Detecting new devices in the local network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
- H04N5/06—Generation of synchronising signals
- H04N5/067—Arrangements or circuits at the transmitter end
- H04N5/0675—Arrangements or circuits at the transmitter end for mixing the synchronising signals with the picture signal or mutually
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/439—Processing of audio elementary streams
- H04N21/4396—Processing of audio elementary streams by muting the audio signal
Definitions
- Media devices such as a set top box, a stereo, a television, a computer system, a game system, or the like, are often configured to communicate audio information to a user's audio headset.
- the audio headset user can view presented video content on a display, such as their television (TV), while listening to the corresponding audio content using their audio headset.
- the audio content may be communicated to the user's audio headset using a wire-based medium when the audio headset is communicatively coupled to the media device using a wire-based connector.
- the audio content may be communicated to the user's audio headset using a wireless-based medium when the audio headset is communicatively coupled to the media device using a suitable wireless transceiver.
- Wireless communication of the audio content requires various processing steps. Each processing step introduces a delay in the final presentation of the audio content on the user's audio headset. For example, but not limited to, the original audio content must be split off from the originally received video/audio content stream. If the audio headset is a wireless type device, the audio content must be converted into a wireless medium format, and then communicated to the audio headset. Finally, the wireless audio headset must detect then wireless signal with the audio content, and then process the wireless signal to generate a signal that is reproducible as sound using the speakers of the wireless audio headset. Accordingly, the output of the video content and the output of the audio content may not be in synchronism when the delay times of processing and presenting the video content is different from the delay times of processing, communicating and presenting the audio content.
- a delay in presentation of the video content and/or the audio content is implemented so that the video content and the audio content are presented in synchronism (or at least substantially in synchronism with each other so that the user substantially perceives that video content and the audio content are synchronously presented).
- multiple users may wish to simultaneously view the video content while individually listening to the audio content using their own personal audio headsets.
- the multiple users at an apartment complex where watching a loud action movie late at night using their stereo system with external speakers may not be practical. Accordingly, the multiple users may watch the movie video content on their large screen TV while listening to the audio content using their own audio headsets.
- a problem not addressed in the prior art is synchronism of video content presentation with audio content when the audio content is presented on multiple wireless and/or wire-based audio headsets. This problem becomes particularly complex in view that there are a wide variety of different types of wireless audio headsets in the market place, each with different inherent audio content processing time delays.
- a wire-based audio headset may be inherently synchronized with the presented video when the media device is sourcing both the video display and the wireless audio headset. Synchronism corrections to a concurrently user wireless audio headset will then cause the wire-based headset to become out of synchronism with the video content.
- a first time delay corresponds to a first duration of time between communication of the audio content from the media device and presentation of the audio content by a first wireless audio headset.
- a second time delay corresponds to a second duration of time between communication of the audio content from the media device and presentation of the audio content by a second wireless audio headset, wherein the first time delay is greater than the second time delay.
- Video content communicated to a display is delayed by the first time delay.
- Audio content communicated to the second wireless audio headset is delayed by a time delay difference between the first time delay and the second time delay.
- FIG. 1 is a block diagram of an embodiment of an audio synchronism system implemented in a media device
- FIG. 2 illustrates a hypothetical time line diagram showing time delays associated with presentation of video content and audio content when a plurality of audio headsets are used to present audio content
- FIG. 3 illustrates a hypothetical time line diagram showing time delays associated with presentation of video content and audio content when a plurality of audio headsets and intermediate mobile electronic devices are used to present audio content.
- FIG. 1 is a block diagram of an embodiment of a audio synchronism system 100 implemented in a media device 102 , such as, but not limited to, a set top box (STB).
- a media device 102 such as, but not limited to, a set top box (STB).
- Embodiments of the audio synchronism system 100 may be implemented in other media devices 102 , such as, but not limited to, a surround-sound receiver, a television (TV), a tablet computer, a laptop computer, a personal computer (PC), a digital video disc (DVD) player, a digital video recorder (DVR), or a game playing device.
- TV television
- PC personal computer
- DVD digital video disc
- DVR digital video recorder
- FIG. 1 is a block diagram of an embodiment of a audio synchronism system 100 implemented in a media device 102 , such as, but not limited to, a set top box (STB).
- Embodiments of the audio synchronism system 100 may
- Embodiments of the audio synchronism system 100 are configured to synchronize presentation of video content and audio content with a plurality of different types of audio headsets 104 (wireless audio headsets and/or wire-based audio headsets). Based on unique time delay associated with communication to and presentation of the audio content by different audio headsets 104 , and based on the time delay associated with presentation of the associated video content, embodiments of the audio synchronism system 100 adjust the times that the media device 102 communicates the video content and/or the audio content such that the video content is synchronously presented with the audio content.
- audio headsets 104 wireless audio headsets and/or wire-based audio headsets
- the exemplary media device 102 is communicatively coupled to a media presentation system 106 that includes a visual display device 108 , such as a television (hereafter, generically a TV), and an audio presentation device 110 , such as a surround sound receiver controlling an audio reproduction device 112 (hereafter, generically, a speaker).
- a visual display device 108 such as a television (hereafter, generically a TV)
- an audio presentation device 110 such as a surround sound receiver controlling an audio reproduction device 112 (hereafter, generically, a speaker).
- Other types of output devices may also be coupled to the media device 102 , including those providing any sort of stimuli sensible by a human being, such as temperature, vibration and the like.
- the video content portion of a media content event is displayed on the display 114 and the audio portion of the media content event is reproduced as sounds by one or more speakers 112 .
- the media device 102 and one or more of the components of the media presentation system 106 may be integrated into a single electronic
- the non-limiting exemplary media device 102 comprises a media content stream interface 116 , a processor system 118 , a memory 120 , a program buffer 122 , an optional digital video recorder (DVR) 124 , a presentation device interface 126 , a remote interface 128 , an optional interne interface 130 , and an audio headset interface 132 .
- the memory 120 comprises portions for storing the media device logic 134 , an optional browser 136 , the electronic program guide (EPG) information 138 , time delay processing logic 140 , video and headset delays 142 , and the video and audio splitter 144 .
- EPG electronic program guide
- the media device logic 134 , the browser 136 , the time delay processing logic, and the video and audio splitter, and/or may be integrated with other logic.
- some or all of these memory and other data manipulation functions may be provided by and using remote server or other electronic devices suitably connected via the Internet or otherwise to a client device.
- Other media devices 102 may include some, or may omit some, of the above-described media processing components. Further, additional components not described herein may be included in alternative embodiments
- a media content provider provides media content that is received in one or more multiple media content streams 146 multiplexed together in one or more transport channels.
- the transport channels with the media content streams 146 are communicated to the media device 102 from a media system sourced from a remote head end facility (not shown) operated by the media content provider.
- media systems include satellite systems, cable system, and the Internet.
- the media device 102 is configured to receive one or more broadcasted satellite signals detected by an antenna (not shown).
- the media content stream 146 can be received from one or more different sources, such as, but not limited to, a cable system, a radio frequency (RF) communication system, or the Internet.
- RF radio frequency
- the one or more media content streams 146 are received by the media content stream interface 116 .
- One or more tuners 116 a in the media content stream interface 116 selectively tune to one of the media content streams 146 in accordance with instructions received from the processor system 118 .
- the processor system 118 executing the media device logic 134 and based upon a request for a media content event of interest specified by a user, parses out media content associated with one or more media content events of interest.
- the video and audio splitter 144 is configured to separate the video content and the audio content. The media content event of interest is then assembled into a stream of video content and audio content.
- the video content and the audio content may be stored by the program buffer 122 such that the video content and the audio content can be streamed out to components of the media presentation system 106 , such as the visual display device 108 and/or the audio presentation device 110 , via the presentation device interface 126 .
- the parsed out media content may be saved into the DVR 124 for later presentation.
- the DVR 124 may be directly provided in, locally connected to, or remotely connected to, the media device 102 .
- the media content streams 146 may stored for later decompression, processing and/or decryption.
- the EPG information 138 portion of the memory 120 stores the information pertaining to the scheduled programming of media content events received in the media content stream 146 .
- the information may include, but is not limited to, a scheduled presentation start and/or end time, a program channel, and descriptive information for individual media content events.
- the media content event's descriptive information may include the title of the media content event, names of performers or actors, date of creation, and a summary describing the nature of the media content event. Any suitable information may be included in the supplemental information.
- the information in the EPG information 138 is retrieved, formatted, and then presented on the display 114 as an EPG.
- the exemplary media device 102 is configured to receive commands from a user via a remote control 148 .
- the remote control 148 includes one or more controllers 150 disposed on the surface of the remote control.
- the user by actuating one or more of the controllers 150 , causes the remote control 148 to generate and transmit commands, via a wireless signal 152 , to the media device 102 .
- the commands control the media device 102 and/or control the media presentation devices.
- the wireless signal 152 may be an infrared (IR) signal or a radio frequency (RF) signal that is detectable by the remote interface 128 .
- the processes performed by the media device 102 relating to the processing of the received media content stream 146 and communication of a presentable video content and the audio content of the media content event to the components of the media presentation system 106 are generally implemented by the processor system 118 while executing the media device logic 134 .
- the media device 102 may perform a variety of functions related to the processing and presentation of one or more media content events received in the media content stream 146 .
- the media device 102 automatically mutes the audio content output from the speakers 112 if one or more of the audio headsets 104 are coupled to the media device 102 .
- the audio content portion of the presented media content event may not be communicated out from the presentation device interface 126 to components of the media presentation system 106 . Accordingly, the speakers 112 do not produce the audio content.
- some embodiments may optionally continue to output the audio content from the speakers 112 if one or more of the audio headsets 104 are coupled to the media device 102 .
- the user of the audio headset 104 may be hearing impaired, where the audio headset 104 provides enhanced sound control of the hearing impaired user.
- the user wearing the audio headset 104 may wish to have the volume presented at a louder volume level (or a lesser volume level) than the audio volume heard by other people who are listening to the audio content output from the speakers 112 .
- FIG. 1 illustrates a plurality of different audio headsets 104 communicatively coupled to the media device 102 using a variety of communication means.
- the audio headset 104 a is a wire-based head phone set that couples to the media device 102 using the wire connector 154 .
- the wire connector 154 has a suitable plug type connector that fits into a mating receptacle of the audio headset interface 132 .
- the audio headset interface 132 outputs the audio content using a suitable wire-based format, such as, but not limited to, an analog signal.
- Some embodiments of the media device 102 may be configured to couple to a plurality of wire-based audio headsets 104 (via a plurality of receptacles and/or by using an external audio headset signal splitter).
- a plurality of wireless based audio headsets 104 are configured to receive audio content from the media device 102 via a wireless signal 156 . Further, the wireless audio headsets 104 may be different from each other, such as the example wireless audio headset 104 b and the wireless audio headset 104 c . Such wireless audio headsets 104 b , 104 c comprise a short range transceiver 158 configured to detect the wireless signal 156 with the audio content therein. Wireless audio headsets 104 b , 104 c also comprise a memory 160 and a processor system 162 . Logic for receiving and processing the audio content received in the wireless signal 156 resides in the memory 160 .
- an identifier of that particular wireless audio headset 104 b , 104 c is stored in the memory 160 .
- the identifier of the wireless audio headset 104 b , 104 c may be retrieved and communicated to the media device 102 , via the short range transceiver 158 .
- the media device 102 can determine a time delay that is associated with that particular identified wireless audio headset 104 b , 104 c .
- the time delay is a duration of time for the communication, reception and processing of the wireless signal 156 , and the attendant reproduction of the audio content on the speakers of, the wireless audio headset 104 b , 104 c .
- the functions of processing the received wireless signal 156 , generating the audio content for reproduction on the speakers of the wireless audio headset 104 b , 104 c , and/or the retrieval and communication of the identifier of the wireless audio headset 104 b , 104 c is performed by the processor system 162 .
- the time delay associated with that particular wireless audio headsets 104 b , 104 c may be different from each other.
- the time delay of the wireless audio headset 104 b may be known to be fifty milliseconds (50 ms) and the time delay of the wireless audio headset 104 c may be known to be two hundred milliseconds (200 ms).
- audio content presentation on the wireless audio headset 104 b would be delayed by 50 ms behind presentation of the video content on the display 114 .
- the audio content presented for the wireless audio headset 104 c would be delayed by 200 ms behind presentation of the video content on the display 114 .
- synchronization of the audio content from the wireless audio headset 104 c with the video content may be effected by delaying presentation of the video content by 200 ms.
- audio content presentation on the wireless audio headset 104 b would otherwise be in advance of the presented video content by 150 ms. Accordingly, embodiments delay the audio content presentation on the wireless audio headset 104 b by an additional 150 ms. Accordingly, the video content is in synchronism with the audio content presented by both the audio headsets 104 b , 104 c.
- the different wireless audio headsets 104 b , 104 c may be configured to receive the same format signal. That is, the different audio headsets 104 b , 104 c detect the same emitted wireless signal 156 .
- the inherent time delay of each different one of the plurality of wireless audio headsets 104 b , 104 c may be different, particularly if they have been made by different manufacturers, have been made using different types of components, and/or it they have different features.
- one of the wireless audio headsets 104 b , 104 c may have complex logic or circuitry configured to emulate the effect of a surround sound or other multiple audio channel system.
- the wireless signal 156 is a Bluetooth communication signal.
- the Bluetooth communication signal is well known to employ a short range wireless technology standard for exchanging data over short distances using short-wavelength ultra high frequency (UHF) radio waves in the industrial, scientific and medical (ISM) radio band from 2.4 to 2.485 GHz.
- Bluetooth technology may be used by fixed and mobile devices, such as the example wireless audio headsets 104 b , 104 c .
- the audio headset interface 132 of the media device 102 (a fixed electronic device) and the short range transceiver 158 of the wireless audio headsets 104 b , 104 c (mobile electronic devices) include a Bluetooth transceiver.
- the detectable range of the wireless signal 156 by the media device 102 and the wireless audio headsets 104 b , 104 c is inherently limited to several meters by the Bluetooth technology.
- the Bluetooth protocol provides for secure exchange of information between a devices.
- the master device (the wireless audio headset 104 b , 104 c ) periodically broadcasts out the wireless signal 156 having its identifier of the broadcasting Bluetooth wireless audio headset 104 b , 104 c .
- the media device 102 only needs to detect the emitted wireless signal 156 from the Bluetooth compatible authorizing wireless audio headset 104 b , 104 c . Accordingly, the media device 102 may identify a particular wireless audio headset 104 b , 104 c , and thus determine the particular time delay that is suitable for that particular wireless audio headset 104 b , 104 c.
- the audio synchronism system 100 may be configured to communicate portions of the wireless signals 156 using the same medium, wherein each wireless signal 156 is designated for a particular one of a plurality of audio headsets 104 .
- the wireless audio headsets 104 b , 104 c may both employ a Bluetooth communication medium, but may have different associated time delays.
- the Bluetooth medium employs a packet-based technology wherein a portion of the audio content is communicated as data in a voice data packet, a unique identifier of the particular destination wireless audio headset 104 b , 104 c may be included in each voice data packet communicated in the wireless signal 156 . Accordingly, a particular portion of the video content in a first packet for the wireless audio headset 104 b (identified by the unique identifier of the wireless audio headset 104 b ) can be communicated a particular time. The same portion of the video content may be communicated at a different time in a second packet for the wireless audio headset 104 c (identified by the unique identifier of the wireless audio headset 104 c ).
- the wireless audio headset 104 b processes received voice data packets with its unique identifier to generate a stream of audio content based on the first time that the packet was communicated from the media device 102 to the wireless audio headset 104 b .
- the wireless audio headset 104 c processes received voice data packets with its unique identifier to generate a stream of audio content based on a second time that the packet was communicated from the media device 102 to the wireless audio headset 104 c.
- embodiments of the media device 102 and the wireless audio headsets 104 b , 104 c may be configured to receive wireless signal 156 using a wireless local area network (LAN) protocol such as under the Institute of Electrical and Electronics Engineers (IEEE) 802 . 11 standard or other similar standard.
- LAN wireless local area network
- the mobile electronic device may be a user's portable laptop computer, notebook, or the like that is configured to communicate wirelessly with a non-mobile electronic device such as a printer or to websites via the Internet.
- Embodiments of the media device 102 and the wireless audio headsets 104 b , 104 c may be configured to communicate using a wireless LAN protocol.
- Other embodiments may employ a Wi-Fi compatible protocol.
- the audio headset interface 132 may include a plurality of different transceiver therein that are configured to communicate using different mediums.
- a first transceiver may be included in the audio headset interface 132 that is compatible with Bluetooth communications and a second transceiver may be included in the audio headset interface 132 that is compatible with Wi-Fi communications.
- a plurality of different audio headset interfaces 132 each using different communication mediums, may be implemented in the media device 102 .
- an intermediate mobile electronic device may be configured to detect the wireless signal 156 emitted from the media device 102 .
- a mobile tablet 164 and/or a mobile phone 166 may detect the wireless signal 156 with the audio content therein, and then present the audio content on an audio headset coupled to that intermediate mobile device.
- a wire-based wireless audio headset 104 d is illustrated as being coupled to the exemplary intermediate mobile electronic device, a tablet 164 , such that the wire-based audio headset 104 d receives the audio content from the tablet 164 via a second wire conductor.
- a wireless audio headset 104 e is illustrated as being coupled to another intermediate mobile electronic device, the mobile phone 166 , such that the wireless audio headset 104 e receives the audio content from the mobile phone 166 via a second wireless signal 168 .
- Such mobile electronic devices such as cell phones, smart phones, tablets, and/or note pads may be provisioned with a low range wireless communication system, such as, but not limited to, a Bluetooth system. Their respective Bluetooth system is then configured to emit a wireless signal 156 that is used for identifying themselves to the media device 102 . Accordingly, the media device 102 can determine a suitable delay that is appropriate for presentation of the audio content on the wireless audio headsets 104 d , 104 e.
- the wire-based audio headset 104 a may be coupled to one of the components of the media presentation system 106 , such as the visual display device 108 or the audio presentation device 110 , using the wire connector 154 (conceptually illustrated using a dashed line to the visual display device 108 ). Accordingly, some amount of time delay may be associated with presentation of the audio content when communicated to the wire-based audio headset 104 a via the intervening component of the media presentation system 106 .
- one or more of the components of the media presentation system 106 may be configured to communicate audio content to one or more of the wireless audio headsets 104 b , 104 c , and/or to the intermediate mobile electronic device (mobile tablet 164 and/or a mobile phone 166 ) using a second wireless signal 170 . Accordingly, some additional amount of time delay may be associated with presentation of the audio content when communicated via the wireless signal 170 (via the intervening component of the media presentation system 106 ).
- the component of the media presentation system 106 transmitting the wireless signal 170 would include a suitable wireless signal interface (transceiver).
- embodiments of the audio synchronism system 100 are configured to synchronize audio output from all of the actively used audio headsets 104 with presentation of the video content on the display 114 . Accordingly, all users of the audio headsets, and optionally any users listening to the audio content output from the speakers 112 , hear the presented audio content and view the synchronously presented video content.
- a time delay for each particular audio headset 104 and/or for each intermediate mobile electronic device is stored as information in the video and audio headset delays 142 portion of memory 120 .
- the time delays for each particular audio headset 104 , and for each intermediate mobile electronic device, may be determined in any suitable manner.
- a headset synchronization (sync) graphical user interface (GUI) 172 may be presented to indicate to the user which wireless audio headsets 104 and/or to which intermediate mobile electronic devices are currently being used to present audio content that has been received from the media device 102 .
- the user may, via the headset sync GUI 172 , select and/or identify which wireless audio headsets 104 and/or to which intermediate mobile electronic devices are currently being used to present audio content.
- the user may, via the headset sync GUI 172 , initiate a learning process or the like wherein a time delay for a new wireless audio headset 104 is determined by and/or is provided to the media device 102 .
- the user is able to specify time delays for a particular wireless audio headset 104 and/or for a particular intermediate mobile electronic device.
- time delay information may be available in device manuals or online at a website that the user may separately access.
- the user by actuation controllers 150 on their remote control 148 , identify the particular wireless audio headset 104 and then enter a numerical value for the time delay associated with the specified wireless audio headset 104 .
- the headset sync GUI 172 is configured to permit the media device to determine, or at least approximate, time delays based on user feedback.
- one or more audible test signals are emitted from the media device 102 , and a microphone or other audio sound detector 174 in the media device 102 detects the emitted audible test signal. Based on the time that the audible test signal was emitted from the media device 102 and the time that the audible test signal was detected at the media device, the delay time can be determined.
- the audible test signal may be communicated to the user's wireless audio headset 104 .
- the user of the tested audio headsets 104 may actuate one of the controllers 150 on their remote control 148 when they begin to hear the presentation of the audible test signal.
- the delay time can be determined.
- a visual test signal may be communicated concurrently with the audible test signal.
- the user will initially perceive the mis-synchronism between presentation of the visual test signal on the display 114 and their hearing of the audible test signal on their wireless audio headset 104 .
- the user by navigating about the headset sync GUI 172 , may then manually enter time delays.
- a specific value of a time delay change may be specified by the user.
- incremental time delay adjustments may be initiated by the user. After a plurality of iterations of viewing the visible test signal and hearing the audible test signal, a final time delay can be determined when the user finally perceives synchronization between presentation of the visual test signal and presentation of the audible test signal.
- the time delay information for particular audio headsets 104 and/or for particular intermediate mobile electronic devices are provided to the media device.
- the time delay information is included in the wireless signal 156 emitted by that audio headset 104 or intermediate mobile electronic device.
- the browser 136 may be used to access a remote site to obtain time delay information.
- a time delay associated with communication of the video content and/or the audio content from the media device 102 to components of the media presentation system 106 , and the delay times associated with the attendant presentation of the video and audio content by components of the media presentation system 106 are stored as information in the video and audio headset delays 142 .
- a time delay for presentation of video on the display 114 may be stored as information in the video and audio headset delays 142 . If different displays 114 might be used for presentation of the video content, then different delay times may be stored.
- a large screen TV in a media room may be used to present video content received from the media device 102 .
- another TV may be in another room, such as the kitchen or a bedroom, and be presenting video content received from the media device 102 .
- FIG. 2 illustrates a hypothetical time line diagram 200 showing time delays associated with presentation of video content and audio content when a plurality of audio headsets 104 a , 104 b , and 104 c are used to present audio content.
- the time line diagram conceptually illustrates passage of time (from left to right). Accordingly, the time line diagram 200 illustrates communication of the video content and/or audio content at time 202 from the media device 102 .
- a presentation time 204 is conceptually illustrated to indicate presentation of the video content by the media presentation system 106 .
- the time line portion T A conceptually illustrates a time delay T A that is required for the media device 102 to communicate audio content, and then for the audio headset 104 a to present the audio content therefrom.
- the time delay T A is a relatively short duration since the wire-based audio headset 104 a is directly coupled to the media device 102 via the wire connector 154 ( FIG. 1 ).
- the time delay T B that is required for the wireless audio headset 104 b to receive the wireless signal 156 , process the audio information therein, and then present the audio content on its wireless audio headset speaker is a relatively longer duration.
- the time delay T C that is required for the wireless audio headset 104 c to receive the wireless signal 156 , process the audio information therein, and then present the audio content to its wireless audio headset speaker is another relatively longer duration. Presuming that the wireless audio headsets 104 b , 104 c are different from each other, the time delay T B and the time delay T C are different from each other. In the hypothetical example illustrated in FIG. 2 , the time delay T C is larger (has a greater duration) that the time delay T B .
- embodiments of the audio synchronism system 100 initially determine which particular audio headsets 104 are currently being used to present audio content to the plurality of users.
- the identifier of each of the wireless audio headsets 104 is provided to, and/or is detected by, the media device 102 .
- Time delays associated with presentation of audio content, if any, are then determined, are user specified 4 , and/or are retrieved from the video and headset delays 142 portion of memory 120 , for each wireless audio headset 104 .
- the longest time delay is then selected, identified or determined. In the hypothetical example of FIG. 2 , the longest time delay is the time delay T C associated with the wireless audio headset 104 c.
- the media device 102 delays communication of the video content to the visual display device 108 of the media presentation system 106 by a time delay amount equal to a duration of T VIDEO & AUDIO, DELAY .
- the duration of T VIDEO & AUDIO, DELAY is substantially equal to the duration of the time delay T C that is associated with the wireless audio headset 104 c .
- the video content presented on the display 114 of the visual display device 108 is presented synchronously with the audio content presented by the wireless audio headset 104 c .
- that duration may be subtracted from the time delay T C to determine the duration of T VIDEO & AUDIO, DELAY .
- the video content presented on the display 114 of the visual display device 108 will be out of synchronism with the audio content presented by the wire-based audio headset 104 a .
- the amount of time of the out-of-synchronization between the presented video content and audio content presented by the audio headset 104 a corresponds to the duration identified as T A, DELAY .
- embodiments of the audio synchronism system 100 delay communication of the audio content to the audio headset 104 a by the time delay of T A, DELAY .
- the video content presented on the display 114 of the visual display device 108 is presented synchronously with the audio content presented by the wire-based audio headset 104 a .
- the duration of the time delay T A, DELAY is determined once the duration of the video content delay T VIDEO & AUDIO, DELAY is determined.
- the duration of the T A, DELAY is determined by subtracting out the duration of the time delay T A from the duration of T VIDEO & AUDIO, DELAY .
- the video content presented on the display 114 of the visual display device 108 will be out of synchronism with the audio content presented by the wireless audio headset 104 b .
- the amount of time of the out-of-synchronization between the presented video content and audio content presented by the wireless audio headset 104 b corresponds to the duration identified as T DELAY . Accordingly, embodiments of the audio synchronism system 100 delay communication of the audio content to the wireless audio headset 104 b by the time delay of T B, DELAY .
- the communication of the audio content is delayed to the second wireless audio headset 104 b by the time delay difference between the larger (first) time delay T C of the wireless audio headset 104 c and the smaller (second) time delay T B of the wireless audio headset 104 b .
- the video content presented on the display 114 of the visual display device 108 is also then presented synchronously with the audio content presented by the wireless audio headset 104 b .
- the duration of the T B, DELAY is determined by subtracting out the duration of the time delay T B from the duration of T VIDEO & AUDIO, DELAY .
- the time delay T B of the wireless audio headset 104 b may be known to be fifty milliseconds (50 ms) and the time delay T C of the wireless audio headset 104 c may be known to be two hundred milliseconds (200 ms). Accordingly, the audio content presentation on the wireless audio headset 104 a would be delayed by 200 ms. The audio content presentation for the wireless audio headset 104 b would be delayed by 150 ms. Accordingly, the presented video content would be in synchronism with the wireless audio headsets 104 a , 104 b , 104 c.
- the audio content is also output from the speakers 112 of the media presentation system 106 .
- the presented video content would be in synchronism with the audio content output from the speakers 112 .
- communication of the audio content could be communicated in advance of the video content by the associated time delay amounts.
- communication of the audio content on the wireless audio headset 104 c would be advanced by 200 ms before communication of the video content.
- Communication of the audio content for the wireless audio headset 104 b would be delayed by 50 ms before communication of the video content.
- the presented video content would be in synchronism with the audio content output from the audio headsets 104 .
- FIG. 3 illustrates a hypothetical time line diagram 300 showing time delays associated with presentation of video content and audio content when a plurality of audio headsets 104 a - 104 e , and the intermediate mobile electronic devices 164 and 166 , are used to present audio content.
- the longest example time delay is for presentation of audio content on the wireless audio headset 104 e .
- a second time delay T E2 occurs for communication of the audio content in the wireless communication signal 170 from the component of the media presentation system 106 to the mobile phone 166 , and the associated processing of the video content performed by the mobile phone 166 .
- a third time delay T E3 occurs for communication of the audio content in the wireless communication signal 168 from the mobile phone 166 to the wireless audio headset 104 e , and the associated processing and presentation of the video content by the wireless audio headset. Accordingly, the delay time between communication of the audio content from the media device 102 to presentation of the audio content by the wireless audio headset 104 e is the time delay (T E1 +T E2 +T E3 ).
- the media device 102 delays communication of the video content to the visual display device 108 of the media presentation system 106 by a time delay amount equal to a duration of T VIDEO & AUDIO, DELAY .
- the duration of T VIDEO & AUDIO, DELAY is substantially equal to the duration of the longest duration time delay (T E1 +T E2 +T E3 ) that is associated with the wireless audio headset 104 e . Accordingly, the video content presented on the display 114 of the visual display device 108 is presented synchronously with the audio content presented by the wireless audio headset 104 e.
- Time delays for the audio headset 104 a , 104 b and 104 c are determined above with reference to the determined longest duration time delay (T E1 +T E2 +T E3 ).
- the time delay T A, DELAY in audio content for the first audio headset 104 a would be the determined longest duration time delay (T E1 +T E2 +T E3 ) minus the time delay T A .
- the time delay T B, DELAY in audio content for the second wireless audio headset 104 b would be the determined longest duration time delay (T E1 +T E2 +T E3 ) minus the time delay T B .
- the time delay T C, DELAY in audio content for the first audio headset 104 c would be the determined longest duration time delay (T E1 +T E2 +T E3 ) minus the time delay T C .
- T D1 there is a time delay T D1 that is associated with communication of the video content via wireless signal 156 to the mobile tablet 164 .
- the video content is communicated to the wire-based audio headset 104 d that is coupled to the mobile tablet 164 .
- the total video content presentation time delay T D, DELAY for the audio headset 104 d is equal to the sum of the time delays, (T D1 +T D2 ).
- Embodiments of the audio synchronism system 100 delay communication of the audio content to the wireless audio headset 104 d by the time delay of T D, DELAY , which is equal to the determined longest duration time delay (T E1 +T E2 +T 3 ) minus the time delays (T D1 +T D2 ).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Circuit For Audible Band Transducer (AREA)
- Headphones And Earphones (AREA)
Abstract
Description
- Media devices, such as a set top box, a stereo, a television, a computer system, a game system, or the like, are often configured to communicate audio information to a user's audio headset. The audio headset user can view presented video content on a display, such as their television (TV), while listening to the corresponding audio content using their audio headset. The audio content may be communicated to the user's audio headset using a wire-based medium when the audio headset is communicatively coupled to the media device using a wire-based connector. Alternatively, the audio content may be communicated to the user's audio headset using a wireless-based medium when the audio headset is communicatively coupled to the media device using a suitable wireless transceiver.
- Wireless communication of the audio content requires various processing steps. Each processing step introduces a delay in the final presentation of the audio content on the user's audio headset. For example, but not limited to, the original audio content must be split off from the originally received video/audio content stream. If the audio headset is a wireless type device, the audio content must be converted into a wireless medium format, and then communicated to the audio headset. Finally, the wireless audio headset must detect then wireless signal with the audio content, and then process the wireless signal to generate a signal that is reproducible as sound using the speakers of the wireless audio headset. Accordingly, the output of the video content and the output of the audio content may not be in synchronism when the delay times of processing and presenting the video content is different from the delay times of processing, communicating and presenting the audio content.
- Various systems and methods have been devised to correct for the above-described video/audio synchronization problem for a wireless audio headset. Essentially, a delay in presentation of the video content and/or the audio content is implemented so that the video content and the audio content are presented in synchronism (or at least substantially in synchronism with each other so that the user substantially perceives that video content and the audio content are synchronously presented).
- However, in some situations, multiple users may wish to simultaneously view the video content while individually listening to the audio content using their own personal audio headsets. For example, the multiple users at an apartment complex where watching a loud action movie late at night using their stereo system with external speakers may not be practical. Accordingly, the multiple users may watch the movie video content on their large screen TV while listening to the audio content using their own audio headsets.
- A problem not addressed in the prior art is synchronism of video content presentation with audio content when the audio content is presented on multiple wireless and/or wire-based audio headsets. This problem becomes particularly complex in view that there are a wide variety of different types of wireless audio headsets in the market place, each with different inherent audio content processing time delays.
- Further, a wire-based audio headset may be inherently synchronized with the presented video when the media device is sourcing both the video display and the wireless audio headset. Synchronism corrections to a concurrently user wireless audio headset will then cause the wire-based headset to become out of synchronism with the video content.
- Accordingly, there is a need in the arts to provide enhanced synchronism of video content presentation with audio content when the audio content is presented on multiple wire-based and/or wireless audio headsets.
- Systems and methods of synchronizing presentation of video content with a plurality of different wireless audio headsets are disclosed. In an exemplary embodiment, a first time delay corresponds to a first duration of time between communication of the audio content from the media device and presentation of the audio content by a first wireless audio headset. A second time delay corresponds to a second duration of time between communication of the audio content from the media device and presentation of the audio content by a second wireless audio headset, wherein the first time delay is greater than the second time delay. Video content communicated to a display is delayed by the first time delay. Audio content communicated to the second wireless audio headset is delayed by a time delay difference between the first time delay and the second time delay.
- Preferred and alternative embodiments are described in detail below with reference to the following drawings:
-
FIG. 1 is a block diagram of an embodiment of an audio synchronism system implemented in a media device; and -
FIG. 2 illustrates a hypothetical time line diagram showing time delays associated with presentation of video content and audio content when a plurality of audio headsets are used to present audio content; and -
FIG. 3 illustrates a hypothetical time line diagram showing time delays associated with presentation of video content and audio content when a plurality of audio headsets and intermediate mobile electronic devices are used to present audio content. -
FIG. 1 is a block diagram of an embodiment of aaudio synchronism system 100 implemented in amedia device 102, such as, but not limited to, a set top box (STB). Embodiments of theaudio synchronism system 100 may be implemented inother media devices 102, such as, but not limited to, a surround-sound receiver, a television (TV), a tablet computer, a laptop computer, a personal computer (PC), a digital video disc (DVD) player, a digital video recorder (DVR), or a game playing device. Here, suchexemplary media devices 102 are configured to communicate the audio content to a plurality ofaudio headsets 104. - Embodiments of the
audio synchronism system 100 are configured to synchronize presentation of video content and audio content with a plurality of different types of audio headsets 104 (wireless audio headsets and/or wire-based audio headsets). Based on unique time delay associated with communication to and presentation of the audio content bydifferent audio headsets 104, and based on the time delay associated with presentation of the associated video content, embodiments of theaudio synchronism system 100 adjust the times that themedia device 102 communicates the video content and/or the audio content such that the video content is synchronously presented with the audio content. - The
exemplary media device 102 is communicatively coupled to amedia presentation system 106 that includes avisual display device 108, such as a television (hereafter, generically a TV), and anaudio presentation device 110, such as a surround sound receiver controlling an audio reproduction device 112 (hereafter, generically, a speaker). Other types of output devices may also be coupled to themedia device 102, including those providing any sort of stimuli sensible by a human being, such as temperature, vibration and the like. The video content portion of a media content event is displayed on thedisplay 114 and the audio portion of the media content event is reproduced as sounds by one ormore speakers 112. In some embodiments, themedia device 102 and one or more of the components of themedia presentation system 106 may be integrated into a single electronic device. - The non-limiting
exemplary media device 102 comprises a mediacontent stream interface 116, aprocessor system 118, amemory 120, aprogram buffer 122, an optional digital video recorder (DVR) 124, apresentation device interface 126, aremote interface 128, an optionalinterne interface 130, and anaudio headset interface 132. Thememory 120 comprises portions for storing themedia device logic 134, anoptional browser 136, the electronic program guide (EPG)information 138, timedelay processing logic 140, video andheadset delays 142, and the video andaudio splitter 144. In some embodiments, themedia device logic 134, thebrowser 136, the time delay processing logic, and the video and audio splitter, and/or may be integrated with other logic. In other embodiments, some or all of these memory and other data manipulation functions may be provided by and using remote server or other electronic devices suitably connected via the Internet or otherwise to a client device.Other media devices 102 may include some, or may omit some, of the above-described media processing components. Further, additional components not described herein may be included in alternative embodiments - The functionality of the
media device 102, here a set top box, is now broadly described. A media content provider provides media content that is received in one or more multiplemedia content streams 146 multiplexed together in one or more transport channels. The transport channels with themedia content streams 146 are communicated to themedia device 102 from a media system sourced from a remote head end facility (not shown) operated by the media content provider. Non-limiting examples of such media systems include satellite systems, cable system, and the Internet. For example, if the media content provider provides programming via a satellite-based communication system, themedia device 102 is configured to receive one or more broadcasted satellite signals detected by an antenna (not shown). Alternatively, or additionally, themedia content stream 146 can be received from one or more different sources, such as, but not limited to, a cable system, a radio frequency (RF) communication system, or the Internet. - The one or more
media content streams 146 are received by the mediacontent stream interface 116. One ormore tuners 116 a in the mediacontent stream interface 116 selectively tune to one of themedia content streams 146 in accordance with instructions received from theprocessor system 118. Theprocessor system 118, executing themedia device logic 134 and based upon a request for a media content event of interest specified by a user, parses out media content associated with one or more media content events of interest. The video andaudio splitter 144 is configured to separate the video content and the audio content. The media content event of interest is then assembled into a stream of video content and audio content. The video content and the audio content may be stored by theprogram buffer 122 such that the video content and the audio content can be streamed out to components of themedia presentation system 106, such as thevisual display device 108 and/or theaudio presentation device 110, via thepresentation device interface 126. Alternatively, or additionally, the parsed out media content may be saved into the DVR 124 for later presentation. The DVR 124 may be directly provided in, locally connected to, or remotely connected to, themedia device 102. In alternative embodiments, themedia content streams 146 may stored for later decompression, processing and/or decryption. - From time to time, information populating the
EPG information 138 portion of thememory 120 is communicated to themedia device 102, via themedia content stream 146 or via another suitable media. TheEPG information 138 portion of thememory 120 stores the information pertaining to the scheduled programming of media content events received in themedia content stream 146. The information may include, but is not limited to, a scheduled presentation start and/or end time, a program channel, and descriptive information for individual media content events. The media content event's descriptive information may include the title of the media content event, names of performers or actors, date of creation, and a summary describing the nature of the media content event. Any suitable information may be included in the supplemental information. Upon receipt of a command from the user requesting presentation of an EPG display, the information in theEPG information 138 is retrieved, formatted, and then presented on thedisplay 114 as an EPG. - The
exemplary media device 102 is configured to receive commands from a user via aremote control 148. Theremote control 148 includes one ormore controllers 150 disposed on the surface of the remote control. The user, by actuating one or more of thecontrollers 150, causes theremote control 148 to generate and transmit commands, via awireless signal 152, to themedia device 102. The commands control themedia device 102 and/or control the media presentation devices. Thewireless signal 152 may be an infrared (IR) signal or a radio frequency (RF) signal that is detectable by theremote interface 128. - The processes performed by the
media device 102 relating to the processing of the receivedmedia content stream 146 and communication of a presentable video content and the audio content of the media content event to the components of themedia presentation system 106 are generally implemented by theprocessor system 118 while executing themedia device logic 134. Thus, themedia device 102 may perform a variety of functions related to the processing and presentation of one or more media content events received in themedia content stream 146. - In some embodiments, the
media device 102 automatically mutes the audio content output from thespeakers 112 if one or more of theaudio headsets 104 are coupled to themedia device 102. For example, the audio content portion of the presented media content event may not be communicated out from thepresentation device interface 126 to components of themedia presentation system 106. Accordingly, thespeakers 112 do not produce the audio content. - Alternatively, some embodiments may optionally continue to output the audio content from the
speakers 112 if one or more of theaudio headsets 104 are coupled to themedia device 102. Here, the user of theaudio headset 104 may be hearing impaired, where theaudio headset 104 provides enhanced sound control of the hearing impaired user. As another example, the user wearing theaudio headset 104 may wish to have the volume presented at a louder volume level (or a lesser volume level) than the audio volume heard by other people who are listening to the audio content output from thespeakers 112. -
FIG. 1 illustrates a plurality of differentaudio headsets 104 communicatively coupled to themedia device 102 using a variety of communication means. Theaudio headset 104 a is a wire-based head phone set that couples to themedia device 102 using thewire connector 154. Thewire connector 154 has a suitable plug type connector that fits into a mating receptacle of theaudio headset interface 132. Accordingly, theaudio headset interface 132 outputs the audio content using a suitable wire-based format, such as, but not limited to, an analog signal. Some embodiments of themedia device 102 may be configured to couple to a plurality of wire-based audio headsets 104 (via a plurality of receptacles and/or by using an external audio headset signal splitter). - A plurality of wireless based
audio headsets 104 are configured to receive audio content from themedia device 102 via awireless signal 156. Further, thewireless audio headsets 104 may be different from each other, such as the examplewireless audio headset 104 b and thewireless audio headset 104 c. Suchwireless audio headsets short range transceiver 158 configured to detect thewireless signal 156 with the audio content therein.Wireless audio headsets memory 160 and aprocessor system 162. Logic for receiving and processing the audio content received in thewireless signal 156 resides in thememory 160. Further, in some embodiments of thewireless audio headsets wireless audio headset memory 160. The identifier of thewireless audio headset media device 102, via theshort range transceiver 158. Accordingly, themedia device 102 can determine a time delay that is associated with that particular identifiedwireless audio headset wireless signal 156, and the attendant reproduction of the audio content on the speakers of, thewireless audio headset wireless signal 156, generating the audio content for reproduction on the speakers of thewireless audio headset wireless audio headset processor system 162. - In situations where the
wireless audio headset 104 b is different from thewireless audio headset 104 c, the time delay associated with that particularwireless audio headsets wireless audio headset 104 b may be known to be fifty milliseconds (50 ms) and the time delay of thewireless audio headset 104 c may be known to be two hundred milliseconds (200 ms). Accordingly, audio content presentation on thewireless audio headset 104 b would be delayed by 50 ms behind presentation of the video content on thedisplay 114. The audio content presented for thewireless audio headset 104 c would be delayed by 200 ms behind presentation of the video content on thedisplay 114. Here, synchronization of the audio content from thewireless audio headset 104 c with the video content may be effected by delaying presentation of the video content by 200 ms. However, audio content presentation on thewireless audio headset 104 b would otherwise be in advance of the presented video content by 150 ms. Accordingly, embodiments delay the audio content presentation on thewireless audio headset 104 b by an additional 150 ms. Accordingly, the video content is in synchronism with the audio content presented by both theaudio headsets - In some embodiments, the different
wireless audio headsets different audio headsets wireless signal 156. However, the inherent time delay of each different one of the plurality ofwireless audio headsets wireless audio headsets - In an example embodiment, the
wireless signal 156 is a Bluetooth communication signal. The Bluetooth communication signal is well known to employ a short range wireless technology standard for exchanging data over short distances using short-wavelength ultra high frequency (UHF) radio waves in the industrial, scientific and medical (ISM) radio band from 2.4 to 2.485 GHz. Bluetooth technology may be used by fixed and mobile devices, such as the examplewireless audio headsets audio headset interface 132 of the media device 102 (a fixed electronic device) and theshort range transceiver 158 of thewireless audio headsets wireless signal 156 by themedia device 102 and thewireless audio headsets - The Bluetooth protocol provides for secure exchange of information between a devices. Under the Bluetooth protocol, the master device (the
wireless audio headset wireless signal 156 having its identifier of the broadcasting Bluetoothwireless audio headset audio synchronism system 100, themedia device 102 only needs to detect the emittedwireless signal 156 from the Bluetooth compatible authorizingwireless audio headset media device 102 may identify a particularwireless audio headset wireless audio headset - In such embodiments where the
different audio headsets wireless signal 156, theaudio synchronism system 100 may be configured to communicate portions of the wireless signals 156 using the same medium, wherein eachwireless signal 156 is designated for a particular one of a plurality ofaudio headsets 104. For example, thewireless audio headsets wireless audio headset wireless signal 156. Accordingly, a particular portion of the video content in a first packet for thewireless audio headset 104 b (identified by the unique identifier of thewireless audio headset 104 b) can be communicated a particular time. The same portion of the video content may be communicated at a different time in a second packet for thewireless audio headset 104 c (identified by the unique identifier of thewireless audio headset 104 c). Here, thewireless audio headset 104 b processes received voice data packets with its unique identifier to generate a stream of audio content based on the first time that the packet was communicated from themedia device 102 to thewireless audio headset 104 b. Similarly, thewireless audio headset 104 c processes received voice data packets with its unique identifier to generate a stream of audio content based on a second time that the packet was communicated from themedia device 102 to thewireless audio headset 104 c. - Alternatively, or additionally, embodiments of the
media device 102 and thewireless audio headsets wireless signal 156 using a wireless local area network (LAN) protocol such as under the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard or other similar standard. For example, the mobile electronic device may be a user's portable laptop computer, notebook, or the like that is configured to communicate wirelessly with a non-mobile electronic device such as a printer or to websites via the Internet. Embodiments of themedia device 102 and thewireless audio headsets - In embodiments where different wireless mediums are used, the
audio headset interface 132 may include a plurality of different transceiver therein that are configured to communicate using different mediums. For example, but not limited to, a first transceiver may be included in theaudio headset interface 132 that is compatible with Bluetooth communications and a second transceiver may be included in theaudio headset interface 132 that is compatible with Wi-Fi communications. Alternatively, a plurality of differentaudio headset interfaces 132, each using different communication mediums, may be implemented in themedia device 102. - In some embodiments, an intermediate mobile electronic device may configured to detect the
wireless signal 156 emitted from themedia device 102. For example, amobile tablet 164 and/or amobile phone 166 may detect thewireless signal 156 with the audio content therein, and then present the audio content on an audio headset coupled to that intermediate mobile device. For example, but not limited to, a wire-basedwireless audio headset 104 d is illustrated as being coupled to the exemplary intermediate mobile electronic device, atablet 164, such that the wire-basedaudio headset 104 d receives the audio content from thetablet 164 via a second wire conductor. As another non-limiting example, awireless audio headset 104 e is illustrated as being coupled to another intermediate mobile electronic device, themobile phone 166, such that thewireless audio headset 104 e receives the audio content from themobile phone 166 via asecond wireless signal 168. Such mobile electronic devices, such as cell phones, smart phones, tablets, and/or note pads may be provisioned with a low range wireless communication system, such as, but not limited to, a Bluetooth system. Their respective Bluetooth system is then configured to emit awireless signal 156 that is used for identifying themselves to themedia device 102. Accordingly, themedia device 102 can determine a suitable delay that is appropriate for presentation of the audio content on thewireless audio headsets - In some embodiments, the wire-based
audio headset 104 a may be coupled to one of the components of themedia presentation system 106, such as thevisual display device 108 or theaudio presentation device 110, using the wire connector 154 (conceptually illustrated using a dashed line to the visual display device 108). Accordingly, some amount of time delay may be associated with presentation of the audio content when communicated to the wire-basedaudio headset 104 a via the intervening component of themedia presentation system 106. - Alternatively, or additionally, one or more of the components of the
media presentation system 106 may be configured to communicate audio content to one or more of thewireless audio headsets mobile tablet 164 and/or a mobile phone 166) using asecond wireless signal 170. Accordingly, some additional amount of time delay may be associated with presentation of the audio content when communicated via the wireless signal 170 (via the intervening component of the media presentation system 106). In such embodiments, the component of themedia presentation system 106 transmitting thewireless signal 170 would include a suitable wireless signal interface (transceiver). - When a plurality of different
audio headsets 104 are concurrently used to present audio content to a user of thatparticular audio headset 104, embodiments of theaudio synchronism system 100 are configured to synchronize audio output from all of the actively usedaudio headsets 104 with presentation of the video content on thedisplay 114. Accordingly, all users of the audio headsets, and optionally any users listening to the audio content output from thespeakers 112, hear the presented audio content and view the synchronously presented video content. - In the various embodiments, a time delay for each
particular audio headset 104 and/or for each intermediate mobile electronic device is stored as information in the video andaudio headset delays 142 portion ofmemory 120. The time delays for eachparticular audio headset 104, and for each intermediate mobile electronic device, may be determined in any suitable manner. - In some embodiments, a headset synchronization (sync) graphical user interface (GUI) 172 may be presented to indicate to the user which
wireless audio headsets 104 and/or to which intermediate mobile electronic devices are currently being used to present audio content that has been received from themedia device 102. The user may, via theheadset sync GUI 172, select and/or identify whichwireless audio headsets 104 and/or to which intermediate mobile electronic devices are currently being used to present audio content. In some embodiments, the user may, via theheadset sync GUI 172, initiate a learning process or the like wherein a time delay for a newwireless audio headset 104 is determined by and/or is provided to themedia device 102. - In some embodiments, the user is able to specify time delays for a particular
wireless audio headset 104 and/or for a particular intermediate mobile electronic device. For example, time delay information may be available in device manuals or online at a website that the user may separately access. The user, byactuation controllers 150 on theirremote control 148, identify the particularwireless audio headset 104 and then enter a numerical value for the time delay associated with the specifiedwireless audio headset 104. - Alternatively, or additionally, the
headset sync GUI 172 is configured to permit the media device to determine, or at least approximate, time delays based on user feedback. In some embodiments, one or more audible test signals are emitted from themedia device 102, and a microphone or otheraudio sound detector 174 in themedia device 102 detects the emitted audible test signal. Based on the time that the audible test signal was emitted from themedia device 102 and the time that the audible test signal was detected at the media device, the delay time can be determined. - Alternatively, or additionally, the audible test signal may be communicated to the user's
wireless audio headset 104. The user of the testedaudio headsets 104 may actuate one of thecontrollers 150 on theirremote control 148 when they begin to hear the presentation of the audible test signal. Based on a response from the user, such as by actuation of acontroller 150 on theremote control 148, the delay time can be determined. - Alternatively, or additionally, a visual test signal may be communicated concurrently with the audible test signal. The user will initially perceive the mis-synchronism between presentation of the visual test signal on the
display 114 and their hearing of the audible test signal on theirwireless audio headset 104. The user, by navigating about theheadset sync GUI 172, may then manually enter time delays. A specific value of a time delay change may be specified by the user. Alternatively, or additionally, incremental time delay adjustments may be initiated by the user. After a plurality of iterations of viewing the visible test signal and hearing the audible test signal, a final time delay can be determined when the user finally perceives synchronization between presentation of the visual test signal and presentation of the audible test signal. - In some embodiments, the time delay information for particular
audio headsets 104 and/or for particular intermediate mobile electronic devices are provided to the media device. In some embodiments, the time delay information is included in thewireless signal 156 emitted by thataudio headset 104 or intermediate mobile electronic device. Alternatively, or additionally, thebrowser 136 may be used to access a remote site to obtain time delay information. - In some embodiments, a time delay associated with communication of the video content and/or the audio content from the
media device 102 to components of themedia presentation system 106, and the delay times associated with the attendant presentation of the video and audio content by components of themedia presentation system 106, are stored as information in the video and audio headset delays 142. For example, but not limited to, a time delay for presentation of video on thedisplay 114 may be stored as information in the video and audio headset delays 142. Ifdifferent displays 114 might be used for presentation of the video content, then different delay times may be stored. For example, a large screen TV in a media room may be used to present video content received from themedia device 102. Alternatively, or additionally, another TV may be in another room, such as the kitchen or a bedroom, and be presenting video content received from themedia device 102. -
FIG. 2 illustrates a hypothetical time line diagram 200 showing time delays associated with presentation of video content and audio content when a plurality ofaudio headsets time 202 from themedia device 102. Apresentation time 204 is conceptually illustrated to indicate presentation of the video content by themedia presentation system 106. - For example, the time line portion TA conceptually illustrates a time delay TA that is required for the
media device 102 to communicate audio content, and then for theaudio headset 104 a to present the audio content therefrom. Here, the time delay TA is a relatively short duration since the wire-basedaudio headset 104 a is directly coupled to themedia device 102 via the wire connector 154 (FIG. 1 ). - In contrast, the time delay TB that is required for the
wireless audio headset 104 b to receive thewireless signal 156, process the audio information therein, and then present the audio content on its wireless audio headset speaker is a relatively longer duration. Similarly, the time delay TC that is required for thewireless audio headset 104 c to receive thewireless signal 156, process the audio information therein, and then present the audio content to its wireless audio headset speaker is another relatively longer duration. Presuming that thewireless audio headsets FIG. 2 , the time delay TC is larger (has a greater duration) that the time delay TB. - In the hypothetical example of
FIG. 2 , embodiments of theaudio synchronism system 100 initially determine whichparticular audio headsets 104 are currently being used to present audio content to the plurality of users. In an example embodiment, the identifier of each of thewireless audio headsets 104 is provided to, and/or is detected by, themedia device 102. Time delays associated with presentation of audio content, if any, are then determined, are user specified 4, and/or are retrieved from the video andheadset delays 142 portion ofmemory 120, for eachwireless audio headset 104. Then, the longest time delay is then selected, identified or determined. In the hypothetical example ofFIG. 2 , the longest time delay is the time delay TC associated with thewireless audio headset 104 c. - Once the longest duration time delay is determined, the
media device 102 delays communication of the video content to thevisual display device 108 of themedia presentation system 106 by a time delay amount equal to a duration of T VIDEO & AUDIO, DELAY. Here, the duration of TVIDEO & AUDIO, DELAY is substantially equal to the duration of the time delay TC that is associated with thewireless audio headset 104 c. Accordingly, the video content presented on thedisplay 114 of thevisual display device 108 is presented synchronously with the audio content presented by thewireless audio headset 104 c. In some embodiments, if there is known time delay associated with communication and presentation of the video content on the display 114 (not shown inFIG. 2 ), then that duration may be subtracted from the time delay TC to determine the duration of TVIDEO & AUDIO, DELAY. - However, after the video portion is delayed by the duration of TVIDEO & AUDIO, DELAY, the video content presented on the
display 114 of thevisual display device 108 will be out of synchronism with the audio content presented by the wire-basedaudio headset 104 a. The amount of time of the out-of-synchronization between the presented video content and audio content presented by theaudio headset 104 a corresponds to the duration identified as TA, DELAY. Accordingly, embodiments of theaudio synchronism system 100 delay communication of the audio content to theaudio headset 104 a by the time delay of TA, DELAY. Thus, the video content presented on thedisplay 114 of thevisual display device 108 is presented synchronously with the audio content presented by the wire-basedaudio headset 104 a. The duration of the time delay TA, DELAY is determined once the duration of the video content delay TVIDEO & AUDIO, DELAY is determined. The duration of the TA, DELAY is determined by subtracting out the duration of the time delay TA from the duration of TVIDEO & AUDIO, DELAY. - Similarly, after the video portion is delayed by the duration of TVIDEO & AUDIO, DELAY, the video content presented on the
display 114 of thevisual display device 108 will be out of synchronism with the audio content presented by thewireless audio headset 104 b. The amount of time of the out-of-synchronization between the presented video content and audio content presented by thewireless audio headset 104 b corresponds to the duration identified as TDELAY. Accordingly, embodiments of theaudio synchronism system 100 delay communication of the audio content to thewireless audio headset 104 b by the time delay of TB, DELAY. That is, the communication of the audio content is delayed to the secondwireless audio headset 104 b by the time delay difference between the larger (first) time delay TC of thewireless audio headset 104 c and the smaller (second) time delay TB of thewireless audio headset 104 b. Thus, the video content presented on thedisplay 114 of thevisual display device 108 is also then presented synchronously with the audio content presented by thewireless audio headset 104 b. The duration of the TB, DELAY is determined by subtracting out the duration of the time delay TB from the duration of TVIDEO & AUDIO, DELAY. - For example, the time delay TB of the wireless audio headset104 b may be known to be fifty milliseconds (50 ms) and the time delay TC of the
wireless audio headset 104 c may be known to be two hundred milliseconds (200 ms). Accordingly, the audio content presentation on thewireless audio headset 104 a would be delayed by 200 ms. The audio content presentation for thewireless audio headset 104 b would be delayed by 150 ms. Accordingly, the presented video content would be in synchronism with thewireless audio headsets - In some instances, the audio content is also output from the
speakers 112 of themedia presentation system 106. Here, the presented video content would be in synchronism with the audio content output from thespeakers 112. - In alternative embodiments, communication of the audio content could be communicated in advance of the video content by the associated time delay amounts. In the above-described example, communication of the audio content on the
wireless audio headset 104 c would be advanced by 200 ms before communication of the video content. Communication of the audio content for thewireless audio headset 104 b would be delayed by 50 ms before communication of the video content. Here, the presented video content would be in synchronism with the audio content output from theaudio headsets 104. -
FIG. 3 illustrates a hypothetical time line diagram 300 showing time delays associated with presentation of video content and audio content when a plurality ofaudio headsets 104 a-104 e, and the intermediate mobileelectronic devices - In this conceptual example, the longest example time delay is for presentation of audio content on the wireless audio headset104 e. In this example, there is a first time delay TEl that is associated from communication of the audio content from the
media device 102 to themedia presentation system 106 and the associated processing of the video content performed by the component of themedia presentation system 106. A second time delay TE2 occurs for communication of the audio content in thewireless communication signal 170 from the component of themedia presentation system 106 to themobile phone 166, and the associated processing of the video content performed by themobile phone 166. A third time delay TE3 occurs for communication of the audio content in thewireless communication signal 168 from themobile phone 166 to thewireless audio headset 104 e, and the associated processing and presentation of the video content by the wireless audio headset. Accordingly, the delay time between communication of the audio content from themedia device 102 to presentation of the audio content by thewireless audio headset 104 e is the time delay (TE1+TE2+TE3). - Once the longest duration time delay (TE1+TE2+TE3) is determined (conceptually illustrated at a time 302), the
media device 102 delays communication of the video content to thevisual display device 108 of themedia presentation system 106 by a time delay amount equal to a duration of TVIDEO & AUDIO, DELAY. Here, the duration of TVIDEO & AUDIO, DELAY is substantially equal to the duration of the longest duration time delay (TE1+TE2+TE3) that is associated with thewireless audio headset 104 e. Accordingly, the video content presented on thedisplay 114 of thevisual display device 108 is presented synchronously with the audio content presented by thewireless audio headset 104 e. - Time delays for the
audio headset first audio headset 104 a would be the determined longest duration time delay (TE1+TE2+TE3) minus the time delay TA. The time delay TB, DELAY in audio content for the secondwireless audio headset 104 b would be the determined longest duration time delay (TE1+TE2+TE3) minus the time delay TB. And, the time delay TC, DELAY in audio content for thefirst audio headset 104 c would be the determined longest duration time delay (TE1+TE2+TE3) minus the time delay TC. - Similarly, there is a time delay TD1 that is associated with communication of the video content via
wireless signal 156 to themobile tablet 164. After processing, the video content is communicated to the wire-basedaudio headset 104 d that is coupled to themobile tablet 164. Accordingly, the total video content presentation time delay TD, DELAY for theaudio headset 104 d is equal to the sum of the time delays, (TD1+TD2). Embodiments of theaudio synchronism system 100 delay communication of the audio content to thewireless audio headset 104 d by the time delay of TD, DELAY, which is equal to the determined longest duration time delay (TE1+TE2+T3) minus the time delays (TD1+TD2). - It should be emphasized that the above-described embodiments of the
audio synchronism system 100 are merely possible examples of implementations of the invention. Many variations and modifications may be made to the above-described embodiments. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/534,650 US9338391B1 (en) | 2014-11-06 | 2014-11-06 | Apparatus, systems and methods for synchronization of multiple headsets |
US15/148,888 US9998703B2 (en) | 2014-11-06 | 2016-05-06 | Apparatus, systems and methods for synchronization of multiple headsets |
US16/002,483 US10178345B2 (en) | 2014-11-06 | 2018-06-07 | Apparatus, systems and methods for synchronization of multiple headsets |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/534,650 US9338391B1 (en) | 2014-11-06 | 2014-11-06 | Apparatus, systems and methods for synchronization of multiple headsets |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/148,888 Continuation US9998703B2 (en) | 2014-11-06 | 2016-05-06 | Apparatus, systems and methods for synchronization of multiple headsets |
Publications (2)
Publication Number | Publication Date |
---|---|
US9338391B1 US9338391B1 (en) | 2016-05-10 |
US20160134833A1 true US20160134833A1 (en) | 2016-05-12 |
Family
ID=55860164
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/534,650 Active 2034-11-20 US9338391B1 (en) | 2014-11-06 | 2014-11-06 | Apparatus, systems and methods for synchronization of multiple headsets |
US15/148,888 Active US9998703B2 (en) | 2014-11-06 | 2016-05-06 | Apparatus, systems and methods for synchronization of multiple headsets |
US16/002,483 Active US10178345B2 (en) | 2014-11-06 | 2018-06-07 | Apparatus, systems and methods for synchronization of multiple headsets |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/148,888 Active US9998703B2 (en) | 2014-11-06 | 2016-05-06 | Apparatus, systems and methods for synchronization of multiple headsets |
US16/002,483 Active US10178345B2 (en) | 2014-11-06 | 2018-06-07 | Apparatus, systems and methods for synchronization of multiple headsets |
Country Status (1)
Country | Link |
---|---|
US (3) | US9338391B1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160191592A1 (en) * | 2014-12-24 | 2016-06-30 | Sonus Networks, Inc. | Methods and apparatus for communicating delay information and minimizing delays |
US20200084342A1 (en) * | 2018-09-12 | 2020-03-12 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
EP3474577B1 (en) * | 2017-10-20 | 2023-08-23 | Google LLC | Bluetooth device, method and computer program for controlling a plurality of wireless audio devices with a bluetooth device |
US12108235B2 (en) | 2021-11-18 | 2024-10-01 | Surround Sync Pty Ltd | Virtual reality headset audio synchronization system |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9668290B1 (en) * | 2016-05-24 | 2017-05-30 | Ronald Snagg | Wireless communication headset system |
US10104471B2 (en) * | 2016-11-30 | 2018-10-16 | Google Llc | Tactile bass response |
US10291964B2 (en) | 2016-12-06 | 2019-05-14 | At&T Intellectual Property I, L.P. | Multimedia broadcast system |
US10362346B2 (en) | 2017-04-20 | 2019-07-23 | Apple Inc. | Simultaneous playback for multiple audience members with different visual and audio needs |
US10897667B2 (en) | 2017-06-08 | 2021-01-19 | Dts, Inc. | Correcting for latency of an audio chain |
US10334358B2 (en) * | 2017-06-08 | 2019-06-25 | Dts, Inc. | Correcting for a latency of a speaker |
CN111357262A (en) | 2018-03-01 | 2020-06-30 | 索尼公司 | Dynamic lip synchronization compensation for a truly wireless bluetooth device |
CN109379619B (en) * | 2018-11-20 | 2021-05-18 | 海信视像科技股份有限公司 | Sound and picture synchronization method and device |
GB201902664D0 (en) * | 2019-02-27 | 2019-04-10 | Oxsight Ltd | Head mountable imaging apparatus and system for assisting a user with reduced vision |
WO2021060578A1 (en) * | 2019-09-25 | 2021-04-01 | 엘지전자 주식회사 | Image display device, lip-sync correction method thereof, and image display system |
KR20220014519A (en) | 2020-07-29 | 2022-02-07 | 삼성전자주식회사 | Electronic device for synchronizing an output time point of content output by external electronic devices and method for the same |
CN116250243A (en) * | 2020-10-16 | 2023-06-09 | 三星电子株式会社 | Method and apparatus for controlling connection of wireless audio output device |
US20230344928A1 (en) * | 2022-04-20 | 2023-10-26 | Plantronics, Inc. | Wireless In-Line Dock for Headsets |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030179317A1 (en) | 2002-03-21 | 2003-09-25 | Sigworth Dwight L. | Personal audio-synchronizing device |
US7668243B2 (en) | 2004-05-18 | 2010-02-23 | Texas Instruments Incorporated | Audio and video clock synchronization in a wireless network |
US7636126B2 (en) | 2005-06-22 | 2009-12-22 | Sony Computer Entertainment Inc. | Delay matching in audio/video systems |
US7634227B2 (en) | 2006-03-29 | 2009-12-15 | Sony Ericsson Mobile Communications Ab | Method and system for controlling audio data playback in an accessory device |
US8102836B2 (en) | 2007-05-23 | 2012-01-24 | Broadcom Corporation | Synchronization of a split audio, video, or other data stream with separate sinks |
KR101416249B1 (en) | 2007-08-01 | 2014-07-07 | 삼성전자 주식회사 | Signal processing apparatus and control method thereof |
US8743284B2 (en) | 2007-10-08 | 2014-06-03 | Motorola Mobility Llc | Synchronizing remote audio with fixed video |
KR101450100B1 (en) | 2007-11-22 | 2014-10-15 | 삼성전자주식회사 | Multimedia apparatus and synchronization method thereof |
AU2008291065A1 (en) | 2007-12-19 | 2009-07-09 | Interactivetv Pty Limited | Device and method for synchronisation of digital video and audio streams to media presentation devices |
KR20100124909A (en) | 2009-05-20 | 2010-11-30 | 삼성전자주식회사 | Apparatus and method for synchronization between video and audio in mobile communication terminal |
US8505054B1 (en) | 2009-12-18 | 2013-08-06 | Joseph F. Kirley | System, device, and method for distributing audio signals for an audio/video presentation |
US9013632B2 (en) | 2010-07-08 | 2015-04-21 | Echostar Broadcasting Corporation | Apparatus, systems and methods for user controlled synchronization of presented video and audio streams |
US8665320B2 (en) | 2010-07-26 | 2014-03-04 | Echo Star Technologies L.L.C. | Method and apparatus for automatic synchronization of audio and video signals |
WO2012054872A2 (en) | 2010-10-22 | 2012-04-26 | Phorus Llc | Media distribution architecture |
US20120200774A1 (en) | 2011-02-07 | 2012-08-09 | Ehlers Sr Gregory Allen | Audio and video distribution system with latency delay compensator |
US8441577B2 (en) * | 2011-02-08 | 2013-05-14 | Echostar Technologies L.L.C. | Apparatus, systems and methods for synchronization of a video stream and an audio stream |
TWI501673B (en) | 2011-02-16 | 2015-09-21 | Amtran Technology Co Ltd | Method of synchronized playing video and audio data and system thereof |
JP5284451B2 (en) | 2011-11-30 | 2013-09-11 | 株式会社東芝 | Electronic device and audio output method |
US11178489B2 (en) | 2012-02-02 | 2021-11-16 | Arris Enterprises Llc | Audio control module |
US20130232282A1 (en) | 2012-03-05 | 2013-09-05 | Jihwan Kim | Electronic device and method of controlling the same |
RU2015111194A (en) | 2012-08-28 | 2016-10-20 | Конинклейке Филипс Н.В. | AUDIO TRANSMISSION DEVICE AND RELATED METHOD |
US8925003B2 (en) | 2013-03-08 | 2014-12-30 | Silicon Image, Inc. | Mechanism for facilitating synchronization of audio and video between multiple media devices |
-
2014
- 2014-11-06 US US14/534,650 patent/US9338391B1/en active Active
-
2016
- 2016-05-06 US US15/148,888 patent/US9998703B2/en active Active
-
2018
- 2018-06-07 US US16/002,483 patent/US10178345B2/en active Active
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160191592A1 (en) * | 2014-12-24 | 2016-06-30 | Sonus Networks, Inc. | Methods and apparatus for communicating delay information and minimizing delays |
US10484447B2 (en) * | 2014-12-24 | 2019-11-19 | Ribbon Communications Operating Company, Inc. | Methods and apparatus for communicating delay information and minimizing delays |
US11115453B2 (en) * | 2014-12-24 | 2021-09-07 | Ribbon Communications Operating Company, Inc. | Methods and apparatus for communicating delay information and minimizing delays |
EP3474577B1 (en) * | 2017-10-20 | 2023-08-23 | Google LLC | Bluetooth device, method and computer program for controlling a plurality of wireless audio devices with a bluetooth device |
US11800284B2 (en) | 2017-10-20 | 2023-10-24 | Google Llc | Bluetooth device and method for controlling a plurality of wireless audio devices with a Bluetooth device |
US20200084342A1 (en) * | 2018-09-12 | 2020-03-12 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
US10834296B2 (en) * | 2018-09-12 | 2020-11-10 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
US12108235B2 (en) | 2021-11-18 | 2024-10-01 | Surround Sync Pty Ltd | Virtual reality headset audio synchronization system |
Also Published As
Publication number | Publication date |
---|---|
US9338391B1 (en) | 2016-05-10 |
US20160255302A1 (en) | 2016-09-01 |
US20180288365A1 (en) | 2018-10-04 |
US10178345B2 (en) | 2019-01-08 |
US9998703B2 (en) | 2018-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10178345B2 (en) | Apparatus, systems and methods for synchronization of multiple headsets | |
EP2599296B1 (en) | Methods and apparatus for automatic synchronization of audio and video signals | |
US11606596B2 (en) | Methods, systems, and media for synchronizing audio and video content on multiple media devices | |
EP2750404B1 (en) | Audio level based closed-captioning control | |
US8434006B2 (en) | Systems and methods for adjusting volume of combined audio channels | |
WO2016094130A1 (en) | Methods, devices and systems for audiovisual synchronization with multiple output devices | |
US8441577B2 (en) | Apparatus, systems and methods for synchronization of a video stream and an audio stream | |
GB2458727A (en) | Delay of audiovisual (AV) signal component for synchronisation with wireless transmission | |
US11431880B2 (en) | Method and device for automatically adjusting synchronization of sound and picture of TV, and storage medium | |
CN113077799A (en) | Decoder arrangement with two audio links | |
US11528389B2 (en) | Method and system for synchronizing playback of independent audio and video streams through a network | |
US11140484B2 (en) | Terminal, audio cooperative reproduction system, and content display apparatus | |
KR102598367B1 (en) | Method and corresponding device for audio detection | |
US10209952B2 (en) | Content reproduction device, content reproduction system, and control method for a content reproduction device | |
KR102279395B1 (en) | Digital Broadcasting Output System, Set-Top Box and Digital Broadcasting Signal Output Method, Mobile and Sound Signal Output Method | |
JP5735360B2 (en) | Information terminal, information device, and system comprising these | |
JP2009088627A (en) | Viewing and listening device, parameter management means, viewing and listening system | |
KR20090028155A (en) | Device and system for near field communication based on zone and method thereof | |
KR20150055863A (en) | Display device and operating method thereof | |
JP2008042516A (en) | Video content display system and video display device | |
KR20070092338A (en) | Wireless video system and method of processing a signal in the wireless video system | |
KR20150091662A (en) | Method and system for playing digital contents | |
JP2016208166A (en) | Signal processing device | |
TW200840351A (en) | Method and system for controlling volume settings for multimedia devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ECHOSTAR TECHNOLOGIES L.L.C., COLORADO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GREENE, GREGORY;INNES, DAVID;REEL/FRAME:034121/0034 Effective date: 20141105 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: DISH TECHNOLOGIES L.L.C., COLORADO Free format text: CONVERSION;ASSIGNOR:ECHOSTAR TECHNOLOGIES L.L.C.;REEL/FRAME:046737/0610 Effective date: 20180201 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: U.S. BANK, NATIONAL ASSOCIATION, AS COLLATERAL AGENT, MINNESOTA Free format text: SECURITY INTEREST;ASSIGNORS:DISH BROADCASTING CORPORATION;DISH NETWORK L.L.C.;DISH TECHNOLOGIES L.L.C.;REEL/FRAME:058295/0293 Effective date: 20211126 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |