US20210400168A1 - Synchronization of wireless-audio to video - Google Patents
Synchronization of wireless-audio to video Download PDFInfo
- Publication number
- US20210400168A1 US20210400168A1 US16/905,166 US202016905166A US2021400168A1 US 20210400168 A1 US20210400168 A1 US 20210400168A1 US 202016905166 A US202016905166 A US 202016905166A US 2021400168 A1 US2021400168 A1 US 2021400168A1
- Authority
- US
- United States
- Prior art keywords
- wireless audio
- audio
- content
- playback
- wireless
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 claims abstract description 97
- 238000000034 method Methods 0.000 claims abstract description 19
- 239000000872 buffer Substances 0.000 claims description 47
- 238000012360 testing method Methods 0.000 claims description 20
- 230000003111 delayed effect Effects 0.000 claims description 13
- 238000000605 extraction Methods 0.000 claims description 5
- 238000012546 transfer Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 238000006243 chemical reaction Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000003139 buffering effect Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 239000004606 Fillers/Extenders Substances 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43076—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/65—Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/764—Media network packet handling at the destination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42684—Client identification by a unique number or address, e.g. serial number, MAC address, socket ID
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43637—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W80/00—Wireless network protocols or protocol adaptations to wireless operation
- H04W80/02—Data link layer protocols
Definitions
- AV audio-to-video
- various embodiments of the disclosure relate to an electronic apparatus and method for synchronization of wireless audio to video.
- AV synchronization or lip sync issue is a well-known problem associated media transmission and playback.
- the lip sync error is measured by an amount of time by which audio of AV content lags behind or leads video of the AV content.
- wireless audio such as Bluetooth® audio
- An electronic apparatus and method for synchronization of wireless audio to video is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
- FIG. 1 is a block diagram that illustrates an exemplary network environment for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 2 is a block diagram that illustrates an exemplary electronic apparatus for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 3 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 4 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 5 is a flowchart that illustrates an exemplary method for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- Exemplary aspects of the disclosure provide an electronic apparatus which may determine a wireless audio processing delay associated with the electronic apparatus (i.e. audio source).
- the wireless audio processing delay may include a duration by which the playback of the video content may lead or lag behind playback of the audio content on the wireless audio device.
- the electronic apparatus may transmit the audio content to the wireless audio device for playback and may control the playback of the video content on the display device such that playback of the video content is time-synchronized with the playback of the transmitted audio content on the wireless audio device.
- the playback of the video content may be controlled based on the determined wireless audio processing delay.
- the electronic apparatus may use the determined wireless audio processing delay to delay the playback of the video content on the display device.
- the delayed playback of the video content may be timed to match the delayed playback of the audio content on the wireless audio device. This may mitigate the lip sync (or AV synchronization issue) and enhance the listening experience of the user on the wireless audio device.
- the user may be able to hear the audio content on the wireless audio device at the same time when the video content is displayed on the display device.
- FIG. 1 is a block diagram that illustrates an exemplary network environment for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- the network environment 100 may include an electronic apparatus 102 , an audio/video (AV) source 104 , a display device 106 , and a wireless audio device 108 .
- the electronic apparatus 102 may be coupled to the wireless audio device 108 , via a wireless network 110 .
- a user 112 who may be associated with the wireless audio device 108 .
- the electronic apparatus 102 and the display device 106 are shown as two separate devices, however, in some embodiments, the entire functionality of the display device 106 may be incorporated in the electronic apparatus 102 , without a deviation from the scope of the disclosure.
- the electronic apparatus 102 may include suitable logic, circuitry, interfaces, and/or code that may be configured to receive media content from the AV source 104 and control playback of the received media content via the display device 106 and one or more audio devices communicatively coupled to the electronic apparatus 102 .
- the electronic apparatus 102 may be a display-enabled media player and the display device 106 may be included in the electronic apparatus 102 .
- Examples of such an implementation of the electronic apparatus 102 may include, but are not limited to, a television (TV), an Internet-Protocol TV (IPTV), a smart TV, a smartphone, a personal computer, a laptop, a tablet, a wearable electronic device, or any other display device with a capability to receive, decode, and play content encapsulated in broadcasting signals from cable or satellite networks, over-the-air broadcast, or internet-based communication signals.
- TV television
- IPTV Internet-Protocol TV
- smart TV a smartphone
- personal computer a laptop
- a tablet a wearable electronic device
- any other display device with a capability to receive, decode, and play content encapsulated in broadcasting signals from cable or satellite networks, over-the-air broadcast, or internet-based communication signals.
- the electronic apparatus 102 may be a media player that may communicate with the display device 106 , via a wired or a wireless connection.
- Examples of such an implementation of the electronic apparatus 102 may include, but are not limited to, a digital media player (DMP), a micro-console, a TV tuner, an Advanced Television Systems Committee (ATSC) 3.0 tuner, a set-top-box, an Over-the-Top (OTT) player, a digital media streamer, a media extender/regulator, a digital media hub, a computer workstation, a mainframe computer, a handheld computer, a smart appliance, a plug-in device and/or any other computing device with content streaming and playback functionality.
- DMP digital media player
- ATSC Advanced Television Systems Committee
- OTT Over-the-Top
- the AV source 104 may include suitable logic, circuitry, and interfaces that may be configured to transmit the media content to the electronic apparatus 102 .
- the media content on the AV source 104 may include audio content and video content associated with the audio content.
- the audio content may include a background audio, actor voice or speech, and other audio components, such as audio description.
- the AV source 104 may be implemented as a storage device which stores the media content. Examples of such an implementation of the AV source 104 may include, but are not limited to, a Pen Drive, a Flash USB Stick, a Hard Disk Drive (HDD), a Solid-State Drive (SSD), and/or a Secure Digital (SD) card.
- the AV source 104 may be implemented as a media streaming server, which may transmit the media content to the electronic apparatus 102 , via a communication network (not shown).
- the AV source 104 may be an TV tuner, such as an ATSC tuner, which may receive digital TV (DTV) signals from an over-the-air broadcast network and may extract the media content from the received DTV signal. Thereafter, the AV source 104 may transmit the extracted media content to the electronic apparatus 102 .
- DTV digital TV
- the AV source 104 and the electronic apparatus 102 are shown as two separate devices. However, the present disclosure may not be so limiting and in some embodiments, the functionality of the AV source 104 may be incorporated in its entirety or at least partially in the electronic apparatus 102 , without departing from the scope of the present disclosure.
- the display device 106 may include suitable logic, circuitry, and/or interfaces that may be configured to display the video content, which may be received from the electronic apparatus 102 .
- the display device 106 may be a touch screen which may enable the user 112 to provide a user-input via the display device 106 .
- the display device 106 may be realized through several known technologies such as, but not limited to, at least one of a Liquid Crystal Display (LCD) display, a Light Emitting Diode (LED) display, a plasma display, or an Organic LED (OLED) display technology, or other display devices.
- the display device 106 may refer to a display screen of a head mounted device (HMD), a smart-glass device, a see-through display, a projection-based display, an electro-chromic display, or a transparent display.
- HMD head mounted device
- smart-glass device a see-through display
- a projection-based display a projection-based display
- electro-chromic display or a transparent display.
- the wireless audio device 108 may include suitable logic, circuitry, and/or interfaces that may be configured to receive the audio content from the electronic apparatus 102 .
- the wireless audio device 108 may be a portable wireless speaker, a wearable audio device, or a head-mounted audio device. Examples of the wireless audio device 108 may include, but are not limited to, a wireless speaker of a surround sound system, an over-head headphone, an in-ear headphone, a clip-on headphone, a bone-conduction headphone, a hearing aid, smart glasses, or a head-mounted display (for example, an Augmented Reality (AR) headset, Mixed Reality (MR) headset, or Virtual Reality (VR) goggles).
- the wireless audio device 108 may rely on a wireless communication protocol, such as Wi-Fi, Bluetooth®, or Bluetooth® Low Energy (BLE) to receive the audio content from the electronic apparatus 102 .
- the wireless network 110 may include a medium through which two or more wireless devices may communicate with each other.
- the wireless network 110 may be established between the electronic apparatus 102 and the wireless audio device 108 and may allow the electronic apparatus 102 and the wireless audio device 108 to communication with each other.
- each of the two or more wireless audio devices may pair-up and communicate with the electronic apparatus 102 via the wireless network 110 .
- wireless network protocols may include, but are not limited to, Radio Frequency Identification (RFID), Wireless USB, and Near Field Communication (NFC) (e.g., NFC Peer-to-Peer), BluetoothTM, or Bluetooth Low Energy (BLETM) ZigBee, Personal Area Network (PAN), Wi-Max, a cellular network, a Long-Term Evolution (LTE) network, or an Evolved High Speed Packet Access (HSPA+), protocols based on 802 wireless standards such as 802.3, 802.15.1, 802.16 (Wireless local loop), 802.20 (Mobile Broadband Wireless Access (MBWA)), 802.11-1997 (legacy version), 802.15.4, 802.11a, 802.11b, 802.11g, 802.11e, 802.11i, 802.11f, 802.11c, 802.11h (specific to European regulations) 802.11n, 802.11j (specific to Japanese regulations), 802.11p, 802.11ac, 802.11ad, 802.11ah, 802.11
- the electronic apparatus 102 may determine the first wireless audio processing delay associated with the electronic apparatus 102 .
- the first wireless audio processing delay may be determined based on a first delay profile stored in a memory (as shown in FIG. 2 ) of the electronic apparatus 102 .
- the memory may store the first delay profile in form of a look-up table, which may include a list of wireless audio processing delays associated with various models or variants (identified by a unique identifier) of the electronic apparatus 102 .
- the look-up table may also include a Media Access Control (MAC) address associated with the electronic apparatus 102 .
- MAC Media Access Control
- the first wireless audio processing delay may be associated an audio processing pipeline of the electronic apparatus 102 .
- the audio processing pipeline may include various audio processing operations which may be executed on the electronic apparatus 102 before the audio content is wirelessly transmitted to the wireless audio device 108 . Examples of such audio processing operations may include, but are not limited to, audio frame buffering, sample rate conversions, file format conversions, or audio transcoding.
- the first wireless audio processing delay may depend on a size of each audio buffer and a number of times audio frames of the audio content are stored and retrieved from each audio buffer. A larger audio buffer may introduce a greater latency between the time an audio sample is written into it and read out for next operation.
- the audio processing pipeline may also include an operation to convert the audio content into a suitable codec format. For example, if the audio content needs to be transferred as Bluetooth® audio, then the audio content may need to be converted into a suitable codec format to conform with the Bluetooth® standard for audio.
- the first wireless audio processing delay may include a latency which may be incurred as a result of the conversion to the suitable codec format for the wireless audio.
- the conversion to the suitable codec format may use a suitable audio codec.
- an audio codec for Bluetooth® audio may include, but are not limited to, LDACTM, AptXTM, AptXTM Adaptive, Low latency and High-Definition audio Codec (LHDCTM), Low Complexity Communication Codec (LC3), and low-complexity sub-band codec (SBC).
- LDACTM Low Latency and High-Definition audio Codec
- LC3 Low Complexity Communication Codec
- SBC low-complexity sub-band codec
- the latency included in the first wireless audio processing delay may vary depending on the type of audio codec used for the conversion.
- the first wireless audio processing delay may also include a latency associated with a wireless audio transfer operation which may be executed by a wireless data transfer hardware or circuitry of the electronic apparatus 102 .
- a wireless data transfer hardware or circuitry of the electronic apparatus 102 For example, for Bluetooth® audio transfer, the audio samples of the audio content may be queued, packaged, and sent over the Bluetooth® audio transfer hardware.
- the Bluetooth audio transfer hardware may packetize and transmit the packetized audio content over the wireless network 110 .
- the latency associated with the Bluetooth® audio transfer may be included in the first wireless audio processing delay.
- the first wireless audio processing delay may include a 100 milliseconds latency for the audio processing operations of the audio processing pipeline, another 2-5 milliseconds latency for conversion to the suitable codec format, and another 100 milliseconds latency for the Bluetooth® audio transfer.
- the electronic apparatus 102 may also determine a second wireless audio processing delay associated with the wireless audio device 108 .
- the electronic apparatus 102 may first determine a MAC address of the wireless audio device 108 and then determine the second wireless processing delay associated with the wireless audio device 108 based on the determined MAC address.
- the second wireless audio processing delay may be determined based on a second delay profile stored in the memory of the electronic apparatus 102 .
- the memory may also store the second delay profile in form of a look-up table.
- the look-up table may include wireless audio processing delays associated with different models or variants of the wireless audio device 108 .
- Example of the look-up table is provided in Table 2, as follows:
- the second wireless audio processing delay may be associated with a latency caused by audio processing operations on the wireless audio device 108 .
- audio processing operations may include, but are not limited to, audio packet/frame buffering, audio decoding or decryption, or other audio effects, such as audio equalization.
- the electronic apparatus 102 may control the display device 106 to display a test video. Also, the electronic apparatus 102 may transmit a test audio associated with the test video to the wireless audio device 108 .
- the electronic apparatus 102 may receive, via the wireless audio device 108 , a user input which may include a duration by which the playback of the test video is to be delayed to match a time of the playback of the transmitted test audio on the wireless audio device 108 .
- the user input may be received in the form of a touch input, a hand gesture, a head gesture, a voice input, and the like.
- the electronic apparatus 102 may determine the first wireless audio processing delay as the duration included in the user input.
- the electronic apparatus 102 may receive the media content from the AV source 104 .
- the media content may include, for example, audio content, video content associated with the audio content, and other information, such as subtitles and closed captions.
- the electronic apparatus 102 may transmit the audio content to the wireless audio device 108 via the wireless network 110 .
- the electronic apparatus 102 may control the playback of the video content on the display device 106 based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device 108 .
- the playback of the video content may be delayed by at least the determined first wireless audio processing delay to match a time of the playback of the audio content on the wireless audio device 108 .
- the electronic apparatus 102 may control the playback of the video content on the display device 106 further based on the determined second audio processing delay.
- the playback of the video content may be delayed by a time which may equal the sum of the determined first wireless audio processing delay and the determined second wireless audio processing delay.
- the user 112 may be able to listen to the audio content on the wireless audio device 108 and watch the video content on the display device 106 without any noticeable lip sync issue.
- FIG. 2 is a block diagram that illustrates an exemplary electronic apparatus for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 2 is explained in conjunction with elements from FIG. 1 .
- the electronic apparatus 102 may include circuitry 202 , a memory 204 , an AV source 206 , a buffer memory 208 , and a network interface 210 .
- the circuitry 202 may be communicatively coupled to the wireless audio device 108 , the AV source 206 , the buffer memory 208 , and the network interface 210 .
- the AV source 206 is an exemplary implementation of the AV source 104 of FIG. 1 .
- the circuitry 202 may include suitable logic, circuitry, and/or interfaces that may be configured to execute program instructions associated with different operations to be executed by the electronic apparatus 102 .
- the circuitry 202 may include one or more specialized processing units, which may be implemented as an integrated processor or a cluster of processors that perform the functions of the one or more specialized processing units, collectively.
- the circuitry 202 may be implemented based on a number of processor technologies known in the art.
- Examples of implementations of the circuitry 202 may be an x86-based processor, a Graphics Processing Unit (GPU), a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, a microcontroller, a central processing unit (CPU), and/or other computing circuits.
- GPU Graphics Processing Unit
- RISC Reduced Instruction Set Computing
- ASIC Application-Specific Integrated Circuit
- CISC Complex Instruction Set Computing
- microcontroller a central processing unit (CPU), and/or other computing circuits.
- the memory 204 may include suitable logic, circuitry, and/or interfaces that may be configured to store program instructions to be executed by the circuitry 202 .
- the memory 204 may be configured to store the first delay profile for the electronic apparatus 102 .
- the first delay profile may include the first wireless audio processing delay associated with the electronic apparatus 102 .
- the first delay profile may include a model name or an identifier (ID) of the electronic apparatus 102 and/or a Media Access Control (MAC) address associated with the electronic apparatus 102 .
- the memory 204 may be also configured to store the second delay profile for the wireless audio device 108 .
- the second delay profile may include a second wireless audio processing delay associated with the wireless audio device 108 .
- the second delay profile may include a model name or an identifier of the wireless audio device 108 and/or a MAC address associated with the wireless audio device 108 .
- the memory 204 may be further configured to store the media content including the video content and the audio content associated with the video content. Examples of implementation of the memory 204 may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), a CPU cache, and/or a Secure Digital (SD) card.
- RAM Random Access Memory
- ROM Read Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- HDD Hard Disk Drive
- SSD Solid-State Drive
- CPU cache and/or a Secure Digital (SD) card.
- SD Secure Digital
- the AV source 206 may include suitable logic, circuitry, and/or interfaces that may be configured to output compressed media which includes compressed audio data and compressed video data.
- the AV source 206 may further include a memory for storage of the compressed media.
- the AV source 206 may receive the compressed media through various content delivery systems, such as terrestrial content broadcasting networks, satellite-based broadcasting networks, Internet Protocol (IP) based content networks, or a combination thereof.
- content delivery systems such as terrestrial content broadcasting networks, satellite-based broadcasting networks, Internet Protocol (IP) based content networks, or a combination thereof.
- IP Internet Protocol
- the buffer memory 208 may include suitable logic, circuitry, and interfaces that may be configured to temporarily store data to be transmitted to or received from the electronic apparatus 102 .
- the memory 204 may be configured to instantaneously allocate the buffer memory 208 when the data is required to stored.
- the size of the allocation (in Kilobytes or Megabytes) may depend on a content bandwidth (in Mbps) and whether compressed or uncompressed data is required to be stored in the buffer memory 208 .
- the buffer memory 208 may be a memory module which may be separate from the memory 204 .
- Example implementations of such a memory module may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), and/or a Secure Digital (SD) card.
- RAM Random Access Memory
- ROM Read Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- HDD Hard Disk Drive
- SSD Solid-State Drive
- SD Secure Digital
- the network interface 210 may include suitable logic, circuitry, interfaces, and/or code that may be configured to facilitate communication between the circuitry 202 and the wireless audio device 108 , via the wireless network 110 .
- the network interface 210 may be implemented by use of various known technologies to support wireless communication of the electronic apparatus 102 via the wireless network 110 .
- the network interface 210 may include, for example, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, a local buffer circuitry, and the like.
- RF radio frequency
- the network interface 210 may be configured to communicate via wireless communication with networks, such as the Internet, an Intranet, a wireless network, a cellular telephone network, a wireless local area network (LAN), or a metropolitan area network (MAN).
- the wireless communication may be configured to use one or more of a plurality of communication standards, protocols and technologies, such as Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), Long Term Evolution (LTE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE 802.11a, IEEE 802.11b, IEEE 802.11g or IEEE 802.11n), voice over Internet Protocol (VoIP), light fidelity (Li-Fi), or Worldwide Interoperability for Microwave Access (Wi-MAX).
- GSM Global System for Mobile Communications
- EDGE Enhanced Data GSM Environment
- W-CDMA wideband code division multiple access
- LTE Long Term Evolution
- the functions or operations executed by the electronic apparatus 102 may be performed by the circuitry 202 . Operations executed by the circuitry 202 are described in detail, for example, in FIG. 3 and FIG. 4 .
- FIG. 3 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 3 is explained in conjunction with elements from FIG. 1 and FIG. 2 .
- a block diagram 300 that illustrates exemplary operations from 304 to 308 for synchronization of wireless audio to video.
- the exemplary operations may be executed by any computing system, for example, by the electronic apparatus 102 of FIG. 1 or by the circuitry 202 of FIG. 2 .
- the AV source 302 may be coupled to the buffer memory 208 of FIG. 2 .
- the AV source 302 is an exemplary implementation of the AV source 206 of FIG. 2 or the AV source 104 of FIG. 1 .
- the description of the AV source 302 is omitted from the disclosure for the sake of brevity.
- the AV source 302 may store the compressed media which includes compressed audio data 302 A and compressed video data 302 B associated with the compressed audio data 302 A.
- the compressed audio data 302 A may be decoded.
- the circuitry 202 may decode the compressed audio data 302 A to output uncompressed audio, which may be referred to as the audio content.
- the compressed audio data 302 A may be decoded and converted to a suitable codec format supported by an audio codec for wireless audio transfer. Thereafter, the circuitry 202 may transmit the audio content to the wireless audio device 108 , via the wireless network 110 .
- the decoding of the compressed audio data 302 A, the conversion to the suitable codec format, and the wireless transmission of the audio content may incur a delay in the wireless audio processing pipeline of the electronic apparatus 102 .
- the delay may equal (or may approximate) the determined first wireless audio processing delay (as described in FIG. 1 ) associated with the electronic apparatus 102 .
- decoding of the compressed video data 302 B may be delayed, as described at 306 .
- the compressed video data 302 B may be stored in the buffer memory 208 .
- the circuitry 202 may store the compressed video data 302 B in the buffer memory 208 for a holding duration while the compressed audio data 302 A is decoded, converted, and transmitted to the wireless audio device 108 (at 304 ).
- the holding duration may refer to a time duration for which the compressed video data 302 B may be stored in the buffer memory 208 to delay the playback of uncompressed video data (i.e. video content) on the display device 106 and to match a time of the playback of the transmitted audio content to the wireless audio device 108 .
- the holding duration may include the determined first wireless audio processing delay and/or any delay associated with movement of the compressed video data 302 B in and out of the buffer memory 208 .
- the holding duration may also include the second wireless audio processing delay (as described in FIG. 1 ) associated with the wireless audio device 108 .
- the compressed video data 302 B may be decoded.
- the circuitry 202 may extract the compressed video data 302 B from the buffer memory 208 after the holding duration. After the extraction, the circuitry 202 may decode the compressed video data 302 B to output uncompressed video, which may be referred to as the video content. After the compressed video data 302 B is decoded, the circuitry 202 may transfer the video content to the display device 106 and may control the playback of the video content on the display device 106 .
- the playback of the video content may be delayed to match the time of the playback of the audio content on the wireless audio device 108 . This may allow the electronic apparatus 102 to remove the lip sync error typically associated with the playback of the video content and the playback of the audio content.
- FIG. 4 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 4 is explained in conjunction with elements from FIGS. 1, 2, and 3 .
- a block diagram 400 that illustrates exemplary operations from 404 to 408 for synchronization of wireless audio to video.
- the exemplary operations from 404 to 408 may be executed by any computing system, for example, by the electronic apparatus 102 of FIG. 1 or by the circuitry 202 of FIG. 2 .
- the AV source 402 may be coupled to the buffer memory 208 of FIG. 2 .
- the AV source 402 is an exemplary implementation of the AV source 206 of FIG. 2 or the AV source 104 of FIG. 1 .
- the description of the AV source 402 is omitted from the disclosure for the sake of brevity.
- the AV source 402 may store the compressed media which includes compressed audio data 402 A and compressed video data 402 B associated with the compressed audio data 402 A.
- the compressed audio data 402 A may be decoded.
- the circuitry 202 may decode the compressed audio data 402 A to output uncompressed audio, which may be referred to as the audio content.
- the compressed audio data 402 A may be decoded and converted to a suitable codec format supported by an audio codec for wireless audio transfer. Thereafter, the circuitry 202 may transmit the audio content to the wireless audio device 108 , via the wireless network 110 .
- the decoding of the compressed audio data 402 A, the conversion to the suitable codec format, and the wireless transmission of the audio content may incur a delay in the wireless audio processing pipeline of the electronic apparatus 102 .
- the delay may equal (approximate) the determined first wireless audio processing delay (as described in FIG. 1 ) associated with the electronic apparatus 102 .
- the compressed video data 402 B may be first decoded and then stored in a buffer to delay the playback, as described at 406 and onwards.
- the compressed video data 402 B may be decoded.
- the circuitry 202 may decode the compressed video data 402 B to output uncompressed video, which may be referred to as the video content.
- the video content may be stored in the buffer memory 208 after the compressed video data 402 B is decoded.
- the circuitry 202 may store the compressed video data 402 B in the buffer memory 208 for a holding duration while the compressed audio data 402 A is decoded, converted, and transmitted to the wireless audio device 108 (at 404 ).
- the holding duration may refer to a time duration for which the compressed video data 402 B may be stored in the buffer memory 208 to delay the playback of uncompressed video data (i.e. video content) on the display device 106 and to match a time of the playback of the transmitted audio content to the wireless audio device 108 .
- the holding duration may include the determined first wireless audio processing delay and/or any delay associated with movement of the compressed video data 402 B in and out of the buffer memory 208 .
- the holding duration may also include the second wireless audio processing delay (as described in FIG. 2 ) associated with the wireless audio device 108 .
- the circuitry 202 may extract the video content (i.e. the uncompressed video data) from the buffer memory 208 after the holding duration and may transfer the video content to the display device 106 . Thereafter, the circuitry 202 may control the playback of the video content on the display device 106 and match the time of the playback of the audio content on the wireless audio device 108 .
- FIG. 5 is a flowchart that illustrates exemplary method for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.
- FIG. 5 is explained in conjunction with elements from FIGS. 1, 2, 3, and 4 .
- FIG. 5 there is shown a flowchart 500 .
- the method illustrated in the flowchart 500 may be executed by any computing system, such as by the electronic apparatus 102 or the circuitry 202 .
- the method may start at 502 and proceed to 504 .
- a first wireless audio processing delay may be determined.
- the circuitry 202 may be configured to determine the first wireless audio processing delay associated with the electronic apparatus 102 .
- the media content may be received.
- the circuitry 202 may be configured to receive the media content which includes video content and audio content associated with the video content.
- the audio content may be transmitted to the wireless audio device 108 .
- the circuitry may be configured to transmit the audio content to the wireless audio device 108 .
- a playback of the video content on the display device 106 may be controlled based on the determined first wireless audio processing delay.
- the circuitry 202 may be configured to control the playback of the video content on the display device 106 based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device 108 . Control may pass to end.
- flowchart 500 is illustrated as discrete operations, such as 504 , 506 , 508 , and 510 , the disclosure is not so limited. Accordingly, in certain embodiments, such discrete operations may be further divided into additional operations, combined into fewer operations, or eliminated, depending on the particular implementation without detracting from the essence of the disclosed embodiments.
- Various embodiments of the disclosure may provide a non-transitory computer readable medium and/or storage medium having stored thereon, instructions executable by a machine and/or a computer to operate an electronic apparatus.
- the instructions may cause the machine and/or computer to perform operations that include determining a first wireless audio processing delay associated with the electronic apparatus.
- the operations may further include receiving media content comprising video content and audio content associated with the video content.
- the operations may further include transmitting the audio content to a wireless audio device communicatively coupled to the electronic apparatus.
- the operations may further include controlling playback of the video content on a display device associated with the electronic apparatus based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device.
- Exemplary aspects of the disclosure may provide an electronic apparatus (such as the electronic apparatus 102 of FIG. 1 ) that includes circuitry (such as the circuitry 202 ) that may be communicatively coupled to a display device (such as the display device 106 ) and a wireless audio device (such as the wireless audio device 108 ).
- the circuitry may be configured to determine a first wireless audio processing delay associated with the electronic apparatus.
- the circuitry may be further configured to receive media content comprising video content and audio content associated with the video content.
- the circuitry may be further configured to transmit the audio content to the wireless audio device and control playback of the video content on the display device based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device.
- the playback of the video content is delayed by at least the determined first wireless audio processing delay to match a time of the playback of the audio content on the wireless audio device.
- the circuitry may be further configured to determine a Media Access Control (MAC) address of the wireless audio device and determine a second wireless audio processing delay associated with the wireless audio device based on the determined MAC address. In accordance with an embodiment, the circuitry may be further configured to control the playback of the video content on the display device further based on the determined second wireless audio processing delay.
- MAC Media Access Control
- the electronic apparatus may further include a memory (such as the memory 204 ) configured to store a first delay profile for the electronic apparatus.
- the first delay profile comprises the first wireless audio processing delay associated with the electronic apparatus.
- the circuitry may be further configured to determine the first wireless audio processing delay based on the first delay profile.
- the electronic apparatus may further include an Audio/Video (AV) source (such as the AV source 206 ) configured to output compressed media comprising compressed audio data (such as the compressed audio data 302 A) and compressed video data (such as the compressed video data 302 B).
- AV Audio/Video
- the electronic apparatus may further include an Audio/Video (AV) source (such as the AV source 206 ) configured to output compressed media comprising compressed audio data (such as the compressed audio data 302 A) and compressed video data (such as the compressed video data 302 B).
- AV Audio/Video
- the electronic apparatus may further include a buffer memory (such as the buffer memory 208 ) coupled to the AV source.
- the circuitry may be configured to store the compressed video data in the buffer memory for a holding duration that includes the determined first wireless audio processing delay while the compressed audio data is decoded as the audio content and transmitted to the wireless audio device.
- the circuitry may be further configured to extract the compressed video data from the buffer memory after the holding duration. After the extraction, the circuitry may be further configured to decode the compressed video data as the video content. Thereafter, the circuitry may be configured to control the playback of the video content on the display device to match with a timing of the playback of the audio content on the wireless audio device.
- the electronic apparatus may further include a buffer memory (such as the buffer memory 208 ) coupled to the AV source.
- the circuitry may be configured to decode the compressed video data to obtain the video content. After the decode, the circuitry may be configured to store the video content in the buffer memory for a holding duration that includes the determined first wireless audio processing delay while the compressed audio data is decoded as the audio content and transmitted to the wireless audio device.
- the circuitry may be further configured to extract the stored video data from the buffer memory after the holding duration. Thereafter, the circuitry may be configured to control the playback of the video content on the display device to match with a timing of the playback of the transmitted audio content on the wireless audio device.
- the circuitry may be further configured to control the display device to display a test video.
- the circuitry may be further configured to transmit a test audio associated with the test video to the wireless audio device 108 .
- the circuitry may be further configured to receive, via the wireless audio device, a user (such as the user 112 ) input comprising a duration by which the playback of the test video is to be delayed to match a time of the playback of the transmitted test audio to the wireless audio device. Thereafter, the circuitry may be configured to set the first wireless audio processing delay as the duration included in the user input.
- the present disclosure may be realized in hardware, or a combination of hardware and software.
- the present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems.
- a computer system or other apparatus adapted to carry out the methods described herein may be suited.
- a combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein.
- the present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
- Computer program in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system with information processing capability to perform a particular function either directly, or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Power Engineering (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
- None.
- Various embodiments of the disclosure relate to audio-to-video (AV) synchronization. More specifically, various embodiments of the disclosure relate to an electronic apparatus and method for synchronization of wireless audio to video.
- AV synchronization or lip sync issue is a well-known problem associated media transmission and playback. Typically, the lip sync error is measured by an amount of time by which audio of AV content lags behind or leads video of the AV content. In case of wireless audio, such as Bluetooth® audio, there is a noticeable lip sync problem which can be very annoying and can actually make a program unwatchable for certain people.
- Limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of described systems with some aspects of the present disclosure, as set forth in the remainder of the present application and with reference to the drawings.
- An electronic apparatus and method for synchronization of wireless audio to video is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
- These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure, along with the accompanying figures in which like reference numerals refer to like parts throughout.
-
FIG. 1 is a block diagram that illustrates an exemplary network environment for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. -
FIG. 2 is a block diagram that illustrates an exemplary electronic apparatus for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. -
FIG. 3 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. -
FIG. 4 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. -
FIG. 5 is a flowchart that illustrates an exemplary method for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. - The following described implementations may be found in the disclosed electronic apparatus and method for synchronization of wireless audio to video. Exemplary aspects of the disclosure provide an electronic apparatus which may determine a wireless audio processing delay associated with the electronic apparatus (i.e. audio source). The wireless audio processing delay may include a duration by which the playback of the video content may lead or lag behind playback of the audio content on the wireless audio device. The electronic apparatus may transmit the audio content to the wireless audio device for playback and may control the playback of the video content on the display device such that playback of the video content is time-synchronized with the playback of the transmitted audio content on the wireless audio device.
- The playback of the video content may be controlled based on the determined wireless audio processing delay. For example, the electronic apparatus may use the determined wireless audio processing delay to delay the playback of the video content on the display device. As the playback of the audio content is delayed due to wireless audio processing, the delayed playback of the video content may be timed to match the delayed playback of the audio content on the wireless audio device. This may mitigate the lip sync (or AV synchronization issue) and enhance the listening experience of the user on the wireless audio device. The user may be able to hear the audio content on the wireless audio device at the same time when the video content is displayed on the display device.
-
FIG. 1 is a block diagram that illustrates an exemplary network environment for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure. With reference toFIG. 1 , there is shown anetwork environment 100. Thenetwork environment 100 may include anelectronic apparatus 102, an audio/video (AV)source 104, adisplay device 106, and awireless audio device 108. Theelectronic apparatus 102 may be coupled to thewireless audio device 108, via awireless network 110. There is further shown auser 112 who may be associated with thewireless audio device 108. InFIG. 1 , theelectronic apparatus 102 and thedisplay device 106 are shown as two separate devices, however, in some embodiments, the entire functionality of thedisplay device 106 may be incorporated in theelectronic apparatus 102, without a deviation from the scope of the disclosure. - The
electronic apparatus 102 may include suitable logic, circuitry, interfaces, and/or code that may be configured to receive media content from theAV source 104 and control playback of the received media content via thedisplay device 106 and one or more audio devices communicatively coupled to theelectronic apparatus 102. - In an exemplary embodiment, the
electronic apparatus 102 may be a display-enabled media player and thedisplay device 106 may be included in theelectronic apparatus 102. Examples of such an implementation of theelectronic apparatus 102 may include, but are not limited to, a television (TV), an Internet-Protocol TV (IPTV), a smart TV, a smartphone, a personal computer, a laptop, a tablet, a wearable electronic device, or any other display device with a capability to receive, decode, and play content encapsulated in broadcasting signals from cable or satellite networks, over-the-air broadcast, or internet-based communication signals. - In another exemplary embodiment, the
electronic apparatus 102 may be a media player that may communicate with thedisplay device 106, via a wired or a wireless connection. Examples of such an implementation of theelectronic apparatus 102 may include, but are not limited to, a digital media player (DMP), a micro-console, a TV tuner, an Advanced Television Systems Committee (ATSC) 3.0 tuner, a set-top-box, an Over-the-Top (OTT) player, a digital media streamer, a media extender/regulator, a digital media hub, a computer workstation, a mainframe computer, a handheld computer, a smart appliance, a plug-in device and/or any other computing device with content streaming and playback functionality. - The
AV source 104 may include suitable logic, circuitry, and interfaces that may be configured to transmit the media content to theelectronic apparatus 102. The media content on theAV source 104 may include audio content and video content associated with the audio content. For example, if the media content is a television program, then the audio content may include a background audio, actor voice or speech, and other audio components, such as audio description. - In an embodiment, the
AV source 104 may be implemented as a storage device which stores the media content. Examples of such an implementation of theAV source 104 may include, but are not limited to, a Pen Drive, a Flash USB Stick, a Hard Disk Drive (HDD), a Solid-State Drive (SSD), and/or a Secure Digital (SD) card. In another embodiment, theAV source 104 may be implemented as a media streaming server, which may transmit the media content to theelectronic apparatus 102, via a communication network (not shown). In another embodiment, theAV source 104 may be an TV tuner, such as an ATSC tuner, which may receive digital TV (DTV) signals from an over-the-air broadcast network and may extract the media content from the received DTV signal. Thereafter, theAV source 104 may transmit the extracted media content to theelectronic apparatus 102. - In
FIG. 1 , theAV source 104 and theelectronic apparatus 102 are shown as two separate devices. However, the present disclosure may not be so limiting and in some embodiments, the functionality of theAV source 104 may be incorporated in its entirety or at least partially in theelectronic apparatus 102, without departing from the scope of the present disclosure. - The
display device 106 may include suitable logic, circuitry, and/or interfaces that may be configured to display the video content, which may be received from theelectronic apparatus 102. In one embodiment, thedisplay device 106 may be a touch screen which may enable theuser 112 to provide a user-input via thedisplay device 106. Thedisplay device 106 may be realized through several known technologies such as, but not limited to, at least one of a Liquid Crystal Display (LCD) display, a Light Emitting Diode (LED) display, a plasma display, or an Organic LED (OLED) display technology, or other display devices. In accordance with an embodiment, thedisplay device 106 may refer to a display screen of a head mounted device (HMD), a smart-glass device, a see-through display, a projection-based display, an electro-chromic display, or a transparent display. - The
wireless audio device 108 may include suitable logic, circuitry, and/or interfaces that may be configured to receive the audio content from theelectronic apparatus 102. Thewireless audio device 108 may be a portable wireless speaker, a wearable audio device, or a head-mounted audio device. Examples of thewireless audio device 108 may include, but are not limited to, a wireless speaker of a surround sound system, an over-head headphone, an in-ear headphone, a clip-on headphone, a bone-conduction headphone, a hearing aid, smart glasses, or a head-mounted display (for example, an Augmented Reality (AR) headset, Mixed Reality (MR) headset, or Virtual Reality (VR) goggles). Thewireless audio device 108 may rely on a wireless communication protocol, such as Wi-Fi, Bluetooth®, or Bluetooth® Low Energy (BLE) to receive the audio content from theelectronic apparatus 102. - The
wireless network 110 may include a medium through which two or more wireless devices may communicate with each other. For example, thewireless network 110 may be established between theelectronic apparatus 102 and thewireless audio device 108 and may allow theelectronic apparatus 102 and thewireless audio device 108 to communication with each other. In case there are two or more wireless audio devices, each of the two or more wireless audio devices may pair-up and communicate with theelectronic apparatus 102 via thewireless network 110. - Examples of wireless network protocols may include, but are not limited to, Radio Frequency Identification (RFID), Wireless USB, and Near Field Communication (NFC) (e.g., NFC Peer-to-Peer), Bluetooth™, or Bluetooth Low Energy (BLE™) ZigBee, Personal Area Network (PAN), Wi-Max, a cellular network, a Long-Term Evolution (LTE) network, or an Evolved High Speed Packet Access (HSPA+), protocols based on 802 wireless standards such as 802.3, 802.15.1, 802.16 (Wireless local loop), 802.20 (Mobile Broadband Wireless Access (MBWA)), 802.11-1997 (legacy version), 802.15.4, 802.11a, 802.11b, 802.11g, 802.11e, 802.11i, 802.11f, 802.11c, 802.11h (specific to European regulations) 802.11n, 802.11j (specific to Japanese regulations), 802.11p, 802.11ac, 802.11ad, 802.11ah, 802.11aj, 802.11ax, 802.11ay, 802.11az, 802.11 hr (high data rate), 802.11af (white space spectrum), 802.11-2007, 802.11-2008, 802.11-2012, 802.11-2016.
- In operation, the
electronic apparatus 102 may determine the first wireless audio processing delay associated with theelectronic apparatus 102. In accordance with an embodiment, the first wireless audio processing delay may be determined based on a first delay profile stored in a memory (as shown inFIG. 2 ) of theelectronic apparatus 102. The memory may store the first delay profile in form of a look-up table, which may include a list of wireless audio processing delays associated with various models or variants (identified by a unique identifier) of theelectronic apparatus 102. The look-up table may also include a Media Access Control (MAC) address associated with theelectronic apparatus 102. Example of the look-up table is provided in Table 1, as follows: -
TABLE 1 Look-up table Identifier of Buffer Electronic Wireless Audio Memory (in apparatus 102Processing Delay bytes) MAC Address Device A 250 milliseconds 500K 00:1A:7D:10:7C:84 Device B 525 milliseconds 1050K F8:DF:15:6D:E9: 7B Device C 300 milliseconds 600K 0D:1D:86:88:CC:F0
Data provided in Table 1 is merely provided as experimental data and should not be construed as limiting for the present disclosure. The first wireless audio processing delay may be associated an audio processing pipeline of theelectronic apparatus 102. The audio processing pipeline may include various audio processing operations which may be executed on theelectronic apparatus 102 before the audio content is wirelessly transmitted to thewireless audio device 108. Examples of such audio processing operations may include, but are not limited to, audio frame buffering, sample rate conversions, file format conversions, or audio transcoding. - In some embodiments, the first wireless audio processing delay may depend on a size of each audio buffer and a number of times audio frames of the audio content are stored and retrieved from each audio buffer. A larger audio buffer may introduce a greater latency between the time an audio sample is written into it and read out for next operation. For wireless audio transfer, the audio processing pipeline may also include an operation to convert the audio content into a suitable codec format. For example, if the audio content needs to be transferred as Bluetooth® audio, then the audio content may need to be converted into a suitable codec format to conform with the Bluetooth® standard for audio. Thus, the first wireless audio processing delay may include a latency which may be incurred as a result of the conversion to the suitable codec format for the wireless audio.
- The conversion to the suitable codec format may use a suitable audio codec. Examples of an audio codec for Bluetooth® audio may include, but are not limited to, LDAC™, AptX™, AptX™ Adaptive, Low latency and High-Definition audio Codec (LHDC™), Low Complexity Communication Codec (LC3), and low-complexity sub-band codec (SBC). As each type of the audio codec may consume computing resources and buffers in a different manner, the latency included in the first wireless audio processing delay may vary depending on the type of audio codec used for the conversion.
- In some other embodiments, the first wireless audio processing delay may also include a latency associated with a wireless audio transfer operation which may be executed by a wireless data transfer hardware or circuitry of the
electronic apparatus 102. For example, for Bluetooth® audio transfer, the audio samples of the audio content may be queued, packaged, and sent over the Bluetooth® audio transfer hardware. The Bluetooth audio transfer hardware may packetize and transmit the packetized audio content over thewireless network 110. The latency associated with the Bluetooth® audio transfer may be included in the first wireless audio processing delay. As an example, for Bluetooth® audio, the first wireless audio processing delay may include a 100 milliseconds latency for the audio processing operations of the audio processing pipeline, another 2-5 milliseconds latency for conversion to the suitable codec format, and another 100 milliseconds latency for the Bluetooth® audio transfer. - In some other embodiments, the
electronic apparatus 102 may also determine a second wireless audio processing delay associated with thewireless audio device 108. For example, theelectronic apparatus 102 may first determine a MAC address of thewireless audio device 108 and then determine the second wireless processing delay associated with thewireless audio device 108 based on the determined MAC address. The second wireless audio processing delay may be determined based on a second delay profile stored in the memory of theelectronic apparatus 102. Similar to the first delay profile, the memory may also store the second delay profile in form of a look-up table. The look-up table may include wireless audio processing delays associated with different models or variants of thewireless audio device 108. Example of the look-up table is provided in Table 2, as follows: -
TABLE 2 Look-up table Identifier of Wireless Wireless Audio Audio Device 108 Processing Delay MAC Address Device A 5 milliseconds 00:1A:7D:10:7C:84 Device B 7 milliseconds F8:DF:15:6D:E9:7B Device C 10 milliseconds 0D:1D:86:88:CC:F0
Data provided in Table 2 is merely provided as experimental data and should not be construed as limiting for the present disclosure. The second wireless audio processing delay may be associated with a latency caused by audio processing operations on thewireless audio device 108. Examples of such audio processing operations may include, but are not limited to, audio packet/frame buffering, audio decoding or decryption, or other audio effects, such as audio equalization. - In some other embodiments, the
electronic apparatus 102 may control thedisplay device 106 to display a test video. Also, theelectronic apparatus 102 may transmit a test audio associated with the test video to thewireless audio device 108. Theelectronic apparatus 102 may receive, via thewireless audio device 108, a user input which may include a duration by which the playback of the test video is to be delayed to match a time of the playback of the transmitted test audio on thewireless audio device 108. The user input may be received in the form of a touch input, a hand gesture, a head gesture, a voice input, and the like. Theelectronic apparatus 102 may determine the first wireless audio processing delay as the duration included in the user input. - At any time-instant, the
electronic apparatus 102 may receive the media content from theAV source 104. The media content may include, for example, audio content, video content associated with the audio content, and other information, such as subtitles and closed captions. Theelectronic apparatus 102 may transmit the audio content to thewireless audio device 108 via thewireless network 110. Thereafter, theelectronic apparatus 102 may control the playback of the video content on thedisplay device 106 based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on thewireless audio device 108. For example, the playback of the video content may be delayed by at least the determined first wireless audio processing delay to match a time of the playback of the audio content on thewireless audio device 108. - In some embodiments, the
electronic apparatus 102 may control the playback of the video content on thedisplay device 106 further based on the determined second audio processing delay. For example, the playback of the video content may be delayed by a time which may equal the sum of the determined first wireless audio processing delay and the determined second wireless audio processing delay. With the delay in the video playback, theuser 112 may be able to listen to the audio content on thewireless audio device 108 and watch the video content on thedisplay device 106 without any noticeable lip sync issue. -
FIG. 2 is a block diagram that illustrates an exemplary electronic apparatus for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.FIG. 2 is explained in conjunction with elements fromFIG. 1 . With reference toFIG. 2 , there is shown a block diagram 200 of theelectronic apparatus 102. Theelectronic apparatus 102 may includecircuitry 202, amemory 204, anAV source 206, abuffer memory 208, and anetwork interface 210. Thecircuitry 202 may be communicatively coupled to thewireless audio device 108, theAV source 206, thebuffer memory 208, and thenetwork interface 210. TheAV source 206 is an exemplary implementation of theAV source 104 ofFIG. 1 . - The
circuitry 202 may include suitable logic, circuitry, and/or interfaces that may be configured to execute program instructions associated with different operations to be executed by theelectronic apparatus 102. Thecircuitry 202 may include one or more specialized processing units, which may be implemented as an integrated processor or a cluster of processors that perform the functions of the one or more specialized processing units, collectively. Thecircuitry 202 may be implemented based on a number of processor technologies known in the art. Examples of implementations of thecircuitry 202 may be an x86-based processor, a Graphics Processing Unit (GPU), a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, a microcontroller, a central processing unit (CPU), and/or other computing circuits. - The
memory 204 may include suitable logic, circuitry, and/or interfaces that may be configured to store program instructions to be executed by thecircuitry 202. In at least one embodiment, thememory 204 may be configured to store the first delay profile for theelectronic apparatus 102. The first delay profile may include the first wireless audio processing delay associated with theelectronic apparatus 102. Additionally, the first delay profile may include a model name or an identifier (ID) of theelectronic apparatus 102 and/or a Media Access Control (MAC) address associated with theelectronic apparatus 102. Thememory 204 may be also configured to store the second delay profile for thewireless audio device 108. The second delay profile may include a second wireless audio processing delay associated with thewireless audio device 108. Additionally, the second delay profile may include a model name or an identifier of thewireless audio device 108 and/or a MAC address associated with thewireless audio device 108. Thememory 204 may be further configured to store the media content including the video content and the audio content associated with the video content. Examples of implementation of thememory 204 may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), a CPU cache, and/or a Secure Digital (SD) card. - The
AV source 206 may include suitable logic, circuitry, and/or interfaces that may be configured to output compressed media which includes compressed audio data and compressed video data. TheAV source 206 may further include a memory for storage of the compressed media. TheAV source 206 may receive the compressed media through various content delivery systems, such as terrestrial content broadcasting networks, satellite-based broadcasting networks, Internet Protocol (IP) based content networks, or a combination thereof. - The
buffer memory 208 may include suitable logic, circuitry, and interfaces that may be configured to temporarily store data to be transmitted to or received from theelectronic apparatus 102. In one embodiment, thememory 204 may be configured to instantaneously allocate thebuffer memory 208 when the data is required to stored. The size of the allocation (in Kilobytes or Megabytes) may depend on a content bandwidth (in Mbps) and whether compressed or uncompressed data is required to be stored in thebuffer memory 208. In another embodiment, thebuffer memory 208 may be a memory module which may be separate from thememory 204. Example implementations of such a memory module may include, but are not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Hard Disk Drive (HDD), a Solid-State Drive (SSD), and/or a Secure Digital (SD) card. - The
network interface 210 may include suitable logic, circuitry, interfaces, and/or code that may be configured to facilitate communication between thecircuitry 202 and thewireless audio device 108, via thewireless network 110. Thenetwork interface 210 may be implemented by use of various known technologies to support wireless communication of theelectronic apparatus 102 via thewireless network 110. Thenetwork interface 210 may include, for example, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, a local buffer circuitry, and the like. - The
network interface 210 may be configured to communicate via wireless communication with networks, such as the Internet, an Intranet, a wireless network, a cellular telephone network, a wireless local area network (LAN), or a metropolitan area network (MAN). The wireless communication may be configured to use one or more of a plurality of communication standards, protocols and technologies, such as Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), Long Term Evolution (LTE), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (such as IEEE 802.11a, IEEE 802.11b, IEEE 802.11g or IEEE 802.11n), voice over Internet Protocol (VoIP), light fidelity (Li-Fi), or Worldwide Interoperability for Microwave Access (Wi-MAX). - The functions or operations executed by the
electronic apparatus 102, as described inFIG. 1 , may be performed by thecircuitry 202. Operations executed by thecircuitry 202 are described in detail, for example, inFIG. 3 andFIG. 4 . -
FIG. 3 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.FIG. 3 is explained in conjunction with elements fromFIG. 1 andFIG. 2 . With reference toFIG. 3 , there is shown a block diagram 300 that illustrates exemplary operations from 304 to 308 for synchronization of wireless audio to video. The exemplary operations may be executed by any computing system, for example, by theelectronic apparatus 102 ofFIG. 1 or by thecircuitry 202 ofFIG. 2 . - There is further shown an
AV source 302 which may be coupled to thebuffer memory 208 ofFIG. 2 . TheAV source 302 is an exemplary implementation of theAV source 206 ofFIG. 2 or theAV source 104 ofFIG. 1 . The description of theAV source 302 is omitted from the disclosure for the sake of brevity. TheAV source 302 may store the compressed media which includes compressedaudio data 302A andcompressed video data 302B associated with thecompressed audio data 302A. - At 304, the
compressed audio data 302A may be decoded. In accordance with an embodiment, thecircuitry 202 may decode thecompressed audio data 302A to output uncompressed audio, which may be referred to as the audio content. Thecompressed audio data 302A may be decoded and converted to a suitable codec format supported by an audio codec for wireless audio transfer. Thereafter, thecircuitry 202 may transmit the audio content to thewireless audio device 108, via thewireless network 110. - The decoding of the
compressed audio data 302A, the conversion to the suitable codec format, and the wireless transmission of the audio content may incur a delay in the wireless audio processing pipeline of theelectronic apparatus 102. The delay may equal (or may approximate) the determined first wireless audio processing delay (as described inFIG. 1 ) associated with theelectronic apparatus 102. In order to avoid any lip synch error due to such delay in the wireless audio processing pipeline of theelectronic apparatus 102, decoding of thecompressed video data 302B may be delayed, as described at 306. - At 306, the
compressed video data 302B may be stored in thebuffer memory 208. In accordance with an embodiment, thecircuitry 202 may store thecompressed video data 302B in thebuffer memory 208 for a holding duration while thecompressed audio data 302A is decoded, converted, and transmitted to the wireless audio device 108 (at 304). The holding duration may refer to a time duration for which thecompressed video data 302B may be stored in thebuffer memory 208 to delay the playback of uncompressed video data (i.e. video content) on thedisplay device 106 and to match a time of the playback of the transmitted audio content to thewireless audio device 108. The holding duration may include the determined first wireless audio processing delay and/or any delay associated with movement of thecompressed video data 302B in and out of thebuffer memory 208. In at least one embodiment, the holding duration may also include the second wireless audio processing delay (as described inFIG. 1 ) associated with thewireless audio device 108. - At 308, the
compressed video data 302B may be decoded. In accordance with an embodiment, thecircuitry 202 may extract thecompressed video data 302B from thebuffer memory 208 after the holding duration. After the extraction, thecircuitry 202 may decode thecompressed video data 302B to output uncompressed video, which may be referred to as the video content. After thecompressed video data 302B is decoded, thecircuitry 202 may transfer the video content to thedisplay device 106 and may control the playback of the video content on thedisplay device 106. As the operations at 308 may start only after with thecompressed video data 302B is removed from thebuffer memory 208, the playback of the video content may be delayed to match the time of the playback of the audio content on thewireless audio device 108. This may allow theelectronic apparatus 102 to remove the lip sync error typically associated with the playback of the video content and the playback of the audio content. -
FIG. 4 is a diagram that illustrates exemplary operations for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.FIG. 4 is explained in conjunction with elements fromFIGS. 1, 2, and 3 . With reference toFIG. 4 , there is shown a block diagram 400 that illustrates exemplary operations from 404 to 408 for synchronization of wireless audio to video. The exemplary operations from 404 to 408 may be executed by any computing system, for example, by theelectronic apparatus 102 ofFIG. 1 or by thecircuitry 202 ofFIG. 2 . - There is further shown an
AV source 402 which may be coupled to thebuffer memory 208 ofFIG. 2 . TheAV source 402 is an exemplary implementation of theAV source 206 ofFIG. 2 or theAV source 104 ofFIG. 1 . The description of theAV source 402 is omitted from the disclosure for the sake of brevity. TheAV source 402 may store the compressed media which includes compressed audio data 402A and compressed video data 402B associated with the compressed audio data 402A. - At 404, the compressed audio data 402A may be decoded. In accordance with an embodiment, the
circuitry 202 may decode the compressed audio data 402A to output uncompressed audio, which may be referred to as the audio content. The compressed audio data 402A may be decoded and converted to a suitable codec format supported by an audio codec for wireless audio transfer. Thereafter, thecircuitry 202 may transmit the audio content to thewireless audio device 108, via thewireless network 110. - The decoding of the compressed audio data 402A, the conversion to the suitable codec format, and the wireless transmission of the audio content may incur a delay in the wireless audio processing pipeline of the
electronic apparatus 102. The delay may equal (approximate) the determined first wireless audio processing delay (as described inFIG. 1 ) associated with theelectronic apparatus 102. In order to avoid any lip synch error due to such delay in the wireless audio processing pipeline of theelectronic apparatus 102, the compressed video data 402B may be first decoded and then stored in a buffer to delay the playback, as described at 406 and onwards. - At 406, the compressed video data 402B may be decoded. In accordance with an embodiment, the
circuitry 202 may decode the compressed video data 402B to output uncompressed video, which may be referred to as the video content. - At 408, the video content may be stored in the
buffer memory 208 after the compressed video data 402B is decoded. In accordance with an embodiment, thecircuitry 202 may store the compressed video data 402B in thebuffer memory 208 for a holding duration while the compressed audio data 402A is decoded, converted, and transmitted to the wireless audio device 108 (at 404). - The holding duration may refer to a time duration for which the compressed video data 402B may be stored in the
buffer memory 208 to delay the playback of uncompressed video data (i.e. video content) on thedisplay device 106 and to match a time of the playback of the transmitted audio content to thewireless audio device 108. The holding duration may include the determined first wireless audio processing delay and/or any delay associated with movement of the compressed video data 402B in and out of thebuffer memory 208. In at least one embodiment, the holding duration may also include the second wireless audio processing delay (as described inFIG. 2 ) associated with thewireless audio device 108. - The
circuitry 202 may extract the video content (i.e. the uncompressed video data) from thebuffer memory 208 after the holding duration and may transfer the video content to thedisplay device 106. Thereafter, thecircuitry 202 may control the playback of the video content on thedisplay device 106 and match the time of the playback of the audio content on thewireless audio device 108. -
FIG. 5 is a flowchart that illustrates exemplary method for synchronization of wireless audio to video, in accordance with an embodiment of the disclosure.FIG. 5 is explained in conjunction with elements fromFIGS. 1, 2, 3, and 4 . With reference toFIG. 5 , there is shown aflowchart 500. The method illustrated in theflowchart 500 may be executed by any computing system, such as by theelectronic apparatus 102 or thecircuitry 202. The method may start at 502 and proceed to 504. - At 504, a first wireless audio processing delay may be determined. In one or more embodiments, the
circuitry 202 may be configured to determine the first wireless audio processing delay associated with theelectronic apparatus 102. - At 506, the media content may be received. In one or more embodiments, the
circuitry 202 may be configured to receive the media content which includes video content and audio content associated with the video content. - At 508, the audio content may be transmitted to the
wireless audio device 108. In one or more embodiments, the circuitry may be configured to transmit the audio content to thewireless audio device 108. - At 510, a playback of the video content on the
display device 106 may be controlled based on the determined first wireless audio processing delay. In one or more embodiments, thecircuitry 202 may be configured to control the playback of the video content on thedisplay device 106 based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on thewireless audio device 108. Control may pass to end. - Although the
flowchart 500 is illustrated as discrete operations, such as 504, 506, 508, and 510, the disclosure is not so limited. Accordingly, in certain embodiments, such discrete operations may be further divided into additional operations, combined into fewer operations, or eliminated, depending on the particular implementation without detracting from the essence of the disclosed embodiments. - Various embodiments of the disclosure may provide a non-transitory computer readable medium and/or storage medium having stored thereon, instructions executable by a machine and/or a computer to operate an electronic apparatus. The instructions may cause the machine and/or computer to perform operations that include determining a first wireless audio processing delay associated with the electronic apparatus. The operations may further include receiving media content comprising video content and audio content associated with the video content. The operations may further include transmitting the audio content to a wireless audio device communicatively coupled to the electronic apparatus. The operations may further include controlling playback of the video content on a display device associated with the electronic apparatus based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device.
- Exemplary aspects of the disclosure may provide an electronic apparatus (such as the
electronic apparatus 102 ofFIG. 1 ) that includes circuitry (such as the circuitry 202) that may be communicatively coupled to a display device (such as the display device 106) and a wireless audio device (such as the wireless audio device 108). The circuitry may be configured to determine a first wireless audio processing delay associated with the electronic apparatus. At any time, the circuitry may be further configured to receive media content comprising video content and audio content associated with the video content. The circuitry may be further configured to transmit the audio content to the wireless audio device and control playback of the video content on the display device based on the determined first wireless audio processing delay such that the playback of the video content is time-synchronized with playback of the audio content on the wireless audio device. In accordance with an embodiment, the playback of the video content is delayed by at least the determined first wireless audio processing delay to match a time of the playback of the audio content on the wireless audio device. - In accordance with an embodiment, the circuitry may be further configured to determine a Media Access Control (MAC) address of the wireless audio device and determine a second wireless audio processing delay associated with the wireless audio device based on the determined MAC address. In accordance with an embodiment, the circuitry may be further configured to control the playback of the video content on the display device further based on the determined second wireless audio processing delay.
- In accordance with an embodiment, the electronic apparatus may further include a memory (such as the memory 204) configured to store a first delay profile for the electronic apparatus. The first delay profile comprises the first wireless audio processing delay associated with the electronic apparatus. In accordance with an embodiment, the circuitry may be further configured to determine the first wireless audio processing delay based on the first delay profile.
- In accordance with an embodiment, the electronic apparatus may further include an Audio/Video (AV) source (such as the AV source 206) configured to output compressed media comprising compressed audio data (such as the
compressed audio data 302A) and compressed video data (such as thecompressed video data 302B). - In accordance with an embodiment, the electronic apparatus may further include a buffer memory (such as the buffer memory 208) coupled to the AV source. The circuitry may be configured to store the compressed video data in the buffer memory for a holding duration that includes the determined first wireless audio processing delay while the compressed audio data is decoded as the audio content and transmitted to the wireless audio device. The circuitry may be further configured to extract the compressed video data from the buffer memory after the holding duration. After the extraction, the circuitry may be further configured to decode the compressed video data as the video content. Thereafter, the circuitry may be configured to control the playback of the video content on the display device to match with a timing of the playback of the audio content on the wireless audio device.
- In accordance with an embodiment, the electronic apparatus may further include a buffer memory (such as the buffer memory 208) coupled to the AV source. The circuitry may be configured to decode the compressed video data to obtain the video content. After the decode, the circuitry may be configured to store the video content in the buffer memory for a holding duration that includes the determined first wireless audio processing delay while the compressed audio data is decoded as the audio content and transmitted to the wireless audio device. The circuitry may be further configured to extract the stored video data from the buffer memory after the holding duration. Thereafter, the circuitry may be configured to control the playback of the video content on the display device to match with a timing of the playback of the transmitted audio content on the wireless audio device.
- In accordance with an embodiment, the circuitry may be further configured to control the display device to display a test video. The circuitry may be further configured to transmit a test audio associated with the test video to the
wireless audio device 108. The circuitry may be further configured to receive, via the wireless audio device, a user (such as the user 112) input comprising a duration by which the playback of the test video is to be delayed to match a time of the playback of the transmitted test audio to the wireless audio device. Thereafter, the circuitry may be configured to set the first wireless audio processing delay as the duration included in the user input. - The present disclosure may be realized in hardware, or a combination of hardware and software. The present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems. A computer system or other apparatus adapted to carry out the methods described herein may be suited. A combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein. The present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
- The present disclosure may also be embedded in a computer program product, which comprises all the features that enable the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program, in the present context, means any expression, in any language, code or notation, of a set of instructions intended to cause a system with information processing capability to perform a particular function either directly, or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- While the present disclosure is described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made, and equivalents may be substituted without departure from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departure from its scope. Therefore, it is intended that the present disclosure is not limited to the particular embodiment disclosed, but that the present disclosure will include all embodiments that fall within the scope of the appended claims.
Claims (21)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/905,166 US11196899B1 (en) | 2020-06-18 | 2020-06-18 | Synchronization of wireless-audio to video |
CN202180006340.3A CN114667510A (en) | 2020-06-18 | 2021-06-17 | Wireless audio and video synchronization |
KR1020227021863A KR20220101726A (en) | 2020-06-18 | 2021-06-17 | Synchronization to video in wireless audio |
EP21826471.1A EP4133377A4 (en) | 2020-06-18 | 2021-06-17 | Synchronization of wireless-audio to video |
PCT/US2021/037930 WO2021257902A1 (en) | 2020-06-18 | 2021-06-17 | Synchronization of wireless-audio to video |
JP2022538804A JP2023508945A (en) | 2020-06-18 | 2021-06-17 | Synchronization of wireless audio with video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/905,166 US11196899B1 (en) | 2020-06-18 | 2020-06-18 | Synchronization of wireless-audio to video |
Publications (2)
Publication Number | Publication Date |
---|---|
US11196899B1 US11196899B1 (en) | 2021-12-07 |
US20210400168A1 true US20210400168A1 (en) | 2021-12-23 |
Family
ID=78818789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/905,166 Active US11196899B1 (en) | 2020-06-18 | 2020-06-18 | Synchronization of wireless-audio to video |
Country Status (6)
Country | Link |
---|---|
US (1) | US11196899B1 (en) |
EP (1) | EP4133377A4 (en) |
JP (1) | JP2023508945A (en) |
KR (1) | KR20220101726A (en) |
CN (1) | CN114667510A (en) |
WO (1) | WO2021257902A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11871069B2 (en) * | 2019-07-26 | 2024-01-09 | Lg Electronics Inc. | Multimedia service providing device and multimedia service providing method |
US20240155178A1 (en) * | 2022-11-03 | 2024-05-09 | Roku, Inc. | Private listening system for streaming audio and video |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101416249B1 (en) | 2007-08-01 | 2014-07-07 | 삼성전자 주식회사 | Signal processing apparatus and control method thereof |
US8743284B2 (en) | 2007-10-08 | 2014-06-03 | Motorola Mobility Llc | Synchronizing remote audio with fixed video |
KR101450100B1 (en) | 2007-11-22 | 2014-10-15 | 삼성전자주식회사 | Multimedia apparatus and synchronization method thereof |
WO2010077564A1 (en) * | 2008-12-08 | 2010-07-08 | Analog Devices Inc. | Multimedia switching over wired or wireless connections in a distributed environment |
CN105376628B (en) | 2014-08-27 | 2018-09-21 | 深圳Tcl新技术有限公司 | Audio and video frequency signal synchronizing method and device |
CN204305260U (en) | 2014-12-28 | 2015-04-29 | 冠捷显示科技(厦门)有限公司 | The television set of a kind of video and wireless sound box Audio Matching |
CN106331562B (en) | 2015-06-16 | 2020-04-24 | 南宁富桂精密工业有限公司 | Cloud server, control device and audio and video synchronization method |
US10158905B2 (en) | 2016-09-14 | 2018-12-18 | Dts, Inc. | Systems and methods for wirelessly transmitting audio synchronously with rendering of video |
US10892833B2 (en) * | 2016-12-09 | 2021-01-12 | Arris Enterprises Llc | Calibration device, method and program for achieving synchronization between audio and video data when using Bluetooth audio devices |
-
2020
- 2020-06-18 US US16/905,166 patent/US11196899B1/en active Active
-
2021
- 2021-06-17 CN CN202180006340.3A patent/CN114667510A/en active Pending
- 2021-06-17 KR KR1020227021863A patent/KR20220101726A/en not_active Application Discontinuation
- 2021-06-17 EP EP21826471.1A patent/EP4133377A4/en active Pending
- 2021-06-17 WO PCT/US2021/037930 patent/WO2021257902A1/en unknown
- 2021-06-17 JP JP2022538804A patent/JP2023508945A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
EP4133377A1 (en) | 2023-02-15 |
US11196899B1 (en) | 2021-12-07 |
KR20220101726A (en) | 2022-07-19 |
WO2021257902A1 (en) | 2021-12-23 |
CN114667510A (en) | 2022-06-24 |
JP2023508945A (en) | 2023-03-06 |
EP4133377A4 (en) | 2023-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7120997B2 (en) | Multi-mode synchronous rendering of audio and video | |
US9826015B2 (en) | Dynamic and automatic control of latency buffering for audio/video streaming | |
EP2987296B1 (en) | Method and apparatus for packet header compression | |
US9928844B2 (en) | Method and system of audio quality and latency adjustment for audio processing by using audio feedback | |
US8665370B2 (en) | Method for synchronized playback of wireless audio and video and playback system using the same | |
US20160098244A1 (en) | Audio synchronization method for bluetooth speakers | |
US11196899B1 (en) | Synchronization of wireless-audio to video | |
KR20160139020A (en) | Broadcast and broadband hybrid service with mmt and dash | |
KR102306352B1 (en) | Signaling and operation of an mmtp de-capsulation buffer | |
JP7100052B2 (en) | Electronic device and its control method | |
US20170026439A1 (en) | Devices and methods for facilitating video and graphics streams in remote display applications | |
KR102356956B1 (en) | Method and apparatus for signaling and operation of low delay consumption of media data in mmt | |
US20140013362A1 (en) | Method for implementing digital television technology and wireless fidelity hot spot apparatus | |
US20140365685A1 (en) | Method, System, Capturing Device and Synchronization Server for Enabling Synchronization of Rendering of Multiple Content Parts, Using a Reference Rendering Timeline | |
US20150201253A1 (en) | Methods and apparatus for universal presentation timeline alignment | |
US20220103609A1 (en) | Method and apparatus for playing multimedia streaming data | |
US20150189328A1 (en) | Systems and methods for transmitting and receiving audio and video data | |
US20240056617A1 (en) | Signaling changes in aspect ratio of media content | |
KR102209782B1 (en) | Method for providing of streamming service and apparatus for the same | |
US20170374243A1 (en) | Method of reducing latency in a screen mirroring application and a circuit of the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CANDELORE, BRANT;NEJAT, MAHYAR;SHINTANI, PETER;AND OTHERS;SIGNING DATES FROM 20200722 TO 20200928;REEL/FRAME:054139/0793 |
|
AS | Assignment |
Owner name: SONY GROUP CORPORATION, JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:SONY CORPORATION;REEL/FRAME:057246/0328 Effective date: 20200401 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |