CN103905879A - Video data and audio data synchronized playing method and device and equipment - Google Patents

Video data and audio data synchronized playing method and device and equipment Download PDF

Info

Publication number
CN103905879A
CN103905879A CN201410093950.3A CN201410093950A CN103905879A CN 103905879 A CN103905879 A CN 103905879A CN 201410093950 A CN201410093950 A CN 201410093950A CN 103905879 A CN103905879 A CN 103905879A
Authority
CN
China
Prior art keywords
playing
display terminal
data
time value
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410093950.3A
Other languages
Chinese (zh)
Other versions
CN103905879B (en
Inventor
李典
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201410093950.3A priority Critical patent/CN103905879B/en
Publication of CN103905879A publication Critical patent/CN103905879A/en
Application granted granted Critical
Publication of CN103905879B publication Critical patent/CN103905879B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The embodiment of the invention provides a video data and audio data synchronized playing method and device and equipment. The method includes the steps that a multimedia data playing request is received on a display terminal side; the playing request comprises multimedia data identification; by depending on the multimedia data identification, corresponding video data are acquired from a request-playing server in communication with the display terminal side; the multimedia data identification is transmitted to a mobile device side; the mobile device side is used for acquiring the corresponding video data from the request-playing server in communication with the display terminal side by depending on the multimedia data identification, and the display terminal side is connected with the mobile device side in a wireless transmission mode; when the video data are played, playing object timestamps are generated according to currently-played video timestamps; the playing object timestamps are sent to the mobile device side; the mobile device side is used for playing the audio data corresponding to the playing object timestamps. With the video data and audio data synchronized playing method and device and the equipment, synchronized playing of the audio data and the video data is achieved.

Description

Method, device and equipment for synchronously playing video data and audio data
Technical Field
The embodiment of the invention relates to the technical field of multimedia data processing, in particular to a method for synchronously playing video data and audio data, a device for synchronously playing video data and audio data and equipment.
Background
With the popularization of players capable of supporting various media, more and more media files are played synchronously to obtain better appreciation effects and artistic effects, wherein playing videos, enjoying music and browsing pictures are the most widely applied media execution modes.
Taking the smart television device as an example, the smart television comprises a smart television and a smart set top box, and can play multimedia data. In order to avoid disturbing other people and ensure watching of multimedia data in the case of rest of family members at night, for example, a user may wish to listen to sound of the smart television device playing video by wearing earphones instead of the speaker device of the smart television device playing sound.
There are generally two solutions to wearing headphones: one method is to connect the smart television device with a wired earphone; another method is for a device supporting a bluetooth headset, which is linked to a smart tv.
The first method is to insert the wired earphone into the smart tv device, and since the smart tv device is usually far from the user viewing location, the wired earphone is required to have a long line, and the plugging and unplugging are inconvenient. The user needs to drag a relatively long line to perform other actions, such as pouring water, etc., which causes inconvenience in operation.
The second method needs to purchase additional bluetooth headsets, which is high in cost; moreover, the bluetooth headset usually has a problem that sound is delayed from a picture, and the user experience is very poor.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present invention is to provide a method for synchronously playing video data and audio data to solve the problems of inconvenient operation and high cost.
Correspondingly, the embodiment of the invention also provides a device and equipment for synchronously playing the video data and the audio data, which are used for ensuring the realization and the application of the method.
In order to solve the above problem, an embodiment of the present invention discloses a method for synchronously playing video data and audio data, including:
receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display device side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or,
the delay time value is the delay time value sent by the mobile equipment.
Preferably, after the step of obtaining the corresponding video data from the on-demand server according to the multimedia data identifier, the method further includes:
and buffering the video data.
Preferably, the multimedia data identification includes a file name of the multimedia data, and/or a file address of the multimedia data.
The embodiment of the invention also discloses a method for synchronously playing the video data and the audio data, which comprises the following steps:
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, after the step of receiving the play target timestamp sent by the display terminal side, the method further includes:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Preferably, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Preferably, the audio data carries one or more audio time stamps;
the step of playing the audio data corresponding to the playing target timestamp includes:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Preferably, before the step of playing the audio data corresponding to the playing target timestamp, the method further includes:
and carrying out buffering processing on the audio data.
Preferably, the multimedia data identifier includes a file name of the multimedia data, and/or a file address of the multimedia data;
the embodiment of the invention also discloses a device for synchronously playing the video data and the audio data, which comprises the following steps:
the multimedia data identification receiving module is used for receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
the video data acquisition module is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification;
the multimedia data identification sending module is used for sending the multimedia data identification to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the playing target timestamp generating module is used for generating a playing target timestamp according to the currently played video timestamp when the video data is played;
the playing target timestamp sending module is used for sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Preferably, the apparatus further comprises:
and the first buffer module is used for buffering the video data.
Preferably, the multimedia data identification includes a file name of the multimedia data, and/or a file address of the multimedia data.
The embodiment of the invention also discloses a device for synchronously playing the video data and the audio data, which comprises the following steps:
the multimedia data identification receiving module is used for receiving the multimedia data identification sent by the display terminal side at the mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the audio data module is used for acquiring audio data from an on-demand server which is communicated with the mobile equipment according to the multimedia data identifier;
a playing target timestamp receiving module, configured to receive a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and the audio data playing module is used for playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the apparatus further comprises:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
Preferably, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Preferably, the audio data carries one or more audio time stamps; the audio data broadcasting module comprises:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
Preferably, the method further comprises the following steps:
and the second buffer module is used for carrying out buffer processing on the audio data.
Preferably, the multimedia data identification includes a file name of the multimedia data, and/or a file address of the multimedia data.
The embodiment of the invention also discloses a system for synchronously playing the video data and the audio data, which comprises a display terminal, a mobile device and an on-demand server which are communicated with each other;
the display terminal is used for receiving a multimedia data playing request and sending a video data acquisition request to the on-demand server according to the playing request; the playing request comprises a multimedia data identifier;
the mobile equipment is used for receiving the multimedia data identifier sent by the display terminal and sending an audio data acquisition request to the on-demand server according to the multimedia data identifier;
the on-demand server is used for respectively returning corresponding video data and audio data to the display terminal and the mobile equipment according to the video data acquisition request and the audio data acquisition request;
the display terminal is further used for playing the video data, generating a timestamp according to the currently played video timestamp, and sending the timestamp to the mobile equipment; the moving device is further configured to play audio data corresponding to the play target timestamp.
The embodiment of the invention also discloses a device, which comprises:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
The embodiment of the invention also discloses a device, which comprises:
one or more processors;
a memory; and
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
Compared with the background art, the embodiment of the invention has the following advantages:
in the embodiment of the invention, after the display terminal receives the multimedia data identifier, the corresponding video data is obtained according to the multimedia data identifier, and when the video data is played, the multimedia data identifier and the playing target timestamp are sent to the mobile equipment, the mobile equipment obtains the corresponding audio data according to the multimedia data identifier, and plays the audio data according to the playing target timestamp, wherein the display terminal is connected with the mobile equipment in a wireless transmission mode, a user can get rid of the constraint that a wired earphone is directly connected with the display terminal, the operation of the user is convenient, the problem of obvious asynchronization caused by the accumulation of tiny differences generated when the audio data and the video data are played is avoided, and the synchronous playing of the audio data and the video data is realized. In addition, the mobile equipment is a product with high frequency of use for the public, the embodiment of the invention reuses the mobile equipment, has multiple functions, avoids additional purchase of Bluetooth earphones, has strong practicability and greatly reduces the cost.
According to the embodiment of the invention, the delay time value is added in the playing target timestamp, so that the influence of the delay of the display terminal and the mobile equipment in transmitting the playing target timestamp is eliminated, and the synchronous playing precision of the audio data and the video data is further improved.
Drawings
FIG. 1 is a diagram illustrating an exemplary process for a display terminal to synchronously play video data and audio data;
FIG. 2 is a diagram illustrating an exemplary process of a display terminal playing video data and audio data synchronously with a Bluetooth headset;
fig. 3 is a flowchart illustrating steps of embodiment 1 of a method for synchronously playing video data and audio data according to the present invention;
FIG. 4 is a flow chart illustrating the steps of embodiment 2 of the method for playing video data and audio data synchronously;
FIG. 5 is a flow chart illustrating the steps of embodiment 3 of the method for playing video data and audio data synchronously;
FIG. 6 is a flow chart illustrating the steps of embodiment 4 of the method for playing video data and audio data synchronously;
fig. 7 is a block diagram showing the structure of an embodiment 1 of the apparatus for synchronously playing video data and audio data according to the present invention;
fig. 8 is a block diagram showing the structure of an embodiment 2 of the apparatus for synchronously playing video data and audio data according to the present invention;
FIG. 9 is a block diagram of a system for synchronized playback of video data and audio data according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a smart television according to an embodiment of the present invention;
fig. 11 shows a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the embodiments of the present invention more comprehensible, embodiments of the present invention are described in detail below with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, a flowchart illustrating a process of synchronously playing video data and audio data by a display terminal is shown.
As shown in fig. 1, an audio/video data reading module in a display terminal reads multimedia data and then decodes the multimedia data to obtain video data and audio data, and the video data and the audio data have time stamps for synchronization.
And then sending the video data to a video output module and sending the audio data to an audio output module. When the multimedia data are played, the playing time point control module synchronizes timestamps of the video data and the audio data, then the video output module plays the video data on the display according to the synchronized timestamps, the audio data module plays the audio data on the loudspeaker according to the synchronized timestamps, finally, the synchronous playing of the video data and the audio data is realized, and the playing of the multimedia data is realized on the whole.
However, in some cases, such as at night when family members rest, to avoid disturbing others, the user may wish to listen to the audio data through the earphones when the smart television device plays the multimedia data.
In order to solve the problems, the Bluetooth headset can be used for remotely listening to audio data when the intelligent television equipment plays multimedia data.
In bluetooth applications, bluetooth products differentiate between device types and service types.
Generally, the device types include a main device type and an auxiliary device type, and specify which type of device the bluetooth device belongs to, such as a headset, a mobile phone, a printer, and the like. Taking a mobile phone as an example, whether the mobile phone is a smart phone or a normal mobile phone is specified by the type of the auxiliary device.
The service type specifies the services that a bluetooth device can provide. Taking a mobile phone as an example, some mobile phones support two File transfer services, i.e., an Object store Profile (OPP) service and a File Transfer Protocol (FTP), some mobile phones only provide the OPP service, two bluetooth devices need to communicate with each other, and the device types may be different, such as a mobile phone and an earphone, but the service protocols of the bluetooth devices must be consistent because the earphone is required to provide a voice service, the earphone is searched by the mobile phone, which service it can provide is required to inquire before connection, and then communication is performed.
Although a mobile device, such as a smart phone, a smart tablet, etc., has a bluetooth function and belongs to a bluetooth product, the device type of the mobile device is not a headset, and the mobile device cannot provide a bluetooth headset service capability, and therefore cannot be used as a headset to be connected to a bluetooth module on a display terminal and push audio data.
Therefore, if the user wants to listen to the audio data of the smart television device playing the multimedia data through the bluetooth headset, the bluetooth headset needs to be additionally purchased.
Referring to fig. 2, a flowchart illustrating a process of playing video data and audio data synchronously by a display terminal and a bluetooth headset is shown.
As shown in fig. 2, the display terminal is connected to the bluetooth headset through a bluetooth link. And an audio and video data reading module in the display terminal reads the multimedia data and then decodes the multimedia data to obtain video data and audio data, wherein the video data and the audio data are provided with time stamps for synchronization.
And then sending the video data to a video output module and sending the audio data to the Bluetooth headset.
And a Bluetooth audio receiving module in the Bluetooth earphone receives audio data sent by the display terminal, then transmits the audio data to an audio output module, and then outputs the audio data to an earphone loudspeaker for playing.
The audio data and the video data are respectively played on two devices, and before the audio data is played, the audio data is subjected to a Bluetooth transmission process and data processing in the transmission process, so that time delay is generated certainly.
The clock frequency that bluetooth headset timing was used and display terminal's clock frequency have little difference, can make audio data and video data produce little difference at the speed of broadcast, and this difference can constantly accumulate gradually, causes audio data and video data asynchronous phenomenon along with the increase of broadcast time, and is more and more obvious.
Based on the above requirements, the inventor creatively proposes one of the core concepts of the embodiments of the present invention, in which the display terminal decodes the multimedia data into the video data and the audio data, transmits the audio data to the mobile device, returns the audio playing information to the display terminal when the audio data is played, and plays the video data synchronously according to the timestamp.
Referring to fig. 3, a flowchart illustrating steps of embodiment 1 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 301, receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
it should be noted that the Display terminal may include a smart television, a personal computer, a palm computer, a mobile device, and the like, and the smart television may include a Liquid Crystal Display (LCD) television, a Light Emitting Diode (LED) television, a 3D television, a plasma television, and the like, which is not limited in this embodiment of the present invention.
The multimedia data may be a digital television signal, may be multimedia data stored on a display terminal or a magnetic disk of an external device, may be streaming media data, and the like, which is not limited in this embodiment of the present invention.
In a specific implementation, the play request may be generated by a control device of the display terminal, for example, a remote controller of the smart television, and when the user selects the corresponding multimedia data on the remote controller, the remote controller may generate a play request carrying the corresponding multimedia data identifier according to the multimedia data selected by the user, and send the play request to the display terminal.
Step 302, obtaining corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identification;
in practical application, when a display terminal receives a play request, a multimedia data identifier is extracted from the play request, and corresponding video data is acquired from an on-demand server which is communicated with the display terminal. As an example of a specific application of the embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Specifically, the display terminal may obtain the corresponding video data from the on-demand server according to a file name of the multimedia data, or obtain the corresponding video data from the on-demand server according to a file address of the multimedia data, for example, a Uniform Resource Locator (URL) of the multimedia data in the on-demand server.
Step 303, sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
by applying the embodiment of the invention, the display terminal can establish a data transmission link with the mobile equipment.
It should be noted that the mobile device may be various mobile devices such as a tablet computer, a Personal Digital Assistant (PDA), a mobile phone, and the like, and the embodiment of the present invention is not limited thereto.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
an Ethernet;
in this example, the transmission link between the display terminal and the mobile device may be an Ethernet (Ethernet) link, and the display terminal side may transmit the audio data to the mobile device side through the Ethernet transmission manner.
Ethernet (Ethernet) is a computer local area networking technology that uses passive media to broadcast information. It specifies the physical layer and data link layer protocols, the interfaces of the physical layer and data link layer and the interfaces of the data link layer with higher layers. The standard topological structure is bus type topology, but the current fast Ethernet (100 BASE-T, 1000BASE-T standard) uses the Switch (Switch hub) to connect and organize the network in order to reduce the conflict to the maximum extent and improve the network speed and the use efficiency to the maximum extent, so the topological structure of the Ethernet is star-shaped, but logically, the Ethernet still uses the bus type topology and the bus contention technology of CSMA/CD (Carrier Sense Multiple Access/Collision Detect, i.e. Carrier Sense Multiple Access with conflict detection).
In this example, the ethernet may be WiFi (a wireless local area network device established in IEEE802.11 standard), and after the display terminal and the mobile device are connected to the same local area network, the connection may be initiated by using a TCP/IP (Transmission Control Protocol/Internet Protocol, also known as a network communication Protocol) Protocol through an IP (Internet Protocol, Protocol for interconnection between networks) address.
Bluetooth;
in this example, a transmission link between the display terminal and the mobile device may be a bluetooth link, and the display terminal side may transmit the audio data to the mobile device side by using a bluetooth transmission manner.
Bluetooth, a radio technology that supports short-range communication (typically within 10 m) of devices. The wireless information exchange can be carried out among a plurality of devices such as mobile phones, PDAs, wireless earphones, notebook computers and related peripherals.
By using the bluetooth technology, the communication between mobile communication terminal devices can be effectively simplified, and the communication between the devices and the Internet can also be successfully simplified, so that the data transmission becomes faster and more efficient, and the way is widened for wireless communication.
Bluetooth adopts a distributed network structure and a fast frequency hopping and short packet technology, supports point-to-point and point-to-multipoint communication, and works in a global universal 2.4GHz ISM (industrial, scientific and medical) frequency band. The data rate is 1 Mbps. And the full duplex transmission is realized by adopting a time division duplex transmission scheme.
In this example, according to the bluetooth protocol, the mobile device may search for surrounding devices, list device IDs and names, and select a display terminal to be connected to perform connection.
It should be noted that, in this example, a mobile device, such as a smart phone, a smart tablet, and the like, has a bluetooth function, belongs to a bluetooth product, and the device type thereof may be a display terminal, and may provide a bluetooth data transmission capability between the display terminals, so that the mobile device may be used as an audio data receiving terminal to be connected to a bluetooth module on the display terminal and push audio data.
2.4G wireless networks;
in this example, the transmission link between the display terminal and the mobile device may be a 2.4G wireless network link, and the display terminal side may transmit the audio data to the mobile device side through a transmission manner of the 2.4G wireless network.
The 2.4G wireless network frequency band belongs to the ISM frequency band, which is an ultra-low radiation green environment-friendly frequency band widely used in the global range; 125 communication channels are provided, because the 2.4G wireless network communication is smoother, a plurality of communication commands can not interfere with each other; the highest bandwidth transmission rate of the 2.4G wireless grid can reach 108Mbps, so that the transmission speed of the wireless grid is high; its transmission distance is relatively long (open area: 200m effective transmission distance), and it is not affected by transmission party, and supports two-way communication.
Infrared rays;
in this example, the transmission link between the display terminal and the mobile device may be an infrared link, and the display terminal side may transmit the audio data to the mobile device side by an infrared transmission manner.
Infrared is short for infrared, is a wireless communication mode, and can transmit wireless data. The infrared communication has the characteristics of low cost, convenient connection, simplicity, easy use and compact structure, so the infrared communication is widely applied to small-sized mobile equipment. Through the infrared interface, various mobile devices can freely exchange data.
And (4) a wireless network protocol ZigBee.
In this example, a transmission link between the display terminal and the mobile device may be a ZigBee link, and the display terminal side may transmit the audio data to the mobile device side in a transmission manner of a wireless network protocol ZigBee.
Zigbee is a wireless network protocol for low-speed short-range transmission based on the ieee802.15.4 standard. The protocols are, from bottom to top, a physical layer (PHY), a medium access control layer (MAC), a Transport Layer (TL), a network layer (NWK), an application layer (APL), etc. Wherein the physical layer and the medium access control layer comply with the specifications of the ieee802.15.4 standard.
The ZigBee network has the main characteristics of low power consumption, low cost, low speed, support of a large number of nodes, support of various network topologies, low complexity, rapidness, reliability and safety. The devices in the ZigBee network can be divided into three roles, namely, a Coordinator (Coordinator), a sink node (Router), a sensor node (end device), and the like.
Of course, the above transmission manner is only an example, and when implementing the embodiment of the present invention, other transmission manners may be set according to actual situations as long as the connection of the wireless transmission between the display terminal and the mobile device can be achieved, which is not limited in this embodiment of the present invention. In addition, besides the above transmission modes, those skilled in the art may also adopt other transmission modes according to actual needs, and the embodiment of the present invention is not limited thereto.
In practical application, when the mobile device receives the multimedia data identifier, corresponding audio data is obtained from an on-demand server which is mutually communicated with the mobile device. As an example of a specific application of the embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Specifically, the mobile device may obtain the corresponding audio data from the on-demand server according to a file name of the multimedia data, or obtain the corresponding audio data from the on-demand server according to a file address of the multimedia data, for example, a URL of the multimedia data at the on-demand server.
Step 304, when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
in practical application, when the display terminal receives the audio data, the display terminal may continuously send video playing information (i.e., a playing target timestamp) to the display terminal.
There may be some delay in data transmission between the display terminal and the mobile device, and in one case, in order to improve the accuracy of synchronous playing, the embodiment of the present invention may consider the delay in transmission when synchronizing the video data and the audio data.
In this embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In the embodiment of the present invention, if the video timestamp corresponding to the current video data is ta and the delay time value is Δ t, the target timestamp t is playeda'=ta+Δt
In the embodiment of the invention, the delay time value between the display terminal and the mobile equipment can be measured in advance or at present; the display terminal may be used for active measurement, or may be obtained from a mobile device, which is not limited in this embodiment of the present invention. For example, when the display terminal and the mobile device are connected for the first time, the display terminal or the mobile device actively initiates measurement of the delay time value, and after the measurement is finished, the identifier of the display terminal, the identifier of the mobile device, the transmission mode and the delay time value are stored in the display terminal and/or the mobile device. When the display terminal and the mobile device are connected again, and when the identifier of the display terminal, the identifier of the mobile device and the transmission mode are successfully matched, the previously measured delay time value can be directly obtained from the display terminal and/or the mobile device.
In a preferred example of the embodiment of the present invention, the delay time value may be a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
in practical applications, the analog data may be data in any format. The time delay value of data transmission is related to the size of the transmitted data, so that the size of the simulation data is the same as that of the time point data which is actually transmitted once. Then in this example the analog data may be the same size as the audio data between the two audio time stamps.
After the display terminal sends the analog data to the mobile device, the mobile device needs to immediately return the analog data to the display terminal, and the display terminal calculates half of the time difference (i.e. the difference value between the first system time and the second system time) between sending and receiving of the analog data to obtain the delay time value.
When the first system time is T1The second system time is T2Time delay Δ T = (T)2-T1)/2。-
Of course, the embodiment of the present invention may also calculate a half of the time difference between the sending and receiving of the analog data for multiple times to obtain the delay time value, so as to reduce the error.
According to the embodiment of the invention, the delay time value is added in the playing target timestamp, so that the influence of the delay of the display terminal and the mobile equipment in transmitting the playing target timestamp is eliminated, and the synchronous playing precision of the audio data and the video data is further improved.
In another preferred example of the embodiment of the present invention, the delay time value may be a delay time value sent by the mobile device.
In this example, the delay time value may be a delay time value obtained by the mobile device by sending preset simulation data to the display terminal and recording a current third system time value, receiving the simulation data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value.
The human eye resolution is typically 1/24 seconds, and if the video data and the audio data are different from each other in a smile, the human eye will not feel the difference. Therefore, in another case, in order to reduce the resource occupation of the display terminal or the mobile device, when the delay time value is less than a preset threshold (e.g., 40 ms), the embodiment of the present invention may not consider the transmission delay when synchronizing the video data and the audio data.
In this embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
The video time stamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
Step 305, sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Corresponding to step 303, the display terminal side may send the play target timestamp to the mobile device side through a transmission mode of an ethernet, or may send the play target timestamp to the mobile device side through a transmission mode of a bluetooth, or may send the play target timestamp to the mobile device side through a transmission mode of a 2.4G wireless network, or may send the play target timestamp to the mobile device side through a transmission mode of an infrared ray, or may send the play target timestamp to the mobile device side through a transmission mode of a wireless network protocol ZigBee.
According to the embodiment of the invention, the display terminal sends the audio data to the mobile equipment after receiving the multimedia data, the display terminal sends the playing target timestamp to the mobile equipment when playing the video data, and the mobile equipment plays the audio data according to the playing target timestamp, wherein the display terminal is connected with the mobile equipment in a wireless transmission mode, so that a user can get rid of the constraint of directly connecting the wired earphone with the display terminal, the operation of the user is convenient, meanwhile, the problem of obvious asynchronization caused by smile difference accumulation generated when the audio data and the video data are played is avoided, and the synchronous playing of the audio data and the video data is realized. In addition, the mobile equipment is a product with high frequency of use for the public, the embodiment of the invention reuses the mobile equipment, has multiple functions, avoids additional purchase of Bluetooth earphones, has strong practicability and greatly reduces the cost.
Referring to fig. 4, a flowchart illustrating steps of embodiment 2 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 401, receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
step 402, obtaining corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
step 403, buffering the video data;
in a specific implementation, the buffering process may be to resume playing the video data or the audio data after the buffering time value. The buffering time may be preset to a fixed value, for example 5 seconds.
In the display terminal, after enough video data is buffered by the buffer time, the video data can be played; in the mobile device, after enough audio data is buffered, the audio data can be played again after the same buffering time.
Step 404, sending the multimedia data identifier to a mobile device side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
step 405, when playing the video data, generating a playing target timestamp according to the currently played video timestamp;
in this embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
If the video timestamp corresponding to the current video data is ta, the playing target timestamp ta' = ta.
Step 406, sending the playing target timestamp to a mobile device side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Referring to fig. 5, a flowchart illustrating steps of embodiment 3 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 501, receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
by applying the embodiment of the invention, the display terminal can establish a data transmission link with the mobile equipment.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In this example, the transmission link between the display terminal and the mobile device may be an ethernet link, a bluetooth link, a 2.4G wireless network link, an infrared link, a ZigBee link, or other transmission link.
Specifically, the mobile device side may receive the multimedia data identifier sent by the display terminal side through a transmission mode of an ethernet, or may receive the multimedia data identifier sent by the display terminal side through a transmission mode of a bluetooth, or may receive the multimedia data identifier sent by the display terminal side through a transmission mode of a 2.4G wireless network, or may receive the multimedia data identifier sent by the display terminal side through a transmission mode of an infrared ray, or may receive the multimedia data identifier sent by the display terminal side through a transmission mode of a wireless network protocol ZigBee.
In practical application, when a display terminal receives a play request, a multimedia data identifier is extracted from the play request, and corresponding video data is acquired from an on-demand server which is communicated with the display terminal. As an example of a specific application of the embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Specifically, the display terminal may obtain the corresponding video data from the on-demand server according to a file name of the multimedia data, or obtain the corresponding video data from the on-demand server according to a file address of the multimedia data, for example, a Uniform Resource Locator (URL) of the multimedia data in the on-demand server.
Step 502, obtaining audio data from an on-demand server in communication with the mobile device according to the multimedia data identifier;
in practical application, when the mobile device receives the multimedia data identifier, corresponding audio data is obtained from an on-demand server which is mutually communicated with the mobile device. As an example of a specific application of the embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Specifically, the mobile device may obtain the corresponding audio data from the on-demand server according to a file name of the multimedia data, or obtain the corresponding video data from the on-demand server according to a file address of the multimedia data, for example, a URL of the multimedia data at the on-demand server.
Step 503, receiving a play target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
corresponding to step 501, the mobile device side may receive the play target timestamp sent by the display terminal side through a transmission method of an ethernet, or may receive the play target timestamp sent by the display terminal side through a transmission method of a bluetooth, or may receive the play target timestamp sent by the display terminal side through a transmission method of a 2.4G wireless network, or may receive the play target timestamp sent by the display terminal side through a transmission method of an infrared ray, or may receive the play target timestamp sent by the display terminal side through a transmission method of a wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
In another preferred embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value may be a time of data transmission delay between the display terminal side and the mobile device side.
In the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf the delay time value is delta t, playing the target timestamp ta'=ta+Δt。
And step 504, playing the audio data corresponding to the playing target timestamp.
When the mobile device receives the video playing information (i.e. the playing target timestamp), the mobile device can synchronously play the audio data by using a speaker of the mobile device or accessing a wired earphone and the like according to the video playing information.
In a preferred embodiment of the present invention, the audio data may carry one or more audio time stamps, and step 503 may include the following sub-steps:
a substep S11, when the currently played audio timestamp is greater than the playing target timestamp, pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp;
if the currently played audio timestamp is greater than the playing target timestamp, that is, the audio data is played before the audio data is played, for example, the playing target timestamp is 50000 ms, and the currently played audio timestamp is 50040 ms, the audio data can be paused, for example, the current video data is repeatedly played, and normal playing cannot be started until the audio data and the audio data are synchronized.
Of course, the embodiment of the present invention may also directly play and search the audio time stamp equal to the playing target time stamp without pausing the playing of the audio data, and play the audio data corresponding to the audio time stamp, which is not limited in this embodiment of the present invention.
And/or the presence of a gas in the gas,
a substep S12, when the currently played audio timestamp is less than or equal to the playing target timestamp, searching for an audio timestamp equal to the playing target timestamp;
and a substep S13, playing the audio data corresponding to the audio time stamp.
The currently played audio timestamp is less than or equal to the playing target timestamp, that is, the audio data is played backward or synchronized with the playing of the audio data, for example, the playing target timestamp is 50000 ms, and the currently played video timestamp is 49960 ms, then the audio timestamp of the synchronization point can be searched, and the video data of the synchronization point can be directly played without playing the audio data that is in the middle backward.
It should be noted that, since method embodiment 3 corresponds to method embodiment 1, the description is relatively simple, and for the relevant points, reference may be made to part of description of method embodiment 1, and the embodiment of the present invention is not described in detail herein.
Referring to fig. 6, a flowchart illustrating steps of embodiment 4 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 601, receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
in a preferred embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Step 602, obtaining audio data from an on-demand server in communication with the mobile device according to the multimedia data identifier;
step 603, buffering the audio data;
in the display terminal, after enough video data is buffered by the buffer time, the video data can be played; in the mobile device, after enough audio data is buffered, the audio data can be played again after the same buffering time.
Step 604, receiving a play target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
in a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Then, in the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
Step 605, obtaining a delay time value;
in the embodiment of the invention, the delay time value between the display terminal and the mobile equipment can be measured in advance or at present; the display terminal may be used for active measurement, or may be obtained from a mobile device, which is not limited in this embodiment of the present invention.
In a preferred example of the embodiment of the present invention, the delay time value is a delay time value obtained by sending preset simulation data to the display terminal and recording a current third system time value, receiving the simulation data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
after the mobile device sends the analog data to the display terminal, the display terminal needs to immediately return the analog data to the mobile device, and the mobile device calculates half of the time difference (i.e. the difference between the third system time and the fourth system time) between sending and receiving of the analog data to obtain the delay time value.
When the third system time is T3The fourth system time is T4Time delay Δ T = (T)4-T3)/2。
Of course, the embodiment of the present invention may also calculate a half of the time difference between the sending and receiving of the analog data for multiple times to obtain the delay time value, so as to reduce the error.
In another preferred example of the embodiment of the present invention, the delay time value may be a delay time value sent by the display terminal.
In this example, the delay time value may be a delay time value obtained by a display terminal sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value.
Step 606, adding the delay time value to the time value indicated by the playing target timestamp;
in the embodiment of the invention, the target timestamp t is playeda'=taIf the delay time value is Δ t, the updated playing target timestamp t is obtaineda''=ta'+Δt=ta+Δt。
Step 607, playing the audio data corresponding to the playing target timestamp.
It should be noted that, since method embodiment 4 corresponds to method embodiment 2, the description is relatively simple, and for the relevant points, reference may be made to the partial description of method embodiment 2, and the embodiment of the present invention is not described in detail herein.
For simplicity of explanation, the method embodiments are described as a series of acts or combinations, but those skilled in the art will appreciate that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the embodiments of the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 7, a block diagram of an apparatus embodiment 1 for synchronously playing video data and audio data according to the present invention is shown, and the embodiment of the present invention may include the following modules:
a multimedia data identifier receiving module 701, configured to receive a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
in a preferred embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
A video data obtaining module 702, configured to obtain corresponding video data from an on-demand server in communication with the display terminal side according to the multimedia data identifier;
a multimedia data identifier sending module 703, configured to send the multimedia data identifier to a mobile device side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
a playing target timestamp generating module 704, configured to generate a playing target timestamp according to a currently played video timestamp when the video data is played;
a playing target timestamp sending module 705, configured to send the playing target timestamp to a mobile device side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
In a preferred embodiment of the present invention, the wireless transmission means includes one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the play target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In a preferred embodiment of the present invention, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In a preferred embodiment of the present invention, the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
In a preferred embodiment of the present invention, the method may further include:
and the first buffer module is used for buffering the video data.
In a preferred embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Referring to fig. 8, a block diagram of an embodiment 2 of the apparatus for synchronously playing video data and audio data according to the present invention is shown, and the embodiment of the present invention may include the following modules:
a multimedia data identifier receiving module 801, configured to receive, at a mobile device side, a multimedia data identifier sent by a display terminal side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
in a preferred embodiment of the present invention, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
An audio data module 802, configured to obtain audio data from an on-demand server in communication with the mobile device according to the multimedia data identifier;
a play target timestamp receiving module 803, configured to receive a play target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and an audio data playing module 804, configured to play the audio data corresponding to the playing target timestamp.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In a preferred embodiment of the present invention, the embodiment of the present invention may further include the following modules:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
In a preferred embodiment of the present invention, the delay time value may be a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value may be a delay time value transmitted by the display terminal.
In a preferred embodiment of the present invention, the audio data may carry one or more audio time stamps; the audio data playing module 803 may include the following sub-modules:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
In a preferred embodiment of the present invention, the embodiment of the present invention may further include the following modules:
and the second buffer module is used for carrying out buffer processing on the audio data.
For the device embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant points can be referred to the partial description of the method embodiment.
Referring to fig. 9, a block diagram of a system for synchronously playing video data and audio data according to an embodiment of the present invention is shown, which may specifically include a display terminal, a mobile device and an on-demand server that communicate with each other;
the display terminal is used for receiving a multimedia data playing request and sending a video data acquisition request to the on-demand server according to the playing request; the playing request comprises a multimedia data identifier;
the mobile equipment is used for receiving the multimedia data identifier sent by the display terminal and sending an audio data acquisition request to the on-demand server according to the multimedia data identifier;
the on-demand server is used for respectively returning corresponding video data and audio data to the display terminal and the mobile equipment according to the video data acquisition request and the audio data acquisition request;
the display terminal is further used for playing the video data, generating a timestamp according to the currently played video timestamp, and sending the timestamp to the mobile equipment; the moving device is further configured to play audio data corresponding to the play target timestamp.
For the system embodiment, since it is basically similar to the method embodiment, the description is simple, and the relevant points can be referred to the partial description of the method embodiment.
An embodiment of the present invention further provides an apparatus, where the apparatus may include:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the functionality to:
receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the delay time value is obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Optionally, the one or more modules may also have the following functions:
and buffering the video data.
Optionally, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
An embodiment of the present invention further provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device with a display, the one or more modules may cause the device to execute instructions (instructions) of: receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the delay time value is obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Optionally, the one or more modules may also have the following functions:
and buffering the video data.
Optionally, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Referring to fig. 10, a schematic structural diagram of an intelligent television according to an embodiment of the present invention is shown. The electronic device is configured to implement the content presentation method provided in the above embodiment, specifically:
electronic device 800 may include RF (Radio Frequency) circuitry 810, memory 820 including one or more computer-readable storage media, input unit 830, display unit 840, sensor 850, audio circuitry 860, short-range wireless transmission module 870, processor 880 including one or more processing cores, and power supply 890. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 810 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information from a base station and then processing the received downlink information by the one or more processors 880; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 810 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for mobile communications), GPRS (General Packet radio service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short messaging service), etc. The memory 820 may be used to store software programs and modules, for example, the memory 820 may be used to store a software program for collecting voice signals, a software program for realizing keyword recognition, a software program for realizing continuous voice recognition, a software program for realizing setting reminders, and the like. The processor 880 executes various functional applications and data processing such as a function of "receiving multimedia data on the display terminal side," a function of transmitting the audio data to the mobile device side, "a function of generating a play target time stamp from a currently played video time stamp when the video data is played," a function of transmitting the play target time stamp to the mobile device side, "and the like in the embodiment of the present invention by running software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device 800, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 820 may also include a memory controller to provide the processor 880 and the input unit 830 access to the memory 820.
The input unit 830 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 830 may include a touch-sensitive surface 831 as well as other input devices 832. The touch-sensitive surface 831, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 831 (e.g., operations by a user on or near the touch-sensitive surface 831 using a finger, a stylus, or any other suitable object or attachment) and drive the corresponding connection device according to a predefined program. Alternatively, the touch-sensitive surface 831 can include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 880, and can receive and execute commands from the processor 880. In addition, the touch-sensitive surface 831 can be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 830 may include other input devices 832 in addition to the touch-sensitive surface 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 840 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device 800, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 840 may include a Display panel 841, and the Display panel 841 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like, as an option. Further, touch-sensitive surface 831 can overlie display panel 841 such that when touch-sensitive surface 831 detects a touch operation thereon or thereabout, it can be relayed to processor 880 to determine the type of touch event, and processor 880 can then provide a corresponding visual output on display panel 841 in accordance with the type of touch event. Although in FIG. 9, touch-sensitive surface 831 and display panel 841 are implemented as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 831 may be integrated with display panel 841 to implement input and output functions.
The electronic device 800 may also include at least one sensor 850, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 841 based on the brightness of ambient light, and a proximity sensor that may turn off the display panel 841 and/or backlight when the electronic device 800 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile phone is stationary, and may be used for applications of recognizing gestures of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that may be further configured to the electronic device 800, which are not described herein again.
The audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between a user and the electronic device 800. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 860, and outputs the audio data to the processor 880 for processing, and then transmits the audio data to another terminal via the RF circuit 810, or outputs the audio data to the memory 820 for further processing. The audio circuitry 860 may also include an earbud jack to provide communication of a peripheral headset with the electronic device 800.
The short-distance wireless transmission module 870 may be a WIFI (wireless fidelity) module or a bluetooth module, etc. The electronic device 800, which may assist the user in e-mailing, browsing web pages, accessing streaming media, etc., through the short-range wireless transmission module 870, provides the user with wireless broadband internet access. Although fig. 9 shows the short-range wireless transmission module 870, it is understood that it does not belong to the essential constitution of the electronic device 800 and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 880 is a control center of the electronic device 800, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device 800 and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby monitoring the electronic device as a whole. Optionally, processor 880 may include one or more processing cores; preferably, the processor 880 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 880.
The electronic device 800 also includes a power supply 890 (e.g., a battery) for powering the various components, which may be logically coupled to the processor 880 via a power management system that may be used to manage charging, discharging, and power consumption. Power supply 890 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device 800 may further include a camera, a bluetooth module, and the like, which are not described in detail herein. Specifically, in the present embodiment, the display unit of the electronic device 800 is a touch screen display.
An embodiment of the present invention further provides an apparatus, where the apparatus may include:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the functionality to:
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
Optionally, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
An embodiment of the present invention further provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device with an audio playing function, the one or more modules may cause the device to execute instructions (instructions) of the following steps:
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
Optionally, the multimedia data identifier may include a file name of the multimedia data, and/or a file address of the multimedia data.
Fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present invention. Referring to fig. 10, the terminal device may be used to implement the method for synchronously playing video data and audio data provided in the above embodiment. Wherein, this terminal equipment can be cell-phone, panel, wearing formula mobile device (like intelligent wrist-watch) etc..
The terminal device 700 may include components such as a communication unit 110, a memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a WiFi (wireless fidelity) module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the communication unit 110 may be used for receiving and transmitting information or signals during a call, and the communication unit 110 may be an RF (Radio Frequency) circuit, a router, a modem, or other network communication devices. In particular, when the communication unit 110 is an RF circuit, downlink information of the base station is received and then processed by the one or more processors 180; in addition, data relating to uplink is transmitted to the base station. Generally, the RF circuit as a communication unit includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the communication unit 110 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code division Multiple Access), WCDMA (Wideband Code division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), etc. The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal device 700, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Optionally, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. Alternatively, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphic user interfaces of the terminal device 700, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 10, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The terminal device 700 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Alternatively, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the terminal device 700 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile phone is stationary, and may be used for applications of recognizing gestures of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that are further configured to the terminal device 700, and are not described herein again.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal device 700. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and outputs the audio data to the processor 180 for processing, and then transmits the audio data to, for example, another terminal device via the RF circuit 110, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of peripheral headphones with the terminal device 700.
To implement wireless communication, a wireless communication unit 170 may be configured on the terminal device, and the wireless communication unit 170 may be a WiFi module. WiFi belongs to a short-range wireless transmission technology, and the terminal device 700 can help a user to send and receive e-mail, browse a web page, access streaming media, and the like through the wireless communication unit 170, which provides the user with wireless broadband internet access. Although fig. 10 shows the wireless communication unit 170, it is understood that it does not belong to the essential constitution of the terminal device 700 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the terminal device 700, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal device 700 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal device 700 further includes a power supply 190 (e.g., a battery) for supplying power to the various components, which may preferably be logically connected to the processor 180 via a power management system, so as to manage charging, discharging, and power consumption via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal device 700 may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the display unit of the terminal device is a touch screen display, the terminal device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts in the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, mobile devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing mobile device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing mobile device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or mobile device that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or mobile device. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or mobile device that comprises the element.
The method for synchronously playing video data and audio data, the device for synchronously playing video data and audio data, and the equipment provided by the embodiments of the present invention are described in detail above, a specific example is applied in the present document to explain the principle and the implementation manner of the embodiments of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the embodiments of the present invention; meanwhile, for a person skilled in the art, according to the idea of the embodiment of the present invention, there may be a change in the specific implementation and application scope, and in summary, the content of the present specification should not be construed as a limitation to the embodiment of the present invention.

Claims (35)

1. A method for synchronously playing video data and audio data, comprising:
receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
2. The method of claim 1, wherein the manner of wireless transmission comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
3. The method according to claim 1, wherein the playback target timestamp includes a video timestamp corresponding to current video data extracted by the display device when the video data is played back.
4. The method according to claim 1, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
5. The method of claim 4, wherein the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating half of a difference between the second system time value and the first system time value;
or,
the delay time value is the delay time value sent by the mobile equipment.
6. The method according to any one of claims 1 to 5, further comprising, after the step of obtaining corresponding video data from the on-demand server according to the multimedia data identifier:
and buffering the video data.
7. The method of claim 1, wherein the multimedia data identifier comprises a file name of the multimedia data and/or a file address of the multimedia data.
8. A method for synchronously playing video data and audio data, comprising:
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
9. The method of claim 8, wherein the manner of wireless transmission comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
10. The method according to claim 8, wherein the play target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when playing the video data.
11. The method according to claim 8, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
12. The method according to claim 10, wherein after the step of receiving the play target time stamp transmitted from the display terminal side, the method further comprises:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
13. The method of claim 12, wherein the delay time value is a delay time value obtained by transmitting preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned from the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
14. The method according to any one of claims 8 to 13, wherein the audio data carries one or more audio time stamps;
the step of playing the audio data corresponding to the playing target timestamp includes:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
15. The method according to any one of claims 8 to 13, wherein before the step of playing the audio data corresponding to the playing target timestamp, the method further comprises:
and carrying out buffering processing on the audio data.
16. The method of claim 8, wherein the multimedia data identifier comprises a file name of the multimedia data and/or a file address of the multimedia data.
17. An apparatus for synchronously playing video data and audio data, comprising:
the multimedia data identification receiving module is used for receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
the video data acquisition module is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification;
the multimedia data identification sending module is used for sending the multimedia data identification to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the playing target timestamp generating module is used for generating a playing target timestamp according to the currently played video timestamp when the video data is played;
the playing target timestamp sending module is used for sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
18. The apparatus of claim 17, wherein the means for wirelessly transmitting comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
19. The apparatus according to claim 17, wherein the playback target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when the video data is played back.
20. The apparatus according to claim 17, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
21. The apparatus of claim 20, wherein the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
22. The apparatus of any one of claims 17 to 21, further comprising:
and the first buffer module is used for buffering the video data.
23. The apparatus of claim 17, wherein the multimedia data identifier comprises a file name of the multimedia data and/or a file address of the multimedia data.
24. An apparatus for synchronously playing video data and audio data, comprising:
the multimedia data identification receiving module is used for receiving the multimedia data identification sent by the display terminal side at the mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the audio data module is used for acquiring audio data from an on-demand server which is communicated with the mobile equipment according to the multimedia data identifier;
a playing target timestamp receiving module, configured to receive a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and the audio data playing module is used for playing the audio data corresponding to the playing target timestamp.
25. The apparatus of claim 24, wherein the means for wirelessly transmitting comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
26. The apparatus according to claim 24, wherein the play target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when playing the video data.
27. The apparatus according to claim 24, wherein the playback target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal when playing back the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
28. The apparatus of claim 26, further comprising:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
29. The apparatus of claim 28, wherein the delay time value is a delay time value obtained by transmitting preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
30. The apparatus according to any one of claims 24 to 29, wherein the audio data carries one or more audio time stamps; the audio data broadcasting module comprises:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
31. The apparatus of any one of claims 24 to 29, further comprising:
and the second buffer module is used for carrying out buffer processing on the audio data.
32. The apparatus of claim 24, wherein the multimedia data identifier comprises a file name of the multimedia data and/or a file address of the multimedia data.
33. A system for synchronously playing video data and audio data is characterized by comprising a display terminal, a mobile device and an on-demand server which are communicated with each other;
the display terminal is used for receiving a multimedia data playing request and sending a video data acquisition request to the on-demand server according to the playing request; the playing request comprises a multimedia data identifier;
the mobile equipment is used for receiving the multimedia data identifier sent by the display terminal and sending an audio data acquisition request to the on-demand server according to the multimedia data identifier;
the on-demand server is used for respectively returning corresponding video data and audio data to the display terminal and the mobile equipment according to the video data acquisition request and the audio data acquisition request;
the display terminal is further used for playing the video data, generating a timestamp according to the currently played video timestamp, and sending the timestamp to the mobile equipment; the moving device is further configured to play audio data corresponding to the play target timestamp.
34. An apparatus, comprising:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving a multimedia data playing request at a display terminal side; the playing request comprises a multimedia data identifier;
acquiring corresponding video data from an on-demand server in mutual communication with the display terminal side according to the multimedia data identifier;
sending the multimedia data identifier to a mobile equipment side; the mobile equipment side is used for acquiring corresponding video data from an on-demand server which is communicated with the display terminal side according to the multimedia data identification, and the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to a mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
35. An apparatus, comprising:
one or more processors;
a memory; and
receiving a multimedia data identifier sent by a display terminal side at a mobile equipment side; the multimedia data is extracted from a playing request received by a display terminal side, and the display terminal side is used for acquiring video data from an on-demand server which is communicated with the display terminal according to the multimedia data identifier; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
acquiring audio data from an on-demand server in communication with the mobile equipment according to the multimedia data identifier;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
and playing the audio data corresponding to the playing target timestamp.
CN201410093950.3A 2014-03-13 2014-03-13 The method, apparatus and equipment that a kind of video data and audio data are played simultaneously Active CN103905879B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410093950.3A CN103905879B (en) 2014-03-13 2014-03-13 The method, apparatus and equipment that a kind of video data and audio data are played simultaneously

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410093950.3A CN103905879B (en) 2014-03-13 2014-03-13 The method, apparatus and equipment that a kind of video data and audio data are played simultaneously

Publications (2)

Publication Number Publication Date
CN103905879A true CN103905879A (en) 2014-07-02
CN103905879B CN103905879B (en) 2018-07-06

Family

ID=50996995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410093950.3A Active CN103905879B (en) 2014-03-13 2014-03-13 The method, apparatus and equipment that a kind of video data and audio data are played simultaneously

Country Status (1)

Country Link
CN (1) CN103905879B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303648A (en) * 2015-06-11 2017-01-04 阿里巴巴集团控股有限公司 A kind of method and device synchronizing to play multi-medium data
CN106385611A (en) * 2016-10-08 2017-02-08 广东欧珀移动通信有限公司 Media data playing method, device and system
CN107734378A (en) * 2017-10-31 2018-02-23 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN109286813A (en) * 2018-11-14 2019-01-29 北京奇艺世纪科技有限公司 A kind of video communication quality detection method and device
CN109658765A (en) * 2019-03-04 2019-04-19 西安交通大学医学院第附属医院 A kind of digital medical images software teaching service system
CN109982143A (en) * 2017-12-28 2019-07-05 中国移动通信集团陕西有限公司 A kind of method, apparatus, medium and the equipment of determining video playing time delay
CN110545454A (en) * 2019-08-27 2019-12-06 北京奇艺世纪科技有限公司 Data synchronous playing method and device
CN110798725A (en) * 2018-08-02 2020-02-14 视联动力信息技术股份有限公司 Data processing method and device
CN111641864A (en) * 2019-03-01 2020-09-08 腾讯科技(深圳)有限公司 Video information acquisition method, device and equipment
CN111918257A (en) * 2020-07-28 2020-11-10 歌尔光学科技有限公司 Head-mounted display device, data transmission method thereof and readable storage medium
CN112086095A (en) * 2020-09-10 2020-12-15 深圳前海微众银行股份有限公司 Data processing method, device, equipment and storage medium
CN113763919A (en) * 2021-09-29 2021-12-07 北京字跳网络技术有限公司 Video display method and device, computer equipment and storage medium
CN114827696A (en) * 2021-01-29 2022-07-29 华为技术有限公司 Method for synchronously playing cross-device audio and video data and electronic device
CN114885198A (en) * 2022-07-07 2022-08-09 中央广播电视总台 Mixed network-oriented accompanying sound and video collaborative presentation system
CN118018795A (en) * 2024-01-31 2024-05-10 书行科技(北京)有限公司 Video playing method, device, electronic equipment and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1860866A1 (en) * 2006-05-26 2007-11-28 British Telecommunications Public Limited Company Audio-visual reception
CN101809965A (en) * 2007-09-28 2010-08-18 汤姆逊许可公司 Communication technique able to synchronise the received stream with that sent to another device
CN103297824A (en) * 2013-05-29 2013-09-11 华为技术有限公司 Video processing method, dongle, control terminal and system
CN103369365A (en) * 2013-06-28 2013-10-23 东南大学 Audio and video synchronous recording device
CN103458305A (en) * 2013-08-28 2013-12-18 小米科技有限责任公司 Video playing method and device, terminal device and server
CN103458277A (en) * 2013-08-26 2013-12-18 小米科技有限责任公司 Method and device for operating live channel programs
CN103491334A (en) * 2013-09-11 2014-01-01 浙江大学 Video transcode method from H264 to HEVC based on region feature analysis

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1860866A1 (en) * 2006-05-26 2007-11-28 British Telecommunications Public Limited Company Audio-visual reception
CN101809965A (en) * 2007-09-28 2010-08-18 汤姆逊许可公司 Communication technique able to synchronise the received stream with that sent to another device
CN103297824A (en) * 2013-05-29 2013-09-11 华为技术有限公司 Video processing method, dongle, control terminal and system
CN103369365A (en) * 2013-06-28 2013-10-23 东南大学 Audio and video synchronous recording device
CN103458277A (en) * 2013-08-26 2013-12-18 小米科技有限责任公司 Method and device for operating live channel programs
CN103458305A (en) * 2013-08-28 2013-12-18 小米科技有限责任公司 Video playing method and device, terminal device and server
CN103491334A (en) * 2013-09-11 2014-01-01 浙江大学 Video transcode method from H264 to HEVC based on region feature analysis

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303648A (en) * 2015-06-11 2017-01-04 阿里巴巴集团控股有限公司 A kind of method and device synchronizing to play multi-medium data
CN106385611A (en) * 2016-10-08 2017-02-08 广东欧珀移动通信有限公司 Media data playing method, device and system
CN106385611B (en) * 2016-10-08 2019-04-12 Oppo广东移动通信有限公司 A kind of playback method of media data, apparatus and system
CN107734378B (en) * 2017-10-31 2019-11-01 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN107734378A (en) * 2017-10-31 2018-02-23 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN109982143B (en) * 2017-12-28 2021-04-23 中国移动通信集团陕西有限公司 Method, device, medium and equipment for determining video playing time delay
CN109982143A (en) * 2017-12-28 2019-07-05 中国移动通信集团陕西有限公司 A kind of method, apparatus, medium and the equipment of determining video playing time delay
CN110798725A (en) * 2018-08-02 2020-02-14 视联动力信息技术股份有限公司 Data processing method and device
CN109286813A (en) * 2018-11-14 2019-01-29 北京奇艺世纪科技有限公司 A kind of video communication quality detection method and device
CN111641864B (en) * 2019-03-01 2022-05-20 腾讯科技(深圳)有限公司 Video information acquisition method, device and equipment
CN111641864A (en) * 2019-03-01 2020-09-08 腾讯科技(深圳)有限公司 Video information acquisition method, device and equipment
CN109658765A (en) * 2019-03-04 2019-04-19 西安交通大学医学院第附属医院 A kind of digital medical images software teaching service system
CN110545454A (en) * 2019-08-27 2019-12-06 北京奇艺世纪科技有限公司 Data synchronous playing method and device
CN111918257A (en) * 2020-07-28 2020-11-10 歌尔光学科技有限公司 Head-mounted display device, data transmission method thereof and readable storage medium
CN111918257B (en) * 2020-07-28 2024-06-07 歌尔科技有限公司 Head-mounted display device, data transmission method thereof and readable storage medium
CN112086095A (en) * 2020-09-10 2020-12-15 深圳前海微众银行股份有限公司 Data processing method, device, equipment and storage medium
CN112086095B (en) * 2020-09-10 2024-01-19 深圳前海微众银行股份有限公司 Data processing method, device, equipment and storage medium
CN114827696A (en) * 2021-01-29 2022-07-29 华为技术有限公司 Method for synchronously playing cross-device audio and video data and electronic device
CN114827696B (en) * 2021-01-29 2023-06-27 华为技术有限公司 Method for synchronously playing audio and video data of cross-equipment and electronic equipment
CN113763919B (en) * 2021-09-29 2023-09-05 北京字跳网络技术有限公司 Video display method, device, computer equipment and storage medium
CN113763919A (en) * 2021-09-29 2021-12-07 北京字跳网络技术有限公司 Video display method and device, computer equipment and storage medium
CN114885198A (en) * 2022-07-07 2022-08-09 中央广播电视总台 Mixed network-oriented accompanying sound and video collaborative presentation system
CN114885198B (en) * 2022-07-07 2022-10-21 中央广播电视总台 Mixed network-oriented accompanying sound and video collaborative presentation system
CN118018795A (en) * 2024-01-31 2024-05-10 书行科技(北京)有限公司 Video playing method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN103905879B (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN103905879B (en) The method, apparatus and equipment that a kind of video data and audio data are played simultaneously
CN103905881B (en) The method, apparatus and equipment that a kind of video data and audio data are played simultaneously
CN103905876A (en) Video data and audio data synchronized playing method and device and equipment
CN103905878A (en) Video data and audio data synchronized playing method and device and equipment
US20140354441A1 (en) System and constituent media device components and media device-based ecosystem
CN106254903B (en) A kind of synchronous broadcast method of multi-medium data, apparatus and system
WO2017202348A1 (en) Video playing method and device, and computer storage medium
CN103391473B (en) Method and device for providing and acquiring audio and video
CN105208056B (en) Information interaction method and terminal
JP2014514847A (en) Systems and methods for implementing multicast using personal area network (PAN) wireless technology
US20150304701A1 (en) Play control method and device
CN111245854B (en) Media transmission method, media control method and device
US10463965B2 (en) Control method of scene sound effect and related products
EP3429176B1 (en) Scenario-based sound effect control method and electronic device
CN107360318B (en) Voice noise reduction method and device, mobile terminal and computer readable storage medium
CN105606117A (en) Navigation prompting method and navigation prompting apparatus
WO2020143658A1 (en) Method and apparatus for monitoring pdcch, terminal, base station, and storage medium
CN103491421B (en) Content displaying method, device and intelligent television
CN106205657B (en) A kind of lyric display method and device
WO2019242633A1 (en) Measurement interval processing method, terminal and network node
WO2024082906A1 (en) Information acquisition method and apparatus, bluetooth device, terminal device, and storage medium
WO2019191996A1 (en) Data transmission method and device
CN103458064A (en) Method, device and terminal equipment for transmitting address information of multimedia information
CN106303616B (en) Play control method, device and terminal
US10853412B2 (en) Scenario-based sound effect control method and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant