CN110636359B - Method and device for synchronously playing audio and video - Google Patents

Method and device for synchronously playing audio and video Download PDF

Info

Publication number
CN110636359B
CN110636359B CN201810646307.7A CN201810646307A CN110636359B CN 110636359 B CN110636359 B CN 110636359B CN 201810646307 A CN201810646307 A CN 201810646307A CN 110636359 B CN110636359 B CN 110636359B
Authority
CN
China
Prior art keywords
audio
video
thread
time stamp
timestamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810646307.7A
Other languages
Chinese (zh)
Other versions
CN110636359A (en
Inventor
王伟杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201810646307.7A priority Critical patent/CN110636359B/en
Publication of CN110636359A publication Critical patent/CN110636359A/en
Application granted granted Critical
Publication of CN110636359B publication Critical patent/CN110636359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8543Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The application provides a method and a device for synchronously playing audio and video, which are applied to multimedia equipment based on an HTML5 platform, and the method comprises the following steps: by running a foreground thread, the foreground thread sends the audio and video time stamp to a background thread different from the foreground thread; by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video time stamp according to each frame of audio and video time stamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data; wherein the background thread and the foreground thread run independently of each other. The audio and video synchronous playing in the HTML5 platform is realized by combining the foreground thread and the background thread, and the synchronous playing realized by only sending the audio and video time stamp is adopted by the foreground thread, so that the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of the foreground thread for playing the audio and video data is ensured.

Description

Method and device for synchronously playing audio and video
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for audio and video synchronous playing.
Background
Audio-video synchronization refers to synchronization of video and audio, that is, the currently played sound and the currently displayed picture are consistent in time domain. At present, an audio and video synchronous playing method is generally based on a windows platform, a linux platform, an android platform and an apple platform, and audio and video data are directly transmitted among threads in a parallel processing mode of a plurality of foreground threads. However, since parallel processing of multiple foreground threads cannot be performed in an HTML5(Hypertext Markup Language) platform, the above audio/video synchronous playing method is not suitable for an HTML5 platform.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for audio and video synchronous playing to solve the problem that audio and video cannot be synchronously played in an HTML5 platform.
According to a first aspect of the embodiments of the present application, a method for audio and video synchronous playing is provided, where the method is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, and the foreground thread is used to play audio and video data, and the method includes:
by running the foreground thread, the foreground thread sends an audio and video timestamp to a background thread different from the foreground thread;
by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data;
wherein the background thread and the foreground thread run independently of each other.
According to a second aspect of the embodiments of the present application, a device for audio and video synchronous playing is provided, the device is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, the foreground thread is used for playing audio and video data, the device includes:
the first running module is used for sending the audio and video time stamp to a background thread different from the foreground thread by the foreground thread through running the foreground thread;
the second running module is used for determining whether audio and video data corresponding to each frame of audio and video timestamp is played or not by the background thread according to each frame of audio and video timestamp by running the background thread, and if so, returning a notification of playing the audio and video data to the foreground thread so as to enable the foreground thread to play the audio and video data;
wherein the background thread and the foreground thread run independently of each other.
According to a third aspect of embodiments herein, there is provided a multimedia device, the device comprising a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-4.
By applying the embodiment of the application, the application is based on an HTML5 platform, audio and video synchronous playing is realized by combining a foreground thread and a background thread, namely, an audio and video timestamp is sent to the background thread different from the foreground thread by the foreground thread, whether corresponding audio and video data are played or not is determined by the background thread according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread, so that the foreground thread plays the audio and video data, and because the foreground thread adopts synchronous playing realized by only sending the audio and video timestamp, the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of playing the audio and video data by the foreground thread is ensured.
Drawings
Fig. 1 is a flowchart illustrating an embodiment of a method for audio and video synchronized playback according to an exemplary embodiment of the present application;
fig. 2 is a flowchart illustrating an embodiment of another method for audio-video synchronized playback according to an exemplary embodiment of the present application;
FIG. 3 is a diagram illustrating a hardware configuration of a multimedia device according to an exemplary embodiment of the present application;
fig. 4 is a block diagram of an embodiment of an apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application;
fig. 5 is a block diagram of an embodiment of another apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a flowchart of an embodiment of a method for audio and video synchronous playing shown in this application according to an exemplary embodiment, where this embodiment may be applied to a multimedia device based on an HTML5 platform, in this application embodiment, a foreground thread and a background thread run on the multimedia device, where the foreground thread may also be referred to as a page thread and is used to play audio and video data, and the background thread is used to determine whether to play audio and video data synchronously, and in an HTML5 platform, a Web Worker technology may be used to implement the background thread, and because the foreground thread and the background thread run independently of each other, the running of the background thread does not affect the performance of the foreground thread, and because in an HTML5 platform, data transfer between the foreground thread and the background thread may not be directly performed, data transfer between the foreground thread and the background thread may be implemented in a postMessage manner. As shown in fig. 1, the method for synchronously playing audio and video includes the following steps:
step 101: and by running the foreground thread, the foreground thread sends the audio and video timestamp to a background thread different from the foreground thread.
In an embodiment, the foreground thread can store the audio and video data and the audio and video timestamp data in different storage units respectively, and the performance of data interaction between the foreground thread and the background thread in a postMessage mode is related to the data volume, so that the foreground thread can only send the audio and video timestamp to the background thread, the performance loss of data transmission between the foreground thread and the background thread is reduced, and the real-time performance of the foreground thread for playing the audio and video data is ensured.
In an embodiment, the audio and video timestamps can be sent to the background thread in an active sending mode through the foreground thread, specifically, the foreground thread can monitor whether the number of the unsent video timestamps in the video timestamp linked list reaches a preset value, when the number of the unsent video timestamps in the video timestamp linked list reaches the preset value, the unsent video timestamps in the video timestamp linked list and the unsent audio timestamps in the audio timestamp linked list are copied, and the copied audio and video timestamps are sent to the background thread.
The foreground thread caches the video time stamp of the preset value frame in the video time stamp linked list at each time and then sends the video time stamp, and the preset value can be set according to actual requirements as long as the video playing real-time performance is not influenced. In addition, after the foreground thread sends the copied audio and video timestamps to the background thread, the sent video timestamps in the video timestamp linked list and the sent audio timestamps in the audio timestamp linked list can be deleted.
In an embodiment, the audio and video timestamp may be sent to the background thread in a manner of being sent passively by the foreground thread, and specifically, the foreground thread may copy the video timestamp not sent in the video timestamp linked list and the audio timestamp not sent in the audio timestamp linked list when receiving the data request of the background thread, and send the copied audio and video timestamp to the background thread.
When receiving a data request sent by a background thread, a foreground thread indicates that an audio and video timestamp sent to the background thread by the foreground thread is processed, and needs to send a new audio and video timestamp. In addition, after the foreground thread sends the copied audio and video timestamps to the background thread, the sent video timestamps in the video timestamp linked list and the sent audio timestamps in the audio timestamp linked list can also be deleted.
In an embodiment, the foreground thread can also send the audio and video timestamp to the background thread through the combination of active sending and passive sending, specifically, after foreground thread and background thread are started, when the quantity of the video timestamp in the video timestamp linked list reaches a preset value, the foreground thread copies the video timestamp in the video timestamp linked list and the audio timestamp in the audio timestamp linked list, and send the copied audio and video timestamp to the background thread, thereafter, the foreground thread directly sends the unsent audio timestamp in the unsent video timestamp linked list and the unsent audio timestamp linked list to the background thread when receiving the data request of the background thread. Therefore, the background thread can be ensured to continuously perform judgment processing on the basis of not influencing the real-time playing of the foreground thread.
Step 102: by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video time stamp according to each frame of audio and video time stamp, if so, step 103 is executed, and if not, step 104 is executed.
The processing procedure of step 102 may be exemplarily referred to the following description of the embodiment shown in fig. 2, and will not be described in detail here.
Step 103: and returning a notice of playing the audio and video data to the foreground thread by the background thread so as to enable the foreground thread to play the audio and video data.
In an embodiment, the notification sent by the background thread may carry a video timestamp, an audio timestamp, and a play flag, so that the foreground thread may synchronously play video data corresponding to the video timestamp and audio data corresponding to the audio timestamp according to the play flag. In addition, the foreground thread may delete the corresponding video data and audio data after playing.
It should be noted that, because the video is usually taken as the main part and the audio is taken as the auxiliary part in the audio and video synchronous playing service, the background thread can only carry the video timestamp and the playing flag in the notification when determining whether to play the corresponding audio and video data according to each frame of audio and video timestamp and under the condition that the video timestamp is smaller than the audio timestamp, that is, the audio timestamp is late relative to the video timestamp, so that the foreground thread only plays the video data corresponding to the video timestamp according to the playing flag.
Step 104: and returning a notice of discarding the audio and video data to the foreground thread by the background thread so as to enable the foreground thread to discard the audio and video data.
In an embodiment, the notification sent by the background thread may carry a video timestamp and an audio timestamp and a discard flag, so that the foreground thread may discard video data corresponding to the video timestamp and audio data corresponding to the audio timestamp according to the discard flag.
In the embodiment of the application, based on the HTML5 platform, audio and video synchronous playing is realized by combining a foreground thread and a background thread, namely, an audio and video timestamp is sent to the background thread different from the foreground thread by the foreground thread, whether corresponding audio and video data are played or not is determined by the background thread according to each frame of audio and video timestamp, if yes, a notification of playing the audio and video data is returned to the foreground thread, so that the foreground thread plays the audio and video data, and because the foreground thread adopts synchronous playing realized by only sending the audio and video timestamp, the performance loss of data transmission between the foreground thread and the background thread can be reduced, and the real-time property of playing the audio and video data by the foreground thread is ensured.
Fig. 2 is a flowchart of another embodiment of a method for audio and video synchronous playing shown in this application according to an exemplary embodiment, and this embodiment uses the method provided in this application embodiment to exemplarily explain how to determine whether to play audio and video data corresponding to each frame of audio and video timestamp by a background thread according to each frame of audio and video timestamp, as shown in fig. 2, the method for audio and video synchronous playing includes the following steps:
step 201: and the background thread reads the first frame audio and video time stamp and the system time, determines the frame video time stamp as a reference time point, determines the read system time as a reference time point, and directly determines to play audio and video data corresponding to the frame audio and video time stamp.
In an embodiment, before the background thread performs step 201, it may be determined whether the number of the video timestamps is greater than 1, if so, step 201 is performed, and if not, the notification of playing the audio/video data is directly sent to the foreground thread.
In an embodiment, the background thread may reset the synchronization reference time for the read first frame audio/video timestamp, that is, the first frame video timestamp is used as a reference time point, that is, a video timestamp reference point, for subsequently determining an actual offset duration of each frame video timestamp, and meanwhile, the read system time is used as a reference time point for subsequently determining a reference offset duration of each frame video timestamp.
Step 202: and the background thread starts from the second frame of audio and video time stamp, and determines whether to play audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when reading one frame of audio and video time stamp and one system time.
In an embodiment, for a process of determining whether to play audio and video data corresponding to the frame of audio and video timestamp by using the frame of audio and video timestamp, the read system time, the reference time point, and the reference time point, the background thread may determine a difference between the frame of audio and video timestamp and the reference time point as an actual offset duration, determine a difference between the read system time and the reference time point as a reference offset duration, then determine a difference between the actual offset duration and the reference offset duration, and determine whether to play the audio and video data corresponding to the frame of audio and video timestamp by using the difference and the audio timestamp.
In an embodiment, for a process of determining whether to play audio and video data corresponding to the frame of audio and video timestamp by using the difference and the audio timestamp, if the difference is between a positive threshold and a negative threshold and the frame of audio timestamp is less than or equal to the frame of video timestamp, it is determined to play the audio and video data corresponding to the frame of audio and video timestamp; if the difference value is between the positive threshold value and the negative threshold value and the frame audio time stamp is greater than the frame video time stamp, only video data corresponding to the frame video time stamp is determined to be played; if the difference value is smaller than the negative threshold value, discarding the audio and video data corresponding to the frame audio and video timestamp; if the difference value is greater than the positive threshold value, the preset time length is set at intervals, the system time is read again, and the process of determining whether to play the audio and video data corresponding to the frame of audio and video time stamp or not by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point is returned to be executed.
The audio and video synchronous playing service generally takes video as a main service and audio as an auxiliary service, so that the difference value between the actual offset time length and the reference offset time length is judged firstly, and the relation between the audio time stamp and the video time stamp is continuously judged under the condition that the difference value meets the condition. If the difference between the actual offset duration and the reference offset duration is positive, it indicates that the playing time has not been reached, if the difference between the actual offset duration and the reference offset duration is negative, it indicates that the playing time has elapsed, if it is zero, it indicates that the playing time is just reached, and the range between the positive threshold and the negative threshold indicates the allowable range of the difference, which can be set according to practical experience. In addition, in the case that it is determined that the playing time has not been reached, the next comparison process may be waited, that is, the system time is read continuously after waiting for a preset time length, and the size of the preset time length (in milliseconds) may be smaller than the length of an interval between a positive threshold and a negative threshold, for example, a positive threshold of 5 and a negative threshold of-5, and the preset time length is smaller than 10 milliseconds.
In an exemplary scenario, a certain frame is read with a video timestamp of 14:47:00, an audio timestamp of 14:46:59, a system time of 14:47:01, a reference time point of 14:46:00, and a reference time point of 14:46:01, wherein the timestamp format is in minutes: seconds: milliseconds, and the positive and negative thresholds are 5 and-5, respectively, and thus the actual offset duration is: the frame video timestamp-reference time point is 1 second, and the reference offset duration is: the read system time-reference time point is 1 second; and the difference value between the actual offset time length and the reference offset time length is 0 and is between-5 and 5, and the audio time stamp is smaller than the video time stamp, so that the audio and video data corresponding to the frame of audio and video time stamp is determined to be played.
It will be understood by those skilled in the art that the time stamps in the above scenario are merely illustrative and do not represent time stamps in an actual application.
It should be noted that, before determining whether to play the audio/video data corresponding to the frame of audio/video timestamp by using the frame of audio/video timestamp, the read system time, the reference time point, and the reference time point, the background thread may determine whether an abnormal condition exists, so as to ensure the accuracy of playing the audio/video data. If no abnormal condition exists, the process of determining whether to play the audio and video data corresponding to the frame audio and video time stamp or not by using the frame audio and video time stamp, the read system time, the reference time point and the reference time point can be continuously executed; if an abnormal condition exists, the synchronous reference time is reset again, namely the frame audio and video time stamp can be determined as a reference time point, the currently read system time is determined as a reference time point, and a notice of discarding the audio and video data is returned to the foreground thread.
For the abnormal condition judgment process, the background thread may first judge whether the reference offset duration of the current frame is less than the reference offset duration of the previous frame, if so, determine that an abnormal condition exists, reset the reference time point, if not, continue to determine an adjacent difference between the video timestamp of the current frame and the video timestamp of the previous frame, if the adjacent difference exceeds a certain threshold range or the actual offset duration of the current frame is a negative value, determine that an abnormal condition exists, and reset the synchronous reference time again.
It should be further noted that, before reading a frame of audio/video timestamp each time, the background thread may determine whether processing of the currently received audio/video timestamp is finished, if so, send a data request to the foreground thread, and if not, read a frame of audio/video timestamp.
In this embodiment, for an audio/video timestamp sent by a foreground thread to a background thread, a first frame audio/video timestamp and a system time are read first, the first frame video timestamp is determined as a reference time point, the read system time is determined as a reference time point, and audio/video data corresponding to the first frame audio/video timestamp is directly determined to be played.
Fig. 3 is a hardware block diagram of a multimedia device according to an exemplary embodiment of the present application, where the multimedia device includes: a communication interface 301, a processor 302, a machine-readable storage medium 303, and a bus 304; wherein the communication interface 301, the processor 302, and the machine-readable storage medium 303 communicate with each other via a bus 304. The processor 302 may execute the above-described method for audio/video synchronous playing by reading and executing machine executable instructions corresponding to the control logic of the method for audio/video synchronous playing in the machine readable storage medium 302, and the specific content of the method is referred to the above-described embodiments, and will not be described herein again.
The machine-readable storage medium 303 referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: a RAM (random Access Memory), a volatile Memory, a non-volatile Memory, a flash Memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, a dvd, etc.), or similar storage medium, or a combination thereof.
Further, the multimedia device may be a variety of terminal or backend devices, such as a camera, server, mobile phone, Personal Digital Assistant (PDA), mobile audio or video player, game console, Global Positioning System (GPS) receiver, or portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Fig. 4 is a structural diagram of an embodiment of an apparatus for audio and video synchronous playing shown in this application according to an exemplary embodiment, where this embodiment may be applied to a multimedia device based on an HTML5 platform, where the multimedia device runs a foreground thread, and the foreground thread is used to play audio and video data, and as shown in fig. 4, the apparatus for audio and video synchronous playing includes:
the first running module 40 is configured to send, by running the foreground thread, an audio/video timestamp to a background thread different from the foreground thread by the foreground thread;
the second running module 41 is configured to determine, by running the background thread, whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, and if so, return a notification of playing the audio and video data to the foreground thread, so that the foreground thread plays the audio and video data;
wherein the background thread and the foreground thread run independently of each other.
In an optional implementation manner, the first operation module 40 is specifically configured to monitor whether the number of unsent video timestamps in the video timestamp linked list reaches a preset value; when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list; and sending the copied audio and video time stamp to the background thread.
In an optional implementation manner, the first running module 40 is specifically configured to copy, when a data request of a background thread is received, a video timestamp that is not sent in the video timestamp linked list and an audio timestamp that is not sent in the audio timestamp linked list; and sending the copied audio and video time stamp to the background thread.
Fig. 5 is a structural diagram of another embodiment of an apparatus for synchronously playing audio and video according to an exemplary embodiment of the present application, and based on the embodiment shown in fig. 4, as shown in fig. 5, the second operation module 41 includes:
the first determining submodule 411 is configured to read a first frame of audio/video timestamp and system time, determine the frame of video timestamp as a reference time point, determine the read system time as a reference time point, and directly determine to play audio/video data corresponding to the frame of audio/video timestamp;
and the second determining submodule 412 is configured to determine whether to play audio and video data corresponding to the frame of audio and video timestamp by using the frame of audio and video timestamp, the read system time, the reference time point, and the reference time point every time one frame of audio and video timestamp and one system time are read from the second frame of audio and video timestamp.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (9)

1. A method for synchronously playing audio and video is characterized in that the method is applied to a multimedia device based on an HTML5 platform, the multimedia device runs a foreground thread, and the foreground thread is used for playing audio and video data, and the method comprises the following steps:
by running the foreground thread, the foreground thread sends an audio and video timestamp to a background thread different from the foreground thread;
by running the background thread, the background thread determines whether to play audio and video data corresponding to each frame of audio and video timestamp according to each frame of audio and video timestamp, if so, a notification of playing the audio and video data is returned to the foreground thread so that the foreground thread plays the audio and video data;
wherein the background thread and the foreground thread run independently of each other.
2. The method of claim 1, wherein sending, by the foreground thread, an audio-video timestamp to a background thread comprises:
monitoring whether the number of unsent video timestamps in a video timestamp linked list reaches a preset value or not;
when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list;
and sending the copied audio and video time stamp to the background thread.
3. The method of claim 1, wherein sending, by the foreground thread, an audio-video timestamp to a background thread comprises:
copying a video timestamp which is not sent in a video timestamp linked list and an audio timestamp which is not sent in an audio timestamp linked list when a data request of a background thread is received;
and sending the copied audio and video time stamp to the background thread.
4. The method of claim 1, wherein determining, by the background thread, whether to play audio/video data corresponding to each frame of audio/video timestamp based on each frame of audio/video timestamp comprises:
reading a first frame audio and video time stamp and system time, determining the frame video time stamp as a reference time point, determining the read system time as a reference time point, and directly determining to play audio and video data corresponding to the frame audio and video time stamp;
and from the second frame of audio and video time stamp, determining whether to play audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when one frame of audio and video time stamp and one system time are read.
5. The utility model provides a device of audio frequency and video synchronization broadcast which characterized in that, the device is used in the multimedia device based on HTML5 platform, multimedia device operation foreground thread, foreground thread is used for broadcasting audio frequency and video data, the device includes:
the first running module is used for sending the audio and video time stamp to a background thread different from the foreground thread by the foreground thread through running the foreground thread;
the second running module is used for determining whether audio and video data corresponding to each frame of audio and video timestamp is played or not by the background thread according to each frame of audio and video timestamp by running the background thread, and if so, returning a notification of playing the audio and video data to the foreground thread so as to enable the foreground thread to play the audio and video data;
wherein the background thread and the foreground thread run independently of each other.
6. The apparatus according to claim 5, wherein the first operation module is specifically configured to monitor whether the number of unsent video timestamps in the video timestamp chain table reaches a preset value; when the condition that a preset value is reached is monitored, copying an unsent video time stamp in the video time stamp linked list and an unsent audio time stamp in the audio time stamp linked list; and sending the copied audio and video time stamp to the background thread.
7. The apparatus according to claim 5, wherein the first running module is specifically configured to copy, when receiving a data request of a background thread, the video timestamp that is not sent in the video timestamp linked list and the audio timestamp that is not sent in the audio timestamp linked list; and sending the copied audio and video time stamp to the background thread.
8. The apparatus of claim 5, wherein the second operational module comprises:
the first determining submodule is used for reading a first frame of audio and video time stamp and system time, determining the frame of video time stamp as a reference time point, determining the read system time as a reference time point, and directly determining to play audio and video data corresponding to the frame of audio and video time stamp;
and the second determining submodule is used for determining whether to play the audio and video data corresponding to the frame of audio and video time stamp by using the frame of audio and video time stamp, the read system time, the reference time point and the reference time point every time when the second frame of audio and video time stamp is started and one frame of audio and video time stamp and one-time system time are read.
9. A multimedia device, characterized in that the device comprises a readable storage medium and a processor;
wherein the readable storage medium is configured to store machine executable instructions;
the processor configured to read the machine executable instructions on the readable storage medium and execute the instructions to implement the steps of the method of any one of claims 1-4.
CN201810646307.7A 2018-06-21 2018-06-21 Method and device for synchronously playing audio and video Active CN110636359B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810646307.7A CN110636359B (en) 2018-06-21 2018-06-21 Method and device for synchronously playing audio and video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810646307.7A CN110636359B (en) 2018-06-21 2018-06-21 Method and device for synchronously playing audio and video

Publications (2)

Publication Number Publication Date
CN110636359A CN110636359A (en) 2019-12-31
CN110636359B true CN110636359B (en) 2021-11-23

Family

ID=68966484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810646307.7A Active CN110636359B (en) 2018-06-21 2018-06-21 Method and device for synchronously playing audio and video

Country Status (1)

Country Link
CN (1) CN110636359B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101137066A (en) * 2007-05-11 2008-03-05 中兴通讯股份有限公司 Multimedia data flow synchronous control method and device
CN103237255A (en) * 2013-04-24 2013-08-07 南京龙渊微电子科技有限公司 Multi-thread audio and video synchronization control method and system
CN103747316A (en) * 2013-12-23 2014-04-23 乐视致新电子科技(天津)有限公司 Audio and video synchronizing method and electronic device
CN104932892A (en) * 2015-06-19 2015-09-23 浪潮软件集团有限公司 Web application middleware asynchronous nested calling method realized based on WebWorker
CN108089927A (en) * 2016-11-23 2018-05-29 阿里巴巴集团控股有限公司 The method and device of message communicating are realized based on Web Worker

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130014141A1 (en) * 2011-07-06 2013-01-10 Manish Bhatia Audience Atmospherics Monitoring Platform Apparatuses and Systems
CN105100963B (en) * 2014-05-22 2017-12-22 纬创资通股份有限公司 Audio/video media file synthesis method and service system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101137066A (en) * 2007-05-11 2008-03-05 中兴通讯股份有限公司 Multimedia data flow synchronous control method and device
CN103237255A (en) * 2013-04-24 2013-08-07 南京龙渊微电子科技有限公司 Multi-thread audio and video synchronization control method and system
CN103747316A (en) * 2013-12-23 2014-04-23 乐视致新电子科技(天津)有限公司 Audio and video synchronizing method and electronic device
CN104932892A (en) * 2015-06-19 2015-09-23 浪潮软件集团有限公司 Web application middleware asynchronous nested calling method realized based on WebWorker
CN108089927A (en) * 2016-11-23 2018-05-29 阿里巴巴集团控股有限公司 The method and device of message communicating are realized based on Web Worker

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
html5-web workers-前后台线程交互;清濛;《https://blog.csdn.net/xyy_933/article/details/51436578》;20160517;全文 *
前端worker之web worker;潇湘待雨;《https://www.cnblogs.com/pqjwyn/p/7912703.html》;20171128;全文 *

Also Published As

Publication number Publication date
CN110636359A (en) 2019-12-31

Similar Documents

Publication Publication Date Title
US9628531B2 (en) Systems and methods for controlling client behavior in adaptive streaming
US11563992B2 (en) Video playback quality detection method and apparatus
CN107277558B (en) Player client, system and method for realizing synchronization of live video
CN107018427B (en) Live broadcast shared content processing method and device
CN110166788B (en) Information synchronous playing method, device and storage medium
CN107277594A (en) A kind of video and audio and barrage synchronous method and device
EP3824638B1 (en) Advanced preparation for content revision based on expected latency in obtaining new content
US11032584B2 (en) Picture storage method, apparatus and video monitoring system
CN109714622B (en) Video data processing method and device and electronic equipment
CN102685554A (en) Method and device for processing video play
US11540028B2 (en) Information presenting method, terminal device, server and system
CN111447455A (en) Live video stream playback processing method and device and computing equipment
CN106604064A (en) Rapid broadcasting method and device
CN110581973A (en) data playback method and device, terminal equipment and storage medium
CN108259998B (en) Player, play control method and device, electronic equipment and play system
US10200433B2 (en) Client device, a method for receiving a streaming media data and a streaming media data transmission system
CN110636359B (en) Method and device for synchronously playing audio and video
US10674188B2 (en) Playback apparatus, method of controlling playback apparatus, playback method and server apparatus
CN103826158A (en) Network broadcasting device and method
CN106303754A (en) A kind of audio data play method and device
CN106331789A (en) Carousel program playing control method and carousel program playing control device
JP2013009332A5 (en)
CN106899843B (en) A kind of video service quality appraisal procedure and device
CN112449209B (en) Video storage method and device, cloud server and computer readable storage medium
CN113973215A (en) Data deduplication method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant