CN115883865A - Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment - Google Patents

Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment Download PDF

Info

Publication number
CN115883865A
CN115883865A CN202310046315.9A CN202310046315A CN115883865A CN 115883865 A CN115883865 A CN 115883865A CN 202310046315 A CN202310046315 A CN 202310046315A CN 115883865 A CN115883865 A CN 115883865A
Authority
CN
China
Prior art keywords
data
whiteboard
stream
audio
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310046315.9A
Other languages
Chinese (zh)
Inventor
王辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Vhall Time Technology Co ltd
Original Assignee
Beijing Vhall Time Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Vhall Time Technology Co ltd filed Critical Beijing Vhall Time Technology Co ltd
Priority to CN202310046315.9A priority Critical patent/CN115883865A/en
Publication of CN115883865A publication Critical patent/CN115883865A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a method and a system for synchronizing whiteboard interaction data and audio-video data, a storage medium and electronic equipment. The synchronization method comprises the following steps: acquiring whiteboard interactive data and audio-video data at a speaking end; forming plug-flow data stream according to the whiteboard interactive data and the audio and video data; and pushing the stream pushing data stream to the viewing end through the network according to the stream pushing data request from the viewing end, so that the viewing end splits the stream pushing data stream to synchronously acquire whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.

Description

Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment
Technical Field
The application relates to the technical field of live broadcast, in particular to a method and a system for synchronizing whiteboard interaction data and audio-video data, a storage medium and electronic equipment.
Background
With the development of internet technology and new media technology, the application of video interactive live broadcast is more and more extensive. Personnel from different places can enter the same interactive live video call in a network video interactive live broadcast mode, so that multi-person real-time video live broadcast interaction can be realized.
In some specific scenes such as enterprise live broadcast training, online live broadcast teaching, government and enterprise video live broadcast conferences and the like, a live broadcast end has a whiteboard interactive display requirement, and an operator of a speaking end in live broadcast needs to display required interactive contents to a watching end through a whiteboard blackboard writing.
At present, audio and video data and whiteboard interactive data in the live broadcast process are transmitted through two different data transmission channels. However, the inventor of the present application finds that, when a large-scale viewing terminal accesses live broadcasting or a cross-region viewing requirement exists, the audio and video data and the whiteboard interactive data may have a condition that data streams are not synchronized, so that delay may exist in the audio and video data and the whiteboard interactive data received by the viewing terminal, and live broadcasting experience and live broadcasting effect of the viewing terminal are affected.
Disclosure of Invention
According to one aspect of the application, a method for synchronizing whiteboard interaction data and audio-video data is disclosed. The synchronization method comprises the following steps: acquiring whiteboard interactive data and audio-video data at a speaking end; forming a plug-flow data stream according to the whiteboard interactive data and the audio and video data; and pushing the stream pushing data stream to the viewing end through the network according to the stream pushing data request from the viewing end, so that the viewing end splits the stream pushing data stream to synchronously acquire whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.
According to some embodiments of the present application, forming a stream pushing data stream according to whiteboard interaction data and audio/video data comprises: packaging the whiteboard interactive data; merging the whiteboard interactive data subjected to packaging processing into audio and video data; and the packaged whiteboard interactive data comprises interactive data information.
According to some embodiments of the present application, forming a stream pushing data stream according to the whiteboard interaction data and the audio-video data further comprises: after the preset time, updating the whiteboard interactive data and the corresponding interactive data information; packaging the updated whiteboard interactive data; and merging the updated whiteboard interactive data subjected to the packaging processing into the audio and video data.
According to some embodiments of the application, the synchronization method further comprises: receiving local interactive data information from a viewing terminal; judging whether the local interactive data information is consistent with the interactive data information; if yes, displaying whiteboard interactive data at a viewing end; if not, the whiteboard interactive data is reset at the viewing end, and the updated whiteboard interactive data corresponding to the interactive data information is displayed.
According to some embodiments of the present application, after forming a plug-stream data stream according to the whiteboard interaction data and the audio-video data, the synchronization method further includes: and storing the plug flow data stream.
According to another aspect of the present application, a system for synchronizing whiteboard interaction data and audiovisual data is disclosed. The synchronous system comprises a data processing unit and a plug flow processing unit, wherein the data processing unit is used for acquiring whiteboard interactive data and audio and video data at a speaking end and forming a plug flow data stream according to the whiteboard interactive data and the audio and video data; the stream pushing processing unit is used for pushing the stream pushing data stream to the watching end through the network according to the stream pushing data request from the watching end, so that the watching end splits the stream pushing data stream to synchronously acquire whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.
According to some embodiments of the application, the data processing unit packages whiteboard interactive data and incorporates the packaged whiteboard interactive data into audio and video data; and the packaged whiteboard interactive data comprises interactive data information.
According to some embodiments of the application, after a preset time, the data processing unit updates whiteboard interactive data and corresponding interactive data information; the data processing unit is also used for packaging the updated whiteboard interactive data and merging the packaged updated whiteboard interactive data into the audio and video data.
According to some embodiments of the application, the stream pushing processing unit receives local interactive data information from a viewing end and judges whether the local interactive data information is consistent with the interactive data information; under the condition that the local interactive data information is consistent with the interactive data information, the stream pushing processing unit displays the whiteboard interactive data on the viewing end; and under the condition that the local interactive data information is inconsistent with the interactive data information, resetting the whiteboard interactive data at the viewing end by the stream pushing processing unit, and displaying the updated whiteboard interactive data corresponding to the interactive data information.
According to some embodiments of the application, the synchronization system further comprises a storage unit. The storage unit is used for storing the plug flow data stream.
According to another aspect of the application, a live broadcast system is also provided. The live system comprises a synchronization system of whiteboard interaction data and audiovisual data as described above.
According to yet another aspect of the present application, there is also provided a non-volatile computer-readable storage medium. The storage medium has stored thereon a computer program which can implement the synchronization method as described above.
According to another aspect of the application, an electronic device is also provided. The electronic device includes one or more processors and storage for storing one or more programs. When executed by one or more processors, the one or more programs enable the one or more processors to implement the synchronization methods described above.
According to the technical scheme, the stream pushing data stream simultaneously comprising the whiteboard interactive data and the audio-video data is obtained by acquiring and combining the whiteboard interactive data and the audio-video data at the speaking end. The watching end can directly acquire the plug flow data stream from the network by sending the plug flow data request, so that whiteboard interactive data and audio and video data obtained by splitting the plug flow data stream can be synchronously acquired, and the effect of synchronously acquiring the whiteboard interactive data and the audio and video data by the watching end is realized.
According to the technical scheme, compared with the transmission of whiteboard interactive data and audio-video data in the prior art through two transmission channels, the problem of data asynchronism caused by different transmission channel time differences and fixed time delay is avoided, live broadcast effect can be guaranteed, and watching experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 shows a flow diagram of a synchronization method of an example embodiment of the present application;
FIG. 2 shows a schematic diagram of a plug flow data flow of an example embodiment of the present application;
FIG. 3 illustrates another flow diagram of a synchronization method of an exemplary embodiment of the present application;
fig. 4 shows a schematic structural diagram of a synchronization system according to an exemplary embodiment of the present application.
Description of the reference numerals:
a synchronization system 1; a data processing unit 10; a plug flow processing unit 20; a memory unit 30.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other means, components, materials, devices, etc. In such cases, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail.
Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The terms "first," "second," and the like in the description and claims of the present application and in the foregoing drawings are used for distinguishing between different objects and not for describing a particular sequential order.
The technical solutions of the present application will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
One aspect of the application provides a method for synchronizing whiteboard interaction data and audio-video data, which can enable whiteboard interaction data and audio-video data output by a speaking end in a live broadcast process to be synchronously displayed at a watching end.
The technical scheme of the application is described in detail in the following with the accompanying drawings of the specification.
Fig. 1 shows a flow diagram of a synchronization method according to an exemplary embodiment of the present application. As shown in FIG. 1, the synchronization method includes steps S100 to S300. According to an example embodiment, the synchronization method is performed by a server.
In step S100, the server acquires whiteboard interaction data and audiovisual data at the speaking end.
According to an example embodiment, in the live broadcasting process, an operator at a live broadcasting end starts the live broadcasting, and a server acquires whiteboard interaction data and audio and video data of the live broadcasting end.
For example, the whiteboard interaction data may be document information displayed at the speaking end, document page turning, writing, drawing, modifying, annotating, drawing data deleting, clearing and other editing information performed on the whiteboard by an operator at the speaking end, and the like, which are used for interacting with the viewing end. The audio and video data can be voice information of an operator at the speaking end and video information displayed by the whiteboard on a screen displaying whiteboard interaction data.
In step S200, the server forms a plug-flow data stream according to the whiteboard interaction data and the audio/video data.
According to an example embodiment, the server combines the whiteboard interaction data and the audiovisual data received from the speaking terminal, thereby forming a plug-stream data stream comprising the whiteboard interaction data and the audiovisual data.
Optionally, in step S200, the server packages the whiteboard interaction data, and incorporates the packaged whiteboard interaction data into the audio and video data.
For example, the server receives whiteboard interaction data from a speaking terminal and stores the whiteboard interaction data in a local file. The server encapsulates the whiteboard interactive data through an SEI (Supplemental Enhancement Information) technology, and incorporates the encapsulated whiteboard interactive data into the audio and video data.
The SEI technology can add additional information to the audio/video data, so that the audio/video data can also store data other than images and audio. Therefore, the server stably adds the whiteboard interactive data into the audio and video data through the SEI technology, so that the whiteboard interactive data is transmitted without depending on an independent data channel, the whiteboard interactive data is packaged and transmitted with the audio and video data through the SEI technology, and accurate synchronous transmission of the whiteboard interactive data and the audio and video data can be realized.
According to an example embodiment, the packaged processed whiteboard interaction data comprises interaction data information.
For example, the interactive data information includes version information of the whiteboard interactive data, such as a version snapshot of the whiteboard interactive data performed by the server to generate a version number corresponding to the current whiteboard interactive data.
In step S300, the server pushes the stream pushing data stream to the viewing end through the network according to the stream pushing data request from the viewing end, so that the viewing end splits the stream pushing data stream, thereby synchronously obtaining whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.
For example, when the viewing end is first accessed to the live broadcast, the viewing end sends a stream pushing data request to the server to obtain a stream pushing data stream including whiteboard interaction data and audio-video data. The watching end obtains the plug-flow data stream through the CDN network pull flow, and splits the plug-flow data stream, thereby synchronously obtaining whiteboard interactive data and audio-video data.
Alternatively, steps S200 and S300 may be performed at the speaking end, so that the whiteboard interaction data and the audio-video data may have a smaller operation time difference.
Through the above example embodiment, according to the technical scheme of the application, the push stream data stream including both the whiteboard interaction data and the audio-video data is obtained by acquiring and combining the whiteboard interaction data and the audio-video data at the speaking end. The watching end can directly acquire the plug flow data stream from the network by sending the plug flow data request, so that the whiteboard interactive data and the audio and video data obtained by splitting the plug flow data stream can be synchronously acquired, and the effect of synchronously acquiring the whiteboard interactive data and the audio and video data by the watching end is realized.
Compared with the transmission of whiteboard interactive data and audio-video data in the prior art through two transmission channels, the technical scheme of the application avoids the problems of different transmission channels, different time differences and data asynchronism caused by fixed time delay, and can ensure the live broadcast effect and improve the watching experience of users.
Optionally, the server further empties the whiteboard interaction data stored in the local file according to an emptying instruction from the speaking end, or the server further deletes a corresponding file in the whiteboard interaction data in the local file according to a deleting instruction from the speaking end.
For example, the operator at the speaking end sends an emptying instruction to the server according to an actual requirement to clear unnecessary whiteboard interaction data, or the operator at the speaking end sends a deletion instruction of a certain drawing line or a certain drawing brush to the server according to the actual requirement to delete the unnecessary drawing line or the drawing brush, and the server executes a corresponding operation instruction. Therefore, meaningless data in the plug flow data stream formed by the whiteboard interactive data and the audio and video data can be deleted, and the transmission efficiency is prevented from being influenced by overlarge data volume of the plug flow data stream.
Optionally, the server further updates the whiteboard interaction data and the corresponding interaction data information after a preset time.
According to an example embodiment, the server updates the whiteboard interaction data after every preset time (e.g., 3 s) interval. For example, after every 3s, the server performs version snapshot on all currently stored whiteboard interaction data to generate version numbers corresponding to all currently stored whiteboard interaction data.
And the server packages the updated whiteboard interactive data and incorporates the packaged updated whiteboard interactive data into the audio and video data, wherein the updated whiteboard interactive data comprises interactive data information (namely version information) corresponding to the updated whiteboard interactive data.
The whiteboard interactive data and the interactive data information are updated simultaneously, and it can be understood here that after the whiteboard interactive data is updated each time, the version information of the whiteboard interactive data in the interactive data information is incremented (i.e. the version number is gradually increased by one).
Fig. 2 shows a schematic diagram of a plug flow data flow of an example embodiment of the present application. As shown in fig. 2, the server encapsulates the updated whiteboard interactive data by the SEI technology, and incorporates the encapsulated updated whiteboard interactive data (such as version number 1 and version number 2 … shown in fig. 2) into the audio/video data, so as to keep updating the stream pushing data stream.
Optionally, fig. 3 shows another flowchart of the synchronization method according to the exemplary embodiment of the present application. As shown in FIG. 3, the synchronization method includes steps S100 to S700, and steps S100 to S300 are described in detail above and are not repeated herein.
In step S400, the server receives local interactive data information from the viewer.
For example, the local interaction data information includes version information of the locally stored whiteboard interaction data. When the viewing terminal initially acquires the whiteboard interactive data, the locally stored whiteboard interactive data is initialized, the whiteboard interactive data is used as an initial version, and a corresponding local version number is stored. Therefore, the synchronism of whiteboard interactive data of the watching end is ensured when the watching end enters the live broadcast for the first time.
In step S500, the server determines whether the local interactive data information and the interactive data information are consistent.
In step S600, the server displays the whiteboard interaction data on the viewer.
In step S700, the server resets the whiteboard interactive data at the viewing end, and displays the updated whiteboard interactive data corresponding to the interactive data information.
For example, when the viewing end receives whiteboard interactive data obtained by splitting the stream data stream, interactive data information (i.e., version number) corresponding to the whiteboard interactive data is obtained. The server judges whether the current local version number of the watching end is consistent with the version number of the current pushed whiteboard interactive data, and if so, the server represents that the pushed data stream normally runs; and if the current plug flow data stream is inconsistent with the current plug flow data stream, the current plug flow data stream is in error or the audio and video data is in a disconnected state.
Under the condition that the current local version number is consistent with the version number of the current pushed whiteboard interactive data, the server displays the whiteboard interactive data on the viewing end; and under the condition that the current local version number is not consistent with the version number of the currently pushed whiteboard interactive data, the server resets the whiteboard interactive data and the display state of the watching end at the watching end, and displays the updated whiteboard interactive data corresponding to the version number of the currently pushed whiteboard interactive data at the watching end, so that under the condition that the current pushed stream data stream is wrong, the whiteboard interactive data displayed by the watching end and the currently pushed stream whiteboard interactive data are kept consistent in time. Therefore, when the plug-flow data stream is in error or the audio and video data is in a disconnected state, the whiteboard interactive data of the watching end can be automatically updated, the follow-up normal display is ensured, and the restorability of the whiteboard interactive data is ensured.
Optionally, in step S200, after the server forms a plug flow data stream according to the whiteboard interaction data and the audio/video data, the synchronization method further includes storing the plug flow data stream.
For example, the server also stores a plug-flow data stream to ensure the playability of whiteboard interaction data and audio/video data in live broadcasting. And by storing the plug flow data stream comprising the whiteboard interactive data and the audio and video data, the server can edit the plug flow data stream through a video processing tool, so that the editability of the plug flow data stream is ensured.
Another aspect of the application provides a whiteboard interactive data and audio-video data synchronization system, which enables whiteboard interactive data and audio-video data output by a speaking end in a live broadcast process to be displayed synchronously at a watching end.
Fig. 4 shows a schematic structural diagram of a synchronization system according to an exemplary embodiment of the present application. As shown in fig. 4, the synchronization system 1 includes a data processing unit 10 and a plug flow processing unit 20.
According to an exemplary embodiment, the data processing unit 10 acquires whiteboard interaction data and audio-video data at a speaking end, and forms a plug flow data stream from the whiteboard interaction data and the audio-video data.
According to an exemplary embodiment, during the live broadcast, an operator at the speaking end of the live broadcast starts the live broadcast, and the data processing unit 10 acquires whiteboard interaction data and audio-video data of the speaking end.
For example, the whiteboard interaction data may be document information displayed at the speaking end, document page turning, writing, drawing, modifying, annotating, drawing data deleting, clearing and other editing information performed on the whiteboard by an operator at the speaking end, and the like, for interacting with the viewing end. The audio and video data can be the sound information of an operator at the speaking end and the video information displayed by the whiteboard on the screen displaying the whiteboard interaction data.
According to an exemplary embodiment, the data processing unit 10 combines the whiteboard interaction data and the audiovisual data received from the speaking terminal, thereby forming a plug-flow data stream comprising the whiteboard interaction data and the audiovisual data.
Optionally, the data processing unit 10 performs encapsulation processing on the whiteboard interaction data, and incorporates the whiteboard interaction data subjected to the encapsulation processing into the audio and video data.
For example, the data processing unit 10 receives whiteboard interaction data from a speaking terminal and stores the whiteboard interaction data into a local file. The data processing unit 10 encapsulates the whiteboard interactive data by an SEI (Supplemental Enhancement Information) technology, and incorporates the encapsulated whiteboard interactive data into the audio/video data.
The SEI technology can add additional information to the audio/video data, so that the audio/video data can also store data other than images and audio. Therefore, the data processing unit 10 stably adds the whiteboard interactive data to the audio and video data through the SEI technology, so that the whiteboard interactive data is transmitted without depending on a separate data channel, and the whiteboard interactive data is packaged and transmitted with the audio and video data through the SEI technology, thereby realizing accurate and synchronous transmission of the whiteboard interactive data and the audio and video data.
According to an example embodiment, the packaged processed whiteboard interaction data comprises interaction data information.
For example, the interactive data information includes version information of the whiteboard interactive data, such as the data processing unit 10 performs a version snapshot on the whiteboard interactive data to generate a version number corresponding to the current whiteboard interactive data.
According to an exemplary embodiment, the stream pushing processing unit 20 pushes the stream pushing data stream to the viewing end through the network according to the stream pushing data request from the viewing end, so that the viewing end splits the stream pushing data stream to synchronously acquire the whiteboard interaction data and the audio and video data obtained by splitting the stream pushing data stream.
For example, when the viewing end is first connected to the live broadcast, the viewing end sends a stream pushing data request to the stream pushing processing unit 20 to obtain a stream pushing data stream including whiteboard interaction data and audiovisual data. The watching end obtains the plug-flow data stream through CDN network pull-flow and splits the plug-flow data stream, thereby synchronously obtaining whiteboard interactive data and audio-video data.
Through the above exemplary embodiment, according to the technical scheme of the application, the stream pushing data stream including both the whiteboard interactive data and the audio-video data is obtained by acquiring and combining the whiteboard interactive data and the audio-video data at the speaking end. The watching end can directly acquire the plug flow data stream from the network by sending the plug flow data request, so that whiteboard interactive data and audio and video data obtained by splitting the plug flow data stream can be synchronously acquired, and the effect of synchronously acquiring the whiteboard interactive data and the audio and video data by the watching end is realized.
According to the technical scheme, compared with the transmission of whiteboard interactive data and audio-video data in the prior art through two transmission channels, the problem of data asynchronism caused by different transmission channel time differences and fixed time delay is avoided, live broadcast effect can be guaranteed, and watching experience of a user is improved.
Optionally, the data processing unit 10 further empties the whiteboard interaction data stored in the local file according to an emptying instruction from the speaking end, or the data processing unit 10 further deletes a corresponding file in the whiteboard interaction data in the local file according to a deleting instruction from the speaking end.
For example, the operator at the speaking end sends an emptying instruction to the data processing unit 10 according to an actual requirement to clear unnecessary whiteboard interaction data, or the operator at the speaking end sends a deletion instruction of a certain drawing line or a certain drawing brush to the data processing unit 10 according to the actual requirement to delete the unnecessary drawing line or the drawing brush, and then the data processing unit 10 executes a corresponding operation instruction. Therefore, meaningless data in the plug flow data stream formed by the whiteboard interactive data and the audio and video data can be deleted, and the transmission efficiency is prevented from being influenced by overlarge data volume of the plug flow data stream.
Optionally, the data processing unit 10 further updates the whiteboard interaction data and the corresponding interaction data information after a preset time. The data processing unit 10 further performs encapsulation processing on the updated whiteboard interaction data, and incorporates the encapsulated updated whiteboard interaction data into the audio and video data.
According to an exemplary embodiment, the data processing unit 10 updates the whiteboard interaction data after every preset time (e.g., 3 s) interval. For example, after every 3s, the data processing unit 10 performs a version snapshot on all currently stored whiteboard interaction data to generate a version number corresponding to all currently stored whiteboard interaction data.
The data processing unit 10 encapsulates the updated whiteboard interactive data, and incorporates the encapsulated updated whiteboard interactive data into the audio/video data, where the updated whiteboard interactive data includes interactive data information (i.e., version information) corresponding to the updated whiteboard interactive data.
The whiteboard interactive data and the interactive data information are updated simultaneously, and it can be understood here that after the whiteboard interactive data is updated each time, the version information of the whiteboard interactive data in the interactive data information is incremented (i.e. the version number is gradually increased by one).
Optionally, the plug flow processing unit 20 receives local interactive data information from the viewing end, and determines whether the local interactive data information is consistent with the interactive data information.
For example, the local interaction data information includes version information of the locally stored whiteboard interaction data. When the viewing end initially obtains the whiteboard interaction data, the stream pushing processing unit 20 initializes the locally stored whiteboard interaction data, takes the whiteboard interaction data as an initial version, and stores a corresponding local version number. Therefore, the synchronism of whiteboard interactive data of the watching end is ensured when the watching end enters the live broadcasting for the first time.
When the viewing end receives whiteboard interactive data obtained by splitting the plug-flow data stream, interactive data information (namely a version number) corresponding to the whiteboard interactive data is obtained. The plug-flow processing unit 20 determines whether the current local version number of the viewing end is consistent with the version number of the whiteboard interactive data currently pushed, and if so, represents that the plug-flow data flow normally runs; and if the current plug flow data stream is inconsistent with the current plug flow data stream, the current plug flow data stream is in error or the audio and video data is in a disconnected state.
Under the condition that the current local version number is consistent with the version number of the current pushed whiteboard interactive data, the pushing processing unit 20 displays the whiteboard interactive data on the viewing end; when the current local version number is not consistent with the version number of the currently pushed whiteboard interactive data, the pushing processing unit 20 resets the whiteboard interactive data at the viewing end, and displays the updated whiteboard interactive data corresponding to the interactive data information, so that the whiteboard interactive data displayed at the viewing end is kept consistent with the currently pushed whiteboard interactive data in time when the current pushing data stream is wrong. Therefore, when the plug-flow data stream is in error or the audio and video data is in a disconnected state, the whiteboard interactive data of the watching end can be automatically updated, the follow-up normal display is ensured, and the restorability of the whiteboard interactive data is ensured.
Optionally, as shown in fig. 4, the synchronization system further comprises a storage unit 30. The storage unit 30 stores the plug flow data stream.
For example, the storage unit 30 further stores a plug-stream data stream to ensure the playability of whiteboard interaction data and audio/video data in live broadcasting. And by storing the plug flow data stream comprising the whiteboard interactive data and the audio and video data, the plug flow data stream can be edited by a video processing tool, so that the editability of the plug flow data stream is ensured.
According to another aspect of the application, a live broadcast system is also provided. The live system comprises a synchronization system of whiteboard interaction data and audiovisual data as described above.
According to yet another aspect of the present application, there is also provided a non-volatile computer-readable storage medium. The storage medium has stored thereon a computer program which may implement the synchronization method as described above.
According to another aspect of the application, an electronic device is also provided. The electronic device includes one or more processors and storage for storing one or more programs. When executed by one or more processors, the one or more programs enable the one or more processors to implement the synchronization methods described above.
Finally, it should be noted that the above-mentioned embodiments are only preferred embodiments of the present application, and not intended to limit the present application, and although the present application is described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made in the technical solutions of the foregoing embodiments, or equivalents may be substituted for some of the technical features. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (13)

1. A method for synchronizing whiteboard interactive data and audio-video data is characterized by comprising the following steps:
acquiring whiteboard interactive data and audio-video data at a speaking end;
forming plug-flow data stream according to the whiteboard interactive data and the audio and video data;
pushing the stream pushing data stream to a viewing end through a network according to a stream pushing data request from the viewing end, so that the viewing end splits the stream pushing data stream to synchronously acquire whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.
2. The synchronization method according to claim 1, wherein the forming a stream pushing data stream according to the whiteboard interaction data and the audio-video data comprises:
packaging the whiteboard interactive data;
merging the whiteboard interactive data subjected to packaging processing into the audio and video data;
and the packaged whiteboard interactive data comprises interactive data information.
3. The synchronization method of claim 2, wherein forming a plug flow data stream according to the whiteboard interaction data and the audio-video data further comprises:
after the preset time, updating the whiteboard interactive data and the corresponding interactive data information;
packaging the updated whiteboard interactive data;
and merging the updated whiteboard interactive data subjected to encapsulation processing into the audio and video data.
4. The synchronization method of claim 3, further comprising:
receiving local interactive data information from the viewing end;
judging whether the local interactive data information is consistent with the interactive data information;
if yes, displaying the whiteboard interaction data at the viewing end;
if not, the whiteboard interactive data is reset at the viewing end, and the updated whiteboard interactive data corresponding to the interactive data information is displayed.
5. The synchronization method according to claim 1, wherein after the forming of a plug flow data stream according to the whiteboard interaction data and the audio-video data, the synchronization method further comprises: and storing the plug flow data stream.
6. A system for synchronizing whiteboard interaction data and audiovisual data, comprising:
the data processing unit is used for acquiring whiteboard interactive data and audio and video data at a speaking end and forming plug flow data streams according to the whiteboard interactive data and the audio and video data;
and the stream pushing processing unit is used for pushing the stream pushing data stream to the watching end through a network according to a stream pushing data request from the watching end, so that the watching end splits the stream pushing data stream to synchronously acquire whiteboard interaction data and audio and video data obtained by splitting the stream pushing data stream.
7. The synchronization system according to claim 6, wherein the data processing unit packages the whiteboard interaction data, and incorporates the packaged whiteboard interaction data into the audio/video data;
and the packaged whiteboard interactive data comprises interactive data information.
8. The synchronization system according to claim 7, wherein the data processing unit updates the whiteboard interaction data and the corresponding interaction data information after a preset time;
the data processing unit is also used for packaging the updated whiteboard interactive data and merging the updated whiteboard interactive data subjected to packaging into the audio and video data.
9. The synchronization system of claim 8, wherein the stream pushing processing unit receives local interactive data information from the viewing end and determines whether the local interactive data information is consistent with the interactive data information;
under the condition that local interactive data information is consistent with the interactive data information, the stream pushing processing unit displays the whiteboard interactive data on the viewing end;
and under the condition that the local interactive data information is inconsistent with the interactive data information, the stream pushing processing unit resets the whiteboard interactive data at the viewing end and displays the updated whiteboard interactive data corresponding to the interactive data information.
10. The synchronization system of claim 6, further comprising:
and the storage unit is used for storing the plug flow data stream.
11. A live broadcast system comprising a synchronization system as claimed in any one of claims 6 to 10.
12. A non-transitory computer-readable storage medium having stored thereon a computer program implementing the synchronization method of any of claims 1~5.
13. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a synchronization method as in any of claims 1~5.
CN202310046315.9A 2023-01-31 2023-01-31 Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment Pending CN115883865A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310046315.9A CN115883865A (en) 2023-01-31 2023-01-31 Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310046315.9A CN115883865A (en) 2023-01-31 2023-01-31 Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115883865A true CN115883865A (en) 2023-03-31

Family

ID=85758548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310046315.9A Pending CN115883865A (en) 2023-01-31 2023-01-31 Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115883865A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002048A1 (en) * 2002-07-01 2004-01-01 Matthew Thurmaier Method and system for providing a virtual computer classroom
CN109547831A (en) * 2018-11-19 2019-03-29 网宿科技股份有限公司 A kind of method, apparatus, calculating equipment and the storage medium of blank and audio video synchronization
CN111124333A (en) * 2019-12-05 2020-05-08 视联动力信息技术股份有限公司 Method, device, equipment and storage medium for synchronizing display contents of electronic whiteboard
CN111381918A (en) * 2018-12-29 2020-07-07 中兴通讯股份有限公司 Method and related equipment for realizing remote assistance
CN114205637A (en) * 2021-12-16 2022-03-18 杭州雅顾科技有限公司 Whiteboard audio and video synchronization method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002048A1 (en) * 2002-07-01 2004-01-01 Matthew Thurmaier Method and system for providing a virtual computer classroom
CN109547831A (en) * 2018-11-19 2019-03-29 网宿科技股份有限公司 A kind of method, apparatus, calculating equipment and the storage medium of blank and audio video synchronization
CN111381918A (en) * 2018-12-29 2020-07-07 中兴通讯股份有限公司 Method and related equipment for realizing remote assistance
CN111124333A (en) * 2019-12-05 2020-05-08 视联动力信息技术股份有限公司 Method, device, equipment and storage medium for synchronizing display contents of electronic whiteboard
CN114205637A (en) * 2021-12-16 2022-03-18 杭州雅顾科技有限公司 Whiteboard audio and video synchronization method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
EP2940940B1 (en) Methods for sending and receiving video short message, apparatus and handheld electronic device thereof
CN110798697B (en) Video display method, device and system and electronic equipment
JP6570646B2 (en) Audio video file live streaming method, system and server
US6751800B1 (en) Information processing apparatus, method, and computer-readable medium
CN109963162B (en) Cloud directing system and live broadcast processing method and device
EP3334175A1 (en) Streaming media and caption instant synchronization displaying and matching processing method, device and system
CN108289159B (en) Terminal live broadcast special effect adding system and method and terminal live broadcast system
US20030159153A1 (en) Method and apparatus for processing ATVEF data to control the display of text and images
US7039933B1 (en) Enhanced TV broadcasting method and system using tags for incorporating local content into a program data stream
US20200186887A1 (en) Real-time broadcast editing system and method
JP2005051703A (en) Live streaming broadcasting method, live streaming broadcasting apparatus, live streaming broadcasting system, program, recording medium, broadcasting method, and broadcasting apparatus
JP2009517976A (en) Interactive TV without trigger
JP6700957B2 (en) Subtitle data generation device and program
CN111723558A (en) Document display method and device, electronic equipment and storage medium
CN108810580B (en) Media content pushing method and device
CN112383790A (en) Live broadcast screen recording method and device, electronic equipment and storage medium
CN114595409A (en) Method for directly clicking and opening hyperlink in shared screen content by conference participant
KR101915792B1 (en) System and Method for Inserting an Advertisement Using Face Recognition
JP2005277847A (en) Image reproduction system, image transmission apparatus, image receiving apparatus, image reproduction method, image reproduction program, and recording medium
KR100254051B1 (en) Method for reproducing video signal in set top unit for video on demand
KR100641850B1 (en) Apparatus and Method for storing link information of DMB interactive contents, and DMB Receiving Terminal System using it
CN111757187A (en) Multi-language subtitle display method, device, terminal equipment and storage medium
CN115883865A (en) Method and system for synchronizing whiteboard interactive data and audio-video data, storage medium and electronic equipment
CN113891108A (en) Subtitle optimization method and device, electronic equipment and storage medium
JP6481290B2 (en) Information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230331