CN114257844A - Multi-video synchronous playing method, device, equipment and readable storage medium - Google Patents

Multi-video synchronous playing method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN114257844A
CN114257844A CN202111575542.8A CN202111575542A CN114257844A CN 114257844 A CN114257844 A CN 114257844A CN 202111575542 A CN202111575542 A CN 202111575542A CN 114257844 A CN114257844 A CN 114257844A
Authority
CN
China
Prior art keywords
video
video frame
videos
timestamps
frame set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111575542.8A
Other languages
Chinese (zh)
Other versions
CN114257844B (en
Inventor
张富全
甄海华
黄锡平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Group Co Ltd filed Critical Vtron Group Co Ltd
Priority to CN202111575542.8A priority Critical patent/CN114257844B/en
Publication of CN114257844A publication Critical patent/CN114257844A/en
Application granted granted Critical
Publication of CN114257844B publication Critical patent/CN114257844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23602Multiplexing isochronously with the video sync, e.g. according to bit-parallel or bit-serial interface formats, as SDI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a multi-video synchronous playing method, a device, equipment and a readable storage medium, wherein the method comprises the following steps: acquiring a plurality of videos; decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp; for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps; and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps. Obviously, each video is decoded independently, and a plurality of video frames corresponding to the same timestamp are spliced into one video frame, so that a plurality of spliced video frames can be played in the same playing window, and a plurality of videos can be played synchronously in the same playing window, so that a user can observe the pictures of each video in the same time point.

Description

Multi-video synchronous playing method, device, equipment and readable storage medium
Technical Field
The present application relates to the field of video playing technologies, and in particular, to a method, an apparatus, a device, and a readable storage medium for synchronously playing multiple videos.
Background
In the application of large screen display in a monitoring room or a conference room, multiple paths of video signals need to be synchronously recorded at the same time point sometimes, and multiple paths of videos also need to be synchronously played back during playback, so that the interaction among the multiple paths of videos at the same time point can be clearly known. Because one playing window of the player in the prior art can only play one video file and cannot synchronously play a plurality of video files, if the exchange among the videos needs to be known, a plurality of playing windows need to be opened to play different videos, but the played videos cannot be synchronously played due to the time difference of the opened videos.
Therefore, how to implement a playing window to play multiple videos synchronously is a problem worthy of research.
Disclosure of Invention
In view of this, the present application provides a method, an apparatus, a device and a readable storage medium for playing multiple videos synchronously in a same playing window.
In order to achieve the above object, the following solutions are proposed:
a multi-video synchronous playing method comprises the following steps:
acquiring a plurality of videos;
decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp;
for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps;
and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Preferably, said decoding each said video separately comprises:
establishing a decoding thread corresponding to each video;
and decoding the corresponding video by utilizing the decoding thread.
Preferably, before the splicing the video frames corresponding to the timestamps of the respective videos without overlapping, the method further includes:
taking each video frame with the same timestamp as a video frame set, and caching the video frames in a cache queue according to the time sequence of the timestamps;
the splicing the video frames corresponding to the timestamps of the videos without overlapping includes:
and splicing the video frames of the video frame set corresponding to the timestamps acquired from the cache queue in a non-overlapping manner.
Preferably, the buffering, in the buffer queue according to the time sequence of the timestamps, each video frame with the same timestamp as a video frame set includes:
judging whether a video frame set corresponding to the video frame to be cached exists in the cache queue or not aiming at each video frame to be cached;
if the video frame to be cached exists in the cache queue, caching the video frame to be cached in a corresponding video frame set in the cache queue;
if the video frame set does not exist, establishing a video frame set corresponding to the video frame to be cached in the cache queue according to the time stamp sequence;
and buffering the video frame to be buffered in a newly established video frame set in the buffer queue.
Preferably, the splicing, without overlapping, video frames of the video frame set corresponding to each timestamp acquired from the buffer queue includes:
acquiring a video frame set corresponding to each timestamp from the cache queue according to the time sequence of the timestamps;
for each video frame set, judging whether the number of video frames of the video frame set is equal to the number of the videos;
if so, splicing all the video frames of the video frame set in a non-overlapping manner;
if not, searching a first video frame set which meets the condition that the number of the video frames is equal to the number of the videos according to the time sequence of the time stamps from the video frame set in the cache queue;
if the video frames are found, performing a step of splicing the video frames of the video frame set without overlapping on the found video frame set;
and if the video is not found, waiting for the decoding thread to finish decoding the corresponding video.
Preferably, the method further comprises the following steps:
acquiring an audio corresponding to the plurality of videos;
and synchronously playing the audio when playing each spliced video frame in the same playing window according to the time sequence of the time stamps.
A multi-video synchronized playback device, comprising:
a video acquisition unit for acquiring a plurality of videos;
the video decoding unit is used for decoding each video independently to obtain each timestamp of each video and a video frame corresponding to each timestamp;
the video frame splicing unit is used for splicing the video frames corresponding to the time stamps of the videos in a non-overlapping manner aiming at each time stamp to obtain spliced video frames corresponding to the time stamps;
and the video playing unit is used for playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Preferably, the video decoding unit includes:
a decoding thread establishing unit for establishing a decoding thread corresponding to each video;
and the decoding thread decoding video unit is used for decoding the corresponding video by utilizing the decoding thread.
A multi-video synchronous playing device comprises a memory and a processor;
the memory is used for storing programs;
the processor is used for executing the program and realizing the steps of the multi-video synchronous playing method.
A readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the above-mentioned multi-video synchronized playback method.
According to the scheme, the multi-video synchronous playing method provided by the application comprises the following steps: acquiring a plurality of videos; decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp; for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps; and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps. Obviously, each video is decoded independently, and a plurality of video frames corresponding to the same timestamp are spliced into one video frame, so that a plurality of spliced video frames can be played in the same playing window, and a plurality of videos can be played synchronously in the same playing window, so that a user can observe the pictures of each video in the same time point.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of a multi-video synchronous playing method according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a scene in which multiple video frames are spliced and played according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a multi-video synchronous playing device according to an embodiment of the present application;
fig. 4 is a block diagram of a hardware structure of a multi-video synchronous playback device disclosed in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flow chart of a multi-video synchronous playing method provided in an embodiment of the present application, where the method includes:
step S100: a plurality of videos is acquired.
Specifically, multiple videos in different scenes may be acquired, where the acquired multiple videos may be videos in different formats or videos in the same format, and examples of the videos include: multiple videos, each in h.264 format, may be acquired.
The acquiring of the scenes of the plurality of videos may include: the multiple cameras are used for monitoring different areas, so that multiple paths of videos can be recorded in the same time period, and multiple paths of monitoring videos in the same time period can be obtained; a plurality of persons carry out a video conference, each person corresponds to one video, the conference videos of the persons can be recorded respectively, and a plurality of paths of conference videos in the same time can be obtained; multiple unrelated videos may also be obtained from different video sources.
Step S110: and decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp.
Specifically, for each video, one timestamp may correspond to one video frame, and after each video is decoded separately, each timestamp of each video and a video frame corresponding to each timestamp may be obtained. The video frame obtained after decoding may be a video frame in YUV format.
Step S120: and splicing the video frames corresponding to the time stamps of the videos without overlapping aiming at each time stamp to obtain the spliced video frames corresponding to the time stamps.
Specifically, different video frames of a plurality of videos with the same timestamp can be spliced without overlapping to obtain a spliced video frame, and the spliced video can completely display the content of the plurality of videos.
Step S130: and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Specifically, each spliced video frame is played in the same playing window according to the time sequence, and then the effect of synchronously playing a plurality of videos can be achieved.
In an optional implementation manner of the process of playing the spliced video frames, the spliced video frames may be rendered to the same playing window for playing by using an SDL (Simple direct media Layer, a set of cross-platform multimedia development library of open source codes).
According to the scheme, each video is decoded independently, the video frames corresponding to the same timestamp are spliced into one video frame, the spliced video frames can be played in the same playing window, and the videos can be played in the same playing window synchronously, so that a user can observe the pictures of the videos at the same time point.
In some embodiments of the present application, the above-mentioned step S110 is introduced, and the process of decoding each of the videos separately is further described below.
Specifically, the process may include the steps of:
and S1, establishing a decoding thread corresponding to each video.
Specifically, one decoding thread may be established for each video.
And S2, decoding the corresponding video by utilizing the decoding thread.
Specifically, each video may be decoded by using a decoding thread corresponding to each video, and each timestamp of each video and a video frame corresponding to each timestamp may be obtained after decoding.
According to the scheme, the time of the decoding process can be saved and the decoding efficiency can be improved by establishing one decoding thread for each video.
In view of that it takes a certain time to decode all videos, in step S120, before splicing the video frames corresponding to the timestamps of the respective videos without overlapping, the embodiment of the present application may further add a process of buffering the video frames.
Specifically, the process of buffering the video frame may include the following steps:
and taking each video frame with the same timestamp as a video frame set, and caching the video frames in a cache queue according to the time sequence of the timestamp.
Specifically, each video frame of a timestamp is buffered in the buffer queue as a video frame set, and the video frame set is buffered in the buffer queue in the time sequence of the timestamp.
Next, the process of buffering the video frames will be further described.
Specifically, the process of buffering the video frame may include the following steps:
s1, judging whether a video frame set corresponding to the video frame to be cached exists in the cache queue or not aiming at each video frame to be cached.
Specifically, when each video frame is buffered in the buffer queue, it may be determined whether a video frame set identified by a timestamp corresponding to the video frame exists in the buffer queue.
And S2, if the video frame to be buffered exists, buffering the video frame to be buffered in the corresponding video frame set in the buffer queue.
Specifically, if other video frames with the same timestamp as the video frame to be cached have been cached in the cache queue, a video frame set identified by the timestamp corresponding to the video frame to be cached exists in the cache queue, and then the video frame to be cached can be cached in the video frame set identified by the timestamp in the cache queue.
And S3, if the video frame does not exist, establishing a video frame set corresponding to the video frame to be cached in the cache queue according to the time stamp sequence.
Specifically, if no other video frame with the same timestamp as the video frame to be cached is cached in the cache queue, a video frame set identified by the timestamp corresponding to the video frame to be cached does not exist in the cache queue, and then a video frame set is newly established in the cache queue by using the timestamp of the video frame to be cached as the identifier.
And S4, buffering the video frame to be buffered in the newly established video frame set in the buffer queue.
Specifically, the video frame to be cached may be cached in the newly established video frame set identified by the timestamp.
According to the scheme, the timestamp is used as a video frame set, the video frame set is cached in the buffer queue according to the time sequence of the timestamp, each sequenced and cached video frame can be conveniently acquired from the buffer queue subsequently, and the efficiency of the subsequent process can be improved.
On the basis of buffering the video frames in the buffer queue, in the above-described step S120 of the present application, for each timestamp, the process of splicing the video frames corresponding to the timestamps of the respective videos without overlapping may include the following steps:
and splicing the video frames of the video frame set corresponding to the timestamps acquired from the cache queue in a non-overlapping manner.
Specifically, the video frame sets may be obtained from the buffer queue in time stamp sequence, and then the video frames of each video frame set may be spliced without overlapping.
Next, the process of obtaining video frames from the buffer queue for splicing will be further described.
Specifically, the process may include the steps of:
and S1, acquiring a video frame set corresponding to each timestamp from the buffer queue according to the time sequence of the timestamps.
Specifically, the video frame sets in the buffer queue are stored according to the time sequence of the time stamps, and then each video frame set can be obtained according to the time sequence of the time stamps when the video frame sets are obtained from the buffer queue.
S2, for each video frame set, determining whether the number of video frames in the video frame set is equal to the number of the plurality of videos. If yes, go to step S3; if not, step S4 is executed.
Specifically, it may be determined whether the number of video frames in each acquired video frame set is equal to the number of the acquired videos.
And S3, splicing the video frames of the video frame set without overlapping.
Specifically, the spliced video frames are not overlapped, and the content of each video frame can be completely displayed.
And S4, searching the first video frame set which satisfies that the number of the video frames is equal to the number of the videos according to the time sequence of the time stamps from the video frame set in the buffer queue. If the search result is found, executing the step S5; if not, go to step S6.
Specifically, starting from a video frame set whose number of video frames is not equal to the number of the acquired videos, the first video frame set whose number of video frames is equal to the number of the acquired videos may be searched in the time sequence of the timestamp.
S5, executing the step S3 on the searched video frame set.
Specifically, the videos of the searched video frame set can be spliced without overlapping.
And S6, waiting for the decoding thread to finish decoding the corresponding video.
Specifically, if the video frame set satisfying the above condition is not found, the decoding thread may wait for the video to be decoded, so as to obtain a video frame set whose number of video frames is equal to the number of the obtained videos.
According to the scheme, if the video frame sets with the number of the video frames equal to the number of the acquired videos are not found, in order to ensure the synchronism of the played pictures, the video frame sets without the video frames collected together can be skipped, and the video frames are spliced from the video frame sets with the video frames collected together, so that the spliced video frames are played, and the effect of synchronously playing a plurality of videos is realized.
In consideration of the fact that videos and audios are stored separately in some cases, the embodiment of the present application may further add a process of playing audio synchronously on the basis of playing multiple videos synchronously.
Specifically, the process may include the following processes:
and S1, acquiring an audio corresponding to the plurality of videos.
Specifically, the acquired audio may correspond to the picture and sound of each video. Examples are as follows: a plurality of persons carry out video conference through a network, conference video corresponding to each person is recorded independently, a plurality of videos can be obtained, sound exchanged by the persons of each person during the conference can be recorded into an audio, and the audio can correspond to the plurality of videos which are recorded independently.
And S2, synchronously playing the audio when playing each spliced video frame in the same playing window according to the time sequence of the time stamps.
Specifically, the spliced video frame and the acquired audio can be played synchronously, so that the effect of synchronously playing a plurality of videos and one audio is realized.
Taking the video conference introduced above as an example, the video frames of the videos in the same timestamp can be obtained by decoding the recorded videos respectively, the video frames are spliced, and then the audio obtained in the above steps can be played synchronously, so that the effect of playing back the video conference can be realized.
According to the scheme, one audio and the plurality of videos are played synchronously, so that a user can clearly know the picture interaction of each video and also know the sound interaction of each video, and the user can master the interaction among the videos.
In order to more clearly describe the implementation process of the present application, the process of splicing and playing video frames of the present application will be shown in detail with reference to fig. 2.
Specifically, the number of the plurality of videos played synchronously may be determined according to actual situations, and four videos played synchronously are taken as an example here. First, four different videos can be acquired, and then a decoding thread can be created for each video, each decoding thread can parse and decode the respective video, and video frames with different timestamps can be obtained, and the video frames can be video frames in YUV format. Then, the four video frames in the YUV format can be spliced without overlapping to obtain a spliced YUV video frame, the spliced YUV video frame can completely display the content of each video frame, the spliced YUV video frame can be subjected to SDL rendering, and a plurality of rendered spliced YUV video frames can be played in the same playing window, so that the effect of synchronously playing the four videos is realized.
The following describes the multi-video synchronous playing device provided in the embodiment of the present application, and the multi-video synchronous playing device described below and the multi-video synchronous playing method described above may be referred to correspondingly.
First, referring to fig. 3, the multi-video synchronous playing device will be described, as shown in fig. 3, the multi-video synchronous playing device may include:
a video acquisition unit 100 for acquiring a plurality of videos;
a video decoding unit 110, configured to decode each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp;
a video frame splicing unit 120, configured to splice, for each timestamp, video frames corresponding to the timestamps of the videos without overlapping, so as to obtain a spliced video frame corresponding to the timestamp;
and the video playing unit 130 is configured to play the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Optionally, the video decoding unit 110 may include:
a decoding thread establishing unit for establishing a decoding thread corresponding to each video;
and the decoding thread decoding video unit is used for decoding the corresponding video by utilizing the decoding thread.
Optionally, the multi-video synchronous playing apparatus may further include:
a video frame set cache unit, configured to use each video frame with the same timestamp as a video frame set before splicing the video frames corresponding to the timestamps of the videos without overlapping, and cache the video frames in a cache queue according to a time sequence of the timestamps;
the video frame splicing unit 120 may include:
and the cache video frame splicing unit is used for splicing the video frames of the video frame set corresponding to the timestamps acquired from the cache queue in a non-overlapping manner.
Optionally, the video frame set buffering unit may include:
a video frame set judgment unit, configured to judge, for each video frame to be cached, whether a video frame set corresponding to the video frame to be cached exists in the cache queue;
the first video frame caching unit is used for caching the video frames to be cached in the corresponding video frame set in the cache queue if the video frame set corresponding to the video frames to be cached exists in the cache queue;
a video frame set establishing unit, configured to establish, in the cache queue according to a time stamp sequence, a video frame set corresponding to the video frame to be cached, if the video frame set corresponding to the video frame to be cached does not exist in the cache queue;
and the second video frame buffer unit is used for buffering the video frames to be buffered in the video frame set newly established in the buffer queue.
Optionally, the cache video frame splicing unit may include:
a video frame set acquiring unit, configured to acquire, from the cache queue, a video frame set corresponding to each timestamp according to the time sequence of the timestamps;
a video frame number judgment unit configured to judge, for each video frame set, whether the number of video frames of the video frame set is equal to the number of the plurality of videos;
a cache video frame splicing subunit, configured to splice, if the number of video frames in the video frame set is equal to the number of the plurality of videos, the video frames in the video frame set without overlapping;
a video frame set searching unit, configured to search, in the cache queue, a first video frame set that satisfies that the number of video frames is equal to the number of the plurality of videos, according to a time sequence of timestamps, starting from the video frame set if the number of video frames of the video frame set is not equal to the number of the plurality of videos;
if the video frame is found, executing the cache video frame splicing subunit on the found video frame set;
and if the video is not found, waiting for the decoding thread to finish decoding the corresponding video.
Optionally, the multi-video synchronous playing apparatus may further include:
an audio acquisition unit, configured to acquire an audio corresponding to the plurality of videos;
and the audio and video synchronous playing unit is used for synchronously playing the audio when playing each spliced video frame in the same playing window according to the time sequence of the time stamps.
The multi-video synchronous playing device provided by the embodiment of the application can be applied to multi-video synchronous playing equipment. Fig. 4 is a block diagram illustrating a hardware structure of a multi-video synchronous playback device, and referring to fig. 4, the hardware structure of the multi-video synchronous playback device may include: at least one processor 1, at least one communication interface 2, at least one memory 3 and at least one communication bus 4;
in the embodiment of the application, the number of the processor 1, the communication interface 2, the memory 3 and the communication bus 4 is at least one, and the processor 1, the communication interface 2 and the memory 3 complete mutual communication through the communication bus 4;
the processor 1 may be a central processing unit CPU, or an application Specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present invention, etc.;
the memory 3 may include a high-speed RAM memory, and may further include a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory;
wherein the memory stores a program and the processor can call the program stored in the memory, the program for:
acquiring a plurality of videos;
decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp;
for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps;
and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Alternatively, the detailed function and the extended function of the program may be as described above.
Embodiments of the present application further provide a storage medium, where a program suitable for execution by a processor may be stored, where the program is configured to:
acquiring a plurality of videos;
decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp;
for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps;
and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
Alternatively, the detailed function and the extended function of the program may be as described above.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A multi-video synchronous playing method is characterized by comprising the following steps:
acquiring a plurality of videos;
decoding each video separately to obtain each timestamp of each video and a video frame corresponding to each timestamp;
for each timestamp, splicing the video frames corresponding to the timestamps of the videos in a non-overlapping manner to obtain spliced video frames corresponding to the timestamps;
and playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
2. The method of claim 1, wherein said decoding each of said videos separately comprises:
establishing a decoding thread corresponding to each video;
and decoding the corresponding video by utilizing the decoding thread.
3. The method according to claim 1, further comprising, before said splicing the video frames corresponding to the timestamps of the respective videos without overlapping:
taking each video frame with the same timestamp as a video frame set, and caching the video frames in a cache queue according to the time sequence of the timestamps;
the splicing the video frames corresponding to the timestamps of the videos without overlapping includes:
and splicing the video frames of the video frame set corresponding to the timestamps acquired from the cache queue in a non-overlapping manner.
4. The method of claim 3, wherein buffering video frames with the same timestamp in the buffering queue in time order of the timestamp as a video frame set comprises:
judging whether a video frame set corresponding to the video frame to be cached exists in the cache queue or not aiming at each video frame to be cached;
if the video frame to be cached exists in the cache queue, caching the video frame to be cached in a corresponding video frame set in the cache queue;
if the video frame set does not exist, establishing a video frame set corresponding to the video frame to be cached in the cache queue according to the time stamp sequence;
and buffering the video frame to be buffered in a newly established video frame set in the buffer queue.
5. The method of claim 3, wherein the splicing the video frames of the set of video frames corresponding to the timestamps obtained from the buffer queue without overlapping comprises:
acquiring a video frame set corresponding to each timestamp from the cache queue according to the time sequence of the timestamps;
for each video frame set, judging whether the number of video frames of the video frame set is equal to the number of the videos;
if so, splicing all the video frames of the video frame set in a non-overlapping manner;
if not, searching a first video frame set which meets the condition that the number of the video frames is equal to the number of the videos according to the time sequence of the time stamps from the video frame set in the cache queue;
if the video frames are found, performing a step of splicing the video frames of the video frame set without overlapping on the found video frame set;
and if the video is not found, waiting for the decoding thread to finish decoding the corresponding video.
6. The method of any one of claims 1-5, further comprising:
acquiring an audio corresponding to the plurality of videos;
and synchronously playing the audio when playing each spliced video frame in the same playing window according to the time sequence of the time stamps.
7. A multi-video synchronized playback device, comprising:
a video acquisition unit for acquiring a plurality of videos;
the video decoding unit is used for decoding each video independently to obtain each timestamp of each video and a video frame corresponding to each timestamp;
the video frame splicing unit is used for splicing the video frames corresponding to the time stamps of the videos in a non-overlapping manner aiming at each time stamp to obtain spliced video frames corresponding to the time stamps;
and the video playing unit is used for playing the spliced video frames corresponding to the timestamps in the same playing window according to the time sequence of the timestamps.
8. The apparatus of claim 7, wherein the video decoding unit comprises:
a decoding thread establishing unit for establishing a decoding thread corresponding to each video;
and the decoding thread decoding video unit is used for decoding the corresponding video by utilizing the decoding thread.
9. A multi-video synchronous playing device is characterized by comprising a memory and a processor;
the memory is used for storing programs;
the processor, configured to execute the program, and implement the steps of the multi-video synchronous playing method according to any one of claims 1 to 6.
10. A readable storage medium having stored thereon a computer program, characterized in that the computer program, when being executed by a processor, carries out the steps of the multi-video synchronized playback method according to any one of claims 1-6.
CN202111575542.8A 2021-12-21 2021-12-21 Multi-video synchronous playing method, device, equipment and readable storage medium Active CN114257844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111575542.8A CN114257844B (en) 2021-12-21 2021-12-21 Multi-video synchronous playing method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111575542.8A CN114257844B (en) 2021-12-21 2021-12-21 Multi-video synchronous playing method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN114257844A true CN114257844A (en) 2022-03-29
CN114257844B CN114257844B (en) 2023-01-06

Family

ID=80796571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111575542.8A Active CN114257844B (en) 2021-12-21 2021-12-21 Multi-video synchronous playing method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114257844B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120069137A1 (en) * 2007-09-30 2012-03-22 Optical Fusion Inc. Synchronization and Mixing of Audio and Video Streams in Network-Based Video Conferencing Call Systems
CN104918137A (en) * 2015-06-03 2015-09-16 宁波Gqy视讯股份有限公司 Method enabling spliced screen system to play videos
CN106658030A (en) * 2016-12-30 2017-05-10 上海寰视网络科技有限公司 Method and device for playing composite video comprising single-path audio and multipath videos
CN108881927A (en) * 2017-11-30 2018-11-23 北京视联动力国际信息技术有限公司 A kind of video data synthetic method and device
CN111182235A (en) * 2019-12-05 2020-05-19 浙江大华技术股份有限公司 Method, device, computer device and storage medium for recording spliced screen pictures
CN111741247A (en) * 2020-06-23 2020-10-02 浙江大华技术股份有限公司 Video playback method and device and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120069137A1 (en) * 2007-09-30 2012-03-22 Optical Fusion Inc. Synchronization and Mixing of Audio and Video Streams in Network-Based Video Conferencing Call Systems
CN104918137A (en) * 2015-06-03 2015-09-16 宁波Gqy视讯股份有限公司 Method enabling spliced screen system to play videos
CN106658030A (en) * 2016-12-30 2017-05-10 上海寰视网络科技有限公司 Method and device for playing composite video comprising single-path audio and multipath videos
CN108881927A (en) * 2017-11-30 2018-11-23 北京视联动力国际信息技术有限公司 A kind of video data synthetic method and device
CN111182235A (en) * 2019-12-05 2020-05-19 浙江大华技术股份有限公司 Method, device, computer device and storage medium for recording spliced screen pictures
CN111741247A (en) * 2020-06-23 2020-10-02 浙江大华技术股份有限公司 Video playback method and device and computer equipment

Also Published As

Publication number Publication date
CN114257844B (en) 2023-01-06

Similar Documents

Publication Publication Date Title
CN109168078B (en) Video definition switching method and device
CN107613357B (en) Sound and picture synchronous optimization method and device and readable storage medium
WO2016095369A1 (en) Screen recording method and device
CN109089127B (en) Video splicing method, device, equipment and medium
CN104869467A (en) Information output method and system for media playing, and apparatuses
CN107426603B (en) Video playing method and device
CN111629251B (en) Video playing method and device, storage medium and electronic equipment
CN103428555A (en) Multi-media file synthesis method, system and application method
CN101872634B (en) Electronic device and content reproducing method and program
JP2009517976A (en) Interactive TV without trigger
CN111556329B (en) Method and device for inserting media content in live broadcast
EP4206945A1 (en) Search content matching method and apparatus, and electronic device and storage medium
CN107948718B (en) Program information processing method, device and system
CN114257844B (en) Multi-video synchronous playing method, device, equipment and readable storage medium
JP2005252372A (en) Digest video image producing device and method
CN115665493A (en) Large screen splicing device supporting recording and playing, splicer, playing control method and system
CN104853245A (en) Movie preview method and device thereof
JP2000222381A (en) Album preparation method and information processor and information outputting device
CN114245231B (en) Multi-video synchronous skipping method, device and equipment and readable storage medium
CN114302169B (en) Picture synchronous recording method, device, system and computer storage medium
CN114666648B (en) Video playing method and electronic equipment
JP2002304349A (en) Information regenerator, information recorder, and recording medium
WO2023036275A1 (en) Video processing method and apparatus, electronic device, medium, and program product
JP2007122502A (en) Frame buffer management program, program storage medium and management method
CN110825482B (en) Method and device for processing illustration, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant