CN114143486A - Video stream synchronization method and device, computer equipment and storage medium - Google Patents

Video stream synchronization method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114143486A
CN114143486A CN202111084529.2A CN202111084529A CN114143486A CN 114143486 A CN114143486 A CN 114143486A CN 202111084529 A CN202111084529 A CN 202111084529A CN 114143486 A CN114143486 A CN 114143486A
Authority
CN
China
Prior art keywords
processed
time
video frame
video
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111084529.2A
Other languages
Chinese (zh)
Inventor
朱逸民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202111084529.2A priority Critical patent/CN114143486A/en
Publication of CN114143486A publication Critical patent/CN114143486A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Abstract

The invention relates to a video stream synchronization method, a video stream synchronization device, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed; determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed; and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time alignment information in the video frame to be processed. The method and the device realize the time synchronization of the video frame to be processed and the reference video frame based on the time calibration information in the video frame to be processed, and improve the efficiency of the time synchronization compared with a synchronization mode based on the time calibration information of all the video frames in the video stream to be processed.

Description

Video stream synchronization method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of video processing, and in particular, to a video stream synchronization method, apparatus, computer device, and storage medium.
Background
When the video recording equipment receives the front-end video stream, the time content is recorded in the frame information field of the video stream and then stored in the hard disk of the video recording equipment in the form of a video recording file. When the platform end is connected with video equipment for video playback, the frame information in the read video file is sent out, and the time for playing back the video stream is determined by the platform according to the time field of the frame information. When the system time of the shooting device is calibrated by other devices or platforms, the time of the video file is inconsistent with the system time of the video device, and the problem of abnormal video playing is caused.
Therefore, how to synchronize the time of the video file with the system time of the video equipment is an urgent technical problem to be solved.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a video stream synchronization method, apparatus, computer device and storage medium.
In a first aspect, an embodiment of the present invention provides a video stream synchronization method, where the method includes:
acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed;
determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time calibration information in the video frame to be processed.
In an embodiment, the determining the time alignment information based on the time information included in the video frame to be processed and the obtained reference time includes:
and taking the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time as time calibration information.
In an embodiment, the writing the time alignment information into the corresponding video frame to be processed includes:
determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the obtained reference time is a positive value or a negative value, and writing a determination result into a first preset field in the video frame to be processed;
and coding the absolute value of the difference value between the timestamp in the time information contained in the video frame to be processed and the acquired reference time, and writing the absolute value into a second preset field in the video frame to be processed.
In an embodiment, the time synchronizing the video frame to be processed and the reference video frame included in the video stream to be processed based on the time alignment information in the video frame to be processed includes:
performing time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
and performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time alignment information in the video frame to be processed.
In a second aspect, an embodiment of the present invention provides a video stream synchronization apparatus, where the apparatus includes:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed;
the determining module is used for determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and the synchronization module is used for performing time synchronization on the video frame to be processed and the reference video frame contained in the video stream to be processed based on the time calibration information in the video frame to be processed.
In an embodiment, a difference value between a timestamp in the time information included in the video frame to be processed and the acquired reference time is used as the time alignment information.
In one embodiment, the determining module comprises:
the first writing module is used for determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the acquired reference time is a positive value or a negative value, and writing the determination result into a first preset field in the video frame to be processed;
and the second writing module is used for coding the absolute value of the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time and writing the absolute value into a second preset field in the video frame to be processed.
In one embodiment, the synchronization module comprises:
the first synchronization module is used for carrying out time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
and the second synchronization module is used for performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time calibration information in the video frame to be processed.
In a third aspect, an embodiment of the present invention provides a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the following steps when executing the computer program:
acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed;
determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time calibration information in the video frame to be processed.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the processor implements the following steps:
acquiring a video frame to be processed contained in a video stream to be processed based on a video frame to be processed contained in the video stream to be processed, and acquiring time information contained in the video frame to be processed;
determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time calibration information in the video frame to be processed.
According to the method, the device, the computer equipment and the storage medium, the time information contained in the video frame to be processed is acquired based on the video frame to be processed contained in the video stream to be processed; determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed; and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time alignment information in the video frame to be processed. The method and the device realize the time synchronization of the video frame to be processed and the reference video frame based on the time calibration information in the video frame to be processed, and improve the time synchronization efficiency compared with a time synchronization mode based on the time calibration information of all the video frames in the video stream to be processed.
Drawings
FIG. 1 is a diagram of an exemplary embodiment of a video stream synchronization method;
FIG. 2 is a flow diagram illustrating a method for video stream synchronization in one embodiment;
FIG. 3 is a flow chart illustrating writing of time alignment information according to one embodiment;
FIG. 4 is a diagram illustrating a structure of a video frame to be processed according to an embodiment;
FIG. 5 is a schematic diagram illustrating an overall process flow of writing time alignment information according to an embodiment;
FIG. 6 is a flow diagram illustrating a method for time synchronization in one embodiment;
FIG. 7 is a flow diagram illustrating video playback in one embodiment;
FIG. 8 is a diagram illustrating an exemplary video stream synchronizer;
FIG. 9 is a diagram illustrating the internal architecture of a computing device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The video stream synchronization method provided by the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 firstly acquires time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed; determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed; based on the time calibration information in the video frame to be processed, time synchronization is performed on the video frame to be processed and a reference video frame included in the video stream to be processed, and the terminal 102 sends the synchronized video stream to be processed to the server 104. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In an embodiment, as shown in fig. 2, a video stream synchronization method is provided, which is described by taking the method as an example for being applied to the terminal in fig. 1, and includes the following steps:
s202: and acquiring time information contained in the video frame to be processed based on the video frame to be processed contained in the video stream to be processed.
In general, an I frame of a video stream to be processed is taken as a video frame to be processed, because the I frame is taken as a key frame in the video stream to be processed. In this embodiment, the type of the video frame to be processed is not limited, and may be one or more of a P frame and a B frame.
It is understood that each video frame includes a field in which time information for recording a shooting time is included, the time information corresponding to a shooting device that shoots a video stream to be processed.
S204: and determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed.
In this embodiment, the obtained reference time is the system time of the video recording device.
After the system time of the front-end shooting device is calibrated by other devices or platforms, the system time of the shooting device may be inconsistent with the system time of the video recording device, so that the time information in the video frame to be processed is inconsistent with the system time of the video recording device.
The video stream name of the video recording device contains the start time and the end time of the video stream, and the start time and the end time are respectively based on the device system time taken by the device when the device writes the video stream and packs the video stream. Therefore, if the time of the front-end shooting device is not consistent with the system time of the video recording device, the time of the video stream name and the time information of each frame contained in the video stream name come in and go out. For example, when the shooting device time is 10:00 and the system time of the video recording device is 8:00, the shooting device time is faster than the video recording device time, and the video recording device packs the video stream from 8 o 'clock to 9 o' clock, the time information of the intra frame is 10 o 'clock to 11 o' clock. If time alignment information is available, the time information for the video frame can be recalculated to a device time that is consistent with the video stream name.
In this embodiment, a difference between a timestamp in the time information included in the video frame to be processed and the acquired reference time is used as the time alignment information.
And writing the time calibration information into the corresponding video frame to be processed, and directly acquiring the time calibration information in the video frame to be processed before playing a certain video frame to be processed.
S206: and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time alignment information in the video frame to be processed.
The reference video frame in this embodiment is a video frame for which determination of the temporal alignment information is not performed.
The video stream to be processed in this embodiment includes several categories of video frames, such as I frames, P frames, and B frames. In order to reduce the requirement on performance indexes, in this embodiment, time calibration information is not calculated for all video frames, but one or more types of video frames are used as the video frames to be processed to calculate the time calibration information, and the reference video frame refers to the corresponding time calibration information to perform time synchronization, so that the synchronization efficiency of the video stream to be processed is also improved, and meanwhile, the problem of abnormal playing caused by time asynchronism can be solved.
In one embodiment, as shown in fig. 3, writing the time alignment information into the corresponding video frame to be processed includes the following steps:
s302: determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the obtained reference time is a positive value or a negative value, and writing a determination result into a first preset field in the video frame to be processed;
s304: and encoding the absolute value of the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time, and writing the encoded absolute value into a second preset field in the video frame to be processed.
When a shooting device is accessed in a certain channel of the video recording device, the video recording device monitors time information in a video frame transmitted by the front-end shooting device in real time, a timestamp in the time information is compared with the current system time of the video recording device, and time calibration information is not calculated if the time is equal. And if the difference value is not equal, subtracting the system time of the video frame to be processed and the video equipment to calculate the difference value, and judging the obtained result difference value.
In an embodiment, if the difference between the time information included in the video frame to be processed and the obtained reference time is a positive value, the first preset field of the video frame to be processed is set to 0, and if the difference between the time information included in the video frame to be processed and the obtained reference time is a negative value, the first preset field of the video frame to be processed is set to 1. The binary encoding of the absolute value of the difference remains in the second preset field of the processed video frame. It is understood that, in some other embodiments, the first preset field may also be set to 1 when a difference between the time information included in the video frame to be processed and the obtained reference time is a positive value, and the first preset field may also be set to 0 when a difference between the time information included in the video frame to be processed and the obtained reference time is a negative value.
In order to write the time alignment information into the video frame to be processed, the field information of the video frame to be processed needs to be added, and a reserved field or an extension field of the frame header can be used as storage. As shown in fig. 4, 4 bytes are used to store the absolute value of the difference between the timestamp in the time information contained in the video frame to be processed and the acquired reference time; 1 byte is used for storing a positive value and a negative value and is used for distinguishing the comparison size between a timestamp in time information contained in a video frame to be processed and acquired reference time, 0 is taken to represent the negative value, namely the timestamp in the time information contained in the video frame to be processed is greater than the reference time, a time difference value needs to be subtracted for calibration, 1 is taken to represent the positive value, namely the timestamp in the time information contained in the video frame to be processed is less than the reference time, and a time difference value needs to be added for calibration; the remaining 3 bytes are used for other functions or reserved for alignment.
The writing process of the time calibration information is as shown in fig. 5, the video stream of the front-end shooting device is transmitted to the video recording device, the time information contained in the video frame to be processed is obtained, whether the time stamp of the video frame to be processed is equal to the reference time is judged, if yes, the time stamp is written into the storage hard disk, and the video recording device stores the video frame to be processed according to the written time; if not, calculating a difference value between a time stamp in the time information contained in the video frame to be processed and the reference time, judging whether the time stamp is greater than the reference time, if so, setting the first preset field to be 0, if not, setting the first preset field to be 1, coding an absolute value of the difference value, writing the absolute value into the second preset field, writing the absolute value into a storage hard disk, and storing the absolute value by the image recording equipment according to the written time.
In an embodiment, as shown in fig. 6, a method for performing time synchronization on a video frame to be processed and a reference video frame included in a video stream to be processed based on the time alignment information in the video frame to be processed includes the following steps:
s402: performing time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
s404: and performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time alignment information in the video frame to be processed.
And sending the video frame to be processed into a decoding channel, displaying the image after decoding, moving a cursor to a correct position on a playing time axis according to the time information and the time calibration information of the video frame to be processed, and performing time synchronization on the video frame to be processed, so that a smooth image can be displayed.
And performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time alignment information in the video frame to be processed.
Typically, reference video frames that are one frame or the next frame before a video frame to be processed are time synchronized. The temporal alignment information may differ between the video frames to be processed due to the repetitive setting of time by the photographing apparatus, but the temporal alignment information between adjacent video frames is substantially the same, so that the adjacent reference video frames may be time-synchronized with the temporal alignment information in the video frames to be processed.
The process of synchronizing the video stream to be processed is shown in fig. 7, and the process includes firstly searching the video stream to be processed of the video recording device, obtaining a file name of the video stream to be processed according to a searched time interval, playing the video stream to be processed, sending frame information of a video frame to be processed, judging whether time calibration information exists in the frame information, if not, returning the frame information, returning a reference video frame behind the current video frame to be processed, judging whether a next frame is finished, if not, continuously judging whether the time calibration information exists in the frame information, and until the playing of the video stream to be processed is finished; if the time calibration information exists, judging whether the difference value is a positive value, if so, adding the difference value to the timestamp of the video frame to be processed, if not, subtracting the difference value from the timestamp of the video frame to be processed, returning the video frame information to be processed after synchronization, subtracting the same difference value from the timestamp of the reference video frame of the next frame of the video frame to be processed, judging whether the next frame is finished, if not, continuously judging whether the time calibration information exists in the frame information until the playing of the video stream to be processed is finished.
Through the steps S402-S404, all video frames in the video stream to be processed are synchronized. In the synchronization process, the time calibration information of each video frame is not determined, but the time calibration information in the video frame to be processed is utilized to perform time synchronization on the video frame to be processed and the reference video frame contained in the video stream to be processed, so that the synchronization efficiency of the video stream to be processed is improved, and the smooth playing of the video stream to be processed is ensured.
It should be understood that, although the various steps in the above-described flowcharts are shown in sequence, the steps are not necessarily performed in the order indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-7 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 8, the present invention provides a video stream synchronization apparatus, including:
an obtaining module 802, configured to obtain time information included in a to-be-processed video frame based on the to-be-processed video frame included in the to-be-processed video stream;
a determining module 804, configured to determine time calibration information based on the time information included in the video frame to be processed and the obtained reference time, and write the time calibration information into a corresponding video frame to be processed;
a synchronization module 806, configured to perform time synchronization on the video frame to be processed and a reference video frame included in the video stream to be processed based on the time alignment information in the video frame to be processed.
In an embodiment, a difference value between a timestamp in the time information included in the video frame to be processed and the acquired reference time is used as the time alignment information.
In one embodiment, the determining module comprises:
the first writing module is used for determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the acquired reference time is a positive value or a negative value, and writing the determination result into a first preset field in the video frame to be processed;
and the second writing module is used for coding the absolute value of the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time and writing the absolute value into a second preset field in the video frame to be processed.
In one embodiment, the synchronization module comprises:
the first synchronization module is used for carrying out time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
and the second synchronization module is used for performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time calibration information in the video frame to be processed.
For specific limitations of the video stream synchronization apparatus, reference may be made to the above limitations of the video stream synchronization method, which are not described herein again. The modules in the video stream synchronization apparatus can be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, and the computer device may be a server, and the internal structure thereof may be as shown in fig. 9. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing motion detection data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the steps of any of the above-described embodiments of the video stream synchronization method.
Those skilled in the art will appreciate that the architecture shown in fig. 9 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of any of the above video stream synchronization method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of any of the above-mentioned video stream synchronization method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), for example.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (10)

1. A method for video stream synchronization, the method comprising:
acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed;
determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and performing time synchronization on the video frame to be processed and a reference video frame contained in the video stream to be processed based on the time alignment information in the video frame to be processed.
2. The method according to claim 1, wherein the determining the time alignment information based on the time information included in the video frame to be processed and the obtained reference time comprises:
and taking the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time as time calibration information.
3. The method of claim 2, wherein writing the temporal alignment information into the corresponding video frame to be processed comprises:
determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the obtained reference time is a positive value or a negative value, and writing a determination result into a first preset field in the video frame to be processed;
and encoding the absolute value of the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time, and writing the encoded absolute value into a second preset field in the video frame to be processed.
4. The method according to claim 1, wherein the time synchronizing the video frame to be processed and the reference video frame included in the video stream to be processed based on the time alignment information in the video frame to be processed comprises:
performing time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
and performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time alignment information in the video frame to be processed.
5. A video stream synchronization apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring time information contained in a video frame to be processed based on the video frame to be processed contained in the video stream to be processed;
the determining module is used for determining time calibration information based on the time information contained in the video frame to be processed and the acquired reference time, and writing the time calibration information into the corresponding video frame to be processed;
and the synchronization module is used for performing time synchronization on the video frame to be processed and the reference video frame contained in the video stream to be processed based on the time calibration information in the video frame to be processed.
6. The apparatus according to claim 5, wherein a difference between a timestamp in the time information included in the video frame to be processed and the obtained reference time is used as the time alignment information.
7. The apparatus of claim 6, wherein the determining module comprises:
the first writing module is used for determining that the difference value between the timestamp in the time information contained in the video frame to be processed and the acquired reference time is a positive value or a negative value, and writing the determination result into a first preset field in the video frame to be processed;
and the second writing module is used for coding the absolute value of the difference value between the time stamp in the time information contained in the video frame to be processed and the acquired reference time and writing the absolute value into a second preset field in the video frame to be processed.
8. The apparatus of claim 5, wherein the synchronization module comprises:
the first synchronization module is used for carrying out time synchronization on the corresponding video frame to be processed based on the time calibration information in the video frame to be processed;
and the second synchronization module is used for performing time synchronization on the reference video frame adjacent to the corresponding video frame to be processed based on the time calibration information in the video frame to be processed.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
CN202111084529.2A 2021-09-16 2021-09-16 Video stream synchronization method and device, computer equipment and storage medium Pending CN114143486A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111084529.2A CN114143486A (en) 2021-09-16 2021-09-16 Video stream synchronization method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111084529.2A CN114143486A (en) 2021-09-16 2021-09-16 Video stream synchronization method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114143486A true CN114143486A (en) 2022-03-04

Family

ID=80394660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111084529.2A Pending CN114143486A (en) 2021-09-16 2021-09-16 Video stream synchronization method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114143486A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115665483A (en) * 2022-12-27 2023-01-31 北京蓝色星际科技股份有限公司 Video playing method and device and hard disk video recorder

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702013A (en) * 2013-11-28 2014-04-02 北京航空航天大学 Frame synchronization method for multiple channels of real-time videos
CN107079193A (en) * 2014-10-31 2017-08-18 瑞典爱立信有限公司 Video stream synchronization
CN107135330A (en) * 2017-07-04 2017-09-05 广东工业大学 A kind of method and apparatus of video frame synchronization
WO2018120557A1 (en) * 2016-12-26 2018-07-05 深圳市中兴微电子技术有限公司 Method and device for synchronously processing audio and video, and storage medium
CN110460891A (en) * 2018-05-08 2019-11-15 杭州海康威视数字技术股份有限公司 Video frame output method, device, electronic equipment and storage medium
CN110933449A (en) * 2019-12-20 2020-03-27 北京奇艺世纪科技有限公司 Method, system and device for synchronizing external data and video pictures

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702013A (en) * 2013-11-28 2014-04-02 北京航空航天大学 Frame synchronization method for multiple channels of real-time videos
CN107079193A (en) * 2014-10-31 2017-08-18 瑞典爱立信有限公司 Video stream synchronization
WO2018120557A1 (en) * 2016-12-26 2018-07-05 深圳市中兴微电子技术有限公司 Method and device for synchronously processing audio and video, and storage medium
CN107135330A (en) * 2017-07-04 2017-09-05 广东工业大学 A kind of method and apparatus of video frame synchronization
CN110460891A (en) * 2018-05-08 2019-11-15 杭州海康威视数字技术股份有限公司 Video frame output method, device, electronic equipment and storage medium
CN110933449A (en) * 2019-12-20 2020-03-27 北京奇艺世纪科技有限公司 Method, system and device for synchronizing external data and video pictures

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115665483A (en) * 2022-12-27 2023-01-31 北京蓝色星际科技股份有限公司 Video playing method and device and hard disk video recorder
CN115665483B (en) * 2022-12-27 2023-03-31 北京蓝色星际科技股份有限公司 Video playing method and device and hard disk video recorder

Similar Documents

Publication Publication Date Title
CN110418186B (en) Audio and video playing method and device, computer equipment and storage medium
CN109144858B (en) Fluency detection method and device, computing equipment and storage medium
CN109714623B (en) Image display method and device, electronic equipment and computer readable storage medium
CN110147469B (en) Data processing method, device and storage medium
RU2763518C1 (en) Method, device and apparatus for adding special effects in video and data media
CN108492338B (en) Compression method and device for animation file, storage medium and electronic device
CN110572691B (en) Video reading method, device, equipment and storage medium
CN112532998B (en) Method, device and equipment for extracting video frame and readable storage medium
JP2012123770A (en) Vehicle recording apparatus and image recording method
US10033930B2 (en) Method of reducing a video file size for surveillance
CN114143486A (en) Video stream synchronization method and device, computer equipment and storage medium
CN109889922B (en) Method, device, equipment and storage medium for forwarding streaming media data
CN109821235B (en) Game video recording method, device and server
CN110582016A (en) video information display method, device, server and storage medium
CN109597566B (en) Data reading and storing method and device
US9729919B2 (en) Remultiplexing bitstreams of encoded video for video playback
CN111506747B (en) File analysis method, device, electronic equipment and storage medium
CN103761194B (en) A kind of EMS memory management process and device
WO2019056701A1 (en) Information processing method and apparatus, computer device and storage medium
CN113096218A (en) Dynamic image playing method, device, storage medium and computer equipment
CN116708892A (en) Sound and picture synchronous detection method, device, equipment and storage medium
CN112866745B (en) Streaming video data processing method, device, computer equipment and storage medium
CN110401845B (en) First screen playing method and device, computer equipment and storage medium
WO2022120828A1 (en) Video frame extraction method, device, and storage medium
CN113612962A (en) Video conference processing method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination