CN116249004A - Video acquisition control method, device, equipment and storage medium - Google Patents

Video acquisition control method, device, equipment and storage medium Download PDF

Info

Publication number
CN116249004A
CN116249004A CN202310239332.4A CN202310239332A CN116249004A CN 116249004 A CN116249004 A CN 116249004A CN 202310239332 A CN202310239332 A CN 202310239332A CN 116249004 A CN116249004 A CN 116249004A
Authority
CN
China
Prior art keywords
acquisition
target
phase
exposure time
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310239332.4A
Other languages
Chinese (zh)
Inventor
金友芝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Original Assignee
Douyin Vision Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Douyin Vision Co Ltd filed Critical Douyin Vision Co Ltd
Priority to CN202310239332.4A priority Critical patent/CN116249004A/en
Publication of CN116249004A publication Critical patent/CN116249004A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0602Systems characterised by the synchronising information used
    • H04J3/0617Systems characterised by the synchronising information used the synchronising signal being characterised by the frequency or phase
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

The embodiment of the disclosure provides a video acquisition control method, a video acquisition control device, video acquisition control equipment and a storage medium. The method comprises the following steps: acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end; determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on a preset frame rate and a current acquisition time stamp; determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase; determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase; and carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end based on the target exposure time so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase. According to the technical scheme, the acquisition phase of the acquisition end can be synchronized, so that the synchronous acquisition of video frames is realized.

Description

Video acquisition control method, device, equipment and storage medium
Technical Field
The embodiment of the disclosure relates to computer technology, in particular to a video acquisition control method, a device, equipment and a storage medium.
Background
With the rapid development of computer technology, at least two acquisition ends, such as a mobile phone and other terminals, can be used for acquiring video frames at different viewing angles to obtain free viewing angle videos. Fig. 1 shows an example of video frame acquisition, as shown in fig. 1, due to the fact that the acquisition end can generate crystal oscillator drift with different degrees in the process of acquiring video frames, and the like, time errors exist, so that the acquisition end cannot acquire video frames synchronously. As can be seen, there is a great need for a way to control the synchronous acquisition of video frames at the acquisition end.
Disclosure of Invention
The disclosure provides a video acquisition control method, a device, equipment and a storage medium, so as to synchronize acquisition phases of acquisition ends, thereby realizing synchronous acquisition of video frames.
In a first aspect, an embodiment of the present disclosure provides a video acquisition control method, where the video is acquired by at least two acquisition ends, each acquisition end acquires a video frame under a corresponding viewing angle based on a preset frame rate, and the method includes:
acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
Determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
determining a target acquisition end needing to synchronize phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
and based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
In a second aspect, an embodiment of the present disclosure further provides a video capture control device, where the video is captured by at least two capturing terminals, each capturing terminal captures a video frame at a corresponding viewing angle based on a preset frame rate, and the device includes:
the current acquisition time stamp acquisition module is used for acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
the current acquisition phase determining module is used for determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
The target acquisition end determining module is used for determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
the target exposure time determining module is used for determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
and the exposure time adjustment module is used for carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end based on the target exposure time so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
In a third aspect, embodiments of the present disclosure further provide an electronic device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the video capture control method as described in any of the embodiments of the present disclosure.
In a fourth aspect, the disclosed embodiments also provide a storage medium containing computer executable instructions, which when executed by a computer processor, are for performing a video acquisition control method as described in any of the disclosed embodiments.
According to the embodiment of the disclosure, the current acquisition phase corresponding to the current video frame acquired by each acquisition end is determined based on the preset frame rate and the current acquisition time stamp of the acquired video frame, and the target acquisition end needing to synchronize the phase and the target acquisition phase to which the target acquisition end needs to synchronize are determined based on the current acquisition phase; determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase; based on the target exposure time, single-frame exposure time adjustment is carried out on the subsequent video frames acquired by the target acquisition end, so that the target acquisition end is adjusted to the target acquisition phase from the current acquisition phase, the acquisition phases of the target acquisition end can be synchronized by adjusting the single-frame exposure time, the condition that the acquisition phases are not synchronized due to the condition of crystal oscillator drift and the like is avoided, and further synchronous acquisition of the video frames is realized.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is an example of video frame acquisition in the prior art;
fig. 2 is a schematic flow chart of a video acquisition control method according to an embodiment of the disclosure;
FIG. 3 is an example of a synchronous acquisition phase in accordance with embodiments of the present disclosure;
fig. 4 is a flowchart of another video acquisition control method according to an embodiment of the disclosure;
fig. 5 is a schematic structural diagram of a video acquisition control device according to an embodiment of the disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
It will be appreciated that the data (including but not limited to the data itself, the acquisition or use of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
Fig. 2 is a schematic flow chart of a video acquisition control method provided by an embodiment of the present disclosure, where the embodiment of the present disclosure is applicable to a case of controlling at least two acquisition ends to synchronously acquire video frames, the method may be performed by a video acquisition control device, and the device may be implemented in a form of software and/or hardware, optionally, by an electronic device, where the electronic device may be a mobile terminal, a PC end, a server, or the like.
As shown in fig. 2, the video acquisition control method specifically includes the following steps:
s110, acquiring a current acquisition time stamp corresponding to the current video frame acquired by each acquisition end.
The video is collected by at least two collection ends, and each collection end collects video frames under corresponding visual angles based on a preset frame rate. For example, the captured video may refer to free view video, i.e., video that may be viewed at different viewing angles. The acquisition end refers to any terminal equipment capable of acquiring video frames. For example, the acquisition end may be a mobile terminal with a camera, such as a mobile phone, a tablet computer, a video camera, etc. The number of the acquisition ends is at least two, so that video frames at least two different visual angles can be acquired, and a user can watch the video at least two different visual angles. All the acquisition ends for acquiring the same video have the same brand and model, and synchronous acquisition of video frames is further ensured.
The preset frame rate may be preset based on service requirements and actual scenes, and the acquisition end is used for acquiring the frame rate of the video frames. For example, the preset frame rate may be 30fps, i.e., 30 frames per second are acquired. The current video frame may refer to a video frame acquired by the acquisition end at the current moment. The current acquisition time stamp may be a time stamp used to characterize when the current video frame was started to be acquired. The time stamp can be directly represented by the current time, or can be represented by the interval duration from the fixed time to the current time, and at the moment, each acquisition end can determine the time stamp based on the same fixed time so as to measure whether the acquisition is synchronous or not by using the time stamp.
Specifically, at least two acquisition ends can be utilized to acquire video frames of the same object at different view angles according to the same preset frame rate, so as to obtain video frames at different view angles. In the process of each acquisition end for acquiring video frames, the control end can acquire the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end in real time so as to synchronize the acquisition phases of all the acquisition ends in real time; the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end can be acquired at intervals. So as to synchronously acquire the phase at fixed time and save synchronous resources. The interval of the timing acquisition can be determined based on the crystal oscillator drift degree of the acquisition end, for example, the value range of the timing interval can be 5 minutes to 15 minutes.
It should be noted that, because each acquisition end may generate different degrees of crystal oscillator drift during the process of acquiring video frames, the time interval between every two adjacent frames acquired by each acquisition end is not exactly equal, for example, when video frames are acquired at 30fps, the time interval between every two adjacent frames is not exactly 33.3333 milliseconds. The time interval of every two adjacent frames collected by different collecting terminals is different, and the time interval of every two adjacent frames collected by the same collecting terminal also has nanosecond jitter. For example, the acquisition end also has a millisecond-level time error in one minute due to the crystal oscillation drift generated by the acquisition end. In view of this, the control end can be through the acquisition phase place among all collection ends of synchronous for all collection ends can gather the video frame in step, avoid the time error that produces because of circumstances such as crystal oscillator drift.
S120, determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on a preset frame rate and a current acquisition time stamp.
The current acquisition phase may refer to an acquisition phase when the current video frame is acquired. The acquisition phase may refer to a position where the acquisition position point is located in a cyclic acquisition period corresponding to a preset frame rate. For example, when the preset frame rate is 20fps, that is, 20 frames are acquired per second, and the acquisition period corresponding to each frame is 50 ms, if the actual start acquisition position point corresponding to a certain video frame is 51 ms, the position where the actual start acquisition position point of the video frame is in the acquisition period is taken as the acquisition phase of the video frame, for example, the difference (that is, 1 ms) between the actual start acquisition position point (that is, 51 ms) and the standard start acquisition position point (that is, 50 ms) may be taken as the acquisition phase of the video frame, or the ratio between the difference and the acquisition period may be taken as the acquisition phase of the video frame.
Specifically, the control end may determine, in an acquisition period corresponding to a preset frame rate, a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on a current acquisition timestamp corresponding to a current video frame acquired by each acquisition frame.
Illustratively, S120 may include: determining standard acquisition time length of a single video frame based on a preset frame rate; determining the total acquisition time length from the first video frame acquired by each acquisition end to the current video frame based on the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end and the first acquisition time stamp corresponding to the first video frame acquired by each acquisition end; and performing residual processing on the total acquisition time length and the standard acquisition time length, and determining the current acquisition phase corresponding to the current video frame acquired by each acquisition end.
The standard acquisition duration may refer to an acquisition duration that a single video frame should have in the absence of self crystal drift. For example, the preset frame rate is 20fps, and the standard acquisition duration corresponding to a single video frame is 50 milliseconds.
Specifically, the preset frame rate can be inverted to obtain the standard acquisition duration of the single video frame. When the current time is used for representing the current acquisition time stamp, for each acquisition end, the interval duration between the current acquisition time stamp corresponding to the current video frame acquired by the acquisition end and the first acquisition time stamp corresponding to the acquired first video frame can be used as the total acquisition duration from the first video frame acquired by the acquisition end to the current video frame. And taking the remainder of the total acquisition time length and the standard acquisition time length, wherein the obtained remainder can be directly used as the current acquisition phase corresponding to the current video frame acquired by the acquisition end, or the ratio of the obtained remainder to the standard acquisition time length can be used as the current acquisition phase corresponding to the current video frame acquired by the acquisition end.
When the interval duration from the fixed time to the current time is used for representing the current acquisition time stamp, the current acquisition time stamp and the standard acquisition time duration can be directly subjected to residual processing, the obtained remainder can be directly used as the current acquisition phase corresponding to the current video frame acquired by the acquisition end, or the ratio between the obtained remainder and the standard acquisition time duration can be used as the current acquisition phase corresponding to the current video frame acquired by the acquisition end.
S130, determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase.
The target acquisition end may refer to an acquisition end with an unsynchronized acquisition phase among all the acquisition ends. The target acquisition phase may refer to an acquisition phase to which the target acquisition end needs to be adjusted. The target acquisition end can be one or more.
Specifically, the current acquisition phases of the current video frames acquired by each acquisition end can be compared, and all target acquisition ends needing to be synchronized in phase and the target acquisition phases to which each target acquisition end needs to be adjusted from the current acquisition phases are determined, so that the acquisition phases of all the acquisition ends are synchronized, and all the acquisition ends can synchronously acquire the video frames.
The determining process of the target acquisition end can be implemented at least by the following two ways:
as one implementation, S130 may include: determining a corresponding acquisition phase difference of each acquisition end based on the current acquisition phase corresponding to each acquisition end and a preset standard phase; and determining the acquisition end with the acquisition phase difference larger than or equal to the preset phase difference as a target acquisition end needing to synchronize the phase, and determining the preset standard phase as a target acquisition phase to which the target acquisition end needs to synchronize.
The preset standard phase can be preset, and the acquisition end acquires the standard acquisition phase of the video frame. The preset phase difference may be a preset maximum value that allows acquisition of phase fluctuations. Specifically, synchronization of the acquisition phases can be achieved by synchronizing the current acquisition phases of all the acquisition terminals for acquiring the current video frame to a preset standard phase. For this mode, the difference between the current acquisition phase corresponding to each acquisition end and the preset standard phase can be determined to determine the acquisition phase difference corresponding to the corresponding acquisition end, and the acquisition end with the acquisition phase difference greater than or equal to the preset phase difference is determined to be the target acquisition end needing the synchronous phase. And determining a preset standard phase as a target acquisition phase to which each target acquisition end needs to be synchronized. The acquisition end with the acquisition phase difference smaller than the preset phase difference is an acquisition end without adjustment of the acquisition phase. By adjusting the acquisition phases of all the target acquisition ends to the preset standard phase, each acquisition end can acquire video frames with the preset standard phase, so that synchronization of the acquisition phases is realized.
As another implementation, S130 may include: determining an acquisition phase difference between the first acquisition end and each second acquisition end based on the current acquisition phase corresponding to the first acquisition end and the current acquisition phase corresponding to each second acquisition end; and determining a second acquisition end with the acquisition phase difference larger than or equal to the preset phase difference as a target acquisition end needing to synchronize the phase, and determining the current acquisition phase corresponding to the first acquisition end as a target acquisition phase to which the target acquisition end needs to synchronize.
The first acquisition end may be an acquisition end serving as a synchronization standard. The second acquisition end may be other acquisition ends than the first acquisition end. The preset phase difference may be a preset maximum value that allows acquisition of phase fluctuations. Specifically, synchronization of the acquisition phases can be achieved by synchronizing the current acquisition phases of all the second acquisition ends for acquiring the current video frame to the current acquisition phase of the first acquisition end. For this mode, the difference between the current acquisition phase corresponding to each second acquisition end and the current acquisition phase corresponding to the first acquisition end may be determined to determine the acquisition phase difference corresponding to the corresponding second acquisition end, and the second acquisition end whose acquisition phase difference is greater than or equal to the preset phase difference may be determined to be the target acquisition end requiring the synchronization phase. And determining the current acquisition phase corresponding to the first acquisition end as a target acquisition phase to which each target acquisition end needs to be synchronized. The acquisition end with the acquisition phase difference smaller than the preset phase difference is an acquisition end without adjustment of the acquisition phase. By adjusting the acquisition phases of all the second acquisition ends to the acquisition phases of the first acquisition ends, each second acquisition end can acquire video frames with the acquisition phases of the first acquisition ends, so that synchronization of the acquisition phases is realized.
And S140, determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase.
The exposure time may be the time the shutter is open when the video frame is acquired. The exposure time may affect the acquisition duration of the video frame and thus the acquisition phase of the video frame. The acquisition time of a video frame can be prolonged by prolonging the exposure time of the video frame, so that the acquisition phase of the video frame acquired subsequently can be changed. The variation relationship between the exposure time and the acquisition phase may be a predetermined correspondence relationship between the exposure time variation amount and the acquisition phase variation amount.
Specifically, for each target acquisition end, the difference between the current acquisition phase and the target acquisition phase corresponding to the target acquisition end may be determined as a target phase difference corresponding to the target acquisition end, and based on the change relationship between the exposure time and the acquisition phase, an exposure extension time corresponding to the target phase difference is determined, and based on the exposure extension time and a minimum exposure time affecting the change of the acquisition phase, a target exposure time for adjusting the current acquisition phase of the target acquisition end to the target acquisition phase is determined.
And S150, based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
Each acquisition end has a function of adjusting exposure time of a single frame. For example, the operating system of each acquisition end is an android system, so that the acquisition end supports adjustment of single-frame exposure time.
Specifically, the exposure time of the target acquisition end can be temporarily adjusted to the target exposure time, after the target acquisition end acquires a subsequent video frame based on the adjusted target exposure time, the exposure time of the target acquisition end is readjusted to the original exposure time, and the acquisition is continued by utilizing the original exposure time, so that the acquisition phase corresponding to the subsequent video frame of the video frame is the target acquisition phase by prolonging the exposure time of the video frame, and the synchronization of the acquisition phases is realized. For example, fig. 3 gives an example of a synchronous acquisition phase. As shown in fig. 3, the acquisition phase corresponding to the mth frame is 50% of the acquisition period (the position indicated by the dashed line in the figure), 50% of the acquisition phase of the acquisition end is greater than the preset standard phase 0, the adjustment of the exposure time is required to be triggered during the mth frame, the adjusted extended exposure time is used for acquiring the mth+2h frame, so that the acquisition duration of the mth+2h frame is prolonged, and further, the acquisition phases of the mth+3h frame and the subsequent video frames are adjusted to the preset standard phase 0, thereby realizing synchronization of the acquisition phase 0.
In an exemplary embodiment, when determining that the acquisition phase of the current video frame acquired by a certain acquisition end needs to be synchronized, if the acquisition end can adjust the exposure time of a single frame every several frames, after the group of video frames are acquired by using the current exposure time, the first video frame of the lower group can be acquired by using the adjusted target exposure time, and after the acquisition of the first video frame is completed, the acquisition of the subsequent video frame is continued by using the original exposure time, as shown in fig. 3, and then the synchronization to the standard phase can be performed after about 5 to 10 frames. If the acquisition end can adjust the single-frame exposure time frame by frame, the next video frame of the current video frame can be directly acquired by utilizing the adjusted target exposure time, so that the acquisition phase can be more rapidly synchronized.
It should be noted that, by adjusting the exposure time of a single frame, the acquisition phases of all the acquisition ends for acquiring video frames can be synchronized, that is, all the acquisition ends start to acquire video frames at the same position, so that all the acquisition ends can synchronously acquire video frames, and frame synchronization among all the acquisition ends is realized.
According to the technical scheme, the current acquisition phase corresponding to the current video frame acquired by each acquisition end is determined based on the preset frame rate and the current acquisition time stamp of the acquired video frame by the acquisition end, and the target acquisition end needing to synchronize the phase and the target acquisition phase to which the target acquisition end needs to synchronize are determined based on the current acquisition phase; determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase; based on the target exposure time, single-frame exposure time adjustment is carried out on the subsequent video frames acquired by the target acquisition end, so that the target acquisition end is adjusted to the target acquisition phase from the current acquisition phase, the acquisition phases of the target acquisition end can be synchronized by adjusting the single-frame exposure time, the condition that the acquisition phases are not synchronized due to the condition of crystal oscillator drift and the like is avoided, and further synchronous acquisition of the video frames is realized.
Based on the above technical solution, before S110, the method may further include: and based on a preset time interval, periodically performing time synchronization on the system time of each acquisition end.
The preset time interval may be a synchronization period interval preset for synchronizing the system time of the acquisition end. The preset time interval may be set based on the degree of crystal oscillator drift. For example, the preset time interval may be 5 minutes to 10 minutes.
Specifically, the system time of each acquisition end can be periodically synchronized at regular time based on a preset time interval by using the existing time synchronization mode, so that the synchronization of the acquisition time stamps in each acquisition end is ensured, and the synchronization of the acquisition phases is further ensured. The system time and the phase can be synchronously acquired at regular time, so that synchronous acquisition of videos can be ensured, and long-time recording of synchronous videos or push stream synchronous videos is supported.
Based on the technical scheme, the method further comprises the following steps: determining an abnormal video frame in the video stream acquired by each acquisition end, wherein the abnormal video frame comprises a video frame with abnormal exposure time; deleting the abnormal video frames and storing the deleted video stream.
The video stream may refer to continuous video frames continuously collected by the collecting end. An anomalous video frame may refer to a video frame acquired with an adjusted target exposure time. Because the target exposure time is an extended exposure time, the acquisition time of the acquired video frames is longer than the standard acquisition time length. As shown in fig. 3, the m+2th frame is an abnormal video frame. The acquisition time length of the abnormal video frames is longer than the standard acquisition time length corresponding to the preset frame rate.
Specifically, when recording video, it can detect whether the exposure time of each video frame in the video stream collected by each collecting end is greater than the original exposure time or whether the collection time is greater than the standard collection time, if so, it indicates that the video frame is collected by using the extended exposure time, at this time, the video frame can be determined to be an abnormal video frame of the exposed video, and all abnormal video frames in the video stream are deleted, so that only the video stream with normal exposure time is saved, thereby avoiding the condition of image flicker when playing the video stream, and improving the user watching experience.
Fig. 4 is a schematic flow chart of another video acquisition control method provided by an embodiment of the present disclosure, where the step of determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relationship between the exposure time and the acquisition phase is further optimized on the basis of the above disclosed embodiment. Wherein the same or corresponding terms as those of the above-described embodiments are not explained in detail herein.
As shown in fig. 4, the video acquisition control method specifically includes the following steps:
s210, acquiring a current acquisition time stamp corresponding to the current video frame acquired by each acquisition end.
S220, determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on a preset frame rate and a current acquisition time stamp.
S230, determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase.
S240, determining a target phase difference corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end.
Specifically, for each target acquisition end, the difference between the current acquisition phase corresponding to the target acquisition end and the target acquisition phase may be determined as the target phase difference corresponding to the target acquisition end.
S250, determining the target exposure time corresponding to the target acquisition end based on the target phase difference, the preset frame rate, the preset change coefficient and the exposure time control threshold.
The preset change coefficient may refer to a change coefficient between an exposure time change amount and a phase change amount. The preset change coefficient may be determined in advance based on the historical acquisition data. For example, if the preset change coefficient is 2, it indicates that the exposure time changes by 1 unit and the acquisition phase changes by 2 units. The exposure time control threshold may refer to a minimum exposure time that affects the change in acquisition phase. Too small exposure time does not affect the acquisition phase change, and the acquisition phase change is affected only after the exposure time is greater than an exposure time control threshold, so that the exposure time control threshold is needed to be utilized to determine the target exposure time which can be adjusted to the target acquisition phase.
Specifically, the exposure extension time required to be extended when the current acquisition phase of the target acquisition end is adjusted to the target acquisition phase can be determined based on the target phase difference, the standard acquisition time length corresponding to the preset frame rate and the preset change coefficient, and the target exposure time after the exposure extension time is increased based on the exposure extension time and the exposure time control threshold value is determined based on the exposure time control threshold value. The target exposure time to be prolonged can be directly and rapidly determined by utilizing the preset change coefficient and the exposure time control threshold value, so that the synchronization efficiency of the acquisition phase is further improved.
It should be noted that, since too small exposure time does not affect the acquisition phase to change, the acquisition phase can only be changed by extending the exposure time based on the exposure time control threshold, and the target exposure time is larger than the original exposure time.
Illustratively, S250 may include: determining standard acquisition time length of a single video frame based on a preset frame rate; determining a target difference value between the target phase difference and the standard acquisition time length; determining exposure extension time based on the target difference value and a preset change coefficient; and determining the target exposure time corresponding to the target acquisition end based on the exposure extension time and the exposure time control threshold.
Specifically, the acquisition phases of the subsequent video frames can be synchronized only by extending the acquisition duration, so that when the acquisition phases are represented by directly utilizing the remainder, a target difference value between the target phase difference and the standard acquisition duration needs to be determined, and the target difference value is a phase change amount which needs to be changed in practice. If the preset change coefficient is the ratio of the exposure time change amount divided by the phase change amount, the target difference value can be multiplied by the preset change coefficient, and the obtained product is determined as the exposure extension time. And adding the exposure extension time and the exposure time control threshold value, and determining the obtained addition result as the target exposure time corresponding to the target acquisition end.
And S260, based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
According to the technical scheme, the target phase difference corresponding to the target acquisition end is determined based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end, and the target exposure time corresponding to the target acquisition end is determined based on the target phase difference, the preset frame rate, the preset change coefficient and the exposure time control threshold, so that the target exposure time required to be prolonged can be directly and rapidly determined by utilizing the preset change coefficient and the exposure time control threshold, and the synchronization efficiency of the acquisition phase is further improved.
Based on the above technical solution, before synchronizing the acquisition phases, the exposure time control threshold may be determined by the following steps S310-S330:
s310, acquiring a first exposure time and a second exposure time, wherein the first exposure time does not influence the acquisition phase to change, and the second exposure time does not influence the acquisition phase to change.
Wherein the acquisition phase does not change when the first exposure time is adjusted. The acquisition phase will change when the second exposure time is adjusted. Specifically, two exposure times may be preset, that is, a first exposure time that does not affect the change of the acquisition phase and a second exposure time that does affect the change of the acquisition phase, where the exposure time control threshold is located between the first exposure time and the second exposure time. For example, when the preset frame rate is 30fps, the first exposure time is set to 20000000 nanoseconds, and the second exposure time is set to 40000000 nanoseconds. The first exposure time is relatively short and the second exposure time is relatively long.
S320, determining an average exposure time based on the first exposure time and the second exposure time, and detecting whether the average exposure time is set to affect the acquisition phase to change.
Specifically, the first exposure time and the second exposure time are subjected to an average process to obtain an average exposure time. After the original exposure time of the acquisition end is adjusted to be the average exposure time, comparing whether the acquisition phase corresponding to the original exposure time is the same as the acquisition phase corresponding to the average exposure time or not, or within an allowable error range, if so, determining that the average exposure time does not influence the acquisition phase to change, otherwise, determining that the average exposure time can influence the acquisition phase to change.
S330, updating the first exposure time or the second exposure time based on the detection result, detecting whether the difference between the updated first exposure time and the updated second exposure time is smaller than or equal to a preset difference, if so, executing step S340, and if not, executing step S320.
The preset difference value may be preset, and the exposure time controls the allowable error range of the threshold. The smaller the preset difference, the higher the accuracy of the exposure time control threshold.
Specifically, if the acquisition phase is detected to be changed, the average exposure time is indicated to be greater than the exposure time control threshold, and the second exposure time can be updated to be the average exposure time. If the acquisition phase is detected not to change, the average exposure time is smaller than the exposure time control threshold value, and the first exposure time can be updated to be the average exposure time. After the first exposure time or the second exposure time is updated, whether a difference between the updated first exposure time and second exposure time is smaller than a preset difference may be detected, so as to determine whether to continue updating the first exposure time or the second exposure time based on the detection result. If the difference between the first exposure time and the second exposure time is greater than the preset difference, it indicates that the distance between the first exposure time and the second exposure time needs to be continuously reduced, and step S320 may be executed again at this time, and the first exposure time or the second exposure time is updated again in a circulating manner until the first exposure time and the second exposure time meeting the requirements are determined.
S340, determining an exposure time control threshold based on the first exposure time and the second exposure time.
Specifically, when the difference between the updated first exposure time and the second exposure time is less than or equal to the preset difference, the updated first exposure time and second exposure time may be subjected to an averaging process, and the obtained average exposure time is determined as the exposure time control threshold. The exposure time control threshold value can be accurately determined by continuously and circularly reducing the distance between the first exposure time and the second exposure time, and the accuracy and the efficiency of the acquisition phase synchronization are further ensured.
Fig. 5 is a schematic structural diagram of a video acquisition control device according to an embodiment of the disclosure. The video is collected by at least two collection ends, and each collection end collects video frames under corresponding visual angles based on a preset frame rate. As shown in fig. 5, the apparatus specifically includes: a current acquisition timestamp acquisition module 310, a current acquisition phase determination module 320, a target acquisition end determination module 330, a target exposure time determination module 340, and an exposure time adjustment module 350.
The current acquisition timestamp acquisition module 310 is configured to acquire a current acquisition timestamp corresponding to a current video frame acquired by each acquisition end; the current acquisition phase determining module 320 is configured to determine a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition timestamp; a target acquisition end determining module 330, configured to determine, based on the current acquisition phase, a target acquisition end that needs to synchronize a phase and a target acquisition phase to which the target acquisition end needs to synchronize; a target exposure time determining module 340, configured to determine a target exposure time corresponding to the target acquisition end based on a current acquisition phase and a target acquisition phase corresponding to the target acquisition end and a change relationship between an exposure time and an acquisition phase; the exposure time adjustment module 350 is configured to perform single-frame exposure time adjustment on a subsequent video frame acquired by the target acquisition end based on the target exposure time, so as to synchronize an acquisition phase of the target acquisition end to the target acquisition phase.
According to the technical scheme provided by the embodiment of the disclosure, the current acquisition phase corresponding to the current video frame acquired by each acquisition end is determined based on the preset frame rate and the current acquisition time stamp of the acquired video frame, and the target acquisition end needing to synchronize the phase and the target acquisition phase to which the target acquisition end needs to synchronize are determined based on the current acquisition phase; determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase; based on the target exposure time, single-frame exposure time adjustment is carried out on the subsequent video frames acquired by the target acquisition end, so that the target acquisition end is adjusted to the target acquisition phase from the current acquisition phase, the acquisition phases of the target acquisition end can be synchronized by adjusting the single-frame exposure time, the condition that the acquisition phases are not synchronized due to the condition of crystal oscillator drift and the like is avoided, and further synchronous acquisition of the video frames is realized.
Based on the above technical solution, the current acquisition phase determining module 320 is specifically configured to:
Determining standard acquisition duration of a single video frame based on the preset frame rate; determining the total acquisition time length from the first video frame acquired by each acquisition end to the current video frame based on the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end and the first acquisition time stamp corresponding to the first video frame acquired by each acquisition end; and performing residual processing on the total acquisition time length and the standard acquisition time length, and determining a current acquisition phase corresponding to the current video frame acquired by each acquisition end.
Based on the above technical solutions, the target acquisition end determining module 330 is specifically configured to:
determining a corresponding acquisition phase difference of each acquisition end based on the current acquisition phase corresponding to each acquisition end and a preset standard phase; and determining the acquisition end with the acquisition phase difference larger than or equal to a preset phase difference as a target acquisition end needing to synchronize the phase, and determining the preset standard phase as a target acquisition phase to which the target acquisition end needs to synchronize.
Based on the above technical solutions, the target acquisition end determining module 330 is specifically configured to:
determining an acquisition phase difference between the first acquisition end and each second acquisition end based on the current acquisition phase corresponding to the first acquisition end and the current acquisition phase corresponding to each second acquisition end; and determining the second acquisition end with the acquisition phase difference larger than or equal to the preset phase difference as a target acquisition end needing to synchronize the phase, and determining the current acquisition phase corresponding to the first acquisition end as a target acquisition phase to which the target acquisition end needs to synchronize.
Based on the above technical solutions, the target exposure time determining module 340 includes:
the target phase difference determining unit is used for determining a target phase difference corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end;
a target exposure time determining unit, configured to determine a target exposure time corresponding to the target acquisition end based on the target phase difference, the preset frame rate, a preset change coefficient and an exposure time control threshold;
wherein the preset change coefficient refers to a change coefficient between exposure time change amount and phase change amount; the exposure time control threshold value refers to the minimum exposure time affecting the acquisition phase change.
Based on the above technical solutions, the target exposure time determining unit is specifically configured to:
determining standard acquisition duration of a single video frame based on the preset frame rate; determining a target difference between the target phase difference and the standard acquisition duration; determining exposure extension time based on the target difference value and a preset change coefficient; and determining the target exposure time corresponding to the target acquisition end based on the exposure extension time and the exposure time control threshold.
On the basis of the technical schemes, the device further comprises:
the exposure time control threshold determining module is used for acquiring a first exposure time and a second exposure time, wherein the first exposure time does not influence the acquisition phase to change, and the second exposure time influences the acquisition phase to change; determining an average exposure time based on the first exposure time and the second exposure time, and detecting whether the average exposure time is set to influence acquisition phase change or not; updating the first exposure time or the second exposure time based on the detection result, and returning to execute the operation of determining the average exposure time based on the first exposure time and the second exposure time until the difference value between the first exposure time and the second exposure time is smaller than or equal to the preset difference value, and determining an exposure time control threshold based on the first exposure time and the second exposure time.
On the basis of the technical schemes, the device further comprises:
the time synchronization module is used for periodically performing time synchronization on the system time of each acquisition end based on a preset time interval before acquiring the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end.
On the basis of the technical schemes, the device further comprises:
the abnormal video frame determining module is used for determining abnormal video frames in the video stream acquired by each acquisition end, wherein the abnormal video frames comprise video frames with abnormal exposure time;
and the video stream storage module is used for deleting the abnormal video frames and storing the deleted video stream.
The video acquisition control device provided by the embodiment of the disclosure can execute the video acquisition control method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that each unit and module included in the above apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now to fig. 6, a schematic diagram of an electronic device (e.g., a terminal device or server in fig. 6) 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An edit/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The electronic device provided by the embodiment of the present disclosure and the video acquisition control method provided by the foregoing embodiment belong to the same inventive concept, and technical details not described in detail in the present embodiment may be referred to the foregoing embodiment, and the present embodiment has the same beneficial effects as the foregoing embodiment.
The embodiment of the present disclosure provides a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the video capture control method provided by the above embodiment.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end; determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp; determining a target acquisition end needing to synchronize phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase; determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase; and based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a video capture control method, in which video is captured by at least two capture terminals, each capture terminal capturing video frames at a respective viewing angle based on a preset frame rate, the method including:
acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
determining a target acquisition end needing to synchronize phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
and based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example two ] further comprising:
Optionally, the determining, based on the preset frame rate and the current acquisition timestamp, a current acquisition phase corresponding to a current video frame acquired by each acquisition end includes:
determining standard acquisition duration of a single video frame based on the preset frame rate;
determining the total acquisition time length from the first video frame acquired by each acquisition end to the current video frame based on the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end and the first acquisition time stamp corresponding to the first video frame acquired by each acquisition end;
and performing residual processing on the total acquisition time length and the standard acquisition time length, and determining a current acquisition phase corresponding to the current video frame acquired by each acquisition end.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example three ], further comprising:
optionally, the determining, based on the current acquisition phase, a target acquisition end that needs to synchronize a phase and a target acquisition phase to which the target acquisition end needs to synchronize, includes:
determining a corresponding acquisition phase difference of each acquisition end based on the current acquisition phase corresponding to each acquisition end and a preset standard phase;
and determining the acquisition end with the acquisition phase difference larger than or equal to a preset phase difference as a target acquisition end needing to synchronize the phase, and determining the preset standard phase as a target acquisition phase to which the target acquisition end needs to synchronize.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example four ], further comprising:
optionally, the determining, based on the current acquisition phase, a target acquisition end that needs to synchronize a phase and a target acquisition phase to which the target acquisition end needs to synchronize, includes:
determining an acquisition phase difference between the first acquisition end and each second acquisition end based on the current acquisition phase corresponding to the first acquisition end and the current acquisition phase corresponding to each second acquisition end;
and determining the second acquisition end with the acquisition phase difference larger than or equal to the preset phase difference as a target acquisition end needing to synchronize the phase, and determining the current acquisition phase corresponding to the first acquisition end as a target acquisition phase to which the target acquisition end needs to synchronize.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example five ], further comprising:
optionally, the determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relationship between the exposure time and the acquisition phase includes:
Determining a target phase difference corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end;
determining a target exposure time corresponding to the target acquisition end based on the target phase difference, the preset frame rate, a preset change coefficient and an exposure time control threshold;
wherein the preset change coefficient refers to a change coefficient between exposure time change amount and phase change amount; the exposure time control threshold value refers to the minimum exposure time affecting the acquisition phase change.
According to one or more embodiments of the present disclosure, there is provided a video acquisition control method [ example six ], further comprising:
optionally, the determining the target exposure time corresponding to the target acquisition end based on the target phase difference, the preset frame rate, the preset change coefficient and the exposure time control threshold includes:
determining standard acquisition duration of a single video frame based on the preset frame rate;
determining a target difference between the target phase difference and the standard acquisition duration;
determining exposure extension time based on the target difference value and a preset change coefficient;
and determining the target exposure time corresponding to the target acquisition end based on the exposure extension time and the exposure time control threshold.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example seventh ], further comprising:
optionally, the determining of the exposure time control threshold includes:
acquiring a first exposure time and a second exposure time, wherein the first exposure time does not influence the acquisition phase to change, and the second exposure time does not influence the acquisition phase to change;
determining an average exposure time based on the first exposure time and the second exposure time, and detecting whether the average exposure time is set to influence acquisition phase change or not;
updating the first exposure time or the second exposure time based on the detection result, and returning to execute the operation of determining the average exposure time based on the first exposure time and the second exposure time until the difference value between the first exposure time and the second exposure time is smaller than or equal to the preset difference value, and determining an exposure time control threshold based on the first exposure time and the second exposure time.
According to one or more embodiments of the present disclosure, there is provided a video capture control method [ example eight ]:
optionally, before the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end is acquired, the method further includes:
And based on a preset time interval, periodically performing time synchronization on the system time of each acquisition end.
According to one or more embodiments of the present disclosure, there is provided a video acquisition control method [ example nine ], further comprising:
optionally, the method further comprises:
determining abnormal video frames in the video stream acquired by each acquisition end, wherein the abnormal video frames comprise video frames with abnormal exposure time;
deleting the abnormal video frames and storing the deleted video stream.
According to one or more embodiments of the present disclosure, there is provided a video capture control apparatus [ example ten ], the video being captured by at least two capture terminals, each capture terminal capturing video frames at a respective viewing angle based on a preset frame rate, the apparatus comprising:
the current acquisition time stamp acquisition module is used for acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
the current acquisition phase determining module is used for determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
the target acquisition end determining module is used for determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
The target exposure time determining module is used for determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
and the exposure time adjustment module is used for carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end based on the target exposure time so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (12)

1. A video acquisition control method, wherein the video is acquired by at least two acquisition terminals, each acquisition terminal acquires video frames at a corresponding viewing angle based on a preset frame rate, the method comprising:
acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
determining a target acquisition end needing to synchronize phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
determining a target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
And based on the target exposure time, carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
2. The video capture control method according to claim 1, wherein determining a current capture phase corresponding to a current video frame captured by each capture end based on the preset frame rate and the current capture timestamp comprises:
determining standard acquisition duration of a single video frame based on the preset frame rate;
determining the total acquisition time length from the first video frame acquired by each acquisition end to the current video frame based on the current acquisition time stamp corresponding to the current video frame acquired by each acquisition end and the first acquisition time stamp corresponding to the first video frame acquired by each acquisition end;
and performing residual processing on the total acquisition time length and the standard acquisition time length, and determining a current acquisition phase corresponding to the current video frame acquired by each acquisition end.
3. The video capture control method according to claim 1, wherein determining, based on the current capture phase, a target capture end to which a synchronization phase is required and a target capture phase to which the target capture end is required to be synchronized, comprises:
Determining a corresponding acquisition phase difference of each acquisition end based on the current acquisition phase corresponding to each acquisition end and a preset standard phase;
and determining the acquisition end with the acquisition phase difference larger than or equal to a preset phase difference as a target acquisition end needing to synchronize the phase, and determining the preset standard phase as a target acquisition phase to which the target acquisition end needs to synchronize.
4. The video capture control method according to claim 1, wherein determining, based on the current capture phase, a target capture end to which a synchronization phase is required and a target capture phase to which the target capture end is required to be synchronized, comprises:
determining an acquisition phase difference between the first acquisition end and each second acquisition end based on the current acquisition phase corresponding to the first acquisition end and the current acquisition phase corresponding to each second acquisition end;
and determining the second acquisition end with the acquisition phase difference larger than or equal to the preset phase difference as a target acquisition end needing to synchronize the phase, and determining the current acquisition phase corresponding to the first acquisition end as a target acquisition phase to which the target acquisition end needs to synchronize.
5. The video capture control method according to claim 1, wherein the determining the target exposure time corresponding to the target capture end based on the current capture phase and the target capture phase corresponding to the target capture end and a change relationship between the exposure time and the capture phase comprises:
Determining a target phase difference corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end;
determining a target exposure time corresponding to the target acquisition end based on the target phase difference, the preset frame rate, a preset change coefficient and an exposure time control threshold;
wherein the preset change coefficient refers to a change coefficient between exposure time change amount and phase change amount; the exposure time control threshold value refers to the minimum exposure time affecting the acquisition phase change.
6. The video capture control method according to claim 5, wherein determining the target exposure time corresponding to the target capture end based on the target phase difference, the preset frame rate, a preset change coefficient, and an exposure time control threshold value comprises:
determining standard acquisition duration of a single video frame based on the preset frame rate;
determining a target difference between the target phase difference and the standard acquisition duration;
determining exposure extension time based on the target difference value and a preset change coefficient;
and determining the target exposure time corresponding to the target acquisition end based on the exposure extension time and the exposure time control threshold.
7. The video capture control method of claim 5, wherein the determining of the exposure time control threshold comprises:
acquiring a first exposure time and a second exposure time, wherein the first exposure time does not influence the acquisition phase to change, and the second exposure time does not influence the acquisition phase to change;
determining an average exposure time based on the first exposure time and the second exposure time, and detecting whether the average exposure time is set to influence acquisition phase change or not;
updating the first exposure time or the second exposure time based on the detection result, and returning to execute the operation of determining the average exposure time based on the first exposure time and the second exposure time until the difference value between the first exposure time and the second exposure time is smaller than or equal to the preset difference value, and determining an exposure time control threshold based on the first exposure time and the second exposure time.
8. The video acquisition control method according to claim 1, further comprising, before acquiring a current acquisition timestamp corresponding to a current video frame acquired by each acquisition end:
and based on a preset time interval, periodically performing time synchronization on the system time of each acquisition end.
9. The video capture control method of any one of claims 1-8, further comprising:
determining abnormal video frames in the video stream acquired by each acquisition end, wherein the abnormal video frames comprise video frames with abnormal exposure time;
deleting the abnormal video frames and storing the deleted video stream.
10. A video acquisition control device, wherein the video is acquired by at least two acquisition terminals, each acquisition terminal acquiring video frames at a corresponding viewing angle based on a preset frame rate, the device comprising:
the current acquisition time stamp acquisition module is used for acquiring a current acquisition time stamp corresponding to a current video frame acquired by each acquisition end;
the current acquisition phase determining module is used for determining a current acquisition phase corresponding to a current video frame acquired by each acquisition end based on the preset frame rate and the current acquisition time stamp;
the target acquisition end determining module is used for determining a target acquisition end needing to synchronize the phase and a target acquisition phase to which the target acquisition end needs to synchronize based on the current acquisition phase;
the target exposure time determining module is used for determining the target exposure time corresponding to the target acquisition end based on the current acquisition phase and the target acquisition phase corresponding to the target acquisition end and the change relation between the exposure time and the acquisition phase;
And the exposure time adjustment module is used for carrying out single-frame exposure time adjustment on the subsequent video frames acquired by the target acquisition end based on the target exposure time so as to synchronize the acquisition phase of the target acquisition end to the target acquisition phase.
11. An electronic device, the electronic device comprising:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the video capture control method of any of claims 1-9.
12. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the video acquisition control method as claimed in any one of claims 1 to 9.
CN202310239332.4A 2023-03-08 2023-03-08 Video acquisition control method, device, equipment and storage medium Pending CN116249004A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310239332.4A CN116249004A (en) 2023-03-08 2023-03-08 Video acquisition control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310239332.4A CN116249004A (en) 2023-03-08 2023-03-08 Video acquisition control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116249004A true CN116249004A (en) 2023-06-09

Family

ID=86624007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310239332.4A Pending CN116249004A (en) 2023-03-08 2023-03-08 Video acquisition control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116249004A (en)

Similar Documents

Publication Publication Date Title
CN110266420B (en) Clock synchronization method, clock synchronization apparatus, and computer-readable storage medium
CN112488783B (en) Image acquisition method and device and electronic equipment
CN114699767A (en) Game data processing method, device, medium and electronic equipment
CN111163336B (en) Video resource pushing method and device, electronic equipment and computer readable medium
CN116708892A (en) Sound and picture synchronous detection method, device, equipment and storage medium
CN110809166B (en) Video data processing method and device and electronic equipment
CN112929240A (en) Method, device, terminal and non-transitory storage medium for acquiring communication delay time
CN114630170B (en) Audio and video synchronization method and device, electronic equipment and storage medium
CN116249004A (en) Video acquisition control method, device, equipment and storage medium
CN111669625A (en) Processing method, device and equipment for shot file and storage medium
CN114528433B (en) Template selection method and device, electronic equipment and storage medium
CN114584709B (en) Method, device, equipment and storage medium for generating zooming special effects
CN115114463A (en) Media content display method and device, electronic equipment and storage medium
CN111628913B (en) Online time length determining method and device, readable medium and electronic equipment
CN114499816A (en) Clock synchronization method and device, terminal equipment and readable storage medium
CN113127282A (en) Frame capability determining method and device, electronic equipment and storage medium
CN108024121B (en) Voice barrage synchronization method and system
CN116033199A (en) Multi-device audio and video synchronization method and device, electronic device and storage medium
CN117354483A (en) Synchronous verification method and device, electronic equipment and storage medium
CN113096194B (en) Method, device, terminal and non-transitory storage medium for determining time sequence
CN112188274B (en) Method and device for adjusting video playing progress and electronic equipment
CN117528157A (en) Video playing method, device, system, equipment and storage medium
CN117527648A (en) Bandwidth detection method, device, equipment and storage medium
CN117714635A (en) Information processing method, apparatus, electronic device and storage medium
CN112258408A (en) Information display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination