WO2020125643A1 - 帧同步方法及其装置 - Google Patents

帧同步方法及其装置 Download PDF

Info

Publication number
WO2020125643A1
WO2020125643A1 PCT/CN2019/126055 CN2019126055W WO2020125643A1 WO 2020125643 A1 WO2020125643 A1 WO 2020125643A1 CN 2019126055 W CN2019126055 W CN 2019126055W WO 2020125643 A1 WO2020125643 A1 WO 2020125643A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera device
frame
frames
camera
time stamp
Prior art date
Application number
PCT/CN2019/126055
Other languages
English (en)
French (fr)
Inventor
龚平
秦书嘉
贝尔洪丹尼尔
卡尔巴耶拉巴勃罗
多布拉多卡门
卡布雷拉胡利安
卡莫纳卡洛斯
莫兰弗朗西斯科
加西亚纳西索
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020125643A1 publication Critical patent/WO2020125643A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • H04N5/073Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • H04N5/073Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations
    • H04N5/0733Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations for distributing synchronisation pulses to different TV cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • Embodiments of the present application relate to the field of multimedia technology, and in particular, to a frame synchronization method and device.
  • a hardware trigger mechanism is usually used to accurately synchronize each video recording device, that is, the frame synchronization of each video recording device is triggered by the hardware device.
  • each laser beam generated by a laser beam splitter is connected to a photoelectric converter through an optical fiber.
  • the photoelectric converter converts the laser signal into an electrical pulse signal.
  • the technical problem to be solved by the embodiments of the present application is to provide a frame synchronization method and device thereof, which realizes frame synchronization of multiple camera devices through software, which has low cost and a wide range of applications.
  • a first aspect of an embodiment of the present application provides a frame synchronization method, including:
  • the second camera device obtains the time stamp of each frame in consecutive N frames of the first camera device, where N is a positive integer greater than 1;
  • the second camera device intercepts consecutive N frames of the second camera device and collects the time stamp of each frame in the consecutive N frames of the second camera device;
  • the second imaging device determines the second imaging from the consecutive N frames of the first imaging device according to the time stamp of each frame in the consecutive N frames of the first imaging device and the time stamp of each frame in the consecutive N frames of the second imaging device Aligned frames of each frame in consecutive N frames of the device;
  • the second camera device determines the average delay according to the time stamp of each frame in consecutive N frames of the second camera device and the time stamp of the aligned frames of each frame in the consecutive N frames of the second camera device;
  • the second camera device determines to be frame synchronized with the first camera device.
  • the second camera device determines the average delay by aligning each frame in consecutive N frames of the second camera device, and then determines whether the second camera device and the first camera device are based on the average delay Frame synchronization, if the average time delay meets the synchronization condition, the second camera device and the first camera device frame synchronization; if the average time delay does not meet the synchronization condition, the second camera device needs to be restarted to make the second camera device and the first Frame synchronization of camera equipment, so that frame synchronization of multiple camera equipment can be achieved through software, with low cost and wide application range.
  • the second camera device restarts, and again obtains the time stamp of each frame in consecutive N frames of the first camera device; the second camera device is intercepted again Consecutive N frames, and collect the time stamp of each frame in the consecutive N frames of the second camera device intercepted again; according to the time stamp of each frame in the consecutive N frames of the first camera device acquired again and the second camera captured again.
  • the timestamp of each frame in consecutive N frames of the device from the consecutive N frames of the first imaging device acquired again, determine the aligned frame of each frame in the consecutive N frames of the second imaging device intercepted again;
  • the second camera device determines to be frame synchronized with the first camera device.
  • the second camera device When the average time delay does not satisfy the synchronization condition, the second camera device restarts to promote frame synchronization between the second camera device and the first camera device. If the second camera device does not meet the synchronization condition after restarting, it restarts again. In other words, the second camera device can be restarted multiple times until the average delay meets the synchronization condition, so that the second camera device is frame synchronized with the first camera device.
  • the second imaging device may determine that the average delay meets the synchronization condition; if the average delay is greater than or equal to the threshold, the second imaging device may determine that the average delay does not satisfy the synchronization condition.
  • the threshold value is an empirical value, which is related to the camera device type, frame rate calculation, and debugging. The specific value is not limited in the embodiment of the present application.
  • each camera device in the frame synchronization system can determine its own start frame, and the start frame is the frame at which the image acquisition starts. That is the first frame of the officially collected image or video.
  • the second camera device determines the start frame of the second camera device, including: the second camera device obtains a reference timestamp, the reference timestamp is the timestamp of the start frame selected by the reference camera device; the reference camera device manages the camera managed by the master node The main camera device in the device; the second camera device determines the start frame of the second camera device according to the reference time stamp.
  • Each camera device determines its starting frame according to the reference time stamp, which can effectively shorten the time deviation in the formal image acquisition process, so as to better process the images collected by each camera device.
  • the second camera device may select a frame corresponding to a timestamp whose difference between the reference timestamps is within a preset range from the frame timestamp sequence of the second camera device, and It is determined as the starting frame of the second camera device; wherein, the preset range is [-Fp/2, Fp/2], and Fp is a single frame time interval.
  • the difference between the timestamp of the start frame of the second camera and the reference timestamp is within a preset range to effectively reduce the time deviation between the second camera and the reference camera during the formal image acquisition process .
  • the first camera device is the reference camera device
  • the second camera device is the master camera device among the camera devices managed by the slave node.
  • the second camera device uses memory sharing to The node acquires the time stamp of each frame in consecutive N frames of the first camera device.
  • the first camera device synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the master node through memory sharing
  • the master node uses distributed process messages to synchronize each frame in the consecutive N frames of the first camera device.
  • the time stamp of is synchronized to the slave node to which the second camera device belongs, and the slave node synchronizes the time stamp of each frame in the consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the first camera device is a reference camera device
  • the second camera device is a slave camera device among the camera devices managed by the master node.
  • the second camera device directly shares through memory, from The first imaging device acquires the time stamp of each frame in consecutive N frames of the first imaging device.
  • the first camera device is the master camera device of the camera device managed by the slave node
  • the second camera device is the slave camera device of the camera devices managed by the slave node.
  • the second camera device The device directly obtains the time stamp of each frame in the consecutive N frames of the first imaging device from the first imaging device through memory sharing.
  • the reference imaging device is still the main imaging device of the imaging device managed by the master node.
  • the first imaging device when the first imaging device is the master imaging device of the imaging device managed by the slave node and the second imaging device is the slave imaging device of the imaging devices managed by the slave node, if the first imaging device When the device obtains the time stamp of each frame in the continuous N frames of the reference camera device, the first camera device can synchronize the time stamp of each frame in the continuous N frames of the reference camera device to the second camera device through memory sharing.
  • the imaging device determines the continuous N of the second imaging device from the consecutive N frames of the reference imaging device based on the time stamp of each frame in the consecutive N frames of the reference imaging device and the time stamp of each frame in the consecutive N frames of the second imaging device The aligned frame of each frame in the frame.
  • the second camera device intercepts consecutive N frames of the second camera device from the frame time stamp sequence of the second camera device.
  • the second camera device uses the timestamp of each frame in consecutive N frames of the first camera device and the time stamp of each frame in consecutive N frames of the second camera device
  • determining the aligned frame of each frame in consecutive N frames of the second camera device includes: the second camera device calculates the time stamp of the i-th frame in the consecutive N frames of the second camera device and the continuity of the first camera device The absolute difference between the timestamps of each frame in the N frame; 1 ⁇ i ⁇ N; from the consecutive N frames of the first camera, obtain the frame corresponding to the smallest absolute difference and determine it as the second camera The aligned frame of the i-th frame in consecutive N frames.
  • the second camera device determines the average time according to the time stamp of each frame in consecutive N frames of the second camera device and the time stamp of the aligned frames of each frame in consecutive N frames of the second camera device Extension, including:
  • the second camera device obtains the absolute difference between the time stamp of the i-th frame and the time stamp of the aligned frame of the i-th frame in the continuous N frames of the second camera device; 1 ⁇ i ⁇ N; the average time is calculated according to the calculation formula
  • the calculation formula is:
  • D avg is the average delay
  • t i is the time stamp of the i-th frame in consecutive N frames of the second camera device
  • t i ′ is the time stamp of the aligned frame of the i-th frame.
  • a second aspect of an embodiment of the present application provides a second camera device, and the second camera device has a function of implementing the method provided in the first aspect.
  • the function can be realized by hardware, or can also be realized by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the second camera device includes: a transceiver module and a processing module; the transceiver module is used to obtain a time stamp of each frame in consecutive N frames of the first camera device, where N is a positive integer greater than 1 ;
  • the processing module is used to intercept the continuous N frames of the second camera device and collect the time stamp of each frame in the continuous N frames of the second camera device; according to the time stamp and the first The timestamp of each frame in the consecutive N frames of the two imaging devices, from the consecutive N frames of the first imaging device, determine the alignment frame of each frame in the consecutive N frames of the second imaging device; according to the consecutive N frames of the second imaging device
  • the timestamp of each frame in the frame and the timestamp of the aligned frame of each frame in the consecutive N frames of the second camera device determine the average delay; when the average delay meets the synchronization conditions, the frame of the second camera device and the first camera device are determined Synchronize.
  • the second camera device includes: a processor, a transceiver, and a memory, where the transceiver is used to receive and send information, and a computer program is stored in the memory, and the computer program includes program instructions, and the processor passes the bus Connected to the memory and the transceiver, the processor executes the program instructions stored in the memory to enable the second camera device to perform the following operations: control the transceiver module to obtain the time stamp of each frame in consecutive N frames of the first camera device, where N is greater than A positive integer of 1; intercept consecutive N frames of the second camera device and collect the time stamp of each frame in the consecutive N frames of the second camera device; according to the time stamp and the second of each frame in the continuous N frames of the first camera device The timestamp of each frame in consecutive N frames of the imaging device, from the consecutive N frames of the first imaging device, determine the alignment frame of each frame in the consecutive N frames of the second imaging device; according to the consecutive N frames of the second imaging device The timestamp of each frame and the times
  • a third aspect of the embodiments of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program includes program instructions, and the program instructions, when executed by a processor, cause the processor to execute the first Aspect of the method.
  • a fourth aspect of the embodiments of the present application provides a computer program product containing instructions that, when run on a computer, cause the computer to execute the method described in the first aspect above.
  • a fifth aspect of an embodiment of the present application provides a frame synchronization method, including:
  • the first camera device intercepts consecutive N frames of the first camera device and collects the time stamp of each frame in the consecutive N frames of the first camera device, where N is a positive integer greater than 1;
  • the first imaging device synchronizes the time stamp of each frame in consecutive N frames of the first imaging device to the second imaging device.
  • the first camera device synchronizes the time stamp of each frame in consecutive N frames to the second camera device, so that the second camera device uses the time stamp of each frame in consecutive N frames of the first camera device Determine whether the second camera device and the first camera device are frame synchronized.
  • the first camera device intercepts consecutive N frames of the first camera device from the frame time stamp sequence of the first camera device.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a slave camera device among the camera devices managed by the master node
  • the first imaging device directly synchronizes the time stamp of each frame in consecutive N frames of the first imaging device to the second imaging device through memory sharing.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a master camera device among the camera devices managed by the slave nodes
  • the first imaging device synchronizes the time stamp of each frame in the consecutive N frames of the first imaging device to the master node through memory sharing, so that the master node synchronizes each frame in the consecutive N frames of the first imaging device
  • the time stamp of is synchronized to the slave node, so that the slave node synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the first imaging device is a reference imaging device
  • the first imaging device determines the start frame of the first imaging device and The timestamp of the start frame, and using the timestamp of the start frame as a reference timestamp, the reference timestamp is synchronized to the second camera device, and the reference timestamp is used by the second camera device to determine the start frame of the second camera device.
  • the process of synchronizing the reference time stamp to the second camera device by the first camera device is the same as the process of synchronizing the time stamp of each frame in consecutive N frames of the first camera device to the second camera device.
  • the first camera device is the master camera device of the camera device managed by the slave node
  • the second camera device is the slave camera device of the camera devices managed by the slave node.
  • the first camera device The device synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the reference imaging device is still the main imaging device among the imaging devices managed by the master node.
  • the first camera device in a case where the first camera device is the master camera device of the camera device managed by the slave node, the first camera device obtains a reference time stamp, and the reference time stamp is the starting frame selected by the reference camera device The time stamp of the first camera device is determined according to the reference time stamp, which can effectively shorten the time deviation between the first camera device and the reference camera device during the formal image acquisition process.
  • a sixth aspect of the embodiments of the present application provides a first camera device, and the first camera device has a function of implementing the method provided in the fifth aspect.
  • the function can be realized by hardware, or can also be realized by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the first camera device includes: a transceiver module and a processing module; the processing module is configured to intercept consecutive N frames of the first camera device and collect each frame of the consecutive N frames of the first camera device Timestamp, N is a positive integer greater than 1, and the transceiver module is used to synchronize the timestamp of each frame in consecutive N frames of the first camera device to the second camera device.
  • the first camera device includes: a processor, a transceiver, and a memory, where the transceiver is used to receive and send information, and a computer program is stored in the memory, and the computer program includes program instructions, and the processor passes the bus Connected to the memory and the transceiver, the processor executes the program instructions stored in the memory to make the first imaging device perform the following operations: intercept consecutive N frames of the first imaging device, and collect each of the consecutive N frames of the first imaging device The timestamp of the frame, N is a positive integer greater than 1, and the transceiver is controlled to synchronize the timestamp of each frame in consecutive N frames of the first camera to the second camera.
  • a seventh aspect of the embodiments of the present application provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program, and the computer program includes program instructions.
  • the processor executes the fifth Aspect of the method.
  • An eighth aspect of an embodiment of the present application provides a computer program product containing instructions that, when run on a computer, cause the computer to perform the method described in the fifth aspect above.
  • a ninth aspect of an embodiment of the present application provides a frame synchronization system, including:
  • the first camera device intercepts consecutive N frames of the first camera device and collects the time stamp of each frame in the continuous N frames of the first camera device; synchronizes the time stamp of each frame in the continuous N frames of the first camera device to the second For camera equipment, N is a positive integer greater than 1;
  • the second camera device intercepts consecutive N frames of the second camera device and collects the time stamp of each frame in the continuous N frames of the second camera device; based on the time stamp of each frame in the continuous N frames of the first camera device and the second camera
  • the time stamp of each frame in the consecutive N frames of the device from the consecutive N frames of the first camera device, determine the alignment frame of each frame in the consecutive N frames of the second camera device;
  • the timestamp of the frame and the timestamp of the aligned frames of each frame in consecutive N frames of the second camera device determine the average delay; when the average delay meets the synchronization condition, the second camera device determines that the frame is synchronized with the first camera device.
  • a tenth aspect of an embodiment of the present application provides a frame synchronization system, including a first camera device and a second camera device, the second camera device is used to perform the method provided in the first aspect, and the first camera device is used to perform the fifth aspect provided Methods.
  • the first camera device is a reference camera device
  • the reference camera device is the master camera device of the camera devices managed by the master node
  • the second camera device is the master node Slave camera device among managed camera devices.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a slave node The main camera device in the managed camera devices.
  • the first camera device is a master camera device of the camera device managed by the slave node
  • the second camera device is a slave camera device of the camera devices managed by the slave node
  • the reference camera device is still the main camera device among the camera devices managed by the master node.
  • FIG. 1 is a schematic diagram of a network architecture applying an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a frame synchronization method provided in Embodiment 1 of the present application;
  • FIG. 3 is a schematic diagram of determining an aligned frame from a main camera device provided by an embodiment of the present application
  • FIG. 4 is a schematic diagram of determining whether frame synchronization is performed from a master camera device according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of determining a start frame provided by an embodiment of this application.
  • FIG. 6 is a schematic flowchart of a frame synchronization method according to Embodiment 2 of the present application.
  • FIG. 7 is a schematic diagram of a logical structure of a frame synchronization device provided by an embodiment of the present application.
  • FIG. 8 is a simplified schematic diagram of a physical structure of a frame synchronization device provided by an embodiment of the present application.
  • At least one of the following or a similar expression refers to any combination of these items, including any combination of a single item or a plurality of items.
  • at least one item (a) in a, b, or c can represent: a, b, c, ab, ac, bc, or abc, where a, b, c can be a single or multiple .
  • the words “first” and “second” are used to distinguish the same or similar items whose functions and functions are basically the same. Those skilled in the art may understand that the words “first” and “second” do not limit the number and execution order, and the words “first” and “second” do not necessarily mean different.
  • the network architecture and business scenarios described in the embodiments of the present application are intended to more clearly explain the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application.
  • the technical solutions provided by the embodiments of the present application are also applicable to similar technical problems.
  • the frame synchronization system can also be called a node system.
  • the node system includes a master node and a slave node.
  • the node is responsible for managing one or more camera devices.
  • the camera devices managed by the node can be divided into master camera devices and slave camera devices.
  • the frame synchronization system can be used to achieve frame synchronization of multiple camera devices, so that multiple camera devices can collaboratively record video and coordinate monitoring.
  • the camera equipment managed by this node must include the master camera equipment.
  • the number of slave nodes is one or more, depending on the specific situation.
  • the camera equipment managed by the slave node performs frame synchronization through the master camera equipment managed by the master node.
  • the main master imaging device the master imaging device among the imaging devices managed by the master node, is the reference imaging device in the frame synchronization system.
  • the frame synchronization between the nodes and the frame synchronization within the nodes refer to the time stamp of the main master camera device. After the main camera device is turned on, it is initialized only once.
  • other imaging devices other than the master master imaging device are called slave master imaging devices, and may also be called slave imaging devices of the master node.
  • the reference camera device is the master camera device among the camera devices managed by the master node, that is, the master camera device.
  • other imaging devices other than the reference imaging device need to refer to the reference imaging device for frame synchronization, and refer to the reference time stamp of the reference imaging device to determine their respective start frames. It should be noted that other names used to describe the reference imaging device should fall within the protection scope of the embodiments of the present application, for example, refer to the imaging device.
  • the slave master imaging device refers to the time stamp of the master master camera device for frame synchronization; during the frame synchronization process in a slave node, the slave camera device of the slave node Refer to the time stamp of the master camera device for frame synchronization.
  • the imaging devices other than the master imaging device among the imaging devices managed by the slave node are called slave imaging devices, and may also be called slave imaging devices of the slave node.
  • the slave camera device is connected to the master node or slave node to participate in frame synchronization within the node. If a slave camera device belongs to the master node, the slave camera device performs frame synchronization with the master master camera device; if a slave camera device belongs to the slave node, the slave camera device synchronizes frame with the slave master camera device.
  • Reference timestamp the timestamp of the start frame of the main master camera device selected for the main master camera device, is used for all camera devices to perform synchronous time calibration, that is, each camera device determines its own start frame according to the reference time stamp, which can be effective Shorten the deviation between the time stamps of the start frames of each camera device, so that the deviation can be controlled below the millisecond level, so that the images collected by multiple camera devices can be better connected.
  • the start frame is the frame at which the image or video starts to be collected, that is, the first frame of the officially collected image or video.
  • FIG. 1 is a schematic diagram of a network architecture to which an embodiment of the present application is applied.
  • the schematic diagram of the network architecture includes a master node 101, a slave node 102, a master master camera device 103, a master slave camera device 104, a slave master camera device 105, and a slave camera Device 106.
  • the form and number of devices shown in FIG. 1 do not constitute a limitation on the embodiment of the present application, but in the embodiment of the present application, there is only one master node, and there may be one or more slave nodes; There is only one master camera device and one slave camera device. There can be one or more master camera devices and slave camera devices. The number of master camera devices and slave camera devices may or may not be equal, depending on the situation It depends.
  • the camera device may be the monitor shown in FIG. 1 or other types of monitors, or various types of cameras, video recording devices, or image acquisition devices.
  • the master node 101 and the slave node 102 may be computing nodes, and each computing node manages one or more camera devices.
  • the administrator can set which computing node is the master node and which computing nodes are slave nodes, or the frame synchronization system can independently set the master node and slave nodes.
  • the master node 101 is responsible for managing the master master camera device 103 and the master slave camera device 104.
  • the master node 101 may select one camera device from the camera devices it manages as the master camera device of the master node 101, that is, the master master camera device, and the remaining camera devices serve as slave camera devices of the master node 101, that is, master-slave camera devices.
  • the administrator may also select one camera device from the camera devices managed by the master node 101 as the master camera device.
  • the master master camera device 103 is a reference camera device for the entire frame synchronization system, that is, the master-slave camera device 104 in the master node 101 needs to refer to the time stamp of the master master camera device 103 for frame synchronization, and the camera device managed by the slave node 102 also needs Frame synchronization is performed with reference to the time stamp of the main master imaging device 103.
  • the master-slave imaging device 104 performs frame synchronization with the master-master imaging device 103, and frame synchronization within the master node 101 can be completed.
  • the slave node 102 is responsible for managing the slave master camera device 105 and slave camera device 106. Taking a slave node 102 as an example, the slave node 102 can select one camera device from the camera devices it manages as the master camera device of the slave node 102, that is, the slave camera device, and the remaining camera devices as slaves of the slave node 102 Camera device, that is, slave camera device. The administrator may also select one camera device from the camera devices managed by the slave node 102 as the slave camera device.
  • the frame synchronization between the master imaging device 105 and the master imaging device 103 can complete the frame synchronization between the master node 101 and the slave node 102.
  • the frame synchronization in the slave node 102 can be completed by performing frame synchronization between the slave imaging device 106 and the slave master imaging device 105.
  • master node, slave node, master master camera device, master slave camera device, slave master camera device, slave camera device, etc. in FIG. 1 are only one name, and the name does not constitute a limitation on the device itself. Other names used to describe these names should fall within the protection scope of the embodiments of the present application.
  • the embodiments of the present application provide a method and device for implementing frame synchronization through software, which have low cost and wide application range.
  • a method for realizing frame synchronization through software uses target feature points to synchronize each camera.
  • the method includes: acquiring the images obtained by the first camera and the second camera during the movement of the target feature points Position information, including the first feature point position sequence obtained by the first camera and the second feature point position sequence obtained by the second camera; the first feature point position sequence includes the first position information of the target feature point; the second feature point position sequence Including the second position information of the target feature point; the second feature point estimated position sequence of the target feature point is obtained according to the first feature point position sequence and the homography matrix; the estimated position sequence and the second feature point position sequence according to the second feature point
  • the sequence difference value is obtained; when the sequence difference value meets the preset synchronization condition, the frame synchronization position of the first camera and the second camera is determined.
  • This method uses feature points to complete frame synchronization, but this method completes frame synchronization during the shooting process, and cannot achieve frame synchronization between the start of shooting and obtaining the feature points.
  • the frame synchronization of each camera can be completed before the formal shooting, that is, the frame synchronization can be completed in the preparation stage of multi-camera collaborative recording, and the frame synchronization process does not affect the formal video recording process, thereby ensuring the official video recording process Integrity.
  • the embodiments of the present application can be applied to a scenario where multiple cameras work together, for example, a scenario where multiple cameras record video together, and can also be applied to a video surveillance scenario, such as deploying multiple surveillance cameras in a crowded square, according to a certain surveillance camera
  • a video surveillance scenario such as deploying multiple surveillance cameras in a crowded square, according to a certain surveillance camera
  • the criminal information of the next frame can quickly extract the criminal information of other cameras at this time point, so that the complete criminal information can be obtained.
  • Embodiment 1 of the present application is a schematic flowchart of the frame synchronization method provided in Embodiment 1 of the present application.
  • the method may include but not limited to the following steps:
  • Step S201 Determine the node role.
  • the administrator or frame synchronization system divides the roles of each node in the frame synchronization system and the camera equipment managed by each node, specifying which node is the master node, which nodes are slave nodes, and which of the camera equipment managed by the master node is the master Among the camera devices, which are the master and slave camera devices and which of the camera devices managed by the slave node are slave camera devices and which are slave camera devices. Managers can divide roles before each camera device is initialized, and each role can also be set by the frame synchronization system by default.
  • each node and each camera device can clearly know the respective roles, so that the respective roles can be determined so as to perform the functions of the respective roles.
  • Step S202 node system time synchronization.
  • the node system includes a master node and a slave node. There is only one master node and one or more slave nodes. Before each camera device performs frame synchronization, each node performs system time synchronization. Each node can use the precision time protocol (precision time protocol, PTP) to achieve the system time synchronization of each node, PTP can achieve nanosecond synchronization.
  • PTP precision time protocol
  • the system time between nodes can be shared through the distributed inter-process messaging interface, for example, the master node passes the system time of the master node to the slave node through the distributed inter-process message, so that the slave node synchronizes the system time according to the system time of the master node .
  • the time stamp information between the nodes can be shared through the distributed inter-process message transfer interface, and the time stamp information within the nodes can be directly shared through the memory.
  • the timestamp information may include one or more timestamps, which may be timestamp information of the system time, or timestamp information of the main master camera device.
  • the time stamp information synchronization between the master master camera device and the slave master camera device may include: the master master camera device synchronizes its time stamp information to the master node through memory sharing, and the master node synchronizes the master master with distributed inter-process messages The time stamp information of the camera device is synchronized to the slave node, and the slave node synchronizes the time stamp information of the master camera device to the slave camera device through memory sharing.
  • the time stamp information synchronization between the master and master camera devices and the master and slave camera devices may include: the master master camera device synchronizes its time stamp information to the master and slave camera devices through memory sharing.
  • the synchronization of the time stamp information between the slave master camera device and the slave slave camera device may include: the slave master camera device synchronizes the time stamp information of the slave camera device to the slave slave camera device through memory sharing, or the slave master camera device through memory sharing Synchronize the time stamp information of the master camera device to the slave camera device.
  • Step S203 Initialization between nodes.
  • the initialization between nodes may include the initialization of the master camera device and the slave camera device. Initialization can be achieved by turning the camera device on and off.
  • the master master camera device is initialized only once, that is, the master master camera device is only turned on once; the slave master camera device can be initialized multiple times, and when the frame synchronization between the slave master camera device and the master master camera device fails, the slave master camera device
  • the camera device is initialized, that is, the slave camera device is restarted at this time, and the frame synchronization with the master camera device is restarted.
  • Step S204 the slave master imaging device and the master master imaging device perform frame synchronization.
  • the main master imaging device intercepts consecutive N frames of the main master imaging device and collects the time stamp of each frame in the continuous N frames of the main master imaging device.
  • the main master imaging device may intercept consecutive N frames of the main master imaging device from the frame time stamp sequence of the main master imaging device.
  • N is a positive integer greater than 1.
  • the master master camera device After collecting the time stamp of each frame in the consecutive N frames of the main master camera device, the master master camera device synchronizes the time stamp of each frame in the consecutive N frames of the master master camera device to the master node through memory sharing.
  • the inter-process message synchronizes the time stamp of each frame in the continuous N frames of the master master camera device to the slave node.
  • the slave node synchronizes the time stamp of each frame in the continuous N frames of the master master camera device to the slave master camera through memory sharing. Device, so that the master camera device can obtain the time stamp of each frame in the consecutive N frames of the master camera device, and then determine whether the frame synchronization with the master camera device is successful.
  • the slave master imaging device intercepts consecutive N frames of the slave master imaging device, and collects the time stamp of each frame in the consecutive N frames of the slave master imaging device.
  • the slave master imaging device may intercept consecutive N frames of the slave master imaging device from the frame time stamp sequence of the slave master imaging device.
  • N is a positive integer greater than 1.
  • the master master imaging device and the slave master imaging device intercept consecutive N frames, and the order in which consecutive N frames are intercepted, are not limited in the embodiments of the present application.
  • the number of frames intercepted by the main main imaging device is the same as that of the main imaging device.
  • the specific value of N is not limited in the embodiment of the present application, and can be set by a manager or by a frame synchronization system.
  • the slave master imaging device After acquiring the time stamp of each frame in the consecutive N frames of the master master imaging device and the time stamp of each frame in the consecutive N frames of the master imaging device, the slave master imaging device determines from the consecutive N frames of the master master imaging device Out of the aligned frames of each of the N consecutive frames from the master camera device. Specifically, according to the time stamp of each frame in consecutive N frames of the master master imaging device and the time stamp of each frame in consecutive N frames of the master master imaging device, the slave master imaging device selects from the consecutive N frames of the master master imaging device, Determine the aligned frame of each of the N consecutive frames from the master camera device.
  • the slave master camera device will In turn with Compare to obtain the absolute difference D n , 1 ⁇ n ⁇ N.
  • the frame of the master master imaging device corresponding to the smallest D n is determined as the consecutive N frames of the slave master imaging device
  • the aligned frame of the first frame in the middle According to this process, determine the alignment frame of the second frame in the consecutive N frames of the master imaging device, and repeat this process until the alignment frame of the Nth frame in the jth continuous N frame of the master imaging device is determined. For each slave master camera, this process can be used to determine the alignment frame of each of the consecutive N frames.
  • FIG. 3 is a schematic diagram of determining an aligned frame from a master camera device provided by an embodiment of the present application.
  • FIG. 3 is an example of determining the aligned frame of the first frame among the consecutive N frames of the master imaging device.
  • the timestamp of the first frame among the consecutive N frames of the slave imaging device is t
  • the consecutive N frames of the master master imaging device The time stamp of the third frame in the t is t
  • the absolute difference between the two is 0, which is the smallest absolute difference.
  • the aligned frame of the first frame of the consecutive N frames from the main camera is the main master camera The third frame of consecutive N frames.
  • the master imaging device determines the alignment frame of the second frame in the continuous N frames until the alignment frame of the Nth frame in the continuous N frames is determined.
  • Step S205 Determine whether the master master camera device and the slave master camera device are successfully synchronized.
  • the master camera device acquires the absolute difference between the time stamp of the i-th frame and the time stamp of the aligned frame of the i-th frame in the continuous N frames of the master camera device, until 1 ⁇ i ⁇ N, until the slave
  • D avg represents the average time delay
  • t i represents the time stamp of the i-th frame in consecutive N frames from the master imaging device
  • the average delay is less than the threshold, it is determined that the average delay meets the synchronization condition, and the slave master camera and the master master camera frame synchronization, that is, the synchronization is successful; if the average delay is greater than the threshold, it is determined that the average delay does not meet the synchronization condition, from The main camera device is not synchronized with the main camera device, that is, the synchronization fails.
  • the threshold value can be represented by D max
  • D max is the maximum value of the delay error between the master device and the imaging apparatus main imaging synchronizing master
  • a value of experience and the type of imaging equipment
  • the frame rate is calculated and related to debugging, specific values
  • the embodiments of the present application are not limited.
  • the slave camera device re-initializes, that is, restarts, re-acquires the time stamp of each frame in the continuous N frames of the master camera device, and re-intercepts the slave camera device
  • the continuous N frames of the device collect the timestamp of each frame in the re-intercepted continuous N frames, re-determine the alignment frame and the average delay, and then determine whether the re-determined average delay is less than the threshold, if it is less, the frame synchronization is successful; Otherwise, it is initialized again until the average delay is less than the threshold.
  • FIG. 4 is a schematic diagram of determining whether frame synchronization is performed by a master camera device provided by an embodiment of the present application.
  • the newly determined average delay D avg ⁇ D max meets Synchronization conditions, the slave master camera device and master master camera device are successfully synchronized.
  • time stamp and threshold in FIG. 4 are used as examples, and do not constitute a limitation on the embodiments of the present application.
  • step S206 if the synchronization between the slave master camera device and the master master camera device is successful, the node is initialized.
  • the slave imaging device managed by the master node is initialized and the slave imaging device managed by the slave node is initialized.
  • the slave imaging device managed by the master node is turned on, and the slave imaging device managed by the slave node is turned on.
  • Step S207 the slave camera device and the master camera device perform frame synchronization.
  • the frame synchronization between the slave camera device and the master camera device includes: the master-slave camera device refers to the master master camera device for frame synchronization, and the slave camera device refers to the slave master camera device or the master master camera device for frame synchronization.
  • the process of frame synchronization between the slave camera device and the master camera device is the same as the process of frame synchronization between the master camera device and the master camera device, and will not be repeated here.
  • the frame synchronization of the slave camera device with reference to the master camera device is better than that of the slave camera device with reference to the master camera device.
  • Step S208 Determine whether the slave camera device and the master camera device are successfully synchronized.
  • the slave camera determines whether the synchronization with the master camera is successful, that is, whether Davg ⁇ Dmax . If the synchronization is unsuccessful, the slave camera re-initializes until Davg ⁇ Dmax .
  • Step S209 If the slave camera device and the master camera device are successfully synchronized, all the camera devices perform synchronization time calibration.
  • all slave camera devices are synchronized with the master camera device successfully, all camera devices perform synchronization time calibration. Since the camera equipment may have automatic image correction or frame loss during initialization, after all camera equipment synchronization is completed, it is necessary to drop several frames before starting formal image acquisition, so as to ensure the first frame of each video stream received by the encoder Can correspond to the same time, or the closest to the same time, so it is necessary to synchronize the time calibration of all camera equipment.
  • the main master imaging device determines the reference time stamp, and the reference time stamp is the time stamp of the start frame of the main master imaging device, and the start frame is the first frame of the official image collection.
  • the master master camera device can predict the reference time stamp when synchronizing between the nodes, or it can predict the reference time stamp after the slave master camera device synchronizes with the master master camera device.
  • the master master camera device synchronizes the reference time stamp to the slave master camera device.
  • the slave master camera device obtains the reference time stamp, it can synchronize the reference time stamp to the slave camera device.
  • the master master imaging device may synchronize the reference time stamp with the time stamp of each frame in the consecutive N frames of the master master imaging device to the slave master imaging device, or may synchronize separately, which is not limited in the embodiments of the present application.
  • the slave master imaging device determines the start frame of the slave master imaging device according to the reference time stamp.
  • the slave master camera device selects the frame corresponding to the time stamp whose difference between the reference time stamps is within the preset range from the frame time stamp sequence of the slave master camera device, and determines it as the start of the slave master camera device frame.
  • the preset range is [-Fp/2, Fp/2], and Fp is a single frame time interval.
  • FIG. 5 is a schematic diagram of determining a start frame provided by an embodiment of the present application.
  • the slave master camera device 1 and the slave master camera device 2 are aligned with the master master camera device.
  • the reference time stamp determined by the master master camera device is the time point indicated by the black arrow, and the reference time stamp
  • the corresponding frame is the start frame of the main master camera device.
  • the frame indicated by the black arrow is taken as the start frame of the slave master imaging device 1
  • the difference between the time stamp and the reference time stamp of the frame is within a preset range, as The frame closest to the start frame of the main master camera device.
  • the frame indicated by the black arrow is used as the start frame of the slave master imaging device 2, and the difference between the time stamp of the frame and the reference time stamp is within a preset range, as The frame closest to the start frame of the main master camera device, and the difference between the time stamp of the frame pointed by the gray arrow and the reference time stamp is not within the preset range, and the frame pointed by the gray arrow is not taken as the slave main camera Start frame of device 2.
  • the slave camera device managed by the master node and the slave camera device managed by the slave node also determine their respective start frames according to the reference time stamp.
  • the embodiment shown in FIG. 2 realizes the frame synchronization of multiple camera devices through software, and can be applied to various types of camera devices, with low cost and wide application range; by determining the alignment frame, the time deviation can be effectively shortened, so that the frame synchronization
  • the time deviation between each camera device in the system is controlled below milliseconds; the frame synchronization process between each camera device is performed before the formal image acquisition, so that the synchronization process does not affect the formal image acquisition process to ensure the integrity of the video recording;
  • each camera device determines its own starting frame, thereby effectively shortening the time deviation in the formal image acquisition process, so as to better process the images collected by each camera device.
  • FIG. 6 is a schematic flowchart of a frame synchronization method according to Embodiment 2 of the present application.
  • the method may include but is not limited to the following steps:
  • Step S601 The first camera device intercepts consecutive N frames of the first camera device, and collects the time stamp of each frame in the consecutive N frames of the first camera device.
  • the first camera device Before performing step S601, the first camera device needs to determine whether its role is the master camera device or the slave camera device. If it is a master master camera device, the first camera device is the reference camera device of the frame synchronization system; if it is a slave master camera device, the first camera device refers to the master master camera device for frame synchronization, which is managed by the slave node to which the slave master camera device belongs The slave camera device can refer to the first camera device for frame synchronization.
  • the first camera device After the first camera device is started, it may intercept consecutive N frames of the first camera device from the frame time stamp sequence of the first camera device and collect time stamps of each frame in the consecutive N frames of the first camera device. How the first camera device intercepts consecutive N frames from its frame timestamp sequence is not limited in the embodiment of the present application.
  • Step S602 The first camera device synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the second camera device.
  • the second camera device obtains the time stamp of each frame in the consecutive N frames of the first camera device.
  • the first camera device is a master camera device
  • the second camera device is a slave camera device.
  • the first camera device synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the master node through memory sharing
  • the master node uses distributed process messages to synchronize the time of each frame in the consecutive N frames of the first camera device.
  • the stamp is synchronized to the slave node to which the second camera device belongs, and the slave node synchronizes the time stamp of each frame in consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the first camera device is a master camera device
  • the second camera device is a master camera device.
  • the first camera device directly synchronizes the time stamp of each frame in the consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the first camera device is a slave camera device
  • the second camera device is a slave camera device.
  • the first camera device directly synchronizes the time stamp of each frame in the consecutive N frames of the first camera device to the second camera device through memory sharing.
  • the first camera device synchronizes the time stamp of each frame in the continuous N frames of the main master camera device to the second through memory sharing when the time stamp of each frame in the continuous N frames of the main master camera device is acquired Camera equipment.
  • Step S603 The second camera device intercepts consecutive N frames of the second camera device, and collects the time stamp of each frame in the consecutive N frames of the second camera device.
  • the second camera device After the second camera device is started, it may intercept consecutive N frames of the second camera device from the frame time stamp sequence of the second camera device, and collect time stamps of each frame in the consecutive N frames of the second camera device. How the second camera device intercepts consecutive N frames from its frame time stamp sequence is not limited in the embodiment of the present application.
  • Step S604 The second camera device determines from the consecutive N frames of the first camera device based on the time stamp of each frame in the continuous N frames of the first camera device and the time stamp of each frame in the continuous N frames of the second camera device The aligned frame of each frame in consecutive N frames of the second camera device.
  • the second camera device calculates the absolute difference between the time stamp of the i-th frame in consecutive N frames of the second camera device and the time stamp of each frame in the continuous N frames of the first camera device; 1 ⁇ i ⁇ N ; Obtain the frame corresponding to the smallest absolute difference from the consecutive N frames of the first imaging device, and determine it as the aligned frame of the i-th frame among the consecutive N frames of the second imaging device. In this manner, the aligned frame of the first frame among the consecutive N frames of the second imaging device is determined until the aligned frame of each frame in the consecutive N frames of the second imaging device is determined. For details, refer to the detailed description of determining the aligned frame in step S204 in the embodiment shown in FIG. 2.
  • Step S605 The second camera device determines the average time delay according to the time stamp of each frame in consecutive N frames of the second camera device and the time stamp of the aligned frame of each frame in consecutive N frames of the second camera device.
  • step S605 For the specific implementation process of step S605, reference may be made to the detailed description of determining the average delay in the embodiment shown in FIG. 2, and details are not described herein again.
  • step S606 if the average delay meets the synchronization condition, the second camera device determines to be frame synchronized with the first camera device.
  • the synchronization condition may include a threshold. If the average delay is less than the threshold, it can be determined that the average delay meets the synchronization condition. At this time, the second camera can determine frame synchronization with the first camera, that is, the second camera and the first camera The camera device is successfully synchronized.
  • the second camera device restarts, and executes steps S602-S605 again, that is, once again obtains the time stamp of each frame in consecutive N frames of the first camera device, and intercepts the second camera device again Consecutive N frames, and collect the timestamp of each frame in the consecutive N frames of the second camera device again intercepted, determine the alignment frame again, determine the average delay again, and determine whether the average delay meets the synchronization condition again. If the average time delay determined this time also does not satisfy the synchronization condition, the second camera device is restarted, and then steps S602-S605 are performed until the average time delay meets the synchronization condition, and the second camera device and the first camera device are determined Frame synchronization.
  • the first camera device when the first camera device is the master camera device, the first camera device may determine a reference time stamp during frame synchronization with the second camera device, and the reference time stamp is the master camera device
  • the timestamp of the start frame selected by the camera device that is, the timestamp of the first frame of the image that the master camera device officially collected.
  • the first camera device synchronizes the reference time stamp to the second camera device, and the second camera device determines the start frame of the second camera device according to the reference time stamp, between the time stamp of the start frame of the second camera device and the reference time stamp
  • the difference is within a preset range, which is [-Fp/2, Fp/2], where Fp is a single frame time interval.
  • the second imaging device may synchronize the reference time stamp to the slave imaging device.
  • each camera device in the frame synchronization system can determine the respective start frame according to the reference time stamp, thereby effectively shortening the time deviation in the formal image acquisition process, so as to better process the images collected by each camera device.
  • the first camera device when the first camera device is the slave camera device, the first camera device obtains the reference time stamp.
  • the master master camera device can synchronize the reference time stamp to the master node, and the master node synchronizes the reference time stamp to the slave node to which the first camera device belongs, and the slave node synchronizes the reference time stamp to the slave master camera device, so that the slave master camera device
  • the reference time stamp can be obtained.
  • the first imaging device obtains the reference timestamp
  • the first frame of the first imaging device is determined according to the reference timestamp.
  • the difference between the timestamp of the first frame of the first imaging device and the reference timestamp is Within the set range, the preset range is [-Fp/2, Fp/2], and Fp is a single frame time interval.
  • the second imaging device determines the second imaging by the time stamp of each frame in consecutive N frames of the first imaging device and the time stamp of each frame in consecutive N frames of the second imaging device Align frames of each frame in consecutive N frames of the device, and then determine the average delay, and when the average delay meets the synchronization condition, determine that the second camera device is synchronized with the first camera device frame, so that multiple camera device frames can be realized Synchronization, which does not involve hardware equipment, is realized by software, with low cost and wide application range.
  • FIG. 7 is a schematic diagram of a logical structure of a frame synchronization device provided by an embodiment of the present application.
  • the frame synchronization device 70 includes a transceiver module 701 and a processing module 702.
  • the frame synchronization device 70 may be the first imaging device or the second imaging device.
  • the frame synchronization device 70 is the second camera device:
  • the transceiver module 701 is used to obtain a time stamp of each frame in consecutive N frames of the first camera device, where N is a positive integer greater than 1;
  • the processing module 702 is configured to intercept consecutive N frames of the second camera device and collect time stamps of each frame in the consecutive N frames of the second camera device; The timestamp of each frame in the consecutive N frames of the two imaging devices, from the consecutive N frames of the first imaging device, determine the alignment frame of each frame in the consecutive N frames of the second imaging device; according to the consecutive N frames of the second imaging device The timestamp of each frame in the frame and the timestamp of the aligned frame of each frame in the consecutive N frames of the second camera device determine the average delay; when the average delay meets the synchronization conditions, the frame of the second camera device and the first camera device are determined Synchronize.
  • the processing module 702 is further configured to restart the second camera device when the average delay does not meet the synchronization condition, and obtain the time stamp of each frame in consecutive N frames of the first camera device again; Intercept the consecutive N frames of the second camera device again, and collect the time stamp of each frame in the consecutive N frames of the second camera device intercepted again; The timestamp of each frame in the consecutive N frames of the second camera device intercepted again, and from the consecutive N frames of the first camera device acquired again, determine the alignment frame of each frame in the consecutive N frames of the second camera device intercepted again ; Determine the average delay again according to the timestamp of each frame in the consecutive N frames of the second camera device intercepted again and the timestamp of the aligned frame of each frame in the continuous N frames of the second camera device intercepted again; If the time delay meets the synchronization condition, it is determined that the second camera device is frame synchronized with the first camera device.
  • the transceiver module 701 is also used to obtain a reference timestamp, which is the timestamp of the start frame selected by the reference camera device; the reference camera device is the main camera in the camera device managed by the master node equipment;
  • the processing module 702 is also used to determine the start frame of the second camera device according to the reference time stamp.
  • the processing module 702 when used to determine the starting frame of the second camera device according to the reference time stamp, it is specifically used to select the reference time stamp from the frame time stamp sequence of the second camera device The frame corresponding to the timestamp with the difference between them within the preset range, and determine it as the starting frame of the second camera device;
  • the preset range is [-Fp/2, Fp/2], and Fp is a single frame time interval.
  • the first camera device is a reference camera device
  • the second camera device is a master camera device among the camera devices managed by the slave node
  • the transceiver module 701 When the transceiver module 701 is used to obtain the time stamp of each frame in the consecutive N frames of the first camera device, it is specifically used to obtain the time stamp of each frame in the consecutive N frames of the first camera device from the slave node through memory sharing.
  • the first camera device is a reference camera device
  • the second camera device is a slave camera device among the camera devices managed by the master node
  • the transceiver module 701 obtains the time stamp of each frame in the consecutive N frames of the first camera device, it is specifically used to obtain the time stamp of each frame in the consecutive N frames of the first camera device from the first camera device through memory sharing.
  • the first camera device is the master camera device of the camera device managed by the slave node
  • the second camera device is the slave camera device of the camera devices managed by the slave node
  • the transceiver module 701 When the transceiver module 701 is used to obtain the time stamp of each frame in the consecutive N frames of the first camera device, it is specifically used to obtain the time stamp of each frame in the consecutive N frames of the first camera device from the first camera device through memory sharing.
  • the frame synchronization device 70 is the first imaging device:
  • the processing module 702 is configured to intercept consecutive N frames of the first camera device and collect time stamps of each frame in the consecutive N frames of the first camera device, where N is a positive integer greater than 1;
  • the transceiver module 701 is configured to synchronize the time stamp of each frame in consecutive N frames of the first camera device to the second camera device.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a slave camera device among the camera devices managed by the master node ;
  • the transceiver module 701 is used for synchronizing the time stamp of each frame in the consecutive N frames of the first camera to the second camera, specifically for sharing the time stamp of each frame in the consecutive N frames of the first camera by memory sharing Synchronize to the second camera device.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a master camera device among the camera devices managed by the slave nodes ;
  • the transceiver module 701 is used for synchronizing the time stamp of each frame in the consecutive N frames of the first camera to the second camera, specifically for sharing the time stamp of each frame in the consecutive N frames of the first camera by memory sharing Synchronize to the master node, so that the master node synchronizes the time stamp of each frame in the continuous N frames of the first camera to the slave node, so that the slave node uses memory sharing to The time stamp is synchronized to the second camera device.
  • the processing module 702 is further used to determine the start frame of the first camera device and the time stamp of the start frame, and the start The timestamp of the frame is used as a reference timestamp;
  • the transceiver module 701 is also used to synchronize the reference time stamp to the second camera device, and the reference time stamp is used by the second camera device to determine the start frame of the second camera device.
  • the first camera device is the master camera device of the camera device managed by the slave node
  • the second camera device is the slave camera device of the camera devices managed by the slave node
  • the transceiver module 701 is used for synchronizing the time stamp of each frame in the consecutive N frames of the first camera to the second camera, specifically for sharing the time stamp of each frame in the consecutive N frames of the first camera by memory sharing Synchronize to the second camera device.
  • the transceiver module 701 is also used to obtain a reference timestamp, which is the timestamp of the start frame selected by the reference camera device.
  • the reference camera device is the master of the camera devices managed by the master node Camera equipment
  • the processing module 702 is also used to determine the start frame of the first camera device according to the reference time stamp.
  • the transceiver module 701 is used to perform step S602 in the embodiment shown in FIG. 6, and the processing module 702 is used to perform step S603-step S605 in the embodiment shown in FIG. 6 .
  • the transceiver module 701 is used to perform step S602 in the embodiment shown in FIG. 6, and the processing module 702 is used to perform step S601 in the embodiment shown in FIG. 6.
  • FIG. 8 is a simplified schematic diagram of the physical structure of the frame synchronization device provided by an embodiment of the present application.
  • the frame synchronization device may be a first camera device or a second camera device.
  • the frame synchronization device 80 includes a transceiver 801, a processor 802, and a memory 803.
  • the transceiver 801, the processor 802, and the memory 803 may be connected to each other through a bus 804, or may be connected through other methods.
  • the related functions implemented by the processing module 702 shown in FIG. 7 may be implemented by one or more processors 802.
  • the related functions implemented by the transceiver module 701 shown in FIG. 7 can be implemented by the transceiver 801.
  • the memory 803 includes, but is not limited to, random access memory (random access memory, RAM), read-only memory (read-only memory (ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM), or A portable read-only memory (compact, read-only memory, CD-ROM), the memory 803 is used for related instructions and data.
  • random access memory random access memory
  • ROM read-only memory
  • EPROM erasable programmable read only memory
  • a portable read-only memory compact, read-only memory, CD-ROM
  • the transceiver 801 is used to transmit data and/or signaling, and to receive data and/or signaling.
  • the transceiver 801 is used to communicate with the second imaging device, for example, performing step S602 in the embodiment shown in FIG. 6.
  • the transceiver 801 is used to communicate with the first imaging device, for example, performing step S602 in the embodiment shown in FIG. 6.
  • the processor 802 may include one or more processors, for example, one or more central processing units (CPUs).
  • processors for example, one or more central processing units (CPUs).
  • CPUs central processing units
  • the CPU may be a single-core CPU, or It can be a multi-core CPU.
  • the processor 802 is used to execute step S601 in the embodiment shown in FIG. 6.
  • the processor 802 is used to execute steps S603-S606 in the embodiment shown in FIG. 6.
  • the memory 803 is used to store the program code and data of the frame synchronization device 80.
  • FIG. 8 only shows a simplified design of the frame synchronization device.
  • the frame synchronization device may also include other necessary elements, including but not limited to any number of transceivers, processors, controllers, memories, communication units, etc., and all devices that can implement this application are in this Within the scope of protection applied for.
  • An embodiment of the present application further provides a frame synchronization system, which includes a first camera device and a second camera device, and further includes a master node and a slave node.
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a slave camera device among the camera devices managed by the master node .
  • the first camera device is a reference camera device
  • the reference camera device is a master camera device among the camera devices managed by the master node
  • the second camera device is a master camera device among the camera devices managed by the slave nodes .
  • the first camera device is the master camera device of the camera device managed by the slave node
  • the second camera device is the slave camera device of the camera devices managed by the slave node
  • the reference camera device is still managed by the master node The main camera device in the camera device.
  • a person of ordinary skill in the art may understand that all or part of the process in the method of the above embodiment may be implemented by a computer program instructing relevant hardware.
  • the program may be stored in a computer-readable storage medium, and when the program is executed , May include the processes of the foregoing method embodiments.
  • the foregoing storage media include various media that can store program codes, such as ROM or random storage memory RAM, magnetic disks, or optical disks. Therefore, yet another embodiment of the present application provides a computer-readable storage medium, in which instructions are stored in the computer-readable storage medium, which when executed on a computer, causes the computer to execute the method described in the above aspects.
  • Yet another embodiment of the present application also provides a computer program product containing instructions, which, when run on a computer, causes the computer to perform the methods described in the above aspects.
  • the disclosed system, device, and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical, or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a dedicated computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website, computer, server or data center via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) Another website site, computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device including a server, a data center, and the like integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)) or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请实施例提供一种帧同步方法及其装置,其中方法可包括:第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳;根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;当平均时延满足同步条件,则第二摄像设备与第一摄像设备帧同步。本申请实施例,通过软件方式实现多个摄像设备的帧同步,成本低,应用范围广泛。

Description

帧同步方法及其装置 技术领域
本申请实施例涉及多媒体技术领域,具体涉及一种帧同步方法及其装置。
背景技术
随着信息化的发展,视频技术在各行各业得到广泛应用,极大的丰富和方便了人们的生活,推动了社会的快速发展。在视频领域的研究和应用工作中,多摄像机协同工作也日益普及。在自由视点电视(free-viewpoint television,FTV)、自由视点视频(free-viewpoint video,FVV)、全景视频、深度测量、三维立体测量等应用中,常常会涉及多摄像机对不同场景或不同视角的同时拍摄。在图像合成或计算的过程中,不同摄像机拍摄的帧必须进行同步。当多个摄像机拍摄的帧存在时间差异时,会导致图像拼接重影、深度估计偏差、三维重构失败等后果。因此,多摄像机的帧同步在机器视觉领域具有重要意义。
当前各种工业视频录制设备中,通常使用硬件触发机制去精确地同步各个视频录像设备,即通过硬件设备触发各个视频录像设备的帧同步。例如,利用激光分光器产生的每路激光经过光纤连接到一个光电转换器,光电转换器将激光信号转换为电脉冲信号,这些电脉冲信号作为同步信号去触发各个摄像机进行图像采集,准确性较高。
虽然通过硬件设备实现帧同步的准确性较高,但是硬件设备成本较高,应用范围有一定的局限性。
发明内容
本申请实施例所要解决的技术问题在于,提供一种帧同步方法及其装置,通过软件方式实现多个摄像设备的帧同步,成本低,应用范围广泛。
本申请实施例第一方面提供一种帧同步方法,包括:
第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
第二摄像设备截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳;
第二摄像设备根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;
第二摄像设备根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;
当平均时延满足同步条件,则第二摄像设备确定与第一摄像设备帧同步。
本申请实施例第一方面,第二摄像设备通过第二摄像设备的连续N帧中每帧的对齐帧来确定平均时延,进而根据平均时延来判断第二摄像设备与第一摄像设备是否帧同步,若平均时延满足同步条件,则第二摄像设备与第一摄像设备帧同步;若平均时延不满足同步条件,则第二摄像设备需要重新启动以使第二摄像设备与第一摄像设备帧同步,从而通过软件方式可以实现多个摄像设备的帧同步,成本低,应用范围广泛。
在一种可能的实现方式中,当平均时延不满足同步条件,则第二摄像设备重新启动,并再次获取第一摄像设备的连续N帧中每帧的时间戳;再次截取第二摄像设备的连续N帧,并收集再次截取的第二摄像设备的连续N帧中每帧的时间戳;根据再次获取的第一摄像设备的连续N帧中每帧的时间戳和再次截取的第二摄像设备的连续N帧中每帧的时间戳,从 再次获取的第一摄像设备的连续N帧中,确定再次截取的第二摄像设备的连续N帧中每帧的对齐帧;
根据再次截取的第二摄像设备的连续N帧中每帧的时间戳以及再次截取的第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定再次平均时延;若再次平均时延满足同步条件,则第二摄像设备确定与第一摄像设备帧同步。
当平均时延不满足同步条件时,第二摄像设备重新启动,以促使第二摄像设备与第一摄像设备帧同步,若第二摄像设备重启之后,依然不满足同步条件,则再次重启。换言之,第二摄像设备可多次重启,直到平均时延满足同步条件,从而第二摄像设备与第一摄像设备帧同步。
进一步的,若平均时延小于阈值,则第二摄像设备可确定平均时延满足同步条件;若平均时延大于或等于阈值,则第二摄像设备可确定平均时延不满足同步条件。其中,阈值为经验值,与摄像设备类型、帧率计算以及调试有关,具体数值在本申请实施例中不作限定。
在一种可能的实现方式中,待帧同步系统中的所有摄像设备帧同步之后,该帧同步系统中的各个摄像设备可确定各自的起始帧,起始帧即为开始采集图像的帧,也即正式采集的图像或视频的第一帧。
第二摄像设备确定第二摄像设备的起始帧,包括:第二摄像设备获取参考时间戳,参考时间戳为基准摄像设备选择的起始帧的时间戳;基准摄像设备为主节点管理的摄像设备中的主摄像设备;第二摄像设备根据参考时间戳确定第二摄像设备的起始帧。
各个摄像设备根据参考时间戳确定各自的起始帧,可有效缩短正式图像采集过程中的时间偏差,以便更好地处理各个摄像设备采集的图像。
在一种可能的实现方式中,第二摄像设备可从第二摄像设备的帧时间戳序列中,选择与参考时间戳之间的差值在预设范围内的时间戳对应的帧,并将其确定为第二摄像设备的起始帧;其中,预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。换言之,第二摄像设备的起始帧的时间戳与参考时间戳之间的差值在预设范围内,以有效缩短正式图像采集过程中,第二摄像设备与基准摄像设备之间的时间偏差。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备,该种情况下,第二摄像设备通过内存共享,从从节点获取第一摄像设备的连续N帧中每帧的时间戳。具体的,第一摄像设备通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至主节点,主节点通过分布式进程消息将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备所属的从节点,该从节点通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备,该种情况下,第二摄像设备直接通过内存共享,从第一摄像设备获取第一摄像设备的连续N帧中每帧的时间戳。
在一种可能的实现方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备,该种情况下,第二摄像设备直接通过内存共享,从第一摄像设备获取第一摄像设备的连续N帧中每帧的时间戳。该种情况下,基准摄像设备依然为主节点管理的摄像设备的主摄像设备。
在一种可能的实现方式中,在第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备的情况下,若第一摄像设备获取到基准摄像设备的连续N帧中每帧的时间戳,则第一摄像设备可通过内存共享,将基准摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备,第二摄像设备根据基准摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从基准摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧。
在一种可能的实现方式中,第二摄像设备从第二摄像设备的帧时间戳序列中,截取第二摄像设备的连续N帧。
在一种可能的实现方式中,第二摄像设备根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧,包括:第二摄像设备计算第二摄像设备的连续N帧中第i帧的时间戳与第一摄像设备的连续N帧中每帧的时间戳之间的绝对差值;1≤i≤N;从第一摄像设备的连续N帧中,获取最小绝对差值对应的帧,并将其确定为第二摄像设备的连续N帧中第i帧的对齐帧。
在一种可能的实现方式中,第二摄像设备根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延,包括:
第二摄像设备获取第二摄像设备的连续N帧中第i帧的时间戳与第i帧的对齐帧的时间戳之间的绝对差值;1≤i≤N;根据计算公式计算得到平均时延,计算公式为:
Figure PCTCN2019126055-appb-000001
其中,D avg为平均时延,t i为第二摄像设备的连续N帧中第i帧的时间戳,t i'为第i帧的对齐帧的时间戳。
本申请实施例第二方面提供一种第二摄像设备,该第二摄像设备具有实现第一方面提供方法的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。
在一种可能实现的方式中,该第二摄像设备包括:收发模块和处理模块;收发模块,用于获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;处理模块,用于截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳;根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;当平均时延满足同步条件,则确定第二摄像设备与第一摄像设备帧同步。
在一种可能实现的方式中,该第二摄像设备包括:处理器、收发器和存储器,其中,收发器用于接收和发送信息,存储器中存储计算机程序,计算机程序包括程序指令,处理器通过总线与存储器和收发器连接,处理器执行存储器中存储的程序指令,以使该第二摄像设备执行以下操作:控制收发模块获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧 的时间戳;根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;当平均时延满足同步条件,则确定第二摄像设备与第一摄像设备帧同步。
基于同一发明构思,由于该第二摄像设备解决问题的原理以及有益效果可以参见第一方面所述的方法以及所带来的有益效果,因此该装置的实施可以参见方法的实施,重复之处不再赘述。
本申请实施例第三方面提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,计算机程序包括程序指令,程序指令当被处理器执行时使得处理器执行上述第一方面所述的方法。
本申请实施例第四方面提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第一方面所述的方法。
本申请实施例第五方面提供一种帧同步方法,包括:
第一摄像设备截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
第一摄像设备将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
本申请实施例第五方面,第一摄像设备将其连续N帧中每帧的时间戳同步至第二摄像设备,以便第二摄像设备根据第一摄像设备的连续N帧中每帧的时间戳确定第二摄像设备与第一摄像设备是否帧同步。
在一种可能实现的方式中,第一摄像设备从第一摄像设备的帧时间戳序列中,截取第一摄像设备的连续N帧。
在一种可能实现的方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备,该种情况下,第一摄像设备直接通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能实现的方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备,该种情况下,第一摄像设备通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至主节点,以使主节点将第一摄像设备的连续N帧中每帧的时间戳同步至从节点,以使从节点通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能实现的方式中,在第一摄像设备为基准摄像设备的情况下,在第二摄像设备与第一摄像设备帧同步之后,第一摄像设备确定第一摄像设备的起始帧以及起始帧的时间戳,并将起始帧的时间戳作为参考时间戳,将参考时间戳同步至第二摄像设备,参考时间戳用于第二摄像设备确定第二摄像设备的起始帧。第一摄像设备将参考时间戳同步至第二摄像设备的过程,与将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备的过程相同。
在一种可能实现的方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备,该种情况下,第一摄像设备通过内 存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。该种情况下,基准摄像设备依然为主节点管理的摄像设备中的主摄像设备。
在一种可能实现的方式中,在第一摄像设备为从节点管理的摄像设备的主摄像设备的情况下,第一摄像设备获取参考时间戳,参考时间戳为基准摄像设备选择的起始帧的时间戳,根据参考时间戳确定第一摄像设备的起始帧,可有效缩短正式图像采集过程中,第一摄像设备与基准摄像设备之间的时间偏差。
本申请实施例第六方面提供一种第一摄像设备,该第一摄像设备具有实现第五方面提供方法的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。
在一种可能实现的方式中,该第一摄像设备包括:收发模块和处理模块;处理模块,用于截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;收发模块,用于将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能实现的方式中,该第一摄像设备包括:处理器、收发器和存储器,其中,收发器用于接收和发送信息,存储器中存储计算机程序,计算机程序包括程序指令,处理器通过总线与存储器和收发器连接,处理器执行存储器中存储的程序指令,以使该第一摄像设备执行以下操作:截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;控制收发器将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
基于同一发明构思,由于该第一摄像设备解决问题的原理以及有益效果可以参见第五方面所述的方法以及所带来的有益效果,因此该装置的实施可以参见方法的实施,重复之处不再赘述。
本申请实施例第七方面提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,计算机程序包括程序指令,程序指令当被处理器执行时使得处理器执行上述第五方面所述的方法。
本申请实施例第八方面提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第五方面所述的方法。
本申请实施例第九方面提供一种帧同步系统,包括:
第一摄像设备截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳;将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备,N为大于1的正整数;
第二摄像设备截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳;根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;当平均时延满足同步条件,则第二摄像设备确定与第一摄像设备帧同步。
本申请实施例第十方面提供一种帧同步系统,包括第一摄像设备和第二摄像设备,第二摄像设备用于执行第一方面提供的方法,第一摄像设备用于执行第五方面提供的方法。
结合第九方面和第十方面,在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备。
结合第九方面和第十方面,在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备。
结合第九方面和第十方面,在一种可能的实现方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备,基准摄像设备依然为主节点管理的摄像设备中的主摄像设备。
附图说明
为了更清楚地说明本申请实施例或背景技术中的技术方案,下面将对本申请实施例或背景技术中所需要使用的附图进行说明。
图1为应用本申请实施例的网络架构示意图;
图2为本申请实施例一提供的帧同步方法的流程示意图;
图3为本申请实施例提供的从主摄像设备确定对齐帧的示意图;
图4为本申请实施例提供的从主摄像设备确定是否帧同步的示意图;
图5为本申请实施例提供的确定起始帧的示意图;
图6为本申请实施例二提供的帧同步方法的流程示意图;
图7为本申请实施例提供的帧同步装置的逻辑结构示意图;
图8为本申请实施例提供的帧同步装置的实体结构简化示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请的描述中,除非另有说明,“/”表示前后关联的对象是一种“或”的关系,例如,A/B可以表示A或B;本申请中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,其中A,B可以是单数或者复数。并且,在本申请的描述中,除非另有说明,“多个”是指两个或多于两个。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。另外,为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
此外,本申请实施例描述的网络架构以及业务场景是为了更加清楚的说明本申请实施例的技术方案,并不构成对于本申请实施例提供的技术方案的限定,本领域普通技术人员可知,随着网络架构的演变和新业务场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
下面将对本申请实施例涉及的技术名称或术语进行介绍。
帧同步系统,也可称为节点系统,节点系统包括主节点和从节点,节点负责管理一个 或多个摄像设备。节点所管理的摄像设备可分为主摄像设备和从摄像设备。帧同步系统可用于实现多个摄像设备的帧同步,以便多个摄像设备可以协同录制视频,协同监控等。
主节点,有且只有一个,该节点所管理的摄像设备中必须包括主摄像设备。
从节点,其数量为一个或多个,视具体情况而定。从节点所管理的摄像设备通过主节点所管理的主摄像设备进行帧同步。
主主摄像设备,主节点所管理的摄像设备中的主摄像设备,为帧同步系统中的基准摄像设备。节点之间的帧同步以及节点内的帧同步都参考主主摄像设备的时间戳。主主摄像设备在开启之后,只初始化一次。将主节点所管理的摄像设备中除主主摄像设备之外的其他摄像设备称为从主摄像设备,也可以称为主节点的从摄像设备。
基准摄像设备,为主节点所管理的摄像设备中的主摄像设备,即主主摄像设备。帧同步系统中除基准摄像设备之外的其他摄像设备需要参考基准摄像设备进行帧同步,以及参考基准摄像设备的参考时间戳确定各自的起始帧。需要说明的是,其他用于描述基准摄像设备的名称理应落入本申请实施例的保护范围,例如参考摄像设备等。
从主摄像设备,从节点所管理的摄像设备中的一个摄像设备被指定为从主摄像设备,也可以称为从节点的主摄像设备。在从节点与主节点之间的帧同步过程中,从主摄像设备参考主主摄像设备的时间戳进行帧同步;在某个从节点内的帧同步过程中,该从节点的从从摄像设备参考从主摄像设备的时间戳进行帧同步。将从节点所管理的摄像设备中除从主摄像设备之外的其他摄像设备称为从从摄像设备,也可以称为从节点的从摄像设备。
从摄像设备,接入主节点或从节点,参与节点内的帧同步。若某个从摄像设备属于主节点,则该从摄像设备与主主摄像设备进行帧同步;若某个从摄像设备属于从节点,则该从摄像设备与从主摄像设备进行帧同步。
参考时间戳,为主主摄像设备选择的主主摄像设备的起始帧的时间戳,用于所有摄像设备进行同步时间校准,即各个摄像设备根据参考时间戳确定各自的起始帧,可以有效缩短各个摄像设备的起始帧的时间戳之间的偏差,使得偏差可以控制在毫秒级以下,以便多个摄像设备所采集的图像可以更好地衔接。
起始帧,为开始采集图像或视频的帧,也即正式采集的图像或视频的第一帧。
请参见图1,为应用本申请实施例的网络架构示意图,该网络架构示意图包括主节点101、从节点102、主主摄像设备103、主从摄像设备104、从主摄像设备105和从从摄像设备106。需要说明的是,图1所示的设备的形态和数量并不构成对本申请实施例的限定,不过在本申请实施例中,主节点有且只有一个,从节点可以有一个或多个;主主摄像设备、从主摄像设备有且只有一个,主从摄像设备以及从从摄像设备可以有一个或多个,主从摄像设备与从从摄像设备的数量可以相等也可以不相等,视具体情况而定。
其中,摄像设备可以是图1所示的监控器,也可以是其他类型的监控器,还可以是各种类型的摄像机、视频录制设备或图像采集设备等。主节点101和从节点102可以是计算节点,每个计算节点管理一个或多个摄像设备。管理人员可设置哪个计算节点是主节点,哪些计算节点是从节点,也可由帧同步系统自主设定主节点和从节点。
图1中,主节点101负责管理主主摄像设备103和主从摄像设备104。主节点101可从其管理的摄像设备中选择一个摄像设备作为主节点101的主摄像设备,即主主摄像设备,其余摄像设备便作为主节点101的从摄像设备,即主从摄像设备。管理人员也可从主节点101 所管理的摄像设备中选择一个摄像设备作为主主摄像设备。主主摄像设备103为整个帧同步系统的基准摄像设备,即主节点101中的主从摄像设备104需要参考主主摄像设备103的时间戳进行帧同步,从节点102所管理的摄像设备也需要参考主主摄像设备103的时间戳进行帧同步。主从摄像设备104与主主摄像设备103进行帧同步,可以完成主节点101内的帧同步。
从节点102负责管理从主摄像设备105和从从摄像设备106。以一个从节点102为例,该从节点102可从其管理的摄像设备中选择一个摄像设备作为该从节点102的主摄像设备,即从主摄像设备,其余摄像设备作为该从节点102的从摄像设备,即从从摄像设备。管理人员也可从该从节点102所管理的摄像设备中选择一个摄像设备作为从主摄像设备。从主摄像设备105与主主摄像设备103进行帧同步,可以完成主节点101与该从节点102之间的帧同步。从从摄像设备106与从主摄像设备105进行帧同步,可以完成该从节点102内的帧同步。
需要说明的是,图1的主节点、从节点、主主摄像设备、主从摄像设备、从主摄像设备和从从摄像设备等仅是一个名字,名字对设备本身不构成限定。其它用于描述这些名字的名称理应落入本申请实施例的保护范围。
鉴于通过硬件设备实现帧同步的成本较高,应用范围有一定局限性的弊端,本申请实施例提供一种通过软件方式实现帧同步的方法及其装置,成本低且应用范围广泛。
目前也提出了一种通过软件方式实现帧同步的方法,该方法利用目标特征点来同步各个摄像机,该方法包括:获取在目标特征点的运动过程中,第一摄像机和第二摄像机拍摄得到的位置信息,包括第一摄像机得到的第一特征点位置序列和第二摄像机得到的第二特征点位置序列;第一特征点位置序列包括目标特征点的第一位置信息;第二特征点位置序列包括目标特征点的第二位置信息;根据第一特征点位置序列以及单应矩阵,得到目标特征点的第二特征点估计位置序列;根据第二特征点估计位置序列和第二特征点位置序列得到序列差异值;在序列差异值符合预设的同步条件时,确定第一摄像机和第二摄像机的帧同步位置。该方法利用特征点完成帧同步,不过该方法在拍摄过程中完成帧同步,在开始拍摄到获得特征点之间无法实现帧同步。而本申请实施例,在正式拍摄前可以完成各个摄像机的帧同步,即在多摄像机协同录制的准备阶段可以完成帧同步,帧同步过程不影响正式的视频录制过程,从而可以保证视频正式录制过程的完整性。
本申请实施例可以应用于多个摄像机协同工作的场景,例如多个摄像机协同录制视频的场景,还可以应用于视频监控场景,例如在人流密集的广场部署多个监控摄像机,根据某个监控摄像机下某一帧的罪犯信息可以快速提取出其他摄像机下该时间点的罪犯信息,从而可以得到完整的罪犯信息。
下面将对本申请实施例提供的帧同步方法进行详细的介绍。
以本申请实施例应用于图1所示的网络架构示意图为例,如图2所示,为本申请实施例一提供的帧同步方法的流程示意图。该方法可以包括但不限于如下步骤:
步骤S201,确定节点角色。
管理人员或帧同步系统为帧同步系统中的各个节点以及各个节点所管理的摄像设备划分角色,指定哪个节点是主节点,哪些节点是从节点,主节点所管理的摄像设备中哪个是主主摄像设备,哪些是主从摄像设备,从节点所管理的摄像设备中哪个是从主摄像设备,哪些是从从摄像设备。管理人员可在各个摄像设备初始化之前划分角色,各个角色也可由 帧同步系统默认设置。
在管理人员或帧同步系统划分角色之后,各个节点以及各个摄像设备可以清楚地获知各自的角色,从而可以确定各自的角色,以便执行各自角色的功能。
步骤S202,节点系统时间同步。
节点系统包括主节点和从节点,主节点有且只有一个,从节点有一个或多个。在各个摄像设备进行帧同步之前,各个节点进行系统时间同步。各个节点可使用精确时间协议(precision time protocol,PTP)来实现各个节点的系统时间同步,PTP可实现纳秒量级的同步。节点之间的系统时间可通过分布式进程间消息传递接口共享,例如,主节点通过分布式进程间消息向从节点传递主节点的系统时间,以便从节点根据主节点的系统时间进行系统时间同步。
应用在本申请实施例中,节点之间的时间戳信息可通过分布式进程间消息传递接口共享,节点内的时间戳信息可直接通过内存共享。其中,时间戳信息可以包括一个或多个时间戳,可以是系统时间的时间戳信息,也可以是主主摄像设备的时间戳信息。
具体的,主主摄像设备与从主摄像设备之间的时间戳信息同步可包括:主主摄像设备将其时间戳信息通过内存共享同步至主节点,主节点通过分布式进程间消息将主主摄像设备的时间戳信息同步至从节点,从节点通过内存共享将主主摄像设备的时间戳信息同步至从主摄像设备。
主主摄像设备与主从摄像设备之间的时间戳信息同步可包括:主主摄像设备通过内存共享将其时间戳信息同步至主从摄像设备。从主摄像设备与从从摄像设备之间的时间戳信息同步可包括:从主摄像设备通过内存共享将从主摄像设备的时间戳信息同步至从从摄像设备,或从主摄像设备通过内存共享将主主摄像设备的时间戳信息同步至从从摄像设备。
步骤S203,节点间初始化。
节点间初始化可包括主主摄像设备和从主摄像设备的初始化。初始化可通过摄像设备的开启与关闭来实现。在本申请实施例中,主主摄像设备只初始化一次,即主主摄像设备只开启一次;从主摄像设备可初始化多次,当从主摄像设备与主主摄像设备帧同步失败时,从主摄像设备进行初始化,即此时从主摄像设备重新启动,重新与主主摄像设备进行帧同步。
步骤S204,从主摄像设备与主主摄像设备进行帧同步。
主主摄像设备在启动之后,立即截取主主摄像设备的连续N帧,并收集主主摄像设备的连续N帧中每帧的时间戳。具体的,主主摄像设备可从主主摄像设备的帧时间戳序列中截取主主摄像设备的连续N帧。其中,N为大于1的正整数。用
Figure PCTCN2019126055-appb-000002
来表示主主摄像设备的连续N帧中第i帧的时间戳。
主主摄像设备在收集主主摄像设备的连续N帧中每帧的时间戳之后,通过内存共享,将主主摄像设备的连续N帧中每帧的时间戳同步至主节点,主节点通过分布式进程间消息将主主摄像设备的连续N帧中每帧的时间戳同步至从节点,从节点通过内存共享,将主主摄像设备的连续N帧中每帧的时间戳同步至从主摄像设备,以便从主摄像设备能够获取主主摄像设备的连续N帧中每帧的时间戳,进而判断是否与主主摄像设备帧同步成功。
从主摄像设备在启动之后,立即截取从主摄像设备的连续N帧,并收集从主摄像设备的连续N帧中每帧的时间戳。具体的,从主摄像设备可从从主摄像设备的帧时间戳序列中 截取从主摄像设备的连续N帧。其中,N为大于1的正整数。用
Figure PCTCN2019126055-appb-000003
来表示第j个从节点中的从主摄像设备的连续N帧中第i帧的时间戳,或表示第j个从主摄像设备的连续N帧中第i帧的时间戳。
主主摄像设备与从主摄像设备何时截取连续N帧,以及截取连续N帧的先后顺序,在本申请实施例中不作限定。主主摄像设备与从主摄像设备所截取的帧数相同,N的具体数值在本申请实施例中不作限定,可由管理人员设定或由帧同步系统设定。
从主摄像设备在获取主主摄像设备的连续N帧中每帧的时间戳,以及从主摄像设备的连续N帧中每帧的时间戳之后,从主主摄像设备的连续N帧中,确定出从主摄像设备的连续N帧中每帧的对齐帧。具体的,从主摄像设备根据主主摄像设备的连续N帧中每帧的时间戳,以及从主摄像设备的连续N帧中每帧的时间戳,从主主摄像设备的连续N帧中,确定出从主摄像设备的连续N帧中每帧的对齐帧。
Figure PCTCN2019126055-appb-000004
表示第j个从主摄像设备的连续N帧中第1帧的时间戳,从主摄像设备将
Figure PCTCN2019126055-appb-000005
依次与
Figure PCTCN2019126055-appb-000006
进行比较,得到绝对差值D n,1≤n≤N,在主主摄像设备的连续N帧中,将D n最小对应的主主摄像设备的帧,确定为从主摄像设备的连续N帧中第1帧的对齐帧。按照此过程确定从主摄像设备的连续N帧中第2帧的对齐帧,重复此过程,直到确定出第j个从主摄像设备的连续N帧中第N帧的对齐帧。对于每个从主摄像设备,均可按照此过程确定出各自的连续N帧中每帧的对齐帧。
可参见图3,为本申请实施例提供的从主摄像设备确定对齐帧的示意图。图3以确定从主摄像设备的连续N帧中的第1帧的对齐帧为例,从主摄像设备的连续N帧中的第1帧的时间戳为t,主主摄像设备的连续N帧中第3帧的时间戳为t,两者之间的绝对差值为0,为最小的绝对差值,那么从主摄像设备的连续N帧中第1帧的对齐帧,为主主摄像设备的连续N帧中的第3帧。从主摄像设备在确定出其连续N帧中第1帧的对齐帧之后,确定其连续N帧中第2帧的对齐帧,直到确定出其连续N帧中第N帧的对齐帧。
步骤S205,判断主主摄像设备与从主摄像设备是否同步成功。
从主摄像设备在确定出其连续N帧中每帧的对齐帧之后,根据从主摄像设备的连续N帧中每帧的时间戳以及从主摄像设备的连续N帧中每帧的对齐帧的时间戳,计算得到平均时延。
具体的,从主摄像设备获取从主摄像设备的连续N帧中第i帧的时间戳与该第i帧的对齐帧的时间戳之间的绝对差值,1≤i≤N,直到获取从主摄像设备的连续N帧中每帧的时间戳与各自的对齐帧的时间戳之间的绝对差值;然后根据计算公式计算得到平均时延,该计算公式如下:
Figure PCTCN2019126055-appb-000007
D avg表示平均时延,t i表示从主摄像设备的连续N帧中第i帧的时间戳,t i'表示第i帧的对齐帧的时间戳。
若平均时延小于阈值,则确定平均时延满足同步条件,从主摄像设备与主主摄像设备帧同步,即同步成功;若平均时延大于阈值,则确定平均时延不满足同步条件,从主摄像设备与主主摄像设备不同步,即同步失败。
其中,阈值可用D max表示,D max为从主摄像设备与主主摄像设备之间同步的最大延迟 误差值,其数值为经验值,与摄像设备类型、帧率计算以及调试有关,具体数值在本申请实施例不作限定。
在从主摄像设备与主主摄像设备同步不成功的情况下,从主摄像设备重新初始化,即重新启动,重新获取主主摄像设备的连续N帧中每帧的时间戳,重新截取从主摄像设备的连续N帧,并收集重新截取的连续N帧中每帧的时间戳,重新确定对齐帧以及平均时延,然后判断重新确定的平均时延是否小于阈值,若小于,则帧同步成功;否则再次初始化,直到平均时延小于阈值。
可参见图4,为本申请实施例提供的从主摄像设备确定是否帧同步的示意图。图4中,假设D max=3,左侧中D avg>D max,不满足同步条件,从主摄像设备与主主摄像设备同步失败,此时从主摄像设备重新初始化,由于主主摄像设备的帧序列一直在向前走,当重新截取主主摄像设备的连续N帧时,重新截取的连续N帧中第1帧的时间戳应该在t=5之后,假设主主摄像设备重新截取的连续N帧的第1帧的时间戳为t=11,从主摄像设备重新截取的连续N帧的第1帧的时间戳为t=13,重新确定的平均时延D avg<D max,满足同步条件,从主摄像设备与主主摄像设备同步成功。
需要说明的是,图4中的时间戳以及阈值用于举例,并不构成对本申请实施例的限定。
步骤S206,若从主摄像设备与主主摄像设备同步成功,则节点内初始化。
若从主摄像设备与主主摄像设备同步成功,即D avg<D max,则主节点所管理的从摄像设备进行初始化以及从节点所管理的从摄像设备进行初始化。换言之,在从主摄像设备与主主摄像设备同步成功的情况下,开启主节点所管理的从摄像设备,开启从节点所管理的从摄像设备。
步骤S207,从摄像设备与主摄像设备进行帧同步。
从摄像设备与主摄像设备进行帧同步包括:主从摄像设备参考主主摄像设备进行帧同步,以及从从摄像设备参考从主摄像设备或主主摄像设备进行帧同步。从摄像设备与主摄像设备进行帧同步的过程与主从摄像设备与主主摄像设备进行帧同步的过程相同,在此不再赘述。
从从摄像设备参考主主摄像设备进行帧同步,相比从从摄像设备参考从主摄像设备进行帧同步,效果更好。
步骤S208,判断从摄像设备与主摄像设备是否同步成功。
从摄像设备判断与主摄像设备是否同步成功,即是否D avg<D max,若同步不成功,则从摄像设备重新初始化,直到D avg<D max
步骤S209,若从摄像设备与主摄像设备同步成功,则所有摄像设备进行同步时间校准。
若所有从摄像设备与主摄像设备同步成功,则所有摄像设备进行同步时间校准。由于摄像设备在初始化时可能存在图像自动校正或丢帧的情况,在所有摄像设备同步完成后,需要丢掉若干帧再开始正式图像采集,从而保证编码器接收到的各路视频流的第1帧能对应于同一时刻,或最接近于同一时刻,因此需要对所有摄像设备进行同步时间校准。
具体的,主主摄像设备确定参考时间戳,参考时间戳为主主摄像设备的起始帧的时间戳,起始帧即为正式图像采集的第1帧。主主摄像设备可在节点间同步时,预测参考时间戳,也可在从主摄像设备与主主摄像设备同步之后,预测参考时间戳。
之后,主主摄像设备将参考时间戳同步至从主摄像设备,从主摄像设备在获取到参考 时间戳时,可将参考时间戳同步至从从摄像设备。主主摄像设备可将参考时间戳与主主摄像设备的连续N帧中每帧的时间戳一起同步至从主摄像设备,也可分别同步,本申请实施例中不作限定。
从主摄像设备根据参考时间戳确定从主摄像设备的起始帧。从主摄像设备从从主摄像设备的帧时间戳序列中,选择与参考时间戳之间的差值在预设范围内的时间戳对应的帧,并将其确定为从主摄像设备的起始帧。其中,预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。
可参见图5,为本申请实施例提供的确定起始帧的示意图。图5中,在时间点T处,从主摄像设备1以及从主摄像设备2与主主摄像设备对齐,主主摄像设备所确定的参考时间戳为黑色箭头所指的时间点,参考时间戳对应的帧即为主主摄像设备的起始帧。对于从主摄像设备1而言,将黑色箭头所指的帧作为从主摄像设备1的起始帧,此时该帧的时间戳与参考时间戳之间的差值在预设范围内,为最接近主主摄像设备的起始帧的帧。对于从主摄像设备2而言,将黑色箭头所指的帧作为从主摄像设备2的起始帧,此时该帧的时间戳与参考时间戳之间的差值在预设范围内,为最接近主主摄像设备的起始帧的帧,而灰色箭头所指的帧的时间戳与参考时间戳之间的差值不在预设范围内,不将灰色箭头所指的帧作为从主摄像设备2的起始帧。
主节点所管理的从摄像设备以及从节点所管理的从摄像设备,同样根据参考时间戳确定各自的起始帧。
需要说明的是,图2所示实施例中,先执行从主摄像设备与主主摄像设备帧同步的过程,再执行主从摄像设备与主主摄像设备帧同步的过程,实际应用中,也可以同时执行这两个过程。
图2所示的实施例,通过软件方式实现多个摄像设备的帧同步,可以应用于各种类型的摄像设备,成本低,应用范围广泛;通过确定对齐帧可以有效缩短时间偏差,使得帧同步系统中各个摄像设备之间的时间偏差控制在毫秒级以下;各个摄像设备之间的帧同步过程在正式图像采集之前执行,使得同步过程不影响正式图像采集过程,以确保视频录制的完整性;通过确定参考时间戳,以便各个摄像设备确定各自的起始帧,从而有效缩短正式图像采集过程中的时间偏差,以便更好地处理各个摄像设备采集的图像。
请参见图6,为本申请实施例二提供的帧同步方法的流程示意图,该方法可以包括但不限于如下步骤:
步骤S601,第一摄像设备截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳。
在执行步骤S601之前,第一摄像设备需要确定其角色,是主主摄像设备还是从主摄像设备。若是主主摄像设备,则第一摄像设备是帧同步系统的基准摄像设备;若是从主摄像设备,则第一摄像设备参考主主摄像设备进行帧同步,该从主摄像设备所属从节点所管理的从从摄像设备可参考第一摄像设备进行帧同步。
第一摄像设备在启动之后,可从第一摄像设备的帧时间戳序列中,截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳。第一摄像设备如何从其帧时间戳序列中截取连续N帧,在本申请实施例中不作限定。
步骤S602,第一摄像设备将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像 设备。相应的,第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳。
在一种可能的实现方式中,第一摄像设备为主主摄像设备,第二摄像设备为从主摄像设备。第一摄像设备通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至主节点,主节点通过分布式进程消息,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备所属的从节点,该从节点通过内存共享,将将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,第一摄像设备为主主摄像设备,第二摄像设备为主从摄像设备。第一摄像设备直接通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,第一摄像设备为从主摄像设备,第二摄像设备为从从摄像设备。第一摄像设备直接通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。或,第一摄像设备在获取到主主摄像设备的连续N帧中每帧的时间戳的情况下,通过内存共享,将主主摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
步骤S603,第二摄像设备截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳。
第二摄像设备在启动之后,可从第二摄像设备的帧时间戳序列中,截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳。第二摄像设备如何从其帧时间戳序列中截取连续N帧,在本申请实施例中不作限定。
步骤S604,第二摄像设备根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧。
具体的,第二摄像设备计算第二摄像设备的连续N帧中第i帧的时间戳与第一摄像设备的连续N帧中每帧的时间戳之间的绝对差值;1≤i≤N;从第一摄像设备的连续N帧中,获取最小绝对差值对应的帧,并将其确定为第二摄像设备的连续N帧中第i帧的对齐帧。按照该方式,确定出第二摄像设备的连续N帧中第1帧的对齐帧,直到确定出第二摄像设备的连续N帧中每帧的对齐帧。具体可参见图2所示实施例中步骤S204中对确定对齐帧的详细描述。
步骤S605,第二摄像设备根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延。
步骤S605的具体实现过程,可参见图2所示实施例中对确定平均时延的详细描述,在此不再赘述。
步骤S606,若平均时延满足同步条件,则第二摄像设备确定与第一摄像设备帧同步。
其中,同步条件可包括阈值,若平均时延小于阈值,则可以确定平均时延满足同步条件,此时,第二摄像设备可确定与第一摄像设备帧同步,即第二摄像设备与第一摄像设备同步成功。
若平均时延不满足同步条件,则第二摄像设备重新启动,并再次执行步骤S602-步骤S605,即再次获取第一摄像设备的连续N帧中每帧的时间戳,再次截取第二摄像设备的连续N帧,并收集再次截取的第二摄像设备的连续N帧中每帧的时间戳,再次确定对齐帧,再次确定平均时延,再次判断平均时延是否满足同步条件。若这次确定的平均时延也不满 足同步条件,则第二摄像设备再重新启动,再执行步骤S602-步骤S605,直到平均时延满足同步条件时,确定第二摄像设备与第一摄像设备帧同步。
在一种可能的实现的方式,第一摄像设备为主主摄像设备时,第一摄像设备在与第二摄像设备进行帧同步的过程中,可确定参考时间戳,该参考时间戳为主主摄像设备选择的起始帧的时间戳,即主主摄像设备正式采集图像的第1帧的时间戳。第一摄像设备将参考时间戳同步至第二摄像设备,第二摄像设备根据参考时间戳确定第二摄像设备的起始帧,第二摄像设备的起始帧的时间戳与参考时间戳之间的差值在预设范围内,该预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。若第二摄像设备为从主摄像设备,则第二摄像设备可将参考时间戳同步至从从摄像设备。这样,帧同步系统中各个摄像设备可根据参考时间戳确定各自的起始帧,从而有效缩短正式图像采集过程中的时间偏差,以便更好地处理各个摄像设备采集的图像。
在一种可能的实现方式中,第一摄像设备为从主摄像设备时,第一摄像设备获取参考时间戳。主主摄像设备可将参考时间戳同步至主节点,主节点将参考时间戳同步至第一摄像设备所属的从节点,该从节点将参考时间戳同步至从主摄像设备,从而从主摄像设备可获取参考时间戳。第一摄像设备在获取到参考时间戳的情况下,根据参考时间戳确定第一摄像设备的起始帧,第一摄像设备的起始帧的时间戳与参考时间戳之间的差值在预设范围内,该预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。
在图6所示的实施例中,第二摄像设备通过第一摄像设备的连续N帧中每帧的时间戳,以及第二摄像设备的连续N帧中每帧的时间戳,确定第二摄像设备的连续N帧中每帧的对齐帧,进而确定平均时延,并在平均时延满足同步条件时,确定第二摄像设备与第一摄像设备帧同步,从而可以实现多个摄像设备的帧同步,不涉及硬件设备,通过软件方式实现,成本低,应用范围广泛。
上述详细阐述了本申请实施例的方法,下面提供了本申请实施例的装置。
请参见图7,是本申请实施例提供的帧同步装置的逻辑结构示意图,该帧同步装置70包括收发模块701和处理模块702。该帧同步装置70可以是第一摄像设备,也可以是第二摄像设备。
对于该帧同步装置70是第二摄像设备的情况:
收发模块701,用于获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
处理模块702,用于截取第二摄像设备的连续N帧,并收集第二摄像设备的连续N帧中每帧的时间戳;根据第一摄像设备的连续N帧中每帧的时间戳和第二摄像设备的连续N帧中每帧的时间戳,从第一摄像设备的连续N帧中,确定第二摄像设备的连续N帧中每帧的对齐帧;根据第二摄像设备的连续N帧中每帧的时间戳以及第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;当平均时延满足同步条件,则确定第二摄像设备与第一摄像设备帧同步。
在一种可能的实现方式中,处理模块702还用于当平均时延不满足同步条件,则重新启动第二摄像设备,并再次获取第一摄像设备的连续N帧中每帧的时间戳;再次截取第二摄像设备的连续N帧,并收集再次截取的第二摄像设备的连续N帧中每帧的时间戳;根据再次获取的第一摄像设备的连续N帧中每帧的时间戳和再次截取的第二摄像设备的连续N帧 中每帧的时间戳,从再次获取的第一摄像设备的连续N帧中,确定再次截取的第二摄像设备的连续N帧中每帧的对齐帧;根据再次截取的第二摄像设备的连续N帧中每帧的时间戳以及再次截取的第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定再次平均时延;若再次平均时延满足同步条件,则确定第二摄像设备与第一摄像设备帧同步。
在一种可能的实现方式中,收发模块701还用于获取参考时间戳,参考时间戳为基准摄像设备选择的起始帧的时间戳;基准摄像设备为主节点管理的摄像设备中的主摄像设备;
处理模块702,还用于根据参考时间戳确定第二摄像设备的起始帧。
在一种可能的实现方式中,处理模块702用于根据参考时间戳确定第二摄像设备的起始帧时,具体用于从第二摄像设备的帧时间戳序列中,选择与参考时间戳之间的差值在预设范围内的时间戳对应的帧,并将其确定为第二摄像设备的起始帧;
其中,预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备;
收发模块701用于获取第一摄像设备的连续N帧中每帧的时间戳时,具体用于通过内存共享,从从节点获取第一摄像设备的连续N帧中每帧的时间戳。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备;
收发模块701获取第一摄像设备的连续N帧中每帧的时间戳时,具体用于通过内存共享,从第一摄像设备获取第一摄像设备的连续N帧中每帧的时间戳。
在一种可能的实现方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备;
收发模块701用于获取第一摄像设备的连续N帧中每帧的时间戳时,具体用于通过内存共享,从第一摄像设备获取第一摄像设备的连续N帧中每帧的时间戳。
对于该帧同步装置70为第一摄像设备的情况:
处理模块702,用于截取第一摄像设备的连续N帧,并收集第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
收发模块701,用于将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备;
收发模块701用于将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备时,具体用于通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备;
收发模块701用于将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备时,具体用于通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至主节点,以使主节点将第一摄像设备的连续N帧中每帧的时间戳同步至从节点,以使从节点通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,在第二摄像设备与第一摄像设备帧同步之后,处理模块702还用于确定第一摄像设备的起始帧以及起始帧的时间戳,并将起始帧的时间戳作为参考时 间戳;
收发模块701还用于将参考时间戳同步至第二摄像设备,参考时间戳用于第二摄像设备确定第二摄像设备的起始帧。
在一种可能的实现方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备;
收发模块701用于将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备时,具体用于通过内存共享,将第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
在一种可能的实现方式中,收发模块701,还用于获取参考时间戳,参考时间戳为基准摄像设备选择的起始帧的时间戳,基准摄像设备为主节点管理的摄像设备中的主摄像设备;
处理模块702,还用于根据参考时间戳确定第一摄像设备的起始帧。
对于该帧同步装置70是第二摄像设备的情况,收发模块701用于执行图6所示实施例中的步骤S602,处理模块702用于执行图6所示实施例中的步骤S603-步骤S605。对于该帧同步装置70是第一摄像设备的情况,收发模块701用于执行图6所示实施例中的步骤S602,处理模块702用于执行图6所示实施例中的步骤S601。
请参见图8,是本申请实施例提供的帧同步装置的实体结构简化示意图,该帧同步装置可以是第一摄像设备,也可以是第二摄像设备。该帧同步装置80包括收发器801、处理器802和存储器803。收发器801、处理器802和存储器803可以通过总线804相互连接,也可以通过其它方式相连接。图7所示的处理模块702所实现的相关功能可以通过一个或多个处理器802来实现。图7所示的收发模块701所实现的相关功能可以通过收发器801来实现。
存储器803包括但不限于是随机存储记忆体(random access memory,RAM)、只读存储器(read-only memory,ROM)、可擦除可编程只读存储器(erasable programmable read only memory,EPROM)、或便携式只读存储器(compact disc read-only memory,CD-ROM),该存储器803用于相关指令及数据。
收发器801用于发送数据和/或信令,以及接收数据和/或信令。
应用在本申请实施例中,对于帧同步装置80为第一摄像设备的情况,收发器801用于与第二摄像设备进行通信,例如执行图6所示实施例中的步骤S602。
应用在本申请实施例中,对于帧同步装置80为第二摄像设备的情况,收发器801用于与第一摄像设备进行通信,例如执行图6所示实施例中的步骤S602。
处理器802可以包括是一个或多个处理器,例如包括一个或多个中央处理器(central processing unit,CPU),在处理器802是一个CPU的情况下,该CPU可以是单核CPU,也可以是多核CPU。
应用在本申请实施例中,对于帧同步装置80为第一摄像设备的情况,处理器802用于执行图6所示实施例中的步骤S601。
应用在本申请实施例中,对于帧同步装置80为第二摄像设备的情况,处理器802用于执行图6所示实施例中的步骤S603-步骤S606。
存储器803用于存储帧同步装置80的程序代码和数据。
关于处理器802和收发器801所执行的步骤,具体可参见图6所示实施例的描述,在此不再赘述。
可以理解的是,图8仅仅示出了帧同步装置的简化设计。在实际应用中,帧同步装置 还可以分别包含必要的其他元件,包含但不限于任意数量的收发器、处理器、控制器、存储器、通信单元等,而所有可以实现本申请的设备都在本申请的保护范围之内。
本申请实施例还提供了一种帧同步系统,包括第一摄像设备和第二摄像设备,还包括主节点和从节点。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为主节点管理的摄像设备中的从摄像设备。
在一种可能的实现方式中,第一摄像设备为基准摄像设备,基准摄像设备为主节点管理的摄像设备中的主摄像设备,第二摄像设备为从节点管理的摄像设备中的主摄像设备。
在一种可能的实现方式中,第一摄像设备为从节点管理的摄像设备的主摄像设备,第二摄像设备为从节点管理的摄像设备中的从摄像设备,基准摄像设备依然为主节点管理的摄像设备中的主摄像设备。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。因此,本申请又一实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述各方面所述的方法。
本申请又一实施例还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各方面所述的方法。
本领域普通技术人员可以意识到,结合本申请中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产 品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。

Claims (14)

  1. 一种帧同步方法,其特征在于,包括:
    第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
    所述第二摄像设备截取所述第二摄像设备的连续N帧,并收集所述第二摄像设备的连续N帧中每帧的时间戳;
    所述第二摄像设备根据所述第一摄像设备的连续N帧中每帧的时间戳和所述第二摄像设备的连续N帧中每帧的时间戳,从所述第一摄像设备的连续N帧中,确定所述第二摄像设备的连续N帧中每帧的对齐帧;
    所述第二摄像设备根据所述第二摄像设备的连续N帧中每帧的时间戳以及所述第二摄像设备的连续N帧中每帧的对齐帧的时间戳,确定平均时延;
    当所述平均时延满足同步条件,则所述第二摄像设备确定与所述第一摄像设备帧同步。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    所述第二摄像设备获取参考时间戳,所述参考时间戳为基准摄像设备选择的起始帧的时间戳;所述基准摄像设备为主节点管理的摄像设备中的主摄像设备;
    所述第二摄像设备根据所述参考时间戳确定所述第二摄像设备的起始帧。
  3. 根据权利要求2所述的方法,其特征在于,所述第二摄像设备根据所述参考时间戳确定所述第二摄像设备的起始帧,包括:
    所述第二摄像设备从所述第二摄像设备的帧时间戳序列中,选择与所述参考时间戳之间的差值在预设范围内的时间戳对应的帧,并将其确定为所述第二摄像设备的起始帧;
    其中,所述预设范围为[-Fp/2,Fp/2],Fp为单帧时间间隔。
  4. 根据权利要求2所述的方法,其特征在于,所述第一摄像设备为所述基准摄像设备,所述第二摄像设备为从节点管理的摄像设备中的主摄像设备;
    所述第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,包括:
    所述第二摄像设备通过内存共享,从所述从节点获取第一摄像设备的连续N帧中每帧的时间戳。
  5. 根据权利要求2所述的方法,其特征在于,所述第一摄像设备为所述基准摄像设备,所述第二摄像设备为所述主节点管理的摄像设备中的从摄像设备;
    所述第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,包括:
    所述第二摄像设备通过内存共享,从第一摄像设备获取所述第一摄像设备的连续N帧中每帧的时间戳。
  6. 根据权利要求2所述的方法,其特征在于,所述第一摄像设备为从节点管理的摄像设备的主摄像设备,所述第二摄像设备为所述从节点管理的摄像设备中的从摄像设备;
    所述第二摄像设备获取第一摄像设备的连续N帧中每帧的时间戳,包括:
    所述第二摄像设备通过内存共享,从第一摄像设备获取所述第一摄像设备的连续N帧中每帧的时间戳。
  7. 一种帧同步方法,其特征在于,包括:
    第一摄像设备截取所述第一摄像设备的连续N帧,并收集所述第一摄像设备的连续N帧中每帧的时间戳,N为大于1的正整数;
    所述第一摄像设备将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
  8. 根据权利要求7所述的方法,其特征在于,所述第一摄像设备为基准摄像设备,所述基准摄像设备为主节点管理的摄像设备中的主摄像设备,所述第二摄像设备为所述主节点管理的摄像设备中的从摄像设备;
    所述第一摄像设备将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备,包括:
    所述第一摄像设备通过内存共享,将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
  9. 根据权利要求7所述的方法,其特征在于,所述第一摄像设备为基准摄像设备,所述基准摄像设备为主节点管理的摄像设备中的主摄像设备,所述第二摄像设备为从节点管理的摄像设备中的主摄像设备;
    所述第一摄像设备将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备,包括:
    所述第一摄像设备通过内存共享,将所述第一摄像设备的连续N帧中每帧的时间戳同步至所述主节点,以使所述主节点将所述第一摄像设备的连续N帧中每帧的时间戳同步至所述从节点,以使所述从节点通过内存共享,将所述第一摄像设备的连续N帧中每帧的时间戳同步至所述第二摄像设备。
  10. 根据权利要求8或9所述的方法,其特征在于,所述方法还包括:
    在所述第二摄像设备与所述第一摄像设备帧同步之后,所述第一摄像设备确定所述第一摄像设备的起始帧以及所述起始帧的时间戳,并将所述起始帧的时间戳作为参考时间戳,
    所述第一摄像设备将所述参考时间戳同步至所述第二摄像设备,所述参考时间戳用于所述第二摄像设备确定所述第二摄像设备的起始帧。
  11. 根据权利要求7所述的方法,其特征在于,所述第一摄像设备为从节点管理的摄像设备的主摄像设备,所述第二摄像设备为所述从节点管理的摄像设备中的从摄像设备;
    所述第一摄像设备将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备,包括:
    所述第一摄像设备通过内存共享,将所述第一摄像设备的连续N帧中每帧的时间戳同步至第二摄像设备。
  12. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    所述第一摄像设备获取参考时间戳,所述参考时间戳为基准摄像设备选择的起始帧的时间戳,所述基准摄像设备为主节点管理的摄像设备中的主摄像设备;
    所述第一摄像设备根据所述参考时间戳确定所述第一摄像设备的起始帧。
  13. 一种帧同步装置,其特征在于,所述帧同步装置包括处理器、收发器和存储器,所述存储器中存储计算机程序,计算机程序包括程序指令,所述处理器被配置用于调用 所述程序指令,实现如权利要求1-12任一项所述的帧同步方法。
  14. 一种计算机可读存储介质,其特征在于,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行如权利要求1-12任一项所述的帧同步方法。
PCT/CN2019/126055 2018-12-18 2019-12-17 帧同步方法及其装置 WO2020125643A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811550496.4 2018-12-18
CN201811550496.4A CN111343401B (zh) 2018-12-18 2018-12-18 帧同步方法及其装置

Publications (1)

Publication Number Publication Date
WO2020125643A1 true WO2020125643A1 (zh) 2020-06-25

Family

ID=71102550

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/126055 WO2020125643A1 (zh) 2018-12-18 2019-12-17 帧同步方法及其装置

Country Status (2)

Country Link
CN (1) CN111343401B (zh)
WO (1) WO2020125643A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022252854A1 (zh) * 2021-05-31 2022-12-08 华为技术有限公司 同步方法、电子设备、计算机可读存储介质及程序产品
CN114554242B (zh) * 2022-04-24 2022-08-05 深圳市前海日新数码科技有限公司 直播方法和可读存储介质
CN114710829B (zh) * 2022-06-06 2022-08-05 希诺麦田技术(深圳)有限公司 自组网装置的gps帧定时同步方法、设备及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101146231A (zh) * 2007-07-03 2008-03-19 浙江大学 根据多视角视频流生成全景视频的方法
CN101521745A (zh) * 2009-04-14 2009-09-02 王广生 一组多镜头光心重合式全方位摄像装置及全景摄像、转播的方法
CN101807988A (zh) * 2009-02-17 2010-08-18 索尼公司 从装置、从装置的时刻同步化方法、主装置以及电子设备系统
US20130198264A1 (en) * 2012-02-01 2013-08-01 Erik Hellman Method and device for synchronizing a clock between a server communication device and a client communication device
CN105681632A (zh) * 2015-12-31 2016-06-15 深圳市华途数字技术有限公司 多目摄像机及其帧同步的方法
CN107277385A (zh) * 2017-06-12 2017-10-20 深圳市瑞立视多媒体科技有限公司 一种多相机系统同步曝光的控制方法、装置及终端设备
CN107404362A (zh) * 2017-09-15 2017-11-28 青岛海信移动通信技术股份有限公司 一种双摄像头数据帧的同步方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050091311A1 (en) * 2003-07-29 2005-04-28 Lund Christopher D. Method and apparatus for distributing multimedia to remote clients
JP2005341132A (ja) * 2004-05-26 2005-12-08 Toshiba Corp 映像データ処理装置及び処理方法
US8660152B2 (en) * 2006-09-25 2014-02-25 Futurewei Technologies, Inc. Multi-frame network clock synchronization
US9160898B2 (en) * 2011-01-25 2015-10-13 Autofuss System and method for improved video motion control
US20130070751A1 (en) * 2011-09-20 2013-03-21 Peter Atwal Synchronization of time in a mobile ad-hoc network
CN102905054B (zh) * 2012-10-23 2017-11-21 上海佰贝科技发展有限公司 一种基于图像多维特征值比对的视频同步方法
CN103402109B (zh) * 2013-07-31 2015-07-08 上海交通大学 3d视频中左右视点间帧同步性的检测与保证方法
CN103702013B (zh) * 2013-11-28 2017-02-01 北京航空航天大学 一种用于多路实时视频的帧同步方法
US11228764B2 (en) * 2014-01-15 2022-01-18 Avigilon Corporation Streaming multiple encodings encoded using different encoding parameters
CN104063867B (zh) * 2014-06-27 2017-02-08 浙江宇视科技有限公司 一种多摄像机视频同步方法和装置
CN107135330B (zh) * 2017-07-04 2020-04-28 广东工业大学 一种视频帧同步的方法与装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101146231A (zh) * 2007-07-03 2008-03-19 浙江大学 根据多视角视频流生成全景视频的方法
CN101807988A (zh) * 2009-02-17 2010-08-18 索尼公司 从装置、从装置的时刻同步化方法、主装置以及电子设备系统
CN101521745A (zh) * 2009-04-14 2009-09-02 王广生 一组多镜头光心重合式全方位摄像装置及全景摄像、转播的方法
US20130198264A1 (en) * 2012-02-01 2013-08-01 Erik Hellman Method and device for synchronizing a clock between a server communication device and a client communication device
CN105681632A (zh) * 2015-12-31 2016-06-15 深圳市华途数字技术有限公司 多目摄像机及其帧同步的方法
CN107277385A (zh) * 2017-06-12 2017-10-20 深圳市瑞立视多媒体科技有限公司 一种多相机系统同步曝光的控制方法、装置及终端设备
CN107404362A (zh) * 2017-09-15 2017-11-28 青岛海信移动通信技术股份有限公司 一种双摄像头数据帧的同步方法及装置

Also Published As

Publication number Publication date
CN111343401B (zh) 2021-06-01
CN111343401A (zh) 2020-06-26

Similar Documents

Publication Publication Date Title
WO2020125643A1 (zh) 帧同步方法及其装置
CN109104259B (zh) 一种多传感器对时同步系统和方法
WO2018228352A1 (zh) 一种同步曝光方法、装置及终端设备
CN107277385B (zh) 一种多相机系统同步曝光的控制方法、装置及终端设备
US10630884B2 (en) Camera focusing method, apparatus, and device for terminal
US20170289646A1 (en) Multi-camera dataset assembly and management with high precision timestamp requirements
CA3153390C (en) Multi-camera synchronization method and distributed system
US10951804B2 (en) Photographing synchronization method and apparatus
WO2020258901A1 (zh) 传感器数据处理方法、装置、电子设备及系统
CN112153306A (zh) 图像采集系统、方法、装置、电子设备及可穿戴设备
WO2016041188A1 (zh) 确定拍照延迟时间的方法、装置及拍照设备
WO2021189647A1 (zh) 多媒体信息的确定方法、头戴设备、存储介质及电子设备
WO2020239079A1 (zh) 一种确定半同步曝光参数的方法及电子装置
CN112004023A (zh) 拍摄方法、多摄像头模组以及存储介质
CN111147690A (zh) 一种多图像传感器摄像机的帧同步装置及方法
US20200218700A1 (en) Image synchronized storage method and image processing device
CN205681558U (zh) 一种基于人脸识别的监控装置
CN111857462B (zh) 一种服务器及光标同步方法、装置、计算机可读存储介质
WO2023138339A1 (zh) 时间同步方法、装置及系统
WO2024002194A1 (zh) 一种同步校验方法、装置、电子设备及存储介质
KR20240072677A (ko) 라이다-카메라의 위상과 시간을 동기화할 수 있는 방법 및 장치
CN216673171U (zh) 图像采集系统及图像采集装置
WO2021147750A1 (zh) 实现3d拍摄的方法、装置及3d显示终端
JP2019003325A (ja) 画像処理装置、画像処理方法及びプログラム
US20230082766A1 (en) Image synchronization method and apparatus, and device and computer storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19899469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19899469

Country of ref document: EP

Kind code of ref document: A1