WO2023201822A1 - 多相机同步校正方法、装置及存储介质 - Google Patents

多相机同步校正方法、装置及存储介质 Download PDF

Info

Publication number
WO2023201822A1
WO2023201822A1 PCT/CN2022/093867 CN2022093867W WO2023201822A1 WO 2023201822 A1 WO2023201822 A1 WO 2023201822A1 CN 2022093867 W CN2022093867 W CN 2022093867W WO 2023201822 A1 WO2023201822 A1 WO 2023201822A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
signal
slave
image
difference
Prior art date
Application number
PCT/CN2022/093867
Other languages
English (en)
French (fr)
Inventor
邢冬逢
Original Assignee
深圳看到科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳看到科技有限公司 filed Critical 深圳看到科技有限公司
Publication of WO2023201822A1 publication Critical patent/WO2023201822A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors

Definitions

  • the present invention relates to the technical field of multi-camera systems, and in particular to a multi-camera synchronous correction method, device and storage medium.
  • Synchronous acquisition of multi-angle stereo images requires simultaneous shooting of multiple cameras.
  • the system times of the cameras are independent of each other.
  • the time synchronization function is generally implemented through a network time protocol or a synchronization device.
  • the system time difference of each camera is still relatively large. Large, there is obvious picture out-of-sync problem in the output video.
  • the purpose of this application is to provide a multi-camera synchronization correction method, aiming to solve the existing technical problem of asynchronous multi-camera synchronization shooting output pictures.
  • this application provides a multi-camera synchronization correction method, which is characterized in that the steps of the method include: obtaining a signal interaction time group as the current time group in the preset signal interaction time set, wherein each Each of the signal interaction time groups includes a host feedback signal time, a slave sending signal time adjacent to the host feedback signal time, and a slave receiving signal time adjacent to the host feedback signal time; based on the host feedback signal time , the time when the slave sends a signal and the time when the slave receives a signal, calculates the current time difference of the current time group; determines the target time difference between the host and the slave based on the current time difference, and determines the target time difference based on the target time difference.
  • the image acquisition time of the slave machine is corrected.
  • the present application also provides a multi-camera synchronization correction device.
  • the multi-camera synchronization correction device includes: a signal interaction time group acquisition module, used to obtain a signal interaction time in a preset signal interaction time set. group, as the current time group, wherein each of the signal interaction time groups includes the master feedback signal time, the slave machine sending signal time adjacent to the master feedback signal time, and the slave machine receiving time adjacent to the host feedback signal time.
  • a current time difference calculation module used to calculate the current time difference of the current time group based on the host feedback signal time, the slave machine sending signal time and the slave machine receiving signal time
  • an image acquisition time correction module Used to determine the target time difference between the host machine and the slave machine based on the current time difference, and to correct the image acquisition time of the slave machine based on the target time difference.
  • the present application also provides a multi-camera synchronization correction device, which includes a processor, a memory, and a multi-camera synchronization correction device stored on the memory and executed by the processor. Synchronization correction program, wherein when the multi-camera synchronization correction program is executed by the processor, the steps of the above multi-camera synchronization correction method are implemented.
  • the present application also provides a computer-readable storage medium.
  • a multi-camera synchronization correction program is stored on the computer-readable storage medium.
  • the multi-camera synchronization correction program is executed by a processor, The steps of the multi-camera synchronization correction method are as mentioned above.
  • This application provides a multi-camera synchronization correction method.
  • the method obtains a signal interaction time group as the current time group in a preset signal interaction time set, wherein each signal interaction time group includes the host feedback signal time, The host feedback signal time is adjacent to the slave machine sending signal time and the slave machine receiving signal time is adjacent to the host feedback signal time; based on the host feedback signal time, the slave machine sending signal time and the slave machine Receive signal time, calculate the current time difference of the current time group; determine the target time difference between the host and the slave based on the current time difference, and correct the image acquisition time of the slave based on the target time difference.
  • this application extracts the signal interaction time group in the preset signal interaction time set, calculates the corresponding time difference, corrects the slave image acquisition time, and then determines the slave's first frame based on the host's first frame image acquisition time. image, and then the host and slave machines output the collected images starting from the first frame of the camera, achieving the effect of synchronous output of images from multiple cameras at the same time.
  • the acquisition time of the slave image is corrected based on the time difference between the slave and the host, which avoids the time error caused by the impact of network fluctuations on signal transmission. It not only improves the synchronization efficiency of multiple cameras, but also improves the multi-camera shooting screen.
  • the synchronization accuracy solves the current technical problem of out-of-synchronization of multi-camera shooting images.
  • Figure 1 is a schematic diagram of the hardware structure of the multi-camera synchronization correction device involved in the embodiment of the present invention.
  • Figure 2 is a schematic flowchart of the first embodiment of the multi-camera synchronization correction method of the present application.
  • FIG. 3 is a flow chart of the second embodiment of the multi-camera synchronization correction method of the present application.
  • FIG. 4 is a schematic diagram of the request signal transmission process provided by this application.
  • Figure 5 is a schematic flowchart of a third embodiment of a multi-camera synchronization correction method provided by this application.
  • Figure 6 is a flow chart of the fourth embodiment of the multi-camera synchronization correction method provided by this application.
  • Figure 7 is a schematic flowchart of the fifth embodiment of the multi-camera synchronization correction method provided by this application.
  • Figure 8 is a functional module schematic diagram of the first embodiment of the multi-camera synchronization correction device of the present application.
  • the multi-camera synchronization correction method involved in the embodiment of the present application is mainly applied to multi-camera synchronization correction equipment.
  • the multi-camera synchronization correction equipment can be a PC, a portable computer, a mobile terminal and other equipment with display and processing functions.
  • the multi-camera synchronization correction device may include a processor 1001 (such as a CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005.
  • the communication bus 1002 is used to realize connection and communication between these components;
  • the user interface 1003 can include a display screen (Display) and an input unit such as a keyboard (Keyboard);
  • the network interface 1004 can optionally include a standard wired interface and a wireless interface. (such as WI-FI interface);
  • the memory 1005 can be a high-speed RAM memory or a stable memory (non-volatile memory), such as disk memory.
  • the memory 1005 may optionally be a storage device independent of the aforementioned processor 1001.
  • Figure 1 does not constitute a limitation on the multi-camera synchronization correction device, and may include more or less components than shown in the figure, or combine certain components, or different components. layout.
  • the memory 1005 as a computer-readable storage medium in FIG. 1 may include an operating system, a network communication module, and a multi-camera synchronization correction program.
  • the network communication module is mainly used to connect to the server and perform data communication with the server; and the processor 1001 can call the multi-camera synchronization correction program stored in the memory 1005 and execute the multi-camera synchronization correction method provided by the embodiment of the present invention. .
  • the embodiment of the present application provides a multi-camera synchronization correction method.
  • Figure 2 is a schematic flow chart of a first embodiment of a multi-camera synchronization correction method of the present application.
  • the multi-camera synchronous correction method involved in the embodiment of the present application is applied to a multi-camera synchronous correction system.
  • the multi-camera synchronous correction system includes a host machine and at least one slave machine. There is at least one slave machine that is synchronously connected to the host machine.
  • the synchronous connection method may be implemented through a device with a signal synchronization function such as a synchronization line and/or a synchronization controller.
  • the execution subject of the multi-camera synchronization correction method is the subject and the slave in the multi-camera synchronization correction system.
  • the multi-camera synchronous correction system includes a host machine, a slave machine and a server.
  • the execution subject of the multi-camera synchronous correction method is the server in the multi-camera synchronous correction system.
  • the host and slave machines in the multi-camera synchronization correction system are preferred as the execution subjects.
  • the following takes the host and slave in a multi-camera synchronized correction system as execution subjects as an example.
  • the multi-camera synchronized correction method includes the following steps:
  • Step S10 in the preset signal interaction time set, obtain a signal interaction time group as the current time group, wherein each signal interaction time group includes the master feedback signal time and the slave machines adjacent to the master feedback signal time.
  • the signal sending time and the master feedback signal time are adjacent to the slave receiving signal time.
  • a storage unit may be provided in the slave sensor, and a storage area for a preset signal interaction time set, such as a time register, may be provided in the storage unit for storing the signal interaction time group in the slave machine. Since the slave machine can obtain multiple master feedback signal times, the slave machine sending signal time and the slave machine receiving signal time, the above obtained signal times need to be grouped. When the slave machine obtains the above signal time, the master feedback signal Time is used as a label, and its adjacent slave sending signal time and slave receiving signal time are extracted as a set of signal interaction times and stored in the preset signal interaction time set to prevent the generation of mismatched signal time data. Interference causes inaccurate correction data and affects the correction effect.
  • a preset signal interaction time set such as a time register
  • the slave sensor can retrieve the signal interaction time group as the current time group, and perform next step processing on the signal times included in the current time group.
  • Step S20 Calculate the current time difference of the current time group based on the master feedback signal time, the slave signal sending time, and the slave signal receiving time.
  • a group of signal interaction time groups in the preset signal interaction time set is extracted.
  • Each camera has a processor, and the processor can have a data processing unit, which can analyze the time information in the signal interaction time group. , and process time information according to preset procedures. Because the master feedback signal time is adjacent to the slave signal sending time and the slave signal receiving time at the same time, the processor can extract the average value of the slave sending signal time and the slave receiving signal time and compare it with the master feedback signal time. A difference calculation is performed to obtain the current time difference of the extracted current time group, and the processor dumps the current time difference and stores it in a storage unit in the camera for processing.
  • Step S30 Determine a target time difference between the master machine and the slave machine based on the current time difference, and correct the image acquisition time of the slave machine based on the target time difference.
  • determining the target time difference between the master and the slave according to the current time difference includes:
  • the processor in the camera can scan the preset signal interaction time sets in the storage unit to confirm whether there are multiple signal interaction time groups. If the preset signal interaction time is concentrated and there is only one signal interaction time group, then the processor extracts the time difference calculated by the signal interaction time group as the target time difference between the master and the slave; if the preset signal interaction time is concentrated and there are multiple signal interaction time groups, then the processor performs time difference calculations on each signal interaction time group in turn, thereby obtaining multiple time difference data.
  • the data processing unit in the processor can compare and filter the multiple time difference data obtained, and filter out the minimum time difference. The processor only retains the minimum time difference as the target time difference, and clears the remaining time difference data and preset signal interaction time.
  • the processor extracts the target time difference, it does not change the system time of the slave machine, but stores the target time difference to the storage unit. After completing the acquisition of the target image, the system time corresponding to the target image is corrected according to the target time difference. .
  • This embodiment provides a multi-camera synchronization correction method.
  • the method obtains a signal interaction time group as the current time group in a preset signal interaction time set, wherein each signal interaction time group includes a host feedback signal time. , the slave machine sending signal time adjacent to the host feedback signal time and the slave machine receiving signal time adjacent to the host feedback signal time; based on the host feedback signal time, the slave machine sending signal time and the slave machine feedback signal time.
  • the machine receives the signal time, calculates the current time difference of the current time group; based on the current time difference, determines the target time difference between the host machine and the slave machine, and corrects the image acquisition time of the slave machine based on the target time difference.
  • this embodiment extracts the signal interaction time group in the preset signal interaction time set, calculates the corresponding time difference, determines the target time difference based on the time difference, and corrects the image acquisition time collected by the slave machine based on the target time difference. Therefore, based on the calculation of the system time difference between the slave and the host, the image acquisition time of the slave is corrected, which avoids the interference of network fluctuations on the synchronization signal and the time error caused by signal transmission, and not only improves the synchronization efficiency of multiple cameras , and improves the synchronization accuracy of multi-camera shooting pictures, solving the current technical problem of out-of-synchronization of multi-camera shooting pictures.
  • Figure 3 is a flow chart of a second embodiment of a multi-camera synchronization correction method according to the present application.
  • step S10 it also includes:
  • Step S01 Based on the synchronization signal, send a request signal to the host machine, and record the current slave machine system time as the slave machine sending signal time.
  • Step S02 When receiving the request signal fed back by the host, record the current slave system time as the slave receiving signal time.
  • Step S03 Based on the request signal fed back by the host, obtain the current host system time when the host sends the request signal as the host feedback signal time.
  • Step S04 Generate a signal interaction time group based on the master feedback signal time, the slave signal sending time, and the slave signal receiving time.
  • the slave triggers the synchronization function when receiving a synchronization signal sent by an external device (such as a server, a host, etc.).
  • the processor of the slave can extract a request signal, and while sending the request signal, record the current system time of the slave as the slave sending signal time; when the request signal is received by the host, the host can immediately extract and receive the request.
  • the signal is the current system time of the host, which is used as the host feedback signal time, and the host feedback signal time and request signal can be sent back to the slave immediately; when the slave receives the request signal fed back by the host, it can obtain the host feedback signal time, And record the current system time of the slave as the slave receiving system time; the slave can store the above signal time to The preset signal interaction time is concentrated as a signal interaction time group.
  • the request signal can be sent repeatedly to obtain multiple signal interaction time groups.
  • a host is synchronously connected to at least one slave.
  • the request signal sent by the slave can include an identification information.
  • the host can receive information about all request signals and provide feedback, while the slave only receives information corresponding to the identification signal.
  • the information carried by the request signal can be in the form of unique letters, numbers, etc.
  • the request signal can be transmitted using the UDP protocol.
  • UDP is a connectionless transport layer protocol in the OSI (Open System Interconnection) reference model. The source and terminal do not establish a connection before transmitting data. When data needs to be transmitted UDP simply grabs the data from the application and throws it out onto the network as quickly as possible. On the sending end, the speed at which UDP transmits data is only limited by the speed at which the application generates data, the computer's capabilities, and the transmission bandwidth; at the receiving end, UDP puts each message segment in a queue, and the application reads each message segment from the queue each time. Read a message segment. Because UDP is not a connection-based protocol, it consumes less resources and has fast processing speed.
  • OSI Open System Interconnection
  • the processors of the master camera and the slave camera are connected in the same network system, and UDP is used to transmit the request signal, which can avoid the impact of network fluctuations on data transmission, thereby achieving rapid transmission of the request signal and time information. Reduce the transmission time and thereby improve the accuracy of multi-camera synchronization.
  • the slave machine when it receives a synchronization signal sent by an external device (such as a server, a host, etc.), it triggers the synchronization function.
  • the slave can send the signal time from the slave as a data packet and send it to the host through UDP; after the host receives the data packet, it does not process the data packet, but immediately extracts the current time as the host feedback signal time, and can The feedback signal time is added to the data packet and then immediately sent back to the slave; when the slave receives the data packet, the current slave system time is recorded as the slave receiving signal time; and the signal time and host feedback signal can be sent from the slave
  • the time and the time when the slave receives the signal are stored in the preset signal interaction time set as an information interaction time group.
  • the slave machine After the slave machine completes the extraction of an information interaction time group, it can repeat the above steps to obtain N information interaction time groups; then it can calculate the time difference for the N information interaction time groups respectively, and extract the time difference with the smallest value from the N time differences.
  • the target time difference the smaller the target time difference is, the closer the corresponding host and slave times are.
  • the data packet is equivalent to the request signal, and the time when the slave sends the signal contained in the data packet is the identification information of the slave.
  • Figure 4 is a schematic diagram of the request signal transmission process provided by this application.
  • the system time of the slave is S
  • the system time of the master is M.
  • the current time of the slave is S1
  • the current time of the master is M1
  • the slave sends a request signal to the master at time S1, and records S1 as the time when the slave sends the signal
  • the master at time T2
  • the host time is M2 at this time
  • the host records M2 as the host feedback signal time, and sends the request signal and the host feedback signal time to the slave at time M2
  • the slave receives the request signal at time T3, Record the slave system time S3 at this time as the slave receiving signal time.
  • the transmission time of the request signal from the slave to the master is marked as C1, and the transmission time of the request signal from the master to the slave is marked as C2.
  • Figure 5 is a schematic flowchart of a third embodiment of a multi-camera synchronization correction method provided by this application.
  • step S20 also includes:
  • Step S21 Calculate the average of the sum of the signal sending time of the slave machine and the signal receiving time of the slave machine as the slave machine feedback signal time.
  • the slave processor is equipped with a data processing unit, which can extract the slave sending signal time and the slave receiving signal time in the storage unit.
  • the extracted slave sending signal time and slave receiving signal time are adjacent to the same host. Feedback signal time; then calculate the average of the above two time sums.
  • Step S22 Calculate the difference between the slave feedback signal time and the master feedback signal time, and use the absolute value of the difference as the current time difference.
  • the data processing unit after obtaining the average value of the slave sending signal time and the slave receiving signal time, extracts the corresponding host feedback signal time, and calculates the average value of the above two time sums and the host feedback signal time. Difference time, because the difference time may be a negative time, the data processing unit extracts the absolute value of the difference time as the current time difference.
  • Figure 6 is a flow chart of a fourth embodiment of a multi-camera synchronization correction method provided by this application.
  • step S30 also includes:
  • Step S31 based on the image collected by the slave machine, determine the collection time of the current image as the initial image time;
  • Step S32 Calculate the initial image time based on the target time difference, and determine the corrected image time of the target image.
  • the host machine and the slave machine may collect images before synchronizing, or they may collect images after obtaining the target time difference. Because after the slave computer calculates and obtains the target time difference, in order not to affect the operation of the slave system, the time of the slave system is not corrected.
  • the acquisition time of the slave system image is still the slave system time, and the slave system time corresponding to the slave image is used as the initial image time.
  • the initial image time of the slave image is corrected according to the target time difference, and the calculation method of the initial image time is determined based on the sign of the difference time calculated in step S22.
  • step S32 also includes:
  • the difference is a negative value, and the sum time of the difference between the initial image time and the target time is calculated as the corrected image time.
  • the target time difference needs to be subtracted from the initial image time to achieve time correction of the slave image and obtain the corrected image time.
  • the master has received the request signal before the slave feedback signal time, indicating that the slave's system time is slower than the host's system time.
  • the initial image time needs to be added to the target time difference to achieve time correction of the slave image and obtain the corrected image time.
  • FIG. 7 is a schematic flowchart of a fifth embodiment of a multi-camera synchronization correction method provided by this application.
  • step S32 it also includes:
  • Step S33 obtain the first frame image of the host as the first frame host image
  • Step S34 among the frame images corresponding to the slave machine, determine the slave machine image corresponding to the collection time of the target image as the first frame slave machine image;
  • Step S35 Output synchronized video based on the first frame of the host image and the first frame of the slave image.
  • the host can first determine the first frame image of the output video, and send the first frame image time of the host corresponding to the first frame image to the slave machine.
  • the slave machine After the slave machine receives the host time information, it can collect the corresponding image from the slave machine. Match the corrected image time and locate the target image time closest to the host's first frame image time.
  • the slave image corresponding to the target image time is used as the first frame image of the slave output video; the slave completes the first frame image
  • the host and slave receive the video output instructions from the server.
  • the video is output synchronously in chronological order.
  • the output video is a picture taken by multiple cameras at multiple angles and at the same time.
  • the image acquisition time is corrected, and the corrected image can be output to the server in the form of a video stream.
  • the main camera outputs the collected image to the server in the form of a video stream, and the server reads Take the video streams of the master camera and the slave camera, match the image acquisition times, extract the image with the closest time as the first frame image of each camera, and play the video streams synchronously in chronological order.
  • the exposure time and shooting duration settings of the master camera and the slave camera are the same, that is, the time of two adjacent frames of images in the images taken by the master camera and the slave camera.
  • the intervals are theoretically equal, and the number of pictures is theoretically the same; when the first frame image time of the master camera and the slave camera is synchronized, it can be considered that the image time output by the master camera and the slave camera is one-to-one corresponding and synchronized.
  • embodiments of the present application also provide a multi-camera synchronization correction device.
  • FIG. 8 is a functional module schematic diagram of the first embodiment of the multi-camera synchronization correction device of the present application.
  • the multi-camera synchronization correction device includes:
  • the current time group acquisition module 10 is used to acquire a signal interaction time group as the current time group in the preset signal interaction time set, where each signal interaction time group includes the host feedback signal time, the host feedback signal The time when the slave machine sends a signal adjacent in time and the time when the master feedback signal is adjacent in time when the slave machine receives a signal;
  • the current time difference calculation module 20 is used to calculate the current time difference of the current time group based on the host feedback signal time, the slave signal sending time, and the slave signal receiving time;
  • the image acquisition time correction module 30 is configured to determine the target time difference between the host machine and the slave machine based on the current time difference, and correct the image acquisition time of the slave machine based on the target time difference.
  • image acquisition and correction module 30 specifically includes:
  • a plurality of time group acquisition units configured to acquire each signal interaction time group except the current time group in the signal interaction time set;
  • the target time difference determination unit is configured to calculate each time difference corresponding to each signal interaction time group, and determine the minimum time difference among each time difference and the current time difference as the target time difference.
  • the multi-camera synchronization correction device includes a signal interaction time group generation module, and the signal interaction time group generation module specifically includes:
  • the slave machine sends a signal time acquisition unit, which is used to send a request signal to the host based on the synchronization signal, and record the current slave machine system time as the slave machine sending signal time;
  • the slave machine receiving signal time acquisition unit is configured to record the current slave machine system time as the slave machine receiving signal time when receiving the request signal fed back by the host machine;
  • a host feedback signal time acquisition unit configured to obtain the current host system time when the host sends the request signal based on the request signal fed back by the host as the host feedback signal time;
  • a signal interaction time group generating unit is configured to generate one of the signal interaction time groups based on the host feedback signal time, the slave machine sending signal time, and the slave machine receiving signal time.
  • the current time difference calculation module 20 specifically includes:
  • the slave machine feedback signal time calculation unit is used to calculate the average of the slave machine sending signal time and the slave machine receiving signal time as the slave machine feedback signal time;
  • a current time difference calculation unit is used to calculate the difference between the slave feedback signal time and the host feedback signal time, and use the absolute value of the difference as the current time difference.
  • image acquisition and correction module 30 specifically includes:
  • the initial image time determination unit is used to determine the acquisition time of the current image as the initial image time based on the image collected by the slave machine.
  • the corrected image time determination unit is configured to calculate the initial image time based on the target time difference and determine the corrected image time of the target image.
  • the corrected image time determination unit specifically includes:
  • the difference time calculation subunit determines that the difference is a positive value, and calculates the difference time between the initial image time and the target time difference as the corrected image time;
  • the sum time calculation subunit determines that the difference is a negative value, and calculates the sum time of the difference between the initial image time and the target time as the corrected image time.
  • the multi-camera synchronization correction device includes a synchronization video output module, and the video synchronization output module specifically includes:
  • the first frame host image acquisition unit is used to acquire the first frame image of the host as the first frame host image.
  • the first frame slave image acquisition unit is used to determine the slave image corresponding to the acquisition time of the target image among the frame images corresponding to the slave machine as the first frame slave image.
  • a synchronized video output unit is configured to output synchronized video based on the first frame of the host image and the first frame of the slave image.
  • Each module in the above-mentioned multi-camera synchronization correction device corresponds to each step in the above-mentioned multi-camera synchronization correction method embodiment, and its functions and implementation processes will not be described in detail here.
  • embodiments of the present invention also provide a computer-readable storage medium.
  • the computer-readable storage medium of the present invention stores a multi-camera synchronization correction program.
  • the multi-camera synchronization correction program is executed by a processor, the steps of the multi-camera synchronization correction method as described above are implemented.
  • the application may be used in a variety of general or special purpose computer system environments or configurations. For example: personal computers, server computers, handheld or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics devices, network PCs, minicomputers, mainframe computers, including Distributed computing environment for any of the above systems or devices, etc.
  • the application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • the present application may also be practiced in distributed computing environments where tasks are performed by remote processing devices connected through a communications network.
  • program modules may be located in both local and remote computer storage media including storage devices.
  • the methods of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. implementation.
  • the technical solution of the present invention can be embodied in the form of a software product that is essentially or contributes to the existing technology.
  • the computer software product is stored in a storage medium (such as ROM/RAM) as mentioned above. , magnetic disk, optical disk), including several instructions to cause a terminal device (which can be a mobile phone, computer, server, air conditioner, or network device, etc.) to execute the method described in various embodiments of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请提供一种多相机同步校正方法,所述方法通过提取预设信号交互时间集中的信号交互时间组,计算对应的时间差,对从机图像的采集时间进行校正,然后根据主机的首帧图像采集时间确定从机的首帧图像,然后主机和从机以本相机的首帧图像为起始输出采集的图像,实现多相机同一时刻的图像同步输出的效果

Description

多相机同步校正方法、装置及存储介质 技术领域
本发明涉及多相机系统技术领域,尤其涉及一种多相机同步校正方法、装置及存储介质。
背景技术
随着科学技术的发展,用户对于图像的采集要求,不再满足于单一画面的采集,转而追求多角度立体图像的同步采集。
技术问题
多角度立体图像的同步采集需要多台相机同步拍摄,然而相机的系统时间是相互独立的,一般通过网络时间协议或同步设备实现时间同步功能,但是采用这种方法,各相机的系统时间差仍然较大,输出视频存在明显的画面不同步问题。
技术解决方案
本申请的目的在于提供一种多相机同步校正方法,旨在解决现有多相机同步拍摄输出画面不同步的技术问题。
为实现上述目的,本申请提供一种多相机同步校正方法,其特征在于,所述方法的步骤包括:在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;根据所述当前时间差,确定主机与从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
此外,为实现上述目的,本申请还提供一种多相机同步校正装置,所述多相机同步校正装置包括:信号交互时间组获取模块,用于在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;当前时间差计算模块,用于基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;图像采集时间校正模块,用于根据所述当前时间差,确定所述主机与所述从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
此外,为实现上述目的,本申请还提供一种多相机同步校正设备,所述多相机同步校正设备包括处理器、存储器、以及存储在所述存储器上并可被所述处理器执行的多相机同步校正程序,其中所述多相机同步校正程序被所述处理器执行时,实现如上述多相机同步校正方法的步骤。
此外,为实现上述目的,本申请还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有多相机同步校正程序,其中所述多相机同步校正程序被处理器执行时,实现如上述多相机同步校正方法的步骤。
有益效果
本申请提供一种多相机同步校正方法,所述方法在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;根据所述当前时间差,确定主机与从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。通过上述方式,本申请通过提取预设信号交互时间集中的信号交互时间组,计算对应的时间差,对从机图像的采集时间进行校正,然后根据主机的首帧图像采集时间确定从机的首帧图像,然后主机和从机以本相机的首帧图像为起始输出采集的图像,实现多相机同一时刻的图像同步输出的效果。由此,基于从机和主机的时间差对从机图像的采集时间进行校正,避免了网络波动对信号传输的影响造成的时间误差,不仅提高了多相机的同步效率,而且提高了多相机拍摄画面的同步精度,解决了目前多相机拍摄画面不同步的技术问题。
附图说明
图1为本发明实施例方案中涉及的多相机同步校正设备的硬件结构示意图。
图2为本申请多相机同步校正方法第一实施例的流程示意图。
图3为本申请多相机同步校正方法第二实施例的流程图。
图4是本申请提供的请求信号传输过程示意图。
图5是本申请提供的多相机同步校正方法第三实施例的流程示意图。
图6是本申请提供的多相机同步校正方法第四实施例的流程图。
图7是本申请提供的多相机同步校正方法第五实施例的流程示意图。
图8为本申请多相机同步校正装置第一实施例的功能模块示意图。
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
本申请实施例涉及的多相机同步校正方法主要应用于多相机同步校正设备,该多相机同步校正设备可以是PC、便携计算机、移动终端等具有显示和处理功能的设备。
参照图1,图1为本发明实施例方案中涉及的多相机同步校正设备的硬件结构示意图。本发明实施例中,多相机同步校正设备可以包括处理器1001(例如CPU),通信总线1002,用户接口1003,网络接口1004,存储器1005。其中,通信总线1002用于实现这些组件之间的连接通信;用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard);网络接口1004可选的可以包括标准的有线接口、无线接口(如WI-FI接口);存储器1005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如磁盘存储器,存储器1005可选的还可以是独立于前述处理器1001的存储装置。
本领域技术人员可以理解,图1中示出的硬件结构并不构成对多相机同步校正设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
继续参照图1,图1中作为一种计算机可读存储介质的存储器1005可以包括操作系统、网络通信模块以及多相机同步校正程序。
在图1中,网络通信模块主要用于连接服务器,与服务器进行数据通信;而处理器1001可以调用存储器1005中存储的多相机同步校正程序,并执行本发明实施例提供的多相机同步校正方法。
本申请实施例提供了一种多相机同步校正方法。
参照图2,图2为本申请多相机同步校正方法第一实施例的流程示意图。
本实施例中,本申请实施例涉及的多相机同步校正方法应用于多相机同步校正系统,所述多相机同步校正系统包括一台主机以及至少一台从机。至少存在一台所述从机与所述主机同步连接,同步连接的方式可以是通过同步线和/或同步控制器等具有信号同步功能的设备实现。所述多相机同步校正方法的执行主体为所述多相机同步校正系统中的主体和从机。
所述多相机同步校正系统包括一台主机、一台从机以及一台服务器,所述多相机同步校正方法的执行主体为所述多相机同步校正系统中的服务器。
可以理解地是,为了减小网络波动对信号传输的影响造成的系统误差,提升多相机输出视频的同步效果,优先采用多相机同步校正系统中的主机和从机作为执行主体。
以下以多相机同步校正系统中的主机和从机作为执行主体举例进行说明,所述多相机同步校正方法包括以下步骤:
步骤S10,在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间。
本实施例中,在从机传感器中可以设置有存储单元,在存储单元中可以设置一个预设信号交互时间集的存储区域,如时间寄存器,用于存储从机中的信号交互时间组。由于从机可以获取多个主机反馈信号时间、从机发送信号时间以及从机接收信号时间,因此,需要对获取的上述信号时间进行分组,当从机获取到上述信号时间时,以主机反馈信号时间作为标签,提取其相邻的从机发送信号时间和从机接收信号时间,以此作为一组信号交互时间,并存储到预设信号交互时间集中,以防止不相匹配的信号时间数据产生干扰,造成校正数据不准确,影响校正效果。
本实施例中,在获取到至少一组信号交互时间组之后,从机传感器可以调取信号交互时间组,作为当前时间组,对当前时间组中所包含的信号时间进行下一步处理。
步骤S20,基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差。
本实施例中,提取预设信号交互时间集中的一组信号交互时间组,每个相机中都有处理器,处理器中可以有数据处理单元,可以对信号交互时间组中的时间信息进行分析,并按照预设定的程序对时间信息进行处理。因为主机反馈信号时间与从机发送信号时间、从机接收信号时间两个信号时间同时相邻,处理器可以提取从机发送信号时间与从机接收信号时间和的平均值,与主机反馈信号时间进行差值计算,从而得到提取的当前时间组的当前时间差,并且处理器将该当前时间差进行转存处理,存储到相机中的存储单元中待处理。
步骤S30,根据所述当前时间差,确定主机与从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
其中,所述根据所述当前时间差,确定主机与从机的目标时间差,包括:
在所述信号交互时间集中,获取除所述当前时间组之外的各个信号交互时间组;
计算各个信号交互时间组对应的各个时间差,并在各个时间差以及所述当前时间差中确定最小时间差,作为所述目标时间差。
本实施例中,相机中的处理器可以对存储单元中的预设信号交互时间集进行扫描,确认其中是否存在多个信号交互时间组。如果预设信号交互时间集中,仅存在一个信号交互时间组,那么处理器就提取该信号交互时间组计算得出的时间差作为主机和从机的目标时间差;如果预设信号交互时间集中,存在多个信号交互时间组,那么处理器依次对各个信号交互时间组进行时间差计算,从而得到多个时间差数据。处理器中的数据处理单元可以对获取的多个时间差数据进行比对筛选,从小筛选出最小时间差,处理器仅保留该最小时间差,作为目标时间差,并清除其余的时间差数据及预设信号交互时间集,防止历史数据影响下一次的校正结果。当处理器提取到目标时间差之后,并不对从机的系统时间进行更改,而是将目标时间差存储至存储单元,并且在完成目标图像的采集之后,根据目标时间差对目标图像对应的系统时间进行校正。
本实施例提供一种多相机同步校正方法,所述方法在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;根据所述当前时间差,确定主机与从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。通过上述方式,本实施例通过提取预设信号交互时间集中的信号交互时间组,计算对应的时间差,根据时间差确定目标时间差,并根据目标时间差对从机采集的图像采集时间进行校正。由此,基于对从机和主机的系统时间差的计算,对从机的图像采集时间进行校正,避免了网络波动对于同步信号的干扰以及信号传输造成的时间误差,不仅提高了多相机的同步效率,而且提高了多相机拍摄画面的同步精度,解决了目前多相机拍摄画面不同步的技术问题。
参照图3,图3为本申请多相机同步校正方法第二实施例的流程图。
基于上述图2所示实施例,本实施例中,所述步骤S10之前,还包括:
步骤S01,基于同步信号,向所述主机发送请求信号,记录当前从机系统时间,作为所述从机发送信号时间。
步骤S02,在接收到所述主机反馈的所述请求信号时,记录当前从机系统时间,作为所述从机接收信号时间。
步骤S03,基于所述主机反馈的所述请求信号,获取所述主机发送所述请求信号时的当前主机系统时间,作为所述主机反馈信号时间。
步骤S04,基于所述主机反馈信号时间、从机发送信号时间以及从机接收信号时间,生成一个所述信号交互时间组。
本实施例中,从机在接收到外部设备(如服务器、主机等)发送的同步信号时,触发同步功能。从机的处理器可以提取一个请求信号,并且在发送请求信号的同时,记录从机的当前系统时间,作为从机发送信号时间;请求信号在被主机接收时,主机可以立即提取接收到该请求信号时主机的当前系统时间,作为主机反馈信号时间,并且可以将主机反馈信号时间和请求信号立刻再发送回从机;从机在接收到主机反馈的请求信号时,可以获得主机反馈信号时间,并且记录从机当前系统时间,作为从机接收系统时间;从机可以按照上述信号时间的生成顺序,即从机发送信号时间、主机反馈信号时间、从机接收信号时间,将上述信号时间存储至预设信号交互时间集中,作为一个信号交互时间组。请求信号可以反复发送,从而得到多个信号交互时间组。
本实施例中,一台主机至少同步连接一台从机,从机发送的请求信号可以包含一个识别信息,主机可以接收所有请求信号的信息,并且进行反馈,从机则只接收对应识别信号的请求信号所携带的信息。其中,识别信息可以是唯一的字母、数字等形式。
本实施例中,请求信号的传输可以采用UDP协议,UDP是OSI(开放式系统互联)参考模型中一种无连接的传输层协议,传输数据之前源端和终端不建立连接,当需要传送数据时,UDP就简单地抓取来自应用程序的数据,并尽可能快地把它扔到网络上。在发送端,UDP传送数据的速度仅仅是受应用程序生成数据的速度、计算机的能力和传输带宽的限制;在接收端,UDP把每个消息段放在队列中,应用程序每次从队列中读一个消息段。因为UDP不属于连接型协议,因而资源消耗小,处理速度快。
本实施例中,主相机和从相机的处理器连接在同一个网络系统中,采用UDP进行请求信号的传输,可以避免网络波动对于数据传输的影响,从而实现请求信号及时间信息的快速传递,减小传输时长,进而提高多相机同步的精度。
进一步地,从机在接收到外部设备(如服务器、主机等)发送的同步信号时,触发同步功能。从机可以将从机发送信号时间作为一个数据包,通过UDP发送给主机;主机接收数据包后,可以不对数据包进行处理,而是立刻提取当前时间,作为主机反馈信号时间,并可以将主机反馈信号时间添加到数据包中,再立刻发送回从机;从机接收到数据包时,记录当前从机系统时间,作为从机接收信号时间;并且可以将从机发送信号时间、主机反馈信号时间以及从机接收信号时间一同存储至预设信号交互时间集中,作为一个信息交互时间组。从机完成一次信息交互时间组的提取之后,可以重复上述步骤,获取N个信息交互时间组;然后可以通过对N个信息交互时间组分别进行时间差计算,从N个时间差中提取数值最小的时间差,作为目标时间差,目标时间差越小,说明对应的主机和从机时间越接近。其中,数据包相当于请求信号,数据包中包含的从机发送信号时间即为该从机的识别信息。
进一步地,参照图4,图4是本申请提供的请求信号传输过程示意图。
本实施例中,假设存在一个参考时间T,从机的系统时间是S,主机的系统时间是M。如图3所示,在T1时刻,从机的当前时间是S1,主机的当前时间是M1;从机在S1时刻发送请求信号给主机,并且记录S1为从机发送信号时间;主机在T2时刻接收到所述请求信号,此时主机时间为M2,主机记录M2为主机反馈信号时间,并且在M2时刻将请求信号以及主机反馈信号时间发送给从机;从机在T3时刻接收到请求信号,记录此时从机的系统时间S3,作为从机接收信号时间。
可以理解地是,请求信号在从机与主机之间进行传输的过程中,一般不存在干扰信号传输的情况,然而本实施例中,为了进一步避免网络波动对信号传输的影响,导致主机与从机的后续画面帧之间的不同步,需要获取主机与从机在多个时间段中的传输时间差,由此,对传输时间差进行反复测试,直至确定没有干扰的时间点测试出来的传输时间差值,即选择绝对值最小的时间差作为目标时间差。具体地:
将请求信号由从机到主机的传输时间记为C1,请求信号由主机返回到从机的传输时间记为C2,那么请求信号的总传输时间C3=C1+C2=T3-T1,其中,C1=T2-T1,C2=T3-T2;以从机系统时间计算,请求信号的总传输时间为ΔS=S3-S1,可以认为,在从机发送请求信号时间与从机接收请求信号时间的中间时刻,即从机时间在S2=S1+ΔS/2=(S1+S3)/2时刻,主机接收到请求信号,此时,C1=C2,M2=S2,即认为主机与从机系统时间同步;但是因为网络波动等原因,C1与C2时间未必相等,此时,从机和主机即存在时间差:ΔT=S2-M2,重复上述操作,计算多个从机和主机的时间差ΔT,当ΔT的值越接近于0,则说明从机与主机的系统时间差越小,越接近时间同步,所以当存在多个ΔT时,在多个时间差中,确定绝对值最小的时间差,提取其中绝对值最小的ΔT作为目标时间差。 通过绝对值最小的目标时间差,对从机图像的采集时间进行校正,进一步提高了校准精度。
参考图5,图5是本申请提供的多相机同步校正方法第三实施例的流程示意图。
基于上述图2所示实施例,本实施例中,所述步骤S20,还包括:
步骤S21,计算所述从机发送信号时间与所述从机接收信号时间和的平均值,作为从机反馈信号时间。
本实施例中,从机处理器具备数据处理单元,可以提取存储单元中从机发送信号时间和从机接收信号时间,提取的从机发送信号时间和从机接收信号时间相邻于同一个主机反馈信号时间;然后计算上述两个时间和的平均值。
步骤S22,计算所述从机反馈信号时间与所述主机反馈信号时间的差值,以所述差值的绝对值作为所述当前时间差。
本实施例中,数据处理单元在得到从机发送信号时间和从机接收信号时间和的平均值之后,提取对应的主机反馈信号时间,计算上述两个时间和的平均值与主机反馈信号时间的差值时间,因为差值时间可能是负值时间,所以数据处理单元提取差值时间的绝对值作为当前时间差。
参照图6,图6是本申请提供的多相机同步校正方法第四实施例的流程图。
基于上述图2所示实施例,本实施例中,所述步骤S30,还包括:
步骤S31,基于从机采集的图像,确定当前图像的采集时间为初始图像时间;
步骤S32,基于目标时间差,对所述初始图像时间进行计算,确定所述目标图像的校正图像时间。
本实施例中,主机和从机可以在进行同步之前进行图像采集,也可以在获取目标时间差之后进行图像采集。因为从机计算获取目标时间差之后,为了不影响从机系统的运行,并不对从机系统的时间进行校正。从机系统图像的采集时间仍然是从机系统时间,将从机图像对应的从机系统时间作为初始图像时间。根据目标时间差对从机图像的初始图像时间进行校正,根据上述步骤S22中计算的差值时间的正负来判断初始图像时间的计算方式。
进一步地,所述步骤S32还包括:
判断所述差值为正值,计算所述初始图像时间与所述目标时间差的差值时间,作为所述校正图像时间;
判断所述差值为负值,计算所述初始图像时间与所述目标时间差的和值时间,作为所述校正图像时间。
可以理解地是,若从机反馈信号时间与主机反馈信号时间的差值为负值,那么在从机反馈信号时间,主机未接收到请求信号,说明从机的系统时间比主机的系统时间快,此时,需要将初始图像时间减去目标时间差,才能实现从机图像的时间校正,得到校正图像时间。
进一步地,若从机反馈信号时间与主机反馈信号时间的差值为正值,那么在从机反馈信号时间之前,主机已经接收到请求信号,说明从机的系统时间比主机的系统时间慢,此时,需要将初始图像时间加上目标时间差,才能实现从机图像的时间校正,得到校正图像时间。
参照图7,图7是本申请提供的多相机同步校正方法第五实施例的流程示意图。
基于上述图6所示实施例,本实施例中,所述步骤S32之后,还包括:
步骤S33,获取所述主机的首帧图像,作为首帧主机图像;
步骤S34,在所述从机对应的各帧图像中,确定目标图像的采集时间对应的从机图像,作为首帧从机图像;
步骤S35,基于所述首帧主机图像与所述首帧从机图像,输出同步视频。
本实施例中,主机可以首先确定输出视频的首帧图像,并且将首帧图像对应的主机首帧图像时间发送给从机,从机接收到主机时间信息后,可以对从机采集图像中对应的校正图像时间进行匹配,定位到最接近于主机首帧图像时间的目标图像时间,该目标图像时间所对应的从机图像,作为从机输出视频的首帧图像;从机完成首帧图像的确定后,主机和从机接收服务器端的视频输出指令,同时以各相机的首帧图像为起始,按照时间顺序同步输出视频,输出的视频为多相机拍摄的多角度、同时刻的画面。
进一步地,从相机完成图像采集之后,对图像的采集时间进行校正,可以将经过校正的图像以视频流的方式输出至服务器,主相机将采集的图像以视频流的方式输出至服务器,服务器读取主相机和从相机的视频流,并对图像采集时间进行匹配,提取其中时间最接近的图像作为各相机的第一帧图像,并按照时间顺序同步播放视频流。
可以理解地是,因为主相机与从相机采用同步拍摄的方式,所以主相机与从相机的曝光时长和拍摄时长设置相同,即主相机和从相机拍摄的图像中,相邻两帧图像的时间间隔理论上是相等的,且图片数量理论上也是相同的;当主相机与从相机的首帧图像时间实现同步时,即可认为主相机与从相机输出的图像时间是一一对应且同步的。
此外,本申请实施例还提供一种多相机同步校正装置。
参照图8,图8为本申请多相机同步校正装置第一实施例的功能模块示意图。
本实施例中,所述多相机同步校正装置包括:
当前时间组获取模块10,用于在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;
当前时间差计算模块20,用于基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;
图像采集时间校正模块30,用于根据所述当前时间差,确定所述主机与所述从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
进一步地,所述图像采集校正模块30具体包括:
多个时间组获取单元,用于在所述信号交互时间集中,获取除所述当前时间组之外的各个信号交互时间组;
目标时间差确定单元,计用于算各个信号交互时间组对应的各个时间差,并在各个时间差以及所述当前时间差中确定最小时间差,作为所述目标时间差。
进一步地,所述多相机同步校正装置包括信号交互时间组生成模块,所述信号交互时间组生成模块具体包括:
从机发送信号时间获取单元,用于基于同步信号,向所述主机发送请求信号,记录当前从机系统时间,作为所述从机发送信号时间;
从机接收信号时间获取单元,用于在接收到所述主机反馈的所述请求信号时,记录当前从机系统时间,作为所述从机接收信号时间;
主机反馈信号时间获取单元,用于基于所述主机反馈的所述请求信号,获取所述主机发送所述请求信号时的当前主机系统时间,作为所述主机反馈信号时间;
信号交互时间组生成单元,用于基于所述主机反馈信号时间、从机发送信号时间以及从机接收信号时间,生成一个所述信号交互时间组。
进一步地,所述当前时间差计算模块20具体包括:
从机反馈信号时间计算单元,用于计算所述从机发送信号时间与所述从机接收信号时间和的平均值,作为从机反馈信号时间;
当前时间差计算单元,用于计算所述从机反馈信号时间与所述主机反馈信号时间的差值,以所述差值的绝对值作为所述当前时间差。
进一步地,所述图像采集校正模块30具体还包括:
初始图像时间确定单元,用于基于从机采集的图像,确定当前图像的采集时间为初始图像时间。
校正图像时间确定单元,用于基于目标时间差,对所述初始图像时间进行计算,确定所述目标图像的校正图像时间。
进一步地,所述校正图像时间确定单元具体包括:
差值时间计算子单元,判断所述差值为正值,计算所述初始图像时间与所述目标时间差的差值时间,作为所述校正图像时间;
和值时间计算子单元,判断所述差值为负值,计算所述初始图像时间与所述目标时间差的和值时间,作为所述校正图像时间。
进一步地,所述多相机同步校正装置包括同步视频输出模块,所述视频同步输出模块具体包括:
首帧主机图像获取单元,用于获取所述主机的首帧图像,作为首帧主机图像。
首帧从机图像获取单元,用于在所述从机对应的各帧图像中,确定所述目标图像的采集时间对应的从机图像,作为首帧从机图像。
同步视频输出单元,用于基于所述首帧主机图像与所述首帧从机图像,输出同步视频。
其中,上述多相机同步校正装置中各个模块与上述多相机同步校正方法实施例中各步骤相对应,其功能和实现过程在此处不再一一赘述。
此外,本发明实施例还提供一种计算机可读存储介质。
本发明计算机可读存储介质上存储有多相机同步校正程序,其中所述多相机同步校正程序被处理器执行时,实现如上述的多相机同步校正方法的步骤。
其中,多相机同步校正程序被执行时所实现的方法可参照本发明多相机同步校正方法的各个实施例,此处不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者系统中还存在另外的相同要素。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
本申请可用于众多通用或专用的计算机系统环境或配置中。例如:个人计算机、服务器计算机、手持设备或便携式设备、平板型设备、多处理器系统、基于微处理器的系统、置顶盒、可编程的消费电子设备、网络PC、小型计算机、大型计算机、包括以上任何系统或设备的分布式计算环境等等。本申请可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本申请,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (15)

  1. 一种多相机同步校正方法,其包括:
    在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;
    基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;
    根据所述当前时间差,确定所述主机与所述从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
  2. 根据权利要求1所述的多相机同步校正方法,其中所述根据所述当前时间差,确定主机与从机的目标时间差,还包括:
    在所述信号交互时间集中,获取除所述当前时间组之外的各个信号交互时间组;
    计算各个信号交互时间组对应的各个时间差,并在各个时间差以及所述当前时间差中确定最小时间差,作为所述目标时间差。
  3. 根据权利要求1所述的多相机同步校正方法,其中所述基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差,包括:
    计算所述从机发送信号时间与所述从机接收信号时间和的平均值,作为从机反馈信号时间;
    计算所述从机反馈信号时间与所述主机反馈信号时间的差值,以所述差值的绝对值作为所述当前时间差。
  4. 根据权利要求3所述的多相机同步校正方法,其中所述根据所述目标时间差,对所述从机的图像采集时间进行校正,包括:
    基于从机采集的图像,确定当前图像的采集时间为初始图像时间;
    基于目标时间差,对所述初始图像时间进行计算,确定所述目标图像的校正图像时间。
  5. 根据权利要求4所述的多相机同步校正方法,其中所述基于目标时间差,对所述初始图像时间进行计算,确定所述目标图像的校正图像时间,包括:
    判断所述差值为正值,计算所述初始图像时间与所述目标时间差的差值时间,作为所述校正图像时间;
    判断所述差值为负值,计算所述初始图像时间与所述目标时间差的和值时间,作为所述校正图像时间。
  6. 根据权利要求5所述的多相机同步校正方法,其中所述根据目标时间差,对所述从机的图像时间进行校正之后,还包括:
    获取所述主机的首帧图像,作为首帧主机图像;
    在所述从机对应的各帧图像中,确定所述目标图像的采集时间对应的从机图像,作为首帧从机图像;
    基于所述首帧主机图像与所述首帧从机图像,输出同步视频。
  7. 根据权利要求1所述的多相机同步校正方法,其中所述在预设信号交互时间集中,获取一个信号交互时间组之前,还包括:
    基于同步信号,向所述主机发送请求信号,记录当前从机系统时间,作为所述从机发送信号时间;
    在接收到所述主机反馈的所述请求信号时,记录当前从机系统时间,作为所述从机接收信号时间;
    基于所述主机反馈的所述请求信号,获取所述主机发送所述请求信号时的当前主机系统时间,作为所述主机反馈信号时间;
    基于所述主机反馈信号时间、从机发送信号时间以及从机接收信号时间,生成一个所述信号交互时间组。
  8. 一种多相机同步校正装置,其包括:
    信号交互时间组获取模块,用于在预设信号交互时间集中,获取一个信号交互时间组,作为当前时间组,其中,每个所述信号交互时间组包括主机反馈信号时间、所述主机反馈信号时间相邻的从机发送信号时间以及所述主机反馈信号时间相邻的从机接收信号时间;
    当前时间差计算模块,用于基于所述主机反馈信号时间、所述从机发送信号时间以及所述从机接收信号时间,计算所述当前时间组的当前时间差;
    图像采集时间校正模块,用于根据所述当前时间差,确定所述主机与所述从机的目标时间差,并根据所述目标时间差,对所述从机的图像采集时间进行校正。
  9. 根据权利要求8所述的多相机同步校正装置,其中所述图像采集校正模块具体包括:
    多个时间组获取单元,用于在所述信号交互时间集中,获取除所述当前时间组之外的各个信号交互时间组;
    目标时间差确定单元,用于计算各个信号交互时间组对应的各个时间差,并在各个时间差以及所述当前时间差中确定最小时间差,作为所述目标时间差。
  10. 根据权利要求8所述的多相机同步校正装置,其中所述当前时间差计算模块具体包括:
    从机反馈信号时间计算单元,用于计算所述从机发送信号时间与所述从机接收信号时间和的平均值,作为从机反馈信号时间;
    当前时间差计算单元,用于计算所述从机反馈信号时间与所述主机反馈信号时间的差值,以所述差值的绝对值作为所述当前时间差。
  11. 根据权利要求10所述的多相机同步校正装置,其中所述图像采集校正模块具体还包括:
    初始图像时间确定单元,用于基于从机采集的图像,确定当前图像的采集时间为初始图像时间;
    校正图像时间确定单元,用于基于目标时间差,对所述初始图像时间进行计算,确定所述目标图像的校正图像时间。
  12. 根据权利要求11所述的多相机同步校正装置,其中所述校正图像时间确定单元具体包括:
    差值时间计算子单元,判断所述差值为正值,计算所述初始图像时间与所述目标时间差的差值时间,作为所述校正图像时间;
    和值时间计算子单元,判断所述差值为负值,计算所述初始图像时间与所述目标时间差的和值时间,作为所述校正图像时间。
  13. 根据权利要求12所述的多相机同步校正装置,其中所述多相机同步校正装置包括同步视频输出模块,所述视频同步输出模块具体包括:
    首帧主机图像获取单元,用于获取所述主机的首帧图像,作为首帧主机图像;
    首帧从机图像获取单元,用于在所述从机对应的各帧图像中,确定所述目标图像的采集时间对应的从机图像,作为首帧从机图像;
    同步视频输出单元,用于基于所述首帧主机图像与所述首帧从机图像,输出同步视频。
  14. 根据权利要求8所述的多相机同步校正装置,其中所述多相机同步校正装置包括信号交互时间组生成模块,所述信号交互时间组生成模块具体包括:
    从机发送信号时间获取单元,用于基于同步信号,向所述主机发送请求信号,记录当前从机系统时间,作为所述从机发送信号时间;
    从机接收信号时间获取单元,用于在接收到所述主机反馈的所述请求信号时,记录当前从机系统时间,作为所述从机接收信号时间;
    主机反馈信号时间获取单元,用于基于所述主机反馈的所述请求信号,获取所述主机发送所述请求信号时的当前主机系统时间,作为所述主机反馈信号时间;
    信号交互时间组生成单元,用于基于所述主机反馈信号时间、从机发送信号时间以及从机接收信号时间,生成一个所述信号交互时间组。
  15. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有多相机同步校正程序,其中所述多相机同步校正程序被处理器执行时,实现如权利要求1所述的多相机同步校正方法的步骤。
PCT/CN2022/093867 2022-04-20 2022-05-19 多相机同步校正方法、装置及存储介质 WO2023201822A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210418853.1 2022-04-20
CN202210418853.1A CN114827576A (zh) 2022-04-20 2022-04-20 多相机同步校正方法、装置、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023201822A1 true WO2023201822A1 (zh) 2023-10-26

Family

ID=82505866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/093867 WO2023201822A1 (zh) 2022-04-20 2022-05-19 多相机同步校正方法、装置及存储介质

Country Status (2)

Country Link
CN (1) CN114827576A (zh)
WO (1) WO2023201822A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115883748A (zh) * 2022-11-28 2023-03-31 中汽创智科技有限公司 一种数据回放的同步方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100209070A1 (en) * 2009-02-17 2010-08-19 Sony Corporation Slave device, time synchronization method in slave device, master device, and electronic equipment system
CN101820500A (zh) * 2009-02-27 2010-09-01 索尼公司 从装置、从装置的时刻同步化方法、主装置以及电子设备系统
CN110971818A (zh) * 2019-11-19 2020-04-07 北京奇艺世纪科技有限公司 一种时刻校准方法、装置、辅助从设备及辅助主设备
WO2021049406A1 (ja) * 2019-09-09 2021-03-18 日本電気株式会社 スレーブ装置、時刻同期システム、時刻同期方法、および時刻同期プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101121153B1 (ko) * 2009-12-08 2012-03-19 (주) 알디텍 영상신호와 센서신호의 동기화 시스템 및 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100209070A1 (en) * 2009-02-17 2010-08-19 Sony Corporation Slave device, time synchronization method in slave device, master device, and electronic equipment system
CN101820500A (zh) * 2009-02-27 2010-09-01 索尼公司 从装置、从装置的时刻同步化方法、主装置以及电子设备系统
WO2021049406A1 (ja) * 2019-09-09 2021-03-18 日本電気株式会社 スレーブ装置、時刻同期システム、時刻同期方法、および時刻同期プログラム
CN110971818A (zh) * 2019-11-19 2020-04-07 北京奇艺世纪科技有限公司 一种时刻校准方法、装置、辅助从设备及辅助主设备

Also Published As

Publication number Publication date
CN114827576A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
CN107277385B (zh) 一种多相机系统同步曝光的控制方法、装置及终端设备
CN107231533B (zh) 一种同步曝光方法、装置及终端设备
EP3291551B1 (en) Image delay detection method and system
CN104375789B (zh) 拼接屏的同步显示方法及系统
CN107439000B (zh) 一种同步曝光的方法、装置及终端设备
CN112154669B (zh) 基于系统时钟的视频流帧时间戳的相关
JP6527289B2 (ja) 時刻同期方法、センサ収容端末、およびセンサネットワークシステム
KR101821145B1 (ko) 영상 라이브 스트리밍 시스템
CN111343415A (zh) 数据传输方法及装置
WO2023201822A1 (zh) 多相机同步校正方法、装置及存储介质
CN110278047A (zh) 用于时钟同步、设置流媒体帧的pts值的方法、装置及设备
WO2024060763A1 (zh) 一种无线智能可穿戴装置及其图像采集方法
JP2003179662A5 (zh)
CN107455006B (zh) 一种同步曝光的方法、装置及终端设备
US20220345290A1 (en) Communication apparatus, method for controlling communication apparatus, and storage medium
CN114866829A (zh) 同步播放的控制方法及装置
JP2019140643A (ja) 伝送装置
JP2006203817A (ja) マルチカメラシステム
WO2024002194A1 (zh) 一种同步校验方法、装置、电子设备及存储介质
TWI815693B (zh) 資料處理系統、資料處理方法和電腦可讀儲存媒體
CN113810243B (zh) 一种延时测试方法及装置
TWI837861B (zh) 資料處理系統、用於判定座標的方法以及電腦可讀儲存媒體
US11930299B2 (en) Measuring audio and video latencies in virtual desktop environments
CN117336419A (zh) 拍摄方法、装置、电子设备及可读存储介质
CN116156143A (zh) 数据生成方法、摄像设备、头戴式显示设备和可读介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22938054

Country of ref document: EP

Kind code of ref document: A1