CN116156074A - Multi-camera acquisition time synchronization method - Google Patents

Multi-camera acquisition time synchronization method Download PDF

Info

Publication number
CN116156074A
CN116156074A CN202211456144.9A CN202211456144A CN116156074A CN 116156074 A CN116156074 A CN 116156074A CN 202211456144 A CN202211456144 A CN 202211456144A CN 116156074 A CN116156074 A CN 116156074A
Authority
CN
China
Prior art keywords
frame
image
images
camera
trigger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211456144.9A
Other languages
Chinese (zh)
Other versions
CN116156074B (en
Inventor
王远帆
朱韬
章健勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huixi Intelligent Technology Shanghai Co ltd
Original Assignee
Huixi Intelligent Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huixi Intelligent Technology Shanghai Co ltd filed Critical Huixi Intelligent Technology Shanghai Co ltd
Priority to CN202211456144.9A priority Critical patent/CN116156074B/en
Publication of CN116156074A publication Critical patent/CN116156074A/en
Application granted granted Critical
Publication of CN116156074B publication Critical patent/CN116156074B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention belongs to the technical field of software testing, and discloses a multi-camera acquisition time synchronization method, which comprises the steps of sending a periodic trigger signal; each camera receives the trigger signal, starts exposure drawing, and simultaneously the processor receives the trigger signal, reads the corresponding system time at the moment, marks the system time as a trigger time stamp and sequentially stores the trigger time stamp into a trigger time stamp queue; the processor receives the images uploaded by each camera frame by frame, synchronizes the images, then calculates the trigger time corresponding to each frame of image, and finds the trigger time stamp closest to the trigger time stamp from the trigger time stamp queue as the actual trigger time stamp; and outputting the images of the frames with the same serial numbers from the cameras together with the corresponding actual trigger time stamps for subsequent image processing.

Description

Multi-camera acquisition time synchronization method
Technical Field
The invention relates to the technical field of image processing, in particular to a multi-camera acquisition time synchronization method.
Background
The traditional implementation method cannot accurately control the frame synchronization among a plurality of cameras, the frame rate cannot meet the requirement of 30 frames per second, the time of receiving an image by a CPU (Central processing Unit) is taken as the generation time of the image, and the time of generating the image by starting exposure of the image has larger error, cannot be aligned and matched with the time generated by Lidar point cloud accurately, and cannot meet the requirement of image processing programs such as BEV multi-camera sensing algorithm.
Disclosure of Invention
The invention provides a multi-camera acquisition time synchronization method, which accurately calculates the initial exposure time of an image through an algorithm, takes the time as the time stamp information of the shot image of the exposure of cameras, realizes the frame synchronization of a plurality of cameras, ensures that the shot image has accurate time stamp information, provides required image information for image processing programs such as BEV multi-camera sensing algorithm and the like, and solves the problems of inconsistent actual exposure time of the image and the acquisition time of the image entering the program and the like in the prior art.
The invention can be realized by the following technical scheme:
a multi-camera acquisition time synchronization method comprises the following steps of
Sending a periodic trigger signal;
each camera receives the trigger signal, starts exposure drawing, and simultaneously the processor receives the trigger signal, reads the corresponding system time at the moment, marks the system time as a trigger time stamp and sequentially stores the trigger time stamp into a trigger time stamp queue;
the processor receives the images uploaded by each camera frame by frame, synchronizes the images, then calculates the trigger time corresponding to each frame of image, and finds the trigger time stamp closest to the trigger time stamp from the trigger time stamp queue as the actual trigger time stamp;
and outputting the images of the frames with the same serial numbers from the cameras together with the corresponding actual trigger time stamps for subsequent image processing.
Further, the Frame number of each camera uploading image is recorded as Frame ID, gold Frame ID is set and initialized,
the received images are sequentially stored in the image queues corresponding to the cameras,
reading images from each image queue Frame by Frame, comparing the Frame ID of each Frame image with the Gold Frame ID, and if the Frame IDs are the same, indicating that the frames of each camera are synchronous at the moment, and enabling the Gold Frame ID=gold Frame ID+1;
if the Frame numbers are different, taking the maximum Frame ID as an updated Gold Frame ID, discarding images corresponding to other Frame IDs, and continuing to read the next Frame image from an image queue corresponding to the other Frame IDs until an image with the same Frame number as the updated Gold Frame ID is found, so that the Frame synchronization of each camera is described, and the Gold Frame ID=gold Frame ID+1;
repeating the steps until the frame synchronization of all the images in each image queue is completed.
Further, the method comprises the following steps:
step one, initializing
Each camera receives the periodic trigger signal, starts to shoot pictures and transmits the pictures to the processor frame by frame;
the processor receives the periodic trigger signal, reads the corresponding system time at the moment as a trigger time stamp, sequentially stores the trigger time stamp into the corresponding trigger time stamp queue, sequentially stores the received images in the image queues corresponding to the cameras,
step two, frame synchronization
Reading current Frame images from each image queue, comparing Frame IDs of the current Frame images with Gold Frame IDs, and if the Frame IDs are the same, indicating that the frames of the cameras are synchronous at the moment, so that the Gold Frame IDs=gold Frame ID+1;
if the Frame numbers are different, taking the maximum Frame ID as an updated Gold Frame ID, discarding images corresponding to other Frame IDs, and continuing to read the next Frame image from an image queue corresponding to the other Frame IDs until an image with the same Frame number as the updated Gold Frame ID is found, so that the Frame synchronization of each camera is described, and the Gold Frame ID=gold Frame ID+1;
step three, calculating the actual trigger time
Calculating actual trigger time stamps corresponding to the current frame images according to the time of receiving the trigger signals by the cameras and the processing time of uploading the images by the processor;
step four, synchronous transmission
The current frame images from the cameras and the corresponding actual trigger time are sent to an image processing module;
and fifthly, repeatedly executing the second to fourth steps to finish the frame alignment and synchronous transmission of each frame of image in the image queue.
Further, the time taken from the beginning of exposure to the beginning of image output of each camera is recorded as T1; the moment when the processor receives the first pixel from each camera image is T2; the moment when the processor receives the completion of ISP image processing is T3; the moment of storing the images into each image queue is T4;
the actual trigger timestamp = T4- (T3-T2) -T1.
Further, the Golden Frame ID is consistent with the initial value of the Frame ID of the camera, and is consistent with the change step length of the Frame ID.
A computer readable storage medium storing a computer program for execution by a processor to implement the method of any one of claims 1-5.
The beneficial technical effects of the invention are as follows:
the method comprises the steps of enabling each camera to simultaneously receive a trigger signal, starting exposure and shooting images at the same frequency, recording the time of the rising edge of each trigger signal as the starting time of image exposure, reading the system time of the time as the trigger time stamp of the images, storing the time into a trigger time stamp queue, calculating at the rate of 30 frames per second, storing 30 trigger time stamp information in the queue per second, and because the sequence of processing each frame of images by an ISP and the sequence of scheduling by a CPU (Central processing Unit), triggering each camera at the same time, the time of last receiving the images by an application program has a sequence relation.
Drawings
FIG. 1 is a schematic general flow diagram of the present invention;
FIG. 2 is a schematic diagram illustrating the routing of the trigger signal according to the present invention;
fig. 3 is a schematic diagram of a processing flow of image data of each camera through a SoC system-on-chip according to the present invention.
FIG. 4 is a schematic diagram of a software state machine according to the present invention;
Detailed Description
The following detailed description of the invention refers to the accompanying drawings and preferred embodiments.
1-4, the invention provides a multi-camera acquisition time synchronization method, which comprises the steps of sending a periodic trigger signal; each camera receives the trigger signal, starts exposure drawing, and simultaneously the processor receives the trigger signal, reads the corresponding system time at the moment, marks the system time as a trigger time stamp and sequentially stores the trigger time stamp into a trigger time stamp queue; the processor receives the images uploaded by each camera frame by frame, synchronizes the images, then calculates the trigger time corresponding to each frame of image, and finds the trigger time stamp closest to the trigger time stamp from the trigger time stamp queue as the actual trigger time stamp; and outputting the images of the frames with the same serial numbers from the cameras together with the corresponding actual trigger time stamps for subsequent image processing.
The specific steps are as follows, as shown in FIG. 4:
step one, initializing
The Frame serial numbers of the uploaded images of the cameras are recorded as Frame IDs, the Gold Frame IDs are set and initialized, the Golden Frame IDs are consistent with the initial values of the Frame IDs of the cameras, and the Frame IDs are consistent with the change step length of the Frame IDs, and can be set according to the frequency of the trigger signals;
transmitting a periodic trigger signal, wherein the frequency pulse width of the trigger signal is configurable;
each camera receives the periodic trigger signal, starts to shoot pictures and transmits the pictures to the processor frame by frame;
the processor receives the periodic trigger signal, reads the corresponding system time at this time as a trigger time stamp, sequentially stores the trigger time stamp in a corresponding trigger time stamp queue as shown in fig. 2, and sequentially stores the received images in an image queue corresponding to each camera as shown in fig. 3.
If the processor receives RAW image data from each camera for the SoC system-on-chip, an ISP image processing module in the processor performs image processing on the RAW image of each camera, and stores the processed RAW image into an image queue of each camera. Because the ISP image processing module needs time to process the image, and then interrupts to inform the SoC that the system on chip is processed and then stores the processed image in the image queue, the time for transmitting the image to the application program is not the actual trigger time of the image, i.e. the actual exposure time, which will cause difficulty in the subsequent image alignment and other processes of the application program, such as difficult to realize accurate image stitching, so that the subsequent steps of frame synchronization, actual trigger timestamp calculation and the like need to be executed, and the synchronicity of the application program to process the image is ensured.
Step two, frame synchronization
Reading current Frame images from each image queue, comparing Frame IDs of the current Frame images with Gold Frame IDs, and if the Frame IDs are the same, indicating that the frames of the cameras are synchronous at the moment, so that the Gold Frame IDs=gold Frame ID+1;
if the Frame numbers are different, taking the maximum Frame ID as an updated Gold Frame ID, discarding images corresponding to other Frame IDs, and continuing to read the next Frame image from an image queue corresponding to the other Frame IDs until an image with the same Frame number as the updated Gold Frame ID is found, so that the Frame synchronization of each camera is described, and the Gold Frame ID=gold Frame ID+1;
step three, calculating the actual trigger time
Calculating actual trigger time stamps corresponding to the current frame images according to the time of receiving the trigger signals by the cameras and the processing time of uploading the images by the processor;
recording the time taken by each camera from the beginning of exposure to the beginning of image output as T1; the processor receives the first pixel moment from each camera image as T2; the moment when the processor receives the completion of ISP image processing is T3; the moment of storing the images into each image queue is T4;
the actual trigger timestamp = T4- (T3-T2) -T1.
Step four, synchronous transmission
Transmitting the current frame image from each camera and the corresponding actual trigger time to an image processing module, if the frame loss phenomenon occurs in the process and the synchronous transmission fails, turning to the execution of the second to third steps, re-performing frame synchronization and actual trigger time stamp calculation, searching the images from the frames with the same serial numbers of each camera, and transmitting the images to a subsequent image processing module together with the corresponding actual trigger time stamp;
and fifthly, repeatedly executing the second to fourth steps to finish the frame alignment and synchronous transmission of each frame of image in the image queue.
In order to verify the feasibility of the multi-camera time alignment method of the invention, we apply it to BEV perception model training and BEV perception reasoning and video recording, as follows:
example one, BEV perception model training
1. Collecting laser radar data, wherein the point cloud data is provided with a time stamp;
2. collecting image data shot by a plurality of cameras, and synchronously processing the image data by the time alignment method to obtain the image data with the actual trigger time stamp;
3. determining the relation between the time stamp of the point cloud data and the actual triggering time stamp of the image data, and marking the data of the target on the image by taking the radar data as a true value;
4. training the BEV model by using the marked data;
5. the model after training can be used for carrying out reasoning and target detection on the real vehicle;
second embodiment, BEV aware reasoning and video recording
1. The image data of a plurality of cameras are synchronously processed by the time alignment method to obtain the image data with the actual trigger time stamp;
2. providing the BEV sensing module for reasoning target detection;
3. providing the video data to a recording module, recording the video data by a plurality of cameras, encoding the video data into H265, and carrying out triggering time stamping on each path of video data;
4. the video playback of the multiple cameras is decoded into YUV images by the H265, because the actual trigger time stamps of the multiple cameras are the same, the frame data of different cameras can be aligned when the multiple cameras are played back;
5. the replayed multi-path camera data is provided to the perception model for the purpose of further testing the model after retraining for some abnormal conditions, such as the condition that no target is detected on the vehicle.
According to an embodiment of the present invention, there is also provided a synchronization terminal including: the system comprises a memory, a processor and an application program stored on the memory and capable of running on the processor, wherein the application program is executed by the processor to realize the steps of the method according to any one of the embodiments.
According to one embodiment of the present invention, there is also provided a computer-readable storage medium.
The computer readable storage medium has stored thereon a time synchronization program which, when executed by a processor, implements the steps of the time synchronization method according to any one of the embodiments.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While particular embodiments of the present invention have been described above, it will be appreciated by those skilled in the art that these are merely illustrative, and that many changes and modifications may be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims.

Claims (6)

1. A multi-camera acquisition time synchronization method is characterized in that: comprising
Sending a periodic trigger signal;
each camera receives the trigger signal, starts exposure drawing, and simultaneously the processor receives the trigger signal, reads the corresponding system time at the moment, marks the system time as a trigger time stamp and sequentially stores the trigger time stamp into a trigger time stamp queue;
the processor receives the images uploaded by each camera frame by frame, synchronizes the images, then calculates the trigger time corresponding to each frame of image, and finds the trigger time stamp closest to the trigger time stamp from the trigger time stamp queue as the actual trigger time stamp;
and outputting the images of the frames with the same serial numbers from the cameras together with the corresponding actual trigger time stamps for subsequent image processing.
2. The multi-camera acquisition time synchronization method of claim 1, wherein: recording the Frame number of each camera uploading image as Frame ID, setting Gold Frame ID and initializing,
the received images are sequentially stored in the image queues corresponding to the cameras,
reading images from each image queue Frame by Frame, comparing the Frame ID of each Frame image with the Gold Frame ID, and if the Frame IDs are the same, indicating that the frames of each camera are synchronous at the moment, and enabling the Gold Frame ID=gold Frame ID+1;
if the Frame numbers are different, taking the maximum Frame ID as an updated Gold Frame ID, discarding images corresponding to other Frame IDs, and continuing to read the next Frame image from an image queue corresponding to the other Frame IDs until an image with the same Frame number as the updated Gold Frame ID is found, so that the Frame synchronization of each camera is described, and the Gold Frame ID=gold Frame ID+1;
repeating the steps until the frame synchronization of all the images in each image queue is completed.
3. The multi-camera acquisition time synchronization method of claim 2, comprising the steps of:
step one, initializing
Each camera receives the periodic trigger signal, starts to shoot pictures and transmits the pictures to the processor frame by frame;
the processor receives the periodic trigger signal, reads the corresponding system time at the moment as a trigger time stamp, sequentially stores the trigger time stamp into the corresponding trigger time stamp queue, sequentially stores the received images in the image queues corresponding to the cameras,
step two, frame synchronization
Reading current Frame images from each image queue, comparing Frame IDs of the current Frame images with Gold Frame IDs, and if the Frame IDs are the same, indicating that the frames of the cameras are synchronous at the moment, so that the Gold Frame IDs=gold Frame ID+1;
if the Frame numbers are different, taking the maximum Frame ID as an updated Gold Frame ID, discarding images corresponding to other Frame IDs, and continuing to read the next Frame image from an image queue corresponding to the other Frame IDs until an image with the same Frame number as the updated Gold Frame ID is found, so that the Frame synchronization of each camera is described, and the Gold Frame ID=gold Frame ID+1;
step three, calculating the actual trigger time
Calculating actual trigger time stamps corresponding to the current frame images according to the time of receiving the trigger signals by the cameras and the processing time of uploading the images by the processor;
step four, synchronous transmission
The current frame images from the cameras and the corresponding actual trigger time are sent to an image processing module;
and fifthly, repeatedly executing the second to fourth steps to finish the frame alignment and synchronous transmission of each frame of image in the image queue.
4. A multi-camera acquisition time synchronization method according to claim 3, characterized in that: recording the time taken by each camera from the beginning of exposure to the beginning of image output as T1; the moment when the processor receives the first pixel from each camera image is T2; the moment when the processor receives the completion of ISP image processing is T3; the moment of storing the images into each image queue is T4;
the actual trigger timestamp = T4- (T3-T2) -T1.
5. The multi-camera acquisition time synchronization method of claim 2, wherein: the Golden Frame ID is consistent with the initial value of the Frame ID of the camera, and is consistent with the change step length of the Frame ID.
6. A computer-readable storage medium storing a computer program, characterized in that: the computer program being executed by a processor to implement the method of any of claims 1-5.
CN202211456144.9A 2022-11-21 2022-11-21 Multi-camera acquisition time synchronization method Active CN116156074B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211456144.9A CN116156074B (en) 2022-11-21 2022-11-21 Multi-camera acquisition time synchronization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211456144.9A CN116156074B (en) 2022-11-21 2022-11-21 Multi-camera acquisition time synchronization method

Publications (2)

Publication Number Publication Date
CN116156074A true CN116156074A (en) 2023-05-23
CN116156074B CN116156074B (en) 2024-03-15

Family

ID=86357220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211456144.9A Active CN116156074B (en) 2022-11-21 2022-11-21 Multi-camera acquisition time synchronization method

Country Status (1)

Country Link
CN (1) CN116156074B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116915978A (en) * 2023-08-07 2023-10-20 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer
CN117041528A (en) * 2023-08-07 2023-11-10 昆易电子科技(上海)有限公司 Time difference measuring method and system and waveform processing module
CN117998026A (en) * 2024-04-03 2024-05-07 深圳市卓驭科技有限公司 Multi-sensor frame synchronization determination method, storage medium, and computer program product
CN116915978B (en) * 2023-08-07 2024-07-16 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702013A (en) * 2013-11-28 2014-04-02 北京航空航天大学 Frame synchronization method for multiple channels of real-time videos
US20170134619A1 (en) * 2015-11-06 2017-05-11 Intel Corporation Synchronized capture of image and non-image sensor data
CN107439000A (en) * 2017-06-12 2017-12-05 深圳市瑞立视多媒体科技有限公司 A kind of method, apparatus synchronously exposed and terminal device
CN108924461A (en) * 2018-06-20 2018-11-30 斑马网络技术有限公司 Method of video image processing and device
CN110855851A (en) * 2019-11-25 2020-02-28 广州市奥威亚电子科技有限公司 Video synchronization device and method
CN111585684A (en) * 2020-05-14 2020-08-25 武汉大学 Multi-path camera time alignment method and system for networked monitoring video analysis
CN112751983A (en) * 2021-04-02 2021-05-04 湖北亿咖通科技有限公司 Image time synchronization method and device, electronic equipment and storage medium
CN114025055A (en) * 2021-11-29 2022-02-08 上海商汤临港智能科技有限公司 Data processing method, device, system, equipment and storage medium
CN114268706A (en) * 2021-12-13 2022-04-01 凌云光技术股份有限公司 Time service method and device of camera

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103702013A (en) * 2013-11-28 2014-04-02 北京航空航天大学 Frame synchronization method for multiple channels of real-time videos
US20170134619A1 (en) * 2015-11-06 2017-05-11 Intel Corporation Synchronized capture of image and non-image sensor data
CN107439000A (en) * 2017-06-12 2017-12-05 深圳市瑞立视多媒体科技有限公司 A kind of method, apparatus synchronously exposed and terminal device
CN108924461A (en) * 2018-06-20 2018-11-30 斑马网络技术有限公司 Method of video image processing and device
CN110855851A (en) * 2019-11-25 2020-02-28 广州市奥威亚电子科技有限公司 Video synchronization device and method
CN111585684A (en) * 2020-05-14 2020-08-25 武汉大学 Multi-path camera time alignment method and system for networked monitoring video analysis
CN112751983A (en) * 2021-04-02 2021-05-04 湖北亿咖通科技有限公司 Image time synchronization method and device, electronic equipment and storage medium
CN114025055A (en) * 2021-11-29 2022-02-08 上海商汤临港智能科技有限公司 Data processing method, device, system, equipment and storage medium
CN114268706A (en) * 2021-12-13 2022-04-01 凌云光技术股份有限公司 Time service method and device of camera

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116915978A (en) * 2023-08-07 2023-10-20 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer
CN117041528A (en) * 2023-08-07 2023-11-10 昆易电子科技(上海)有限公司 Time difference measuring method and system and waveform processing module
CN116915978B (en) * 2023-08-07 2024-07-16 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer
CN117998026A (en) * 2024-04-03 2024-05-07 深圳市卓驭科技有限公司 Multi-sensor frame synchronization determination method, storage medium, and computer program product
CN117998026B (en) * 2024-04-03 2024-06-28 深圳市卓驭科技有限公司 Multi-sensor frame synchronization determination method, storage medium, and computer program product

Also Published As

Publication number Publication date
CN116156074B (en) 2024-03-15

Similar Documents

Publication Publication Date Title
CN116156074B (en) Multi-camera acquisition time synchronization method
CN107231533B (en) synchronous exposure method and device and terminal equipment
US11514606B2 (en) Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
CN107613159B (en) Image time calibration method and system
CN107277385B (en) Multi-camera system synchronous exposure control method and device and terminal equipment
CN112584234B (en) Frame supplementing method and related device for video image
CN107948463B (en) Camera synchronization method, device and system
US8554017B2 (en) Imaging apparatus, data processing method, and program
CN103460248B (en) Image processing method and device
CN107439000B (en) Synchronous exposure method and device and terminal equipment
CN104202534A (en) Multi-camera synchronous control device based on GPS and pulse generator and method
CN110174120B (en) Time synchronization method and device for AR navigation simulation
CN113242431B (en) Marking data preprocessing method for road side perception
JP4186520B2 (en) Multi-view image recording apparatus, multi-view image frame synchronization processing method, and computer program
CN104837002A (en) Shooting device, three-dimensional measuring system, and video intra-frame interpolation method and apparatus
CN114071132B (en) Information delay detection method, device, equipment and readable storage medium
CN111193873A (en) Image rapid dimming system and method
CN113938631B (en) Reference monitor, image frame interception method and system
EP2940989A1 (en) Method and apparatus for generating composite image in electronic device
US9549112B2 (en) Image capturing apparatus, and control method therefor
CN113315926B (en) Image output control method and device, storage medium and electronic device
EP3882853A1 (en) Image processing method and apparatus
CN116455365B (en) Capturing circuit, micro-processing chip and device
CN118042103A (en) Audio and video synchronism detection method, detection device and control equipment
TW202343992A (en) Method and electronic device of correcting frame for optical camera communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant