CN112787740A - Multi-sensor time synchronization device and method - Google Patents

Multi-sensor time synchronization device and method Download PDF

Info

Publication number
CN112787740A
CN112787740A CN202011570288.8A CN202011570288A CN112787740A CN 112787740 A CN112787740 A CN 112787740A CN 202011570288 A CN202011570288 A CN 202011570288A CN 112787740 A CN112787740 A CN 112787740A
Authority
CN
China
Prior art keywords
camera
point cloud
data
time
lidar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011570288.8A
Other languages
Chinese (zh)
Inventor
朱敦尧
周风明
郝江波
王志威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Kotei Informatics Co Ltd
Original Assignee
Wuhan Kotei Informatics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Kotei Informatics Co Ltd filed Critical Wuhan Kotei Informatics Co Ltd
Priority to CN202011570288.8A priority Critical patent/CN112787740A/en
Publication of CN112787740A publication Critical patent/CN112787740A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0644External master-clock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a multi-sensor time synchronization device and a method thereof, wherein the device comprises: the acquisition data processing module, the frequency divider, the camera, laser radar and data storage module, utilize pulse per second signal PPS of GPS module output as sensor time synchronization's time reference, trigger camera and laser radar data collection, and utilize the UTC time that the GPS module obtained from the satellite as the system clock source, add the timestamp for image data and point cloud data, thereby realize multisensor's time synchronization, guarantee time synchronization result is accurate reliable, can avoid the influence that the sensor brought to the reliance and the network unstability of PTP agreement function.

Description

Multi-sensor time synchronization device and method
Technical Field
The invention relates to the field of automatic driving data acquisition, in particular to a multi-sensor time synchronization device and a multi-sensor time synchronization method.
Background
The core sensors of the current automatic driving sensing system are mainly a camera and a laser radar, the camera and the laser radar are functionally redundant for detecting and identifying obstacles, performance defects are made up for each other, a key factor of the functional reliability of the automatic driving sensing system lies in a sensing fusion technology, and time synchronization among the sensors is required to be ensured in the sensing fusion.
Currently, a time synchronization method of a data acquisition system includes: a system time recording method is directly adopted to stamp the data frames stored by the sensor with time to realize rough time synchronization; or a PTP protocol is adopted to synchronize the time of a plurality of sensor nodes. In the former method, each sensor has time deviation in the data acquisition, transmission and storage processes, the deviation is not fixed, and the time interval between different data frames is changed, so the time synchronization error is large; although the latter method can theoretically achieve millisecond-level synchronization accuracy, the time synchronization effect is affected by various factors, such as network instability, caching effect of a network card on information packets, and the influence of operating system process scheduling, which are usually unpredictable and uncontrollable, so that the clock synchronization is unreliable, a sensor is required to support a PTP (precision time protocol) function, and most cameras do not have the function.
Disclosure of Invention
In view of this, embodiments of the present invention provide a multi-sensor time synchronization apparatus and method, so as to solve the problem that a time synchronization result of data acquired by a sensor is unreliable.
In a first aspect of the embodiments of the present invention, a multi-sensor time synchronization apparatus is provided, which at least includes an acquired data processing module, a frequency divider, a camera, a lidar and a data storage module, wherein the acquired data processing module, the frequency divider and the lidar are all connected to a GPS module, and at least one camera is connected to the frequency divider;
the frequency divider is used for receiving the PPS signals sent by the GPS module, dividing the frequency of the PPS signals and sending the PPS signals to the connected cameras;
the camera is used for triggering image data acquisition according to the PPS signal sent by the frequency divider;
the laser radar is used for triggering point cloud data acquisition according to the PPS signal sent by the GPS module;
the collected data processing module is used for receiving the GPGGA message sent by the GPS module, extracting UTC time from the GPGGA message as a system time source, receiving collected image data and point cloud data, and adding timestamps to the image data and the point cloud data according to the system time source;
and the data storage module is used for storing the image data added with the time stamp and the point cloud data.
In a second aspect of the embodiments of the present invention, there is provided a multi-sensor time synchronization method, including:
the GPS module sends a PPS signal to the frequency divider and the laser radar, the frequency divider divides the frequency of the PPS signal and sends the PPS signal to a camera connected with the frequency divider, the camera is triggered to collect images, and the laser radar collects point cloud data after receiving the PPS signal;
the acquisition data processing module receives a GPGGA message sent by a GPS module, and extracts UTC time from the GPGGA message as a system time source;
and the acquired data processing module receives the acquired image data and point cloud data and adds timestamps to the image data and the point cloud data according to the system time source.
In a third aspect of the embodiments of the present invention, there is provided an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the steps of the method according to the second aspect of the embodiments of the present invention.
In a fourth aspect of the embodiments of the present invention, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the method provided in the second aspect of the embodiments of the present invention.
In the embodiment of the invention, the PPS signal output by the GPS module is used as a time synchronization reference to trigger the camera and the laser radar to acquire data, and the UTC time obtained from the satellite by the GPS module is used as a system time source to add timestamps to the image data and the point cloud data, so that the time synchronization of the data acquired by the sensor is realized, and the accuracy and reliability of the time synchronization result can be ensured. The dependence of the sensors on the function of the PTP is avoided, the limitation of sensor type selection in data acquisition is reduced, and meanwhile, the number of the sensors is easy to expand.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a multi-sensor time synchronization apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a camera and a lidar horizontal acquisition angle provided by an embodiment of the invention;
fig. 3 is a flowchart illustrating a multi-sensor time synchronization method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons skilled in the art without any inventive work shall fall within the protection scope of the present invention, and the principle and features of the present invention shall be described below with reference to the accompanying drawings.
The terms "comprises" and "comprising," when used in this specification and claims, and in the accompanying drawings and figures, are intended to cover non-exclusive inclusions, such that a process, method or system, or apparatus that comprises a list of steps or elements is not limited to the listed steps or elements.
The terms "comprises" and "comprising," when used in this specification and claims, and in the accompanying drawings and figures, are intended to cover non-exclusive inclusions, such that a process, method or system, or apparatus that comprises a list of steps or elements is not limited to the listed steps or elements. In addition, "first" and "second" are used to distinguish different objects, and are not used to describe a specific order.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a multi-sensor time synchronizer according to an embodiment of the present invention, which at least includes an acquired data processing module 110, a frequency divider 120, a camera 130, a laser radar 140, and a data storage module 150, where the acquired data processing module 110, the frequency divider 120, and the laser radar 140 are all connected to a GPS module, and at least one camera 130 is connected to the frequency divider 120;
the frequency divider 120 is configured to receive a Pulse Per Second (PPS) (pu l se Per second) signal sent by the GPS module, divide the frequency of the PPS signal, and send the divided signal to the connected camera 130;
the camera 130 is used for triggering image data acquisition according to the pulse per second PPS signal sent by the frequency divider 120;
the laser radar 140 is used for triggering point cloud data acquisition according to the PPS signal sent by the GPS module;
the collected data processing module 110 is configured to receive a GPGGA packet sent by the GPS module, extract UTC (coordinated universal time) time from the GPGGA packet as a system time source, receive collected image data and point cloud data, and add timestamps to the image data and the point cloud data according to the system time source;
the data storage module 150 is configured to store the image data and the point cloud data to which the time stamp is added.
Specifically, an interface RS232 of the GPS module for sending the GPGGA message is connected with the collected data processing module, the GPS module is respectively connected to an external trigger input interface of the laser radar and an input end of a frequency divider, and the frequency divider sends a pulse signal after frequency division to an external trigger input interface of a connected camera; and the data interfaces of the camera and the laser radar are connected with the collected data processing module.
The frequency divider receives the PPS signal and immediately and synchronously outputs a pulse signal to the camera for triggering the camera to expose. The GPIO interface mode of the cameras is configured to be an external trigger exposure mode, external trigger pulse signals of the cameras are all from the same pulse source, pulse trigger signals received by the cameras are guaranteed to be received synchronously, and when the cameras detect the pulse trigger signals, images are collected.
Illustratively, each PPS signal passes through a frequency divider, triggering the camera to output 20 frames of image data in 1 second.
The laser radar configures the time source into a PPS mode through the upper computer, and sets scanning frequency, if the scanning frequency is set to be 10Hz, 10 frames of point cloud data are output in the next 1 second when the laser radar detects PPS signals.
It can be understood that the working trigger signal of the laser radar is a PPS signal of a GPS, the working trigger signal of the camera is a pulse signal obtained by frequency division of the PPS signal, and the trigger time reference of the working trigger signal and the triggering time reference of the working trigger signal are the time when the PPS signal is sent out, so that the collection time of the laser radar and the collection time of the camera are the same time, and the collection frequency of the camera is integral multiple of the scanning frequency of the laser radar, so that each frame of point cloud data has image data aligned with the point cloud data, and the time synchronization of the camera and the laser radar is realized.
And the acquired data processing module receives the GPGGA message sent by the GPS and extracts UTC time from the GPGGA message as a system time source. And the data receiving and processing module receives the image data of the camera and the point cloud data of the laser radar, and stamps a timestamp on the image data and the point cloud data packet by the extracted UTC time.
The dotted line box may also represent a data acquisition system that needs time synchronization, and the GPS module provides the data acquisition system with PPS signals and UTC time accurate to milliseconds.
The exposure mode of the camera can be configured into an external exposure triggering mode through the GPIO interface, one pulse signal triggers the exposure of the camera to be recorded once, and one frame of image data is acquired. Illustratively, when the frequency divider receives the PPS signal, a pulse signal with the frequency of 20Hz is synchronously output immediately, the camera is triggered to continuously acquire 20 frames of image data at the frequency of 20Hz, the external trigger signal sources of a plurality of cameras are the same, and the 20Hz pulse signal output by the frequency divider is obtained, so that the time synchronization of the plurality of cameras is realized.
The scanning frequency of the laser radar is set to be 10Hz through the upper computer, and as the time of one week of horizontal scanning of the laser radar is 100ms, the actual time difference between the head point cloud and the tail point cloud of each frame of point cloud data acquired by the laser radar is 100 ms. It should be understood that, in order to reduce the influence of time synchronization between the point cloud and the image data caused by the time difference, the initial scanning angle of the laser radar is set within the range of the viewing angle of the camera, so that the point cloud data synchronized with the image can be scanned in a shorter time. As shown in fig. 2, assuming that the installation horizontal direction angles of the camera and the lidar are the same, the horizontal viewing angle of the camera is α degrees, the time source of the lidar is configured to be in a PPS mode by the upper computer, and the angle parameter of PPS signal synchronization is set to-0.5 α degrees. When the laser radar receives the PPS signal, scanning is started from-0.5 alpha degrees, a horizontal visual angle alpha of a camera selected by a data acquisition system is about 60 degrees, and the time required by the laser radar to complete a scanning stroke of alpha degrees is far shorter than the time required by scanning for 360 degrees in a week, so that the time synchronization deviation is reduced.
And the GPS module continuously sends the PPS signals and the GPGGA messages at the frequency of 1 Hz. The laser radar starts to collect point cloud data when receiving PPS signals, the camera synchronously receives 20Hz pulse signals of the PPS after frequency division and starts to collect image data, and data information acquired by the camera and a plurality of sensors of the laser radar are all at the same moment. However, the time deviation exists between the point cloud data of the laser radar and the image data of the camera in the transmission process, and the time of the point cloud data and the time of the image data received by the data acquisition and processing module are different.
The system time is T when the acquisition data processing module receives the point cloud datare_lidarThe system time of the acquisition data processing module receiving the image data is Tre_cameraAnd then the timestamp of the laser point cloud data is as follows: t islidar=Tre_lidar-Tdelay_lidarOf image dataTime stamp is Tcamera=Tre_camera-Tdelay_camera
Wherein, the average time delay T of the point cloud data received by the data acquisition processing module and the PPS signal received by the laser radar are respectively measureddelay_lidarAnd acquiring the average time delay T of the image data received by the data processing module and the PPS signal received by the frequency dividerdelay_camera
The data acquisition processing module receives the GPGGA message, extracts UTC time information and updates system time, and the UTC time extracted from the current GPGGA message by the data acquisition processing module is the time of data acquisition time of the sensor because the GPGGA message and the PPS signal are transmitted at the same time. The system time is T when the data acquisition system receives the point cloud data of the laser radarre_lidarThe system time of the collected data processing module receiving the camera data is Tre_camera. Accordingly, the timestamp T of the laser radar point cloud can be calculatedlidar=Tre_lidar-Tdelay_lidarTime stamp T of camera imagecamera=Tre_camera-Tdelay_camera
It should be noted that, in theory, TlidarValue sum TcameraThe values of the point clouds and the camera data are the same, but because the laser radar data and the camera data of each frame have slight time deviation and frame loss in the transmission and receiving processes, the collected data processing module is required to further process the received data so as to ensure that the point clouds and the camera data of each frame are aligned in time. The sampling frequency of the laser radar is half of that of a camera, the point cloud data of the laser radar is used as a core frame, the data processing module reads the received point cloud data from the buffer area, the image data of two frames before and after the point cloud of the frame is found through the index timestamp and is compared with the time, the image data of the frame closest to the time of the point cloud data is found from the two frames of data and is packaged together with the point cloud data, and a serial number is recorded to form a data packet for storage.
In the embodiment, the camera and the laser radar multi-sensor are synchronously triggered to acquire data by taking the high-precision PPS signal as a time reference, and the acquired image and point cloud data timestamps are recorded by using the precise UTC time acquired by the satellite, so that millisecond-level time synchronization of the multi-sensor of the data acquisition system is realized.
Compared with the prior art, the multi-sensor time synchronization method provided by the embodiment has the advantages that: the GPS can output a Pulse Per Second (PPS) signal accurate to millisecond, so that an accurate trigger reference signal is provided for a sensor of an acquisition system, and the time synchronization accuracy can reach millisecond level; 2. the dependence of the sensor on the PTP protocol function is avoided, and the limitation of the data acquisition system on the type selection of the sensor is reduced; 3. the number of the sensors with time synchronization is easy to expand, and only pulse trigger signal lines need to be increased or decreased when the laser radar or the camera is increased or decreased.
Fig. 3 is a schematic flowchart of a multi-sensor time synchronization method according to an embodiment of the present invention, where the method includes:
s301, the GPS module sends a PPS signal to a frequency divider and a laser radar, the frequency divider divides the frequency of the PPS signal and sends the PPS signal to a camera connected with the frequency divider, the camera is triggered to collect images, and the laser radar collects point cloud data after receiving the PPS signal;
and configuring the GPIO interface mode of the cameras into an external trigger exposure mode, wherein external trigger pulse signals of the cameras come from the same pulse source.
Preferably, the laser radar is configured into a PPS mode through an upper computer, and the scanning frequency of the laser radar is set, wherein the acquisition frequency of the camera is integral multiple of the scanning frequency of the laser radar.
Preferably, the initial scanning angle of the laser radar is set within the visual angle range of the camera, so that the point cloud data synchronized with the image is scanned in a shorter time.
S302, the collected data processing module receives a GPGGA message sent by a GPS module, and extracts UTC time from the GPGGA message as a system time source;
it should be noted that the collected data processing module is generally configured to receive image data collected by a camera and point cloud data collected by a laser radar, receive a GPGGA packet sent by a GPS module, and add a timestamp to the image data and the point cloud data according to UTC time provided by the GPS module.
And S303, the collected data processing module receives the collected image data and point cloud data, and adds timestamps to the image data and the point cloud data according to the system time source.
Specifically, the adding the timestamp to the image data and the point cloud data according to the system time source includes:
respectively measuring the average time delay T of the point cloud data received by the data processing module and the PPS signal received by the laser radardelay_lidarAnd the average time delay T of the image data received by the data processing module and the PPS signal received by the frequency dividerdelay_camera
The system time is T when the acquisition data processing module receives the point cloud datare_lidarThe system time of the acquisition data processing module receiving the image data is Tre_cameraAnd then the timestamp of the laser point cloud data is as follows: t islidar=Tre_lidar-Tdelay_lidarTime stamp of image data is Tcamera=Tre_camera-Tdelay_camera
It can be understood that, because there is a slight time deviation and frame loss in the transmission and reception processes of the laser radar data and the camera data of each frame, the received point cloud and image data need to be further processed to ensure that the data of each frame of point cloud and camera are aligned in time, and the accuracy of the added timestamp is ensured.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
It will be appreciated that in one embodiment, the electronic device comprises a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing multi-sensor time synchronization when executing the computer program. Those skilled in the art will also understand that all or part of the steps in the method for implementing the above embodiments may be implemented by a program to instruct associated hardware, where the program may be stored in a computer-readable storage medium, and when executed, the program includes steps S301 to S303, where the storage medium includes, for example: ROM/RAM, optical disks, etc.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A multi-sensor time synchronizer at least comprises a collected data processing module, a frequency divider, cameras, a laser radar and a data storage module, and is characterized in that the collected data processing module, the frequency divider and the laser radar are all connected with a GPS module, and at least one camera is connected with the frequency divider;
the frequency divider is used for receiving the PPS signals sent by the GPS module, dividing the frequency of the PPS signals and sending the PPS signals to the connected cameras;
the camera is used for triggering image data acquisition according to the PPS signal sent by the frequency divider;
the laser radar is used for triggering point cloud data acquisition according to the PPS signal sent by the GPS module;
the collected data processing module is used for receiving the GPGGA message sent by the GPS module, extracting UTC time from the GPGGA message as a system time source, receiving collected image data and point cloud data, and adding timestamps to the image data and the point cloud data according to the system time source;
and the data storage module is used for storing the image data added with the time stamp and the point cloud data.
2. The device of claim 1, wherein the GPIO interface mode of the camera is configured as an external trigger exposure mode, and the external trigger pulse signals of the multiple cameras are from the same pulse source.
3. The device of claim 1, wherein the laser radar is configured to be in a PPS mode through the upper computer, and the scanning frequency of the laser radar is set, wherein the camera collecting frequency is an integral multiple of the scanning frequency of the laser radar.
4. The apparatus of claim 1, wherein the starting scan angle of the lidar is set within a range of view angles of the camera.
5. The apparatus of claim 1, wherein the timestamping image data and point cloud data according to a system time source comprises:
respectively measuring the average time delay T of the point cloud data received by the data processing module and the PPS signal received by the laser radardelay_lidarAnd the average time delay T of the image data received by the data processing module and the PPS signal received by the frequency dividerdelay_camera
The system time is T when the acquisition data processing module receives the point cloud datare_lidarThe system time of the acquisition data processing module receiving the image data is Tre_cameraIf the time stamp of the laser point cloud data is Tlidar=Tre_lidar-Tdelay_lidarTime stamp of image data is Tcamera=Tre_camera-Tdelay_camera
6. A multi-sensor time synchronization method, comprising:
the GPS module sends a PPS signal to the frequency divider and the laser radar, the frequency divider divides the frequency of the PPS signal and sends the PPS signal to a camera connected with the frequency divider, the camera is triggered to collect images, and the laser radar collects point cloud data after receiving the PPS signal;
the acquisition data processing module receives a GPGGA message sent by a GPS module, and extracts UTC time from the GPGGA message as a system time source;
and the acquired data processing module receives the acquired image data and point cloud data and adds timestamps to the image data and the point cloud data according to the system time source.
7. The method of claim 6, wherein the GPIO interface mode of the cameras is configured as an externally triggered exposure mode, and the externally triggered pulse signals of multiple cameras are from the same pulse source.
8. The method of claim 6, wherein the lidar is configured to the PPS mode by the host computer and a scanning frequency of the lidar is set, wherein the camera acquisition frequency is an integer multiple of the scanning frequency of the lidar.
9. The method of claim 6, wherein the starting scan angle of the lidar is set within a range of view angles of the camera.
10. The method of claim 6, wherein the time stamping image data and point cloud data according to the system time source comprises:
respectively measuring the average time delay T of the point cloud data received by the data processing module and the PPS signal received by the laser radardelay_lidarAnd the average time delay T of the image data received by the data processing module and the PPS signal received by the frequency dividerdelay_camera
The system time is T when the acquisition data processing module receives the point cloud datare_lidarThe system time of the acquisition data processing module receiving the image data is Tre_cameraIf the time stamp of the laser point cloud data is Tlidar=Tre_lidar-Tdelay_lidarTime stamp of image data is Tcamera=Tre_camera-Tdelay_camera
CN202011570288.8A 2020-12-26 2020-12-26 Multi-sensor time synchronization device and method Withdrawn CN112787740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011570288.8A CN112787740A (en) 2020-12-26 2020-12-26 Multi-sensor time synchronization device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011570288.8A CN112787740A (en) 2020-12-26 2020-12-26 Multi-sensor time synchronization device and method

Publications (1)

Publication Number Publication Date
CN112787740A true CN112787740A (en) 2021-05-11

Family

ID=75752656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011570288.8A Withdrawn CN112787740A (en) 2020-12-26 2020-12-26 Multi-sensor time synchronization device and method

Country Status (1)

Country Link
CN (1) CN112787740A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113890665A (en) * 2021-09-23 2022-01-04 北京超星未来科技有限公司 Time synchronization method, system, storage medium and processor
CN113922910A (en) * 2021-10-09 2022-01-11 广东汇天航空航天科技有限公司 Sensor time synchronization processing method, device and system
CN114006672A (en) * 2021-09-17 2022-02-01 东风汽车集团股份有限公司 Vehicle-mounted multi-sensor data synchronous acquisition method and system
CN114440872A (en) * 2021-12-30 2022-05-06 广州幻境科技有限公司 Synchronous sampling method and device for multiple groups of inertial sensors
CN114499733A (en) * 2022-02-16 2022-05-13 东南大学 Four-legged robot-mounted SLAM device and sensor time synchronization method
CN115021851A (en) * 2022-05-31 2022-09-06 北京里尔诺科技有限公司 Multi-sensor collaborative time information synchronization equipment based on automatic driving
WO2023093054A1 (en) * 2021-11-29 2023-06-01 上海商汤智能科技有限公司 Data processing method, apparatus, and system, device, and storage medium
WO2023246050A1 (en) * 2022-06-20 2023-12-28 北京航天控制仪器研究所 Millisecond-level data synchronization apparatus and method for electro-optical pod
WO2024011408A1 (en) * 2022-07-12 2024-01-18 阿波罗智能技术(北京)有限公司 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle
WO2024045069A1 (en) * 2022-08-31 2024-03-07 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Flexible lidar camera synchronization for driverless vehicle
WO2024193278A1 (en) * 2023-03-20 2024-09-26 先临三维科技股份有限公司 Data synchronization method and apparatus, and device and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN104655134A (en) * 2013-11-22 2015-05-27 哈尔滨功成科技创业投资有限公司 Multisensor data acquisition system based on GPS time scale
CN109104259A (en) * 2017-08-25 2018-12-28 杭州德泽机器人科技有限公司 A kind of multisensor clock synchronization synchronization system and method
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device
CN110133999A (en) * 2019-04-23 2019-08-16 武汉星源云意科技有限公司 A kind of method for synchronizing time and clock synchronization system for adopting platform based on nebula laser point cloud number
CN110217178A (en) * 2019-06-18 2019-09-10 浙江大学 A kind of unmanned sensory perceptual system and its working method based on hardware synchronization
CN110517209A (en) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 Data processing method, device, system and computer readable storage medium
US20200064859A1 (en) * 2018-08-24 2020-02-27 Baidu Usa Llc Hardware centralized time synchronization hub for an autonomous driving vehicle
CN111309094A (en) * 2020-02-06 2020-06-19 上海图趣信息科技有限公司 Synchronous board card and method for data acquisition of sensor equipment
CN111381487A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 Multi-sensor synchronous time service system, method and device and electronic equipment
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111934843A (en) * 2020-07-31 2020-11-13 深圳市智绘科技有限公司 Multi-sensor data synchronous acquisition method for intelligent unmanned system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN104655134A (en) * 2013-11-22 2015-05-27 哈尔滨功成科技创业投资有限公司 Multisensor data acquisition system based on GPS time scale
CN109104259A (en) * 2017-08-25 2018-12-28 杭州德泽机器人科技有限公司 A kind of multisensor clock synchronization synchronization system and method
CN110517209A (en) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 Data processing method, device, system and computer readable storage medium
US20200064859A1 (en) * 2018-08-24 2020-02-27 Baidu Usa Llc Hardware centralized time synchronization hub for an autonomous driving vehicle
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device
CN111381487A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 Multi-sensor synchronous time service system, method and device and electronic equipment
CN110133999A (en) * 2019-04-23 2019-08-16 武汉星源云意科技有限公司 A kind of method for synchronizing time and clock synchronization system for adopting platform based on nebula laser point cloud number
CN110217178A (en) * 2019-06-18 2019-09-10 浙江大学 A kind of unmanned sensory perceptual system and its working method based on hardware synchronization
CN111309094A (en) * 2020-02-06 2020-06-19 上海图趣信息科技有限公司 Synchronous board card and method for data acquisition of sensor equipment
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111934843A (en) * 2020-07-31 2020-11-13 深圳市智绘科技有限公司 Multi-sensor data synchronous acquisition method for intelligent unmanned system

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114006672A (en) * 2021-09-17 2022-02-01 东风汽车集团股份有限公司 Vehicle-mounted multi-sensor data synchronous acquisition method and system
CN114006672B (en) * 2021-09-17 2024-04-02 东风汽车集团股份有限公司 Vehicle-mounted multi-sensor data synchronous acquisition method and system
CN113890665A (en) * 2021-09-23 2022-01-04 北京超星未来科技有限公司 Time synchronization method, system, storage medium and processor
CN113922910A (en) * 2021-10-09 2022-01-11 广东汇天航空航天科技有限公司 Sensor time synchronization processing method, device and system
CN113922910B (en) * 2021-10-09 2023-09-19 广东汇天航空航天科技有限公司 Sensor time synchronization processing method, device and system
WO2023093054A1 (en) * 2021-11-29 2023-06-01 上海商汤智能科技有限公司 Data processing method, apparatus, and system, device, and storage medium
CN114440872A (en) * 2021-12-30 2022-05-06 广州幻境科技有限公司 Synchronous sampling method and device for multiple groups of inertial sensors
CN114499733A (en) * 2022-02-16 2022-05-13 东南大学 Four-legged robot-mounted SLAM device and sensor time synchronization method
CN115021851A (en) * 2022-05-31 2022-09-06 北京里尔诺科技有限公司 Multi-sensor collaborative time information synchronization equipment based on automatic driving
WO2023246050A1 (en) * 2022-06-20 2023-12-28 北京航天控制仪器研究所 Millisecond-level data synchronization apparatus and method for electro-optical pod
WO2024011408A1 (en) * 2022-07-12 2024-01-18 阿波罗智能技术(北京)有限公司 Method and apparatus for synchronously collecting data, synchronization determination method and apparatus, and autonomous vehicle
WO2024045069A1 (en) * 2022-08-31 2024-03-07 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Flexible lidar camera synchronization for driverless vehicle
WO2024193278A1 (en) * 2023-03-20 2024-09-26 先临三维科技股份有限公司 Data synchronization method and apparatus, and device and storage medium

Similar Documents

Publication Publication Date Title
CN112787740A (en) Multi-sensor time synchronization device and method
CN109104259B (en) Multi-sensor time synchronization system and method
CN111309094A (en) Synchronous board card and method for data acquisition of sensor equipment
CN109905194B (en) Vehicle-mounted terminal system and synchronous data acquisition method and device
CN108923876B (en) Time synchronization method, device and system
CN109587405B (en) Time synchronization method and device
US20180003822A1 (en) Environmental sensing device and information acquiring method applied to environmental sensing device
CN112261283B (en) Synchronous acquisition method, device and system of high-speed camera
CN112541527A (en) Multi-sensor synchronization method and device, electronic equipment and storage medium
CN113496545B (en) Data processing system, method, sensor, mobile acquisition backpack and equipment
CN112945228B (en) Multi-sensor time synchronization method and synchronization device
CN103563287B (en) Synchronization device and synchronization method
CN111556224B (en) Multi-camera synchronous calibration method, device and system
WO2024193278A1 (en) Data synchronization method and apparatus, and device and storage medium
CN113572556A (en) Time synchronization method, sensor and system
CN113225152B (en) Method and device for synchronizing cameras and computer readable medium
CN110177220B (en) Camera with external time service function and control method thereof
CN113267186B (en) Data synchronous acquisition system and data synchronous acquisition method
CN112564883B (en) Time synchronization apparatus, method and automatic driving system
CN117152589A (en) Target recognition method, system and storage medium
WO2023093054A1 (en) Data processing method, apparatus, and system, device, and storage medium
KR101967027B1 (en) Improved UTC Time Synchronization Method for a Device using GPS Module in Abnormal Operation Condition
CN113965285B (en) Cross-system multi-sensor time synchronization and calibration method based on NTP protocol
KR101965932B1 (en) UTC Time Synchronization Method for a Device using GPS Module
CN114338951A (en) Sensor synchronization method, device and system and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210511

WW01 Invention patent application withdrawn after publication