WO2020113358A1 - Systems and methods for synchronizing vehicle sensors and devices - Google Patents

Systems and methods for synchronizing vehicle sensors and devices Download PDF

Info

Publication number
WO2020113358A1
WO2020113358A1 PCT/CN2018/118866 CN2018118866W WO2020113358A1 WO 2020113358 A1 WO2020113358 A1 WO 2020113358A1 CN 2018118866 W CN2018118866 W CN 2018118866W WO 2020113358 A1 WO2020113358 A1 WO 2020113358A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
vehicle
reference timing
timing signal
control board
Prior art date
Application number
PCT/CN2018/118866
Other languages
French (fr)
Inventor
Lu Feng
Zhenqiang YAN
Teng MA
Jingnan LIU
Original Assignee
Beijing Didi Infinity Technology And Development Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology And Development Co., Ltd. filed Critical Beijing Didi Infinity Technology And Development Co., Ltd.
Priority to CN201880092869.XA priority Critical patent/CN112041767A/en
Priority to PCT/CN2018/118866 priority patent/WO2020113358A1/en
Publication of WO2020113358A1 publication Critical patent/WO2020113358A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/70Arrangements in the main station, i.e. central controller
    • H04Q2209/75Arrangements in the main station, i.e. central controller by polling or interrogating the sub-stations

Definitions

  • the present disclosure relates to synchronizing vehicle sensors and devices, and more particularly to, methods and systems for synchronizing vehicle sensors and devices using a synchronization control circuit based on Pulse Per Second (PPS) signals.
  • PPS Pulse Per Second
  • Autonomous driving technology relies heavily on a high-definition map.
  • accuracy of the navigation map is critical to functions of autonomous driving vehicles, such as positioning, ambience recognition, decision making and control.
  • High accuracy maps may be obtained by aggregating images and information acquired by various sensors, detectors, and other devices on vehicles as they drive around.
  • a vehicle may be equipped with Light Detection And Ranging (LiDAR) radar (s) , camera (s) , Global Position System (GPS) , and Inertial Measurement Unit (IMU) , etc., to capture positions and features of the road the vehicle is driving on or surrounding objects.
  • Data captured may include, e.g., center line or border line coordinates of a lane, coordinates and images of an object, such as a landmark or a traffic sign.
  • LiDAR Light Detection And Ranging
  • GPS Global Position System
  • IMU Inertial Measurement Unit
  • the data needs to be synchronized. For instance, GPS coordinate data of a particular time point has to be registered with the LiDAR images of a surrounding object taken at the very same time point, in order for that object to be accurately positioned on the map using the GPS coordinates. Synchronizations may be performed during two stages: the acquisition stage when the data is acquired, and the post-processing stage when the acquired data is processed and aggregated. Synchronization entirely during the post-processing stage tends to be less accurate and inefficient. Synchronization of the sensors and devices during acquisition is usually difficult to implement.
  • One existing method uses custom designed cameras, such as certain high-end models of panorama cameras or binocular cameras, that have synchronization functions.
  • Embodiments of the disclosure address the above problem by systems and methods for synchronization of vehicle sensors and devices.
  • Embodiments of the disclosure provide a vehicle synchronization system.
  • the vehicle synchronization system can include a GPS receiver configured to provide a reference timing signal and a bus configured to provide a vehicle speed signal.
  • the vehicle synchronization system may further include at least one sensor.
  • the vehicle synchronization system may also include a synchronization control board, configured to determine a triggering signal based on the reference timing signal and the vehicle speed signal, and trigger the at least one sensor using the triggering signal for acquiring a plurality of images.
  • Embodiments of the disclosure further disclose a method for synchronizing vehicle sensors.
  • the method may include receiving a reference timing signal from a GPS receiver and a vehicle speed signal from a bus.
  • the method may further include determining, using a synchronization control board, a triggering signal based on the reference timing signal and the vehicle speed signal, and triggering the at least one sensor using the triggering signal for acquiring a plurality of images.
  • Embodiments of the disclosure further disclose a synchronization control board.
  • the synchronization control board may include at least one input pin configured to receive a reference timing signal and a vehicle speed signal.
  • the synchronization control board may further include at least one output pin configured to provide a triggering signal to at least one sensor. The triggering signal may be determined based on the reference timing signal and the vehicle speed signal.
  • the synchronization control board may also include a feedback pin configured to provide the reference timing signal to a processor.
  • FIG. 1 illustrates a schematic diagram of an exemplary vehicle equipped with sensors, according to embodiments of the disclosure.
  • FIG. 2 illustrates a schematic diagram of an exemplary vehicle synchronization system for synchronizing vehicle sensors, according to embodiments of the disclosure.
  • FIG. 3 illustrates a schematic diagram of an exemplary synchronization control board, according to embodiments of the disclosure.
  • FIG. 4 illustrates an exemplary triggering signal synchronized using a reference timing signal, according to embodiments of the disclosure.
  • FIG. 5 illustrates a schematic diagram of an exemplary synchronization computer for synchronizing data acquired by vehicle sensors, according to embodiments of the disclosure.
  • FIG. 6 illustrates a flowchart of an exemplary method for synchronizing vehicle sensors, according to embodiments of the disclosure.
  • FIG. 1 illustrates a schematic diagram of an exemplary vehicle 100 having a plurality of sensors 141, 142, and 150, according to embodiments of the disclosure.
  • vehicle 100 may be a survey vehicle configured for acquiring data for constructing a high-definition map or three-dimensional (3-D) city modeling. It is contemplated that vehicle 100 may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, or a conventional internal combustion engine vehicle. Vehicle 100 may have a body 110 and at least one wheel 120. Body 110 may be any body style, such as a sports vehicle, a coupe, a sedan, a pick-up truck, a station wagon, a sports utility vehicle (SUV) , a minivan, or a conversion van.
  • SUV sports utility vehicle
  • vehicle 100 may include a pair of front wheels and a pair of rear wheels, as illustrated in FIG. 1. However, it is contemplated that vehicle 100 may have more or less wheels or equivalent structures that enable vehicle 100 to move around. Vehicle 100 may be configured to be all wheel drive (AWD) , front wheel drive (FWR) , or rear wheel drive (RWD) . In some embodiments, vehicle 100 may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous.
  • ATD wheel drive
  • FWR front wheel drive
  • RWD rear wheel drive
  • vehicle 100 may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous.
  • vehicle 100 may be equipped with sensors 141 and 142 mounted to body 110 via a mounting structure 130.
  • Mounting structure 130 may be an electro-mechanical device installed or otherwise attached to body 110 of vehicle 100. In some embodiments, mounting structure 130 may use screws, adhesives, or another mounting mechanism.
  • Vehicle 100 may be additionally equipped with sensor 150 inside or outside body 110 using any suitable mounting mechanisms. It is contemplated that the manners in which sensor 141, 142 or 150 can be equipped on vehicle 100 are not limited by the example shown in FIG. 1, and may be modified depending on the types of sensors of 141, 142, or 150 and/or vehicle 100 to achieve desirable sensing performance.
  • sensors 141, 142, and 150 may be configured to capture data as vehicle 100 travels along a trajectory.
  • sensor 141 may be a camera that takes pictures or otherwise collects image data.
  • sensor 141 may include a monocular, binocular, or panorama camera.
  • Sensor 141 may acquire a plurality of images (each known as an image frame) as vehicle 100 moves along a trajectory. Each image frame is acquired at a time point as triggered by a triggering signal.
  • sensor 142 may be a LiDAR scanner configured to scan the surrounding and acquire point clouds.
  • LiDAR measures distance to a target by illuminating the target with pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths can then be used to make digital t3-D representations of the target.
  • the light used for LiDAR scan may be ultraviolet, visible, or near infrared. Because a narrow laser beam can map physical features with very high resolution, LiDAR scanner is particularly suitable for high-definition map surveys. In some embodiments, a LiDAR scanner may capture a point cloud.
  • sensor 142 may also acquire a plurality of point clouds (each known as a point cloud frame) as vehicle 100 moves along a trajectory. Each point cloud frame is acquired at a time point as triggered by a triggering signal.
  • the signal triggering sensor 142 may be the same as, or different from, the signal triggering sensor 141.
  • sensors 141 and 142 may continuously capture data.
  • Each set of scene data captured at a certain time point is known as a data frame.
  • sensor 141 may record a video consisting of multiple image frames captured at multiple time points.
  • sensor 142 may capture a series of point cloud data at multiple time points.
  • vehicle 100 may be additionally equipped with sensor 150, which may include sensors used in a navigation unit, such as a GPS receiver and one or more IMU sensors.
  • a GPS is a global navigation satellite system that provides geolocation and time information to a GPS receiver.
  • An IMU is an electronic device that measures and provides a vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using various inertial sensors, such as accelerometers and gyroscopes, sometimes also magnetometers.
  • sensor 150 can provide real-time pose information of vehicle 100 as it travels, including the positions and orientations (e.g., Euler angles) of vehicle 100 at each time point.
  • GPS/IMU sensor uses an accurate Pulse Per Second (PPS) signal to time its acquisition.
  • PPS signal is an electrical signal that has a width of less than one second and a sharply rising or abruptly falling edge that accurately repeats once per second.
  • the PPS pulses may contain rising edges corresponding to the start of every Universal Time Coordinated (UTC) second. The accuracy of the correspondence may be within a nano-second. Therefore, PPS signals may be used for precise timekeeping and time measurement.
  • UTC Universal Time Coordinated
  • vehicle 100 may include a synchronization system to synchronize sensors 141, 142, and 150 such that image frames captured by sensor 141, point clouds captured by sensor 142, and pose information captured by sensor 150 are all captured at the same time points.
  • the synchronized image frame, point cloud, and associated pose information may be used collectively to position vehicle 100.
  • the PPS signal provided by the GPS/IMU sensor may be used to synchronize the acquisition of sensor 141, 142, and 150.
  • FIG. 2 illustrates a schematic diagram of an exemplary vehicle synchronization system 200 for synchronizing vehicle sensors, according to embodiments of the disclosure.
  • Vehicle synchronization system 200 may include a synchronization control board 210, a Controlled Area Network (CAN) bus 220, and a synchronization computer 230. Consistent with the present disclosure, system 200 is configured to synchronize sensors 141, 142, and 150. It is contemplated that system 200 may include additional components.
  • CAN Controlled Area Network
  • Synchronization control board 210 may be a circuit board configured to receive various input signals and provide various output signals.
  • synchronization control board 210 may be a Printed Circuit Board (PCB) , consisting of an insulator, e.g., fiberglass, with threads of conductive material acting as wires on the based on the board.
  • PCB Printed Circuit Board
  • synchronization control board 210 may be an integrated circuit (also known as an IC, a chip, or a microchip) , which consists of a large number of electronic components (e.g., transistors, resistors, capacitors, inductors, etc. ) on one small flat piece of semiconductor material, e.g., silicon.
  • FIG. 3 illustrates a schematic diagram of an exemplary synchronization control board 210, according to embodiments of the disclosure.
  • Synchronization control board 210 may be a chip that has several input pins, e.g., pins 311-313, and several output pins, e.g., pins 314-317.
  • FIG. 3 shows synchronization control board 210 as a STM 32 IC chip, it is contemplated that other suitable chips or circuits may be used. It is also contemplated that synchronization control board 210 may include more or less pins than showed in FIG. 3.
  • synchronization control board 210 may receive the PPS signal via pin 311 from sensor 150.
  • the PPS signal may be a square wave at 1 Hz, thus each raising edge of the pulse corresponds to a start of a UTC second (referred to hereafter as a “GPS whole-second time” ) .
  • GPS is considered a stratum-0 source
  • the PPS signal may be provided to the PC. 2 pin (pin 311) using a low-latency, low-jitter wire connection.
  • synchronization control board 210 may additionally receive a vehicle speed signal via a RxD pin (pin 312) from CAN bus 220.
  • CAN bus 220 may be a Controller Area Network bus configured to allow microcontrollers and devices in a vehicle to communicate with each other in applications without a host computer. Various sensor inputs from around vehicle 100 may be collected and collated via CAN bus 220. For example, CAN bus 220 may collect vehicle speed information measured by speed sensors. Synchronization control board 210 may be grounded via a GND pin (pin 313) .
  • synchronization control board 210 may use the PPS signal provided by GPS/IMU sensor 150 as a reference timing signal. In some embodiments, based on this reference timing signal and the vehicle speed signal provided by CAN bus 220, synchronization control board 210 determines a triggering signal for triggering sensors 141 and 142. In some embodiments, synchronization control board 210 may first convert the reference timing signal to a higher frequency. For example, synchronization control board 210 may convert an 1 Hz PPS signal to a 10-20 Hz signal. The converted signal may be used to trigger sensors 141 and 142 to acquire each data frame (i.e., an image frame or a point cloud frame) . For example, the acquisition may be triggered by the rising edge of the pulses in the signal.
  • data frame i.e., an image frame or a point cloud frame
  • the frequency of the triggering signal may be determined and adjusted based on the vehicle speed signal provided by CAN bus 220. For example, when vehicle 100 moves at a relatively low speed, the triggering frequency may be set low; when vehicle 100 moves at a relatively high speed, the triggering frequency may be set high to capture sufficient data frames to cover a certain area.
  • a ratio between the frequency of the triggering signal and the frequency of the reference timing signal may be proportional to the vehicle speed. For example, the ratio may be linearly proportional to the vehicle speed.
  • the triggering signal for sensor 142 may have a frequency different from that of the triggering signal for sensor 141. For example, an 1 Hz triggering signal might be sufficient for LiDAR acquisition, while a 10-20 Hz triggering signal may be used for camera acquisition.
  • synchronization control board 210 may use the reference timing signal to correct the triggering signal and ensure that the triggering signal is synchronized to UTC. Because the triggering signal is generated partially based on synchronization control board 210’s own clock, it may contain accumulated errors caused by the board clock.
  • FIG. 4 illustrates an exemplary triggering signal 400 synchronized using a reference timing signal, according to embodiments of the disclosure.
  • triggering signal 400 includes pulses 401-405 with rising edges at time points t0-t4, respectively. At time point t5, synchronization control board 210 may detect a rising edge in the reference timing signal, indicating the GPS whole second time.
  • Synchronization control board 210 may phase shift the triggering signal accordingly so that the next pulse 410 has a rising edge matching the rising edge in the reference timing signal. Subsequent pulse 411 will be shifted correspondingly to have its rising edge at time point t6. The time interval between t5 and t6 will be kept the same as that between t0 and t1, to maintain triggering signal at the same frequency. In some embodiments, triggering signal correction may be triggered by every rising edge in the reference timing signal.
  • synchronization control board 210 may provide triggering signals to sensor 141 (e.g., including cameras 141-1 and 141-2) and sensor 142 (e.g., LiDAR) via PA pins 315-317, respectively.
  • the triggering signals may be pulse-width modulated (PWM) .
  • synchronization control board 210 may provide the reference timing signal (e.g., the PPS signal) and/or the triggering signal as a serial output signal to synchronization computer 230, via a feedback pin 314.
  • the reference timing signal may be used by synchronization computer 230 for post-acquisition synchronization.
  • system 200 may further include synchronization computer 230 that is configured to synchronize the sensor measurements after the acquisition.
  • Synchronization computer 230 can be a general-purpose computer, or a proprietary device specially designed for sensor synchronization. It is contemplated that, synchronization computer 230 can be a separate system or an integrated component of a vehicle controller, such as an Engine Control Unit (ECU) . In some embodiments, synchronization computer 230 may include sub-systems, some of which may be remote.
  • ECU Engine Control Unit
  • FIG. 5 illustrates a schematic diagram of an exemplary synchronization computer 230 for synchronizing data acquired by vehicle sensors, according to embodiments of the disclosure.
  • synchronization computer 230 may include a communication interface 502, a processor 504, a memory 512, and a storage 514.
  • synchronization computer 230 may have different modules in a single device, such as an integrated circuit (IC) chip (implemented as an application-specific integrated circuit (ASIC) or a field-programmable gate array (FPGA) ) , or separate devices with dedicated functions.
  • IC integrated circuit
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate array
  • one or more components of synchronization computer 230 may be located inside vehicle 100, or remotely such as in a cloud, on a mobile device, or other distributed locations. Components of synchronization computer 230 may be in an integrated device, or distributed at different locations but communicate with each other through a network (not shown) .
  • Communication interface 502 may send data to and receive data from components such as sensor 150 and CAN bus 220 via communication cables, a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a cellular network, and/or a local or short-range wireless network (e.g., Bluetooth TM ) , or other communication methods.
  • communication interface 502 can be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection.
  • ISDN integrated services digital network
  • communication interface 502 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • Wireless links can also be implemented by communication interface 502.
  • communication interface 502 can send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information via a network.
  • communication interface 502 may receive a reference timing signal provided by sensor 150 and a triggering signal provided by synchronization control board 210. Communication interface 502 may further receive image frames captured by sensor 141, point cloud data captured by sensor 142, and pose information captured by sensor 150. Communication interface 502 may further provide the received data, signal and information to memory 512/storage 514 for storage or to processor 504 for processing.
  • Processor 504 may include any appropriate type of general-purpose or special-purpose microprocessor, digital signal processor, or microcontroller. Processor 504 may be configured as a separate processor module dedicated to synchronizing the acquired sensor data. Alternatively, processor 504 may be configured as a shared processor module for performing other functions unrelated to sensor synchronization.
  • processor 504 may include multiple modules, such as a reference image trigger time determination unit 506, a reference image determination unit 508, and an image acquisition time determination unit 510, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) of processor 504 designed for use with other components or software units implemented by processor 504 through executing at least part of a program.
  • the program may be stored on a computer-readable medium, and when executed by processor 504, it may perform one or more functions.
  • FIG. 5 shows units 506-510 all within one processor 504, it is contemplated that these units may be distributed among multiple processors located near or remotely with each other.
  • Reference image trigger time determination unit 506 may be configured to determine the time point T corresponding to the GPS whole-second time, i.e., the rising edge of a pulse in the PPS signal. In some embodiments, reference image trigger time determination unit 506 may inquire the GPS time in a GTIMU message. Because this GPS time usually lags the start of the GPS whole-second time, reference image trigger time determination unit 506 may take the floor of the GPS time as the reference image trigger time T.
  • Reference image determination unit 508 may identify, among the sequence of images received from sensor 141, a reference image captured by sensor 141 when triggered at the reference image trigger time T.
  • Reference image determination unit 508 may identify the second image in the pair of images captured 55 ms apart as the reference image.
  • time stamp Ts0 may be a camera parameter predetermined for sensor 141 or calculated automatically by sensor 141.
  • image acquisition time determination unit 510 may determine the GPS times of the remaining images.
  • Tn may be the acquisition time for the nth image behind the reference mage.
  • Raw time Trn and shutter time Tsn are both time stamps created by sensor 141.
  • the GPS time errors will not be accumulated.
  • a reference image determined for the previous second may still be used to calculate the GPS times for the sequence of images.
  • Memory 512 and storage 514 may include any appropriate type of mass storage provided to store any type of information that processor 504 may need to operate.
  • Memory 512 and storage 514 may be a volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other type of storage device or tangible (i.e., non-transitory) computer-readable medium including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM.
  • Memory 512 and/or storage 514 may be configured to store one or more computer programs that may be executed by processor 504 to perform sensor synchronization functions disclosed herein.
  • memory 512 and/or storage 514 may be configured to store program (s) that may be executed by processor 504 to synchronize the sensor acquisitions.
  • Memory 512 and/or storage 514 may be further configured to store information and data used by processor 504.
  • memory 512 and/or storage 514 may be configured to store the various types of signals (e.g., reference timing signal, vehicle speed signal, etc. ) provided by sensors 150 and CAN bus 220, as well as various types of data (e.g., image frames, point cloud data, pose information, etc. ) captured by sensors 141, 142, and 150.
  • Memory 512 and/or storage 514 may also store intermediate signals and data such as triggering signal, camera parameters, etc.
  • the various types of data may be stored permanently, removed periodically, or disregarded immediately after a portion of data is processed.
  • FIG. 6 illustrates a flowchart of an exemplary method 600 for synchronizing vehicle sensors, according to embodiments of the disclosure.
  • method 600 may be implemented by a vehicle synchronization system 200 that includes, among other things, synchronization control board 210 and synchronization computer 230.
  • steps S602-S610 of method 600 may be performed by synchronization control board 210
  • steps S612-S618 may be performed by synchronization computer 230. It is to be appreciated that some of the steps may be optional to perform the disclosure provided herein. Further, some of the steps may be performed simultaneously, or in a different order than shown in FIG. 6.
  • synchronization control board 210 may receive a reference timing signal.
  • the reference timing signal may be a PPS signal provided by GPS/IMU sensor 150.
  • the rising edge of each pulse corresponds to the start of a UTC second.
  • synchronization control board 210 may receive a vehicle speed signal.
  • the vehicle speed signal may be provided by CAN bus 220, which acquires the information through a speed sensor of vehicle 100.
  • synchronization control board 210 may generate a triggering signal based on the reference timing signal and the vehicle speed signal received in steps S602 and S604, respectively.
  • the triggering signal may be a pulsed-wave signal similar to the reference timing signal but at a higher frequency.
  • the frequency of the triggering signal may be determined based on the vehicle speed signal.
  • a ratio between the frequency of the triggering signal and the frequency of the reference timing signal may be linearly proportional to the vehicle speed. For example, when vehicle 100 moves at a relatively low speed, the triggering frequency may be set low, and when vehicle 100 moves at a relatively high speed, the triggering frequency may be set high to capture sufficient data frames to cover a certain area.
  • synchronization control board 210 may generate one triggering signal for sensor 141, and another one for sensor 142.
  • the triggering signal for sensor 141 may have a frequency different from the frequency of the triggering signal for sensor 142.
  • the triggering signal for LiDAR acquisition may be 1 Hz
  • the triggering signal for camera acquisition may be 10-20 Hz.
  • synchronization control board 210 may use the reference timing signal to correct the triggering signal and ensure that the triggering signal is synchronized at the start of every second. For example, synchronization control board 210 may detect a rising edge in the reference timing signal, indicating the GPS whole second time, and phase shift the triggering signal accordingly so that the next pulse in the triggering signal has a rising edge matching the rising edge in the reference timing signal. In some embodiments, triggering signal correction may be triggered by every rising edge in the reference timing signal.
  • the triggering signal (s) generated by synchronization control board 210 may be provided to respective sensors 141 and 142 for triggering the sensor acquisitions.
  • sensors 141 and 142 may be triggered to capture a frame of data upon the rising edge of each pulse in the triggering signal.
  • synchronization computer 230 may receive these captured data frames.
  • synchronization computer 230 may receive a sequence of images (image frames) from sensor 141 and point cloud frames from sensor 142.
  • synchronization computer 230 may perform post-acquisition synchronization on the acquired data.
  • reference image trigger time determination unit 506 may determine the reference image trigger time T corresponding to the GPS whole-second time, i.e., the rising edge of a pulse in the PPS signal.
  • time T may be determined by applying a floor operation on the GPS time in a GTIMU message.
  • reference image determination unit 508 may identify a reference image captured when sensor 141 was triggered at the reference image trigger time T.
  • the reference image may be identified as a second image in a pair of images acquired Tr time interval apart.
  • reference image determination unit 508 may further determine an acquisition time T0 corresponding to the GPS time when the reference image is captured.
  • Time stamp Ts0 may be a camera parameter predetermined for sensor 141 or calculated automatically by sensor 141.
  • image acquisition time determination unit 510 may determine acquisition times corresponding to the GPS times when the remaining images are captured.
  • Tn may be the acquisition time for the nth image behind the reference mage.
  • Raw time Trn and shutter time Tsn are both time stamp created by sensor 141.
  • the computer-readable medium may include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices.
  • the computer-readable medium may be the storage device or the memory module having the computer instructions stored thereon, as disclosed.
  • the computer-readable medium may be a disc or a flash drive having the computer instructions stored thereon.

Abstract

Embodiments of the disclosure provide methods and systems for synchronizing vehicle sensors. The vehicle synchronization system can include a GPS receiver configured to provide a reference timing signal and a bus configured to provide a vehicle speed signal. The vehicle synchronization system may further include at least one sensor. The vehicle synchronization system may also include a synchronization control board, configured to determine a triggering signal based on the reference timing signal and the vehicle speed signal, and trigger the at least one sensor using the triggering signal for acquiring a plurality of images.

Description

SYSTEMS AND METHODS FOR SYNCHRONIZING VEHICLE SENSORS AND DEVICES TECHNICAL FIELD
The present disclosure relates to synchronizing vehicle sensors and devices, and more particularly to, methods and systems for synchronizing vehicle sensors and devices using a synchronization control circuit based on Pulse Per Second (PPS) signals.
BACKGROUND
Autonomous driving technology relies heavily on a high-definition map. For example, accuracy of the navigation map is critical to functions of autonomous driving vehicles, such as positioning, ambience recognition, decision making and control. High accuracy maps may be obtained by aggregating images and information acquired by various sensors, detectors, and other devices on vehicles as they drive around. For example, a vehicle may be equipped with Light Detection And Ranging (LiDAR) radar (s) , camera (s) , Global Position System (GPS) , and Inertial Measurement Unit (IMU) , etc., to capture positions and features of the road the vehicle is driving on or surrounding objects. Data captured may include, e.g., center line or border line coordinates of a lane, coordinates and images of an object, such as a landmark or a traffic sign.
To aggregate the data captured by the different sensors and devices, the data needs to be synchronized. For instance, GPS coordinate data of a particular time point has to be registered with the LiDAR images of a surrounding object taken at the very same time point, in order for that object to be accurately positioned on the map using the GPS coordinates. Synchronizations may be performed during two stages: the acquisition stage when the data is acquired, and the post-processing stage when the acquired data is processed and aggregated. Synchronization entirely during the post-processing stage tends to be less accurate and inefficient. Synchronization of the sensors and devices during acquisition is usually difficult to implement. One existing method uses custom designed cameras, such as certain high-end models of panorama cameras or binocular cameras, that have synchronization functions. However, due to the limited options for such cameras, the hardware configuration is inflexible and hard to extend. Another method uses triggering servers to trigger the sensors and devices through “soft” signals. Such a solution lacks accuracy or security. Therefore, an improved system and method for synchronizing the vehicle sensors and devices is needed  in order to acquire synchronized data that can be later aggregated to create a high-definition map.
Embodiments of the disclosure address the above problem by systems and methods for synchronization of vehicle sensors and devices.
SUMMARY
Embodiments of the disclosure provide a vehicle synchronization system. The vehicle synchronization system can include a GPS receiver configured to provide a reference timing signal and a bus configured to provide a vehicle speed signal. The vehicle synchronization system may further include at least one sensor. The vehicle synchronization system may also include a synchronization control board, configured to determine a triggering signal based on the reference timing signal and the vehicle speed signal, and trigger the at least one sensor using the triggering signal for acquiring a plurality of images.
Embodiments of the disclosure further disclose a method for synchronizing vehicle sensors. The method may include receiving a reference timing signal from a GPS receiver and a vehicle speed signal from a bus. The method may further include determining, using a synchronization control board, a triggering signal based on the reference timing signal and the vehicle speed signal, and triggering the at least one sensor using the triggering signal for acquiring a plurality of images.
Embodiments of the disclosure further disclose a synchronization control board. The synchronization control board may include at least one input pin configured to receive a reference timing signal and a vehicle speed signal. The synchronization control board may further include at least one output pin configured to provide a triggering signal to at least one sensor. The triggering signal may be determined based on the reference timing signal and the vehicle speed signal. The synchronization control board may also include a feedback pin configured to provide the reference timing signal to a processor.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a schematic diagram of an exemplary vehicle equipped with sensors, according to embodiments of the disclosure.
FIG. 2 illustrates a schematic diagram of an exemplary vehicle synchronization system for synchronizing vehicle sensors, according to embodiments of the disclosure.
FIG. 3 illustrates a schematic diagram of an exemplary synchronization control board, according to embodiments of the disclosure.
FIG. 4 illustrates an exemplary triggering signal synchronized using a reference timing signal, according to embodiments of the disclosure.
FIG. 5 illustrates a schematic diagram of an exemplary synchronization computer for synchronizing data acquired by vehicle sensors, according to embodiments of the disclosure.
FIG. 6 illustrates a flowchart of an exemplary method for synchronizing vehicle sensors, according to embodiments of the disclosure.
DETAILED DESCRIPTION
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
FIG. 1 illustrates a schematic diagram of an exemplary vehicle 100 having a plurality of  sensors  141, 142, and 150, according to embodiments of the disclosure. Consistent with some embodiments, vehicle 100 may be a survey vehicle configured for acquiring data for constructing a high-definition map or three-dimensional (3-D) city modeling. It is contemplated that vehicle 100 may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, or a conventional internal combustion engine vehicle. Vehicle 100 may have a body 110 and at least one wheel 120. Body 110 may be any body style, such as a sports vehicle, a coupe, a sedan, a pick-up truck, a station wagon, a sports utility vehicle (SUV) , a minivan, or a conversion van. In some embodiments, vehicle 100 may include a pair of front wheels and a pair of rear wheels, as illustrated in FIG. 1. However, it is contemplated that vehicle 100 may have more or less wheels or equivalent structures that enable vehicle 100 to move around. Vehicle 100 may be configured to be all wheel drive (AWD) , front wheel drive (FWR) , or rear wheel drive (RWD) . In some embodiments, vehicle 100 may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous.
As illustrated in FIG. 1, vehicle 100 may be equipped with  sensors  141 and 142 mounted to body 110 via a mounting structure 130. Mounting structure 130 may be an electro-mechanical device installed or otherwise attached to body 110 of vehicle 100. In  some embodiments, mounting structure 130 may use screws, adhesives, or another mounting mechanism. Vehicle 100 may be additionally equipped with sensor 150 inside or outside body 110 using any suitable mounting mechanisms. It is contemplated that the manners in which  sensor  141, 142 or 150 can be equipped on vehicle 100 are not limited by the example shown in FIG. 1, and may be modified depending on the types of sensors of 141, 142, or 150 and/or vehicle 100 to achieve desirable sensing performance.
In some embodiments,  sensors  141, 142, and 150 may be configured to capture data as vehicle 100 travels along a trajectory. Consistent with the present disclosure, sensor 141 may be a camera that takes pictures or otherwise collects image data. For example, sensor 141 may include a monocular, binocular, or panorama camera. Sensor 141 may acquire a plurality of images (each known as an image frame) as vehicle 100 moves along a trajectory. Each image frame is acquired at a time point as triggered by a triggering signal.
Consistent with the present disclosure, sensor 142 may be a LiDAR scanner configured to scan the surrounding and acquire point clouds. LiDAR measures distance to a target by illuminating the target with pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths can then be used to make digital t3-D representations of the target. The light used for LiDAR scan may be ultraviolet, visible, or near infrared. Because a narrow laser beam can map physical features with very high resolution, LiDAR scanner is particularly suitable for high-definition map surveys. In some embodiments, a LiDAR scanner may capture a point cloud. Similar to sensor 141, sensor 142 may also acquire a plurality of point clouds (each known as a point cloud frame) as vehicle 100 moves along a trajectory. Each point cloud frame is acquired at a time point as triggered by a triggering signal. The signal triggering sensor 142 may be the same as, or different from, the signal triggering sensor 141.
As vehicle 100 travels along the trajectory,  sensors  141 and 142 may continuously capture data. Each set of scene data captured at a certain time point is known as a data frame. For example, sensor 141 may record a video consisting of multiple image frames captured at multiple time points. Meanwhile, sensor 142 may capture a series of point cloud data at multiple time points.
As illustrated in FIG. 1, vehicle 100 may be additionally equipped with sensor 150, which may include sensors used in a navigation unit, such as a GPS receiver and one or more IMU sensors. A GPS is a global navigation satellite system that provides geolocation and time information to a GPS receiver. An IMU is an electronic device that measures and provides a vehicle’s specific force, angular rate, and sometimes the magnetic field  surrounding the vehicle, using various inertial sensors, such as accelerometers and gyroscopes, sometimes also magnetometers. By combining the GPS receiver and the IMU sensor, sensor 150 can provide real-time pose information of vehicle 100 as it travels, including the positions and orientations (e.g., Euler angles) of vehicle 100 at each time point.
GPS/IMU sensor uses an accurate Pulse Per Second (PPS) signal to time its acquisition. A PPS signal is an electrical signal that has a width of less than one second and a sharply rising or abruptly falling edge that accurately repeats once per second. In some embodiments, the PPS pulses may contain rising edges corresponding to the start of every Universal Time Coordinated (UTC) second. The accuracy of the correspondence may be within a nano-second. Therefore, PPS signals may be used for precise timekeeping and time measurement.
Consistent with the present disclosure, vehicle 100 may include a synchronization system to synchronize  sensors  141, 142, and 150 such that image frames captured by sensor 141, point clouds captured by sensor 142, and pose information captured by sensor 150 are all captured at the same time points. The synchronized image frame, point cloud, and associated pose information may be used collectively to position vehicle 100. Consistent with the present disclosure, the PPS signal provided by the GPS/IMU sensor may be used to synchronize the acquisition of  sensor  141, 142, and 150.
FIG. 2 illustrates a schematic diagram of an exemplary vehicle synchronization system 200 for synchronizing vehicle sensors, according to embodiments of the disclosure. Vehicle synchronization system 200 may include a synchronization control board 210, a Controlled Area Network (CAN) bus 220, and a synchronization computer 230. Consistent with the present disclosure, system 200 is configured to synchronize  sensors  141, 142, and 150. It is contemplated that system 200 may include additional components.
Synchronization control board 210 may be a circuit board configured to receive various input signals and provide various output signals. In some embodiments, synchronization control board 210 may be a Printed Circuit Board (PCB) , consisting of an insulator, e.g., fiberglass, with threads of conductive material acting as wires on the based on the board. In some embodiments, synchronization control board 210 may be an integrated circuit (also known as an IC, a chip, or a microchip) , which consists of a large number of electronic components (e.g., transistors, resistors, capacitors, inductors, etc. ) on one small flat piece of semiconductor material, e.g., silicon.
For example, FIG. 3 illustrates a schematic diagram of an exemplary synchronization control board 210, according to embodiments of the disclosure.  Synchronization control board 210 may be a chip that has several input pins, e.g., pins 311-313, and several output pins, e.g., pins 314-317. Although FIG. 3 shows synchronization control board 210 as a STM 32 IC chip, it is contemplated that other suitable chips or circuits may be used. It is also contemplated that synchronization control board 210 may include more or less pins than showed in FIG. 3.
In some embodiments, synchronization control board 210 may receive the PPS signal via pin 311 from sensor 150. For example, the PPS signal may be a square wave at 1 Hz, thus each raising edge of the pulse corresponds to a start of a UTC second (referred to hereafter as a “GPS whole-second time” ) . Because GPS is considered a stratum-0 source, the PPS signal may be provided to the PC. 2 pin (pin 311) using a low-latency, low-jitter wire connection. In some embodiments, synchronization control board 210 may additionally receive a vehicle speed signal via a RxD pin (pin 312) from CAN bus 220. CAN bus 220 may be a Controller Area Network bus configured to allow microcontrollers and devices in a vehicle to communicate with each other in applications without a host computer. Various sensor inputs from around vehicle 100 may be collected and collated via CAN bus 220. For example, CAN bus 220 may collect vehicle speed information measured by speed sensors. Synchronization control board 210 may be grounded via a GND pin (pin 313) .
Consistent with the present disclosure, synchronization control board 210 may use the PPS signal provided by GPS/IMU sensor 150 as a reference timing signal. In some embodiments, based on this reference timing signal and the vehicle speed signal provided by CAN bus 220, synchronization control board 210 determines a triggering signal for triggering  sensors  141 and 142. In some embodiments, synchronization control board 210 may first convert the reference timing signal to a higher frequency. For example, synchronization control board 210 may convert an 1 Hz PPS signal to a 10-20 Hz signal. The converted signal may be used to trigger  sensors  141 and 142 to acquire each data frame (i.e., an image frame or a point cloud frame) . For example, the acquisition may be triggered by the rising edge of the pulses in the signal.
In some embodiments, the frequency of the triggering signal may be determined and adjusted based on the vehicle speed signal provided by CAN bus 220. For example, when vehicle 100 moves at a relatively low speed, the triggering frequency may be set low; when vehicle 100 moves at a relatively high speed, the triggering frequency may be set high to capture sufficient data frames to cover a certain area. In some embodiments, a ratio between the frequency of the triggering signal and the frequency of the reference timing signal may be proportional to the vehicle speed. For example, the ratio may be linearly  proportional to the vehicle speed. In some embodiments, the triggering signal for sensor 142 may have a frequency different from that of the triggering signal for sensor 141. For example, an 1 Hz triggering signal might be sufficient for LiDAR acquisition, while a 10-20 Hz triggering signal may be used for camera acquisition.
In some embodiments, synchronization control board 210 may use the reference timing signal to correct the triggering signal and ensure that the triggering signal is synchronized to UTC. Because the triggering signal is generated partially based on synchronization control board 210’s own clock, it may contain accumulated errors caused by the board clock. For example, FIG. 4 illustrates an exemplary triggering signal 400 synchronized using a reference timing signal, according to embodiments of the disclosure. For example, triggering signal 400 includes pulses 401-405 with rising edges at time points t0-t4, respectively. At time point t5, synchronization control board 210 may detect a rising edge in the reference timing signal, indicating the GPS whole second time. Synchronization control board 210 may phase shift the triggering signal accordingly so that the next pulse 410 has a rising edge matching the rising edge in the reference timing signal. Subsequent pulse 411 will be shifted correspondingly to have its rising edge at time point t6. The time interval between t5 and t6 will be kept the same as that between t0 and t1, to maintain triggering signal at the same frequency. In some embodiments, triggering signal correction may be triggered by every rising edge in the reference timing signal.
As shown in FIG. 3, synchronization control board 210 may provide triggering signals to sensor 141 (e.g., including cameras 141-1 and 141-2) and sensor 142 (e.g., LiDAR) via PA pins 315-317, respectively. In some embodiments, the triggering signals may be pulse-width modulated (PWM) . In addition, synchronization control board 210 may provide the reference timing signal (e.g., the PPS signal) and/or the triggering signal as a serial output signal to synchronization computer 230, via a feedback pin 314. The reference timing signal may be used by synchronization computer 230 for post-acquisition synchronization.
Returning to FIG. 2, in addition to synchronization control board 210 that is configured to synchronize the sensors during acquisition, system 200 may further include synchronization computer 230 that is configured to synchronize the sensor measurements after the acquisition. Synchronization computer 230 can be a general-purpose computer, or a proprietary device specially designed for sensor synchronization. It is contemplated that, synchronization computer 230 can be a separate system or an integrated component of a vehicle controller, such as an Engine Control Unit (ECU) . In some embodiments, synchronization computer 230 may include sub-systems, some of which may be remote. 
FIG. 5 illustrates a schematic diagram of an exemplary synchronization computer 230 for synchronizing data acquired by vehicle sensors, according to embodiments of the disclosure. In some embodiments, synchronization computer 230 may include a communication interface 502, a processor 504, a memory 512, and a storage 514. In some embodiments, synchronization computer 230 may have different modules in a single device, such as an integrated circuit (IC) chip (implemented as an application-specific integrated circuit (ASIC) or a field-programmable gate array (FPGA) ) , or separate devices with dedicated functions. In some embodiments, one or more components of synchronization computer 230 may be located inside vehicle 100, or remotely such as in a cloud, on a mobile device, or other distributed locations. Components of synchronization computer 230 may be in an integrated device, or distributed at different locations but communicate with each other through a network (not shown) .
Communication interface 502 may send data to and receive data from components such as sensor 150 and CAN bus 220 via communication cables, a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a cellular network, and/or a local or short-range wireless network (e.g., Bluetooth TM) , or other communication methods. In some embodiments, communication interface 502 can be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection. As another example, communication interface 502 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links can also be implemented by communication interface 502. In such an implementation, communication interface 502 can send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information via a network.
Consistent with some embodiments, communication interface 502 may receive a reference timing signal provided by sensor 150 and a triggering signal provided by synchronization control board 210. Communication interface 502 may further receive image frames captured by sensor 141, point cloud data captured by sensor 142, and pose information captured by sensor 150. Communication interface 502 may further provide the received data, signal and information to memory 512/storage 514 for storage or to processor 504 for processing.
Processor 504 may include any appropriate type of general-purpose or special-purpose microprocessor, digital signal processor, or microcontroller. Processor 504 may be configured as a separate processor module dedicated to synchronizing the acquired sensor  data. Alternatively, processor 504 may be configured as a shared processor module for performing other functions unrelated to sensor synchronization.
As shown in FIG. 5, processor 504 may include multiple modules, such as a reference image trigger time determination unit 506, a reference image determination unit 508, and an image acquisition time determination unit 510, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) of processor 504 designed for use with other components or software units implemented by processor 504 through executing at least part of a program. The program may be stored on a computer-readable medium, and when executed by processor 504, it may perform one or more functions. Although FIG. 5 shows units 506-510 all within one processor 504, it is contemplated that these units may be distributed among multiple processors located near or remotely with each other.
Reference image trigger time determination unit 506 may be configured to determine the time point T corresponding to the GPS whole-second time, i.e., the rising edge of a pulse in the PPS signal. In some embodiments, reference image trigger time determination unit 506 may inquire the GPS time in a GTIMU message. Because this GPS time usually lags the start of the GPS whole-second time, reference image trigger time determination unit 506 may take the floor of the GPS time as the reference image trigger time T.
Reference image determination unit 508 may identify, among the sequence of images received from sensor 141, a reference image captured by sensor 141 when triggered at the reference image trigger time T. In some embodiments, to identify the reference image, synchronization control board 210 may designate the time interval between two triggering pulses as Ti. Assuming there are N pulses within a whole second, the time interval between the reference image and the image just prior to it will be Tr=1000ms – (N-1) *Ti. Therefore, reference image determination unit 508 may identify a pair of images that are captured Tr apart, and determine the latter image in the pair as the reference image. For example, if N=10 (i.e., 10 Hz triggering signal) and Ti=105ms, then Tr=1000ms- (10-1) *105ms=55ms. Reference image determination unit 508 may identify the second image in the pair of images captured 55 ms apart as the reference image.
Between the reference image trigger time T when sensor 141 is triggered and an acquisition time T0 when the reference image is actually captured, there is a Ts0 time delay, which corresponds to the shutter speed (or exposure time) of sensor 141. The GPS time of the reference image is therefore T0=T+Ts0. Exposure time is the length of time when the  film or digital sensor inside a camera is exposed to light, also when a camera’s shutter is open when capturing an image. In some embodiments, time stamp Ts0 may be a camera parameter predetermined for sensor 141 or calculated automatically by sensor 141.
Once the reference image is identified, and its GPS time T0 is determined, image acquisition time determination unit 510 may determine the GPS times of the remaining images. For example, Tn may be the acquisition time for the nth image behind the reference mage. In some embodiments, Tn may be determined as Tn=T0+ (Trn-Tr0) +Tsn, where T0 is the GPS time for the reference image as determined by reference mage determination unit 508, Trn is the raw time of the nth image, and Tsn is the shutter time of the nth image. Raw time Trn and shutter time Tsn are both time stamps created by sensor 141.
In some embodiments, by synchronizing the reference image trigger time with the GPS whole-second time, the GPS time errors will not be accumulated. As a result, even if a reference image is missing, e.g., lost during the transmission from sensor 141, a reference image determined for the previous second may still be used to calculate the GPS times for the sequence of images.
Memory 512 and storage 514 may include any appropriate type of mass storage provided to store any type of information that processor 504 may need to operate. Memory 512 and storage 514 may be a volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other type of storage device or tangible (i.e., non-transitory) computer-readable medium including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM. Memory 512 and/or storage 514 may be configured to store one or more computer programs that may be executed by processor 504 to perform sensor synchronization functions disclosed herein. For example, memory 512 and/or storage 514 may be configured to store program (s) that may be executed by processor 504 to synchronize the sensor acquisitions.
Memory 512 and/or storage 514 may be further configured to store information and data used by processor 504. For instance, memory 512 and/or storage 514 may be configured to store the various types of signals (e.g., reference timing signal, vehicle speed signal, etc. ) provided by sensors 150 and CAN bus 220, as well as various types of data (e.g., image frames, point cloud data, pose information, etc. ) captured by  sensors  141, 142, and 150. Memory 512 and/or storage 514 may also store intermediate signals and data such as triggering signal, camera parameters, etc. The various types of data may be stored permanently, removed periodically, or disregarded immediately after a portion of data is processed.
FIG. 6 illustrates a flowchart of an exemplary method 600 for synchronizing vehicle sensors, according to embodiments of the disclosure. In some embodiments, method 600 may be implemented by a vehicle synchronization system 200 that includes, among other things, synchronization control board 210 and synchronization computer 230. For example, steps S602-S610 of method 600 may be performed by synchronization control board 210, and steps S612-S618 may be performed by synchronization computer 230. It is to be appreciated that some of the steps may be optional to perform the disclosure provided herein. Further, some of the steps may be performed simultaneously, or in a different order than shown in FIG. 6.
In step S602, synchronization control board 210 may receive a reference timing signal. In some embodiments, the reference timing signal may be a PPS signal provided by GPS/IMU sensor 150. For example, in a 1 Hz PPS signal, the rising edge of each pulse corresponds to the start of a UTC second. In step S604, synchronization control board 210 may receive a vehicle speed signal. For example, the vehicle speed signal may be provided by CAN bus 220, which acquires the information through a speed sensor of vehicle 100.
In step S606, synchronization control board 210 may generate a triggering signal based on the reference timing signal and the vehicle speed signal received in steps S602 and S604, respectively. The triggering signal may be a pulsed-wave signal similar to the reference timing signal but at a higher frequency. In some embodiments, the frequency of the triggering signal may be determined based on the vehicle speed signal. In some embodiments, a ratio between the frequency of the triggering signal and the frequency of the reference timing signal may be linearly proportional to the vehicle speed. For example, when vehicle 100 moves at a relatively low speed, the triggering frequency may be set low, and when vehicle 100 moves at a relatively high speed, the triggering frequency may be set high to capture sufficient data frames to cover a certain area.
In some embodiments, as part of step S606, synchronization control board 210 may generate one triggering signal for sensor 141, and another one for sensor 142. The triggering signal for sensor 141 may have a frequency different from the frequency of the triggering signal for sensor 142. For example, the triggering signal for LiDAR acquisition may be 1 Hz, while the triggering signal for camera acquisition may be 10-20 Hz.
In step S608, synchronization control board 210 may use the reference timing signal to correct the triggering signal and ensure that the triggering signal is synchronized at the start of every second. For example, synchronization control board 210 may detect a rising edge in the reference timing signal, indicating the GPS whole second time, and phase  shift the triggering signal accordingly so that the next pulse in the triggering signal has a rising edge matching the rising edge in the reference timing signal. In some embodiments, triggering signal correction may be triggered by every rising edge in the reference timing signal.
In step S610, the triggering signal (s) generated by synchronization control board 210 may be provided to  respective sensors  141 and 142 for triggering the sensor acquisitions. For example,  sensors  141 and 142 may be triggered to capture a frame of data upon the rising edge of each pulse in the triggering signal. In step S612, synchronization computer 230 may receive these captured data frames. For example, synchronization computer 230 may receive a sequence of images (image frames) from sensor 141 and point cloud frames from sensor 142.
During step S614-S618, synchronization computer 230 may perform post-acquisition synchronization on the acquired data. In step S614, reference image trigger time determination unit 506 may determine the reference image trigger time T corresponding to the GPS whole-second time, i.e., the rising edge of a pulse in the PPS signal. In some embodiments, time T may be determined by applying a floor operation on the GPS time in a GTIMU message.
In step S616, reference image determination unit 508 may identify a reference image captured when sensor 141 was triggered at the reference image trigger time T. In some embodiments, the reference image may be identified as a second image in a pair of images acquired Tr time interval apart. Tr may be determined as Tr=1000ms – (N-1) *Ti, where Ti is the time interval between two triggering pulses, and N is the number of pulses within a whole second. As part of step S616, reference image determination unit 508 may further determine an acquisition time T0 corresponding to the GPS time when the reference image is captured. In some embodiments, T0 may be determined as T0=T+Ts0, where Ts0 is a time delay corresponding to the shutter speed (or exposure time) of sensor 141. Time stamp Ts0 may be a camera parameter predetermined for sensor 141 or calculated automatically by sensor 141.
In step S618, image acquisition time determination unit 510 may determine acquisition times corresponding to the GPS times when the remaining images are captured. For example, Tn may be the acquisition time for the nth image behind the reference mage. In some embodiments, Tn may be determined as Tn=T0+ (Trn-Tr0) +Tsn, where T0 is the GPS time for the reference image as determined in step S616, Trn is the raw time of the nth image,  and Tsn is the shutter time of the nth image. Raw time Trn and shutter time Tsn are both time stamp created by sensor 141.
Another aspect of the disclosure is directed to a non-transitory computer-readable medium storing instructions which, when executed, cause one or more processors to perform the methods, as discussed above. The computer-readable medium may include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices. For example, the computer-readable medium may be the storage device or the memory module having the computer instructions stored thereon, as disclosed. In some embodiments, the computer-readable medium may be a disc or a flash drive having the computer instructions stored thereon.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed system and related methods. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed system and related methods.
It is intended that the specification and examples be considered as exemplary only, with a true scope being indicated by the following claims and their equivalents.

Claims (20)

  1. A vehicle synchronization system, comprising:
    a GPS receiver configured to provide a reference timing signal;
    a bus configured to provide a vehicle speed signal;
    at least one sensor; and
    a synchronization control board configured to:
    determine a triggering signal based on the reference timing signal and the vehicle speed signal; and
    trigger the at least one sensor using the triggering signal for acquiring a plurality of images.
  2. The vehicle synchronization system of claim 1, wherein the reference timing signal is a PPS signal.
  3. The vehicle synchronization system of claim 1, wherein the reference timing signal includes pulses at a first frequency, and the triggering signal includes pulses at a second frequency, wherein a ratio between the second frequency and the first frequency corresponds to the vehicle speed signal.
  4. The vehicle synchronization system of claim 1, wherein the synchronization control board is configured to use a rising pulse edge of the reference timing signal as a reference for synchronizing the triggering signal.
  5. The vehicle synchronization system of claim 1, wherein the at least one sensor includes a camera or a LiDAR.
  6. The vehicle synchronization system of claim 1, further comprising a processor configured to:
    receive, from the synchronization control board, the reference timing signal;
    receive, from the at least one sensor, the plurality of images and a set of time stamps corresponding to each image; and
    determine an acquisition time for each image based on the reference timing signal and the time stamps.
  7. The vehicle synchronization system of claim 6, wherein the processor is further configured to:
    determine a reference image trigger time based on the reference timing signal;
    identify a reference image, among the plurality of images, acquired at the reference image trigger time; and
    determine the acquisition time for each of the remaining images based on the reference image trigger time and the corresponding set of time stamps.
  8. The vehicle synchronization system of claim 7, wherein the set of time stamps includes a raw time and a shutter time.
  9. A method for synchronizing vehicle sensors, comprising:
    receiving a reference timing signal from a GPS receiver;
    receiving a vehicle speed signal from a bus;
    determining, using a synchronization control board, a triggering signal based on the
    reference timing signal and the vehicle speed signal; and
    triggering at least one sensor using the triggering signal for acquiring a plurality of images.
  10. The method of claim 9, wherein the reference timing signal is a PPS signal generated by the GPS receiver.
  11. The method of claim 9, further including using a rising pulse edge of the reference timing signal as a reference for synchronizing the triggering signal.
  12. The method of claim 9, wherein the at least one sensor includes a camera or a LiDAR.
  13. The method of claim 9, further including:
    receiving the reference timing signal from the synchronization control board;
    receiving, from the at least one sensor, the plurality of images and a set of time stamps corresponding to each image; and
    determining, using a processor, an acquisition time for each image based on the reference timing signal and the time stamps.
  14. The method of claim 13, wherein determining the acquisition time further includes:
    determining a reference image trigger time based on the reference timing signal;
    identifying a reference image, among the plurality of images, acquired at the reference image trigger time; and
    determining the acquisition time for each of the remaining images based on the reference image trigger time and the corresponding set of time stamps.
  15. A synchronization control board, comprising:
    at least one input pin configured to receive a reference timing signal and a vehicle speed signal;
    at least one output pin configured to provide a triggering signal to at least one sensor, wherein the triggering signal is determined based on the reference timing signal and the vehicle speed signal; and
    a feedback pin configured to provide the reference timing signal to a processor.
  16. The synchronization control board of claim 15, wherein the reference timing signal is a PPS signal generated by a GPS or an IMU.
  17. The synchronization control board of claim 15, wherein the reference timing signal includes pulses at a first frequency and the triggering signal includes pulses at a second frequency, wherein a ratio between the second frequency and the first frequency corresponds to the vehicle speed.
  18. The synchronization control board of claim 15, wherein the synchronization control board is configured to use a rising pulse edge of the reference timing signal as a reference for synchronizing the triggering signal.
  19. The synchronization control board of claim 15, wherein the vehicle speed signal is received from a CAN bus.
  20. The synchronization control board of claim 15, wherein the at least one sensor includes a camera or a LiDAR.
PCT/CN2018/118866 2018-12-03 2018-12-03 Systems and methods for synchronizing vehicle sensors and devices WO2020113358A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880092869.XA CN112041767A (en) 2018-12-03 2018-12-03 System and method for synchronizing vehicle sensors and devices
PCT/CN2018/118866 WO2020113358A1 (en) 2018-12-03 2018-12-03 Systems and methods for synchronizing vehicle sensors and devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/118866 WO2020113358A1 (en) 2018-12-03 2018-12-03 Systems and methods for synchronizing vehicle sensors and devices

Publications (1)

Publication Number Publication Date
WO2020113358A1 true WO2020113358A1 (en) 2020-06-11

Family

ID=70973993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/118866 WO2020113358A1 (en) 2018-12-03 2018-12-03 Systems and methods for synchronizing vehicle sensors and devices

Country Status (2)

Country Link
CN (1) CN112041767A (en)
WO (1) WO2020113358A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112865902A (en) * 2020-12-24 2021-05-28 深兰人工智能(深圳)有限公司 Data acquisition and time synchronization method and device, electronic equipment and storage medium
CN115603849A (en) * 2022-11-24 2023-01-13 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院)(Cn) Multi-sensor trigger control method, device, equipment and storage medium
AU2023200522B1 (en) * 2022-06-24 2023-04-13 Commonwealth Scientific And Industrial Research Organisation Clock synchronisation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113325838B (en) * 2021-04-23 2022-08-12 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization method and device based on camera exposure characteristics

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1707225A (en) * 2004-06-11 2005-12-14 东软集团有限公司 Method for recording real position of vehicle running
CN101067656A (en) * 2007-05-25 2007-11-07 北京航空航天大学 Position attitude system hardware time synchronizing method
US20130116854A1 (en) * 2011-11-04 2013-05-09 GM Global Technology Operations LLC Lane tracking system
CN203133590U (en) * 2013-03-14 2013-08-14 武汉大学 Vehicle-mounted synchronous controller
CN106043169A (en) * 2016-07-01 2016-10-26 百度在线网络技术(北京)有限公司 Environment perception device and information acquisition method applicable to environment perception device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101949715B (en) * 2010-08-10 2012-06-20 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN103744372B (en) * 2013-12-23 2016-06-08 广东电网公司电力科学研究院 The multisensor method for synchronizing time of unmanned plane electric inspection process and system
US20150185054A1 (en) * 2013-12-30 2015-07-02 Motorola Mobility Llc Methods and Systems for Synchronizing Data Received from Multiple Sensors of a Device
CN104902021B (en) * 2015-05-25 2018-07-20 北京嘀嘀无限科技发展有限公司 The method and device of synchronizing information
US9537956B1 (en) * 2015-12-11 2017-01-03 Uber Technologies, Inc. System for acquiring time-synchronized sensor data
US10268203B2 (en) * 2017-04-20 2019-04-23 GM Global Technology Operations LLC Calibration validation for autonomous vehicle operations
CN108168918B (en) * 2017-12-25 2019-12-27 中铁第四勘察设计院集团有限公司 Synchronous automatic control system and method for synchronous measurement of automatic track measuring vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1707225A (en) * 2004-06-11 2005-12-14 东软集团有限公司 Method for recording real position of vehicle running
CN101067656A (en) * 2007-05-25 2007-11-07 北京航空航天大学 Position attitude system hardware time synchronizing method
US20130116854A1 (en) * 2011-11-04 2013-05-09 GM Global Technology Operations LLC Lane tracking system
CN203133590U (en) * 2013-03-14 2013-08-14 武汉大学 Vehicle-mounted synchronous controller
CN106043169A (en) * 2016-07-01 2016-10-26 百度在线网络技术(北京)有限公司 Environment perception device and information acquisition method applicable to environment perception device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112865902A (en) * 2020-12-24 2021-05-28 深兰人工智能(深圳)有限公司 Data acquisition and time synchronization method and device, electronic equipment and storage medium
CN112865902B (en) * 2020-12-24 2023-06-23 深兰人工智能(深圳)有限公司 Data acquisition and time synchronization method and device, electronic equipment and storage medium
AU2023200522B1 (en) * 2022-06-24 2023-04-13 Commonwealth Scientific And Industrial Research Organisation Clock synchronisation
KR20240001035A (en) * 2022-06-24 2024-01-03 커먼웰쓰 사이언티픽 앤 인더스트리알 리서치 오거니제이션 Clock synchronisation
KR102658028B1 (en) 2022-06-24 2024-04-15 커먼웰쓰 사이언티픽 앤 인더스트리알 리서치 오거니제이션 Clock synchronisation
CN115603849A (en) * 2022-11-24 2023-01-13 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院)(Cn) Multi-sensor trigger control method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112041767A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
CA3028653C (en) Methods and systems for color point cloud generation
EP3612854B1 (en) Vehicle navigation system using pose estimation based on point cloud
CA3027921C (en) Integrated sensor calibration in natural scenes
WO2020113358A1 (en) Systems and methods for synchronizing vehicle sensors and devices
AU2018278901B2 (en) Systems and methods for updating a high-resolution map based on binocular images
CN112005079B (en) System and method for updating high-definition map
CN113160327A (en) Method and system for realizing point cloud completion
CN113240813B (en) Three-dimensional point cloud information determining method and device
JP6369897B2 (en) Self-position calculation device and self-position calculation method
AU2018102199A4 (en) Methods and systems for color point cloud generation
CN111862211B (en) Positioning method, device, system, storage medium and computer equipment
RU2789923C2 (en) Methods and systems for synchronizing sensors of unmanned vehicles (sdv) online
CN117218175A (en) Depth estimation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18942571

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18942571

Country of ref document: EP

Kind code of ref document: A1