WO2020135382A1 - 多传感器同步授时系统、方法、装置及电子设备 - Google Patents
多传感器同步授时系统、方法、装置及电子设备 Download PDFInfo
- Publication number
- WO2020135382A1 WO2020135382A1 PCT/CN2019/127719 CN2019127719W WO2020135382A1 WO 2020135382 A1 WO2020135382 A1 WO 2020135382A1 CN 2019127719 W CN2019127719 W CN 2019127719W WO 2020135382 A1 WO2020135382 A1 WO 2020135382A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- multiple sensors
- time
- sensor
- sensing data
- data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G7/00—Synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04J—MULTIPLEX COMMUNICATION
- H04J3/00—Time-division multiplex systems
- H04J3/02—Details
- H04J3/06—Synchronising arrangements
Definitions
- This application relates to the field of automation technology, and in particular to multi-sensor synchronous timing systems, methods and devices, robots, and electronic equipment.
- a multi-sensor fusion sensing system uses multiple sensors to detect and sense the surrounding environment.
- the synchronization of sensor data is an important link to ensure the perception effect and perception performance.
- each sensor can achieve high-precision synchronous data collection.
- Method 1 There is no strict time synchronization between the sensors. After each sensor collects data, the sensor data is marked with an approximate time stamp according to the operating system time. This method has the advantage of low difficulty in timing.
- Method 2 Use high-precision clock signal sources, such as gps/beidou/ntp and other high-precision clock signal sources, to accurately time each sensor in the same system (such as unmanned vehicles, etc.), and use high-precision pulse signals to trigger The sensor performs data collection, and uses the trigger clock signal of the high-precision clock signal source as the time stamp of the data collected by the sensor.
- the advantage of this method is that each sensor can obtain high collection time accuracy.
- Method 1 cannot perform strict space-time alignment on the original data, thereby increasing the difficulty of perceiving data fusion; 2) Method 2 is unacceptable High-precision clock signal source time-stamped sensors will not be able to perform such sensor synchronization, and at the same time, they will not be able to detect at the data level whether the acquisition time alignment is strictly meaningful.
- the prior art has the problem of low accuracy of multi-sensor synchronous timing.
- the present application provides a multi-sensor synchronous timing system to solve the problem of low accuracy of multi-sensor synchronous timing in the prior art.
- the present application additionally provides multi-sensor synchronous timing methods and devices, robots, and electronic equipment.
- This application provides a multi-sensor synchronous time service system, including:
- a robot configured to trigger a plurality of sensors to collect sensing data through a trigger signal derived from a first clock signal source, and send a synchronous timing request to the server; and, to receive the sensing data between the multiple sensors returned by the server A time offset, and according to the time offset, perform time alignment on the sensing data of the multiple sensors;
- the server is used for receiving the synchronization timing request, determining the sensing data of the multiple sensors, and determining the synchronization frame of the sensing data among the multiple sensors according to the correlation between the changes in the sensing data between the multiple sensors Difference; determine the time offset according to the synchronization frame difference; send the time offset back to the robot.
- This application also provides a multi-sensor synchronous timing method, including:
- time alignment is performed on the sensing data of the plurality of sensors.
- This application also provides a multi-sensor synchronous timing device, including:
- the sensor trigger unit is used to trigger multiple sensors to collect sensory data through a trigger signal derived from the first clock signal source;
- Aware data sending unit used to send synchronous timing request to the server
- a time offset receiving unit configured to receive the time offset of the sensory data sent back by the server
- the time alignment unit is configured to perform time alignment on the sensing data of the multiple sensors according to the time offset.
- This application also provides a robot, including:
- the first clock signal source
- the memory is used to store a program for implementing a multi-sensor synchronous time service method. After the device is powered on and the multi-sensor synchronous time service method program is run by the processor, the following steps are performed: through the source from the first clock signal source Trigger signal to trigger multiple sensors to collect sensing data and send a synchronous timing request to the server; and, receive the time offset of the sensing data between the multiple sensors returned by the server, and according to the time offset, The sensing data of the plurality of sensors performs time alignment.
- This application also provides a multi-sensor synchronous timing method, including:
- the time offset is sent back to the robot.
- This application also provides a multi-sensor synchronous timing method, including:
- Request receiving unit used to receive the synchronous timing request sent by the robot
- a sensing data determining unit used to determine the sensing data of the multiple sensors of the robot
- a synchronization frame difference determining unit configured to determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors;
- a time offset determining unit configured to determine the time offset according to the synchronization frame difference
- the time offset return unit is used to return the time offset to the robot.
- This application also provides an electronic device, including:
- the memory is used to store a program for implementing a multi-sensor synchronous time service method. After the device is powered on and the multi-sensor synchronous time service method program is run through the processor, the following steps are performed: receiving a synchronous time service request sent by a robot; determining the Perceived data of multiple sensors of the robot; determine the synchronization frame difference of the sensed data between the multiple sensors based on the correlation between changes in the frame of perceived data between the multiple sensors; determine the time based on the synchronized frame difference Offset; send the time offset back to the robot.
- This application also provides a multi-sensor synchronous timing method, including:
- time alignment is performed on the sensing data of the plurality of sensors.
- the correlation degree is determined according to relative transformation data of the multiple sensors; the relative transformation data includes relative transformation data between two adjacent frames of sensing data collected by the sensor.
- the determining the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors includes:
- the synchronization frame difference corresponding to the maximum value of the correlation degree is used as the perceived data synchronization frame difference.
- the determining the time offset of the sensing data between the multiple sensors according to the synchronization frame difference includes:
- the average value of the time difference between two pairs of perceptual data with the synchronization frame difference is used as the time offset.
- the determining the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors includes:
- the first sensor and the second sensor are determined according to the correlation between the first sensor and the second sensor to sense the change between data frames Inter-frame data synchronization frame difference; wherein, the first sensor includes a sensor that senses data for a first time, and the second sensor includes a sensor that senses data for a second time.
- the method before determining the correlation between the frames of sensing data between the multiple sensors and determining the synchronization frame difference of the sensing data between the multiple sensors, the method further includes:
- the method before determining the correlation between the frames of sensing data between the multiple sensors and determining the synchronization frame difference of the sensing data between the multiple sensors, the method further includes:
- the determining the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors includes:
- the synchronization frame difference is determined according to the correlation between the preset number of multi-frame sensing data frames among the multiple sensors.
- Optional also includes:
- the trigger signal and the first time corresponding to the trigger time of the sensor are generated.
- This application also provides a multi-sensor synchronous timing device, including:
- the sensor trigger and time distribution unit is used to trigger multiple sensors to collect sensory data through the trigger signal from the first clock signal source;
- a synchronization frame difference determining unit configured to determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors;
- a time offset determining unit configured to determine the time offset of the sensing data between the multiple sensors according to the synchronization frame difference
- the synchronization calibration unit is configured to perform time alignment on the sensing data of the multiple sensors according to the time offset.
- This application also provides a robot, including:
- the first clock signal source
- the memory is used to store a program for implementing a multi-sensor synchronous time service method. After the device is powered on and the multi-sensor synchronous time service method program is run by the processor, the following steps are performed: through a trigger signal from the first clock signal source , Triggering multiple sensors to collect sensing data; determining the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes in the sensing data frames between the multiple sensors; determining the multiple The time offset of the sensing data between the sensors; according to the time offset, time alignment is performed on the sensing data of the multiple sensors.
- the present application also provides a computer-readable storage medium having instructions stored therein, which when executed on a computer, causes the computer to execute the various methods described above.
- the present application also provides a computer program product including instructions, which, when run on a computer, causes the computer to execute the various methods described above.
- the multi-sensor synchronous time service system includes a robot and a server.
- the robot triggers multiple sensors to collect sensing data through a trigger signal from a first clock signal source, and sends a synchronous time service request to the server; the server determines the request according to the request.
- the sensing data of multiple sensors determines the synchronization frame difference of the sensing data between multiple sensors according to the correlation between the changes of the sensing data frame between multiple sensors; according to the synchronization frame difference, the time offset of the sensing data between multiple sensors is determined ; Send the time offset to the robot; the robot performs time alignment on the sensor data of multiple sensors according to the time offset; this processing method makes it possible to detect the accuracy of the synchronization of multiple sensors at the data level.
- the time stamps of multiple sensing data are not aligned, the calibration of multiple sensors' synchronous timing can be achieved; therefore, the accuracy of multi-sensor synchronous timing can be effectively improved to ensure accurate timing of each sensor in the same system.
- FIG. 1 is a flowchart of an embodiment of a multi-sensor synchronous timing method provided by this application;
- FIG. 2 is a schematic diagram of an embodiment of a multi-sensor synchronous timing method provided by this application.
- FIG. 3 is a schematic diagram of an embodiment of a multi-sensor synchronous timing method provided by this application.
- FIG. 5 is a schematic diagram of an embodiment of a multi-sensor synchronous timing method provided by this application.
- FIG. 6 is a schematic structural diagram of an embodiment of a multi-sensor synchronous time service device provided by this application.
- FIG. 7 is a specific schematic diagram of an embodiment of a multi-sensor synchronous timing device provided by this application.
- FIG. 8 is a schematic diagram of an embodiment of a robot provided by the present application.
- FIG. 9 is a specific schematic diagram of an embodiment of a robot provided by this application.
- FIG. 10 is a schematic diagram of an embodiment of a multi-sensor synchronous time service system provided by this application.
- FIG. 11 is a flowchart of an embodiment of a multi-sensor synchronous timing method provided by this application.
- FIG. 12 is a schematic diagram of an embodiment of a multi-sensor synchronous timing service positioning device provided by this application.
- FIG. 13 is a schematic diagram of an embodiment of a robot provided by the present application.
- 15 is a schematic diagram of an embodiment of a multi-sensor synchronous timing service positioning device provided by this application.
- 16 is a schematic diagram of an embodiment of a robot provided by the present application.
- FIG. 1 is a flowchart of an embodiment of a multi-sensor synchronous timing method provided by this application.
- the execution subject of the method includes a multi-sensor synchronous timing device, which can be deployed in a robot system.
- a multi-sensor synchronous timing method provided by this application includes:
- Step S101 Trigger multiple sensors to collect sensing data through a trigger signal derived from a first clock signal source.
- the method provided in the embodiment of the present application can trigger multiple heterogeneous sensors in the robot system to perform data collection through the first clock signal source.
- the robot may be an unmanned vehicle, a cleaning robot, a medical robot, a military robot, a disabled robot, and so on.
- an unmanned vehicle also called a wheeled robot
- its sensors can include the following: image sensor (camera), ultrasonic radar, lidar, and millimeter wave radar, etc.
- each sensor needs to connect its own Processor (such as MCU, ECU, etc.).
- the most advanced smart cars use more than a dozen sensors (only applied to automatic driving functions), such as 5 low-beam laser radars, plus 6 cameras and 3 millimeter-wave radars, which are used by the sensing system in which these sensors are fused
- Multiple sensors detect and sense the surrounding environment.
- clock (timer) synchronization between processors of multiple heterogeneous sensors in a robot system can be achieved.
- the first clock signal source includes a high-precision clock signal source, such as a GPS (Global Positioning System, Global Positioning System) clock, a Beidou clock, an NTP (Network Time Service) clock, and so on.
- a GPS Global Positioning System, Global Positioning System
- Beidou Beidou
- NTP Network Time Service
- the GPS clock can output the time information format that meets the specifications according to user needs to complete the synchronous time service.
- the main principle is to tame the crystal oscillator through the signal of GPS or other satellite navigation systems, thereby achieving high-precision frequency and time signals
- Output is currently the most effective way to achieve nanosecond-level timing accuracy and stability at a frequency output of the order of 1E12.
- GPS clocks are mainly divided into two types, one is GPS time meter, which mainly outputs time scale information, including 1PPS (second pulse, one pulse per second) and TOD (Time of Day, year, month, day, hour, minute, and second) information; another type It is a GPS synchronous clock, which outputs high-stable frequency information obtained by using satellite signals to tame OCXO or rubidium clocks, and a more stable time-scale signal recovered locally.
- 1PPS second pulse, one pulse per second
- TOD Time of Day, year, month, day, hour, minute, and second
- GPS synchronization clock is mainly composed of the following parts: GPS/GNSS receiver, which can be GPS/GLONASS/BD/GALILEO, etc., high-precision OCXO or rubidium clock, local synchronization calibration unit, difference measurement unit, error processing and control structure, Input and output and other parts.
- the multi-sensor synchronous timing device can generate a high-precision clock pulse signal such as PPS through a high-precision clock signal source; and then generate a fixed frequency (such as 50 Hz, etc.) to trigger the sensor through a high-precision clock pulse trigger unit ) Or a trigger signal with a custom frequency and a time stamp corresponding to the trigger time; then, through the sensor trigger and time unit, the trigger signal is used to trigger the sensor to collect the sensing data and time stamp.
- a high-precision clock pulse signal such as PPS through a high-precision clock signal source
- a fixed frequency such as 50 Hz, etc.
- a trigger signal with a custom frequency and a time stamp corresponding to the trigger time
- the trigger signal is used to trigger the sensor to collect the sensing data and time stamp.
- the plurality of sensors include a sensor that can accept the time stamp of the first clock signal source, and a sensor that does not accept the time stamp of the first clock signal source.
- the sensor that accepts the time stamp of the first clock signal source is referred to as the first sensor
- the sensor that does not accept the time stamp of the first clock signal source is referred to as the second sensor.
- the first sensor includes, but is not limited to, a laser rangefinder (LiDAR, lidar for short), millimeter wave radar (Radio), and so on.
- the second sensor includes, but is not limited to, an image sensor (Camera, camera for short), ultrasonic radar, and the like.
- the method provided by the embodiment of the present application triggers a plurality of heterogeneous sensors (including the first sensor and the second sensor) in the robot system to acquire data through the first clock signal source, and uses the trigger clock signal of the first clock signal source as the The timestamp of the sensed data collected by the first sensor, thereby enabling the sensed data of the first sensor to obtain high collection time accuracy and achieve accurate timing of the first sensor.
- the timestamp can be marked for this type of sensing data according to the second clock signal source.
- the clock between the processors of the second sensor needs to be Synchronize.
- the time stamp (data collection time information) corresponding to the sensing data includes the first time or the second time.
- the first time refers to the time corresponding to the trigger time of the sensor determined according to the first clock signal source
- the time is a time stamp To directly assigned by the time service unit
- the time stamp is provided by the time service unit
- the time stamps are consistent.
- the second time includes the time of the second clock signal source.
- the time is a time stamp Ts assigned by a second clock signal source other than the time unit.
- the second clock signal source also known as a third-party clock signal source, may be the operating system clock of the robot (abbreviated as the system clock) and so on.
- the system clock can be a high-performance processor clock system based on the CMOS process, and is a circuit composed of an oscillator (signal source), a timing wake-up device, and a frequency divider. Commonly used signal sources are crystal oscillators and RC oscillators.
- an unmanned vehicle mainly relies on an intelligent driver mainly based on a computer system in the vehicle to achieve the purpose of unmanned driving, in which the clock signal of the operating system of the computer system can be used as the second clock signal source, the robot
- the operating system can be Ubuntu, ROS, Android operating system, etc.
- step S101 a plurality of sensors are triggered by a trigger signal derived from a first clock signal source, so that the sensors collect sensing data, and then they can proceed to the next step, according to the correlation between changes in the sensing data between the sensors Degree to determine the difference in frame synchronization of the perceived data among the multiple sensors.
- Step S103 Determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors.
- the method provided in this application is designed based on the following principle: assuming a rigid connection between multiple sensors, if each sensor is strictly synchronized and each sensor has the same Receptive Field, and the surrounding environment is stationary, when all sensors move together, The sensed data also changes together. When all sensors stop changing, the sensed data also stops changing. If the synchronization between the sensors has a relative time delay offset, the corresponding changes in the perception data will also have the same offset. Based on this principle, the method provided by the present application utilizes the relative changes in the same sensor at two times before and after to achieve online synchronous time service check and synchronous calibration between different sensors.
- the rigid connection refers to a connection between the sensor and the sensor through a structure that does not deform to ensure that the relative change of the collected frames before and after the acquisition is due to the overall motion rather than the relative motion.
- a robot includes a sensor A and a sensor B.
- a and B may be the same type or different types of sensors.
- A is a lidar
- B is a camera
- a and B are rigidly connected.
- the surrounding environment is stationary or partially stationary.
- a and B can be triggered by time pulses of the same frequency or different frequencies.
- the sensor data of sensor A can be obtained in time series Change "Va1, Va2, Va3..., VaN"
- the corresponding timestamp is "Ta1, Ta2, Ta3..., TaN”
- the time series of the sensor B's perception data "Vb1, Vb2, Vb3..., VbN”
- the corresponding timestamp is "Tb1, Tb2, Tb3..., TbN”.
- one of the curves represents the change of the lidar's perception data in time frame before and after
- the other curve represents the change of the camera's perception data in time frame before and after time frame.
- FIG. 3 is a time-series change curve of the data of the lidar and the camera actually collected in this embodiment. It can be seen from Figure 3 that the peak corresponding time of the two sensors' sensing data is basically the same, and the corresponding time of the valley value is basically the same. It can be seen that the two sensors have basically reached the synchronous timing.
- the inventor of the technical solution of the present application finds that the perception data of the two sensors has a strong correlation between the changes of the frames before and after the timing, and the correlation between the two sensors can be obtained through this correlation.
- the relative offset in time Based on this technical concept, the method provided in the embodiments of the present application determines the correlation degree (also called correlation coefficient) of the change of the sensor data of the two sensors before and after the frame in time sequence based on the multi-frame perception data collected by the two sensors respectively , And express the correlation as a function of the synchronization frame difference between the two sensors.
- the correlation degree also called correlation coefficient
- the correlation can be determined based on the relative transformation data of the multiple sensors.
- the relative transformation data includes relative transformation data between two adjacent frames of sensing data collected by the same sensor.
- the synchronization frame difference includes the difference in the frame number of the sensing data collected by the two sensors at the same time. For example, at 12:00:00 2018/11/30, the lidar and the camera are simultaneously triggered to collect sensing data.
- the timestamp of the sensing data of the lidar is derived from the GPS clock, that is: 2018/11/30 12:00:00, camera
- the timestamp of the perception data comes from the operating system, namely: 2018/11/30 12:00:01, assuming that the lidar is on 2018/11/3012:00:00-2018/11/30 12:00:01 1 A total of 10 frames of point cloud data were collected within the second period.
- the time stamp of the first frame is 2018/11/30 12:00:00 and the time stamp of the 10th frame is 2018/11/30 12:00:01 Since the timestamp of the data collected by the camera on 2018/11/30 12:00:00 is 2018/11/30 12:00:01, the difference between the synchronization frame of the lidar and the camera is 9.
- the correlation coefficient between two sensors can be expressed as the following formula:
- k is the synchronization frame difference
- Va(ik) is sensor A's perception data of frame ik
- Vb(i) is sensor B's perception data of frame i.
- step S103 may include the following sub-steps:
- Step S1031 For multiple synchronization frame differences, obtain the correlation degree corresponding to the synchronization frame differences.
- the multiple synchronization frame differences may be 1 frame, 2 frames...10 frames, and so on. According to the above formula, the correlation degree corresponding to each synchronization frame difference can be calculated.
- Step S1032 Use the synchronization frame difference corresponding to the maximum value of the correlation degree as the sensing data synchronization frame difference.
- the number of frames of the perception data to be based on may be determined according to the scene of the sensor collecting data. For example, the synchronization frame difference may be solved based on 100 frames of the perception data. The more frames, the more obvious the relevant peak value.
- the above formula assumes that the time difference of the frames before and after different sensors is the same, that is, the time difference between Va(i+1) and Va(i) is the same as the time difference between Vb(i+1) and Vb(i), and has a fixed
- the sequences Va' and Vb' with the same time difference can be generated by interpolation and other methods to ensure that the above correlation formula of the sensor is established.
- the method provided in this application usually only needs to be turned on when needed to perform time-dependent calibration on the multi-sensors of the robot. For example, for newly installed equipment, or equipment that has not been calibrated for a long time, or when the perception system is working, it is found that the perception data of different sensors cannot be aligned with each other in real time, and the correlation calibration is usually performed by the method provided in this application.
- step S103 determine whether the time length from the last synchronous time calibration time reaches the time threshold value; if so, step S103.
- the duration threshold may be determined according to business requirements, for example, set to one week, that is, the method provided in this application is executed once a week.
- step S103 the following step may also be included: if the sensing data of the multiple sensors cannot be aligned when projected with each other, then step S103 is entered.
- the time offset is found to be 0.5ms at time tx, and then the operating system synchronizes the timing of all sensors that cannot be time-stamped according to the GPS clock signal source according to this value; however, after 1 hour It was found that the perception data of multiple sensors could not be aligned with each other, and the calibration was started.
- the time offset was changed to 1 ms, and then the multiple sensors were synchronized based on this value.
- Using this processing method makes it possible to detect at the data level whether the sensing data of the multi-sensor is strictly aligned with the collection time; therefore, the accuracy of multi-sensor synchronous timing can be effectively improved.
- the multi-sensor synchronous timing device performs a pairwise combination of multiple sensors of a robot, and calculates a time offset for each pair of sensors.
- one first sensor and one second sensor pair may be used to calculate the time offset between the second sensor and the first sensor; or two second sensor groups may be used to calculate two second sensors
- the time offset between the sensors also needs to calculate the time offset between one of the second sensors and the first sensor, thereby deriving the time offset between the other second sensor and the first sensor.
- the next step may be entered to determine the time offset according to the synchronization frame difference.
- Step S105 Determine the time offset of the sensing data between the multiple sensors according to the synchronization frame difference.
- the time offset can be determined according to the time stamps corresponding to the two frames of data with the synchronous frame difference.
- the time stamp corresponding to each of the two frames of data with the synchronization frame difference can be taken, and the difference between the two time stamps can be used as the time offset.
- the synchronization frame difference between the lidar and the camera is 9, you can take the first perception data of the i-th frame of the lidar and the second perception data of the i+9-frame of the camera, and separate the two frames of perception data
- the difference of the timestamps is used as the time offset. If the timestamp difference between the two frames of data is 1 second, the time offset is 1 second.
- the average value of the time difference between two pairs of perceptual data with the synchronization frame difference is used as the time offset.
- the reason for this processing method is that even at a fixed trigger frequency, due to the different methods of calculating the timestamp of each sensor, there will be a slight disturbance in the time of each frame.
- the timestamp of the laser point cloud is due to the calculation method. Differently, there will be a slight disturbance, and the frames before and after will be disturbed by the order of 0.001.
- the average value of the difference between multiple timestamps of two frames is used as the time offset. Using this processing method can effectively improve the accuracy of the time offset.
- the synchronization frame difference between the lidar and the camera is 9, you can take the first perception data of frame i and the second perception data of frame i+9, assuming that the time difference between the two is 1 second; the i +1 frame first perception data and i+1+9 frame second perception data, assuming that the time difference between the two is 1.001 seconds; ..., i+50 frame first perception data and i+50+9 frame first Two perceptual data, assuming that the time difference between the two is 0.999 seconds; the average value of the timestamp difference of all perceptual data, namely (1+1.001+...+0.999)/50, is used as the time offset.
- Step S107 Perform time alignment on the sensing data of the multiple sensors according to the time offset.
- time alignment can be performed on the sensing data among the multiple sensors.
- the timestamp of the sensing data may be set to the GPS time corresponding to the time when the sensor is triggered; when the second sensor collects the sensing data, the timestamp of the sensing data is set to: The difference between the operating system time and the time offset, thereby achieving synchronous timing of multiple sensors.
- the multi-sensor synchronous time service method triggers a plurality of sensors to collect sensory data through a trigger signal from a first clock signal source
- the plurality of sensors includes at least one first sensor and at least A second sensor
- the time of the first perception data collected by the first sensor includes the first time
- the time of the second perception data collected by the second sensor includes the second time
- the first time refers to The time determined by the first clock signal source corresponding to the triggering time of the sensor
- the second time includes the time determined according to the second clock signal source; and determined according to the correlation between the changes in the perceived data frames between the multiple sensors
- Sensing the data synchronization frame difference between the multiple sensors determining the time offset between the first time and the second time according to the synchronization frame difference; according to the time offset
- the sensor data of multiple sensors performs time alignment; this processing method makes it possible to detect the accuracy of multiple sensors synchronous time service at the data level, and can realize multiple sensors when it is detected that the data time
- a multi-sensor synchronous time service method is provided, and correspondingly, the present application also provides a multi-sensor synchronous time service device. This device corresponds to the embodiment of the above method.
- FIG. 6 is a schematic diagram of an embodiment of a multi-sensor synchronous timing device of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- This application additionally provides a multi-sensor synchronous timing device, including:
- the sensor trigger unit 601 is configured to trigger a plurality of sensors to collect sensory data through a trigger signal derived from a first clock signal source;
- the synchronization frame difference determining unit 602 is configured to determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors;
- the time offset determining unit 603 is configured to determine the time offset of the sensing data between the multiple sensors according to the synchronization frame difference;
- the synchronization calibration unit 604 is configured to perform time alignment on the sensing data of the multiple sensors according to the time offset.
- the correlation degree is determined according to relative transformation data of the multiple sensors; the relative transformation data includes relative transformation data between two adjacent frames of sensing data collected by the sensor.
- the synchronization frame difference determining unit 602 includes:
- a correlation degree obtaining subunit configured to obtain the correlation degree corresponding to the synchronization frame difference for multiple synchronization frame differences
- the synchronization frame difference determining subunit is configured to use the synchronization frame difference corresponding to the maximum value of the correlation degree as the synchronization frame difference of the sensing data.
- the time offset determining unit 603 is specifically configured to use the average value of the time difference of multiple pairs of two frames of sensing data with the synchronization frame difference as the time offset.
- the synchronization frame difference determining unit 602 is specifically used for a pairwise combination of the first sensor and the second sensor among the plurality of sensors, based on the perceived data frame between the first sensor and the second sensor The degree of correlation between the changes, to determine the difference in the synchronous frame of perceived data between the first sensor and the second sensor; wherein, the first sensor includes a sensor that senses data for a first time, and the second sensor includes sense data The time is the second time sensor.
- FIG. 7 is a schematic diagram of an embodiment of the robot of the present application.
- the device further includes:
- the first judging unit 701 is used to judge whether the duration of the current time from the last synchronization timing calibration time reaches the duration threshold; if so, the synchronization frame difference determining unit 602 is started.
- Optional also includes:
- the second judging unit is used to judge whether the sensing data of the multiple sensors cannot be aligned when projecting with each other; if so, the synchronization frame difference determining unit 602 is started.
- the synchronization frame difference determining unit 602 is specifically configured to determine the synchronization frame difference according to a correlation of a preset number of multi-frame sensing data frame changes among the multiple sensors.
- Optional also includes:
- a pulse signal generating unit configured to generate a first clock pulse signal according to the first clock signal source
- the trigger signal and the first time generating unit are configured to generate the trigger signal and the first time corresponding to the trigger time of the sensor according to the first clock pulse signal.
- the multi-sensor synchronous timing device triggers a plurality of sensors to collect sensory data through a trigger signal from a first clock signal source
- the plurality of sensors includes at least one first sensor and at least A second sensor
- the time of the first perception data collected by the first sensor includes the first time
- the time of the second perception data collected by the second sensor includes the second time
- the first time refers to The time determined by the first clock signal source corresponding to the triggering time of the sensor
- the second time includes the time determined according to the second clock signal source; and determined according to the correlation between the changes in the perceived data frames between the multiple sensors
- Sensing the data synchronization frame difference between the multiple sensors determining the time offset between the first time and the second time according to the synchronization frame difference; according to the time offset
- the sensor data of multiple sensors performs time alignment; this processing method makes it possible to detect the accuracy of multiple sensors synchronous time service at the data level, and can realize multiple sensors when it is detected that the data
- FIG. 8 is a schematic diagram of an embodiment of the robot of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- a robot of this embodiment the robot includes: a first clock signal source 801, a plurality of sensors 802, a processor 803, and a memory 804.
- the memory is used to store a program for implementing a multi-sensor synchronous time service method. After the device is powered on and the multi-sensor synchronous time service method program is run through the processor, the following steps are performed: Trigger signal to trigger multiple sensors to collect sensory data; determine the synchronization frame difference of the sensory data among the multiple sensors based on the correlation between changes in the sensory data frame between the multiple sensors; A time offset of the sensing data between the multiple sensors; according to the time offset, performing time alignment on the sensing data of the multiple sensors.
- the robot may be an unmanned vehicle, a cleaning robot, a medical robot, a military robot, a disabled robot, and so on.
- the robot in this embodiment is an unmanned vehicle 900, and uses a GPS clock as a first clock signal source, and the clock receives satellite signals transmitted by the GPS satellite system 999.
- components coupled to or included in unmanned vehicle 900 may include propulsion system 902, sensor system 904, control system 906, peripherals 908, power supply 910, computing device 911, and user interface 912.
- the computing device 911 may include a processor 913 and a memory 914.
- the computing device 911 may be a controller or part of the controller of the unmanned vehicle 900.
- the memory 914 may include instructions 916 executable by the processor 913, and may also store map data 915.
- the processor 913 may use a main processor chip (vehicle CPU) of a vehicle machine, etc. It is the most important part of the unmanned vehicle 900 and bears functions of calculation, storage and control.
- the map data 915 can be downloaded from the server through the network, and the program instructions 916 include the program for implementing the multi-sensor synchronous time service method.
- the sensor system 904 of the unmanned vehicle 900 may include the following multiple sensors: a camera 934, an ultrasonic radar 936, a laser radar (LIDAR) 932, a millimeter wave radar (RADAR) 930, a global positioning system module 926, and an inertial measurement unit 928 And so on, where the global positioning system module 926 can be used as a GPS clock.
- the processor 913 executes the program for implementing the multi-sensor synchronous timing method, so that the synchronous timing of the multi-sensor is performed, and the sensor data of the synchronous timing of the multi-sensor is fused by the sensor fusion algorithm 944.
- the propulsion system 902 may include at least one of an engine/engine 918, an energy source 920, a transmission 922, and wheels/tires 924.
- the control system 906 may include at least one of steering 938, throttle 940, brake 942, sensor fusion algorithm 944, computer vision system 946, navigation/route control system 948, obstacle avoidance system 950.
- the peripheral device 908 may include at least one of a wireless communication system 952, a touch screen 954, a microphone 956, and a speaker 958.
- the robot provided by the embodiment of the present application triggers a plurality of sensors included in the robot through a trigger signal derived from a first clock signal source, so that the sensor collects perception data, and the time of the perception data includes the first A time or a second time, the first time refers to the time corresponding to the sensor trigger time determined according to the first clock signal source, and the second time includes the time determined according to the second clock signal source;
- the correlation between the changes in the frames of perceived data between the plurality of sensors to determine the synchronization frame difference of the sensing data between the multiple sensors; according to the synchronization frame difference, determine the difference between the first time and the second time Time offset; according to the time offset, perform time alignment on the sensing data of the multiple sensors; this processing method makes it possible to detect the accuracy of multiple sensors' synchronous timing at the data level.
- the data timestamps of this kind of sense data are not aligned, the calibration of synchronous timing of multiple sensors can be achieved; therefore, the accuracy of synchronous timing
- FIG. 10 is a schematic structural diagram of an embodiment of a multi-sensor synchronous timing system of the present application. Since the system embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The system implementation embodiments described below are only schematic.
- a multi-sensor synchronous time service system in this embodiment includes: a robot 1001 and a server 1002.
- the robot 1001 is configured to trigger a plurality of sensors to collect sensing data through a trigger signal derived from a first clock signal source, and send a synchronous timing request to the server 1002; and receive the plurality of sensors sent back by the server The time offset of the perceptual data, and according to the time offset, perform time alignment on the perceptual data of the multiple sensors.
- the synchronous time service request may include the sensing data of the multiple sensors.
- the plurality of sensors include at least one first sensor and at least one second sensor, the time of the first perception data collected by the first sensor includes the first time, and the time of the second perception data collected by the second sensor
- the time includes a second time, and the first time refers to a time corresponding to the triggering moment of the sensor determined according to the first clock signal source, and the second time includes a time determined according to the second clock signal source.
- the robot 1001 is also used to determine whether the current time is longer than the last synchronous time calibration time to reach the time threshold; if so, send a synchronous time request to the server 1002.
- the synchronous timing service request may include the sensing data of a preset frame number of each sensor.
- the preset number of frames may be determined according to business requirements, such as setting to 100 frames and so on.
- the robot 1001 is further configured to send a synchronous timing request to the server 1002 if the sensing data of the multiple sensors cannot be aligned when projecting with each other.
- the server 1002 is configured to receive the synchronization timing request, determine the sensing data of the multiple sensors, and determine the sensing data between the multiple sensors according to the correlation between the changes in the sensing data between the multiple sensors
- the synchronization frame difference of; the time offset is determined according to the synchronization frame difference; the time offset is sent back to the robot 1001.
- the server 1002 may be specifically configured to acquire the sensing data of the multiple sensors according to the synchronization timing request.
- the multi-sensor synchronous time service system includes a robot and a server.
- the robot triggers multiple sensors to collect sensing data through a trigger signal from a first clock signal source, and sends a synchronous time service request to the server;
- the server determines the sensing data of multiple sensors, and determines the synchronization frame difference of the sensing data between multiple sensors according to the correlation between the changes in the sensing data frame between multiple sensors; according to the synchronization frame difference, determines the sensing between multiple sensors
- the time offset of the data is sent back to the robot; the robot performs time alignment on the sensor data of multiple sensors according to the time offset; this processing method makes it possible to detect the simultaneous timing of multiple sensors at the data level.
- FIG. 11 is a flowchart of an embodiment of a multi-sensor synchronous timing method of the present application. Since the method embodiment is basically similar to the system embodiment, the description is relatively simple, and the relevant part can be referred to the description of the system embodiment. The method embodiments described below are only schematic.
- Step S1101 trigger a plurality of sensors to collect sensing data through a trigger signal derived from the first clock signal source;
- Step S1103 send a synchronous time service request to the server
- Step S1105 Receive the time offset of the sensing data between the multiple sensors returned by the server;
- Step S1107 Perform time alignment on the sensing data of the multiple sensors according to the time offset.
- the multi-sensor synchronous timing method triggers multiple sensors to collect sensing data through a trigger signal from a first clock signal source, and sends a synchronous timing request to the server, so that the server can
- the robot sends back the time offset; the robot performs time alignment on the sensing data of the multiple sensors according to the time offset; this processing method makes it possible to detect the synchronization of multiple sensors at the data level.
- a multi-sensor synchronous time service method is provided, and correspondingly, the present application also provides a multi-sensor synchronous time service device. This device corresponds to the embodiment of the above method.
- FIG. 12 is a schematic diagram of an embodiment of a multi-sensor synchronous timing device of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- This application additionally provides a multi-sensor synchronous timing device, including:
- the sensor trigger unit 1201 is configured to trigger multiple sensors to collect sensory data through a trigger signal derived from the first clock signal source;
- the sensing data sending unit 1203 is used to send a synchronous time service request to the server;
- a time offset receiving unit 1205, configured to receive the time offset of the sensory data sent back by the server
- the time alignment unit 1207 is configured to perform time alignment on the sensing data of the multiple sensors according to the time offset.
- FIG. 13 is a schematic diagram of an embodiment of the robot of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- the robot includes: a first clock signal source 1301, a plurality of sensors 1302, a processor 1303, and a memory 1304; the memory is used to store a program for implementing a multi-sensor synchronous timing method, and the device is powered on After running the program of the multi-sensor synchronous timing method through the processor, the following steps are performed: trigger signals from the first clock signal source are used to trigger multiple sensors to collect sensing data and send a synchronous timing request to the server And receiving the time offset of the sensing data between the multiple sensors returned by the server, and performing time alignment on the sensing data of the multiple sensors according to the time offset.
- FIG. 14 is a flowchart of an embodiment of a multi-sensor synchronous timing method of the present application. Since the method embodiment is basically similar to the system embodiment, the description is relatively simple, and the relevant part can be referred to the description of the system embodiment. The method embodiments described below are only schematic.
- Step S1401 Receive the synchronous timing request sent by the robot
- Step S1403 Determine the sensing data of the multiple sensors of the robot
- Step S1405 Determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors;
- Step S1407 Determine the time offset according to the synchronization frame difference
- Step S1409 Send the time offset back to the robot.
- the multi-sensor synchronous timing method receives the synchronous timing request sent by the robot through the server, determines the sensing data of multiple sensors of the robot, and determines the sensing data frame between the multiple sensors The correlation between the changes, to determine the synchronization frame difference of the sensing data between the multiple sensors; to determine the time offset of the sensing data between the multiple sensors according to the synchronization frame difference; to send back the time to the robot Offset, so that the robot performs time alignment on the sensing data of the multiple sensors according to the time offset; this processing method makes it possible to detect the accuracy of multiple sensors' synchronized timing at the data level.
- the timestamps of multiple sensing data are not aligned, the calibration of multiple sensors' synchronous timing can be achieved; therefore, the accuracy of multi-sensor synchronous timing can be effectively improved, thereby ensuring accurate timing of each sensor in the same system.
- a multi-sensor synchronous time service method is provided.
- the present application also provides a multi-sensor synchronous time service device. This device corresponds to the embodiment of the above method.
- FIG. 15 is a schematic diagram of an embodiment of a multi-sensor synchronous timing device of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- This application additionally provides a multi-sensor synchronous timing device, including:
- the request receiving unit 1501 is used to receive the synchronous time service request sent by the robot;
- Perceptual data determining unit 1503 used to determine the perceptual data of multiple sensors of the robot;
- the synchronization frame difference determining unit 1505 is configured to determine the synchronization frame difference of the sensing data between the multiple sensors according to the correlation between the changes of the sensing data between the multiple sensors;
- a time offset determining unit 1507 configured to determine the time offset according to the synchronization frame difference
- the time offset return unit 1509 is used to return the time offset to the robot.
- FIG. 16 is a schematic diagram of an embodiment of an electronic device of the present application. Since the device embodiment is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. The device embodiments described below are only schematic.
- the electronic device includes: a processor 1601 and a memory 1602; the memory is used to store a program for implementing a multi-sensor synchronous timing method, the device is powered on, and the multi-sensor is operated by the processor
- the following steps are performed: receiving the synchronous time service request sent by the robot; determining the sensory data of multiple sensors of the robot; determining the correlation based on the correlation between changes in the sensory data frames between the multiple sensors The synchronization frame difference of the sensing data between the multiple sensors; determining the time offset according to the synchronization frame difference; sending the time offset back to the robot.
- the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- the memory may include non-permanent memory, random access memory (RAM) and/or non-volatile memory in a computer-readable medium, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
- RAM random access memory
- ROM read only memory
- flash RAM flash memory
- Computer-readable media including permanent and non-permanent, removable and non-removable media, can implement information storage by any method or technology.
- the information may be computer readable instructions, data structures, modules of programs, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only compact disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
- computer-readable media does not include non-transitory computer-readable media (transitory media), such as modulated data signals and carrier waves.
- the embodiments of the present application may be provided as methods, systems, or computer program products. Therefore, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, the present application may take the form of a computer program product implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code.
- computer usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Measurement Of Unknown Time Intervals (AREA)
Abstract
多传感器同步授时系统、方法、装置及电子设备。系统的机器人(1001)通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,向服务器(1002)发送同步授时请求;服务器(1002)根据请求,确定多个传感器的感知数据,根据多个传感器间感知数据帧间变化的相关度,确定多个传感器间感知数据的同步帧差;根据同步帧差,确定多个传感器间感知数据的时间偏移量;向机器人(1001)回送时间偏移量;机器人(1001)根据时间偏移量,对多个传感器的感知数据执行时间对齐。采用这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,并实现多个传感器的同步授时校准;因此,可以有效提升多传感器同步授时准确性,确保对各传感器精确授时。
Description
本申请要求2018年12月29日递交的申请号为201811647921.1、发明名称为“多传感器同步授时系统、方法、装置及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及自动化技术领域,具体涉及多传感器同步授时系统、方法和装置,机器人,以及电子设备。
在无人驾驶及机器人等领域中,机器感知是重要组成部分,多传感器融合的感知系统利用多个传感器探测感知周围的环境。各传感器数据的同步是保证感知效果及感知性能的重要的环节。通过时钟信号源的同步触发,各传感器可以达到高精度的同步数据采集。
目前,主要采用以下两种常用的多传感器同步授时方法。方法一、各传感器之间不做严格时间同步,各传感器各自采集到数据后,根据操作系统时间为感知数据标记一个近似的时间戳,该方式具有授时难度低的优点。方法二、通过高精度时钟信号源,如gps/北斗/ntp等高精度时钟信号源,对同一系统(如无人驾驶车辆等等)内的各传感器进行精确授时,并利用高精度脉冲信号触发传感器进行数据采集,并且将高精度时钟信号源的触发时钟信号作为传感器所采集到的数据的时间戳,该方式的优点是各传感器可以得到高的采集时间精度。
然而,在实现本发明过程中,发明人发现上述技术方案至少存在如下问题:1)方法一无法对原始数据做严格意义的时空对齐,从而增加了感知数据融合难度;2)方法二对于无法接受高精度时钟信号源时间戳的传感器,将无法进行此类的传感器同步,同时也无法在数据层面检测是否严格意义的采集时间对齐。综上所述,现有技术存在多传感器同步授时准确性较低的问题。
发明内容
本申请提供多传感器同步授时系统,以解决现有技术存在的多传感器同步授时准确性较低的问题。本申请另外提供多传感器同步授时方法和装置,机器人以及电子设备。
本申请提供一种多传感器同步授时系统,包括:
机器人,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据,向所述服务器发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;
服务器,用于接收所述同步授时请求,确定所述多个传感器的感知数据,并根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述时间偏移量;向所述机器人回送所述时间偏移量。
本申请还提供一种多传感器同步授时方法,包括:
通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
向服务器发送同步授时请求;
接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;
根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
本申请还提供一种多传感器同步授时装置,包括:
传感器触发单元,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
感知数据发送单元,用于向服务器发送同步授时请求;
时间偏移量接收单元,用于接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;
时间对齐单元,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
本申请还提供一种机器人,包括:
多个传感器;
第一时钟信号源;
处理器;以及
存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于所述第一时钟信号源的触发信号,触发多个传感器采集感知数据,向服务器发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
本申请还提供一种多传感器同步授时方法,包括:
接收机器人发送的同步授时请求;
确定所述机器人的多个传感器的感知数据;
根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
根据所述同步帧差,确定所述时间偏移量;
向所述机器人回送所述时间偏移量。
本申请还提供一种多传感器同步授时方法,包括:
请求接收单元,用于接收机器人发送的同步授时请求;
感知数据确定单元,用于确定所述机器人的多个传感器的感知数据;
同步帧差确定单元,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
时间偏移量确定单元,用于根据所述同步帧差,确定所述时间偏移量;
时间偏移量回送单元,用于向所述机器人回送所述时间偏移量。
本申请还提供一种电子设备,包括:
处理器;以及
存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:接收机器人发送的同步授时请求;确定所述机器人的多个传感器的感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述时间偏移量;向所述机器人回送所述时间偏移量。
本申请还提供一种多传感器同步授时方法,包括:
通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;
根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
可选的,所述相关度根据所述多个传感器的相对变换数据确定;所述相对变换数据包括所述传感器采集的相邻两帧感知数据间的相对变换数据。
可选的,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个 传感器间感知数据同步帧差,包括:
针对多个同步帧差,获取所述同步帧差对应的所述相关度;
将所述相关度的最大值对应的同步帧差,作为所述感知数据同步帧差。
可选的,所述根据所述同步帧差,并确定所述多个传感器间感知数据的时间偏移量,包括:
将多对具有所述同步帧差的两帧感知数据的时间差值的平均值,作为所述时间偏移量。
可选的,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差,包括:
针对所述多个传感器中的第一传感器和第二传感器的两两组合,根据所述第一传感器和第二传感器间感知数据帧间变化的相关度,确定所述第一传感器和第二传感器间感知数据同步帧差;其中,所述第一传感器包括感知数据的时间为第一时间的传感器,所述第二传感器包括感知数据的时间为第二时间的传感器。
可选的,在所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差之前,还包括:
判断当前时间距离上一次同步授时校准时间的时长是否达到时长阈值;若是,则进入下一步。
可选的,在所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差之前,还包括:
若所述多个传感器的感知数据相互投影时无法对齐,则进入下一步。
可选的,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差,包括:
根据所述多个传感器间预设数量的多帧感知数据帧间变化的相关度,确定所述同步帧差。
可选的,还包括:
根据所述第一时钟信号源,生成第一时钟脉冲信号;
根据所述第一时钟脉冲信号,生成所述触发信号和对应传感器触发时刻的所述第一时间。
本申请还提供一种多传感器同步授时装置,包括:
传感器触发及时间分配单元,用于通过来源于第一时钟信号源的触发信号,触发多 个传感器采集感知数据;
同步帧差确定单元,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
时间偏移量确定单元,用于根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;
同步校准单元,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
本申请还提供一种机器人,包括:
多个传感器;
第一时钟信号源;
处理器;以及
存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述各种方法。
本申请还提供一种包括指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各种方法。
与现有技术相比,本申请具有以下优点:
本申请实施例提供的多传感器同步授时系统,包括机器人和服务器,机器人通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,向服务器发送同步授时请求;服务器根据该请求,确定多个传感器的感知数据,根据多个传感器间感知数据帧间变化的相关度,确定多个传感器间感知数据的同步帧差;根据同步帧差,确定多个传感器间感知数据的时间偏移量;向机器人回送时间偏移量;机器人根据时间偏移量,对多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一 系统内的各传感器进行精确授时。
图1是本申请提供的一种多传感器同步授时方法的实施例的流程图;
图2是本申请提供的一种多传感器同步授时方法的实施例的示意图;
图3是本申请提供的一种多传感器同步授时方法的实施例的示意图;
图4是本申请提供的一种多传感器同步授时方法的实施例的具体流程图;
图5是本申请提供的一种多传感器同步授时方法的实施例的示意图;
图6是本申请提供的一种多传感器同步授时装置的实施例的结构示意图;
图7是本申请提供的一种多传感器同步授时装置的实施例的具体示意图;
图8是本申请提供的一种机器人的实施例的示意图;
图9是本申请提供的一种机器人的实施例的具体示意图;
图10是本申请提供的一种多传感器同步授时系统的实施例的示意图;
图11是本申请提供的一种多传感器同步授时方法的实施例的流程图;
图12是本申请提供的一种多传感器同步授时定位装置的实施例的示意图;
图13是本申请提供的一种机器人的实施例的示意图;
图14是本申请提供的一种多传感器同步授时方法的实施例的流程图;
图15是本申请提供的一种多传感器同步授时定位装置的实施例的示意图;
图16是本申请提供的一种机器人的实施例的示意图。
在下面的描述中阐述了很多具体细节以便于充分理解本申请。但是本申请能够以很多不同于在此描述的其它方式来实施,本领域技术人员可以在不违背本申请内涵的情况下做类似推广,因此本申请不受下面公开的具体实施的限制。
在本申请中,提供了多传感器同步授时系统、方法和装置,机器人,以及电子设备。在下面的实施例中逐一对各种方案进行详细说明。
第一实施例
请参考图1,其为本申请提供的一种多传感器同步授时方法的实施例的流程图。该方法的执行主体包括多传感器同步授时装置,该装置可部署在机器人系统中。本申请提供的一种多传感器同步授时方法包括:
步骤S101:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据。
本申请实施例提供的方法,可通过第一时钟信号源触发机器人系统内多个异构传感器进行数据采集。所述机器人,可以是无人车、清洁机器人、医疗机器人、军用机器人、助残机器人等等。
以无人驾驶车辆(也称为轮式机器人)为例,其传感器可包括以下几种:图像传感器(摄像头)、超声波雷达、激光雷达以及毫米波雷达等等,每一个传感器都需要连接自己的处理器(如MCU,ECU等等)。当前最先进的智能汽车采用了十几个传感器(仅指应用于自动驾驶功能),如5颗低线束激光雷达,加上6颗摄像头、3颗毫米波雷达,由这些传感器融合的感知系统利用多个传感器探测感知周围的环境。应用本申请实施例提供的方法,可实现机器人系统内多个异构传感器的处理器之间的时钟(定时器)同步。
所述第一时钟信号源,包括高精度时钟信号源,如GPS(Global Positioning System,全球定位系统)时钟、北斗时钟、NTP(网络授时方式)时钟等等。
以GPS时钟为例,其能够按照用户需求输出符合规约的时间信息格式,从而完成同步授时服务,其主要原理是通过GPS或其他卫星导航系统的信号驯服晶振,从而实现高精度的频率和时间信号输出,是目前达到纳秒级授时精度和稳定度在1E12量级频率输出的最有效方式。
GPS时钟主要分为两类,一类是GPS授时仪,主要输出时标信息,包括1PPS(秒脉冲,一秒一个脉冲)及TOD(Time of Day,年月日时分秒)信息;另外一类是GPS同步时钟,后者输出利用卫星信号驯服OCXO或者铷钟得到的高稳定频率信息,以及本地恢复的更平稳的时标信号。
GPS同步时钟主要由以下几部分组成:GPS/GNSS接收机,其中可以为GPS/GLONASS/BD/GALILEO等,高精度OCXO或铷钟,本地同步校准单元,测差单元,误差处理及控制结构,输入输出等几部分。
在本实施例中,所述多传感器同步授时装置,可通过高精度时钟信号源产生PPS等高精度时钟脉冲信号;再通过高精度时钟脉冲触发单元,生成触发传感器的固定频率(如50Hz等等)或者自定义频率的触发信号及对应触发时刻的时间戳;然后通过传感器触发及授时单元,利用触发信号触发传感器,采集感知数据并打时间戳。
所述多个传感器,包括可接受第一时钟信号源时间戳的传感器,也包括不接受第一时钟信号源时间戳的传感器。为了便于描述,下面将可接受第一时钟信号源时间戳的传感器称为第一传感器,将不接受第一时钟信号源时间戳的传感器称为第二传感器。
所述第一传感器,包括但不限于激光测距仪(LiDAR,简称激光雷达)、毫米波雷达(Radio)等等。所述第二传感器,包括但不限于图像传感器(Camera,简称摄像机)、超声波雷达等等。
本申请实施例提供的方法,通过第一时钟信号源触发机器人系统内多个异构传感器(包括第一传感器和第二传感器)进行数据采集,并将第一时钟信号源的触发时钟信号作为其中第一传感器所采集到的感知数据的时间戳,由此使得第一传感器的感知数据可以得到高的采集时间精度,实现对第一传感器的精确授时。对于机器人系统内的第二传感器,由于其无法接受第一时钟信号源时间戳,因此可根据第二时钟信号源为这类感知数据标记时间戳,第二传感器各自处理器之间的时钟需要进行同步。
由此可见,所述感知数据对应的时间戳(数据采集时间信息),包括第一时间或第二时间。其中,所述第一时间是指,根据所述第一时钟信号源确定的对应所述传感器触发时刻的时间,该时间是由授时单元直接赋值的时间戳To,该时间戳与授时单元所提供时间戳一致。所述第二时间,包括第二时钟信号源的时间,该时间是由授时单元之外的第二时钟信号源赋值的时间戳Ts,该时间戳由于系统的延迟等,会与授时单元的时钟有一个相对时延offset=Ts-To。
所述第二时钟信号源,又称为第三方时钟信号源,可以是机器人的操作系统时钟(简称系统时钟)等等。系统时钟,可以是基于CMOS工艺的高性能处理器时钟系统,是由振荡器(信号源)、定时唤醒器、分频器等组成的电路。常用的信号源有晶体振荡器和RC振荡器。
以无人驾驶车辆为例,其主要依靠车内的以计算机系统为主的智能驾驶仪来实现无人驾驶的目的,其中计算机系统的操作系统的时钟信号即可作为第二时钟信号源,机器人的操作系统可以是Ubuntu、ROS、Android操作系统等等。
在步骤S101通过来源于第一时钟信号源的触发信号触发多个传感器,以使所述传感器采集感知数据之后,就可以进入下一步骤,根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据同步帧差。
步骤S103:根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差。
本申请提供的方法,基于以下原理设计:假设多个传感器之间刚性连接,若各传感器是严格同步且各传感器感受野(Receptive Field)相同,并且周围环境静止,则在所有传感器一起运动时,所感知的数据也一起变化,当所有传感器停止变化时,所感知的数 据也停止变化。如果各传感器之间的同步有一个相对的时间延迟offset,则对应的感知数据变化也会有一个相同的offset。基于此原理,本申请提供的方法利用同一传感器内前后两个时刻的相对变化,实现不同传感器之间的在线同步授时校验及同步校准。
所述刚性连接是指,传感器和传感器之间通过不会发生形变的结构连接,以保证采集到的前后帧相对变化是由于整体的运动产生的,而不是相对运动产生的。
例如,机器人包括传感器A和传感器B,A和B可以是同类也可以是不同类传感器,如A为激光雷达,B为摄像机,A和B刚性连接,周围环境静止或者部分静止。A和B可由相同频率或者不同频率的时间脉冲触发。通过实现一组静止-运动-静止-运动…的传感器运动,可获得一组时序的传感器感知数据,通过求同一个传感器内前后两帧感知数据的相对变换,得到传感器A的感知数据在时序上的变化「Va1,Va2,Va3…,VaN」,对应的时间戳为「Ta1,Ta2,Ta3…,TaN」,传感器B的感知数据在时序上的变化「Vb1,Vb2,Vb3…,VbN」,对应的时间戳为「Tb1,Tb2,Tb3…,TbN」。
如图2所示,其中一条曲线表示激光雷达的感知数据在时序上的前后帧变化情况,另一条曲线表示摄像机的感知数据在时序上的前后帧变化情况,通过对比两个传感器的感知数据变化,如摄像机在t2时刻到t3时刻间的变化、与雷达在t3时刻到t4时刻间的变化规律相同,都是感知数据的数值保持不变阶段;摄像机在t3时刻到t4时刻间的变化、与雷达在t4时刻到t5时刻间的变化规律相同,都是感知数据的数值上升阶段;摄像机在t4时刻到t5时刻间的变化、与雷达在t5时刻到t6时刻间的变化规律相同,都是感知数据的数值下降阶段等等;由此可见,两种感知数据的时间戳的偏移量为offset。
图3为本实施例中实际采集的激光雷达和摄像机的数据时序变化曲线。由图3可见,两个传感器感知数据的峰值对应时间基本一致,同时谷值对应时间也基本一致,由此可见,这两个传感器基本达到了同步授时。
通过对比图2和图3,本申请技术方案的发明人发现两个传感器的感知数据在时序上的前后帧变化具有很强的相关性,可通过这种相关性可获取两个传感器之间的时间上的相对位移offset。基于这种技术构思,本申请实施例提供的方法,根据两个传感器分别采集的多帧感知数据,确定两个传感器的感知数据在时序上的前后帧变化的相关度(又称为相关系数),并将相关度表达为两个传感器的同步帧差的函数。
所述相关度,可根据所述多个传感器的相对变换数据确定。所述相对变换数据,包括同一传感器采集的相邻两帧感知数据间的相对变换数据。
所述同步帧差,包括两个传感器在同一时刻采集的感知数据在帧序号上的差值。例 如,在2018/11/30 12:00:00同时触发激光雷达和摄像机采集感知数据,激光雷达的感知数据的时间戳来源于GPS时钟,即:2018/11/30 12:00:00,摄像机的感知数据的时间戳来源于操作系统,即:2018/11/30 12:00:01,假设激光雷达在2018/11/3012:00:00-2018/11/30 12:00:01这1秒钟的时间段内共采集了10帧点云数据,第1帧的时间戳为2018/11/30 12:00:00,第10帧的时间戳为2018/11/30 12:00:01,由于摄像机在2018/11/30 12:00:00采集的数据的时间戳为2018/11/30 12:00:01,因此激光雷达和摄像机的同步帧差为9。
在本实施例中,可将两个传感器之间的相关系数(correlation coefficient)表示为如下公式:
如图4所示,步骤S103可包括如下子步骤:
步骤S1031:针对多个同步帧差,获取所述同步帧差对应的所述相关度。
所述多个同步帧差,可以是1帧、2帧…10帧等等。根据上述公式,可计算得到各个同步帧差对应的所述相关度。
步骤S1032:将所述相关度的最大值对应的同步帧差,作为所述感知数据同步帧差。
通过分析可知,在传感器A和传感器B两类感知数据对齐的时刻,即根据同步帧差对感知数据的进行对齐后,如雷达数据的第i-k帧与相机的第i帧对齐,C(k)将得到最大值。如图5所示,当两个传感器间时间偏移量为0时,二者间的相关系数最大,偏移量越大,则相关系数越小,趋于0。因此,在确定相关度与同步帧差间的函数关系后,就可以根据两个传感器的多帧感知数据,求解使得C(k)达到最大值的同步帧差。
具体实施时,可根据传感器采集数据的场景等因素,确定要依据的感知数据的帧数,如可根据100帧感知数据求解同步帧差,帧数越多,相关的峰值会越明显。
需要说明的是,上述公式假设不同传感器前后帧的时间差相同,即Va(i+1)与Va(i)的时间差与Vb(i+1)与Vb(i)的时间差相同,且具有固定的帧间时间差,对于时间差不同的情形,可通过插值等方法生成具有相同时间差的序列Va’与Vb’,以保证传感器的 上述相关性公式成立。
本申请提供的方法,通常只需要在有需求的情况下开启,以对机器人的多传感器做时间相关性校准。例如,对于新安装的设备,或者长时间没有校准过的设备,或者感知系统工作时实时发现不同传感器的感知数据相互投影时无法对齐等情况,通常要通过本申请提供的方法做相关性校准。
在一个示例中,在步骤S103之前,还可包括如下步骤:判断当前时间距离上一次同步授时校准时间的时长是否达到时长阈值;若是,则步骤S103。所述时长阈值,可根据业务需求确定,例如,设置为一周,即:每周执行一次本申请提供的方法。
在另一个示例中,在步骤S103之前,还可包括如下步骤:若所述多个传感器的感知数据相互投影时无法对齐,则进入步骤S103。例如,在采集感知数据过程中,tx时刻发现时间偏移量为0.5ms,此后操作系统根据该值对所有无法根据GPS时钟信号源打时间戳的传感器进行同步授时;但是,过了1小时后,发现多个传感器的感知数据相互投影无法对齐,则开启校准,通过执行本申请提供的方法,确定时间偏移量改为1ms,此后根据该值对多传感器进行同步授时。采用这种处理方式,使得可在数据层面检测多传感器的感知数据是否严格意义的采集时间对齐;因此,可以有效提升多传感器同步授时准确性。
在本实施例中,所述多传感器同步授时装置,针对机器人的多个传感器进行两两组合,针对每一对传感器计算时间偏移量。具体实施时,可以是将一个第一传感器和一个第二传感器组对,计算第二传感器与第一传感器的时间偏移量;也可以是将两个第二传感器组对,计算两个第二传感器间的时间偏移量,同时还要计算其中一个第二传感器与一个第一传感器的时间偏移量,由此推算出另一个第二传感器与第一传感器的时间偏移量。
在确定所述多个传感器间感知数据同步帧差之后,就可以进入下一步骤根据所述同步帧差确定时间偏移量。
步骤S105:根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量。
在确定所述多个传感器间感知数据同步帧差后,就可以根据具有所述同步帧差的两帧数据各自对应的时间戳,确定所述时间偏移量。
在一个示例中,可以取任意两帧具有所述同步帧差的数据各自对应的时间戳,将两个时间戳的差值作为时间偏移量。例如,上例中的激光雷达与摄像机间同步帧差为9,则可以取激光雷达的第i帧第一感知数据和摄像机的第i+9帧第二感知数据,将这两帧感 知数据各自的时间戳的差值作为时间偏移量,如这两帧数据的时间戳差值为1秒,则时间偏移量为1秒。
在本实施另一个示例中,将多对具有所述同步帧差的两帧感知数据的时间差值的平均值作为所述时间偏移量。采用这种处理方式的原因在于,即使在固定触发频率下,各传感器由于计算时间戳的方法不同,每一帧的时间会有一个很微小的扰动,例如激光点云的时间戳由于计算方法的不同,会有稍微的扰动,前后帧会差0.001量级的扰动,为减少该扰动对估计时间戳的影响,将多个两帧时间戳差值的平均值作为时间偏移量。采用这种处理方式,可以有效提升时间偏移量的准确性。
例如,上例中的激光雷达与摄像机间同步帧差为9,则可以取第i帧第一感知数据和第i+9帧第二感知数据,假设二者时间差值为1秒;第i+1帧第一感知数据和第i+1+9帧第二感知数据,假设二者时间差值为1.001秒;…,第i+50帧第一感知数据和第i+50+9帧第二感知数据,假设二者时间差值为0.999秒;将所有对感知数据的时间戳差值的平均值,即(1+1.001+…+0.999)/50,作为时间偏移量。
步骤S107:根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
在确定多个传感器间感知数据采集时间的偏移量后,就可以对所述多个传感器间感知数据执行时间对齐。当所述第一传感器采集感知数据时,可将该感知数据的时间戳设置为传感器触发时刻对应的GPS时间;当所述第二传感器采集感知数据时,将该感知数据的时间戳设置为:操作系统时间与时间偏移量的差值,由此实现多传感器的同步授时。
从上述实施例可见,本申请实施例提供的多传感器同步授时方法,通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,所述多个传感器包括至少一个第一传感器和至少一个第二传感器,所述第一传感器采集的第一感知数据的时间包括第一时间,所述第二传感器采集的第二感知数据的时间包括第二时间,所述第一时间是指根据所述第一时钟信号源确定的对应所述传感器触发时刻的时间,所述第二时间包括根据第二时钟信号源确定的时间;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据同步帧差;根据所述同步帧差,确定所述第一时间与所述第二时间之间的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
在上述的实施例中,提供了一种多传感器同步授时方法,与之相对应的,本申请还提供一种多传感器同步授时装置。该装置是与上述方法的实施例相对应。
第二实施例
请参看图6,其为本申请的多传感器同步授时装置的实施例的示意图。由于装置实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的装置实施例仅仅是示意性的。
本申请另外提供一种多传感器同步授时装置,包括:
传感器触发单元601,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
同步帧差确定单元602,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
时间偏移量确定单元603,用于根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;
同步校准单元604,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
可选的,所述相关度根据所述多个传感器的相对变换数据确定;所述相对变换数据包括所述传感器采集的相邻两帧感知数据间的相对变换数据。
可选的,所述同步帧差确定单元602包括:
相关度获取子单元,用于针对多个同步帧差,获取所述同步帧差对应的所述相关度;
同步帧差确定子单元,用于将所述相关度的最大值对应的同步帧差,作为所述感知数据的同步帧差。
可选的,所述时间偏移量确定单元603,具体用于将多对具有所述同步帧差的两帧感知数据的时间差值的平均值,作为所述时间偏移量。
可选的,所述同步帧差确定单元602,具体用于针对所述多个传感器中的第一传感器和第二传感器的两两组合,根据所述第一传感器和第二传感器间感知数据帧间变化的相关度,确定所述第一传感器和第二传感器间感知数据同步帧差;其中,所述第一传感器包括感知数据的时间为第一时间的传感器,所述第二传感器包括感知数据的时间为第二时间的传感器。
请参考图7,其为本申请的机器人的实施例的示意图。在本实施例中,所述装置还包括:
第一判断单元701,用于判断当前时间距离上一次同步授时校准时间的时长是否达到时长阈值;若是,则启动所述同步帧差确定单元602。
可选的,还包括:
第二判断单元,用于判断所述多个传感器的感知数据相互投影时是否无法对齐;若是,则启动所述同步帧差确定单元602。
可选的,所述述同步帧差确定单元602,具体用于根据所述多个传感器间预设数量的多帧感知数据帧间变化的相关度,确定所述同步帧差。
可选的,还包括:
脉冲信号生成单元,用于根据所述第一时钟信号源,生成第一时钟脉冲信号;
触发信号及第一时间生成单元,用于根据所述第一时钟脉冲信号,生成所述触发信号和对应传感器触发时刻的所述第一时间。
从上述实施例可见,本申请实施例提供的多传感器同步授时装置,通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,所述多个传感器包括至少一个第一传感器和至少一个第二传感器,所述第一传感器采集的第一感知数据的时间包括第一时间,所述第二传感器采集的第二感知数据的时间包括第二时间,所述第一时间是指根据所述第一时钟信号源确定的对应所述传感器触发时刻的时间,所述第二时间包括根据第二时钟信号源确定的时间;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据同步帧差;根据所述同步帧差,确定所述第一时间与所述第二时间之间的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
第三实施例
请参考图8,其为本申请的机器人的实施例的示意图。由于设备实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的设备实施例仅仅是示意性的。
本实施例的一种机器人,该述机器人包括:第一时钟信号源801、多个传感器802、处理器803和存储器804。
所述存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述 处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
所述机器人,可以是无人车、清洁机器人、医疗机器人、军用机器人、助残机器人等等。
如图9所示,本实施例的机器人为无人车900,并将GPS时钟作为第一时钟信号源,该时钟接收GPS卫星系统999发射的卫星信号。由图9可见,耦合到无人车900或包括在无人车900中的组件可包括推进系统902、传感器系统904、控制系统906、外围设备908、电源910、计算装置911以及用户接口912。计算装置911可包括处理器913和存储器914。计算装置911可以是无人车900的控制器或控制器的一部分。存储器914可包括处理器913可运行的指令916,并且还可存储地图数据915。处理器913可采用车机主处理器芯片(车机CPU)等等,它是无人车900中最重要的部分,承担着运算,存储和控制的功能。其中,地图数据915可通过网络从服务器端请求下载,所述程序指令916包括所述实现多传感器同步授时方法的程序。
该无人车900的传感器系统904可包括以下几种多个传感器:相机934、超声波雷达936、激光雷达(LIDAR)932、毫米波雷达(RADAR)930、全球定位系统模块926以及惯性测量单元928等等,其中全球定位系统模块926可作为GPS时钟使用。通过处理器913执行所述实现多传感器同步授时方法的程序,使得对多传感器进行同步授时,同步授时的多传感器的感知数据通过传感器融合算法944进行数据融合。
推进系统902可包括引擎/发动机918、能量源920、传动装置922、车轮/轮胎924中的至少一个。控制系统906可包括转向938、油门940、制动942、传感器融合算法944、计算机视觉系统946、导航/路线控制系统948、避障系统950中的至少一个。外围设备908可包括无线通信系统952、触摸屏954、麦克风956、扬声器958中的至少一个。
从上述实施例可见,本申请实施例提供的机器人,通过来源于第一时钟信号源的触发信号触发机器人包括的多个传感器,以使所述传感器采集感知数据,所述感知数据的时间包括第一时间或第二时间,所述第一时间是指根据所述第一时钟信号源确定的对应所述传感器触发时刻的时间,所述第二时间包括根据第二时钟信号源确定的时间;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据同步帧 差;根据所述同步帧差,确定所述第一时间与所述第二时间之间的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
第四实施例
请参考图10,其为本申请的多传感器同步授时系统实施例的结构示意图。由于系统实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的系统实施实施例仅仅是示意性的。
本实施例的一种多传感器同步授时系统,包括:机器人1001和服务器1002。
所述机器人1001,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据,向所述服务器1002发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
所述同步授时请求,可包括所述多个传感器的感知数据。其中,所述多个传感器包括至少一个第一传感器和至少一个第二传感器,所述第一传感器采集的第一感知数据的时间包括第一时间,所述第二传感器采集的第二感知数据的时间包括第二时间,所述第一时间是指根据所述第一时钟信号源确定的对应所述传感器触发时刻的时间,所述第二时间包括根据第二时钟信号源确定的时间。
在一个示例中,所述机器人1001,还用于判断当前时间距离上一次同步授时校准时间的时长是否达到时长阈值;若是,则向所述服务器1002发送同步授时请求。所述同步授时请求可包括每个传感器的预设帧数的感知数据。
所述预设帧数,可根据业务需求确定,如设置为100帧等等。所述预设帧数越大,则同步授时的准确性越高,但是消耗的网络流量也越多。
在另一个示例中,所述机器人1001,还用于若所述多个传感器的感知数据相互投影时无法对齐,则向所述服务器1002发送同步授时请求。
所述服务器1002,用于接收所述同步授时请求,确定所述多个传感器的感知数据,并根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述时间偏移量;向所述机器人1001回送所述时间偏移量。
所述服务器1002,可具体用于根据所述同步授时请求获取所述多个传感器的感知数据。
从上述实施例可见,本申请实施例提供的多传感器同步授时系统,包括机器人和服务器,机器人通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,向服务器发送同步授时请求;服务器根据该请求,确定多个传感器的感知数据,根据多个传感器间感知数据帧间变化的相关度,确定多个传感器间感知数据的同步帧差;根据同步帧差,确定多个传感器间感知数据的时间偏移量;向机器人回送时间偏移量;机器人根据时间偏移量,对多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
第五实施例
请参考图11,其为本申请的多传感器同步授时方法的实施例的流程图。由于方法实施例基本相似于系统实施例,所以描述得比较简单,相关之处参见系统实施例的部分说明即可。下述描述的方法实施例仅仅是示意性的。
本实施例的一种多传感器同步授时方法,包括如下步骤:
步骤S1101:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
步骤S1103:向服务器发送同步授时请求;
步骤S1105:接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;
步骤S1107:根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
从上述实施例可见,本申请实施例提供的多传感器同步授时方法,通过来源于第一时钟信号源的触发信号触发多个传感器采集感知数据,向服务器发送同步授时请求,以使得服务器根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;向所述机器人回送所述时间偏移量;机器人根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
在上述的实施例中,提供了一种多传感器同步授时方法,与之相对应的,本申请还提供一种多传感器同步授时装置。该装置是与上述方法的实施例相对应。
第六实施例
请参看图12,其为本申请的多传感器同步授时装置的实施例的示意图。由于装置实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的装置实施例仅仅是示意性的。
本申请另外提供一种多传感器同步授时装置,包括:
传感器触发单元1201,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;
感知数据发送单元1203,用于向服务器发送同步授时请求;
时间偏移量接收单元1205,用于接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;
时间对齐单元1207,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
第七实施例
请参考图13,其为本申请的机器人实施例的示意图。由于设备实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的设备实施例仅仅是示意性的。
本实施例的一种机器人,该机器人包括:第一时钟信号源1301、多个传感器1302、处理器1303和存储器1304;所述存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于所述第一时钟信号源的触发信号,触发多个传感器采集感知数据,向服务器发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
第八实施例
请参考图14,其为本申请的多传感器同步授时方法的实施例的流程图。由于方法实施例基本相似于系统实施例,所以描述得比较简单,相关之处参见系统实施例的部分说明即可。下述描述的方法实施例仅仅是示意性的。
本实施例的一种多传感器同步授时方法,包括如下步骤:
步骤S1401:接收机器人发送的同步授时请求;
步骤S1403:确定所述机器人的多个传感器的感知数据;
步骤S1405:根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
步骤S1407:根据所述同步帧差,确定所述时间偏移量;
步骤S1409:向所述机器人回送所述时间偏移量。
从上述实施例可见,本申请实施例提供的多传感器同步授时方法,通过服务器接收机器人发送的同步授时请求,确定所述机器人的多个传感器的感知数据,根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;向所述机器人回送所述时间偏移量,以使得机器人根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;这种处理方式,使得可在数据层面检测多个传感器同步授时的准确性,在检测到多种感知数据的数据时间戳未对齐时,可实现多个传感器的同步授时的校准;因此,可以有效提升多传感器同步授时准确性,从而确保对同一系统内的各传感器进行精确授时。
在上述的实施例中,提供了一种多传感器同步授时方法,与之相对应的,本申请还提供一种多传感器同步授时装置。该装置是与上述方法的实施例相对应。
第九实施例
请参看图15,其为本申请的多传感器同步授时装置的实施例的示意图。由于装置实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描述的装置实施例仅仅是示意性的。
本申请另外提供一种多传感器同步授时装置,包括:
请求接收单元1501,用于接收机器人发送的同步授时请求;
感知数据确定单元1503,用于确定所述机器人的多个传感器的感知数据;
同步帧差确定单元1505,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;
时间偏移量确定单元1507,用于根据所述同步帧差,确定所述时间偏移量;
时间偏移量回送单元1509,用于向所述机器人回送所述时间偏移量。
第十实施例
请参考图16,其为本申请的电子设备实施例的示意图。由于设备实施例基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。下述描 述的设备实施例仅仅是示意性的。
本实施例的一种电子设备,该电子设备包括:处理器1601和存储器1602;所述存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:接收机器人发送的同步授时请求;确定所述机器人的多个传感器的感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述时间偏移量;向所述机器人回送所述时间偏移量。
本申请虽然以较佳实施例公开如上,但其并不是用来限定本申请,任何本领域技术人员在不脱离本申请的精神和范围内,都可以做出可能的变动和修改,因此本申请的保护范围应当以本申请权利要求所界定的范围为准。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
1、计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
2、本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
Claims (18)
- 一种多传感器同步授时系统,其特征在于,包括:机器人,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据,向服务器发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐;服务器,用于接收所述同步授时请求,确定所述多个传感器的感知数据,并根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述时间偏移量;向所述机器人回送所述时间偏移量。
- 一种多传感器同步授时方法,其特征在于,包括:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;向服务器发送同步授时请求;接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
- 一种多传感器同步授时装置,其特征在于,包括:传感器触发单元,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;感知数据发送单元,用于向服务器发送同步授时请求;时间偏移量接收单元,用于接收所述服务器回送的所述多个传感器间感知数据的时间偏移量;时间对齐单元,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
- 一种机器人,其特征在于,包括:多个传感器;第一时钟信号源;处理器;以及存储器,用于存储实现多传感器同步授时方法的程序,该机器人通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于所述第一时钟信号源的触发信号,触发多个传感器采集感知数据,向服务器发送同步授时请求;以及,接收所述服务器回送的所述多个传感器间感知数据的时间偏移量,并根据所述时间偏移 量,对所述多个传感器的感知数据执行时间对齐。
- 一种多传感器同步授时方法,其特征在于,包括:接收机器人发送的同步授时请求;确定所述机器人的多个传感器的感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定时间偏移量;向所述机器人回送所述时间偏移量。
- 一种多传感器同步授时装置,其特征在于,包括:请求接收单元,用于接收机器人发送的同步授时请求;感知数据确定单元,用于确定所述机器人的多个传感器的感知数据;同步帧差确定单元,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;时间偏移量确定单元,用于根据所述同步帧差,确定所述时间偏移量;时间偏移量回送单元,用于向所述机器人回送所述时间偏移量。
- 一种电子设备,其特征在于,包括:处理器;以及存储器,用于存储实现多传感器同步授时方法的程序,该设备通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:接收机器人发送的同步授时请求;确定所述机器人的多个传感器的感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定时间偏移量;向所述机器人回送所述时间偏移量。
- 一种多传感器同步授时方法,其特征在于,包括:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
- 根据权利要求8所述的方法,其特征在于,所述相关度根据所述多个传感器的相对变换数据确定;所述相对变换数据包括所述传感器采集的相邻两帧感知数据间的相对 变换数据。
- 根据权利要求8所述的方法,其特征在于,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差,包括:针对多个同步帧差,获取所述同步帧差对应的所述相关度;将所述相关度的最大值对应的同步帧差,作为所述感知数据的同步帧差。
- 根据权利要求8所述的方法,其特征在于,所述根据所述同步帧差,并确定所述多个传感器间感知数据的时间偏移量,包括:将多对具有所述同步帧差的两帧感知数据的时间差值的平均值,作为所述时间偏移量。
- 根据权利要求8所述的方法,其特征在于,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差,包括:针对所述多个传感器中的第一传感器和第二传感器的两两组合,根据所述第一传感器和第二传感器间感知数据帧间变化的相关度,确定所述第一传感器和第二传感器间感知数据同步帧差;其中,所述第一传感器包括感知数据的时间为第一时间的传感器,所述第二传感器包括感知数据的时间为第二时间的传感器。
- 根据权利要求8所述的方法,其特征在于,在所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差之前,还包括:判断当前时间距离上一次同步授时校准时间的时长是否达到时长阈值;若是,则进入下一步。
- 根据权利要求8所述的方法,其特征在于,在所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差之前,还包括:若所述多个传感器的感知数据相互投影时无法对齐,则进入下一步。
- 根据权利要求8所述的方法,其特征在于,所述根据所述多个传感器间感知数据帧间变化的相关度,并确定所述多个传感器间感知数据的同步帧差,包括:根据所述多个传感器间预设数量的多帧感知数据帧间变化的相关度,确定所述同步帧差。
- 根据权利要求8所述的方法,其特征在于,还包括:根据所述第一时钟信号源,生成第一时钟脉冲信号;根据所述第一时钟脉冲信号,生成所述触发信号和对应传感器触发时刻的所述第一时间。
- 一种多传感器同步授时装置,其特征在于,包括:传感器触发单元,用于通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;同步帧差确定单元,用于根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;时间偏移量确定单元,用于根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;同步校准单元,用于根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
- 一种机器人,其特征在于,包括:多个传感器;第一时钟信号源;处理器;以及存储器,用于存储实现多传感器同步授时方法的程序,该机器人通电并通过所述处理器运行该多传感器同步授时方法的程序后,执行下述步骤:通过来源于第一时钟信号源的触发信号,触发多个传感器采集感知数据;根据所述多个传感器间感知数据帧间变化的相关度,确定所述多个传感器间感知数据的同步帧差;根据所述同步帧差,确定所述多个传感器间感知数据的时间偏移量;根据所述时间偏移量,对所述多个传感器的感知数据执行时间对齐。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811647921.1 | 2018-12-29 | ||
CN201811647921.1A CN111381487B (zh) | 2018-12-29 | 2018-12-29 | 多传感器同步授时系统、方法、装置及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020135382A1 true WO2020135382A1 (zh) | 2020-07-02 |
Family
ID=71126940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/127719 WO2020135382A1 (zh) | 2018-12-29 | 2019-12-24 | 多传感器同步授时系统、方法、装置及电子设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111381487B (zh) |
WO (1) | WO2020135382A1 (zh) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112200768A (zh) * | 2020-09-07 | 2021-01-08 | 华北水利水电大学 | 一种基于地理位置的点云信息提取系统 |
CN112485806A (zh) * | 2020-09-27 | 2021-03-12 | 浙江众合科技股份有限公司 | 一种激光雷达和相机时间同步系统及方法 |
CN112907928A (zh) * | 2021-01-26 | 2021-06-04 | 徐州徐工矿业机械有限公司 | 一种挖掘机多信号无线同步采集及分类系统 |
CN113037458A (zh) * | 2021-03-02 | 2021-06-25 | 中国地震局地球物理研究所 | 一种高精度同步授时系统 |
CN113744532A (zh) * | 2021-09-14 | 2021-12-03 | 东风汽车集团股份有限公司 | 一种基于车路协同的城市交通客车盲区预警方法及装置 |
CN113839732A (zh) * | 2021-09-18 | 2021-12-24 | 阿里巴巴达摩院(杭州)科技有限公司 | 时钟同步方法、装置及设备 |
CN114200496A (zh) * | 2021-12-09 | 2022-03-18 | 桂林电子科技大学 | 一种可实时再生的卫星信号模拟系统及方法 |
CN114739445A (zh) * | 2022-01-27 | 2022-07-12 | 厦门万宾科技有限公司 | 一种城市级排水管网增强扫描方法及系统 |
CN114964175A (zh) * | 2022-03-30 | 2022-08-30 | 华南理工大学 | 多传感器数据同步采集装置及采集方法 |
CN115549884A (zh) * | 2022-09-30 | 2022-12-30 | 东风商用车有限公司 | 一种传感器时间同步方法、装置、设备及可读存储介质 |
CN116232524A (zh) * | 2023-05-11 | 2023-06-06 | 北京米波通信技术有限公司 | 接收机板间信号的同步方法及相关设备 |
CN117040678A (zh) * | 2023-10-10 | 2023-11-10 | 北京理工大学 | 一种基于硬件时间同步的时延控制方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111934843A (zh) * | 2020-07-31 | 2020-11-13 | 深圳市智绘科技有限公司 | 一种面向智能无人系统的多传感器数据同步采集方法 |
CN111970351B (zh) * | 2020-08-11 | 2021-06-22 | 震坤行工业超市(上海)有限公司 | 一种基于数据对齐的物联网多维传感优化方法及系统 |
CN112363383B (zh) * | 2020-10-26 | 2022-04-05 | 上海感探号信息科技有限公司 | 一种时间轴统一系统及方法 |
WO2022116000A1 (zh) * | 2020-12-01 | 2022-06-09 | 华为技术有限公司 | 一种通信方法及装置 |
CN114647179A (zh) * | 2020-12-18 | 2022-06-21 | 华为技术有限公司 | 主时钟装置、从时钟装置和时间同步方法 |
CN112787740A (zh) * | 2020-12-26 | 2021-05-11 | 武汉光庭信息技术股份有限公司 | 一种多传感器时间同步装置及方法 |
CN113411156A (zh) * | 2021-06-24 | 2021-09-17 | 青岛蚂蚁机器人有限责任公司 | 一种slam导航agv的传感器时间同步方法 |
CN114006672B (zh) * | 2021-09-17 | 2024-04-02 | 东风汽车集团股份有限公司 | 一种车载多传感器数据同步采集方法及系统 |
CN114415489B (zh) * | 2021-12-02 | 2023-09-22 | 北京罗克维尔斯科技有限公司 | 一种车载传感器时间同步方法、装置、设备和介质 |
CN114338951A (zh) * | 2021-12-30 | 2022-04-12 | 智道网联科技(北京)有限公司 | 传感器同步方法、装置、系统及车辆 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102421187A (zh) * | 2011-12-01 | 2012-04-18 | 北京航天测控技术有限公司 | 一种无线传感器网络的高精度时钟同步方法 |
CN104009833A (zh) * | 2013-02-26 | 2014-08-27 | 赫克斯冈技术中心 | 传感器同步方法和与之有关的传感器测量系统 |
CN107923991A (zh) * | 2015-03-26 | 2018-04-17 | 英国石油勘探运作有限公司 | 地震勘测方法 |
US20180317245A1 (en) * | 2012-03-30 | 2018-11-01 | Texas Instruments Incorporated | Coexistence of Wireless Sensor Networks with Other Wireless Networks |
JP2018180795A (ja) * | 2017-04-08 | 2018-11-15 | 学校法人関西学院 | 無線通信同期回復方法およびそれを用いたセンサネットワークシステム |
CN108923876A (zh) * | 2018-06-27 | 2018-11-30 | 北京艾瑞思机器人技术有限公司 | 时间同步方法、装置及系统 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3227021A1 (de) * | 1982-07-20 | 1984-01-26 | Deutsche Bundespost, vertreten durch den Präsidenten des Fernmeldetechnischen Zentralamtes, 6100 Darmstadt | Verfahren zur bildung von lese- und schreibadressen |
DE69739757D1 (de) * | 1996-12-26 | 2010-03-25 | Nippon Telegraph & Telephone | Weiterreichenverfahren zur verringerung der phasendifferenz zur synchronisation einer mobilstation |
KR20040092259A (ko) * | 2003-04-25 | 2004-11-03 | 삼성전자주식회사 | 기지국 장치를 위한 위성 클럭 동기 시스템 및 이를이용한 기지국 시스템의 위성 클럭 동기화 방법 |
CN101282230B (zh) * | 2007-04-05 | 2011-04-20 | 中兴通讯股份有限公司 | 广播数据全网同步的实现方法 |
CN101394244B (zh) * | 2007-09-17 | 2011-10-26 | 中兴通讯股份有限公司 | 一种时分基站系统中非同源时钟域帧同步信号的产生方法 |
CN101631016B (zh) * | 2009-04-14 | 2011-09-14 | 华中科技大学 | 一种现场总线的时间同步方法 |
JP5838374B2 (ja) * | 2011-11-04 | 2016-01-06 | パナソニックIpマネジメント株式会社 | 無線通信システム |
JP6487386B2 (ja) * | 2016-07-22 | 2019-03-20 | ファナック株式会社 | 時刻精度を維持するためのサーバ、方法、プログラム、記録媒体、及びシステム |
-
2018
- 2018-12-29 CN CN201811647921.1A patent/CN111381487B/zh active Active
-
2019
- 2019-12-24 WO PCT/CN2019/127719 patent/WO2020135382A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102421187A (zh) * | 2011-12-01 | 2012-04-18 | 北京航天测控技术有限公司 | 一种无线传感器网络的高精度时钟同步方法 |
US20180317245A1 (en) * | 2012-03-30 | 2018-11-01 | Texas Instruments Incorporated | Coexistence of Wireless Sensor Networks with Other Wireless Networks |
CN104009833A (zh) * | 2013-02-26 | 2014-08-27 | 赫克斯冈技术中心 | 传感器同步方法和与之有关的传感器测量系统 |
CN107923991A (zh) * | 2015-03-26 | 2018-04-17 | 英国石油勘探运作有限公司 | 地震勘测方法 |
JP2018180795A (ja) * | 2017-04-08 | 2018-11-15 | 学校法人関西学院 | 無線通信同期回復方法およびそれを用いたセンサネットワークシステム |
CN108923876A (zh) * | 2018-06-27 | 2018-11-30 | 北京艾瑞思机器人技术有限公司 | 时间同步方法、装置及系统 |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112200768A (zh) * | 2020-09-07 | 2021-01-08 | 华北水利水电大学 | 一种基于地理位置的点云信息提取系统 |
CN112485806B (zh) * | 2020-09-27 | 2023-12-05 | 浙江众合科技股份有限公司 | 一种激光雷达和相机时间同步系统及方法 |
CN112485806A (zh) * | 2020-09-27 | 2021-03-12 | 浙江众合科技股份有限公司 | 一种激光雷达和相机时间同步系统及方法 |
CN112907928A (zh) * | 2021-01-26 | 2021-06-04 | 徐州徐工矿业机械有限公司 | 一种挖掘机多信号无线同步采集及分类系统 |
CN113037458A (zh) * | 2021-03-02 | 2021-06-25 | 中国地震局地球物理研究所 | 一种高精度同步授时系统 |
CN113037458B (zh) * | 2021-03-02 | 2022-07-08 | 中国地震局地球物理研究所 | 一种高精度同步授时系统 |
CN113744532A (zh) * | 2021-09-14 | 2021-12-03 | 东风汽车集团股份有限公司 | 一种基于车路协同的城市交通客车盲区预警方法及装置 |
CN113839732A (zh) * | 2021-09-18 | 2021-12-24 | 阿里巴巴达摩院(杭州)科技有限公司 | 时钟同步方法、装置及设备 |
CN114200496A (zh) * | 2021-12-09 | 2022-03-18 | 桂林电子科技大学 | 一种可实时再生的卫星信号模拟系统及方法 |
CN114739445A (zh) * | 2022-01-27 | 2022-07-12 | 厦门万宾科技有限公司 | 一种城市级排水管网增强扫描方法及系统 |
CN114739445B (zh) * | 2022-01-27 | 2023-12-15 | 厦门万宾科技有限公司 | 一种城市级排水管网增强扫描方法及系统 |
CN114964175A (zh) * | 2022-03-30 | 2022-08-30 | 华南理工大学 | 多传感器数据同步采集装置及采集方法 |
CN115549884A (zh) * | 2022-09-30 | 2022-12-30 | 东风商用车有限公司 | 一种传感器时间同步方法、装置、设备及可读存储介质 |
CN115549884B (zh) * | 2022-09-30 | 2024-04-26 | 东风商用车有限公司 | 一种传感器时间同步方法、装置、设备及可读存储介质 |
CN116232524A (zh) * | 2023-05-11 | 2023-06-06 | 北京米波通信技术有限公司 | 接收机板间信号的同步方法及相关设备 |
CN117040678A (zh) * | 2023-10-10 | 2023-11-10 | 北京理工大学 | 一种基于硬件时间同步的时延控制方法 |
CN117040678B (zh) * | 2023-10-10 | 2023-12-22 | 北京理工大学 | 一种基于硬件时间同步的时延控制方法 |
Also Published As
Publication number | Publication date |
---|---|
CN111381487A (zh) | 2020-07-07 |
CN111381487B (zh) | 2022-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020135382A1 (zh) | 多传感器同步授时系统、方法、装置及电子设备 | |
CN112672415B (zh) | 多传感器时间同步方法、装置、系统、电子设备及介质 | |
WO2021031604A1 (zh) | 仿生眼多通道imu与相机硬件时间同步方法和装置 | |
EP4020855A1 (en) | Time synchronization method and apparatus | |
CN109905194B (zh) | 一种车载终端系统和同步数据获取方法、装置 | |
US10788841B2 (en) | Hardware centralized time synchronization hub for an autonomous driving vehicle | |
EP3613648B1 (en) | A time source recovery system for an autonomous driving vehicle | |
CN111860604B (zh) | 数据融合方法、系统及计算机存储介质 | |
EP3614687B1 (en) | A gps based high precision timestamp generation circuit for an autonomous driving vehicle | |
US10969783B2 (en) | Time source ranking system for an autonomous driving vehicle | |
JP7380692B2 (ja) | 位置計測システム、測位演算装置、位置計測方法、及びプログラム | |
CN112865902B (zh) | 数据采集和时间同步方法、装置、电子设备及存储介质 | |
CN111600670B (zh) | 感应数据计算控制方法以及授时装置 | |
KR20210106460A (ko) | 자율 주행 차량에 사용되는 센서의 타이밍의 검증 | |
EP3217249A1 (en) | Method and structure for determining inter-system global clock | |
US11892571B2 (en) | Methods and systems for online synchronization of sensors of self-driving vehicles (SDV) | |
CN114025055A (zh) | 一种数据处理的方法、装置、系统、设备及存储介质 | |
CN112383675A (zh) | 一种时间同步方法、装置及终端设备 | |
Faizullin et al. | Open-source lidar time synchronization system by mimicking GNSS-clock | |
KR20210105918A (ko) | 자율 주행 차량의 동기화 센서 | |
WO2024021457A1 (zh) | 时间同步装置和方法、无人车、路侧单元、车联网系统 | |
CA3200304C (en) | Clock synchronisation | |
RU2789923C2 (ru) | Способы и системы для синхронизации датчиков беспилотных транспортных средств (sdv) онлайн | |
CN114647179A (zh) | 主时钟装置、从时钟装置和时间同步方法 | |
CN116819592A (zh) | 组合导航定位方法、系统、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19904047 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19904047 Country of ref document: EP Kind code of ref document: A1 |