CN111083309B - Time alignment method of multi-sensor data and data acquisition equipment - Google Patents

Time alignment method of multi-sensor data and data acquisition equipment Download PDF

Info

Publication number
CN111083309B
CN111083309B CN201811218803.9A CN201811218803A CN111083309B CN 111083309 B CN111083309 B CN 111083309B CN 201811218803 A CN201811218803 A CN 201811218803A CN 111083309 B CN111083309 B CN 111083309B
Authority
CN
China
Prior art keywords
data
time
sensor
image
reference sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811218803.9A
Other languages
Chinese (zh)
Other versions
CN111083309A (en
Inventor
陈亮
李晓东
芦超
辛梓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING MOMENTA TECHNOLOGY Co.,Ltd.
Original Assignee
Beijing Momenta Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Momenta Technology Co ltd filed Critical Beijing Momenta Technology Co ltd
Priority to CN201811218803.9A priority Critical patent/CN111083309B/en
Publication of CN111083309A publication Critical patent/CN111083309A/en
Application granted granted Critical
Publication of CN111083309B publication Critical patent/CN111083309B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • H04L7/0016Arrangements for synchronising receiver with transmitter correction of synchronization errors

Abstract

A time alignment method of multi-sensor data and a data acquisition device are provided, the method comprises the following steps: determining a reference sensor from the plurality of sensors, wherein the reference sensor requires the longest transmission time for transmitting data to the processor; determining a difference between a transmission time of each sensor of the plurality of sensors other than the reference sensor and a transmission time of the reference sensor; aiming at any target sensor except the reference sensor, correcting the receiving time of the first acquired data acquired by the reference sensor by using the corresponding difference value of the target sensor to obtain the new receiving time of the first acquired data, and determining the second acquired data and the first acquired data as synchronous data; the second collected data is data collected by the target sensor, and the reception time of the second collected data is the same as the new reception time of the first collected data. The method can improve the time alignment precision of the data of the multiple sensors, thereby unifying the environmental data acquired by each sensor.

Description

Time alignment method of multi-sensor data and data acquisition equipment
Technical Field
The invention relates to the technical field of automatic driving, in particular to a time alignment method of multi-sensor data and data acquisition equipment.
Background
In the solution of autonomous driving, data acquisition is generally performed using a plurality of different types of sensors that sense the surroundings of the autonomous vehicle independently of each other. The data that a plurality of sensors gathered are fused, the all around environment can be felt more accurately, improve the security of autopilot. However, each sensor may collect a large amount of data, and therefore one of the prerequisites for data fusion is that a set of sensor data used for data fusion needs to be aligned in time.
In order to time-align sensor data, when a data Processing device (e.g., a Central Processing Unit (CPU)) receives certain sensor data, the time at which the sensor data is received is recorded (the sensor data is time-stamped). When data fusion is needed, the data processing device reads the sensor data with the same timestamp, and the obtained sensor data are synchronous data, so that time alignment is realized. However, it has been found in practice that the synchronization data obtained by the above method still has a certain error, and when the vehicle is positioned by using these data, the positioning accuracy required for the automatic driving cannot be satisfied.
Disclosure of Invention
The embodiment of the invention discloses a time alignment method of multi-sensor data and data acquisition equipment, which can improve the time alignment precision of the multi-sensor data so as to unify environmental data acquired by each sensor.
The embodiment of the invention discloses a method for time alignment of multi-sensor data in a first aspect, which comprises the following steps:
determining a reference sensor from the plurality of sensors, wherein the transmission time required for the reference sensor to transmit data to a processor is longer than the transmission time required for any sensor of the plurality of sensors except the reference sensor to transmit data to the processor;
determining a difference between the transit time of each of the plurality of sensors other than the reference sensor and the transit time of the reference sensor;
reading the receiving time of the first acquisition data of the reference sensor;
and correcting the receiving time of the first acquired data by using the difference value corresponding to the target sensor so as to obtain a new receiving time of the first acquired data, and determining second acquired data and the first acquired data as synchronous data, wherein the second acquired data is acquired by the target sensor, and the receiving time of the second acquired data is the same as the new receiving time of the first acquired data.
As an optional implementation manner, in the first aspect of the embodiments of the present invention, the determining a difference between the transmission time of each sensor of the plurality of sensors except the reference sensor and the transmission time of the reference sensor includes:
acquiring an image shot by the camera, wherein the image comprises a marker at a target position, and the target position is known;
identifying the marker from the image to determine an actual location of the marker in the image;
for any sensor except the camera in the plurality of sensors, determining the positioning position of the sensor when the image is shot according to the data of the sensor;
projecting the marker into the image according to the target position, the positioning position and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image;
determining a difference between the transit time of the sensor and the transit time of the camera based on the speed of the vehicle and the distance.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, when the object sensor is a satellite positioning device, the method further includes:
and recording the satellite positioning time contained in the second acquisition data as the receiving time of the second acquisition data, wherein the satellite positioning time is the atomic clock time.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, when the target sensor is a satellite positioning device, the correcting the receiving time of the first collected data by using the difference value corresponding to the target sensor to obtain a new receiving time of the first collected data includes:
determining a correction moment according to the difference corresponding to the satellite positioning device and the receiving moment of the first acquisition data;
and converting the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determining a conversion result as a new receiving time of the first acquisition data.
As an optional implementation manner, in the first aspect of this embodiment of the present invention, the method further includes:
and recording the corresponding relation between the computer system time and the atomic clock time according to the specific frequency.
A second aspect of the embodiments of the present invention discloses a data acquisition device, including:
the selecting unit is used for determining a reference sensor from the plurality of sensors, and the transmission time required by the reference sensor for transmitting data to the processor is longer than the transmission time required by any sensor except the reference sensor in the plurality of sensors for transmitting data to the processor;
a determination unit configured to determine a difference between the transmission time of each of the plurality of sensors other than the reference sensor and the transmission time of the reference sensor;
the reading unit is used for reading the receiving time of the first acquisition data of the reference sensor;
and a synchronization unit configured to correct a reception timing of the first collected data by using the difference corresponding to the target sensor, with respect to any target sensor of the plurality of sensors other than the reference sensor, to obtain a new reception timing of the first collected data, and determine second collected data, which is data collected by the target sensor and whose reception timing is the same as the new reception timing of the first collected data, and the first collected data as synchronized data.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the reference sensor is a camera; and, the determining unit includes:
the acquisition subunit is used for acquiring an image shot by the camera, wherein the image comprises a marker at a target position, and the target position is known;
a determining subunit, configured to identify the marker from the image to determine an actual position of the marker in the image;
a calculation subunit, configured to determine, for any sensor of the plurality of sensors except the reference sensor, a positioning position of the sensor when the image is captured, based on data acquired by the sensor; projecting the marker into the image according to the target position, the positioning position and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image; determining a difference between the transit time of the sensor and the transit time of the camera based on the speed of the vehicle and the distance.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the data acquisition device further includes:
and the first recording unit is used for recording the satellite positioning time contained in the second acquisition data as the receiving time of the second acquisition data when the target sensor is a satellite positioning device, wherein the satellite positioning time is the atomic clock time.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, when the target sensor is a satellite positioning device, a manner that the synchronization unit is configured to correct the receiving time of the first collected data by using the difference value corresponding to the target sensor to obtain a new receiving time of the first collected data is specifically:
the synchronization unit is used for determining a correction time according to the difference corresponding to the satellite positioning device and the receiving time of the first acquisition data; and converting the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determining a conversion result as a new receiving time of the first acquisition data.
As an optional implementation manner, in the second aspect of the embodiment of the present invention, the data acquisition device further includes:
and the second recording unit is used for recording the corresponding relation between the computer system time and the atomic clock time according to the specific frequency.
A third aspect of the embodiments of the present invention discloses a data acquisition device, including:
a memory storing executable program code;
a processor coupled with the memory;
sensor modules including, but not limited to, cameras, satellite positioning devices, inertial measurement units, wheel speed gauges, etc.;
the processor calls the executable program code stored in the memory, receives data acquired by each sensor in the sensor module, and executes any one of the methods disclosed in the first aspect of the embodiments of the present invention.
A fourth aspect of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute any one of the methods disclosed in the first aspect of the embodiments of the present invention.
A fifth aspect of the embodiments of the present invention discloses a computer program product, which, when running on a computer, causes the computer to execute any one of the methods disclosed in the first aspect of the embodiments of the present invention.
Compared with the prior art, the invention has the advantages that:
1. a reference sensor is selected from the plurality of sensors, and the transmission time required for transmitting the data acquired by the reference sensor to the processor is longer than the transmission time required for transmitting the data acquired by any sensor except the reference sensor to the processor (namely, the transmission time required for transmitting the data acquired by the reference sensor to the processor is longest); then determining the difference between the transmission time of each sensor except the reference sensor and the transmission time of the reference sensor; after the receiving time of the first collected data of the reference sensor is read, aiming at any target sensor except the reference sensor, correcting the receiving time of the first collected data by using the corresponding difference value of the target sensor to obtain a new receiving time of the first collected data; and determining second collected data collected by the target sensor and the first collected data as synchronous data, wherein the receiving time of the second collected data is the same as the new receiving time of the first collected data. It can be seen that, in the embodiment of the present invention, the sensor with the longest data transmission time is used as the reference sensor, and after time alignment is completed, the data collected by the sensor with the largest number can be acquired to the greatest extent, and the data are synchronized in time. On the contrary, if the reference sensor is not the sensor with the longest data transmission time, at this time, it is assumed that a certain target sensor is the sensor with the longest data transmission time, and after the new receiving time of the first collected data is obtained by calibration, the collected data of the target sensor may not be transmitted to the processor, that is, the second collected data does not exist. Therefore, the time alignment precision of the data of the multiple sensors can be improved by implementing the embodiment of the invention, so that the environmental data acquired by each sensor is unified.
2. The camera is selected as a reference sensor, and the visual information obtained by shooting through the camera can be used for calculating the data transmission time delay between the reference sensor and the rest sensors. The calculation mode does not need to carry out hardware modification on each sensor or measure the transmission time of each sensor through experiments, and can conveniently determine the transmission time delay of the reference sensor relative to each sensor. In addition, compared with sensors such as an inertial measurement unit and a tachometer, more information can be contained in the single acquisition data of the camera, and generally, the transmission time required for the image data is longer. Based on the reference sensor selection standard, the camera is used as the reference sensor, so that the visual information can be utilized as much as possible for data fusion, and the precision of the data fusion can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for time alignment of multi-sensor data according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart diagram of another method for time-aligning multi-sensor data according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a data acquisition device according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of another data acquisition device disclosed in the embodiments of the present invention;
fig. 5 is a schematic structural diagram of another data acquisition device disclosed in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is to be noted that the terms "comprises" and "comprising" and any variations thereof in the embodiments and drawings of the present invention are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The embodiment of the invention discloses a time alignment method of multi-sensor data and data acquisition equipment, which can improve the time alignment precision of the multi-sensor data so as to unify environmental data acquired by each sensor. The following are detailed below.
Example one
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a method for time alignment of multi-sensor data according to an embodiment of the present invention. As shown in fig. 1, the multi-sensor data method may include the steps of:
101. the data acquisition device determines a reference sensor from the plurality of sensors.
In an embodiment of the present invention, the plurality of sensors may include a camera, a satellite positioning device (e.g., a GPS device, a beidou positioning device), an Inertial navigation unit (IMU), a wheel speed meter, a radar, and the like, which is not limited in the embodiment of the present invention.
The transmission time required by the reference sensor to transmit data to the processor of the data acquisition equipment is longer than the transmission time required by any sensor except the reference sensor in the plurality of sensors to transmit data to the processor, namely the transmission time required by the reference sensor to transmit data to the processor is longest. The transmission time of the sensor to transmit the data to the processor is influenced by various factors such as the size of the data, the circuit layout between the sensor and the processor, and the like. Assuming that two different sensors respectively acquire two data at the same time, the two data may not be transmitted to the processor at the same time due to the influence of the transmission time. If the CPU obtains the sensor data synchronized in time by taking the receiving time as a standard for time fusion, the actual generating time of each sensor data in a group of obtained sensor data may be different, and the precision of the result obtained by data fusion by using the group of sensor data is lower. The sensor with the longest transmission time is determined as the reference sensor, and after time alignment is completed, the data acquired by the sensors with the largest quantity can be acquired to the greatest extent, and the data are synchronized in time, so that the time alignment precision of multi-sensor data can be improved, and further, data fusion (such as positioning calculation, mapping and the like) by using the synchronized sensor data can be improved. This is one of the points of the present invention.
102. The data acquisition device determines a difference between a transmission time of each sensor of the plurality of sensors other than the reference sensor and a transmission time of the reference sensor.
In the embodiment of the invention, after the reference sensor is determined, the transmission time of each sensor is determined relative to the difference of the transmission time of the reference sensor by taking the transmission time of the reference sensor as a standard. That is, for the data collected at the same time, the data of each sensor other than the reference sensor is transmitted to the processor in advance of the data of the reference sensor.
103. The data acquisition device reads the reception time of the first acquisition data of the reference sensor.
In the embodiment of the invention, the data acquisition equipment can record the time when the data of each sensor is transmitted to the processor. As an optional implementation, the data file may also be named according to the receiving time of the data, so as to facilitate searching.
104. For any target sensor except the reference sensor in the plurality of sensors, the data acquisition equipment corrects the receiving time of the first acquired data by using the difference value corresponding to the target sensor to obtain the new receiving time of the first acquired data.
In the embodiment of the invention, the data acquisition equipment compensates the receiving time of the first acquired data by using the corresponding difference value of the target sensor so as to eliminate the relative error between the transmission time of the target sensor and the transmission time of the reference sensor. As an alternative, the data acquisition device may use the receiving time of the first acquired data minus the difference corresponding to the target sensor, and use the result as the new receiving time of the first acquired data. For example, if the CPU receives the first collected data collected by the reference sensor at 10:00:00.000 and the difference between the transmission time of the reference sensor and the transmission time of a certain target sensor is 500ms, the new reception timing of the first collected data after correction is 09:59: 59.500. The data acquisition device may determine the second acquired data acquired by the object sensor received at the time of 09:59:59.500 as data synchronized with the first acquired data described above.
105. The data acquisition equipment determines second acquisition data acquired by the target sensor and the first acquisition data as synchronous data, and the receiving time of the second acquisition data is the same as the new receiving time of the first acquisition data.
In the embodiment of the invention, the automatic driving vehicle at least comprises a camera, an IMU and a GPS sensor, the camera can be determined as a reference sensor, and the difference delta t between the transmission time of the IMU and the transmission time of the camera can be determined1Difference Δ t between transmission time of GPS and transmission time of camera2For a data acquisition device at a certain time T1Received image (first acquisition data) shot by camera, when target sensor is IMU, data acquisition equipment utilizes delta t1For T1Correcting to obtain new receiving time T2Will be at T2The IMU data (second acquisition data) received at the moment is determined as data synchronous with the image; when the target sensor is GPS, the data acquisition device utilizes Δ t2For T1Correcting to obtain new receiving time T3Will be at T3The GPS data (second acquired data) received at the time is determined as data synchronized with the above-described image. After the above time correction, the data acquisition deviceCan be prepared at T1Image received at time T2IMU data received at time, and at T3The GPS data received at the moment is determined to be a group of sensor data synchronized in time, the IMU data and the GPS data can be considered as data generated at the moment when the camera shoots the image, and the accuracy of a fusion result can be improved by utilizing the group of sensor data to perform data fusion.
It can be seen that in the method described in fig. 1, the data acquisition device may determine, from the plurality of sensors, the sensor with the longest transmission time required for transmitting data to the processor as a reference sensor, and then calculate a difference value of the transmission time of each sensor other than the reference sensor with respect to the transmission time of the reference sensor; aiming at any target sensor except the reference sensor, the data acquisition equipment corrects the relative error between the transmission time of the target sensor and the transmission time of the reference sensor, so that the time alignment precision of multi-sensor data can be improved, and the environmental data acquired by each sensor is unified.
Example two
Referring to fig. 2, fig. 2 is a flowchart illustrating a method for time alignment of multi-sensor data according to an embodiment of the present invention. As shown in fig. 2, the multi-sensor data method may include the steps of:
201. the data acquisition device determines the camera from the plurality of sensors as a reference sensor.
In the above sensors, the image data collected by the camera is large in data volume and involves operations such as data encoding and decoding during transmission, and the transmission time of the image data is generally longer than that of other sensor data. Therefore, the camera can be determined as the reference sensor. The camera is used as a reference sensor, so that the visual information can be utilized as much as possible to perform data fusion, and the precision of the data fusion can be further improved. This is also one of the points of the present invention. In addition, the following steps 202 to 206 are executed, so that the transmission delay of the camera relative to the other sensors can be calculated by using the visual information shot by the camera, the calculation method is implemented, the transmission delay of the reference sensor relative to each sensor can be conveniently determined without hardware modification of each sensor or experiment measurement of the transmission time of each sensor, and the method is also one of the invention points of the invention.
202. The data acquisition device acquires an image taken by the camera, the image containing the marker at the target location and the target location being known.
In the embodiment of the invention, a marker pole can be set up in advance or a specific guideboard, a speed limit marker and the like can be selected as the marker, and the target position of the marker can be measured in advance by using a satellite positioning means such as GPS positioning and the like. In particular, the specific guideboard, the speed limit marker and other markers may be objects with known positions in the electronic map, and at this time, the calculation of the transmission delay of the reference sensor with respect to the remaining sensors may be completed only by using the existing objects in the road without additionally erecting a special marker rod, which is also one of the inventions of the present invention.
203. The data acquisition device identifies the marker from the image to determine the actual location of the marker in the image.
204. For any sensor except the camera in the plurality of sensors, the data acquisition equipment determines the positioning position of the sensor when the image is shot according to the data of the sensor.
In the embodiment of the invention, the data acquisition equipment can read the receiving time T of the image containing the marker; for the satellite positioning device, the data acquisition equipment can directly read satellite positioning data with the receiving time T as the positioning position of the satellite positioning device; for the IMU, the IMU data whose receiving time is T may be read, and the positioning position of the IMU may be calculated by using the triaxial acceleration information measured by the IMU.
205. The data acquisition equipment projects the marker into the image according to the target position, the positioning position and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image.
In embodiments of the present invention, the mounting locations of the various sensors on the autonomous vehicle may be different. For example, the camera may be mounted in the front windshield of the vehicle, and the IMU and GPS devices may be mounted in the trunk of the vehicle. Therefore, the data collected by each sensor has a correspondence in space as well as a correspondence in time. For a certain marker objectively present in the surrounding environment of the vehicle, the observation of the marker by different sensors may be understood as a representation of the marker in a coordinate system with the respective sensor as origin of coordinates. Meanwhile, according to the corresponding relation of each sensor on the space, the representation of the marker in a certain coordinate system can be converted into the representation in a different coordinate system.
For example, for any preset marker, since the target position of the marker is known, the positioning position of the IMU is positioned by using the received IMU data, and the marker can be converted into an IMU coordinate system with the IMU as the origin; according to the relative position relationship between the IMU and the camera, the marker can be converted from the IMU coordinate system to the camera coordinate system with the camera as the origin for representation, and according to the projection relationship of the camera, the projection position of the marker in the image shot by the camera can be calculated. If there is no relative error between the transmission time of the IMU and the transmission time of the camera, then the IMU data and images received at time T above may be considered synchronized data. At this time, the projected position and the actual position of the same marker in the image should coincide. If there is a relative error between the transmission time of the IMU and the transmission time of the camera, assume that the relative error is Δ t1The projected position calculated from the IMU data received at time T is then actually Δ T before the image was captured1The position of the marker relative to the camera. The distance between the projected position and the actual position of the same marker in the image is affected by the propagation delay between the sensors and the speed of the vehicle, so that the propagation time of the sensor and the camera can be determined by performing step 206 described belowThe difference between the transmission times of (a).
206. The data acquisition device determines a difference between the transmission time of the sensor and the transmission time of the camera according to the speed of the vehicle and the distance between the projected position and the actual position of the marker.
In an embodiment of the present invention, the steps 203 to 206 may be specifically represented by the following mathematical models:
Figure BDA0001833325920000111
wherein T is the difference between the transmission time of a certain sensor except the camera and the transmission time of the camera, K is an internal reference matrix of the camera, T is a conversion matrix for representing the relative position relationship between the sensor and the camera, and T isbFor the location of the position of the sensor,
Figure BDA0001833325920000112
is the target location of the marker, then it will be appreciated that,
Figure BDA0001833325920000113
is the projected position of the marker in the image; vbIs the speed of the vehicle and is,
Figure BDA0001833325920000114
is the target position of the marker.
By continuously and iteratively adjusting the value of t, the projected position of the marker obtained after time compensation
Figure BDA0001833325920000121
And the error between the actual position of the marker in the image is minimum, so that the value of t when the error is minimum is determined as the difference between the transmission time of the sensor and the transmission time of the camera.
In combination with the mathematical model, the difference between the transmission time of each sensor except the reference sensor in the plurality of sensors and the transmission time of the reference sensor can be determined by performing the above steps 201 to 206; for any target sensor except the reference sensor, the data acquisition equipment can perform time correction according to the corresponding difference value of the target sensor, so that the data acquired by the target sensor and the data acquired by the reference sensor can be aligned in time. In addition, by executing the steps, the vehicle is driven to pass through the preset marker, so that the camera can shoot the marker, the difference between the transmission time of the camera and the transmission time of each sensor except the camera can be automatically calculated, hardware modification or transmission time measurement of each sensor is not needed, and the transmission time delay of the reference sensor relative to each sensor can be determined more conveniently. This is also one of the points of the invention
207. The data acquisition equipment reads the receiving moment of the first acquired data of the camera.
As an alternative embodiment, when the target sensor is a satellite positioning device, the data acquisition device may perform the following steps:
208. the data acquisition device records the satellite positioning time included in the second acquisition data as the reception time of the second acquisition data.
In the embodiment of the present invention, the second collected data may be data collected by a satellite positioning device, such as GPS positioning information. From the satellite positioning information, the satellite positioning time is the instantaneous time corresponding to the satellite positioning position, and the satellite positioning time is the atomic clock time. The embodiment of performing time correction using the transmission time delay of each sensor with respect to the reference sensor to improve the time alignment accuracy can obtain a good effect in the case where the transmission time of each sensor is fixed. In some possible embodiments, when the sensor transmits data to the processor, the data transmission link may be blocked due to various factors, and an actual difference between a transmission time required for transmitting the sensor data to the processor and a transmission time of the reference sensor may not be consistent with a predetermined difference, when the sensor data is time-aligned with a lower accuracy if the predetermined difference is used for time correction. The satellite positioning data comprises satellite positioning time which is accurate atomic clock time and can not be influenced by the state of a data transmission link. This is also one of the points of the present invention.
209. And the data acquisition equipment determines the correction time according to the difference corresponding to the satellite positioning device and the receiving time of the first acquired data.
210. And the data acquisition equipment converts the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determines the conversion result as the new receiving time of the first acquired data.
In the embodiment of the invention, the data acquisition equipment can maintain a time corresponding relation table, the table comprises the corresponding relation between the computer time and the atomic clock time, and the data acquisition equipment can record the corresponding relation between the computer system time and the atomic clock time according to the specific frequency and update the corresponding relation into the time corresponding relation table.
211. The data acquisition equipment determines second acquisition data acquired by the satellite positioning device and the first acquisition data as synchronous data, and the receiving time of the second acquisition data is the same as the new receiving time of the first acquisition data.
In some possible embodiments, the GPS device and IMU of the autonomous vehicle may be integrated into one integrated navigation device that measures pose information of the vehicle and uses the atomic clock time of the GPS as a timing criterion. Accordingly, the data acquisition device can determine the difference value delta t between the transmission time of the camera and the transmission time of the integrated navigation device3. The data acquisition device may store and maintain three types of data as shown in the following table:
IMG(UNIX-TIME) TIME pair (UNIX-TIME, GPS-TIME) POSE(GPS-TIME)
……t1……jpg t2,t3 t3,pose
In the above table, the IMG is image data captured by the camera, the POSE is POSE information of the autonomous vehicle, and the POSE information at least includes longitude, latitude, altitude, pitch angle, course angle, and roll angle of the autonomous vehicle, where position information such as longitude, latitude, altitude, and the like can be obtained from measurement data of the GPS device, and attitude information such as pitch angle, course angle, roll angle, and the like can be obtained from measurement data of the IMU. UNIX-TIME is computer system TIME, GPS-TIME is satellite positioning TIME, and TIME pair is the corresponding relation between computer TIME and atomic clock TIME. Based on the time pair, the mode of acquiring vehicle POSE data synchronized with the image shot by the camera by the data acquisition device can be as follows: the Δ t described above is used for an image captured by the camera at a certain time t13Correcting t1 to obtain the correction time t2 of the image; and finding t3(GPS-TIME) corresponding to t2(UNIX-TIME) as a new receiving TIME of the image through the TIME pair, and acquiring POSE data with the satellite positioning TIME t3 as image synchronization data shot by the camera at the TIME t 1.
It can be seen that in the method described in fig. 2, the data acquisition device can correct the reception time of the first acquired data by the difference between the transmission time of the reference sensor and the transmission time of the target sensor to improve the accuracy of the time alignment of the sensor data. Further, in the method described in fig. 2, the camera is used as the reference sensor, and the difference between the transmission time of the reference sensor and the transmission time of each sensor other than the reference sensor is determined by calculating a preset calibration as the distance between the projection position in the image captured by the camera and the actual position in the image, so that the transmission time delay of the reference sensor relative to each sensor can be conveniently determined. Further, in the method described in fig. 2, when the target sensor is a satellite positioning device, after the data acquisition device corrects the receiving time of the first acquired data by using the transmission time difference corresponding to the target sensor, the receiving time of the first acquired data may be further corrected by using the correspondence between the atomic clock time and the computer system time, so as to further improve the accuracy of time alignment.
EXAMPLE III
Referring to fig. 3, fig. 3 is a schematic structural diagram of a data acquisition device according to an embodiment of the present invention. As shown in fig. 3, the data acquisition apparatus may include:
the selecting unit 301 is configured to determine a reference sensor from the plurality of sensors, where a transmission time required for the reference sensor to transmit data to the processor is longer than a transmission time required for any one of the plurality of sensors except the reference sensor to transmit data to the processor;
in an embodiment of the present invention, the plurality of sensors may include a camera, a satellite positioning device, an inertial navigation unit, a wheel speed meter, a radar, and the like.
A determination unit 302 for determining a difference between the transmission time of each sensor of the plurality of sensors other than the reference sensor determined by the selection unit 301 and the transmission time of the reference sensor;
the reading unit 303 is configured to read the receiving time of the first acquired data of the reference sensor determined by the selecting unit 301;
a synchronization unit 304, configured to, for any target sensor of the plurality of sensors except the reference sensor, correct the reception time of the first collected data read by the reading unit 303 by using the difference value corresponding to the target sensor determined by the determination unit 302 to obtain a new reception time of the first collected data, and determine that the second collected data and the first collected data are synchronized data, where the second collected data is the data collected by the target sensor, and the reception time of the second collected data is the same as the new reception time of the first collected data. As an alternative embodiment, the synchronization unit 304 may use the receiving time of the first collected data minus the difference corresponding to the target sensor, and use the result as the new receiving time of the first collected data.
It can be seen that, with the data acquisition device shown in fig. 3, the difference between the transmission time of the reference sensor and the transmission time of each sensor other than the reference sensor can be determined, so that the receiving time of the first acquired data of the reference sensor is corrected by using the difference corresponding to each sensor, and the sensor data synchronized in time is acquired by using the corrected receiving time, so that a group of acquired sensor data can be aligned in time with the receiving time of the first acquired data as a reference, the time alignment accuracy of multi-sensor data can be improved, and the environmental data acquired by each sensor can be unified.
Example four
Referring to fig. 4, fig. 4 is a schematic structural diagram of another data acquisition device according to an embodiment of the present invention. The data acquisition device shown in fig. 4 can be optimized by the data acquisition device shown in fig. 3. In the data acquisition device shown in fig. 4:
the reference sensor determined by the selection unit 301 from the plurality of sensors may be a camera;
and, in the data acquisition apparatus shown in fig. 4, the determining unit 302 may include:
an obtaining subunit 3021, configured to obtain an image captured by the camera and determined by the selecting unit 301, where the image includes a marker at a target position, and the target position is known;
a determination subunit 3022 configured to identify the marker from the image acquired by the acquisition subunit 3021 to determine an actual position of the marker in the image;
in the embodiment of the invention, a marker pole can be set up in advance or a specific guideboard, a speed limit marker and the like can be selected as the marker, and the target position of the marker can be measured in advance by using a satellite positioning means such as GPS positioning and the like.
A calculation subunit 3033, configured to determine, for any sensor of the plurality of sensors except the reference sensor, a positioning position of the sensor when the image is captured, based on data acquired by the sensor; projecting the marker into the image according to the target position of the marker, the positioning position of the sensor and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image determined by the determining subunit 3022; the difference between the sensor's transit time and the camera's transit time is determined based on the speed of the vehicle and the distance described above.
It can be seen that, with the data acquisition apparatus shown in fig. 4, if the camera captures the marker in the driving diagram of the vehicle, the determination unit 302, using the above-described embodiment, can automatically determine the difference between the transmission time of the reference sensor and the transmission time of each sensor other than the reference sensor, without modifying the hardware of each sensor or measuring the transmission time.
Optionally, in the data acquisition device shown in fig. 4, the data acquisition device may further include:
a first recording unit 305, configured to record, as a reception time of the second acquisition data, a satellite positioning time included in the second acquisition data acquired by the satellite positioning device when the target sensor targeted by the synchronization unit 304 is the satellite positioning device, where the satellite positioning time is an atomic clock time. Specifically, the synchronization unit 304 described above determines the second collected data as data synchronized with the first collected data, based on the reception timing of the second collected data recorded by the first recording unit 305.
When the target sensor is a satellite positioning device, the manner that the synchronization unit 304 is configured to correct the receiving time of the first collected data by using the difference value of the target sensor to obtain a new receiving time of the first collected data may specifically be:
the synchronization unit 304 is configured to determine a correction time according to the difference corresponding to the satellite positioning apparatus and the receiving time of the first acquisition data; and converting the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determining the conversion result as the new receiving time of the first acquisition data.
Further, the data acquisition device shown in fig. 4 may further include:
and a second recording unit 306, configured to record a correspondence between the computer system time and the atomic clock time according to the specific frequency. Specifically, the synchronization unit 304 described above converts the correction timing from the computer time to the atomic clock time according to the correspondence between the computer system time and the atomic clock time recorded by the second recording unit 306.
In summary, with the data acquisition device shown in fig. 4, the receiving time of the first acquired data may be corrected by the difference between the transmission time of the reference sensor and the transmission time of the target sensor, so as to improve the accuracy of the time alignment of the sensor data. Further, by implementing the data acquisition device shown in fig. 4, the transmission delay of the reference sensor relative to each sensor can be conveniently determined. Furthermore, the data acquisition device shown in fig. 4 may further correct the receiving time of the first acquired data by using the corresponding relationship between the atomic clock time and the computer system time, so as to further improve the accuracy of time synchronization.
EXAMPLE five
Referring to fig. 5, fig. 5 is a schematic structural diagram of another data acquisition device according to an embodiment of the present disclosure. As shown in fig. 5, the data acquisition apparatus may include:
a memory 501 in which executable program code is stored;
a processor 502 coupled to a memory 501;
sensor modules 503, including but not limited to cameras, satellite positioning devices, inertial measurement units, wheel speed gauges, etc.;
the processor 502 calls the executable program code stored in the memory 501, receives data collected by each sensor in the sensor module 503, and executes the time alignment method of the multi-sensor data shown in fig. 1 or fig. 2.
An embodiment of the present invention discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute a time alignment method of multi-sensor data shown in fig. 1 or fig. 2.
An embodiment of the invention discloses a computer program product comprising a non-transitory computer readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform the method of time-aligning multi-sensor data shown in fig. 1 or fig. 2.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art should also appreciate that the embodiments described in this specification are exemplary and alternative embodiments, and that the acts and modules illustrated are not required in order to practice the invention.
In various embodiments of the present invention, it should be understood that the sequence numbers of the above-mentioned processes do not imply an inevitable order of execution, and the execution order of the processes should be determined by their functions and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a computer accessible memory. Based on such understanding, the technical solution of the present invention, which is a part of or contributes to the prior art in essence, or all or part of the technical solution, can be embodied in the form of a software product, which is stored in a memory and includes several requests for causing a computer device (which may be a personal computer, a server, a network device, or the like, and may specifically be a processor in the computer device) to execute part or all of the steps of the above-described method of each embodiment of the present invention.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.
The above detailed description is provided for the time alignment method and the data acquisition device of multi-sensor data disclosed in the embodiments of the present invention, and the specific examples are applied herein to explain the principles and embodiments of the present invention. Meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (8)

1. A method of time-aligning multi-sensor data, comprising:
determining a reference sensor from the plurality of sensors, wherein the transmission time required for the reference sensor to transmit data to a processor is longer than the transmission time required for any sensor of the plurality of sensors except the reference sensor to transmit data to the processor;
determining a difference between the transit time of each of the plurality of sensors other than the reference sensor and the transit time of the reference sensor;
reading the receiving time of the first acquisition data of the reference sensor;
correcting the receiving time of the first acquired data by using the difference value corresponding to the target sensor to obtain a new receiving time of the first acquired data, and determining second acquired data and the first acquired data as synchronous data, wherein the second acquired data is acquired by the target sensor, and the receiving time of the second acquired data is the same as the new receiving time of the first acquired data;
the reference sensor is a camera, and the determining a difference between the transmission time of each of the plurality of sensors other than the reference sensor and the transmission time of the reference sensor includes:
acquiring an image shot by the camera, wherein the image comprises a marker at a target position, and the target position is known;
identifying the marker from the image to determine an actual location of the marker in the image;
for any sensor except the camera in the plurality of sensors, determining the positioning position of the sensor when the image is shot according to the data of the sensor;
projecting the marker into the image according to the target position, the positioning position and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image;
determining a difference between the transit time of the sensor and the transit time of the camera based on the speed of the vehicle and the distance.
2. The method of time-aligning multi-sensor data according to claim 1, wherein when the target sensor is a satellite positioning device, the method further comprises:
and recording the satellite positioning time contained in the second acquisition data as the receiving time of the second acquisition data, wherein the satellite positioning time is the atomic clock time.
3. The method of claim 2, wherein when the target sensor is a satellite positioning device, the correcting the receiving time of the first collected data by using the difference corresponding to the target sensor to obtain a new receiving time of the first collected data comprises:
determining a correction moment according to the difference corresponding to the satellite positioning device and the receiving moment of the first acquisition data;
and converting the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determining a conversion result as a new receiving time of the first acquisition data.
4. The method for time-aligning multi-sensor data according to claim 3, further comprising:
and recording the corresponding relation between the computer system time and the atomic clock time according to the specific frequency.
5. A data acquisition device, comprising:
the selecting unit is used for determining a reference sensor from the plurality of sensors, and the transmission time required by the reference sensor for transmitting data to the processor is longer than the transmission time required by any sensor except the reference sensor in the plurality of sensors for transmitting data to the processor;
a determination unit configured to determine a difference between the transmission time of each of the plurality of sensors other than the reference sensor and the transmission time of the reference sensor;
the reading unit is used for reading the receiving time of the first acquisition data of the reference sensor;
a synchronization unit configured to correct a reception timing of the first collected data by using the difference corresponding to the target sensor to obtain a new reception timing of the first collected data for any target sensor of the plurality of sensors other than the reference sensor, and determine second collected data, which is data collected by the target sensor and has a reception timing identical to the new reception timing of the first collected data, and the first collected data as synchronized data;
the reference sensor is a camera;
and, the determining unit includes:
the acquisition subunit is used for acquiring an image shot by the camera, wherein the image comprises a marker at a target position, and the target position is known;
a determining subunit, configured to identify the marker from the image to determine an actual position of the marker in the image;
a calculation subunit, configured to determine, for any sensor of the plurality of sensors except the reference sensor, a positioning position of the sensor when the image is captured, based on data acquired by the sensor; projecting the marker into the image according to the target position, the positioning position and the relative position relationship between the sensor and the camera so as to determine the distance between the projection position of the marker projected into the image and the actual position of the marker in the image; determining a difference between the transit time of the sensor and the transit time of the camera based on the speed of the vehicle and the distance.
6. The data acquisition device of claim 5, further comprising:
and the first recording unit is used for recording the satellite positioning time contained in the second acquisition data as the receiving time of the second acquisition data when the target sensor is a satellite positioning device, wherein the satellite positioning time is the atomic clock time.
7. The data acquisition device according to claim 6, wherein when the target sensor is a satellite positioning device, the synchronization unit is configured to correct the receiving time of the first acquisition data by using the difference corresponding to the target sensor to obtain a new receiving time of the first acquisition data by:
the synchronization unit is used for determining a correction time according to the difference corresponding to the satellite positioning device and the receiving time of the first acquisition data; and converting the correction time from the computer time to the atomic clock time according to the corresponding relation between the computer system time and the atomic clock time, and determining a conversion result as a new receiving time of the first acquisition data.
8. The data acquisition device of claim 7, further comprising:
and the second recording unit is used for recording the corresponding relation between the computer system time and the atomic clock time according to the specific frequency.
CN201811218803.9A 2018-10-18 2018-10-18 Time alignment method of multi-sensor data and data acquisition equipment Active CN111083309B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811218803.9A CN111083309B (en) 2018-10-18 2018-10-18 Time alignment method of multi-sensor data and data acquisition equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811218803.9A CN111083309B (en) 2018-10-18 2018-10-18 Time alignment method of multi-sensor data and data acquisition equipment

Publications (2)

Publication Number Publication Date
CN111083309A CN111083309A (en) 2020-04-28
CN111083309B true CN111083309B (en) 2022-04-01

Family

ID=70308812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811218803.9A Active CN111083309B (en) 2018-10-18 2018-10-18 Time alignment method of multi-sensor data and data acquisition equipment

Country Status (1)

Country Link
CN (1) CN111083309B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114070443A (en) * 2020-08-05 2022-02-18 北京万集科技股份有限公司 Multi-sensing data time synchronization method, system, device and computer equipment
JP2022136764A (en) * 2021-03-08 2022-09-21 株式会社トプコン Optical data processing device, optical data processing method, and optical data processing program
JP2022136754A (en) * 2021-03-08 2022-09-21 株式会社トプコン Optical data processing device, optical data processing method, and optical data processing program
CN114006672B (en) * 2021-09-17 2024-04-02 东风汽车集团股份有限公司 Vehicle-mounted multi-sensor data synchronous acquisition method and system
CN114520855B (en) * 2021-12-31 2024-03-15 广州文远知行科技有限公司 Image frame rendering method and device based on multi-module data and storage medium
CN117082105B (en) * 2023-10-16 2023-12-15 湖南尚医康医疗科技有限公司 Environment-friendly intelligent hospital facility monitoring system and method

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM333730U (en) * 2007-12-18 2008-06-01 Compro Technology Inc Audio/video playing device
US7558157B1 (en) * 2006-04-26 2009-07-07 Itt Manufacturing Enterprises, Inc. Sensor synchronization using embedded atomic clocks
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN102508297A (en) * 2011-10-08 2012-06-20 天津大学 Accurate measurement and correction method and device of synchronous acquisition time errors of multiple codes
EP2493126A1 (en) * 2009-11-23 2012-08-29 ZTE Corporation Method and system for clock recovery in packet switched network
CN102857333A (en) * 2011-06-28 2013-01-02 富士通株式会社 Device and method for synchronizing data packet from sensor network
CN102869086A (en) * 2012-09-25 2013-01-09 京信通信系统(中国)有限公司 Method and device for adjusting signal transmission time
CN103069835A (en) * 2010-08-12 2013-04-24 索尼公司 Information processing device, method and program
CN104426624A (en) * 2013-08-26 2015-03-18 浙江大华技术股份有限公司 Synchronous image display method and apparatus
CN104773178A (en) * 2015-03-26 2015-07-15 合肥工业大学 Driving gesture detecting system based on MEMS (Micro-electromechanical Systems) sensors
CN105338393A (en) * 2015-10-29 2016-02-17 小米科技有限责任公司 Medium synchronization method and device
CN105676693A (en) * 2016-01-15 2016-06-15 武汉光庭科技有限公司 Method and system for data synchronization of vehicle sensor in automatic driving
CN105915308A (en) * 2016-07-08 2016-08-31 腾讯科技(深圳)有限公司 Time synchronization method, device and system
CN107000752A (en) * 2014-10-28 2017-08-01 罗伯特·博世有限公司 Method and apparatus for positioning vehicle in its surrounding environment
CN107659367A (en) * 2017-08-30 2018-02-02 安徽江淮汽车集团股份有限公司 More sensing unit method for synchronizing time and system
CN108449552A (en) * 2018-03-07 2018-08-24 北京理工大学 Tag image acquires the method and system at moment
CN108519610A (en) * 2018-03-29 2018-09-11 武汉大学 Multi-sensor combined navigation system time synchronization and real-time data acquisition method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2174396B1 (en) * 2007-07-23 2015-09-09 ABB Technology AG Method and protection device for a power network accounting for route switching in a telecommunication network
JP5633636B2 (en) * 2011-03-18 2014-12-03 富士通株式会社 Transmission delay difference correction method, communication apparatus, and communication system
JP2013017104A (en) * 2011-07-06 2013-01-24 Fujitsu Ltd Communication system
US20170041688A1 (en) * 2013-11-12 2017-02-09 Qualcomm Incorporated Apparatus and methods for timestamping in a system synchronizing controller and sensors
US10599459B2 (en) * 2016-02-12 2020-03-24 Nutanix, Inc. Entity database distributed replication

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7558157B1 (en) * 2006-04-26 2009-07-07 Itt Manufacturing Enterprises, Inc. Sensor synchronization using embedded atomic clocks
TWM333730U (en) * 2007-12-18 2008-06-01 Compro Technology Inc Audio/video playing device
EP2493126A1 (en) * 2009-11-23 2012-08-29 ZTE Corporation Method and system for clock recovery in packet switched network
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN103069835A (en) * 2010-08-12 2013-04-24 索尼公司 Information processing device, method and program
CN102857333A (en) * 2011-06-28 2013-01-02 富士通株式会社 Device and method for synchronizing data packet from sensor network
CN102508297A (en) * 2011-10-08 2012-06-20 天津大学 Accurate measurement and correction method and device of synchronous acquisition time errors of multiple codes
CN102869086A (en) * 2012-09-25 2013-01-09 京信通信系统(中国)有限公司 Method and device for adjusting signal transmission time
CN104426624A (en) * 2013-08-26 2015-03-18 浙江大华技术股份有限公司 Synchronous image display method and apparatus
CN107000752A (en) * 2014-10-28 2017-08-01 罗伯特·博世有限公司 Method and apparatus for positioning vehicle in its surrounding environment
CN104773178A (en) * 2015-03-26 2015-07-15 合肥工业大学 Driving gesture detecting system based on MEMS (Micro-electromechanical Systems) sensors
CN105338393A (en) * 2015-10-29 2016-02-17 小米科技有限责任公司 Medium synchronization method and device
CN105676693A (en) * 2016-01-15 2016-06-15 武汉光庭科技有限公司 Method and system for data synchronization of vehicle sensor in automatic driving
CN105915308A (en) * 2016-07-08 2016-08-31 腾讯科技(深圳)有限公司 Time synchronization method, device and system
CN107659367A (en) * 2017-08-30 2018-02-02 安徽江淮汽车集团股份有限公司 More sensing unit method for synchronizing time and system
CN108449552A (en) * 2018-03-07 2018-08-24 北京理工大学 Tag image acquires the method and system at moment
CN108519610A (en) * 2018-03-29 2018-09-11 武汉大学 Multi-sensor combined navigation system time synchronization and real-time data acquisition method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"多传感器数据融合处理的时间对准研究";宫峰勋;《辽宁工程技术大学学报(自然科学版)》;20060309;第24卷(第6期);全文 *
"多平台多传感器数据融合中的时间一致";陈嫣;《火力与指挥控制》;20071130;第32卷(第11期);全文 *

Also Published As

Publication number Publication date
CN111083309A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN111083309B (en) Time alignment method of multi-sensor data and data acquisition equipment
CN102575933B (en) System that generates map image integration database and program that generates map image integration database
US10481277B2 (en) Position correction of a vehicle by referencing to objects in the surroundings
JP5590480B2 (en) Control device, surveying system, program and recording medium
US9207677B2 (en) Vehicle positioning method and its system
JP5656316B1 (en) System including a marker device and method using the same
US20170050080A1 (en) Pedaling measurement apparatus, pedaling measurement system, pedaling measurement method, and recording medium
WO2020168667A1 (en) High-precision localization method and system based on shared slam map
CN112601928A (en) Position coordinate estimation device, position coordinate estimation method, and program
CN113783652B (en) Data synchronization method and device of integrated navigation system
KR20190107692A (en) Accurate Altitude Estimation for Indoor Positioning
CN102137183A (en) Mobile communication terminal and method
EP2444822A2 (en) Method and system for computing universal hybrid navigation information for a GNSS enabled device
CN110851545A (en) Map drawing method, device and equipment
CN103376455A (en) Sensor posture coordinate recording device applicable to low-altitude unpiloted remote sensing platform
CN110906830B (en) Intelligent distance measuring wheel system and distance measuring method thereof
WO2018131546A1 (en) Information processing device, information processing system, information processing method, and information processing program
WO2020008791A1 (en) Positioning system, positioning device and center device
CN115389246B (en) Speed precision measuring method, system and device of motion capture system
CN111397602A (en) High-precision positioning method and device integrating broadband electromagnetic fingerprint and integrated navigation
KR20090053623A (en) Method and apparatus for acquiring road image information
CN108981700B (en) Positioning and attitude determining method and device
JP2002243444A (en) Method and apparatus for surveying by aerial photography
KR20200036405A (en) Apparatus and method for correcting longitudinal position error of fine positioning system
CN110766105A (en) Vehicle key positioning method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220228

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: 100083 room 28, 4 / F, block a, Dongsheng building, 8 Zhongguancun East Road, Haidian District, Beijing

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant