CN111351487A - Clock synchronization method and device of multiple sensors and computing equipment - Google Patents

Clock synchronization method and device of multiple sensors and computing equipment Download PDF

Info

Publication number
CN111351487A
CN111351487A CN202010103837.4A CN202010103837A CN111351487A CN 111351487 A CN111351487 A CN 111351487A CN 202010103837 A CN202010103837 A CN 202010103837A CN 111351487 A CN111351487 A CN 111351487A
Authority
CN
China
Prior art keywords
sensor
angular velocity
time
coordinate system
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010103837.4A
Other languages
Chinese (zh)
Other versions
CN111351487B (en
Inventor
侍世腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Shenzhen Robotics Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shenzhen Robotics Systems Co Ltd filed Critical Cloudminds Shenzhen Robotics Systems Co Ltd
Priority to CN202010103837.4A priority Critical patent/CN111351487B/en
Publication of CN111351487A publication Critical patent/CN111351487A/en
Application granted granted Critical
Publication of CN111351487B publication Critical patent/CN111351487B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Gyroscopes (AREA)

Abstract

The embodiment of the invention relates to the technical field of sensor calibration, and discloses a clock synchronization method and device of multiple sensors and computing equipment. The method comprises the following steps: acquiring a first angular speed of a first sensor in a first sensor coordinate system; acquiring a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system; calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation; calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed; determining the minimum angular velocity residual error and the corresponding preset deviation time as the time deviation of the first sensor and the second sensor; and performing clock synchronization according to the time deviation. Through the mode, the embodiment of the invention can calculate the time deviation among the multiple sensors, thereby improving the accuracy of the sensing result.

Description

Clock synchronization method and device of multiple sensors and computing equipment
Technical Field
The embodiment of the invention relates to the technical field of sensor calibration, in particular to a clock synchronization method and device of multiple sensors and computing equipment.
Background
In the field of unmanned driving and in the field of robotic navigation, environmental perception technology is one of the most critical technologies. Environmental sensing environmental information is sensed by various sensing sensors, but various sensors have disadvantages, for example, an inertial measurement unit can rapidly measure self movement at a high frequency, but errors are rapidly accumulated as time increases; lidar is not capable of measuring self-motion, but provides stable time information. Therefore, the fusion of a plurality of sensors can make up for the deficiencies of the sensors, and the perception capability of the system is improved.
However, when a plurality of sensors are merged, time deviation exists among the sensors, and therefore, the sensing result is greatly influenced.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a clock synchronization method, device and computing device for multiple sensors, which can calculate a time deviation between the multiple sensors, so as to improve accuracy of a sensing result.
According to an aspect of an embodiment of the present invention, there is provided a clock synchronization method of a multi-sensor, the method including: acquiring a first angular speed of a first sensor in a first sensor coordinate system; acquiring a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system; calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system; calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed; determining a minimum angular velocity residual error in the angular velocity residual error sum and a corresponding preset deviation time as a time deviation of the first sensor and the second sensor; and performing clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
In an optional manner, the acquiring a first angular velocity of the first sensor in the first sensor coordinate system specifically includes: acquiring a first point cloud of the first sensor under a first sensor coordinate system at a first moment; acquiring a second point cloud of the first sensor under the first sensor coordinate system at a second moment; calculating to obtain a rotation matrix between the first point cloud and the second point cloud based on an iterative closest point algorithm; and calculating the first angular speed according to a rotation matrix between the first point cloud and the second point cloud, the first time and the second time.
In an optional manner, the calculating the first angular velocity according to the rotation matrix between the first point cloud and the second point cloud, the first time, and the second time specifically includes: converting a rotation matrix between the first point cloud and the second point cloud into euler angles; dividing the Euler angle by the difference between the first time and the second time to obtain the angular velocity of the first sensor at the intermediate time between the first time and the second time; determining an angular velocity of the first sensor at a time intermediate the first time and the second time as the first angular velocity.
In an alternative manner, the transformation relationship of the first sensor coordinate system and the second sensor coordinate system includes a rotation matrix of the first sensor coordinate system and the second sensor coordinate system; the calculating, according to the second angular velocity and the transformation relationship between the first sensor coordinate system and the second sensor coordinate system, a third angular velocity of the second sensor in the first sensor coordinate system specifically includes: and multiplying the second angular velocity by the rotation matrix to calculate the third angular velocity.
In an optional manner, the calculating, according to the first angular velocity and the third angular velocity, a sum of residual angular velocities of the first sensor and the second sensor at each preset deviation time specifically includes: determining a first reference moment and the first angular speed corresponding to the first reference moment; adding the preset deviation time to the first reference time to obtain a second reference time; determining the third angular velocity corresponding to the second reference moment; determining an angular velocity pair according to the first angular velocity corresponding to the first reference moment and the third angular velocity corresponding to the second reference moment; and calculating the sum of the vector modular lengths of the difference values of all the angular velocity pairs as the angular velocity residual sum of the first sensor and the second sensor under the preset deviation time.
In an optional manner, the determining a third angular velocity corresponding to the second reference time specifically includes: if the third angular velocity corresponding to the second reference moment does not exist, acquiring the third angular velocity corresponding to the last timestamp and the third angular velocity corresponding to the next timestamp of the second reference moment; calculating a fourth angular velocity of the second sensor at the second reference moment according to the third angular velocity corresponding to the previous timestamp and the third angular velocity corresponding to the next timestamp; and determining the fourth angular velocity as the third angular velocity corresponding to the second reference time.
In an alternative form, the first sensor includes at least one of a laser radar, a millimeter wave radar, a microwave radar, and an image sensor, and the second sensor includes an inertial measurement unit.
According to another aspect of the embodiments of the present invention, there is provided a clock synchronization apparatus of multiple sensors, the apparatus including: the first acquisition module is used for acquiring a first angular speed of the first sensor in a first sensor coordinate system; the second acquisition module is used for acquiring a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system; the angular velocity conversion module is used for calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system; the angular speed residual sum calculating module is used for calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed; a time deviation determining module, configured to determine a minimum angular velocity residual in the angular velocity residual sum and a corresponding preset deviation time as a time deviation of the first sensor and the second sensor; and the clock synchronization module is used for carrying out clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
According to still another aspect of an embodiment of the present invention, there is provided a computing device including: a processor and a memory, the processor executing the executable instructions when the computing device is running, causing the processor to perform the operations of the multi-sensor clock synchronization method as described above.
According to yet another aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored therein at least one executable instruction for causing a processor to perform the steps of the clock synchronization method according to the multi-sensor as described above.
The embodiment of the invention obtains a first angular velocity of a first sensor in a first sensor coordinate system, obtains a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates the angular velocity residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular velocity and the third angular velocity, determines the minimum angular velocity residual sum and the corresponding preset deviation time in the angular velocity residual sum as the time deviation of the first sensor and the second sensor, and performs clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor, the time deviation of the first sensor and the second sensor can be accurately calculated, so that the time deviation among the sensors is calculated, the time synchronization problem of the multiple sensors is solved, and the accuracy of a sensing result is improved.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart illustrating a method for clock synchronization of multiple sensors according to an embodiment of the present invention;
FIG. 2 shows a schematic flow chart of step 110 of FIG. 1;
FIG. 3 shows a schematic flow chart of step 140 of FIG. 1;
FIG. 4 is a flow chart illustrating an application example of a multi-sensor clock synchronization method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram showing the relationship between the preset offset time and the sum of the residuals of the angular velocities of the lidar and the IMU;
FIG. 6 is a schematic structural diagram of a multi-sensor clock synchronization apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a computing device provided by an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
An Inertial Measurement Unit (IMU) is a common sensor that can measure self-movement rapidly at high frequency, but as time goes on, the IMU error gradually accumulates, and the accumulated error needs to be corrected by the observation of other sensors. External sensors (such as radars, cameras and the like) can measure information such as outlines, colors and the like of a real environment, and tasks such as robot drawing, positioning and sensing can be completed by matching with the IMU. However, there is often a time offset between the IMU and the external sensors, which can have a large impact on the sensing result.
Based on this, the embodiments of the present invention provide a clock synchronization method and apparatus for multiple sensors, and a computing device, which can calculate time deviations among multiple sensors, thereby improving accuracy of sensing results.
Specifically, the embodiments of the present invention will be further explained below with reference to the drawings.
It should be understood that the following examples are provided by way of illustration and are not intended to limit the invention in any way to the particular embodiment disclosed.
Fig. 1 is a schematic flowchart illustrating a clock synchronization method for multiple sensors according to an embodiment of the present invention. The method is used for clock synchronization of a first sensor and a second sensor, wherein the first sensor and the second sensor are fixed on a required application device. The first sensor may include a radar sensor including, but not limited to, a lidar, a millimeter wave radar, a microwave radar, a Flash radar, a MEMS radar, a phased array radar, and the like, capable of generating a three-dimensional point cloud, or an image sensor including, but not limited to, a monocular camera, a panoramic camera, a binocular camera, a structured light camera, and the like, which may be used for photographing. The second sensor may comprise an IMU, embodiments of the invention can be used to calculate the time offset between the IMU and the radar sensor, or the IMU and the image sensor.
In the present embodiment, the first sensor is a laser radar, and the second sensor is an IMU. As shown in fig. 1, the method includes:
step 110, a first angular velocity of the first sensor in a first sensor coordinate system is obtained.
The first sensor coordinate system may be a three-dimensional coordinate system established by a center of the laser radar. Since the lidar is in rotational motion, the first angular velocity refers to the average angular velocity of the lidar in the first sensor coordinate system.
Specifically, as shown in fig. 2, step 110 includes:
111, acquiring a first point cloud of a first sensor at a first moment in a first sensor coordinate system;
step 112, acquiring a second point cloud of the first sensor under the first sensor coordinate system at a second moment;
113, calculating to obtain a rotation matrix between the first point cloud and the second point cloud based on an iterative closest point algorithm;
and step 114, calculating a first angular speed according to the rotation matrix between the first point cloud and the second point cloud, the first time and the second time.
The first time and the second time are two different times, for example, every preset time tdAcquiring point cloud data of the first sensor in the first sensor coordinate system, wherein the first moment and the second moment can be separated by a preset time, namely the first moment tiObtaining first point cloud data PiAt a second time ti+dAcquiring second point cloud data Pi+d. For example, if the preset time is 1 second, the first time and the second time may be separated by 1 second, and if the first time is i seconds, the second time is i +1 second.
And matching the first Point cloud and the second Point cloud by adopting an Iterative Closest Point (ICP) algorithm, so as to calculate a rotation matrix between the first Point cloud and the second Point cloud. The ICP algorithm may be embodied as: first, in the first point cloud PiSelecting point pi(ii) a Second, in the second point cloud Pi+dFind out the point piCorresponding point qiAnd make point piAnd point qiThe distance between is minimal (e.g., the second point may be traversed)Cloud Pi+dCalculating a point piAnd the second point cloud Pi+dThe distance of each point is continuously compared to find a distance point piNearest point qi) (ii) a Thirdly, calculating a rotation matrix R and a translation matrix T to minimize an error function; fourth, point to point piCarrying out rotation and translation transformation by using the rotation matrix R and the translation matrix T obtained in the previous step to obtain a new corresponding point set pi'; fifthly, calculating a point pi' AND Point qiThe average distance d of; and sixthly, stopping iterative calculation if the average distance d is smaller than a given threshold or larger than a preset maximum iteration number, otherwise, returning to the second step until a convergence condition is met.
In the present embodiment, first, in the first point cloud PiSelecting point pi∈PiAnd in the second point cloud Pi+dFind out the point piCorresponding point qi∈Pi+d(ii) a The rotation matrix R and the translation matrix T are then calculated by minimizing an error function E (R, T), wherein an error function threshold may be set, and when the error function is less than the error function threshold, the error function is considered to be minimal, wherein:
Figure BDA0002387818930000071
furthermore, for piUsing the rotation matrix R and the translation matrix T to carry out rotation and translation conversion to obtain a new corresponding point set pi':
p′i=R·pi+t,pi∈Pi
Calculating the point pi' AND Point qiAverage distance d of (d):
Figure BDA0002387818930000072
if d is smaller than the set threshold value or reaches the preset iteration times, stopping iteration, otherwise, repeating iteration until the convergence condition is met.
In step 114, after the rotation matrix between the first point cloud and the second point cloud is obtained through calculation, the first angular velocity is calculated according to the rotation matrix between the first point cloud and the second point cloud, the first time, and the second time, and specifically, the method may include: converting a rotation matrix between the first point cloud and the second point cloud into an Euler angle; dividing the Euler angle by the difference between the first time and the second time to obtain the angular velocity of the first sensor at the intermediate time between the first time and the second time; the angular velocity of the first sensor at an intermediate time between the first time and the second time is determined as a first angular velocity.
Wherein a rotation matrix R between the first point cloud and the second point cloud is converted into Euler angles
Figure BDA0002387818930000076
The method specifically comprises the following steps:
the rotation matrix R is represented as:
Figure BDA0002387818930000073
the euler angle can be adjusted
Figure BDA0002387818930000074
Expressed as:
Figure BDA0002387818930000075
the first sensor is at a first time tiAnd a second time ti+dThe angular velocity at the intermediate time of (a) is:
Figure BDA0002387818930000081
then will be
Figure BDA0002387818930000082
A first angular velocity is determined.
And 120, acquiring a second angular velocity of the second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system.
The second sensor coordinate system may be a three-dimensional coordinate system established by using the IMU center. Since the IMU is also in rotational motion, the second angular velocity refers to the average angular velocity of the IMU in the second sensor coordinate system. The IMU can be used to measure angular velocity and acceleration, and the second angular velocity can be obtained directly from the angular velocity measured by the IMU.
The transformation relation between the first sensor coordinate system and the second sensor coordinate system may include a rotation matrix and a translation matrix of the first sensor coordinate system and the second sensor coordinate system. Before the first sensor and the second sensor are subjected to clock synchronization, the first sensor and the second sensor are subjected to external reference calibration, and the transformation relation between the coordinate system of the first sensor and the coordinate system of the second sensor can be obtained from external reference calibration results of the first sensor and the second sensor.
And step 130, calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system.
The third angular velocity is an angular velocity obtained by converting the second angular velocity into the first sensor coordinate system. Then, according to the second angular velocity and the transformation relationship between the first sensor coordinate system and the second sensor coordinate system, the third angular velocity of the second sensor in the first sensor coordinate system is calculated, which may specifically be: and multiplying the second angular velocity by the rotation matrix to calculate a third angular velocity. That is, the third angular velocity is:
Figure BDA0002387818930000083
wherein,
Figure BDA0002387818930000084
is the third angular velocity of the second sensor in the first sensor coordinate system,
Figure BDA0002387818930000085
for a second angular velocity of the second sensor in a second sensor coordinate system, R' is a rotationAnd (4) matrix.
And 140, calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed.
Wherein the preset deviation time is a time deviation between the first sensor and the second sensor which is set in advance. The preset deviation time is several, for example, the deviation range is set to-200 ms to 200ms, the search step is 1 ms, and the preset deviation time is-200 ms, -199 ms, -198 ms … … 199 ms, 200 ms. And calculating the angular speed residual sum of the first sensor and the second sensor according to the angular speed of the corresponding sampling point of the first sensor and the second sensor at each preset deviation time, so as to obtain the angular speed residual sum of the first sensor and the second sensor at each preset deviation time.
Specifically, as shown in fig. 3, step 140 includes:
step 141, determining a first reference time and a first angular velocity corresponding to the first reference time;
step 142, adding a preset deviation time to the first reference time to obtain a second reference time;
step 143, determining a third angular velocity corresponding to the second reference moment;
step 144, determining an angular velocity pair according to the first angular velocity corresponding to the first reference time and the third angular velocity corresponding to the second reference time;
and 145, calculating the sum of the vector modular lengths of the difference values of all the angular velocity pairs as the angular velocity residual sum of the first sensor and the second sensor under the preset deviation time.
Wherein, the first reference time refers to the time stamp of the first sensor, i.e. the time of the laser radar. For example, the laser radar acquires a group of point cloud data every 1 second, and acquires 4 groups of point cloud data, so that the first reference time may be 1s, 2s, 3s, and 4s, and the first angular velocity corresponding to the first reference time refers to the angular velocity corresponding to the laser radar at the time of 1s, 2s, 3s, and 4 s.
Wherein the second reference moment refers to a time stamp of the second sensor, i.e. the time of the IMU. For example, assuming that the first reference time is 1s, 2s, 3s, and 4s, if one of the preset deviation times is-200 ms, the second reference time is 0.8s, 1.8s, 2.8s, and 3.8s, and the angular velocity corresponding to the IMU at the time of 0.8s, 1.8s, 2.8s, and 3.8s is the third angular velocity corresponding to the second reference time; for another example, assuming that the first reference time is 1s, 2s, 3s, and 4s, and if one of the preset deviation times is 100ms, the second reference time is 1.1s, 2.1s, 3.1s, and 4.1s, and the angular velocity corresponding to the IMU at the time 1.1s, 2.1s, 3.1s, and 4.1s is the third angular velocity corresponding to the second reference time.
In steps 144 and 145, after determining the first angular velocity corresponding to the first reference time and the third angular velocity corresponding to the second reference time, determining the third angular velocity corresponding to the second reference time from the first angular velocity corresponding to the first reference time to form an angular velocity pair, thereby determining the angular velocity pair; from the angular velocity pairs, the sum of the vector modulo lengths of the differences of all the angular velocity pairs is calculated. For example, if the first reference time is 1s, 2s, 3s, 4s, and the second reference time is 1.1s, 2.1s, 3.1s, 4.1s, the first angular velocity corresponding to the laser radar when the time is 1s and the third angular velocity corresponding to the IMU when the time is 1.1s constitute a first angular velocity pair, the first angular velocity corresponding to the laser radar when the time is 2s and the third angular velocity corresponding to the IMU when the time is 2.1s constitute a second angular velocity pair, the first angular velocity corresponding to the laser radar when the time is 3s and the third angular velocity corresponding to the IMU when the time is 3.1s constitute a third angular velocity pair, and the first angular velocity corresponding to the laser radar when the time is 4s and the third angular velocity corresponding to the IMU when the time is 4.1s constitute a fourth angular velocity pair, all the angular velocity pairs are shared.
Wherein the sum of the vector mode lengths of the differences of all angular velocity pairs is calculated as:
Figure BDA0002387818930000101
where t is a predetermined offset time, δtIs the sum of the vector modulo lengths of the differences of all angular velocity pairs at a preset offset time t (i.e. at a preset offset time t)The angular velocity residual sum of the first sensor and the second sensor at time t), n is the number of angular velocity pairs, k ∈ [1, n]And k is an integer,
Figure BDA0002387818930000102
refers to the vector modulo length of the difference of the kth angular velocity pair.
And 150, determining the minimum angular velocity residual error in the angular velocity residual error sum and the corresponding preset deviation time as the time deviation of the first sensor and the second sensor.
And after the angular velocity residual sum under each preset deviation time is obtained through calculation, the magnitude of each angular velocity residual sum is compared, the smallest angular velocity residual sum in the angular velocity residual sums is determined, the smallest angular velocity residual sum and the corresponding preset deviation time are determined, and the preset deviation time is determined as the time deviation of the first sensor and the second sensor. For example, if it is determined that the sum of angular velocity residuals is minimum when the preset deviation time is 12 msec, 12 msec is determined as the time deviation of the first and second sensors.
And step 160, performing clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
And adjusting the clock of the first sensor or the second sensor according to the time deviation of the first sensor and the second sensor, so that the clocks of the first sensor and the second sensor are synchronous.
The embodiment of the invention obtains a first angular velocity of a first sensor in a first sensor coordinate system, obtains a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates the angular velocity residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular velocity and the third angular velocity, determines the minimum angular velocity residual sum and the corresponding preset deviation time in the angular velocity residual sum as the time deviation of the first sensor and the second sensor, and performs clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor, the time deviation of the first sensor and the second sensor can be accurately calculated, so that the time deviation among the sensors is calculated, the time synchronization problem of the multiple sensors is solved, and the accuracy of a sensing result is improved.
In some embodiments, in step 143, when the third angular velocity corresponding to the second reference time cannot be directly obtained from the measurement data, the third angular velocity corresponding to the second reference time needs to be calculated. Step 143 may specifically include:
step 1431, if the third angular velocity corresponding to the second reference time does not exist, acquiring a third angular velocity corresponding to a previous timestamp and a third angular velocity corresponding to a next timestamp of the second reference time;
step 1432, calculating a fourth angular velocity of the second sensor at the second reference time according to the third angular velocity corresponding to the previous timestamp and the third angular velocity corresponding to the next timestamp;
step 1433 determines the fourth angular velocity as a third angular velocity corresponding to the second reference time.
And if the IMU data is not acquired at the second reference moment, the third angular velocity corresponding to the second reference moment is not searched.
The last timestamp of the second reference time refers to the time when the IMU data is acquired before the second reference time, and the next timestamp of the second reference time refers to the time when the IMU data is acquired next to the second reference time. And calculating a fourth angular velocity of the second sensor at the second reference time according to the third angular velocities of the two times. Wherein a second reference time t is assumed2Has a last timestamp of t21The next time stamp of the second reference time is t22(then t)21<t2<t22) Calculating to obtain a second reference time t2The corresponding fourth angular velocity is:
Figure BDA0002387818930000111
wherein,
Figure BDA0002387818930000112
the fourth angular velocity is the velocity of the fourth corner,
Figure BDA0002387818930000113
for IMU at time t21The angular velocity of (a) of (b),
Figure BDA0002387818930000114
for IMU at time t22The angular velocity of (c).
Fig. 4 is a flowchart illustrating an application example of a clock synchronization method for multiple sensors according to an embodiment of the present invention. The method comprises the steps of fixing a 10HZ multi-line laser radar and a 200HZ IMU on a robot body, moving the robot to a factory building with obvious characteristics and no obvious degradation scene, controlling the robot to move (including rotation and translation in all directions) and start to record data, finishing movement after one minute, storing the data, wherein the total of 600 frames of laser radar data and 12000 frames of IMU data are stored.
As shown in fig. 4, the method includes:
step 201, calculating a first angular velocity of the laser radar in a laser radar coordinate system.
Wherein, a first point cloud P of the ith frame of the laser radar is obtainediAnd the second point cloud P of the (i + 1) th framei+1And calculating P by ICP algorithmiAnd Pi+1The rotation matrix R in between. Wherein P is calculated by ICP algorithmiAnd Pi+1The method of rotating the matrix in between is described in detail in the above embodiments, and is not described herein again. Converting rotation matrix R into Euler angle
Figure BDA0002387818930000121
And according to the Euler angle
Figure BDA0002387818930000122
Calculating a first angular velocity of a first sensor
Figure BDA0002387818930000123
The first angular velocity is calculated as
Figure BDA0002387818930000124
And 202, acquiring a second angular velocity of the IMU under the IMU coordinate system and a transformation relation between the laser radar coordinate system and the IMU coordinate system.
Wherein the second angular velocity of the IMU under the IMU coordinate system is obtained
Figure BDA0002387818930000125
Figure BDA0002387818930000126
The transformation relation between the laser radar coordinate system and the IMU coordinate system is R'.
And step 203, calculating a third angular velocity of the IMU in the laser radar coordinate system according to the second angular velocity and the transformation relation between the laser radar coordinate system and the IMU coordinate system.
Wherein, according to
Figure BDA0002387818930000127
The third angular velocity of the IMU under the laser radar coordinate system is obtained through calculation
Figure BDA0002387818930000128
And step 204, calculating the angular speed residual sum of the laser radar and the IMU under each preset deviation time according to the first angular speed and the third angular speed.
Wherein, the deviation range is set to be-200 ms to 200ms, the search step is 1 ms, and the preset deviation time is-200 ms, -199 ms, -198 ms … … 199 ms, 200 ms. The sum of the angular velocities of the lidar and the IMU at each preset offset time is calculated as shown in fig. 5.
And step 205, determining the minimum angular velocity residual error in the angular velocity residual error sum and the corresponding preset deviation time as the time deviation of the laser radar and the IMU.
In fig. 5, when the preset offset time is 12 ms, the sum of the angular velocity residuals of the lidar and the IMU is minimum, and then 12 ms is determined as the time offset of the lidar and the IMU.
And step 206, adjusting the clock of the laser radar or the IMU according to the time deviation of the laser radar and the IMU, so that the clocks of the first sensor and the second sensor are synchronous.
The embodiment of the invention calculates the third angular velocity of the IMU under the laser radar coordinate system according to the second angular velocity and the transformation relation of the laser radar coordinate system and the IMU coordinate system, calculates the angular velocity residual sum of the laser radar and the IMU under each preset deviation time according to the first angular velocity and the third angular velocity, determines the minimum angular velocity residual sum and the corresponding preset deviation time of the angular velocity residual sum as the time deviation of the laser radar and the IMU, performs clock synchronization on the laser radar and the IMU according to the time deviation of the laser radar and the IMU, can accurately calculate the time deviation of the laser radar and the IMU, and further calculates the time deviation among a plurality of sensors, the problem of time synchronization of multiple sensors is solved, and therefore accuracy of sensing results is improved.
Fig. 6 is a schematic structural diagram illustrating a clock synchronization apparatus for multiple sensors according to an embodiment of the present invention. As shown in fig. 6, the apparatus 300 includes: a first acquisition module 310, a second acquisition module 320, an angular velocity conversion module 330, an angular velocity residual sum calculation module 340, a time deviation determination module 350, and a clock synchronization module 360.
The first obtaining module 310 is configured to obtain a first angular velocity of the first sensor in a first sensor coordinate system; the second obtaining module 320 is configured to obtain a second angular velocity of the second sensor in a second sensor coordinate system and a transformation relationship between the first sensor coordinate system and the second sensor coordinate system; the angular velocity conversion module 330 is configured to calculate a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and a transformation relationship between the first sensor coordinate system and the second sensor coordinate system; the angular velocity residual sum calculating module 340 is configured to calculate an angular velocity residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular velocity and the third angular velocity; the time deviation determining module 350 is configured to determine a minimum angular velocity residual in the sum of angular velocity residuals and the corresponding preset deviation time as a time deviation of the first sensor and the second sensor; the clock synchronization module 360 is configured to perform clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
In an optional manner, the first obtaining module 310 is specifically configured to: acquiring a first point cloud of the first sensor under a first sensor coordinate system at a first moment; acquiring a second point cloud of the first sensor under the first sensor coordinate system at a second moment; calculating to obtain a rotation matrix between the first point cloud and the second point cloud based on an iterative closest point algorithm; and calculating the first angular speed according to a rotation matrix between the first point cloud and the second point cloud, the first time and the second time.
In an optional manner, the first obtaining module 310 is further specifically configured to: converting a rotation matrix between the first point cloud and the second point cloud into euler angles; dividing the Euler angle by the difference between the first time and the second time to obtain the angular velocity of the first sensor at the intermediate time between the first time and the second time; determining an angular velocity of the first sensor at a time intermediate the first time and the second time as the first angular velocity.
In an alternative manner, the transformation relationship of the first sensor coordinate system and the second sensor coordinate system includes a rotation matrix of the first sensor coordinate system and the second sensor coordinate system; the angular velocity conversion module 330 is specifically configured to: and multiplying the second angular velocity by the rotation matrix to calculate the third angular velocity.
In an optional manner, the angular velocity residual sum calculating module 340 is specifically configured to: determining a first reference moment and the first angular speed corresponding to the first reference moment; adding the preset deviation time to the first reference time to obtain a second reference time; determining the third angular velocity corresponding to the second reference moment; determining an angular velocity pair according to the first angular velocity corresponding to the first reference moment and the third angular velocity corresponding to the second reference moment; and calculating the sum of the vector modular lengths of the difference values of all the angular velocity pairs as the angular velocity residual sum of the first sensor and the second sensor under the preset deviation time.
In an optional manner, the angular velocity residual sum calculating module 340 is further specifically configured to: if the third angular velocity corresponding to the second reference moment does not exist, acquiring the third angular velocity corresponding to the last timestamp and the third angular velocity corresponding to the next timestamp of the second reference moment; calculating a fourth angular velocity of the second sensor at the second reference moment according to the third angular velocity corresponding to the previous timestamp and the third angular velocity corresponding to the next timestamp; and determining the fourth angular velocity as the third angular velocity corresponding to the second reference time.
In an alternative form, the first sensor includes at least one of a laser radar, a millimeter wave radar, a microwave radar, and an image sensor, and the second sensor includes an inertial measurement unit.
It should be noted that the multi-sensor clock synchronization apparatus provided in the embodiments of the present invention is an apparatus capable of executing the multi-sensor clock synchronization method, and all embodiments of the multi-sensor clock synchronization method are applicable to the apparatus and can achieve the same or similar beneficial effects.
The embodiment of the invention obtains a first angular velocity of a first sensor in a first sensor coordinate system, obtains a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates the angular velocity residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular velocity and the third angular velocity, determines the minimum angular velocity residual sum and the corresponding preset deviation time in the angular velocity residual sum as the time deviation of the first sensor and the second sensor, and performs clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor, the time deviation of the first sensor and the second sensor can be accurately calculated, so that the time deviation among the sensors is calculated, the time synchronization problem of the multiple sensors is solved, and the accuracy of a sensing result is improved.
Embodiments of the present invention provide a computer-readable storage medium, where at least one executable instruction is stored, and the executable instruction causes a processor to execute the clock synchronization method of multiple sensors in any of the above method embodiments.
Embodiments of the present invention provide a computer program product comprising a computer program stored on a computer storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform a method of multi-sensor clock synchronization in any of the above-described method embodiments.
Fig. 7 is a schematic structural diagram of a computing device according to an embodiment of the present invention, and a specific embodiment of the present invention does not limit a specific implementation of the computing device.
Wherein the computing device comprises: a processor and a memory. The memory is configured to store at least one executable instruction that, when executed by the computing device, causes the processor to perform the steps of the multi-sensor clock synchronization method according to any of the above-described method embodiments.
Alternatively, as shown in fig. 7, the computing device may include: a processor (processor)402, a Communications Interface 404, a memory 406, and a Communications bus 408.
Wherein: the processor 402, communication interface 404, and memory 406 communicate with each other via a communication bus 408. A communication interface 404 for communicating with network elements of other devices, such as clients or other servers. The processor 402 is configured to execute the program 410, and may specifically execute the multi-sensor clock synchronization method in any of the above-described method embodiments.
In particular, program 410 may include program code comprising computer operating instructions.
The processor 402 may be a central processing unit CPU, or an application specific Integrated circuit asic, or one or more Integrated circuits configured to implement an embodiment of the present invention. The computing device includes one or more processors, which may be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 406 for storing a program 410. Memory 406 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The embodiment of the invention obtains a first angular velocity of a first sensor in a first sensor coordinate system, obtains a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system, calculates the angular velocity residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular velocity and the third angular velocity, determines the minimum angular velocity residual sum and the corresponding preset deviation time in the angular velocity residual sum as the time deviation of the first sensor and the second sensor, and performs clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor, the time deviation of the first sensor and the second sensor can be accurately calculated, so that the time deviation among the sensors is calculated, the time synchronization problem of the multiple sensors is solved, and the accuracy of a sensing result is improved.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specified otherwise.

Claims (10)

1. A method of clock synchronization for multiple sensors, the method comprising:
acquiring a first angular speed of a first sensor in a first sensor coordinate system;
acquiring a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system;
calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system;
calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed;
determining a minimum angular velocity residual error in the angular velocity residual error sum and a corresponding preset deviation time as a time deviation of the first sensor and the second sensor;
and performing clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
2. The method of claim 1, wherein the obtaining the first angular velocity of the first sensor in the first sensor coordinate system comprises:
acquiring a first point cloud of the first sensor under a first sensor coordinate system at a first moment;
acquiring a second point cloud of the first sensor under the first sensor coordinate system at a second moment;
calculating to obtain a rotation matrix between the first point cloud and the second point cloud based on an iterative closest point algorithm;
and calculating the first angular speed according to a rotation matrix between the first point cloud and the second point cloud, the first time and the second time.
3. The method of claim 2, wherein calculating the first angular velocity from the rotation matrix between the first point cloud and the second point cloud, the first time, and the second time comprises:
converting a rotation matrix between the first point cloud and the second point cloud into euler angles;
dividing the Euler angle by the difference between the first time and the second time to obtain the angular velocity of the first sensor at the intermediate time between the first time and the second time;
determining an angular velocity of the first sensor at a time intermediate the first time and the second time as the first angular velocity.
4. The method of claim 1, wherein the transformation relationship of the first sensor coordinate system to the second sensor coordinate system comprises a rotation matrix of the first sensor coordinate system to the second sensor coordinate system;
the calculating, according to the second angular velocity and the transformation relationship between the first sensor coordinate system and the second sensor coordinate system, a third angular velocity of the second sensor in the first sensor coordinate system specifically includes:
and multiplying the second angular velocity by the rotation matrix to calculate the third angular velocity.
5. The method according to claim 1, wherein calculating the angular velocity residual sum of the first sensor and the second sensor at each preset offset time according to the first angular velocity and the third angular velocity specifically comprises:
determining a first reference moment and the first angular speed corresponding to the first reference moment;
adding the preset deviation time to the first reference time to obtain a second reference time;
determining the third angular velocity corresponding to the second reference moment;
determining an angular velocity pair according to the first angular velocity corresponding to the first reference moment and the third angular velocity corresponding to the second reference moment;
and calculating the sum of the vector modular lengths of the difference values of all the angular velocity pairs as the angular velocity residual sum of the first sensor and the second sensor under the preset deviation time.
6. The method according to claim 5, wherein the determining the third angular velocity corresponding to the second reference time specifically includes:
if the third angular velocity corresponding to the second reference moment does not exist, acquiring the third angular velocity corresponding to the last timestamp and the third angular velocity corresponding to the next timestamp of the second reference moment;
calculating a fourth angular velocity of the second sensor at the second reference moment according to the third angular velocity corresponding to the previous timestamp and the third angular velocity corresponding to the next timestamp;
and determining the fourth angular velocity as the third angular velocity corresponding to the second reference time.
7. The method of any one of claims 1-6, wherein the first sensor comprises at least one of a lidar, a millimeter-wave radar, a microwave radar, and an image sensor, and the second sensor comprises an inertial measurement unit.
8. A multi-sensor clock synchronization apparatus, the apparatus comprising:
the first acquisition module is used for acquiring a first angular speed of the first sensor in a first sensor coordinate system;
the second acquisition module is used for acquiring a second angular velocity of a second sensor in a second sensor coordinate system and a transformation relation between the first sensor coordinate system and the second sensor coordinate system;
the angular velocity conversion module is used for calculating a third angular velocity of the second sensor in the first sensor coordinate system according to the second angular velocity and the transformation relation between the first sensor coordinate system and the second sensor coordinate system;
the angular speed residual sum calculating module is used for calculating the angular speed residual sum of the first sensor and the second sensor at each preset deviation time according to the first angular speed and the third angular speed;
a time deviation determining module, configured to determine a minimum angular velocity residual in the angular velocity residual sum and the corresponding preset deviation time as a time deviation of the first sensor and the second sensor;
and the clock synchronization module is used for carrying out clock synchronization on the first sensor and the second sensor according to the time deviation of the first sensor and the second sensor.
9. A computing device, comprising: a processor and a memory;
the memory is configured to store at least one executable instruction that, when executed by the computing device, causes the processor to perform the steps of the multi-sensor clock synchronization method according to any of claims 1-7.
10. A computer-readable storage medium having stored therein at least one executable instruction for causing a processor to perform the steps of the multi-sensor clock synchronization method according to any one of claims 1-7.
CN202010103837.4A 2020-02-20 2020-02-20 Clock synchronization method and device for multiple sensors and computing equipment Active CN111351487B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010103837.4A CN111351487B (en) 2020-02-20 2020-02-20 Clock synchronization method and device for multiple sensors and computing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010103837.4A CN111351487B (en) 2020-02-20 2020-02-20 Clock synchronization method and device for multiple sensors and computing equipment

Publications (2)

Publication Number Publication Date
CN111351487A true CN111351487A (en) 2020-06-30
CN111351487B CN111351487B (en) 2024-06-25

Family

ID=71194068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010103837.4A Active CN111351487B (en) 2020-02-20 2020-02-20 Clock synchronization method and device for multiple sensors and computing equipment

Country Status (1)

Country Link
CN (1) CN111351487B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083400A (en) * 2020-08-21 2020-12-15 达闼机器人有限公司 Calibration method, device and storage medium for moving object and sensor thereof
CN113848696A (en) * 2021-09-15 2021-12-28 北京易航远智科技有限公司 Multi-sensor time synchronization method based on position information

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006018791A1 (en) * 2004-08-17 2006-02-23 Koninklijke Philips Electronics N.V. Method and apparatus for calibrating the rotational relationship between two motion sensors of a sensor system
US20150285835A1 (en) * 2013-12-23 2015-10-08 InvenSense, Incorporated Systems and methods for sensor calibration
CN105953795A (en) * 2016-04-28 2016-09-21 南京航空航天大学 Navigation apparatus and method for surface inspection of spacecraft
CN108680196A (en) * 2018-04-28 2018-10-19 远形时空科技(北京)有限公司 A kind of time delay adjustment method, system and computer-readable medium
US20180341263A1 (en) * 2017-05-25 2018-11-29 GM Global Technology Operations LLC Methods and systems for moving object velocity determination
CN109186592A (en) * 2018-08-31 2019-01-11 腾讯科技(深圳)有限公司 Method and apparatus and storage medium for the fusion of vision inertial navigation information
US20190113347A1 (en) * 2017-10-12 2019-04-18 Hanwha Land Systems Co., Ltd. Inertia-based navigation apparatus and inertia-based navigation method based on relative preintegration
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method
CN110782496A (en) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 Calibration method, calibration device, aerial photographing equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006018791A1 (en) * 2004-08-17 2006-02-23 Koninklijke Philips Electronics N.V. Method and apparatus for calibrating the rotational relationship between two motion sensors of a sensor system
US20150285835A1 (en) * 2013-12-23 2015-10-08 InvenSense, Incorporated Systems and methods for sensor calibration
CN105953795A (en) * 2016-04-28 2016-09-21 南京航空航天大学 Navigation apparatus and method for surface inspection of spacecraft
US20180341263A1 (en) * 2017-05-25 2018-11-29 GM Global Technology Operations LLC Methods and systems for moving object velocity determination
US20190113347A1 (en) * 2017-10-12 2019-04-18 Hanwha Land Systems Co., Ltd. Inertia-based navigation apparatus and inertia-based navigation method based on relative preintegration
CN108680196A (en) * 2018-04-28 2018-10-19 远形时空科技(北京)有限公司 A kind of time delay adjustment method, system and computer-readable medium
CN109186592A (en) * 2018-08-31 2019-01-11 腾讯科技(深圳)有限公司 Method and apparatus and storage medium for the fusion of vision inertial navigation information
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN110782496A (en) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 Calibration method, calibration device, aerial photographing equipment and storage medium
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵夫群: "《点云模型的优化配准方法研究》", 31 July 2018, 哈尔滨工业大学出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083400A (en) * 2020-08-21 2020-12-15 达闼机器人有限公司 Calibration method, device and storage medium for moving object and sensor thereof
CN113848696A (en) * 2021-09-15 2021-12-28 北京易航远智科技有限公司 Multi-sensor time synchronization method based on position information
CN113848696B (en) * 2021-09-15 2022-09-16 北京易航远智科技有限公司 Multi-sensor time synchronization method based on position information

Also Published As

Publication number Publication date
CN111351487B (en) 2024-06-25

Similar Documents

Publication Publication Date Title
CN112051590B (en) Detection method and related device for laser radar and inertial measurement unit
US10948297B2 (en) Simultaneous location and mapping (SLAM) using dual event cameras
CN110873883B (en) Positioning method, medium, terminal and device integrating laser radar and IMU
CN110880189B (en) Combined calibration method and combined calibration device thereof and electronic equipment
US10109104B2 (en) Generation of 3D models of an environment
WO2020140431A1 (en) Camera pose determination method and apparatus, electronic device and storage medium
WO2018048353A1 (en) Simultaneous localization and mapping methods and apparatus
CN112051591A (en) Detection method and related device for laser radar and inertial measurement unit
CN110954134B (en) Gyro offset correction method, correction system, electronic device, and storage medium
US20180075609A1 (en) Method of Estimating Relative Motion Using a Visual-Inertial Sensor
CN112136137A (en) Parameter optimization method and device, control equipment and aircraft
CN112013877B (en) Detection method and related device for millimeter wave radar and inertial measurement unit
CN111351487B (en) Clock synchronization method and device for multiple sensors and computing equipment
CN113655453A (en) Data processing method and device for sensor calibration and automatic driving vehicle
CN112051575A (en) Method for adjusting millimeter wave radar and laser radar and related device
CN111623773A (en) Target positioning method and device based on fisheye vision and inertial measurement
CN111504314B (en) IMU and rigid body pose fusion method, device, equipment and storage medium
CN111383282A (en) Pose information determination method and device
CN111998870A (en) Calibration method and device of camera inertial navigation system
CN110887461B (en) Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation
CN112154480B (en) Positioning method and device for movable platform, movable platform and storage medium
CN116952229A (en) Unmanned aerial vehicle positioning method, device, system and storage medium
CN116721166A (en) Binocular camera and IMU rotation external parameter online calibration method, device and storage medium
CN116047481A (en) Method, device, equipment and storage medium for correcting point cloud data distortion
Ullah et al. EMoVI-SLAM: Embedded monocular visual inertial SLAM with scale update for large scale mapping and localization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210127

Address after: 200000 second floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 200000 second floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant