CN110879400A - Method, equipment and storage medium for fusion positioning of laser radar and IMU - Google Patents
Method, equipment and storage medium for fusion positioning of laser radar and IMU Download PDFInfo
- Publication number
- CN110879400A CN110879400A CN201911183681.9A CN201911183681A CN110879400A CN 110879400 A CN110879400 A CN 110879400A CN 201911183681 A CN201911183681 A CN 201911183681A CN 110879400 A CN110879400 A CN 110879400A
- Authority
- CN
- China
- Prior art keywords
- imu
- pose
- zero offset
- moving object
- measurement data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Abstract
The invention discloses a method, equipment and a storage medium for fusion positioning of a laser radar and an IMU (inertial measurement Unit), wherein the method comprises the following steps: based on the IMU measurement data after zero offset correction, pre-integrating the IMU measurement data between two moments to obtain the pose change of the moving object between the two moments; the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose; acquiring a matching result and outputting a corresponding matching pose; the purpose of positioning the moving object by fusing the laser radar and the IMU measurement data is achieved, and the accuracy of positioning the moving object is improved.
Description
Technical Field
The invention relates to the field of robots, in particular to a method, equipment and a storage medium for fusion positioning of a laser radar and an IMU.
Background
At present, SLAM (Simultaneous Localization And Mapping) is mainly used for solving the problems of map construction And positioning navigation of moving objects, while laser SLAM is a positioning method based on Ranging in the early stage (such as ultrasonic And infrared single-point Ranging), And the occurrence And popularization of laser radar (Light Detection And Ranging) make the measurement more convenient And accurate, And the information amount is richer. The object information collected by the lidar is referred to as a point cloud because it exhibits a series of scattered points with accurate angle and distance information. The laser SLAM system obtains the moving distance and the rotating angle of the object in the period of time, namely the change of the pose, by matching and comparing two pieces of point clouds at different moments; the set of the change of the pose can form a path which a moving object passes through, so that the robot can be positioned.
When the laser radar is used for estimating the pose change of a moving object, the pose change between two moments is aimed at, if the interval time is short, the environment change described by two pieces of point clouds is small, the result of matching two frames of laser frames is relatively high in accuracy, but if the environment change described by the two pieces of point clouds is more and no accurate initial estimated pose exists, the result of matching the two pieces of point clouds and the real pose may be far away, and the matching is blind. The pose change of the object can be obtained by singly using an Inertial Measurement Unit (IMU), but because the pose measured by the IMU is obtained by integration, and data measured by the IMU has zero bias, noise and other reasons, the estimated pose has larger and larger error along with the increase of time when the pose is measured by the IMU.
Disclosure of Invention
The invention provides a method, equipment and a storage medium for fusion positioning of a laser radar and an IMU (inertial measurement Unit), which are used for realizing the positioning of a moving object by fusing the laser radar and the IMU.
In order to achieve the above object, the present invention provides a method for fusion positioning of a laser radar and an IMU, where the method for fusion positioning includes:
based on the IMU measurement data after zero offset correction, pre-integrating the IMU measurement data between two moments to obtain the pose change of the moving object between the two moments;
the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose;
and acquiring a matching result and outputting a corresponding matching pose.
Further, the pre-integrating the IMU measurement data between two moments based on the IMU measurement data after correcting the zero offset to obtain the pose change of the moving object between the two moments further includes:
acquiring the zero offset of the IMU, and correcting the IMU measurement data based on the acquired zero offset.
Further, the acquiring zero offset of the IMU includes:
selecting IMU measurement data between two continuous moments, and performing pre-integration on the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments;
taking the estimated pose change as an initial correction pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial correction pose to obtain a matching correction pose;
subtracting the matched correction pose and the initial correction pose obtained by IMU pre-integration to obtain an error of the IMU pre-integration;
and calculating to obtain the zero offset of the IMU by taking the obtained error as the input quantity of the complementary filtering.
Further, the calculating the zero offset of the IMU by using the obtained error as an input quantity of the complementary filtering includes:
acquiring acceleration and angular velocity in IMU measurement data, and calculating acceleration error and angular velocity error respectively corresponding to the acceleration and the angular velocity;
and obtaining the zero offset of the acceleration and the zero offset of the angular velocity through a complementary filtering algorithm according to the acceleration error and the angular velocity error.
Further, the selecting IMU measurement data between two consecutive moments and pre-integrating the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments includes:
normalizing the angular velocity data output by the IMU at each sampling moment to obtain a two-dimensional matrix corresponding to the angular velocity value;
converting the obtained two-dimensional matrix from an IMU coordinate system to a moving object coordinate system to obtain the acceleration corresponding to the moving object under the moving object coordinate system;
and integrating the obtained accelerated speed to obtain the displacement corresponding to the position variation of the moving object, namely obtaining the estimated pose variation of the moving object.
Further, the acquiring zero offset of the IMU includes:
in the measurement data of the IMU, the acceleration is represented as a, the angular velocity is represented as w, the zero offset of the acceleration is represented as ba, and the zero offset of the angular velocity is represented as bw;
the angular velocity data output by the IMU is normalized, and the following steps are performed:
and integrating the angular velocity at each IMU sampling moment to obtain an angular value theta of the angular velocity integration as follows:
converting the angle value theta obtained by the angular velocity integration to obtain a two-dimensional rotation matrix RnComprises the following steps:
converting the angular velocity data output by the IMU into a moving object coordinate system to obtain the acceleration a corresponding to the moving object in the moving object coordinate systemr nComprises the following steps:
ar n=Rr i*an;
wherein R isr iIs a known 3 x 3 rotation matrix converted from IMU coordinate system to moving object coordinate system, ar nA numerical value representing an acceleration in a moving object coordinate system;
integrating the obtained acceleration to obtain the position variation corresponding to the moving object, namely the displacement P, as follows:
taking the obtained angle value and displacement as initial values matched with the laser radar, and acquiring angle displacement information matched with the initial values from laser point cloud based on the initial values;
subtracting the obtained angular displacement information from an initial value to obtain an angular difference and a displacement difference;
wherein, the relation between the displacement difference and the acceleration zero offset is as follows:
then there are:
according to the angular velocity error and the acceleration error, obtaining the zero offset of the acceleration and the angular velocity through a complementary filtering algorithm as follows:
k1, K2, I1 and I2 are adjusted weight parameters, and the value range is 0-1;
and returning to execute the next cycle of calculating the zero offset after the new zero offset is calculated.
Further, the method for fusion localization further comprises:
and the zero offset of the IMU is acquired, and the output operation of matching the pose is executed based on the IMU measurement data after the zero offset is corrected, and the operation is executed circularly in the whole fusion positioning process of the laser radar and the IMU.
In order to achieve the above object, the present invention further provides a device for fusion positioning of a laser radar and an IMU, where the device for fusion positioning includes:
the pose acquisition module is used for pre-integrating IMU measurement data between two moments to obtain the pose change of the moving object between the two moments based on the IMU measurement data after zero offset correction;
the pose matching module is used for matching two frames of laser frames in the laser point cloud based on the initial pose by using the obtained pose change as the initial pose of the laser point cloud matching;
and the pose output module is used for acquiring the matching result and outputting the corresponding matching pose.
In order to achieve the above object, the present invention further provides an electronic device, which includes a memory and a processor, where the memory stores a fusion positioning program of a lidar and an IMU that is executable on the processor, and the fusion positioning program is executed by the processor to perform the method for fusion positioning of the lidar and the IMU.
To achieve the above object, the present invention further provides a computer storage medium storing a fusion positioning program of a lidar and an IMU, where the fusion positioning program is executable by one or more processors to implement the steps of the method for fusion positioning of the lidar and the IMU.
The method, the equipment and the storage medium for fusion positioning of the laser radar and the IMU can achieve the following beneficial effects:
based on the IMU measurement data after zero offset correction, pre-integrating the IMU measurement data between two moments to obtain the pose change of the moving object between the two moments; the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose; acquiring a matching result and outputting a corresponding matching pose; the method achieves the purpose of realizing the positioning of the moving object by fusing the laser radar and the IMU measurement data, and further improves the accuracy of the positioning of the moving object because the zero offset problem of the IMU measurement data is circularly corrected and the pose matching of the moving object is carried out based on the IMU measurement data after the zero offset is corrected.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described below by means of the accompanying drawings and examples.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a method for positioning a laser radar and an IMU in a fusion manner according to the present invention;
FIG. 2 is a block flow diagram of one embodiment of a method for positioning a laser radar and an IMU in a fused manner according to the present invention;
FIG. 3 is a functional block diagram of an embodiment of the laser radar and IMU fusion positioning apparatus of the present invention;
fig. 4 is a schematic internal structure diagram of an embodiment of the electronic device of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
The invention provides a method, equipment and a storage medium for fusion positioning of a laser radar and an IMU (inertial measurement Unit), which realize the positioning of a moving object by fusing the laser radar and the IMU.
As shown in fig. 1, fig. 1 is a schematic flowchart of an embodiment of a method for positioning a laser radar and an IMU in a fusion manner according to the present invention; the laser radar and IMU fusion positioning method can be implemented as the following steps S10-S30:
step S10, pre-integrating IMU measurement data between two moments based on the IMU measurement data after zero offset correction to obtain the pose change of the moving object between the two moments;
in the embodiment of the invention, IMU measurement data adopted when fusion positioning is carried out on laser point cloud corresponding to a laser radar is IMU measurement data after zero offset correction; of course, when the laser radar and IMU fusion positioning method is implemented for the first time, the data obtained by pre-integrating the IMU measurement data between two moments is the IMU measurement data with uncorrected zero offset.
And (4) pre-integrating IMU measurement data between two moments based on the IMU measurement data after zero offset correction to obtain the pose change of the moving object between the two moments, namely the displacement of the moving object between the two moments.
S20, using the obtained pose change as an initial pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial pose;
and step S30, obtaining a matching result and outputting a corresponding matching pose.
Because the adopted IMU measurement data is the data after correcting zero offset, the pose change obtained by pre-integrating the IMU measurement data between two moments is used as an initial pose, an accurate pose change can be obtained when two frames of laser frames in the laser point cloud are matched based on the initial pose, the matched pose matched with the initial pose is output, and the pose change of the moving object can be obtained, so that the moving object can be positioned. Meanwhile, the matching pose matched with the initial pose of the pre-integration is used as correction feedback for the pose estimated by the IMU, so that the zero offset corresponding to the IMU measurement data can be estimated, and after the accurate zero offset of the IMU measurement data is obtained, a good matching initial pose can be provided for the matching of the laser point cloud.
As shown in fig. 2, fig. 2 is a block flow diagram of an embodiment of a method for positioning a lidar and an IMU in a fusion manner according to the present invention; pre-integrating IMU measurement data, taking pose change obtained by pre-integrating as an initial pose, matching two frames of laser frames in the laser point cloud based on the initial position to obtain a corresponding matching pose, further outputting the corresponding matching pose according to a matching result, simultaneously performing a series of data operations such as difference operation on the obtained matching pose and the initial position obtained by IMU pre-integrating to obtain an error of IMU pre-integrating, and calculating by utilizing a complementary filtering algorithm to obtain the zero offset of the IMU. Namely, the zero offset of the IMU is acquired, and the output operation of the matching pose is executed based on the IMU measurement data after the zero offset is corrected, and the operation is executed circularly in the whole fusion positioning process of the laser radar and the IMU.
In an embodiment, for the first implementation of the method for fusion positioning of a laser radar and an IMU described in the embodiment of fig. 1, in order to further ensure accuracy of obtaining a matching pose of a moving object, in step S01 of the embodiment of fig. 1, based on an IMU measurement data after correcting zero offset, the method of the present invention performs pre-integration on the IMU measurement data between two time instants to obtain a pose change of the moving object between the two time instants, and before further performing the following steps:
acquiring the zero offset of the IMU, and correcting the IMU measurement data based on the acquired zero offset.
That is to say, before the method for fusion positioning of the laser radar and the IMU described in fig. 1 is implemented for the first time, the zero offset of the IMU is acquired, the measurement data of the IMU is corrected, and then the pose change of the moving bitmap is acquired based on the measurement data of the IMU after the zero offset is corrected, so that the positioning of the moving object is realized.
Further, in the embodiment of the present invention, based on the description of the embodiment illustrated in fig. 1 and fig. 2, obtaining the zero offset of the IMU may be implemented as follows:
selecting IMU measurement data between two continuous moments, and performing pre-integration on the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments;
taking the estimated pose change as an initial correction pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial correction pose to obtain a matching correction pose;
subtracting the matched correction pose and the initial correction pose obtained by IMU pre-integration to obtain an error of the IMU pre-integration;
and calculating to obtain the zero offset of the IMU by taking the obtained error as the input quantity of the complementary filtering.
It should be noted that, in order to distinguish the related concepts in the embodiment of the present invention from those in the embodiment shown in fig. 1, the "pose change" is described as "estimated pose change", the "initial pose" is described as "initial corrected pose", and the "matching pose" is described as "matching corrected pose", where each of the three pairs of concepts is substantially identical, such as in the embodiment shown in fig. 2, the meanings of the three pairs of concepts are identical; this is merely to distinguish between the first time the embodiment of fig. 1 is run and the subsequent cycles in which the embodiment of fig. 1 is executed.
Further, in an embodiment, when acquiring the zero offset of the IMU measurement data, the calculating the zero offset of the IMU by using the obtained error as an input quantity of the complementary filtering may be implemented as follows:
acquiring acceleration and angular velocity in IMU measurement data, and calculating acceleration error and angular velocity error respectively corresponding to the acceleration and the angular velocity; and obtaining the zero offset of the acceleration and the zero offset of the angular velocity through a complementary filtering algorithm according to the acceleration error and the angular velocity error.
Based on the idea of obtaining acceleration error and angular velocity error to calculate and obtain acceleration zero bias and angular velocity zero bias in IMU measurement data, in the embodiment of the present invention, the IMU measurement data between two consecutive times is selected, and the IMU measurement data between the two selected times is pre-integrated to obtain the estimated pose change of the moving object between the two times, which can be implemented by the following manners:
normalizing the angular velocity data output by the IMU at each sampling moment to obtain a two-dimensional matrix corresponding to the angular velocity value;
converting the obtained two-dimensional matrix from an IMU coordinate system to a moving object coordinate system to obtain the acceleration corresponding to the moving object under the moving object coordinate system;
and integrating the obtained accelerated speed to obtain the displacement corresponding to the position variation of the moving object, namely obtaining the estimated pose variation of the moving object.
Corresponding to the above-described real-time process of obtaining the zero offset of the IMU measurement data, in a specific application scenario, the zero offset of the IMU measurement data may be obtained as follows:
in the measurement data of the IMU, the acceleration is represented as a, the angular velocity is represented as w, the zero offset of the acceleration is represented as ba, and the zero offset of the angular velocity is represented as bw;
the angular velocity data output by the IMU is normalized, and the following steps are performed:
because there are sampling data of multiple IMUs at two laser frame times, the angular value θ of the angular velocity integral obtained by integrating the angular velocity at each IMU sampling time is:
converting the angle value theta obtained by the angular velocity integration to obtain a two-dimensional rotation matrix RnComprises the following steps:
converting the angular velocity data output by the IMU into a moving object coordinate system to obtain the acceleration a corresponding to the moving object in the moving object coordinate systemr nComprises the following steps:
ar n=Rr i*an;
wherein R isr iIs a known 3 x 3 rotation matrix converted from IMU coordinate system to moving object coordinate system, ar nA numerical value representing an acceleration in a moving object coordinate system; the acceleration value at this time is a three-dimensional vector, and the value in the z direction is removed, so that the vector can be changed into a two-dimensional vector.
Integrating the obtained acceleration to obtain the position variation corresponding to the moving object, namely the displacement P, as follows:
p is the displacement obtained by acceleration integration, the angle value and the displacement obtained from IMU measurement data are used as initial values for matching of the 2D laser radar, accurate angle displacement information can be returned after the laser radar is successfully matched, and the angle displacement and the initial values are subjected to subtraction to obtain a displacement difference and an angle difference; based on the initial value, acquiring angle displacement information matched with the initial value from the laser point cloud; subtracting the obtained angular displacement information from an initial value to obtain an angular difference and a displacement difference;
wherein, when neglecting the smaller error caused by the angular velocity zero offset, the relationship between the displacement difference and the acceleration zero offset is:
then there are:
according to the angular velocity error and the acceleration error, obtaining the zero offset of the acceleration and the angular velocity through a complementary filtering algorithm as follows:
k1, K2, I1 and I2 are adjusted weight parameters, and the value range is 0-1;
after the new zero offset is calculated, the process returns to the next cycle of calculating the zero offset, as described in the embodiment of fig. 2.
The laser radar and IMU fusion positioning method is characterized in that IMU measurement data between two moments are subjected to pre-integration based on IMU measurement data after zero offset correction, so that the pose change of a moving object between the two moments is obtained; the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose; acquiring a matching result and outputting a corresponding matching pose; the method achieves the purpose of realizing the positioning of the moving object by fusing the laser radar and the IMU measurement data, and further improves the accuracy of the positioning of the moving object because the zero offset problem of the IMU measurement data is circularly corrected and the pose matching of the moving object is carried out based on the IMU measurement data after the zero offset is corrected.
Corresponding to the description of the embodiment of fig. 1 and fig. 2, the invention further provides a device for fusion positioning of a lidar and an IMU, as shown in fig. 3, fig. 3 is a schematic functional module diagram of an embodiment of the device for fusion positioning of a lidar and an IMU of the invention; the lidar and IMU fusion positioning apparatus described in the embodiment of fig. 3 is only functionally divided, and the fusion positioning apparatus includes:
the pose acquisition module 100 is configured to perform pre-integration on IMU measurement data between two moments based on the IMU measurement data after zero offset correction to obtain a pose change of the moving object between the two moments;
the pose matching module 200 is configured to use the obtained pose change as an initial pose for laser point cloud matching, and match two frames of laser frames in the laser point cloud based on the initial pose;
and the pose output module 300 is configured to acquire a matching result and output a corresponding matching pose.
In one embodiment, the pose acquisition module 100 is further configured to:
acquiring the zero offset of the IMU, and correcting the IMU measurement data based on the acquired zero offset.
In one embodiment, the pose acquisition module 100 is further configured to:
selecting IMU measurement data between two continuous moments, and performing pre-integration on the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments;
taking the estimated pose change as an initial correction pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial correction pose to obtain a matching correction pose;
subtracting the matched correction pose and the initial correction pose obtained by IMU pre-integration to obtain an error of the IMU pre-integration;
and calculating to obtain the zero offset of the IMU by taking the obtained error as the input quantity of the complementary filtering.
In one embodiment, the pose acquisition module 100 is further configured to:
acquiring acceleration and angular velocity in IMU measurement data, and calculating acceleration error and angular velocity error respectively corresponding to the acceleration and the angular velocity;
and obtaining the zero offset of the acceleration and the zero offset of the angular velocity through a complementary filtering algorithm according to the acceleration error and the angular velocity error.
In one embodiment, the pose acquisition module 100 is further configured to:
normalizing the angular velocity data output by the IMU at each sampling moment to obtain a two-dimensional matrix corresponding to the angular velocity value;
converting the obtained two-dimensional matrix from an IMU coordinate system to a moving object coordinate system to obtain the acceleration corresponding to the moving object under the moving object coordinate system;
and integrating the obtained accelerated speed to obtain the displacement corresponding to the position variation of the moving object, namely obtaining the estimated pose variation of the moving object.
In one embodiment, the pose acquisition module 100 is further configured to:
in the measurement data of the IMU, the acceleration is represented as a, the angular velocity is represented as w, the zero offset of the acceleration is represented as ba, and the zero offset of the angular velocity is represented as bw;
the angular velocity data output by the IMU is normalized, and the following steps are performed:
and integrating the angular velocity at each IMU sampling moment to obtain an angular value theta of the angular velocity integration as follows:
converting the angle value theta obtained by the angular velocity integration to obtain a two-dimensional rotation matrix RnComprises the following steps:
converting the angular velocity data output by the IMU into a moving object coordinate system to obtain the acceleration a corresponding to the moving object in the moving object coordinate systemr nComprises the following steps:
ar n=Rr i*an;
wherein R isr iIs a known 3 x 3 rotation matrix converted from IMU coordinate system to moving object coordinate system, ar nA numerical value representing an acceleration in a moving object coordinate system;
integrating the obtained acceleration to obtain the position variation corresponding to the moving object, namely the displacement P, as follows:
taking the obtained angle value and displacement as initial values matched with the laser radar, and acquiring angle displacement information matched with the initial values from laser point cloud based on the initial values;
subtracting the obtained angular displacement information from an initial value to obtain an angular difference and a displacement difference;
wherein, the relation between the displacement difference and the acceleration zero offset is as follows:
then there are:
according to the angular velocity error and the acceleration error, obtaining the zero offset of the acceleration and the angular velocity through a complementary filtering algorithm as follows:
k1, K2, I1 and I2 are adjusted weight parameters, and the value range is 0-1;
and returning to execute the next cycle of calculating the zero offset after the new zero offset is calculated.
All functional modules in the device for fusion positioning of the laser radar and the IMU are mutually matched, and the following operations are circularly executed in the whole process of fusion positioning of the laser radar and the IMU: and acquiring zero offset of the IMU and executing output operation of matching pose based on the IMU measurement data after correcting the zero offset so as to realize positioning of the moving object.
The laser radar and IMU fusion positioning device carries out pre-integration on IMU measurement data between two moments based on the IMU measurement data after zero offset correction to obtain the pose change of a moving object between the two moments; the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose; acquiring a matching result and outputting a corresponding matching pose; the method achieves the purpose of realizing the positioning of the moving object by fusing the laser radar and the IMU measurement data, and further improves the accuracy of the positioning of the moving object because the zero offset problem of the IMU measurement data is circularly corrected and the pose matching of the moving object is carried out based on the IMU measurement data after the zero offset is corrected.
The invention also provides an electronic device, which can realize the positioning of the moving object according to the method of the laser radar and the IMU fusion positioning as shown in fig. 1. Fig. 4 is a schematic diagram of the internal structure of an embodiment of the electronic device of the present invention, as shown in fig. 4.
In the present embodiment, the electronic device 1 may be a PC (Personal Computer), or may be a terminal device such as a smartphone, a tablet Computer, or a mobile Computer. The electronic device 1 comprises at least a memory 11, a processor 12, a communication bus 13, and a network interface 14.
The memory 11 includes at least one type of readable storage medium, which includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, and the like. The memory 11 may in some embodiments be an internal storage unit of the electronic device 1, for example a hard disk of the electronic device 1. The memory 11 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in hard disk provided on the electronic device 1, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 11 may be used not only to store application software installed in the electronic device 1 and various types of data, such as codes of the fusion locator 01 of the lidar and the IMU, but also to temporarily store data that has been output or is to be output.
The processor 12 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor or other data Processing chip in some embodiments, and is used for executing program codes stored in the memory 11 or Processing data, such as executing the fusion locator 01.
The communication bus 13 is used to realize connection communication between these components.
The network interface 14 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), and is typically used to establish a communication link between the electronic device 1 and other electronic devices.
Optionally, the electronic device 1 may further comprise a user interface, the user interface may comprise a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface may further comprise a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the electronic device 1 and for displaying a visualized user interface, among other things.
While fig. 4 only shows the electronic device 1 with the components 11-14 and the fused positioning program 01, it will be understood by those skilled in the art that the structure shown in fig. 2 does not constitute a limitation of the electronic device 1, and may include fewer or more components than shown, or some components in combination, or a different arrangement of components.
Based on the description of the embodiments in fig. 1 and fig. 2, in the embodiment of the electronic device 1 shown in fig. 4, the memory 11 stores a fusion positioning program 01; the fused locator 01 stored in the memory 11 is executable on the processor 12, and when the fused locator 01 is executed by the processor 12, the following steps are implemented:
based on the IMU measurement data after zero offset correction, pre-integrating the IMU measurement data between two moments to obtain the pose change of the moving object between the two moments;
the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose;
and acquiring a matching result and outputting a corresponding matching pose.
In one embodiment, the fusion localization program 01 may be further executed by the processor 12 to pre-integrate IMU measurement data between two time instants to obtain a pose change of the moving object between the two time instants, based on the IMU measurement data after correcting the zero offset, and the method further includes:
acquiring the zero offset of the IMU, and correcting the IMU measurement data based on the acquired zero offset.
In one embodiment, the fused locator program 01 is further executable by the processor 12 to obtain a zero offset of an IMU including:
selecting IMU measurement data between two continuous moments, and performing pre-integration on the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments;
taking the estimated pose change as an initial correction pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial correction pose to obtain a matching correction pose;
subtracting the matched correction pose and the initial correction pose obtained by IMU pre-integration to obtain an error of the IMU pre-integration;
and calculating to obtain the zero offset of the IMU by taking the obtained error as the input quantity of the complementary filtering.
In one embodiment, the fusion locator program 01 is further executable by the processor 12 to calculate a zero offset of the IMU using the obtained error as an input amount for complementary filtering, and includes:
acquiring acceleration and angular velocity in IMU measurement data, and calculating acceleration error and angular velocity error respectively corresponding to the acceleration and the angular velocity;
and obtaining the zero offset of the acceleration and the zero offset of the angular velocity through a complementary filtering algorithm according to the acceleration error and the angular velocity error.
In one embodiment, the fusion positioning program 01 is further executable by the processor 12 to select IMU measurement data between two consecutive time instants, and pre-integrate the selected IMU measurement data between the two time instants to obtain an estimated pose change of the mobile object between the two time instants, including:
normalizing the angular velocity data output by the IMU at each sampling moment to obtain a two-dimensional matrix corresponding to the angular velocity value;
converting the obtained two-dimensional matrix from an IMU coordinate system to a moving object coordinate system to obtain the acceleration corresponding to the moving object under the moving object coordinate system;
and integrating the obtained accelerated speed to obtain the displacement corresponding to the position variation of the moving object, namely obtaining the estimated pose variation of the moving object.
In one embodiment, the fused locator program 01 is further executable by the processor 12 to obtain a zero offset of an IMU including:
in the measurement data of the IMU, the acceleration is represented as a, the angular velocity is represented as w, the zero offset of the acceleration is represented as ba, and the zero offset of the angular velocity is represented as bw;
the angular velocity data output by the IMU is normalized, and the following steps are performed:
and integrating the angular velocity at each IMU sampling moment to obtain an angular value theta of the angular velocity integration as follows:
converting the angle value theta obtained by the angular velocity integration to obtain a two-dimensional rotation matrix RnComprises the following steps:
converting the angular velocity data output by the IMU into a moving object coordinate system to obtain the acceleration a corresponding to the moving object in the moving object coordinate systemr nComprises the following steps:
ar n=Rr i*an;
wherein R isr iIs a known 3 x 3 rotation matrix converted from IMU coordinate system to moving object coordinate system, ar nA numerical value representing an acceleration in a moving object coordinate system;
integrating the obtained acceleration to obtain the position variation corresponding to the moving object, namely the displacement P, as follows:
taking the obtained angle value and displacement as initial values matched with the laser radar, and acquiring angle displacement information matched with the initial values from laser point cloud based on the initial values;
subtracting the obtained angular displacement information from an initial value to obtain an angular difference and a displacement difference;
wherein, the relation between the displacement difference and the acceleration zero offset is as follows:
then there are:
according to the angular velocity error and the acceleration error, obtaining the zero offset of the acceleration and the angular velocity through a complementary filtering algorithm as follows:
k1, K2, I1 and I2 are adjusted weight parameters, and the value range is 0-1;
and returning to execute the next cycle of calculating the zero offset after the new zero offset is calculated.
The electronic equipment performs pre-integration on IMU measurement data between two moments based on the IMU measurement data after zero offset correction to obtain the pose change of the moving object between the two moments; the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose; acquiring a matching result and outputting a corresponding matching pose; the method achieves the purpose of realizing the positioning of the moving object by fusing the laser radar and the IMU measurement data, and further improves the accuracy of the positioning of the moving object because the zero offset problem of the IMU measurement data is circularly corrected and the pose matching of the moving object is carried out based on the IMU measurement data after the zero offset is corrected.
The specific implementation of the computer-readable storage medium of the present invention is substantially the same as the implementation principle of the embodiments corresponding to the method, the apparatus, and the electronic device for positioning fusion of lidar and IMU, and will not be described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (10)
1. A method for fusion positioning of a laser radar and an IMU (inertial measurement Unit), which is characterized by comprising the following steps:
based on the IMU measurement data after zero offset correction, pre-integrating the IMU measurement data between two moments to obtain the pose change of the moving object between the two moments;
the obtained pose change is used as an initial pose of laser point cloud matching, and two frames of laser frames in the laser point cloud are matched based on the initial pose;
and acquiring a matching result and outputting a corresponding matching pose.
2. The method for fusion positioning of lidar and the IMU of claim 1, wherein the pre-integrating the IMU measurement data between two time instants to obtain the pose change of the moving object between the two time instants based on the IMU measurement data after correcting the zero offset further comprises:
acquiring the zero offset of the IMU, and correcting the IMU measurement data based on the acquired zero offset.
3. The method for lidar and IMU fusion positioning of claim 2, wherein the obtaining the zero offset of the IMU comprises:
selecting IMU measurement data between two continuous moments, and performing pre-integration on the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments;
taking the estimated pose change as an initial correction pose of laser point cloud matching, and matching two frames of laser frames in the laser point cloud based on the initial correction pose to obtain a matching correction pose;
subtracting the matched correction pose and the initial correction pose obtained by IMU pre-integration to obtain an error of the IMU pre-integration;
and calculating to obtain the zero offset of the IMU by taking the obtained error as the input quantity of the complementary filtering.
4. The method for fusion localization of lidar and an IMU of claim 3, wherein calculating the zero offset of the IMU using the obtained error as an input to a complementary filter comprises:
acquiring acceleration and angular velocity in IMU measurement data, and calculating acceleration error and angular velocity error respectively corresponding to the acceleration and the angular velocity;
and obtaining the zero offset of the acceleration and the zero offset of the angular velocity through a complementary filtering algorithm according to the acceleration error and the angular velocity error.
5. The method for fusion positioning of lidar and the IMU of claim 3, wherein the selecting the IMU measurement data between two consecutive moments and pre-integrating the selected IMU measurement data between the two moments to obtain the estimated pose change of the moving object between the two moments comprises:
normalizing the angular velocity data output by the IMU at each sampling moment to obtain a two-dimensional matrix corresponding to the angular velocity value;
converting the obtained two-dimensional matrix from an IMU coordinate system to a moving object coordinate system to obtain the acceleration corresponding to the moving object under the moving object coordinate system;
and integrating the obtained accelerated speed to obtain the displacement corresponding to the position variation of the moving object, namely obtaining the estimated pose variation of the moving object.
6. The method of lidar and IMU fusion positioning of claim 5, wherein the obtaining the zero offset of the IMU comprises:
in the measurement data of the IMU, the acceleration is represented as a, the angular velocity is represented as w, the zero offset of the acceleration is represented as ba, and the zero offset of the angular velocity is represented as bw;
the angular velocity data output by the IMU is normalized, and the following steps are performed:
and integrating the angular velocity at each IMU sampling moment to obtain an angular value theta of the angular velocity integration as follows:
converting the angle value theta obtained by the angular velocity integration to obtain a two-dimensional rotation matrix RnComprises the following steps:
converting the angular velocity data output by the IMU into a moving object coordinate system to obtain the acceleration a corresponding to the moving object in the moving object coordinate systemr nComprises the following steps:
ar n=Rr i*an;
wherein R isr iIs a known 3 x 3 rotation matrix converted from IMU coordinate system to moving object coordinate system, ar nA numerical value representing an acceleration in a moving object coordinate system;
integrating the obtained acceleration to obtain the position variation corresponding to the moving object, namely the displacement P, as follows:
taking the obtained angle value and displacement as initial values matched with the laser radar, and acquiring angle displacement information matched with the initial values from laser point cloud based on the initial values;
subtracting the obtained angular displacement information from an initial value to obtain an angular difference and a displacement difference;
wherein, the relation between the displacement difference and the acceleration zero offset is as follows:
then there are:
according to the angular velocity error and the acceleration error, obtaining the zero offset of the acceleration and the angular velocity through a complementary filtering algorithm as follows:
k1, K2, I1 and I2 are adjusted weight parameters, and the value range is 0-1;
and returning to execute the next cycle of calculating the zero offset after the new zero offset is calculated.
7. The method for lidar and IMU fusion positioning of any of claims 1-6, wherein the method further comprises:
and the zero offset of the IMU is acquired, and the output operation of matching the pose is executed based on the IMU measurement data after the zero offset is corrected, and the operation is executed circularly in the whole fusion positioning process of the laser radar and the IMU.
8. An apparatus for fusion positioning of lidar and an IMU, the apparatus comprising:
the pose acquisition module is used for pre-integrating IMU measurement data between two moments to obtain the pose change of the moving object between the two moments based on the IMU measurement data after zero offset correction;
the pose matching module is used for matching two frames of laser frames in the laser point cloud based on the initial pose by using the obtained pose change as the initial pose of the laser point cloud matching;
and the pose output module is used for acquiring the matching result and outputting the corresponding matching pose.
9. An electronic device, comprising a memory and a processor, wherein the memory stores a lidar and IMU fusion positioning program operable on the processor, and wherein the fusion positioning program, when executed by the processor, performs the lidar and IMU fusion positioning method of any of claims 1-7.
10. A computer storage medium having stored thereon a lidar and IMU fusion positioning program executable by one or more processors to perform the steps of the method of lidar and IMU fusion positioning according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911183681.9A CN110879400A (en) | 2019-11-27 | 2019-11-27 | Method, equipment and storage medium for fusion positioning of laser radar and IMU |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911183681.9A CN110879400A (en) | 2019-11-27 | 2019-11-27 | Method, equipment and storage medium for fusion positioning of laser radar and IMU |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110879400A true CN110879400A (en) | 2020-03-13 |
Family
ID=69730385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911183681.9A Pending CN110879400A (en) | 2019-11-27 | 2019-11-27 | Method, equipment and storage medium for fusion positioning of laser radar and IMU |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110879400A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111442722A (en) * | 2020-03-26 | 2020-07-24 | 达闼科技成都有限公司 | Positioning method, positioning device, storage medium and electronic equipment |
CN111812668A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
CN111812669A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
CN112067007A (en) * | 2020-11-12 | 2020-12-11 | 湖北亿咖通科技有限公司 | Map generation method, computer storage medium, and electronic device |
CN112083433A (en) * | 2020-07-21 | 2020-12-15 | 浙江工业大学 | Laser radar distortion removal method applied to two-wheeled mobile robot |
CN112348897A (en) * | 2020-11-30 | 2021-02-09 | 上海商汤临港智能科技有限公司 | Pose determination method and device, electronic equipment and computer readable storage medium |
CN112489176A (en) * | 2020-11-26 | 2021-03-12 | 香港理工大学深圳研究院 | Tightly-coupled graph building method fusing ESKF, g2o and point cloud matching |
CN112781594A (en) * | 2021-01-11 | 2021-05-11 | 桂林电子科技大学 | Laser radar iteration closest point improvement algorithm based on IMU coupling |
WO2021189479A1 (en) * | 2020-03-27 | 2021-09-30 | 深圳市速腾聚创科技有限公司 | Pose correction method and device for roadbed sensor, and roadbed sensor |
CN113739785A (en) * | 2020-05-29 | 2021-12-03 | 杭州海康机器人技术有限公司 | Robot positioning method and device and storage medium |
CN114111775A (en) * | 2021-12-20 | 2022-03-01 | 国汽(北京)智能网联汽车研究院有限公司 | Multi-sensor fusion positioning method and device, storage medium and electronic equipment |
CN114199236A (en) * | 2021-11-29 | 2022-03-18 | 北京百度网讯科技有限公司 | Positioning data processing method and device, electronic equipment and automatic driving vehicle |
CN114585879A (en) * | 2020-09-27 | 2022-06-03 | 深圳市大疆创新科技有限公司 | Pose estimation method and device |
WO2023000294A1 (en) * | 2021-07-23 | 2023-01-26 | 深圳市大疆创新科技有限公司 | Pose estimation method, laser-radar-inertial odometer, movable platform and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108507568A (en) * | 2017-02-27 | 2018-09-07 | 华为技术有限公司 | The method, apparatus and integrated navigation system of compensation temperature drift error |
CN108732603A (en) * | 2017-04-17 | 2018-11-02 | 百度在线网络技术(北京)有限公司 | Method and apparatus for positioning vehicle |
CN109465832A (en) * | 2018-12-18 | 2019-03-15 | 哈尔滨工业大学(深圳) | High-precision vision and the tight fusion and positioning method of IMU and system |
CN110017850A (en) * | 2019-04-19 | 2019-07-16 | 小狗电器互联网科技(北京)股份有限公司 | A kind of gyroscopic drift estimation method, device and positioning system |
CN110361010A (en) * | 2019-08-13 | 2019-10-22 | 中山大学 | It is a kind of based on occupy grating map and combine imu method for positioning mobile robot |
US20190324471A1 (en) * | 2018-04-19 | 2019-10-24 | Faraday&Future Inc. | System and method for ground plane detection |
-
2019
- 2019-11-27 CN CN201911183681.9A patent/CN110879400A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108507568A (en) * | 2017-02-27 | 2018-09-07 | 华为技术有限公司 | The method, apparatus and integrated navigation system of compensation temperature drift error |
CN108732603A (en) * | 2017-04-17 | 2018-11-02 | 百度在线网络技术(北京)有限公司 | Method and apparatus for positioning vehicle |
US20190324471A1 (en) * | 2018-04-19 | 2019-10-24 | Faraday&Future Inc. | System and method for ground plane detection |
CN109465832A (en) * | 2018-12-18 | 2019-03-15 | 哈尔滨工业大学(深圳) | High-precision vision and the tight fusion and positioning method of IMU and system |
CN110017850A (en) * | 2019-04-19 | 2019-07-16 | 小狗电器互联网科技(北京)股份有限公司 | A kind of gyroscopic drift estimation method, device and positioning system |
CN110361010A (en) * | 2019-08-13 | 2019-10-22 | 中山大学 | It is a kind of based on occupy grating map and combine imu method for positioning mobile robot |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111442722A (en) * | 2020-03-26 | 2020-07-24 | 达闼科技成都有限公司 | Positioning method, positioning device, storage medium and electronic equipment |
WO2021189479A1 (en) * | 2020-03-27 | 2021-09-30 | 深圳市速腾聚创科技有限公司 | Pose correction method and device for roadbed sensor, and roadbed sensor |
CN113739785A (en) * | 2020-05-29 | 2021-12-03 | 杭州海康机器人技术有限公司 | Robot positioning method and device and storage medium |
CN111812668A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
CN111812669A (en) * | 2020-07-16 | 2020-10-23 | 南京航空航天大学 | Winding inspection device, positioning method thereof and storage medium |
CN112083433A (en) * | 2020-07-21 | 2020-12-15 | 浙江工业大学 | Laser radar distortion removal method applied to two-wheeled mobile robot |
CN112083433B (en) * | 2020-07-21 | 2023-06-13 | 浙江工业大学 | Laser radar distortion removal method applied to two-wheeled mobile robot |
CN114585879A (en) * | 2020-09-27 | 2022-06-03 | 深圳市大疆创新科技有限公司 | Pose estimation method and device |
CN112067007A (en) * | 2020-11-12 | 2020-12-11 | 湖北亿咖通科技有限公司 | Map generation method, computer storage medium, and electronic device |
CN112067007B (en) * | 2020-11-12 | 2021-01-29 | 湖北亿咖通科技有限公司 | Map generation method, computer storage medium, and electronic device |
CN112489176A (en) * | 2020-11-26 | 2021-03-12 | 香港理工大学深圳研究院 | Tightly-coupled graph building method fusing ESKF, g2o and point cloud matching |
CN112348897A (en) * | 2020-11-30 | 2021-02-09 | 上海商汤临港智能科技有限公司 | Pose determination method and device, electronic equipment and computer readable storage medium |
CN112781594A (en) * | 2021-01-11 | 2021-05-11 | 桂林电子科技大学 | Laser radar iteration closest point improvement algorithm based on IMU coupling |
WO2023000294A1 (en) * | 2021-07-23 | 2023-01-26 | 深圳市大疆创新科技有限公司 | Pose estimation method, laser-radar-inertial odometer, movable platform and storage medium |
CN114199236A (en) * | 2021-11-29 | 2022-03-18 | 北京百度网讯科技有限公司 | Positioning data processing method and device, electronic equipment and automatic driving vehicle |
CN114111775A (en) * | 2021-12-20 | 2022-03-01 | 国汽(北京)智能网联汽车研究院有限公司 | Multi-sensor fusion positioning method and device, storage medium and electronic equipment |
CN114111775B (en) * | 2021-12-20 | 2024-03-29 | 国汽(北京)智能网联汽车研究院有限公司 | Multi-sensor fusion positioning method and device, storage medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110879400A (en) | Method, equipment and storage medium for fusion positioning of laser radar and IMU | |
US20210190497A1 (en) | Simultaneous location and mapping (slam) using dual event cameras | |
CN110689585B (en) | Multi-phase external parameter combined calibration method, device, equipment and medium | |
US20210158567A1 (en) | Visual positioning method and apparatus, electronic device, and system | |
CN111649739B (en) | Positioning method and device, automatic driving vehicle, electronic equipment and storage medium | |
CN107748569B (en) | Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system | |
CN110246182B (en) | Vision-based global map positioning method and device, storage medium and equipment | |
CN109752003B (en) | Robot vision inertia point-line characteristic positioning method and device | |
US20210183100A1 (en) | Data processing method and apparatus | |
CN110197615B (en) | Method and device for generating map | |
WO2022193508A1 (en) | Method and apparatus for posture optimization, electronic device, computer-readable storage medium, computer program, and program product | |
CN111721305B (en) | Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium | |
KR20210093194A (en) | A method, an apparatus an electronic device, a storage device, a roadside instrument, a cloud control platform and a program product for detecting vehicle's lane changing | |
CN111791235A (en) | Robot multi-camera visual inertia point-line characteristic positioning method and device | |
WO2023005457A1 (en) | Pose calculation method and apparatus, electronic device, and readable storage medium | |
CN110942474B (en) | Robot target tracking method, device and storage medium | |
CN113610702B (en) | Picture construction method and device, electronic equipment and storage medium | |
KR20220100813A (en) | Automatic driving vehicle registration method and device, electronic equipment and a vehicle | |
CN113628284B (en) | Pose calibration data set generation method, device and system, electronic equipment and medium | |
WO2020019116A1 (en) | Multi-source data mapping method, related apparatus, and computer-readable storage medium | |
CN115560744A (en) | Robot, multi-sensor-based three-dimensional mapping method and storage medium | |
CN113946151A (en) | Data processing method and device for automatic driving vehicle and automatic driving vehicle | |
CN113758481A (en) | Grid map generation method, device, system, storage medium and electronic equipment | |
CN112683216B (en) | Method and device for generating vehicle length information, road side equipment and cloud control platform | |
CN108534757B (en) | Cloud-based visual map scale detection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 518000 Room 401, block D, building 7, Shenzhen International Innovation Valley, Dashi 1st Road, Xili community, Xili street, Nanshan District, Shenzhen, Guangdong Applicant after: JUXING TECHNOLOGY (SHENZHEN) Co.,Ltd. Address before: 518000 building 101, building R3b, Gaoxin industrial village, No.018, Gaoxin South 7th Road, community, high tech Zone, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province Applicant before: JUXING TECHNOLOGY (SHENZHEN) Co.,Ltd. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200313 |
|
RJ01 | Rejection of invention patent application after publication |