CN113758491B - Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle - Google Patents

Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle Download PDF

Info

Publication number
CN113758491B
CN113758491B CN202110897052.3A CN202110897052A CN113758491B CN 113758491 B CN113758491 B CN 113758491B CN 202110897052 A CN202110897052 A CN 202110897052A CN 113758491 B CN113758491 B CN 113758491B
Authority
CN
China
Prior art keywords
imu
frame
map
data
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110897052.3A
Other languages
Chinese (zh)
Other versions
CN113758491A (en
Inventor
闫耀威
王宽
林鑫余
彭祥军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202110897052.3A priority Critical patent/CN113758491B/en
Publication of CN113758491A publication Critical patent/CN113758491A/en
Application granted granted Critical
Publication of CN113758491B publication Critical patent/CN113758491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2400/00Indexing codes relating to detected, measured or calculated conditions or factors

Abstract

The invention discloses a relative positioning method and system based on a multi-sensor fusion unmanned vehicle and the vehicle, and the relative positioning method and system based on the multi-sensor fusion unmanned vehicle comprise the following steps: (1) data access; (2) positioning initialization; (3) IMU integral estimation; (4) map database matching; (5) fusion positioning modeling; and (6) fusion positioning solving. The invention reduces the influence of accumulated errors on the positioning result and effectively reduces the mismatching result in the matching process.

Description

Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle
Technical Field
The invention belongs to the technical field of positioning, and particularly relates to a relative positioning method and system based on a multi-sensor fusion unmanned vehicle and the vehicle.
Background
For autonomous mobile drones, one of the core capabilities that it needs to have is to use sensors to sense the surrounding environment. When an unmanned vehicle is engaged in autonomous navigation work, real-time positioning in the environment is always a popular research problem. The traditional positioning method is to calculate the current pose of the unmanned vehicle in real time through data detected by an encoder arranged on the wheels of the unmanned vehicle. This method is simple and easy to use but has its inherent accumulated error. The robot inevitably generates slipping when moving on the ground, and errors exist in the system design and manufacturing process, the errors are accumulated continuously along with the movement of the robot, and the situation that the position and pose calculation result cannot be corrected and applied after running for a period of time exists, so that the method has corresponding limitation.
In the research process of robot positioning, people gradually adopt a laser radar scanning matching technology to correct deviation generated in the position and pose calculation process of the odometer. The scanning matching is to match the laser radar scanning result at the current moment with the scanning result at the previous moment, and the pose change relation of the two groups of laser point clouds can be obtained through matching, so that the pose of the robot body in the environment is determined. The method well compensates accumulated error results existing in the positioning of the odometer, and the positioning result is higher in accuracy under the condition that the radar has higher accuracy. However, the laser scanning matching method has a certain limitation, and the scanned environment must contain more road sign feature points, and meanwhile, the road sign features cannot be highly similar, otherwise, the situation of mismatching occurs.
Therefore, there is a need to develop a new method, system and vehicle for relative positioning based on multi-sensor fusion unmanned vehicles.
Disclosure of Invention
The invention aims to provide a relative positioning method, a relative positioning system and a relative positioning vehicle based on a multi-sensor fusion unmanned vehicle, which can reduce the influence of accumulated errors on positioning results and can effectively reduce mismatching results in a matching process.
In a first aspect, the invention provides a relative positioning method based on a multi-sensor fusion unmanned vehicle, which comprises the following steps:
step 1, data access;
receiving data output by a GPS, an IMU, a wheel speed meter and a laser radar in real time, attaching time stamp information of the current moment to the data of each frame, and storing the time stamp information in corresponding buffer areas respectively;
step 2, positioning initialization;
two frames of GPS data are popped up from the GPS buffer area, whether the vehicle moves or not is judged based on the two frames of GPS data, and if yes, the position P, the speed V and the course angle of the current vehicle are calculated;
step 3.IMU integral calculation;
obtaining an IMU state (R, p, V, ba, bg) at each IMU moment by using linear velocity and angular velocity integration of the IMU according to an IMU dynamics formula, wherein R represents an attitude rotation amount, p represents a translation amount, ba represents a gyroscope deviation, bg represents an accelerometer deviation;
performing distortion correction on the real-time laser radar point cloud according to the historical positioning gesture and the gesture (R, p) obtained by IMU integration;
step 4, matching a map database;
according to the translation amount p calculated by IMU integration, searching a map frame closest to the map frame in a map database by using KNN, calculating a corresponding pose (R, p) between a real-time radar frame and the searched map frame by using an NDT matching algorithm, and adding the pose (R, p) as a monobasic factor into a factor graph model calculation map;
step 5, fusion positioning modeling;
adding the GPS signals as a unitary Factor into a Factor graph model calculation graph according to the received GPS signals;
according to the received wheel speed signal, adding the wheel speed signal as a unitary Factor into a Factor graph model calculation chart;
adding the IMU states (R, p, v, ba, bg) as binary factors to the Factor graph model calculation map;
step 6, fusion positioning solution;
and sending the factors of all the data to the iSAM2 module to obtain the transverse and longitudinal coordinates and the course angle of the final vehicle in the map.
Optionally, using two frames of GPS data greater than 0.1 meter, subtracting the displacement of the next frame from the previous frame, dividing by the time to obtain a speed, and obtaining the navigation angle through the speed vectors in the X direction and the Y direction; using the calculated speed as the initialization speed of the IMU, and transmitting the GPS coordinates to an NDT database for searching a bin file of the NDT data of the last frame; after searching, registering the current frame and the map to obtain the position P, the speed V and the course angle of the current vehicle, taking the calculated position P and the course angle as the initialization pose of the IMU, and taking the speed V together as the IMU initialization parameter of the main line.
Optionally, the IMU state vector at each moment is calculated, specifically:
first, the IMU data between two moments is looked up, a matrix is created 6*1 comprising acceleration and angular velocity, and the time difference between two frames of data is fed into the imu_predefined function in the factor graph model for integration, and the state of the IMU is predicted from the IMU data of each incoming frame (R, p, V, ba, bg).
Optionally, the step 4 specifically includes:
the drawing construction process comprises the following steps: inputting a track, i.e. positioning result { R } calculated by imufactor i ,p i } i=1....N The method comprises the steps of carrying out a first treatment on the surface of the Wherein R is i Representing the rotation vector, p i Representing a translation vector; radar point cloud frames at corresponding times: { L i } i=1....N Wherein L is i Representing pose information of each frame of the laser radar under a local coordinate system, and converting the pose of the point cloud of each frame into a world coordinate system, namelyWherein (1)>Representing pose information of each frame of radar converted from local coordinate system to world coordinate system, p i Representing the translation amount during coordinate conversion, and finally obtaining a point cloud map of the laser radar;
map matching: using NDT to match, namely taking the current point as the center, dividing the space occupied by the reference point cloud into grids or voxels with specified size; counting the scanning points of the laser radar in small lattices, and calculating the multidimensional normal distribution parameters of each lattice when the number of the laser points collected by each lattice is not less than 4, so as to calculate the mean value and covariance of the corresponding lattices; the odometer data obtained by IMU are then assigned to transformation parameters, which are transformed by transformation matrix T into a grid of a reference point cloud, i.e. for the point cloud to be registeredWherein (1)>Representing the transformation of the registered coordinates to point cloud coordinates under the map,/->Laser radar point cloud coordinates representing unconverted coordinates, L new Representing laser radar point cloud coordinates in a local coordinate system; calculating normal distribution PDF parameters of point clouds positioned in a grid: p (k) =exp [ -0.5 (k-q) t-1 (k-q)]Wherein k represents all scanning points in a grid, and q represents the average value of normal distribution; adding the probability densities calculated by each grid to obtain a score of NDT registration, and optimizing an objective function score according to a Newton optimization algorithm, namely searching transformation parameters to optimize the score value, so as to obtain a final positioning NDT positioning result; after receiving the coordinate information of the first frame of the GPS, searching a map frame closest to the frame of coordinates, storing the map frame in a container after the map frame is found, registering the current laser radar data converted by removing the point cloud distortion coordinates with the map frame to obtain registered relative pose, and transmitting the obtained pose as a unitary factor to a factor graph model calculation map.
In a second aspect, the relative positioning system based on the multi-sensor fusion unmanned vehicle provided by the invention comprises a memory and a controller, wherein the controller is connected with the memory, a computer readable program is stored in the memory, and when the controller calls the computer readable program, the controller can execute the steps of the relative positioning method based on the multi-sensor fusion unmanned vehicle provided by the invention.
In a third aspect, the vehicle of the invention adopts the relative positioning system based on the multi-sensor fusion unmanned vehicle.
The invention has the following advantages: the invention can improve the inherent constraint existing in the positioning of the conventional odometer system and reduce the influence of accumulated errors on the positioning result; the laser point cloud scanning matching and IMU, GPS and odometer positioning are combined, and the false matching result in the matching process can be effectively reduced for the condition of a large-scale environment or high similarity of the road sign characteristics of the environment.
Drawings
FIG. 1 is a diagram showing the interrelationship of the sensors and the process flow in the present embodiment;
FIG. 2 is a schematic diagram of the processing manner of each sensor data in the algorithm module in the present embodiment;
fig. 3 is a point cloud map built by using NDT algorithm based on GPS track in the present embodiment;
FIG. 4 is a diagram showing the comparison of the positioning track and the high-precision GPS track (which are all coincident) in the present embodiment;
FIG. 5 is a graph (with larger error) comparing the result of the relative positioning of the prior single lidar with the high-precision GPS track.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
As shown in fig. 1, in this embodiment, a method for positioning a multi-sensor fusion-based unmanned vehicle includes the following steps:
(1) Accessing data;
and respectively accessing data output by the GPS, the IMU, the wheel speed meter and the laser radar, attaching time stamp information of the current moment to the data of each frame, and respectively storing the time stamp information in a corresponding buffer area.
(2) Positioning and initializing;
the IMU is initialized and the vehicle is first stationary for more than 50 seconds. And initializing GPS, namely initializing the GPS by using two frames of GPS data larger than 0.1 meter, subtracting the displacement of the next frame from the previous frame, dividing by the time, calculating the speed, and calculating the heading angle yaw through the speed vectors in the X and Y directions. If no wheel speed is entered, GPS speed is used instead of wheel speed addition. The calculated speed is used as an initialization speed of the IMU and GPS coordinates are transmitted to an NDT (normal distribution transformation mapping) database for searching a bin file of NDT data from the nearest frame. After searching, registering the current frame and the map, so as to obtain the position P (namely the transverse and longitudinal positioning results x and y), the speed V and the course angle of the current vehicle, taking the calculated position P and the course angle as the initialization pose of the IMU, and taking the speed V calculated by the GPS as the IMU initialization parameter of the main line.
(3) IMU integral calculation;
after the IMU obtains the initialization parameters, the IMU state vector at each moment is required to start. Because the accuracy of the IMU in the transient state is higher, but the integral error is very large, the IMU pose calculating method firstly searches IMU data between two moments, creates a 6*1 matrix comprising acceleration and angular velocity, sends the time difference of two frames of data into an imu_pre-integrated function in a factor graph model to integrate, predicts the state of the IMU according to the IMU data of each next frame (R, p, V, ba, bg), and simultaneously needs other sensors as observables to calibrate continuously, and calculates the accumulated value of the IMU by using an imufactor function in the factor graph model to obtain the odometer information. And correcting the distortion of the real-time laser radar point cloud according to the historical positioning gesture and the gesture (R, p) obtained by IMU integration.
(4) Map database matching;
the drawing construction process comprises the following steps: inputting a track, i.e. positioning result { R } calculated by imufactor i ,p i } i=1....N The method comprises the steps of carrying out a first treatment on the surface of the Wherein R is i Representing the rotation vector, p i Representing a translation vector; radar point cloud frames at corresponding times: { L i } i=1....N Wherein L is i Representing pose information of each frame of the laser radar under a local coordinate system, and converting the pose of the point cloud of each frame into the coordinate system of the track, namelyWherein (1)>Representing pose information of each frame of radar converted from local coordinate system to world coordinate system, p i Representing the translation amount during coordinate conversion to finally obtain the laserAnd (5) a point cloud map of the radar.
Map matching: matching by using an NDT (non-linear transformation) normal distribution transformation algorithm; the general idea of NDT is to divide the space occupied by a reference point cloud into grids or voxels of a specified size with the current point as the center; for example, (10 cm) x (10 cm) small lattices, the scanning points of the laser radar fall in the small lattices and are counted, so long as not less than 4 laser points collected by each lattice are satisfied, the multidimensional normal distribution parameters of each lattice are calculated, and the mean value and the covariance of the corresponding lattices are obtained. The transformation parameters are then assigned using the odometry data obtained above by the IMU, which are transformed by a transformation matrix T into a grid of reference point clouds, i.e. for the point clouds to be registeredWherein (1)>Representing the transformation of the registered coordinates to point cloud coordinates under the map,/->Laser radar point cloud coordinates representing unconverted coordinates, L new And (5) representing the laser radar point cloud coordinates in the local coordinate system. Calculating normal distribution PDF parameters of point clouds positioned in a grid: p (k) =exp [ -0.5 (k-q) t-1 (k-q)]Where k represents all scan points in a grid and q represents the mean of the normal distribution. Each PDF may be considered an approximation of a local plane, describing features such as plane position, orientation, shape, and smoothness. And adding the probability densities calculated by each grid to obtain a score (score) of NDT registration, and optimizing an objective function score according to a Newton optimization algorithm, namely searching transformation parameters to ensure that the score value is optimal, so as to obtain a final positioning NDT positioning result. After receiving the coordinate information of the first frame of the GPS, searching for the map frame closest to the frame of the coordinate, storing the map frame in a container after searching, and registering the current laser radar data converted by removing the point cloud distortion coordinate with the map frame, wherein the process is to solve the problem that the current laser radar data is the same as the map frameGaussian distribution relationship in each grid in the previous frame and NDT map. And obtaining the relative pose obtained after registration. And transmitting the obtained pose as a unitary factor to a factor graph model to be used as a unitary factor for adjusting IMU data.
(5) Fusion positioning modeling;
the GPS received data is also used as the position observation quantity of the IMU and added into a factor graph model calculation graph, a monobasic factor is built for the data of the wheel speed meter, the wheel speed meter information is mainly used as the observation quantity to infer the position information of the vehicle, and the position information is fused with the GPS information to be used as the position information of the IMU, so that the position information of the IMU is corrected continuously by one parameter.
(6) Fusion positioning and solving;
and sending the factors of all the data to an iSAM2 module (long-term sliding window) to obtain the transverse and longitudinal coordinates and the course angle of the final vehicle in the map.
As shown in fig. 3, a point cloud map constructed by using NDT algorithm based on GPS track in the present embodiment is shown.
As shown in fig. 4, a comparison between the positioning track and the high-precision GPS track in this embodiment is shown, and it can be seen from fig. 4 that the positioning track and the high-precision GPS track have all been overlapped, which indicates that the positioning accuracy is high and stable by adopting the method.
As shown in FIG. 5, which is a graph comparing the relative positioning result of the existing single lidar with the high-precision GPS track, the error of the relative positioning result of the single lidar is larger.
In this embodiment, the relative positioning system based on the multi-sensor fusion unmanned vehicle includes a memory and a controller, where the controller is connected to the memory, and the memory stores a computer readable program, and when the controller invokes the computer readable program, the controller can execute the steps of the relative positioning method based on the multi-sensor fusion unmanned vehicle described in this embodiment.
In this embodiment, a vehicle adopts the relative positioning system based on the multi-sensor fusion unmanned vehicle as described in this embodiment.

Claims (6)

1. The relative positioning method based on the multi-sensor fusion unmanned vehicle is characterized by comprising the following steps of:
step 1, data access;
receiving data output by a GPS, an IMU, a wheel speed meter and a laser radar in real time, attaching time stamp information of the current moment to the data of each frame, and storing the time stamp information in corresponding buffer areas respectively;
step 2, positioning initialization;
two frames of GPS data are popped up from the GPS buffer area, whether the vehicle moves or not is judged based on the two frames of GPS data, and if yes, the position P, the speed V and the course angle of the current vehicle are calculated;
step 3.IMU integral calculation;
obtaining an IMU state (R, p, V, ba, bg) at each IMU moment by using linear velocity and angular velocity integration of the IMU according to an IMU dynamics formula, wherein R represents an attitude rotation amount, p represents a translation amount, ba represents a gyroscope deviation, bg represents an accelerometer deviation;
performing distortion correction on the real-time laser radar point cloud according to the historical positioning gesture and the gesture (R, p) obtained by IMU integration;
step 4, matching a map database;
according to the translation amount p calculated by IMU integration, searching a map frame closest to the map frame in a map database by using KNN, calculating a corresponding pose (R, p) between a real-time radar frame and the searched map frame by using an NDT matching algorithm, and adding the pose (R, p) as a monobasic factor into a factor graph model calculation map;
step 5, fusion positioning modeling;
adding the GPS signals as a unitary Factor into a Factor graph model calculation graph according to the received GPS signals;
according to the received wheel speed signal, adding the wheel speed signal as a unitary Factor into a Factor graph model calculation chart;
adding the IMU states (R, p, v, ba, bg) as binary factors to the Factor graph model calculation map;
step 6, fusion positioning solution;
and sending the factors of all the data to the iSAM2 module to obtain the transverse and longitudinal coordinates and the course angle of the final vehicle in the map.
2. The multi-sensor fusion unmanned vehicle-based relative positioning method according to claim 1, wherein the method comprises the following steps: using two frames of GPS data greater than 0.1 meter, subtracting the displacement of the next frame from the previous frame, dividing by the time to obtain the speed, and obtaining the navigation angle through the speed vectors in the X direction and the Y direction; using the calculated speed as the initialization speed of the IMU, and transmitting the GPS coordinates to an NDT database for searching a bin file of the NDT data of the last frame; after searching, registering the current frame and the map to obtain the position P, the speed V and the course angle of the current vehicle, taking the calculated position P and the course angle as the initialization pose of the IMU, and taking the speed V together as the IMU initialization parameter of the main line.
3. The multi-sensor fusion unmanned vehicle-based relative positioning method according to claim 2, wherein the method comprises the following steps: the IMU state vector of each moment is calculated, specifically:
first, the IMU data between two moments is looked up, a matrix is created 6*1 comprising acceleration and angular velocity, and the time difference between two frames of data is fed into the imu_predefined function in the factor graph model for integration, and the state of the IMU is predicted from the IMU data of each incoming frame (R, p, V, ba, bg).
4. The multi-sensor fusion unmanned vehicle-based relative positioning method according to claim 2, wherein the method comprises the following steps: the step 4 specifically comprises the following steps:
the drawing construction process comprises the following steps: inputting a track, i.e. positioning result { R } calculated by imufactor i ,p i } i=1....N The method comprises the steps of carrying out a first treatment on the surface of the Wherein R is i Representing the rotation vector, p i Representing a translation vector; radar point cloud frames at corresponding times: { L i } i=1....N Wherein L is i Representing pose information of each frame of the laser radar under a local coordinate system, and converting the pose of the point cloud of each frame into a world coordinate system, namelyWherein (1)>Representing pose information of each frame of radar converted from local coordinate system to world coordinate system, p i Representing the translation amount during coordinate conversion, and finally obtaining a point cloud map of the laser radar;
map matching: using NDT to match, namely taking the current point as the center, dividing the space occupied by the reference point cloud into grids or voxels with specified size; counting the scanning points of the laser radar in small lattices, and calculating the multidimensional normal distribution parameters of each lattice when the number of the laser points collected by each lattice is not less than 4, so as to calculate the mean value and covariance of the corresponding lattices; the odometer data obtained by IMU are then assigned to transformation parameters, which are transformed by transformation matrix T into a grid of a reference point cloud, i.e. for the point cloud to be registeredWherein (1)>Representing the transformation of the registered coordinates to point cloud coordinates under the map,/->Laser radar point cloud coordinates representing unconverted coordinates, L new Representing laser radar point cloud coordinates in a local coordinate system; calculating normal distribution PDF parameters of point clouds positioned in a grid: p (k) =exp [ -0.5 (k-q) t-1 (k-q)]Wherein k represents all scanning points in a grid, and q represents the average value of normal distribution; adding the probability densities calculated by each grid to obtain a score of NDT registration, and optimizing an objective function score according to a Newton optimization algorithm, namely searching transformation parameters to optimize the score value, so as to obtain a final positioning NDT positioning result; splicing jointAfter the coordinate information of the first frame of the GPS is obtained, a map frame closest to the frame of the GPS is searched, the map frame is stored in a container after the map frame is found, then the current laser radar data after the point cloud distortion coordinate conversion is removed is registered with the map frame, the registered relative pose is obtained, and the obtained pose is used as a unitary factor to be transmitted to a factor graph model calculation graph.
5. The utility model provides a relative positioning system based on multisensor fuses unmanned vehicles, includes memory and controller, the controller is connected its characterized in that with the memory: the memory stores a computer readable program, and the controller can execute the steps of the relative positioning method based on the multi-sensor fusion unmanned vehicle according to any one of claims 1 to 4 when the controller calls the computer readable program.
6. A vehicle, characterized in that: a relative positioning system based on a multi-sensor fusion unmanned vehicle according to claim 5.
CN202110897052.3A 2021-08-05 2021-08-05 Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle Active CN113758491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110897052.3A CN113758491B (en) 2021-08-05 2021-08-05 Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110897052.3A CN113758491B (en) 2021-08-05 2021-08-05 Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle

Publications (2)

Publication Number Publication Date
CN113758491A CN113758491A (en) 2021-12-07
CN113758491B true CN113758491B (en) 2024-02-23

Family

ID=78788807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110897052.3A Active CN113758491B (en) 2021-08-05 2021-08-05 Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle

Country Status (1)

Country Link
CN (1) CN113758491B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
WO2019188704A1 (en) * 2018-03-29 2019-10-03 パイオニア株式会社 Self-position estimation device, self-position estimation method, program, and recording medium
CN111413721A (en) * 2020-01-14 2020-07-14 华为技术有限公司 Vehicle positioning method, device, controller, intelligent vehicle and system
CN111522043A (en) * 2020-04-30 2020-08-11 北京联合大学 Unmanned vehicle laser radar rapid re-matching positioning method
CN111949943A (en) * 2020-07-24 2020-11-17 北京航空航天大学 Vehicle fusion positioning method for V2X and laser point cloud registration for advanced automatic driving
CN112083725A (en) * 2020-09-04 2020-12-15 湖南大学 Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
CN112254729A (en) * 2020-10-09 2021-01-22 北京理工大学 Mobile robot positioning method based on multi-sensor fusion
CN112923931A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)科技有限公司 Feature map matching and GPS positioning information fusion method based on fixed route

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3078935A1 (en) * 2015-04-10 2016-10-12 The European Atomic Energy Community (EURATOM), represented by the European Commission Method and device for real-time mapping and localization

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019188704A1 (en) * 2018-03-29 2019-10-03 パイオニア株式会社 Self-position estimation device, self-position estimation method, program, and recording medium
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN112923931A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)科技有限公司 Feature map matching and GPS positioning information fusion method based on fixed route
CN111413721A (en) * 2020-01-14 2020-07-14 华为技术有限公司 Vehicle positioning method, device, controller, intelligent vehicle and system
CN111522043A (en) * 2020-04-30 2020-08-11 北京联合大学 Unmanned vehicle laser radar rapid re-matching positioning method
CN111949943A (en) * 2020-07-24 2020-11-17 北京航空航天大学 Vehicle fusion positioning method for V2X and laser point cloud registration for advanced automatic driving
CN112083725A (en) * 2020-09-04 2020-12-15 湖南大学 Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
CN112254729A (en) * 2020-10-09 2021-01-22 北京理工大学 Mobile robot positioning method based on multi-sensor fusion

Also Published As

Publication number Publication date
CN113758491A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
US11002859B1 (en) Intelligent vehicle positioning method based on feature point calibration
CN109709801B (en) Indoor unmanned aerial vehicle positioning system and method based on laser radar
Zou et al. A comparative analysis of LiDAR SLAM-based indoor navigation for autonomous vehicles
KR102581263B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN107239076B (en) AGV laser SLAM method based on virtual scanning and distance measurement matching
CN109211251B (en) Instant positioning and map construction method based on laser and two-dimensional code fusion
JP7086111B2 (en) Feature extraction method based on deep learning used for LIDAR positioning of autonomous vehicles
KR102628778B1 (en) Method and apparatus for positioning, computing device, computer-readable storage medium and computer program stored in medium
EP4170282A1 (en) Method for calibrating mounting deviation angle between sensors, combined positioning system, and vehicle
CN112083726B (en) Park-oriented automatic driving double-filter fusion positioning system
WO2022007776A1 (en) Vehicle positioning method and apparatus for target scene region, device and storage medium
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN114018248B (en) Mileage metering method and image building method integrating code wheel and laser radar
CN112965063B (en) Robot mapping and positioning method
CN113920198B (en) Coarse-to-fine multi-sensor fusion positioning method based on semantic edge alignment
CN114004869A (en) Positioning method based on 3D point cloud registration
CN113192142A (en) High-precision map construction method and device in complex environment and computer equipment
CN113763549A (en) Method, device and storage medium for simultaneous positioning and mapping by fusing laser radar and IMU
CN116337045A (en) High-speed map building navigation method based on karto and teb
CN115311349A (en) Vehicle automatic driving auxiliary positioning fusion method and domain control system thereof
CN114689035A (en) Long-range farmland map construction method and system based on multi-sensor fusion
CN113758491B (en) Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle
CN117029870A (en) Laser odometer based on road surface point cloud
CN116067358A (en) Multi-source data fusion map building and positioning method and system and automatic driving vehicle
WO2022194110A1 (en) External parameter calibration method and apparatus, device, server and vehicle-mounted computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant