CN115951369A - Multi-sensor fusion positioning method for complex port environment - Google Patents

Multi-sensor fusion positioning method for complex port environment Download PDF

Info

Publication number
CN115951369A
CN115951369A CN202211604925.8A CN202211604925A CN115951369A CN 115951369 A CN115951369 A CN 115951369A CN 202211604925 A CN202211604925 A CN 202211604925A CN 115951369 A CN115951369 A CN 115951369A
Authority
CN
China
Prior art keywords
gps
positioning
coordinate system
environment
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211604925.8A
Other languages
Chinese (zh)
Inventor
张雪峰
孙忠平
殷嘉伦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changjia Fengxing Suzhou Intelligent Technology Co ltd
Original Assignee
Changjia Fengxing Suzhou Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changjia Fengxing Suzhou Intelligent Technology Co ltd filed Critical Changjia Fengxing Suzhou Intelligent Technology Co ltd
Priority to CN202211604925.8A priority Critical patent/CN115951369A/en
Publication of CN115951369A publication Critical patent/CN115951369A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Position Fixing By Use Of Radio Waves (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a multi-sensor fusion positioning method for complex port environment, which comprises the following steps: s1, unifying a multi-sensor coordinate system; s2, extracting constraints which have GPS signals and are in an environment with a rich structure; s3, extracting constraints in the environment without GPS signals but with a rich structure; s4, extracting the constraint in the structure degradation environment without the GPS signal; and S5, establishing a nonlinear fusion positioning frame facing different sensors, and fusing all nonlinear constraints acquired by different sensors in different environments. The method realizes the seamless link positioning of the vehicle under the conditions of scene characteristic transformation and environment transformation, effectively reduces the error of linearization and solves the positioning problem under the environment of structural degradation.

Description

Multi-sensor fusion positioning method for complex port environment
Technical Field
The invention relates to the technical field of positioning, in particular to a multi-sensor fusion positioning method for a complex port environment.
Background
Due to the progress of the autonomous navigation technology, particularly the robot positioning technology in the GPS blind area environment, the application of the robot in the detection task is rapidly developed. Although many methods have been proposed to locate robots using onboard sensors such as cameras and lidar, robust location in geometrically degenerate environments such as tunnels remains a challenging problem.
The data that laser radar obtained have the precision height, and the real-time is good, characteristics such as data stability, and laser radar simple to operate is fit for as the detector of small-size environmental perception system. The load interferometry and Mapping (load) algorithm is a SLAM algorithm based on laser radar. However, since lidar captures geometric information by scanning the environment, it is more likely to be affected in the case of geometric degradation of tunnels and the like.
UWB is an emerging wireless communication technology in recent years. A key advantage of UWB-based ranging is that the wideband is able to measure ToA and RTT with higher accuracy than using other wireless technologies. In recent years, UWB-based spatial positioning technology has achieved widespread use in a number of military and civilian fields. UWB can effectively address the problem of indoor GPS signal occlusion, but is not suitable for global positioning.
Due to the complexity of the scenes, the single sensor cannot solve the positioning problem in all scenes, and multiple sensors are often needed for fusion positioning. The laser sensor has high data precision and strong environmental adaptability, but cannot distinguish geometrically continuous repeated scenes, and can be positioned in a degraded environment by being fused with UWB sensor data.
Traditional fusion frameworks such as kalman filters (including their variant Extended Kalman Filters (EKFs), unscented Kalman Filters (UKFs), invariant extended kalman filtering algorithms, invariant EKFs) are very popular due to their mature technology, simple implementation, and high computational efficiency. However, in tunnels, the performance of integrated GNSS/INS over EKF is significantly degraded.
In addition, the existing fusion framework is single, and can only solve the positioning problem under a single scene, such as positioning for a degraded environment and positioning for a scene without a GPS signal, but lacks a general fusion framework to realize seamless connection high-precision vehicle positioning under different environments.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a multi-sensor fusion positioning method for a complex port environment.
Therefore, the invention adopts the following technical scheme:
a multi-sensor fusion positioning method for complex port environment comprises the following steps:
s1, unifying a multi-sensor coordinate system: constructed map based on world coordinate system
Figure BDA0003998256850000021
The vehicle coordinate system is based on the LiDAR coordinate system by transforming the matrix->
Figure BDA0003998256850000022
Converting the location result of the LiDAR odometer to the world coordinate system->
Figure BDA0003998256850000023
And the transformation matrix between the LiDAR coordinate system and the GPS coordinate system is ^ and/or is based on the SVD decomposition method>
Figure BDA0003998256850000024
Resolved into a rotation matrix->
Figure BDA0003998256850000025
And a translation matrix pick>
Figure BDA0003998256850000026
S2, extracting the constraint of the GPS signal in the environment with rich structure, comprising the following steps:
s21, acquiring GPS signals through a GPS receiver, and converting the signals into GPS positioning factors:
when the vehicle runs a certain distance, the GPS sensor receives a group of GPS data and a GPS positioning factor at the moment k
Figure BDA0003998256850000027
The definition is as follows:
Figure BDA0003998256850000028
Figure BDA0003998256850000029
wherein
Figure BDA00039982568500000210
Is the GPS fix at time k, including the x, y, and z coordinates of the vehicle; />
Figure BDA00039982568500000211
The translation vector in the vehicle state at the moment k is also the vehicle position, is an unknown quantity and comprises x, y and z coordinates of the vehicle; using covariance matrices
Figure BDA00039982568500000212
Weighting the location factor and->
Figure BDA00039982568500000213
The design is as follows:
Figure BDA00039982568500000214
wherein the content of the first and second substances,
Figure BDA00039982568500000215
and &>
Figure BDA00039982568500000216
The weights of the GPS positioning in the x direction, the y direction and the z direction are scalar quantities, and are determined by the GPS positioning precision and the GPS positioning state in the x direction, the y direction and the z direction;
s22, utilizing the laser radar sensor to carry out SLAM positioning, and acquiring relative pose factors in unit time:
based on the LiDAR odometry proposed in LOAM, relative pose measurements [ Δ α ] between consecutive LiDAR frames k and k +1 are obtained k Δβ k ] T Wherein
Figure BDA00039982568500000217
Relative pose factor of time->
Figure BDA00039982568500000218
Is defined as:
Figure BDA00039982568500000219
Figure BDA00039982568500000220
wherein
Figure BDA0003998256850000031
Is a translation vector in the vehicle state at the moment k-1 and k; />
Figure BDA0003998256850000032
The rotation vectors in the vehicle states at the k-1 moment and the k moment are unknown quantities; />
Figure BDA0003998256850000033
Is the world coordinate system>
Figure BDA0003998256850000034
And lidar coordinate system>
Figure BDA0003998256850000035
By the rotation matrix calculated in S1->
Figure BDA0003998256850000036
Replacing; by means of the covariance matrix->
Figure BDA0003998256850000037
Weighting the relative attitude factor;
s3, extracting constraints in the environment without GPS signals but with a rich structure;
s4, extracting the constraint under the environment without GPS signals and structural degradation;
s5, establishing a nonlinear fusion positioning frame facing different sensors, and fusing all nonlinear constraints acquired by the different sensors in different environments:
Figure BDA0003998256850000038
wherein, when the GPS has signals, a 1 =1,a 3 =0, otherwise, a 1 =0,a 3 =1;
When in a structurally rich environment, a 0 =1,a 2 =0, otherwise, a 0 =0,a 2 =1;
Solving the above formula by using a factor graph principle, and acquiring the vehicle state X under the condition of minimum constraint factors k =[t k r k ] T ,
Figure BDA0003998256850000039
The transformation matrix in the above step S1
Figure BDA00039982568500000310
Is represented as follows:
Figure BDA00039982568500000311
where G is the GPS location result set and L is the location result set for LiDAR SLAM.
In the above step S22, in the step S22, the covariance matrix
Figure BDA00039982568500000312
The design is as follows:
Figure BDA00039982568500000313
wherein
Figure BDA00039982568500000314
Is the weight of the rotation, corrected by plane detection; />
Figure BDA00039982568500000315
Is the weight of the conversion, designed as follows:
Figure BDA00039982568500000316
wherein
Figure BDA00039982568500000317
Is a scalar quantity, and the calculation method is as follows:
the vehicle position at time k is predicted from the vehicle positions at the first two times using the following vehicle equation of motion:
Figure BDA00039982568500000318
predicted position calculated by LiDAR odometer
Figure BDA00039982568500000319
And position t k Error distance and weight between>
Figure BDA00039982568500000320
In relation, the larger the error distance, the smaller the weight; weighting based on error distance using a Gaussian model>
Figure BDA0003998256850000041
Modeling was performed as follows:
Figure BDA0003998256850000042
where the variable sigma is the actual situation setting.
The step S3 includes the following sub-steps:
s31, a synchronization step S22, utilizing the laser radar sensor to perform SLAM positioning, and acquiring relative pose factors in unit time:
s32, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment:
the UWB sensor measures the distance from the roadside co-located device to the receiver, and the distance between the UWB receiver and the jth roadside co-located device is represented as
Figure BDA0003998256850000043
Is a scalar; assuming that the receiver is located at the origin of the LiDAR coordinate system L, the distance measurement factor from the UWB receiver is located by the following equation: />
Figure BDA0003998256850000044
Figure BDA0003998256850000045
Wherein the content of the first and second substances,
Figure BDA0003998256850000046
is the jth roadside cooperative localization equipment U j Is at>
Figure BDA0003998256850000047
The position of the coordinate system, the distance measurement coefficient from the roadside co-location device being at most 5cm, based on->
Figure BDA0003998256850000048
Is weighted and/or is based on>
Figure BDA0003998256850000049
Is a scalar quantity. Accordingly, is present>
Figure BDA00039982568500000410
There should be a higher weight.
The step S4 includes the following sub-steps:
s41, detecting the wall surface by using the laser radar sensor, calculating the distance between the wall surface and the wall surface, matching the wall surface with the wall surface in the map, and converting the distance into a distance measurement factor:
in the map generation step, the plane equations of all the wall surfaces in W are manually measured and expressed by the following equations:
Figure BDA00039982568500000411
wherein
Figure BDA00039982568500000412
Is the normal vector of the i-th plane>
Figure BDA00039982568500000413
And &>
Figure BDA00039982568500000414
The components of the normal vector in the x, y and z directions are scalar quantities; d is a radical of i Is the intercept of the ith plane, which is a scalar;
when a laser radar sensor scans laser radar point cloud, extracting planar laser radar points by a plane detection method based on a RANSAC method; distance between laser radar sensor and ith wall surface
Figure BDA00039982568500000415
Represents and/or is based on>
Figure BDA00039982568500000416
Is a markAn amount; the distance measurement coefficient from the plane detection is defined by the following formula:
Figure BDA0003998256850000051
Figure BDA0003998256850000052
detecting l wall surfaces at the moment k, and measuring the distance from the ith plane with the coefficient up to 2cm at most
Figure BDA0003998256850000053
Weighted,. Or>
Figure BDA0003998256850000054
Is a scalar quantity. Accordingly, are combined>
Figure BDA0003998256850000055
There should be a higher weight.
And S42, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment as in S32.
Compared with the prior art, the invention has the following beneficial effects:
1. compared with the traditional fusion positioning method, the fusion positioning method has smaller error. The data fusion based on the factor graph establishes a factor graph together with the current data and the historical data, establishes a batch optimization cost function based on the factor graph, and then optimizes all historical information and current information together. In factor graph optimization, due to multiple iterations, the error of linearization is effectively reduced.
2. The fusion positioning method can realize the seamless link positioning of vehicles under the conditions of scene characteristic change and environment change, such as the environment from abundant structure to degraded structure, and the environment from the environment with GPS signals to the environment without GPS signals. Such environmental changes can cause problems with positioning in the absence of GPS signals, as well as problems with different positioning method selection and fusion.
3. The invention solves the positioning problem in the structural degradation environment.
Drawings
FIG. 1 is a schematic view of a vehicle sensor installation according to the present invention;
FIG. 2 is a schematic diagram of an environment location with a GPS and a rich structure;
FIG. 3 is a schematic diagram of an environment location without GPS and with a rich structure;
FIG. 4 is a schematic diagram of an environmental positioning without GPS and structural degradation.
Detailed Description
The fusion localization method of the present invention is described in detail below with reference to the accompanying drawings and specific examples.
The multi-sensor fusion positioning method facing the complex port environment adopts roadside cooperative positioning equipment and vehicle-mounted multi-sensor positioning equipment on the basis of a high-precision map to form a multi-sensor auxiliary positioning system facing the complex port environment. The roadside cooperative equipment adopts a UWB transmitting device, and the vehicle-mounted multi-sensor comprises a laser radar sensor, a GPS receiver and a UWB receiver. The installation diagram of the sensor is shown in fig. 1.
The multi-sensor auxiliary positioning system adopted by the invention comprises a vehicle-mounted multi-sensor, a lane-level map and one of roadside cooperative positioning equipment and a GPS, wherein:
the vehicle-mounted multi-sensor comprises a GPS receiver, a laser radar sensor and a UWB receiver; the GPS receiver is used for acquiring GPS signals; the laser radar sensor is used for acquiring laser point cloud; the UWB receiver is used for acquiring UWB signals sent by the vehicle-road cooperative positioning device;
the roadside cooperative positioning equipment is a UWB transmitting device, is used for transmitting UWB signals and is only arranged in an area without GPS signals;
the lane-level map comprises architectural structure information in the port and position information of roadside cooperative equipment;
the system adopts sensors such as a GPS, a laser radar sensor and a UWB and collects GPS signals, three-dimensional laser point cloud and UWB ranging information, converts the information into constraints of various scales by combining a lane-level map, and realizes positioning by utilizing factor map fusion.
On the basis of a high-precision map, roadside cooperative positioning equipment and vehicle-mounted multi-sensor positioning equipment are adopted to form a multi-sensor auxiliary positioning system facing a complex port environment.
For the environment with GPS and rich structure, the method of GPS and lidar SLAM positioning is adopted, as shown in fig. 2. Aiming at the GPS-free and rich-structure environment (such as an underground parking lot or a freight warehouse), a laser radar SLAM and UWB fusion positioning method is adopted, and is shown in figure 3. Aiming at the environment without GPS and structural degradation (such as under a gantry crane bridge), a laser radar and UWB fusion positioning method is adopted, as shown in FIG. 4.
The invention relates to a multi-sensor fusion positioning method for a complex port environment, which comprises the following steps:
s1, unifying a multi-sensor coordinate system:
in the whole vehicle positioning system under the tunnel environment, the related coordinate systems comprise a world coordinate system W and a LiDAR coordinate system L, and the constructed map is based on the world coordinate system
Figure BDA0003998256850000061
The vehicle coordinate system is based on the LiDAR coordinate system and is transformed by a transformation matrix
Figure BDA0003998256850000062
Determining the position of the GPS receiver and the UWB receiver in a LiDAR coordinate system, receiving a set of laser radar data by a laser radar sensor, taking the laser radar coordinate system of a first frame as the LiDAR coordinate system, and changing a matrix between the LiDAR coordinate system and the GPS coordinate system->
Figure BDA0003998256850000063
Is represented by: />
Figure BDA0003998256850000064
Wherein G is a GPS positioning result set and L is a LiDAR SLAM positioning result set;
by transforming matrices
Figure BDA0003998256850000065
Converting the location result of the LiDAR odometer to the world coordinate system->
Figure BDA0003998256850000066
And the transformation matrix between the LiDAR coordinate system and the GPS coordinate system is ^ and/or is based on the SVD decomposition method>
Figure BDA0003998256850000067
Decomposition into a rotation matrix
Figure BDA0003998256850000071
And a translation matrix pick>
Figure BDA0003998256850000072
S2, extracting the constraint of the GPS signal in the environment with rich structure, comprising the following steps:
s21, acquiring GPS signals through a GPS receiver, and converting the signals into GPS positioning factors:
when the vehicle runs for a certain distance, the GPS sensor receives a group of GPS data and a GPS positioning factor at the time k
Figure BDA0003998256850000073
The definition is as follows:
Figure BDA0003998256850000074
Figure BDA0003998256850000075
wherein
Figure BDA0003998256850000076
Is the GPS location node at time kThe results, including the x, y, and z coordinates of the vehicle; />
Figure BDA0003998256850000077
The translation vector in the vehicle state at the moment k is also the vehicle position, is an unknown quantity and comprises x, y and z coordinates of the vehicle; using covariance matrices
Figure BDA0003998256850000078
Weighting the location factor and->
Figure BDA0003998256850000079
The design is as follows:
Figure BDA00039982568500000710
wherein the content of the first and second substances,
Figure BDA00039982568500000711
and &>
Figure BDA00039982568500000712
The weights of the GPS positioning in the x direction, the y direction and the z direction are scalar quantities, and are determined by the GPS positioning precision and the GPS positioning state in the x direction, the y direction and the z direction;
s22, utilizing the laser radar sensor to perform SLAM positioning, and acquiring relative pose factors in unit time:
based on the LiDAR odometry proposed in LOAM, relative pose measurements [ Δ α ] between consecutive LiDAR frames k and k +1 are obtained k Δβ k ] T Wherein
Figure BDA00039982568500000713
Relative pose factor of time->
Figure BDA00039982568500000714
Is defined as:
Figure BDA00039982568500000715
Figure BDA00039982568500000716
wherein
Figure BDA00039982568500000717
Is a translation vector in the vehicle state at the moment k-1 and k; />
Figure BDA00039982568500000718
The rotation vectors in the vehicle states at the k-1 moment and the k moment are unknown quantities; />
Figure BDA00039982568500000719
Is the world coordinate system->
Figure BDA00039982568500000720
And the laser radar coordinate system->
Figure BDA00039982568500000721
By the rotation matrix calculated in S1->
Figure BDA00039982568500000722
Replacing; by means of the covariance matrix->
Figure BDA00039982568500000723
The relative pose factors are weighted. Covariance matrix ≥>
Figure BDA00039982568500000724
The design is as follows: />
Figure BDA00039982568500000725
Wherein
Figure BDA0003998256850000081
Is rotatedThe weight of (3), corrected by plane detection; />
Figure BDA0003998256850000082
Is the weight of the conversion, designed as follows:
Figure BDA0003998256850000083
wherein
Figure BDA0003998256850000084
Is a scalar quantity, and the calculation method is as follows:
the vehicle position at time k is predicted from the vehicle positions at the first two times using the following vehicle equation of motion:
Figure BDA0003998256850000085
predicted position calculated by LiDAR odometer
Figure BDA0003998256850000086
And position t k Error distance and weight between->
Figure BDA0003998256850000087
In relation, the larger the error distance, the smaller the weight; weighting based on error distance using a Gaussian model>
Figure BDA0003998256850000088
Modeling was performed as follows:
Figure BDA0003998256850000089
where the variable σ is the actual situation setting.
S3, extracting the constraint without GPS signals under the environment with rich structure, comprising the following steps:
s31, a synchronization step S22, utilizing the laser radar sensor to perform SLAM positioning, and acquiring relative pose factors in unit time:
s32, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment:
the UWB sensor measures the distance from the roadside co-located device to the receiver, and the distance between the UWB receiver and the jth roadside co-located device is represented as
Figure BDA00039982568500000810
Is a scalar; assuming that the receiver is located at the origin of the LiDAR coordinate system L, the distance measurement factor from the UWB receiver is located by the following equation:
Figure BDA00039982568500000811
Figure BDA00039982568500000812
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00039982568500000813
is the jth roadside cooperative localization equipment U j Is at>
Figure BDA00039982568500000814
The position of the coordinate system, the distance measurement coefficient from the roadside co-location device being at most 5cm, based on->
Figure BDA00039982568500000815
Is weighted and/or is based on>
Figure BDA00039982568500000816
Is a scalar quantity.
S4, extracting the constraint without GPS signals and under the structure degradation environment, comprising the following steps:
s41, detecting the wall surface by using the laser radar sensor, calculating the distance between the wall surface and the wall surface, matching the wall surface with the wall surface in the map, and converting the distance into a distance measurement factor:
in the map generation step, the plane equations of all the wall surfaces in W are manually measured and expressed by the following equations:
Figure BDA0003998256850000091
/>
wherein
Figure BDA0003998256850000092
Is the normal vector of the i-th plane>
Figure BDA0003998256850000093
And &>
Figure BDA0003998256850000094
The components of the normal vector in the x, y and z directions are scalar quantities; d i Is the intercept of the ith plane, which is a scalar;
when a laser radar sensor scans laser radar point cloud, extracting planar laser radar points by a plane detection method based on a RANSAC method; distance between laser radar sensor and ith wall surface
Figure BDA0003998256850000095
Means for>
Figure BDA0003998256850000096
Is a scalar; the distance measurement coefficient from the plane detection is defined by the following formula:
Figure BDA0003998256850000097
Figure BDA0003998256850000098
at time k,/are detectedThe wall surface has a distance measurement coefficient of 2cm at most detected from the ith plane
Figure BDA0003998256850000099
Weighted,. Or>
Figure BDA00039982568500000910
Is a scalar;
and S42, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment as in S32.
S5, establishing a nonlinear fusion positioning frame facing different sensors, and fusing all nonlinear constraints acquired by the different sensors in different environments:
Figure BDA00039982568500000911
wherein, when the GPS has a signal, a 1 =1,a 3 =0, otherwise, a 1 =0,a 3 =1;
When in a structurally rich environment, a 0 =1,a 2 =0, otherwise, a 0 =0,a 2 =1;
Solving the above formula by using a factor graph principle, and acquiring the vehicle state X under the condition of minimum constraint factors k =[t k r k ] T ,
Figure BDA00039982568500000912
In one embodiment of the invention, the truck is 6.1 meters long, 2.4 meters wide, and 2.5 meters high, and the GPS, lidar, UWB are mounted in three orientations of the truck respectively, as shown in fig. 1. The sensor parameters were as follows:
1. laser radar: the model is RPLIDAR S2; the scanning frequency is 10HZ; angular resolution of 0.12 °; the range error is +/-30 mm.
2, GPS: the model is the coding Cooper BD3U; the positioning accuracy was 2.5 meters (CEP 50, open land); the positioning update frequency is default 1HZ and maximum 10HZ.
UWB: the model is USB-S1-PRO; the distance measurement is 600m (clear visual distance); the positioning error is as follows: x axis and Y axis are +/-10cm, and Z axis is +/-20 cm; the distance measurement precision is +/-5 cm.

Claims (5)

1. A multi-sensor fusion positioning method for complex port environment comprises the following steps:
s1, unifying a multi-sensor coordinate system: constructed map based on world coordinate system
Figure FDA0003998256840000011
The vehicle coordinate system is based on the LiDAR coordinate system by transforming the matrix->
Figure FDA0003998256840000012
Converting the location result of the LiDAR odometer to the world coordinate system->
Figure FDA0003998256840000013
And the transformation matrix between the LiDAR coordinate system and the GPS coordinate system is ^ and/or is based on the SVD decomposition method>
Figure FDA0003998256840000014
Decomposition into a rotation matrix
Figure FDA0003998256840000015
And a translation matrix pick>
Figure FDA0003998256840000016
S2, extracting the constraint of the GPS signal in the environment with rich structure, comprising the following steps:
s21, acquiring GPS signals through a GPS receiver, and converting the signals into GPS positioning factors:
when the vehicle runs for a certain distance, the GPS sensor receives a group of GPS data and a GPS positioning factor at the time k
Figure FDA0003998256840000017
The definition is as follows:
Figure FDA0003998256840000018
Figure FDA0003998256840000019
wherein
Figure FDA00039982568400000110
Is the GPS fix at time k, including the x, y, and z coordinates of the vehicle; />
Figure FDA00039982568400000111
The translation vector in the vehicle state at the moment k is also the vehicle position, is an unknown quantity and comprises x, y and z coordinates of the vehicle; using covariance matrices
Figure FDA00039982568400000112
Weighting the location factor and->
Figure FDA00039982568400000113
The design is as follows:
Figure FDA00039982568400000114
wherein the content of the first and second substances,
Figure FDA00039982568400000115
and &>
Figure FDA00039982568400000116
The weights of the GPS positioning in the x direction, the y direction and the z direction are scalar quantities, and are determined by the GPS positioning precision and the GPS positioning state in the x direction, the y direction and the z direction;
S22, utilizing the laser radar sensor to perform SLAM positioning, and acquiring relative pose factors in unit time:
based on the LiDAR odometry proposed in LOAM, relative pose measurements [ Δ α ] between consecutive LiDAR frames k and k +1 are obtained k Δβ k ] T Wherein
Figure FDA00039982568400000117
Relative pose factor of time->
Figure FDA00039982568400000118
Is defined as:
Figure FDA00039982568400000119
Figure FDA00039982568400000120
wherein
Figure FDA00039982568400000121
Is a translation vector in the vehicle state at the moment k-1 and k; />
Figure FDA00039982568400000122
The rotation vectors in the vehicle states at the k-1 moment and the k moment are unknown quantities; />
Figure FDA0003998256840000021
Is the world coordinate system->
Figure FDA0003998256840000022
And the laser radar coordinate system->
Figure FDA0003998256840000023
A rotation matrix of S1The calculated rotation matrix ≥>
Figure FDA0003998256840000024
Replacing; by means of the covariance matrix->
Figure FDA0003998256840000025
Weighting the relative attitude factors; />
S3, extracting constraints in the environment without GPS signals but with a rich structure;
s4, extracting the constraint in the structure degradation environment without the GPS signal;
s5, establishing a nonlinear fusion positioning framework facing different sensors, and fusing all nonlinear constraints acquired by different sensors in different environments:
Figure FDA0003998256840000026
wherein, when the GPS has a signal, a 1 =1,a 3 =0, otherwise, a 1 =0,a 3 =1;
When in a structurally rich environment, a 0 =1,a 2 =0, otherwise, a 0 =0,a 2 =1;
Solving the above formula by using a factor graph principle to obtain the vehicle state under the condition that all constraint factors are minimum
Figure FDA0003998256840000027
2. The multi-sensor fusion localization method of claim 1, wherein: the transformation matrix in step S1
Figure FDA0003998256840000028
Is represented as follows:
Figure FDA0003998256840000029
where G is the GPS location result set and L is the location result set for LiDAR SLAM.
3. The multi-sensor fusion localization method of claim 1, wherein: in step S22, the covariance matrix
Figure FDA00039982568400000210
The design is as follows:
Figure FDA00039982568400000211
wherein
Figure FDA00039982568400000212
Is the weight of the rotation, corrected by plane detection; />
Figure FDA00039982568400000213
Is the weight of the conversion, designed as follows:
Figure FDA00039982568400000214
wherein
Figure FDA00039982568400000215
Is a scalar quantity, and the calculation method is as follows:
the vehicle position at time k is predicted from the vehicle positions at the first two times using the following vehicle equation of motion:
Figure FDA00039982568400000216
predicted position calculated by LiDAR odometer
Figure FDA00039982568400000217
And position t k Error distance and weight between->
Figure FDA00039982568400000218
In relation, the larger the error distance, the smaller the weight; weighting based on error distance using a Gaussian model>
Figure FDA0003998256840000031
Modeling was performed as follows:
Figure FDA0003998256840000032
where the variable sigma is the actual situation setting.
4. The multi-sensor fusion positioning method according to claim 1, wherein the step S3 comprises the following sub-steps:
s31, a synchronization step S22, utilizing the laser radar sensor to perform SLAM positioning, and acquiring relative pose factors in unit time:
s32, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment:
the UWB sensor measures the distance from the roadside co-located device to the receiver, and the distance between the UWB receiver and the jth roadside co-located device is represented as
Figure FDA0003998256840000033
Figure FDA0003998256840000034
Is a scalar; assuming that the receiver is located at the origin of the LiDAR coordinate system L, the distance measurement factor from the UWB receiver is located by the following equation:
Figure FDA0003998256840000035
Figure FDA0003998256840000036
wherein the content of the first and second substances,
Figure FDA0003998256840000037
is the jth roadside cooperative localization equipment U j Is at>
Figure FDA0003998256840000038
The position of the coordinate system, the distance measurement coefficient from the roadside co-location device being at most 5cm, based on->
Figure FDA0003998256840000039
Is weighted and/or is based on>
Figure FDA00039982568400000310
Is a scalar quantity.
5. The multi-sensor fusion positioning method according to claim 1, wherein the step S4 comprises the following sub-steps:
s41, detecting the wall surface by using the laser radar sensor, calculating the distance between the wall surface and the wall surface, matching the wall surface with the wall surface in the map, and converting the distance into a distance measurement factor:
in the map generation step, the plane equations of all the wall surfaces in W are manually measured and expressed by the following equations:
Figure FDA00039982568400000311
wherein
Figure FDA00039982568400000312
Is the normal vector of the i-th plane>
Figure FDA00039982568400000313
And &>
Figure FDA00039982568400000314
The components of the normal vector in the x, y and z directions are scalar quantities; d is a radical of i Is the intercept of the ith plane, which is a scalar;
when a laser radar sensor scans laser radar point cloud, extracting planar laser radar points by a plane detection method based on a RANSAC method; distance between laser radar sensor and ith wall surface
Figure FDA0003998256840000041
Represents and/or is based on>
Figure FDA0003998256840000042
Is a scalar; the distance measurement coefficient from the plane detection is defined by the following formula:
Figure FDA0003998256840000043
Figure FDA0003998256840000044
detecting l wall surfaces at the moment k, wherein the distance measurement coefficient detected from the ith plane reaches 2cm at most
Figure FDA0003998256840000045
Weighted +>
Figure FDA0003998256840000046
Is a scalar;
and S42, receiving the signal transmitted by the roadside cooperative positioning equipment by using the UWB receiver, and calculating a distance measurement factor between the UWB receiver and the roadside cooperative positioning equipment as in S32.
CN202211604925.8A 2022-12-14 2022-12-14 Multi-sensor fusion positioning method for complex port environment Pending CN115951369A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211604925.8A CN115951369A (en) 2022-12-14 2022-12-14 Multi-sensor fusion positioning method for complex port environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211604925.8A CN115951369A (en) 2022-12-14 2022-12-14 Multi-sensor fusion positioning method for complex port environment

Publications (1)

Publication Number Publication Date
CN115951369A true CN115951369A (en) 2023-04-11

Family

ID=87290028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211604925.8A Pending CN115951369A (en) 2022-12-14 2022-12-14 Multi-sensor fusion positioning method for complex port environment

Country Status (1)

Country Link
CN (1) CN115951369A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117268409A (en) * 2023-08-30 2023-12-22 苏州大成运和智能科技有限公司 Unmanned vehicle local positioning method based on iterative extended Kalman filtering

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117268409A (en) * 2023-08-30 2023-12-22 苏州大成运和智能科技有限公司 Unmanned vehicle local positioning method based on iterative extended Kalman filtering

Similar Documents

Publication Publication Date Title
CN108225302B (en) Petrochemical plant inspection robot positioning system and method
CN109911188B (en) Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment
CN110837080B (en) Rapid calibration method of laser radar mobile measurement system
CN113124856B (en) Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
CN114199240B (en) Two-dimensional code, laser radar and IMU fusion positioning system and method without GPS signal
CN110702091B (en) High-precision positioning method for moving robot along subway rail
CN112987065B (en) Multi-sensor-integrated handheld SLAM device and control method thereof
CN108775901B (en) Real-time SLAM scene map construction system, navigation system and method
CN109974694B (en) Indoor pedestrian 3D positioning method based on UWB/IMU/barometer
CN111077907A (en) Autonomous positioning method of outdoor unmanned aerial vehicle
CN111025366A (en) Grid SLAM navigation system and method based on INS and GNSS
CN111694001A (en) Real-time distance measurement positioning system for unmanned aerial vehicle
CN110672075A (en) Remote water area detection system and method based on three-dimensional stereo imaging
CN105301621A (en) Vehicle positioning device and intelligent driving exam system
CN114674311B (en) Indoor positioning and mapping method and system
CN115951369A (en) Multi-sensor fusion positioning method for complex port environment
CN115950418A (en) Multi-sensor fusion positioning method
CN114459467B (en) VI-SLAM-based target positioning method in unknown rescue environment
CN116105726A (en) Multi-sensor fusion type wall climbing robot elevation positioning method
CN115930948A (en) Orchard robot fusion positioning method
CN116242372A (en) UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
Kong et al. Hybrid indoor positioning method of BLE and monocular VINS based smartphone
CN110531397B (en) Outdoor inspection robot positioning system and method based on GPS and microwave
CN114812554A (en) Multi-source fusion robot indoor absolute positioning method based on filtering
CN112798020A (en) System and method for evaluating positioning accuracy of intelligent automobile

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination