CN110906923A - Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle - Google Patents

Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle Download PDF

Info

Publication number
CN110906923A
CN110906923A CN201911192614.3A CN201911192614A CN110906923A CN 110906923 A CN110906923 A CN 110906923A CN 201911192614 A CN201911192614 A CN 201911192614A CN 110906923 A CN110906923 A CN 110906923A
Authority
CN
China
Prior art keywords
vehicle
data
imu
laser radar
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911192614.3A
Other languages
Chinese (zh)
Other versions
CN110906923B (en
Inventor
肖乔木
刘紫扬
熊周兵
陈卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN201911192614.3A priority Critical patent/CN110906923B/en
Publication of CN110906923A publication Critical patent/CN110906923A/en
Application granted granted Critical
Publication of CN110906923B publication Critical patent/CN110906923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a vehicle-mounted multi-sensor tight coupling fusion positioning method, a system, a storage medium and a vehicle, wherein the method comprises the following steps: step 1, acquiring laser radar, GPS, IMU and chassis data, and performing time synchronization and space synchronization on the laser radar, the IMU, the GPS and the chassis data; step 2, tightly coupling the laser radar data and the IMU data, and performing matching calculation to obtain the position and the posture of the vehicle; step 3, performing kinematic calculation on the vehicle chassis data to obtain the vehicle position and posture; step 4, acquiring GPS positioning data, and carrying out UTM conversion on the GPS data to obtain the vehicle position posture; and 5, establishing a Kalman state and an observation model, and performing fusion operation to obtain a final positioning result of the vehicle. The invention can ensure that the automatic driving automobile can still stably obtain the positioning data with higher precision in a higher-speed motion state or in a severe environment.

Description

Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle
Technical Field
The invention belongs to the technical field of automobile positioning, and particularly relates to a vehicle-mounted multi-sensor tight coupling fusion positioning method, a system, a storage medium and a vehicle.
Background
The automatic driving automobile relates to various technologies, wherein the core technology positioning is to solve the problem of 'where the automobile is', extremely high requirements are provided for reliability, stability and functional safety, and the positioning precision required by the automatic driving automobile needs to reach centimeter-level positioning. Currently, the positioning technology adopted by an automatic driving automobile can be roughly divided into three types according to the technical principle, wherein the first type is signal-based positioning, namely GNSS positioning, namely a global navigation positioning system; the second type is based on dead reckoning, which represents dead reckoning by means of automobile chassis data, namely, the current position and course are deduced according to the position and course at the previous moment; the third type is positioning based on environmental feature matching, and the representative is a relative positioning mode based on a laser radar and a high-precision map;
the three modes have respective defects, and based on the positioning of the signals, the positioning result is extremely deviated in places where the signals can not be covered, such as tunnels, complex urban environments and the like; positioning based on dead reckoning can accumulate over time to generate integral errors, so that inaccurate positioning is caused; based on the matched positioning, depending on the environmental characteristics, when the environmental change is obvious, the positioning error is large.
Therefore, it is necessary to develop a vehicle-mounted multi-sensor close-coupled fusion positioning method and system.
Disclosure of Invention
The invention aims to provide a vehicle-mounted multi-sensor tight coupling fusion positioning method, a system, a storage medium and a vehicle, which can ensure that the positioning data with higher precision can still be stably obtained in a higher-speed motion state or a severe environment of an automatic driving automobile.
The invention relates to a vehicle-mounted multi-sensor tight coupling fusion positioning method, which comprises the following steps:
step 1, acquiring laser radar, GPS, IMU and chassis data, and performing time synchronization and space synchronization on the laser radar, the IMU, the GPS and the chassis data;
step 2, carrying out tight coupling processing on the laser radar data and the IMU data, and specifically comprising the following steps:
(2-1) judging whether the current iteration times are the first time;
if yes, the IMU state is not updated;
if not, updating the IMU state according to the vehicle pose state error output by the last iteration;
(2-2) integrating the IMU data to obtain a pose state value relative to the IMU;
(2-3) when the laser radar data arrive, performing de-migration calculation on the laser radar data based on the pose data obtained by the IMU to obtain a predicted laser radar pose;
(2-4) extracting and calculating characteristic points of the laser radar data;
(2-5) matching the extracted laser radar characteristic points to a pre-established local map;
(2-6) obtaining a pose measurement value relative to the laser radar according to the matching result;
(2-7) carrying out combined nonlinear optimization, and obtaining MAP state estimation in a local window according to the pose measurement value relative to the laser radar and the pose state value relative to the IMU to obtain a vehicle pose state error;
(2-8) feeding back the optimized vehicle pose state error to the IMU pose state calculation in the step (2-1), and carrying out coordinate transformation to obtain a position and pose value of the vehicle;
step 3, performing kinematic calculation on the vehicle chassis data to obtain a position attitude value of the vehicle;
step 4, acquiring GPS positioning data, and carrying out UTM conversion on the GPS data to obtain a position attitude value of the vehicle;
and 5, establishing a Kalman state and an observation model, taking a vehicle positioning result obtained by matching calculation after the laser data and the IMU are tightly coupled and a vehicle positioning result obtained by converting the GPS into UTM as a positioning observation value, taking a vehicle positioning result obtained by calculating vehicle chassis data as a positioning state measurement value, and carrying out fusion operation on the Kalman state and the observation model to obtain a final positioning result of the vehicle.
Further, in the step 1, spatial synchronization refers to calibration through pre-measured vehicle installation external parameters and internal parameters of the sensor;
the time synchronization refers to calibration based on GPS time service and soft time of an operating system.
Further, step 3 specifically includes acquiring a vehicle wheel pulse signal and a vehicle wheel rotation angle signal, and calculating according to a four-wheel differential motion model.
The invention relates to a vehicle-mounted multi-sensor tight coupling fusion positioning system which comprises four laser radars, a GPS receiving box, an IMU sensor, a controller and a memory, wherein the four laser radars are respectively arranged on the front, the back, the left and the right of a vehicle; the method is characterized in that: the controller is used for realizing the vehicle-mounted multi-sensor close coupling fusion positioning method when the executable program stored in the memory is executed.
The storage medium stores an executable program, and when the executable program is executed by a processor, the vehicle-mounted multi-sensor tight coupling fusion positioning method is realized.
The vehicle adopts the vehicle-mounted multi-sensor tight coupling fusion positioning system.
The invention has the following advantages: by tightly coupling the lidar data and the IMU data, an accurate state estimation value can be output at a higher IMU update frequency, and an error range can be maintained within a receivable range even in the case where the lidar data is degraded for a long period of time. Meanwhile, in order to enhance the robustness and consistency of the positioning system, the rear end is accessed with GPS data and chassis calculation data for fusion positioning, so that a more stable positioning result is obtained. The method has higher calculation efficiency, is reliable and feasible, and ensures that the automatic driving automobile can still stably obtain the positioning data with higher precision in a higher-speed motion state or a severe environment, so that the automatic driving automobile is safer and more reliable.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a schematic view of the installation of the sensor of the present invention.
Detailed Description
The invention will be further explained with reference to the drawings.
As shown in fig. 1, a vehicle-mounted multi-sensor tight coupling fusion positioning method includes the following steps:
step 1, acquiring Lidar (namely Lidar), GPS, IMU and chassis data, and performing time synchronization and space synchronization on the Lidar, the IMU, the GPS and the chassis data. The spatial synchronization refers to calibration through pre-measured vehicle installation external parameters and internal parameters of the sensor. The time synchronization refers to calibration based on GPS time service and soft time of an operating system.
Step 2, carrying out tight coupling processing on the laser radar data and the IMU data, and specifically comprising the following steps:
(2-1) judging whether the current iteration times are the first time;
if yes, the IMU state is not updated;
if not, updating the IMU state according to the vehicle pose state error output by the last iteration;
and (2-2) integrating the IMU data to obtain a pose state value relative to the IMU.
And (2-3) when the laser radar data arrives, performing de-migration calculation on the laser radar data based on the pose data obtained by the IMU to obtain the predicted laser radar pose.
And (2-4) carrying out feature point extraction calculation on the laser radar data.
And (2-5) matching the extracted laser radar characteristic points into a pre-established local map.
(2-6) obtaining a pose measurement value relative to the laser radar according to the matching result;
(2-7) carrying out combined nonlinear optimization, and obtaining MAP state estimation in a local window according to the pose measurement value relative to the laser radar and the pose state value relative to the IMU to obtain a vehicle pose state error;
(2-8) feeding back the optimized vehicle pose state error to the IMU pose state calculation in the step (2-1), and carrying out coordinate transformation to obtain a position and pose value of the vehicle;
step 3, performing kinematic calculation on the vehicle chassis data to obtain a position attitude value of the vehicle; the method specifically comprises the following steps: and acquiring a vehicle wheel pulse signal and a vehicle wheel turning angle signal, and calculating according to a four-wheel differential motion model.
And 4, acquiring GPS positioning data, and carrying out UTM conversion on the GPS data, namely carrying out conversion from longitude and latitude to a local positioning coordinate according to the GPS data.
And 5, establishing a Kalman state and an observation model, taking a vehicle positioning result obtained by matching calculation after the laser data and the IMU are tightly coupled and a vehicle positioning result obtained by converting the GPS into UTM as a positioning observation value, taking a vehicle positioning result obtained by calculating vehicle chassis data as a positioning state measurement value, and carrying out fusion operation on the Kalman state and the observation model to obtain a final positioning result of the vehicle.
In the present embodiment, the first and second electrodes are,
(a) the state of the IMU represents:
Figure BDA0002293950690000041
Figure BDA0002293950690000042
wherein the content of the first and second substances,
Figure BDA0002293950690000043
is the state vector of the IMU and,
Figure BDA0002293950690000044
in the form of a position vector, the position vector,
Figure BDA0002293950690000045
in the form of a velocity vector, the velocity vector,
Figure BDA0002293950690000046
in order to be the attitude vector,
Figure BDA0002293950690000047
is the zero-offset error vector of the acceleration,
Figure BDA0002293950690000048
is the zero-offset error vector of the acceleration,
Figure BDA0002293950690000049
a pose homogeneous transformation matrix is obtained;
(b) updating the state of the IMU:
Figure BDA00022939506900000410
Figure BDA00022939506900000411
Figure BDA00022939506900000412
wherein p isjIs the position value at time j, piIs the position value at time i, vkSpeed value at time k, Δ t being the difference between two times, gWIs a value of gravitational acceleration, RkIs the attitude rotation matrix at the time of the k-th time,
Figure BDA0002293950690000051
is the acceleration value at the time point k,
Figure BDA0002293950690000052
the acceleration noise value at time k.
(c) And (3) carrying out de-migration calculation on the laser radar data:
in the embodiment, the IMU measurement value is adopted to eliminate the intra-frame offset of the laser radar, firstly, the laser radar is assumed to be in linear motion in the process of collecting the laser radar data, and then linear interpolation is carried out according to the pose of the vehicle when the collection of the S1 frame is started and the pose when the collection of the S1 frame is ended.
(d) Extracting characteristic points of laser radar data:
the points of one scanning are classified by curvature values, the points with the curvature larger than the threshold value of the characteristic point are edge points, and the points with the curvature smaller than the threshold value of the characteristic point are plane points. In order to distribute the feature points evenly in the environment, one scan is divided into 4 independent sub-regions, each providing at most 2 edge points and 4 planes. In addition, unstable characteristic points (blemishes) were excluded.
(e) In order to merge the pre-integration values of the IMU, the position and attitude of the lidar are constrained using relative measurements of the lidar, and a local map is established prior to finding the correspondence of the points, because for calculating an accurate correspondence, the points of a single frame of data are not dense enough, the local map contains N discrete time-stamped feature points, and the coordinate system is consistent with the lidar.
(f) After the local map is established, calculating the matching of the laser radar key frame and the local map, outputting the pose data of the optimized key frame, and then updating the output positioning data according to the incremental laser radar pose:
pj=pi+Δp
Figure BDA0002293950690000053
wherein p isjIs the position value at time j, piIs the position value at time i, deltap is the position increment value between two times,
qjis the attitude angle at time j, σ θzAs an attitude angle incremental value, qiIs the attitude angle at time i.
(g) The joint nonlinear optimization is to obtain an optimized state, a fixed delay smoother and marginalization are used, the fixed delay smoother keeps N states in a window for smooth calculation, and the calculated state value is fed back to the IMU state estimation of the first step to prevent the IMU state from diverging.
In this embodiment, the four-wheel differential motion model specifically includes:
Angt=Angt-1+at
Odom_xt=Odom_xt-1+dt*cos(Angt)
Odom_yt=Odom_yt-1+dt*cos(Angt)
wherein, AngtIs the vehicle heading angle, Odom _ xtIs the longitudinal X-axis position of the vehicle, Odom _ ytFor the transverse Y-axis position of the vehicle, dtSubtracting the left wheel walking distance from the right wheel walking distance;
in this embodiment, the kalman state and the observation model are specifically:
Figure BDA0002293950690000061
Pt=FPt-1FT+Q;
Kt=PtHT(HPtHT+R)-1
Figure BDA0002293950690000062
Ptt=(I-KtH)Pt
wherein the content of the first and second substances,
Figure BDA0002293950690000063
for the predicted state vector at time t,
Figure BDA0002293950690000064
is the predicted state vector at time t-1, F is the state transition matrix, FTIs a transposed matrix of F, R is an attitude rotation matrix, B is a control matrix, and represents a control quantity ut-1The influence on the current state, Q is the error matrix, H is the state observation matrix, HTTransposed matrix of H, KtIs a Kalman gain coefficient, ztIs the observation vector at time t, PtFor co-operation at time tVariance matrix, Pt-1Is the covariance matrix at time t-1,
Figure BDA0002293950690000065
for fusing the corrected state vector, P, at time tttAnd (4) fusing and correcting the covariance matrix at the time t, wherein I is an identity matrix.
As shown in fig. 2, in the present embodiment, an on-vehicle multi-sensor close-coupled fusion positioning system includes four laser radars respectively installed on the front, back, left, and right of a vehicle, a GPS receiving box and an IMU sensor installed on the vehicle, a controller for receiving laser radar data, GPS data, IMU data, and chassis data, and a memory for storing an executable program; the method is characterized in that: the controller is used for realizing the vehicle-mounted multi-sensor close coupling fusion positioning method in the embodiment by executing the executable program stored in the memory.
In this embodiment, a storage medium stores an executable program, and when the executable program is executed by a processor, the method for positioning in a vehicle-mounted multi-sensor tight coupling fusion manner as described in this embodiment is implemented.
In this embodiment, a vehicle adopts the vehicle-mounted multi-sensor tight coupling fusion positioning system described in this embodiment.

Claims (6)

1. A vehicle-mounted multi-sensor tight coupling fusion positioning method comprises the following steps:
step 1, acquiring laser radar, GPS, IMU and chassis data, and performing time synchronization and space synchronization on the laser radar, the IMU, the GPS and the chassis data;
step 2, carrying out tight coupling processing on the laser radar data and the IMU data, and specifically comprising the following steps:
(2-1) judging whether the current iteration times are the first time;
if yes, the IMU state is not updated;
if not, updating the IMU state according to the vehicle pose state error output by the last iteration;
(2-2) integrating the IMU data to obtain a pose state value relative to the IMU;
(2-3) when the laser radar data arrive, performing de-migration calculation on the laser radar data based on the pose data obtained by the IMU to obtain a predicted laser radar pose;
(2-4) extracting and calculating characteristic points of the laser radar data;
(2-5) matching the extracted laser radar characteristic points to a pre-established local map;
(2-6) obtaining a pose measurement value relative to the laser radar according to the matching result;
(2-7) carrying out combined nonlinear optimization, and obtaining MAP state estimation in a local window according to the pose measurement value relative to the laser radar and the pose state value relative to the IMU to obtain a vehicle pose state error;
(2-8) feeding back the optimized vehicle pose state error to the IMU pose state calculation in the step (2-1), and carrying out coordinate transformation to obtain a position and pose value of the vehicle;
step 3, performing kinematic calculation on the vehicle chassis data to obtain a position attitude value of the vehicle;
step 4, acquiring GPS positioning data, and carrying out UTM conversion on the GPS data to obtain a position attitude value of the vehicle;
and 5, establishing a Kalman state and an observation model, taking a vehicle positioning result obtained by matching calculation after the laser data and the IMU are tightly coupled and a vehicle positioning result obtained by converting the GPS into UTM as a positioning observation value, taking a vehicle positioning result obtained by calculating vehicle chassis data as a positioning state measurement value, and carrying out fusion operation on the Kalman state and the observation model to obtain a final positioning result of the vehicle.
2. The vehicle-mounted multi-sensor close-coupled fusion positioning method according to claim 1, characterized in that: in the step 1, spatial synchronization refers to calibration through pre-measured vehicle installation external parameters and internal parameters of a sensor;
the time synchronization refers to calibration based on GPS time service and soft time of an operating system.
3. The vehicle-mounted multi-sensor close-coupled fusion positioning method according to claim 1 or 2, characterized in that: and step 3 specifically, acquiring a vehicle wheel pulse signal and a vehicle wheel corner signal, and calculating according to a four-wheel differential motion model.
4. A vehicle-mounted multi-sensor tight coupling fusion positioning system comprises four laser radars, a GPS receiving box, an IMU sensor, a controller and a memory, wherein the four laser radars are respectively arranged on the front, the back, the left and the right of a vehicle; the method is characterized in that: the controller is used for realizing the vehicle-mounted multi-sensor close-coupling fusion positioning method according to any one of claims 1 to 3 by executing the executable program stored in the memory.
5. A storage medium storing an executable program which, when executed by a processor, implements the in-vehicle multi-sensor close-coupled fusion positioning method according to any one of claims 1 to 4.
6. A vehicle, characterized in that: the vehicle-mounted multi-sensor tightly-coupled fusion positioning system of claim 5 is adopted.
CN201911192614.3A 2019-11-28 2019-11-28 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle Active CN110906923B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911192614.3A CN110906923B (en) 2019-11-28 2019-11-28 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911192614.3A CN110906923B (en) 2019-11-28 2019-11-28 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle

Publications (2)

Publication Number Publication Date
CN110906923A true CN110906923A (en) 2020-03-24
CN110906923B CN110906923B (en) 2023-03-14

Family

ID=69820221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911192614.3A Active CN110906923B (en) 2019-11-28 2019-11-28 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle

Country Status (1)

Country Link
CN (1) CN110906923B (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111833631A (en) * 2020-06-24 2020-10-27 武汉理工大学 Target data processing method, system and storage medium based on vehicle-road cooperation
CN112102418A (en) * 2020-09-16 2020-12-18 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112346084A (en) * 2020-10-26 2021-02-09 上海感探号信息科技有限公司 Automobile positioning method, system, electronic equipment and storage medium
CN112505718A (en) * 2020-11-10 2021-03-16 奥特酷智能科技(南京)有限公司 Positioning method, system and computer readable medium for autonomous vehicle
CN112666535A (en) * 2021-01-12 2021-04-16 重庆长安汽车股份有限公司 Environment sensing method and system based on multi-radar data fusion
CN112781893A (en) * 2021-01-04 2021-05-11 重庆长安汽车股份有限公司 Spatial synchronization method and device for vehicle-mounted sensor performance test data and storage medium
CN112800159A (en) * 2021-01-25 2021-05-14 北京百度网讯科技有限公司 Map data processing method and device
CN113449582A (en) * 2021-03-04 2021-09-28 同致电子科技(厦门)有限公司 Vehicle bottom blind area filling method, device, system, storage medium and computer program product
CN113514863A (en) * 2021-03-23 2021-10-19 重庆兰德适普信息科技有限公司 Multi-sensor fusion positioning method
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
CN113654555A (en) * 2021-09-14 2021-11-16 上海智驾汽车科技有限公司 Automatic driving vehicle high-precision positioning method based on multi-sensor data fusion
CN113819905A (en) * 2020-06-19 2021-12-21 北京图森未来科技有限公司 Multi-sensor fusion-based odometer method and device
CN113970330A (en) * 2021-12-22 2022-01-25 比亚迪股份有限公司 Vehicle-mounted multi-sensor fusion positioning method, computer equipment and storage medium
CN113984044A (en) * 2021-10-08 2022-01-28 杭州鸿泉物联网技术股份有限公司 Vehicle pose acquisition method and device based on vehicle-mounted multi-perception fusion
CN114013449A (en) * 2021-11-02 2022-02-08 阿波罗智能技术(北京)有限公司 Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN114264301A (en) * 2021-12-13 2022-04-01 青岛慧拓智能机器有限公司 Vehicle-mounted multi-sensor fusion positioning method and device, chip and terminal
CN115523929A (en) * 2022-09-20 2022-12-27 北京四维远见信息技术有限公司 Vehicle-mounted integrated navigation method, device, equipment and medium based on SLAM
CN115540875A (en) * 2022-11-24 2022-12-30 成都运达科技股份有限公司 Method and system for high-precision detection and positioning of train vehicles in station track
CN113970332B (en) * 2021-11-30 2024-04-19 上海于万科技有限公司 Unmanned sweeping vehicle-based navigation output result smoothing method and system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508277A (en) * 2011-10-27 2012-06-20 中国矿业大学 Precise point positioning and inertia measurement tightly-coupled navigation system and data processing method thereof
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
CN103487822A (en) * 2013-09-27 2014-01-01 南京理工大学 BD/DNS/IMU autonomous integrated navigation system and method thereof
US20160299234A1 (en) * 2015-04-07 2016-10-13 GM Global Technology Operations LLC Fail operational vehicle speed estimation through data fusion of 6-dof imu, gps, and radar
US20160358477A1 (en) * 2015-06-05 2016-12-08 Arafat M.A. ANSARI Smart vehicle
CN106767787A (en) * 2016-12-29 2017-05-31 北京时代民芯科技有限公司 A kind of close coupling GNSS/INS combined navigation devices
CN107246868A (en) * 2017-07-26 2017-10-13 上海舵敏智能科技有限公司 A kind of collaborative navigation alignment system and navigation locating method
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN108621161A (en) * 2018-05-08 2018-10-09 中国人民解放军国防科技大学 Method for estimating body state of foot type robot based on multi-sensor information fusion
CN109544638A (en) * 2018-10-29 2019-03-29 浙江工业大学 A kind of asynchronous online calibration method for Multi-sensor Fusion
CN109725339A (en) * 2018-12-20 2019-05-07 东莞市普灵思智能电子有限公司 A kind of tightly coupled automatic Pilot cognitive method and system
CN110232736A (en) * 2019-06-18 2019-09-13 中国矿业大学 A kind of down-hole combined mining working three-dimensional scenic fast construction method
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110428467A (en) * 2019-07-30 2019-11-08 四川大学 A kind of camera, imu and the united robot localization method of laser radar

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
CN102508277A (en) * 2011-10-27 2012-06-20 中国矿业大学 Precise point positioning and inertia measurement tightly-coupled navigation system and data processing method thereof
CN103487822A (en) * 2013-09-27 2014-01-01 南京理工大学 BD/DNS/IMU autonomous integrated navigation system and method thereof
US20160299234A1 (en) * 2015-04-07 2016-10-13 GM Global Technology Operations LLC Fail operational vehicle speed estimation through data fusion of 6-dof imu, gps, and radar
US20160358477A1 (en) * 2015-06-05 2016-12-08 Arafat M.A. ANSARI Smart vehicle
CN106767787A (en) * 2016-12-29 2017-05-31 北京时代民芯科技有限公司 A kind of close coupling GNSS/INS combined navigation devices
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN107246868A (en) * 2017-07-26 2017-10-13 上海舵敏智能科技有限公司 A kind of collaborative navigation alignment system and navigation locating method
CN108621161A (en) * 2018-05-08 2018-10-09 中国人民解放军国防科技大学 Method for estimating body state of foot type robot based on multi-sensor information fusion
CN109544638A (en) * 2018-10-29 2019-03-29 浙江工业大学 A kind of asynchronous online calibration method for Multi-sensor Fusion
CN109725339A (en) * 2018-12-20 2019-05-07 东莞市普灵思智能电子有限公司 A kind of tightly coupled automatic Pilot cognitive method and system
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110232736A (en) * 2019-06-18 2019-09-13 中国矿业大学 A kind of down-hole combined mining working three-dimensional scenic fast construction method
CN110428467A (en) * 2019-07-30 2019-11-08 四川大学 A kind of camera, imu and the united robot localization method of laser radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SOLOVIEV, A: "Tight Coupling of Laser Scanner and Inertial Measurements for a Fully Autonomous Relative Navigation Solution", 《PROCEEDINGS OF THE 2007 NATIONAL TECHNICAL MEETING OF THE INSTITUTE OF NAVIGATION》 *
周阳: "基于多传感器融合的移动机器人SLAM算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113819905A (en) * 2020-06-19 2021-12-21 北京图森未来科技有限公司 Multi-sensor fusion-based odometer method and device
CN111833631A (en) * 2020-06-24 2020-10-27 武汉理工大学 Target data processing method, system and storage medium based on vehicle-road cooperation
CN112102418A (en) * 2020-09-16 2020-12-18 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112102418B (en) * 2020-09-16 2022-02-11 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112346084A (en) * 2020-10-26 2021-02-09 上海感探号信息科技有限公司 Automobile positioning method, system, electronic equipment and storage medium
CN112505718A (en) * 2020-11-10 2021-03-16 奥特酷智能科技(南京)有限公司 Positioning method, system and computer readable medium for autonomous vehicle
CN112505718B (en) * 2020-11-10 2022-03-01 奥特酷智能科技(南京)有限公司 Positioning method, system and computer readable medium for autonomous vehicle
CN112781893A (en) * 2021-01-04 2021-05-11 重庆长安汽车股份有限公司 Spatial synchronization method and device for vehicle-mounted sensor performance test data and storage medium
CN112781893B (en) * 2021-01-04 2022-09-06 重庆长安汽车股份有限公司 Spatial synchronization method and device for vehicle-mounted sensor performance test data and storage medium
CN112666535A (en) * 2021-01-12 2021-04-16 重庆长安汽车股份有限公司 Environment sensing method and system based on multi-radar data fusion
CN112800159A (en) * 2021-01-25 2021-05-14 北京百度网讯科技有限公司 Map data processing method and device
CN112800159B (en) * 2021-01-25 2023-10-31 北京百度网讯科技有限公司 Map data processing method and device
US11866064B2 (en) 2021-01-25 2024-01-09 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for processing map data
CN113449582A (en) * 2021-03-04 2021-09-28 同致电子科技(厦门)有限公司 Vehicle bottom blind area filling method, device, system, storage medium and computer program product
CN113514863A (en) * 2021-03-23 2021-10-19 重庆兰德适普信息科技有限公司 Multi-sensor fusion positioning method
CN113566833A (en) * 2021-07-28 2021-10-29 上海工程技术大学 Multi-sensor fusion vehicle positioning method and system
CN113654555A (en) * 2021-09-14 2021-11-16 上海智驾汽车科技有限公司 Automatic driving vehicle high-precision positioning method based on multi-sensor data fusion
CN113984044A (en) * 2021-10-08 2022-01-28 杭州鸿泉物联网技术股份有限公司 Vehicle pose acquisition method and device based on vehicle-mounted multi-perception fusion
CN114013449A (en) * 2021-11-02 2022-02-08 阿波罗智能技术(北京)有限公司 Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN114013449B (en) * 2021-11-02 2023-11-03 阿波罗智能技术(北京)有限公司 Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN113970332B (en) * 2021-11-30 2024-04-19 上海于万科技有限公司 Unmanned sweeping vehicle-based navigation output result smoothing method and system
CN114264301A (en) * 2021-12-13 2022-04-01 青岛慧拓智能机器有限公司 Vehicle-mounted multi-sensor fusion positioning method and device, chip and terminal
CN113970330A (en) * 2021-12-22 2022-01-25 比亚迪股份有限公司 Vehicle-mounted multi-sensor fusion positioning method, computer equipment and storage medium
CN113970330B (en) * 2021-12-22 2022-04-19 比亚迪股份有限公司 Vehicle-mounted multi-sensor fusion positioning method, computer equipment and storage medium
CN115523929A (en) * 2022-09-20 2022-12-27 北京四维远见信息技术有限公司 Vehicle-mounted integrated navigation method, device, equipment and medium based on SLAM
CN115540875A (en) * 2022-11-24 2022-12-30 成都运达科技股份有限公司 Method and system for high-precision detection and positioning of train vehicles in station track

Also Published As

Publication number Publication date
CN110906923B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN110906923B (en) Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle
CN106840179B (en) Intelligent vehicle positioning method based on multi-sensor information fusion
US11002859B1 (en) Intelligent vehicle positioning method based on feature point calibration
CN108873038B (en) Autonomous parking positioning method and positioning system
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN108226980B (en) Differential GNSS and INS self-adaptive tightly-coupled navigation method based on inertial measurement unit
WO2022007437A1 (en) Method for calibrating mounting deviation angle between sensors, combined positioning system, and vehicle
CN101846734B (en) Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer
Meduna et al. Closed-loop terrain relative navigation for AUVs with non-inertial grade navigation sensors
KR20210111180A (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN107132563B (en) Combined navigation method combining odometer and dual-antenna differential GNSS
CN108362288B (en) Polarized light SLAM method based on unscented Kalman filtering
CN112147651B (en) Asynchronous multi-vehicle cooperative target state robust estimation method
CN111595348A (en) Master-slave mode cooperative positioning method of autonomous underwater vehicle combined navigation system
CN114076610A (en) Error calibration and navigation method and device of GNSS/MEMS vehicle-mounted integrated navigation system
CN110133695A (en) A kind of double antenna GNSS location delay time dynamic estimation system and method
CN116047565A (en) Multi-sensor data fusion positioning system
CN115046540A (en) Point cloud map construction method, system, equipment and storage medium
CN113984044A (en) Vehicle pose acquisition method and device based on vehicle-mounted multi-perception fusion
CN114136310B (en) Autonomous error suppression system and method for inertial navigation system
CN113375664B (en) Autonomous mobile device positioning method based on dynamic loading of point cloud map
CN109059913A (en) A kind of zero-lag integrated navigation initial method for onboard navigation system
CN112904396A (en) High-precision positioning method and system based on multi-sensor fusion
CN116558512A (en) GNSS and vehicle-mounted sensor fusion positioning method and system based on factor graph
CN114915913A (en) UWB-IMU combined indoor positioning method based on sliding window factor graph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant