CN112033422A - 一种多传感器融合的slam方法 - Google Patents

一种多传感器融合的slam方法 Download PDF

Info

Publication number
CN112033422A
CN112033422A CN202010884318.6A CN202010884318A CN112033422A CN 112033422 A CN112033422 A CN 112033422A CN 202010884318 A CN202010884318 A CN 202010884318A CN 112033422 A CN112033422 A CN 112033422A
Authority
CN
China
Prior art keywords
camera
gps
imu
exp
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010884318.6A
Other languages
English (en)
Other versions
CN112033422B (zh
Inventor
马鑫军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dilu Technology Co Ltd
Original Assignee
Dilu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dilu Technology Co Ltd filed Critical Dilu Technology Co Ltd
Priority to CN202010884318.6A priority Critical patent/CN112033422B/zh
Publication of CN112033422A publication Critical patent/CN112033422A/zh
Application granted granted Critical
Publication of CN112033422B publication Critical patent/CN112033422B/zh
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Studio Devices (AREA)
  • Navigation (AREA)

Abstract

本发明公开了一种多传感器融合的SLAM方法,基于车载GPS、IMU和双目相机实时采集的数据以及实时定位结果,计算得到融合定位结果为:u_j_result=u_j_gps*w_j_gps+u_j_camera*w_camera+u_j_imu*w_j_imu,其中,u_j_result为第j个维度的融合定位结果,u_j_gps、u_j_camera和u_j_imu分别为GPS、IMU和双目相机的第j个维度的实时定位结果,w_j_gps、w_j_camera和w_j_imu分别为u_j_gps、u_j_camera和u_j_imu的权重,j=1,2,…,6。本发明能够支持多个场景,如高速道路,遮挡物少,此时gps信号比较好,起主要作用;城市环境,楼宇多,gps信号不好,双目和IMU起主要作用等。

Description

一种多传感器融合的SLAM方法
技术领域
本发明涉及一种多传感器融合的SLAM方法。
背景技术
近年来,随着人工智能技术应用领域的不断拓宽和深入,自动驾驶渐渐进入大众视野,在消费者看来开车变成了一件轻松的事情。先进驾驶辅助系统(Advanced DriverAssistantSystem),简称ADAS,是利用安装于车上的各式各样的传感器(可侦测光、热、压力等变数),在第一时间收集车内外的环境数据,进行静、动态物体的辨识、侦测与追踪等技术上的处理,从而能够让驾驶者在最快的时间察觉可能发生的危险,以引起注意和提高安全性的主动安全技术。
ADAS主要由GPS和CCD相机探测模块、通信模块和控制模块等组成。其中,GPS和CCD相机探测模块通过GPS接收机接收GPS卫星信号,求出该车的经纬度坐标、速度、时间等信息,利用安装在汽车前部和后部的CCD相机,实时观察道路两旁的状况;通信模块可以发送检测到的相关信息并在相互靠近的汽车之间实时地传输行驶信息;控制模块可以在即将出现事故的时候做出主动控制,从而避免事故的发生。
在车辆ADAS系统中,定位是极其重要的一环,在缺乏地图的前提下,SLAM技术提供了一个很好的解决方案。在自动驾驶技术的研发中,选择以激光雷达还是摄像头为主要传感器是首要解决的问题,它们代表着两套完全不同的系统——激光SLAM(SimultaneousLocalization And Mapping)和视觉SLAM。但是纯视觉或者纯激光的方案或多或少都存在问题,如纯视觉方案中,需要场景中有丰富的纹理;在激光的解决方案中,没有激光的回环检测方法。
发明内容
本发明所要解决的技术问题是提供一种多传感器融合的SLAM方法,在进行视觉SLAM方案时,使用基于深度神经网络的语义分割方法进行闭环检测,以有效地提高闭环检测的鲁棒性与准确率。
本发明为解决上述技术问题采用以下技术方案:
一种多传感器融合的SLAM方法,该方法基于车载GPS、IMU和双目相机实时采集的数据以及实时定位结果,计算得到融合定位结果为:u_j_result=u_j_gps*w_j_gps+u_j_camera*w_camera+u_j_imu*w_j_imu,其中,u_j_result为第j个维度的融合定位结果,u_j_gps、u_j_camera和u_j_imu分别为GPS、IMU和双目相机的第j个维度的实时定位结果,w_j_gps、w_j_camera和w_j_imu分别为u_j_gps、u_j_camera和u_j_imu的权重,j=1,2,…,6;
w_j_gps、w_j_camera和w_j_imu的取值分别为:
w_j_gps=exp(w_j_gps‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_camera=exp(w_j_camera‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_imu=exp(w_j_imu‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘));
其中,w_j_gps‘=w0_j_gps*(σ_j_gps/σ0_j_gps),w_j_camera‘=w0_j_camera*(σ_camera/(σ0_j_camera),w_j_imu‘=w0_j_imu*(σ_j_imu/σ0_j_imu),w0_j_gps、w0_j_camera和w0_j_imu分别为预设的u_j_gps、u_j_camera和u_j_imu的权重初值,σ0_j_gps、σ0_j_camera和σ0_j_imu分别为预设的GPS、双目相机和imu的第j个维度的方差初值,σ_j_gps和σ_j_imu分别为GPS和imu的第j个维度的实时方差,σ_camera为双目相机的实时方差。
进一步,σ_camera的计算方法如下:
步骤1,提取并匹配双目相机左右图像的特征点,通过特征点三角化得到特征点的3D坐标;
步骤2,基于步骤1的结果,根据ICP位姿估计算法得到双目相机的位姿;
步骤3,基于步骤2的结果,计算重投影误差;
步骤4,σ_camera=a/a_threshold,其中,a为步骤3中计算得到的重投影误差,a_threshold为预设阈值。
进一步,σ_imu的计算方法如下:
IMU采用预积分的计算方法,其运动学模型如下:
ω_measure=ω_truth+b_g+n_g,
a_measure=a_truth+b_a+n_a;
其中,ω_measure代表陀螺仪测量值,ω_truth代表陀螺仪真值,b_g代表陀螺仪的偏差,n_g代表陀螺仪的高斯白噪声;a_measure代表加速度计测量值,b_a代表加速度计真值,b_a代表加速度计的偏差,n_a代表加速度的高斯白噪声;
从上述运动学模型推导得到方差的公式如下:
Ξi=F*Ξi-1*F’+G*Ξn*G’
其中,Ξi代表当前时刻的方差,Ξi-1是上个时刻的方差,初始时刻的方差设置为单位矩阵;F代表上述运动学模型对IMU定位结果的一阶线性导数矩阵,F’代表F的转置,G代表上述运动学模型中噪声对IMU定位结果的一阶线性导数矩阵,Ξn是噪声的初始估计值,G’是G的转置;
当收到IMU实时采集的数据后,基于上述方差的公式,得到当前时刻的方差Ξi,σ_j_imu即为Ξi中第j个元素。
进一步,若σ_j_gps>σ_gps_threshold且σ_camera<=σ_camera_threshold,则此时的GPS定位结果不可信,融合定位结果更新为:u_j_result=u_j_camera*w_j_camera+u_j_imu*w_j_imu,其中,w_j_camera=exp(w_j_camera‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_gps_threshold为预设的GPS方差阈值,w_j_imu=exp(w_j_imu‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_camera_threshold为预设的双目相机方差阈值。
进一步,若σ_j_gps>σ_gps_threshold且σ_camera>σ_camera_threshold,则此时GPS和双目相机的定位结果均不可信,融合定位结果即为作为IMU的实时定位结果;其中σ_gps_threshold为预设的GPS方差阈值,σ_camera_threshold为预设的双目相机方差阈值。
本发明采用以上技术方案与现有技术相比,具有以下技术效果:
(1)低成本,没有使用昂贵的激光传感器;
(2)能够支持多个场景,如高速道路,遮挡物少,此时gps信号比较好,起主要作用;城市环境,楼宇多,gps信号不好,双目和IMU起主要作用等;
(3)单个传感器都有限制,在本专利中融合多种数据,定位的结果可靠,能够突破单个传感器的限制。
具体实施方式
下面对本发明的技术方案做进一步的详细说明:
本发明一种多传感器融合的SLAM方法,基于车载GPS、IMU和双目相机的实时数据进行融合定位。首先在车身上安装GPS IMU和双目相机,并标定这三者之间的变换关系。标定分为两个步骤,一个是设备本身的标定,一个是设备之间的标定。设备本身的标定都是采用通用的标定方法,设备之间的标定可通过各个传感器与车载坐标系之间的变换关系从而获得彼此之间的变换关系。
在实际场景中,如果能够接收到GPS信号,则提高卫星惯性导航的定位结果在最终输出中的权重;如果接收不到GPS信号且场景中有丰富的视觉纹理,则提高VIO(视觉惯性里程计Visual-Inertial Odometry,由双目相机和IMU构成)的定位结果在最终输出中的权重;如果接收不到GPS信号且场景中没有丰富的视觉纹理信息,则提高IMU的定位结果在最终输出中的权重。
本发明一种多传感器融合的SLAM方法,基于车载GPS、IMU和双目相机实时采集的数据以及实时定位结果,得到融合定位结果的表达式为:u_j_result=u_j_gps*w_j_gps+u_j_camera*w_camera+u_j_imu*w_j_imu,其中,u_j_result为第j个维度的融合定位结果,u_j_gps、u_j_camera和u_j_imu分别为GPS、IMU和双目相机的第j个维度的实时定位结果,w_j_gps、w_j_camera和w_j_imu分别为u_j_gps、u_j_camera和u_j_imu的权重,j=1,2,…,6;
w_j_gps、w_j_camera和w_j_imu的取值分别为:
w_j_gps=exp(w_j_gps‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_camera=exp(w_j_camera‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_imu=exp(w_j_imu‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘));
其中,w_j_gps‘=w0_j_gps*(σ_j_gps/σ0_j_gps),w_j_camera‘=w0_j_camera*(σ_camera/(σ0_j_camera),w_j_imu‘=w0_j_imu*(σ_j_imu/σ0_j_imu),w0_j_gps、w0_j_camera和w0_j_imu分别为预设的u_j_gps、u_j_camera和u_j_imu的权重初值(设定为0.5、0.3和0.2),σ0_j_gps、σ0_j_camera和σ0_j_imu分别为预设的GPS、双目相机和imu的第j个维度的方差初值(设定为0.2、0.3和0.5),σ_j_gps和σ_j_imu分别为GPS和imu的第j个维度的实时方差,σ_camera为双目相机的实时方差。
进一步,σ_camera的计算方法如下:
步骤1,提取并匹配双目相机左右图像的特征点,通过特征点三角化得到特征点的3D坐标;
步骤2,基于步骤1的结果,根据ICP位姿估计算法得到双目相机的位姿;
步骤3,基于步骤2的结果,计算重投影误差;
步骤4,σ_camera=a/a_threshold,其中,a为步骤3中计算得到的重投影误差,a_threshold为预设阈值(设定为2)。
进一步,σ_imu的计算方法如下:IMU采用预积分的计算方法,其运动学模型如下:
ω_measure=ω_truth+b_g+n_g,
a_measure=a_truth+b_a+n_a;
其中,ω_measure代表陀螺仪测量值,ω_truth代表陀螺仪真值,b_g代表陀螺仪的偏差,n_g代表陀螺仪的高斯白噪声;a_measure代表加速度计测量值,b_a代表加速度计真值,b_a代表加速度计的偏差,n_a代表加速度的高斯白噪声;
从上述运动学模型推导得到方差的公式如下:
Ξi=F*Ξi-1*F’+G*Ξn*G’
其中,Ξi代表当前时刻的方差,Ξi-1是上个时刻的方差,初始时刻的方差设置为单位矩阵;F代表上述运动学模型对IMU定位结果的一阶线性导数矩阵,F’代表F的转置,G代表上述运动学模型中噪声对IMU定位结果的一阶线性导数矩阵,Ξn是噪声的初始估计值,G’是G的转置;
当收到IMU实时采集的数据后,基于上述方差的公式,得到当前时刻的方差Ξi,σ_j_imu即为Ξi中第j个元素。
进一步,若σ_j_gps>σ_gps_threshold且σ_camera<=σ_camera_threshold,则此时的GPS定位结果不可信,融合定位结果更新为:u_j_result=u_j_camera*w_j_camera+u_j_imu*w_j_imu,其中,w_j_camera=exp(w_j_camera‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_gps_threshold为预设的GPS方差阈值(设定为5),w_j_imu=exp(w_j_imu‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_camera_threshold为预设的双目相机方差阈值(设定为15)。
进一步,若σ_j_gps>σ_gps_threshold且σ_camera>σ_camera_threshold,则此时GPS和双目相机的定位结果均不可信,融合定位结果即为作为IMU的实时定位结果;其中σ_gps_threshold为预设的GPS方差阈值,σ_camera_threshold为预设的双目相机方差阈值。
以上所述,仅为本发明中的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉该技术的人在本发明所揭露的技术范围内,可理解想到的变换或替换,都应涵盖在本发明的包含范围之内,因此,本发明的保护范围应该以权利要求书的保护范围为准。

Claims (5)

1.一种多传感器融合的SLAM方法,其特征在于,该方法基于车载GPS、IMU和双目相机实时采集的数据以及实时定位结果,计算得到融合定位结果为:u_j_result=u_j_gps*w_j_gps+u_j_camera*w_camera+u_j_imu*w_j_imu,其中,u_j_result为第j个维度的融合定位结果,u_j_gps、u_j_camera和u_j_imu分别为GPS、IMU和双目相机的第j个维度的实时定位结果,w_j_gps、w_j_camera和w_j_imu分别为u_j_gps、u_j_camera和u_j_imu的权重,j=1,2,…,6;
w_j_gps、w_j_camera和w_j_imu的取值分别为:
w_j_gps=exp(w_j_gps‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_camera=exp(w_j_camera‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘)),
w_j_imu=exp(w_j_imu‘)/(exp(w_j_gps‘)+exp(w_j_camera‘)+exp(w_j_imu‘));
其中,w_j_gps‘=w0_j_gps*(σ_j_gps/σ0_j_gps),w_j_camera‘=w0_j_camera*(σ_camera/(σ0_j_camera),w_j_imu‘=w0_j_imu*(σ_j_imu/σ0_j_imu),w0_j_gps、w0_j_camera和w0_j_imu分别为预设的u_j_gps、u_j_camera和u_j_imu的权重初值,σ0_j_gps、σ0_j_camera和σ0_j_imu分别为预设的GPS、双目相机和imu的第j个维度的方差初值,σ_j_gps和σ_j_imu分别为GPS和imu的第j个维度的实时方差,σ_camera为双目相机的实时方差。
2.如权利要求1所述的一种多传感器融合的SLAM方法,其特征在于,σ_camera的计算方法如下:
步骤1,提取并匹配双目相机左右图像的特征点,通过特征点三角化得到特征点的3D坐标;
步骤2,基于步骤1的结果,根据ICP位姿估计算法得到双目相机的位姿;
步骤3,基于步骤2的结果,计算重投影误差;
步骤4,σ_camera=a/a_threshold,其中,a为步骤3中计算得到的重投影误差,a_threshold为预设阈值。
3.如权利要求1所述的一种多传感器融合的SLAM方法,其特征在于,σ_j_imu的计算方法如下:
IMU采用预积分的计算方法,其运动学模型如下:
ω_measure=ω_truth+b_g+n_g,
a_measure=a_truth+b_a+n_a;
其中,ω_measure代表陀螺仪测量值,ω_truth代表陀螺仪真值,b_g代表陀螺仪的偏差,n_g代表陀螺仪的高斯白噪声;a_measure代表加速度计测量值,b_a代表加速度计真值,b_a代表加速度计的偏差,n_a代表加速度的高斯白噪声;
从上述运动学模型推导得到方差的公式如下:
Ξi=F*Ξi-1*F’+G*Ξn*G’
其中,Ξi代表当前时刻的方差,Ξi-1是上个时刻的方差,初始时刻的方差设置为单位矩阵;F代表上述运动学模型对IMU定位结果的一阶线性导数矩阵,F’代表F的转置,G代表上述运动学模型中噪声对IMU定位结果的一阶线性导数矩阵,Ξn是噪声的初始估计值,G’是G的转置;
当收到IMU实时采集的数据后,基于上述方差的公式,得到当前时刻的方差Ξi,σ_j_imu即为Ξi中第j个元素。
4.如权利要求1所述的一种多传感器融合的SLAM方法,其特征在于,若σ_j_gps>σ_gps_threshold且σ_camera<=σ_camera_threshold,则此时的GPS定位结果不可信,融合定位结果更新为:u_j_result=u_j_camera*w_j_camera+u_j_imu*w_j_imu,其中,w_j_camera=exp(w_j_camera‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_gps_threshold为预设的GPS方差阈值,w_j_imu=exp(w_j_imu‘)/(exp(w_j_camera‘)+exp(w_j_imu‘)),σ_camera_threshold为预设的双目相机方差阈值。
5.如权利要求1所述的一种多传感器融合的SLAM方法,其特征在于,若σ_j_gps>σ_gps_threshold且σ_camera>σ_camera_threshold,则此时GPS和双目相机的定位结果均不可信,融合定位结果即为作为IMU的实时定位结果;其中σ_gps_threshold为预设的GPS方差阈值,σ_camera_threshold为预设的双目相机方差阈值。
CN202010884318.6A 2020-08-28 2020-08-28 一种多传感器融合的slam方法 Active CN112033422B (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010884318.6A CN112033422B (zh) 2020-08-28 2020-08-28 一种多传感器融合的slam方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010884318.6A CN112033422B (zh) 2020-08-28 2020-08-28 一种多传感器融合的slam方法

Publications (2)

Publication Number Publication Date
CN112033422A true CN112033422A (zh) 2020-12-04
CN112033422B CN112033422B (zh) 2023-11-21

Family

ID=73587276

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010884318.6A Active CN112033422B (zh) 2020-08-28 2020-08-28 一种多传感器融合的slam方法

Country Status (1)

Country Link
CN (1) CN112033422B (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113218435A (zh) * 2021-05-07 2021-08-06 复旦大学 一种多传感器时间同步方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780699A (zh) * 2017-01-09 2017-05-31 东南大学 一种基于sins/gps和里程计辅助的视觉slam方法
CN109991636A (zh) * 2019-03-25 2019-07-09 启明信息技术股份有限公司 基于gps、imu以及双目视觉的地图构建方法及系统
CN111156998A (zh) * 2019-12-26 2020-05-15 华南理工大学 一种基于rgb-d相机与imu信息融合的移动机器人定位方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780699A (zh) * 2017-01-09 2017-05-31 东南大学 一种基于sins/gps和里程计辅助的视觉slam方法
CN109991636A (zh) * 2019-03-25 2019-07-09 启明信息技术股份有限公司 基于gps、imu以及双目视觉的地图构建方法及系统
CN111156998A (zh) * 2019-12-26 2020-05-15 华南理工大学 一种基于rgb-d相机与imu信息融合的移动机器人定位方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王立玲;梁亮;马东;王洪瑞;刘秀玲;: "基于多传感器信息融合的双足机器人自主定位", 中国惯性技术学报, no. 05 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113218435A (zh) * 2021-05-07 2021-08-06 复旦大学 一种多传感器时间同步方法

Also Published As

Publication number Publication date
CN112033422B (zh) 2023-11-21

Similar Documents

Publication Publication Date Title
Vu et al. Real-time computer vision/DGPS-aided inertial navigation system for lane-level vehicle navigation
US10788830B2 (en) Systems and methods for determining a vehicle position
CN106767853B (zh) 一种基于多信息融合的无人驾驶车辆高精度定位方法
CN107389064B (zh) 一种基于惯性导航的无人车变道控制方法
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
EP3008708B1 (en) Vision augmented navigation
CN102208035B (zh) 图像处理系统及位置测量系统
Li et al. Simultaneous registration and fusion of multiple dissimilar sensors for cooperative driving
CN107567412A (zh) 利用汽车相机使用车辆运动数据的对象位置测量
Hu et al. Real-time data fusion on tracking camera pose for direct visual guidance
JP7143857B2 (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
CN104729506A (zh) 一种视觉信息辅助的无人机自主导航定位方法
KR101704405B1 (ko) 차선 인식 시스템 및 방법
Wang et al. Vehicle localization at an intersection using a traffic light map
CN109443348A (zh) 一种基于环视视觉和惯导融合的地下车库库位跟踪方法
US10565863B1 (en) Method and device for providing advanced pedestrian assistance system to protect pedestrian preoccupied with smartphone
CN111025366A (zh) 基于ins及gnss的网格slam的导航系统及方法
CN112033422B (zh) 一种多传感器融合的slam方法
Hu et al. Fusion of vision, GPS and 3D gyro data in solving camera registration problem for direct visual navigation
CN113532499B (zh) 无人系统的传感器安全性检测方法、设备及存储介质
CN112862818B (zh) 惯性传感器和多鱼眼相机联合的地下停车场车辆定位方法
EP4113063A1 (en) Localization of autonomous vehicles using camera, gps, and imu
Hu et al. Fusion of vision, 3D gyro and GPS for camera dynamic registration
CN114488244A (zh) 基于语义vislam和gnss定位的穿戴式盲人辅助导航装置与方法
CN114910062A (zh) 一种多源信息融合的导航定位方法

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant