WO2022247306A1 - 一种基于毫米波雷达的无人机定位方法 - Google Patents

一种基于毫米波雷达的无人机定位方法 Download PDF

Info

Publication number
WO2022247306A1
WO2022247306A1 PCT/CN2022/071564 CN2022071564W WO2022247306A1 WO 2022247306 A1 WO2022247306 A1 WO 2022247306A1 CN 2022071564 W CN2022071564 W CN 2022071564W WO 2022247306 A1 WO2022247306 A1 WO 2022247306A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
cloud data
feature
radar
point
Prior art date
Application number
PCT/CN2022/071564
Other languages
English (en)
French (fr)
Inventor
何斌
李刚
沈润杰
周艳敏
陈杰
宋书平
Original Assignee
同济大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 同济大学 filed Critical 同济大学
Priority to US17/870,592 priority Critical patent/US20220383755A1/en
Publication of WO2022247306A1 publication Critical patent/WO2022247306A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR

Definitions

  • the invention relates to the field of UAV positioning, in particular to a method for UAV positioning based on millimeter wave radar.
  • the indoor positioning methods of drones mainly include: indoor positioning system, which realizes the positioning of drones by arranging multiple positioning cameras indoors, and can achieve centimeter-level positioning accuracy; visual SLAM, through airborne cameras or radars, Modeling of the environment map, combined with the motion information of the drone, to realize the positioning of the drone; ultra-wideband (UWB) technology, including the airborne module end and the base module end, by measuring the time of arrival, time difference of arrival or angle of arrival of radio waves Calculate the distance between the two modules to realize the positioning of the drone; there are also positioning solutions based on radio waves, such as realizing the positioning of the drone through Wifi, Zigbee and other communications.
  • indoor positioning system which realizes the positioning of drones by arranging multiple positioning cameras indoors, and can achieve centimeter-level positioning accuracy
  • visual SLAM through airborne cameras or radars, Modeling of the environment map, combined with the motion information of the drone, to realize the positioning of the drone
  • ultra-wideband (UWB) technology including the air
  • the purpose of the present invention is to provide a UAV positioning method based on millimeter-wave radar in order to overcome the defects in the above-mentioned prior art, and perform feature point matching and positioning based on point cloud data of millimeter-wave radar.
  • the positioning accuracy is high, and millimeter-wave radar It can reduce the load of drones, improve the positioning accuracy of drones, and realize all-weather and all-weather positioning of drones.
  • a method for positioning an unmanned aerial vehicle based on a millimeter-wave radar The unmanned aerial vehicle is equipped with a millimeter-wave radar and an IMU, including a calibration phase and a positioning phase.
  • the calibration phase includes:
  • Preprocessing the radar point cloud data extracting key points in the radar point cloud data, obtaining feature line segments based on the key points, and recording key points on the feature line segments as feature points;
  • the positioning phase includes:
  • Step S4 is repeated.
  • the calibration stage also includes the coordinate calibration of the millimeter-wave radar and IMU, specifically: according to the installation positions of the millimeter-wave radar and IMU, establish a coordinate transformation matrix between the radar point cloud data and the UAV motion data measured by the IMU, Complete coordinate calibration.
  • the preprocessing of the radar point cloud data includes obtaining the undeflected point cloud data, and the obtaining of the undeflected point cloud data specifically includes: mapping the three-dimensional radar point cloud data to a two-dimensional plane.
  • the preprocessing of the radar point cloud data includes noise filtering, and the noise filtering is specifically: removing discrete points in the radar point cloud data by a direct elimination method, and down-sampling the radar point cloud data by using a voxel grid filter.
  • the preprocessing of the radar point cloud data includes point cloud registration, and the point cloud registration is specifically: through an iterative algorithm to search for point matching at the nearest point, the radar point cloud of the frame head and frame tail in the same frame of radar point cloud data The coordinates of the data are transformed to the same coordinate system.
  • step S5 is specifically:
  • extracting key points in the radar point cloud data includes the following steps:
  • condition 1 the key point and the previous key point
  • condition 2 the key point does not fall on the straight line l formed by the last two key points, and the distance from the key point to the straight line l exceeds the preset point-line distance threshold.
  • obtaining the characteristic line segment based on the key points is specifically to obtain the characteristic line segment using a straight line fitting method, and a characteristic line segment is obtained by fitting at least 4 key points.
  • step S7 is specifically:
  • Feature points are newly added feature points.
  • step S8 is specifically:
  • the matching feature points in the previous frame and the current frame update the ground coordinates of the drone; according to the coordinate relationship between the new feature points and the matching feature points in the current frame, calculate the new feature points on the map ground coordinates.
  • the present invention has the following beneficial effects:
  • Feature point matching and positioning based on the point cloud data of the millimeter-wave radar has high positioning accuracy.
  • the millimeter-wave radar can reduce the load of the drone, improve the positioning accuracy of the drone, and realize all-weather and all-weather positioning of the drone.
  • Preprocessing point cloud data including acquisition of undeflected point cloud data, noise filtering and point cloud registration, can obtain point cloud data with higher accuracy and usability.
  • Fig. 1 is a flowchart of the present invention
  • Figure 2 is a schematic diagram of feature point tracking.
  • a UAV positioning method based on millimeter-wave radar The UAV is equipped with a millimeter-wave radar and an IMU.
  • the UAV module uses the DJI UAV Matrice M100 UAV open platform, which has a weight of 3.6kg.
  • the maximum load and effective flight time 30 minutes, equipped with Chengtai Technology LDBSD-100 millimeter wave radar, size 77mm*55mm*21mm (H*W*D), weight 500g, and Beiwei Sensing Modbus inertial measurement unit IMU527.
  • the system also includes the TX1 processor produced by NVIDIA, which has 256 NVIDIA CUDA cores and a 64-bit CPU, which can meet the real-time processing requirements of point cloud data.
  • the UAV positioning method includes a calibration stage and a positioning stage, and the calibration stage includes:
  • Preprocessing the radar point cloud data extracting key points in the radar point cloud data, obtaining feature line segments based on the key points, and recording key points on the feature line segments as feature points;
  • the calibration stage also includes the coordinate calibration of the millimeter-wave radar and IMU, specifically: according to the installation positions of the millimeter-wave radar and IMU, establish a coordinate transformation matrix between the radar point cloud data and the UAV motion data measured by the IMU, and complete the coordinate calibration .
  • the positioning phase includes:
  • Step S4 is repeated.
  • Millimeter wave radar can reduce the load of drones and improve the positioning accuracy of drones. position.
  • the preprocessing of the radar point cloud data includes obtaining the undeflected point cloud data, which specifically includes: mapping the three-dimensional radar point cloud data to a two-dimensional plane. Since changes in the flight attitude of the UAV (roll angle, pitch angle) will affect the readings of the lidar, by mapping the lidar data in the three-dimensional environment to the two-dimensional X-Y plane, its influence can be reduced.
  • the lidar data in the three-dimensional environment to the two-dimensional XY plane, and determine three coordinate systems, namely the ground coordinate system E, the UAV coordinate system B and the inertial coordinate system I.
  • the attitude angle of the drone is measured by the IMU ⁇ is the roll angle, ⁇ is the pitch angle, and ⁇ is the yaw angle.
  • the projection matrix of I can be obtained:
  • the yaw angle is used to assist, so that the scanned data will not rotate with the rotation of the drone:
  • the preprocessing of radar point cloud data includes noise filtering.
  • Noise filtering is specifically: removing discrete points in radar point cloud data by direct elimination method, and down-sampling radar point cloud data by using a voxel grid filter.
  • the statistical outlier removal algorithm provided by the PCL library is used to analyze the point cloud and propose points that do not meet the characteristics.
  • a voxel raster filter is employed to reduce the density of the point cloud.
  • the preprocessing of radar point cloud data includes point cloud registration.
  • the point cloud registration is specifically: through the iterative algorithm to search for point matching at the nearest point, the coordinates of the radar point cloud data of the frame head and frame tail in the same frame of radar point cloud data Transform to the same coordinate system. Since the millimeter-wave radar is placed on the moving UAV, the coordinate system of the point cloud of the same frame will be offset, resulting in a certain displacement between the frame head and the frame end of each frame.
  • This application uses iterative The algorithm searches for point matching at the nearest point, and converts the coordinates of the point cloud data at the frame head and frame end, so that it can obtain accurate point cloud data in the same coordinate system.
  • an iterative closest point (ICP) algorithm is used to perform registration and correct distortion.
  • Step S5 is specifically: Obtain the motion data of the UAV measured by the IMU, including the attitude angle, angular velocity, and linear velocity, and calculate the unmanned aerial vehicle based on the attitude angle, angular velocity, and linear velocity. Based on the flight distance of the aircraft, the radar point cloud data is corrected based on the flight distance, and the corrected radar point cloud data is obtained.
  • the angular velocity and linear velocity of the drone can be obtained from the serial port through the DJI Onboard SDK protocol, and the motion information of the radar can be obtained through integration, and then the laser point cloud can be corrected by using the obtained motion information.
  • Obtain the attitude angle of the drone directly from the IMU data of the drone The height Z of the drone, and the flight speed on the horizontal plane Integrating the speed over a certain time interval, the flight distance can be obtained:
  • this application proposes a positioning method based on feature point matching.
  • the calibration stage that is, when the UAV is initialized, the position information of the UAV is known at this time, and the point cloud data of the UAV at this time Extract feature points and record the ground coordinates of the feature points.
  • the positioning stage that is, when the UAV is flying, for each frame of point cloud data collected, find the feature points that match the current frame and the previous frame, so that according to the matching feature
  • the coordinate transformation of the point in two frames updates the drone's ground coordinates.
  • Extracting feature points in radar point cloud data includes the following steps:
  • the characteristic line segment is obtained based on the key points, specifically, the characteristic line segment is obtained by using a straight line fitting method, such as the least square method, and a characteristic line segment is obtained by fitting at least 4 key points.
  • Feature points are newly added feature points.
  • set represents the set of matching feature points, is the set of feature points of the previous frame, is the set of feature points of the current frame,
  • the offset angle of the two feature line segments is the relative yaw angle ⁇ of the two frame point cloud images, so as to compensate the yaw angle measured by the IMU. Rotate the point cloud while obtaining the relative yaw angle, so that there is only a translation relationship between the current frame and the map.
  • the slope of the line L1 is The slope of L2 is k, then the size of the deflection angle ⁇ can be obtained:
  • step S8 is specifically:
  • the ground coordinates of the matching feature points on the map have been determined. According to the coordinate changes of the matching feature points in the previous frame and the current frame, the ground coordinates of the drone can be updated. According to the coordinates of the newly added feature points and the matching feature points relationship, and calculate the ground coordinates of the newly added feature points on the map for use in the next match.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Navigation (AREA)

Abstract

一种基于毫米波雷达的无人机定位方法,包括标定阶段和定位阶段,标定阶段包括:获取无人机的地面坐标;提取雷达点云数据中的特征点并得到特征点的地面坐标;定位阶段包括:获取当前帧雷达点云数据并预处理;获取无人机运动数据并与雷达点云数据进行融合;提取雷达点云数据中的特征线段;将当前帧的特征线段与上一帧的特征线段进行配准,找到匹配特征点和新增特征点;基于匹配特征点在地图上的地面坐标,得到无人机的地面坐标、新增特征点的地面坐标。该方法基于毫米波雷达的点云数据进行特征点匹配定位,定位精度高,毫米波雷达可以减轻无人机负载,提高无人机定位精度,能够实现全天候全天时的无人机定位。

Description

一种基于毫米波雷达的无人机定位方法 技术领域
本发明涉及无人机定位领域,尤其是涉及一种基于毫米波雷达的无人机定位方法。
背景技术
随着技术的进步,无人机的应用范围越来越广,而无人机的定位问题对其安全运行起着重要作用。基于GNSS全球卫星系统,无人机的室外定位及导航技术已趋于成熟,并且能够达到米级精度,进一步的,通过部署固定差分基站,能使精度达到厘米级。但是,在室内、隧洞等GPS信号微弱的空间内,无人机的精准定位是目前面临的一大难题,也是研究的热点。
目前无人机室内定位的方法主要有:室内定位系统,通过在室内布置多个定位相机,实现无人机的定位,能够达到厘米级的定位精度;视觉SLAM,通过机载摄像头或雷达,对环境地图进行建模,结合无人机的运动信息,实现无人机的定位;超宽带(UWB)技术,包括机载模块端和基地模块端,通过测量电波的到达时间、到达时间差或到达角计算出两个模块之间的距离,从而实现无人机的定位;也有基于电波的定位方案,如通过Wifi、Zigbee等通信实现无人机的定位。
但是现有技术或多或少都存在缺陷和不足,如成本和定位精度之间的侧重、适用场景的局限、在特殊环境中的通用性、实时性等。
发明内容
本发明的目的就是为了克服上述现有技术存在的缺陷而提供一种基于毫米波雷达的无人机定位方法,基于毫米波雷达的点云数据进行特征点匹配定位,定位精度高,毫米波雷达可以减轻无人机负载,提高无人机定位精度,能够实现全天候全天时的无人机定位。
本发明的目的可以通过以下技术方案来实现:
一种基于毫米波雷达的无人机定位方法,所述无人机上搭载有毫米波雷达和IMU,包括标定阶段和定位阶段,标定阶段包括:
S1、获取地图,获取毫米波雷达测量的雷达点云数据,获取无人机的地面坐标;
S2、对雷达点云数据进行预处理;提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
S3、根据无人机的地面坐标,将特征点投影在地图上,得到各个特征点的地面坐标;
定位阶段包括:
S4、获取毫米波雷达测量的当前帧雷达点云数据,对雷达点云数据进行预处理;
S5、获取IMU测量的无人机运动数据,将雷达点云数据和无人机运动数据进行融合,得到修正后的雷达点云数据;
S6、提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
S7、将当前帧雷达点云数据的特征线段与上一帧雷达点云数据的特征线段进行配准,找到当前帧与上一帧相匹配的特征点,记为匹配特征点,找到当前帧相对于上一帧增加的特征点,记为新增特征点;
S8、基于匹配特征点在地图上的地面坐标,得到无人机的地面坐标、新增特征点的地面坐标;
S9、重复步骤S4。
进一步的,标定阶段还包括毫米波雷达和IMU的坐标标定,具体为:根据毫米波雷达和IMU的安装位置,建立雷达点云数据与IMU测量的无人机运动数据之间的坐标变换矩阵,完成坐标标定。
进一步的,对雷达点云数据进行预处理包括获取无偏转点云数据,获取无偏转点云数据具体为:将三维的雷达点云数据映射到二维平面上。
进一步的,对雷达点云数据进行预处理包括噪声过滤,噪声过滤具体为:通过直接剔除法剔除雷达点云数据中的离散点,使用体素栅格滤波器对雷达点云数据降采样。
进一步的,对雷达点云数据进行预处理包括点云配准,点云配准具体为:通过 迭代的算法就近搜索点匹配,将同一帧雷达点云数据中帧头和帧尾的雷达点云数据的坐标转换至同一坐标系。
进一步的,步骤S5具体为:
获取IMU测量的无人机运动数据,包括姿态角、角速度和线速度,基于姿态角、角速度和线速度计算无人机的飞行距离,基于飞行距离对雷达点云数据进行修正,得到修正后的雷达点云数据。
进一步的,提取雷达点云数据中的关键点包括以下步骤:
确定雷达点云数据的第一个点为关键点;遍历剩余的雷达点云数据,如果一个点满足条件1或条件2,则该点为关键点,条件1为:关键点与上一个关键点之间的欧式距离大于预设置的点点距离阈值;条件2为:关键点不落在上两个关键点所连成的直线l上,并且关键点到直线l的距离超过预设置的点线距离阈值。
进一步的,基于关键点得到特征线段具体为使用直线拟合方法得到特征线段,一个特征线段至少由4个关键点拟合得到。
进一步的,步骤S7具体为:
计算当前帧的特征线段和上一帧的特征线段之间的角度差和距离差,如果当前帧的一条特征线段与上一帧的一条特征线段之间的角度差和距离差均小于预设置的配准阈值,则这两条特征线段相互匹配;
对于当前帧的所有特征线段,如果一条特征线段找到相互匹配的特征线段,则该特征线段上的特征点为匹配特征点,如果一条特征线段没有找到相互匹配的特征线段,则该特征线段上的特征点为新增特征点。
进一步的,步骤S8具体为:
根据匹配特征点在上一帧和当前帧的坐标变化,更新更新无人机的地面坐标;根据新增特征点与匹配特征点在当前帧的坐标关系,计算得到新增特征点在地图上的地面坐标。
与现有技术相比,本发明具有以下有益效果:
(1)基于毫米波雷达的点云数据进行特征点匹配定位,定位精度高,毫米波雷达可以减轻无人机负载,提高无人机定位精度,能够实现全天候全天时的无人机定位。
(2)将雷达点云数据与惯性IMU的测量数据融合,进一步提升了定位的精度,且减少了计算量,节约了计算资源,保证了定位的速度和实时性。
(3)对点云数据进行预处理,包括无偏转点云数据的获取、噪声过滤和点云配准,能够得到精确度和可用性更高的点云数据。
附图说明
图1为本发明的流程图;
图2为特征点追踪示意图。
具体实施方式
下面结合附图和具体实施例对本发明进行详细说明。本实施例以本发明技术方案为前提进行实施,给出了详细的实施方式和具体的操作过程,但本发明的保护范围不限于下述的实施例。
实施例1:
一种基于毫米波雷达的无人机定位方法,无人机上搭载有毫米波雷达和IMU,本实施例中,无人机模块采用大疆无人机经纬M100无人机开放平台,具备3.6kg的最大负载以及30分钟的有效飞行时间,搭载承泰科技LDBSD-100毫米波雷达,尺寸77mm*55mm*21mm(H*W*D),重量500g,以及北微传感Modbus惯性测量单元IMU527。此外,本系统还包括NVIDIA公司生产的TX1型号处理器,该款处理器具备256个NVIDIACUDA核心、64位CPU,可满足点云数据的实时处理需求。
如图1所示,无人机定位方法包括标定阶段和定位阶段,标定阶段包括:
S1、获取地图,获取毫米波雷达测量的雷达点云数据,获取无人机的地面坐标;
S2、对雷达点云数据进行预处理;提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
S3、根据无人机的地面坐标,将特征点投影在地图上,得到各个特征点的地面坐标;
标定阶段还包括毫米波雷达和IMU的坐标标定,具体为:根据毫米波雷达和IMU的安装位置,建立雷达点云数据与IMU测量的无人机运动数据之间的坐标变换矩阵,完成坐标标定。
定位阶段包括:
S4、获取毫米波雷达测量的当前帧雷达点云数据,并进行预处理;
S5、获取IMU测量的无人机运动数据,将雷达点云数据和无人机运动数据进行融合,得到修正后的雷达点云数据;
S6、提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
S7、将当前帧雷达点云数据的特征线段与上一帧雷达点云数据的特征线段进行配准,找到当前帧与上一帧相匹配的特征点,记为匹配特征点,找到当前帧相对于上一帧增加的特征点,记为新增特征点;
S8、基于匹配特征点在地图上的地面坐标,得到无人机的地面坐标、新增特征点的地面坐标;
S9、重复步骤S4。
本申请基于毫米波雷达的点云数据进行特征点匹配定位,定位精度高,毫米波雷达可以减轻无人机负载,提高无人机定位精度,全天候(大雨天除外)全天时进行无人机定位。
对雷达点云数据进行预处理包括获取无偏转点云数据,获取无偏转点云数据具体为:将三维的雷达点云数据映射到二维平面上。由于无人机飞行姿态的变化(滚转角度、俯仰角度)会影响激光雷达的读数,因此通过将三维环境中激光雷达数据的映射到二维的X-Y平面,减小其影响,
将三维环境中的激光雷达数据映射到二维的X-Y平面,确定三个坐标系,即地面坐标系E、无人机坐标系B和惯性坐标系I。IMU测量得到无人机的姿态角
Figure PCTCN2022071564-appb-000001
φ为滚转角,θ为俯仰角,ψ为偏航角。根据IMU的安装位置,可得到I的投影矩阵:
Figure PCTCN2022071564-appb-000002
设在B坐标系下数据的极坐标为
Figure PCTCN2022071564-appb-000003
则在笛卡尔坐标系下表示为
Figure PCTCN2022071564-appb-000004
经过旋转变换后可得:
Figure PCTCN2022071564-appb-000005
为了保证点云的匹配方向不改变,采用偏航角辅助,使得扫描数据不会随无人机的旋转而旋转:
Figure PCTCN2022071564-appb-000006
对雷达点云数据进行预处理包括噪声过滤,噪声过滤具体为:通过直接剔除法剔除雷达点云数据中的离散点,使用体素栅格滤波器对雷达点云数据降采样。本实施例使用PCL库提供的统计离群值剔除算法,用来分析点云以及提出不满足特征的点。对于缩减采样,采用体素栅格滤波器来减小点云的密度。
对雷达点云数据进行预处理包括点云配准,点云配准具体为:通过迭代的算法就近搜索点匹配,将同一帧雷达点云数据中帧头和帧尾的雷达点云数据的坐标转换至同一坐标系。由于毫米波雷达置于运动中的无人机上,会发生同一帧的点云所处的坐标系有偏移,导致每一帧的帧头和帧尾之间产生一段位移,本申请通过迭代的算法就近搜索点匹配,将帧头和帧尾的点云数据的坐标进行转换,使其在同一坐标系下,得到准确的点云数据。本实施例中通过采用迭代最近点(ICP)算法来进行配准和校正畸变。
考虑到无人机的运动,需要修正雷达点云数据,步骤S5具体为:获取IMU测量的无人机运动数据,包括姿态角、角速度和线速度,基于姿态角、角速度和线速度计算无人机的飞行距离,基于飞行距离对雷达点云数据进行修正,得到修正后的雷达点云数据。
本实施例中,通过大疆Onboard SDK协议可以从串口获取无人机的角速度和线速度,通过积分获得雷达的运动信息,再利用得到的运动信息对激光点云进行修正。由无人机的IMU数据直接获取无人机的姿态角
Figure PCTCN2022071564-appb-000007
无人机高度Z,以及水平面的飞行速度
Figure PCTCN2022071564-appb-000008
在一定的时间间隔内对速度进行积分,即可得到飞行距离:
Figure PCTCN2022071564-appb-000009
利用得到的运动信息对激光点云进行修正,后续再使用特征点匹配方法进行运动估计,能够进一步减小匹配失误率。
本申请为了实现定位,提出了基于特征点匹配的定位方法,在标定阶段,即无人机初始化时,此时无人机的位置信息是已知的,在此时无人机的点云数据中提取特征点并记录特征点的地面坐标,在定位阶段,即无人机飞行时,对于采集的每一帧点云数据,找到当前帧与上一帧相匹配的特征点,从而根据匹配特征点在两帧的坐标变换更新无人机的地面坐标。
提取雷达点云数据中的特征点包括以下步骤:
确定雷达点云数据的第一个点为关键点;遍历剩余的雷达点云数据,如果一个点满足条件1或条件2,则该点为关键点,条件1为:关键点与上一个关键点之间的欧式距离大于预设置的点点距离阈值;条件2为:关键点不落在上两个关键点所连成的直线l上,并且关键点到直线l的距离超过预设置的点线距离阈值。得到关键点后,基于关键点得到特征线段,具体为使用直线拟合方法得到特征线段,如最小二乘法,一个特征线段至少由4个关键点拟合得到。
特征线段的配准具体为:
计算当前帧的特征线段和上一帧的特征线段之间的角度差和距离差,如果当前帧的一条特征线段与上一帧的一条特征线段之间的角度差和距离差均小于预设置的配准阈值,则这两条特征线段相互匹配。
对于当前帧的所有特征线段,如果一条特征线段找到相互匹配的特征线段,则该特征线段上的特征点为匹配特征点,如果一条特征线段没有找到相互匹配的特征线段,则该特征线段上的特征点为新增特征点。
参加图2,设
Figure PCTCN2022071564-appb-000010
代表匹配特征点的集合,
Figure PCTCN2022071564-appb-000011
为上一帧的特征点的集合,
Figure PCTCN2022071564-appb-000012
为当前帧的特征点的集合,
Figure PCTCN2022071564-appb-000013
Figure PCTCN2022071564-appb-000014
则新增特征点的集合为
Figure PCTCN2022071564-appb-000015
特征线段对应好之后,两特征线段的偏移角度就是两帧点云图的相对偏航角ψ,以此来补偿IMU测量的偏航角。得到相对偏航角的同时对点云做旋转,使得当前帧与地图的只存在平移关系。假设直线L1斜率为
Figure PCTCN2022071564-appb-000016
L2斜率为k,则可以得到偏转角Δψ大小:
Figure PCTCN2022071564-appb-000017
得到匹配特征点后,就可以进行定位。参见图2,步骤S8具体为:
匹配特征点在地图上的地面坐标是已确定的,根据匹配特征点在上一帧和当前帧中的坐标变化,可以更新无人机的地面坐标,根据新增特征点与匹配特征点的坐标关系,计算得到新增特征点在地图上的地面坐标,供下一次匹配使用。
以上详细描述了本发明的较佳具体实施例。应当理解,本领域的普通技术人员无需创造性劳动就可以根据本发明的构思作出诸多修改和变化。因此,凡本技术领域中技术人员依本发明的构思在现有技术的基础上通过逻辑分析、推理或者有限的实验可以得到的技术方案,皆应在由权利要求书所确定的保护范围内。

Claims (10)

  1. 一种基于毫米波雷达的无人机定位方法,其特征在于,所述无人机上搭载有毫米波雷达和IMU,包括标定阶段和定位阶段,标定阶段包括:
    S1、获取地图,获取毫米波雷达测量的雷达点云数据,获取无人机的地面坐标;
    S2、对雷达点云数据进行预处理;提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
    S3、根据无人机的地面坐标,将特征点投影在地图上,得到各个特征点的地面坐标;
    定位阶段包括:
    S4、获取毫米波雷达测量的当前帧雷达点云数据,对雷达点云数据进行预处理;
    S5、获取IMU测量的无人机运动数据,将雷达点云数据和无人机运动数据进行融合,得到修正后的雷达点云数据;
    S6、提取雷达点云数据中的关键点,基于关键点得到特征线段,特征线段上的关键点记为特征点;
    S7、将当前帧雷达点云数据的特征线段与上一帧雷达点云数据的特征线段进行配准,找到当前帧与上一帧相匹配的特征点,记为匹配特征点,找到当前帧相对于上一帧增加的特征点,记为新增特征点;
    S8、基于匹配特征点在地图上的地面坐标,得到无人机的地面坐标、新增特征点的地面坐标;
    S9、重复步骤S4。
  2. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,标定阶段还包括毫米波雷达和IMU的坐标标定,具体为:根据毫米波雷达和IMU的安装位置,建立雷达点云数据与IMU测量的无人机运动数据之间的坐标变换矩阵,完成坐标标定。
  3. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,对雷达点云数据进行预处理包括获取无偏转点云数据,获取无偏转点云数据具 体为:将三维的雷达点云数据映射到二维平面上。
  4. 根据权利要求3所述的一种基于毫米波雷达的无人机定位方法,其特征在于,对雷达点云数据进行预处理包括噪声过滤,噪声过滤具体为:通过直接剔除法剔除雷达点云数据中的离散点,使用体素栅格滤波器对雷达点云数据降采样。
  5. 根据权利要求4所述的一种基于毫米波雷达的无人机定位方法,其特征在于,对雷达点云数据进行预处理包括点云配准,点云配准具体为:通过迭代的算法就近搜索点匹配,将同一帧雷达点云数据中帧头和帧尾的雷达点云数据的坐标转换至同一坐标系。
  6. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,步骤S5具体为:
    获取IMU测量的无人机运动数据,包括姿态角、角速度和线速度,基于姿态角、角速度和线速度计算无人机的飞行距离,基于飞行距离对雷达点云数据进行修正,得到修正后的雷达点云数据。
  7. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,提取雷达点云数据中的关键点包括以下步骤:
    确定雷达点云数据的第一个点为关键点;遍历剩余的雷达点云数据,如果一个点满足条件1或条件2,则该点为关键点,条件1为:关键点与上一个关键点之间的欧式距离大于预设置的点点距离阈值;条件2为:关键点不落在上两个关键点所连成的直线l上,并且关键点到直线l的距离超过预设置的点线距离阈值。
  8. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,基于关键点得到特征线段具体为使用直线拟合方法得到特征线段,一个特征线段至少由4个关键点拟合得到。
  9. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,步骤S7具体为:
    计算当前帧的特征线段和上一帧的特征线段之间的角度差和距离差,如果当前帧的一条特征线段与上一帧的一条特征线段之间的角度差和距离差均小于预设置的配准阈值,则这两条特征线段相互匹配;
    对于当前帧的所有特征线段,如果一条特征线段找到相互匹配的特征线段,则该特征线段上的特征点为匹配特征点,如果一条特征线段没有找到相互匹配的特征线段,则该特征线段上的特征点为新增特征点。
  10. 根据权利要求1所述的一种基于毫米波雷达的无人机定位方法,其特征在于,步骤S8具体为:
    根据匹配特征点在上一帧和当前帧的坐标变化,更新更新无人机的地面坐标;根据新增特征点与匹配特征点在当前帧的坐标关系,计算得到新增特征点在地图上的地面坐标。
PCT/CN2022/071564 2021-05-28 2022-04-02 一种基于毫米波雷达的无人机定位方法 WO2022247306A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/870,592 US20220383755A1 (en) 2021-05-28 2022-07-21 Unmanned aerial vehicle positioning method based on millimeter-wave radar

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110588348.7 2021-05-28
CN202110588348.7A CN113419235A (zh) 2021-05-28 2021-05-28 一种基于毫米波雷达的无人机定位方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/870,592 Continuation US20220383755A1 (en) 2021-05-28 2022-07-21 Unmanned aerial vehicle positioning method based on millimeter-wave radar

Publications (1)

Publication Number Publication Date
WO2022247306A1 true WO2022247306A1 (zh) 2022-12-01

Family

ID=77713129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/071564 WO2022247306A1 (zh) 2021-05-28 2022-04-02 一种基于毫米波雷达的无人机定位方法

Country Status (2)

Country Link
CN (1) CN113419235A (zh)
WO (1) WO2022247306A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419235A (zh) * 2021-05-28 2021-09-21 同济大学 一种基于毫米波雷达的无人机定位方法
CN114339989B (zh) * 2021-12-27 2023-04-28 同济大学 一种基于方位角的多智能体系统分布式定位方法
CN114858226B (zh) * 2022-07-05 2022-10-25 武汉大水云科技有限公司 一种无人机山洪流量测量方法、装置及设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111158035A (zh) * 2019-12-31 2020-05-15 广东科学技术职业学院 一种无人车定位方法及无人车
CN111351493A (zh) * 2018-12-24 2020-06-30 上海欧菲智能车联科技有限公司 一种定位方法和系统
CN112130165A (zh) * 2020-09-15 2020-12-25 北京三快在线科技有限公司 一种定位方法、装置、介质及无人设备
CN112241010A (zh) * 2019-09-17 2021-01-19 北京新能源汽车技术创新中心有限公司 定位方法、装置、计算机设备和存储介质
WO2021021862A1 (en) * 2019-07-29 2021-02-04 Board Of Trustees Of Michigan State University Mapping and localization system for autonomous vehicles
CN113419235A (zh) * 2021-05-28 2021-09-21 同济大学 一种基于毫米波雷达的无人机定位方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109945856B (zh) * 2019-02-18 2021-07-06 天津大学 基于惯性/雷达的无人机自主定位与建图方法
CN110456343B (zh) * 2019-07-22 2021-05-28 深圳普捷利科技有限公司 一种基于fmcw毫米波雷达的即时定位方法及系统
CN110570449B (zh) * 2019-09-16 2021-03-16 电子科技大学 一种基于毫米波雷达与视觉slam的定位与建图方法
CN110794392B (zh) * 2019-10-15 2024-03-19 上海创昂智能技术有限公司 车辆定位方法、装置、车辆及存储介质
CN110888125B (zh) * 2019-12-05 2020-06-19 奥特酷智能科技(南京)有限公司 一种基于毫米波雷达的自动驾驶车辆定位方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111351493A (zh) * 2018-12-24 2020-06-30 上海欧菲智能车联科技有限公司 一种定位方法和系统
WO2021021862A1 (en) * 2019-07-29 2021-02-04 Board Of Trustees Of Michigan State University Mapping and localization system for autonomous vehicles
CN112241010A (zh) * 2019-09-17 2021-01-19 北京新能源汽车技术创新中心有限公司 定位方法、装置、计算机设备和存储介质
CN111158035A (zh) * 2019-12-31 2020-05-15 广东科学技术职业学院 一种无人车定位方法及无人车
CN112130165A (zh) * 2020-09-15 2020-12-25 北京三快在线科技有限公司 一种定位方法、装置、介质及无人设备
CN113419235A (zh) * 2021-05-28 2021-09-21 同济大学 一种基于毫米波雷达的无人机定位方法

Also Published As

Publication number Publication date
CN113419235A (zh) 2021-09-21

Similar Documents

Publication Publication Date Title
WO2022247306A1 (zh) 一种基于毫米波雷达的无人机定位方法
CN112347840B (zh) 视觉传感器激光雷达融合无人机定位与建图装置和方法
US20220383755A1 (en) Unmanned aerial vehicle positioning method based on millimeter-wave radar
CN111426320B (zh) 一种基于图像匹配/惯导/里程计的车辆自主导航方法
CN113485441A (zh) 结合无人机高精度定位和视觉跟踪技术的配网巡检方法
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
CN112346104B (zh) 一种无人机信息融合定位方法
WO2022193106A1 (zh) 一种通过惯性测量参数将gps与激光雷达融合定位的方法
Nezhadshahbodaghi et al. Fusing denoised stereo visual odometry, INS and GPS measurements for autonomous navigation in a tightly coupled approach
CN108225273B (zh) 一种基于传感器先验知识的实时跑道检测方法
CN111829514B (zh) 一种适用于车辆底盘集成控制的路面工况预瞄方法
CN111208526B (zh) 基于激光雷达与定位向量匹配的多无人机协同定位方法
CN115902930A (zh) 一种面向船舶检测的无人机室内建图与定位方法
Ivancsits et al. Visual navigation system for small unmanned aerial vehicles
WO2021081958A1 (zh) 地形检测方法、可移动平台、控制设备、系统及存储介质
CN108921896B (zh) 一种融合点线特征的下视视觉罗盘
CN110927765B (zh) 激光雷达与卫星导航融合的目标在线定位方法
Kim et al. Target detection and position likelihood using an aerial image sensor
CN116878542A (zh) 一种抑制里程计高度漂移的激光slam方法
CN112098926A (zh) 一种利用无人机平台的智能测角训练样本生成方法
CN114459467B (zh) 一种未知救援环境中基于vi-slam的目标定位方法
CN113960614A (zh) 一种基于帧-地图匹配的高程图构建方法
CN113589848A (zh) 基于机器视觉的多无人机探测定位跟踪系统及方法
CN113436276A (zh) 基于视觉相对定位的多无人机编队方法
CN112859052A (zh) 基于重叠航带共轭基元的机载激光雷达系统集成误差检校方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22810045

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22810045

Country of ref document: EP

Kind code of ref document: A1