CN106647784A - Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system - Google Patents
Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system Download PDFInfo
- Publication number
- CN106647784A CN106647784A CN201611032966.9A CN201611032966A CN106647784A CN 106647784 A CN106647784 A CN 106647784A CN 201611032966 A CN201611032966 A CN 201611032966A CN 106647784 A CN106647784 A CN 106647784A
- Authority
- CN
- China
- Prior art keywords
- controller
- unmanned aerial
- attitude
- tau
- aerial vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000005259 measurement Methods 0.000 claims abstract description 16
- 238000001914 filtration Methods 0.000 claims abstract description 8
- 239000011159 matrix material Substances 0.000 claims description 21
- 230000001133 acceleration Effects 0.000 claims description 12
- 230000010354 integration Effects 0.000 claims description 7
- 238000013461 design Methods 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 6
- 239000013256 coordination polymer Substances 0.000 claims description 3
- 230000004069 differentiation Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000009434 installation Methods 0.000 claims 1
- 230000000007 visual effect Effects 0.000 abstract description 16
- 230000004927 fusion Effects 0.000 abstract description 10
- 230000003287 optical effect Effects 0.000 abstract description 10
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 abstract description 8
- 238000002474 experimental method Methods 0.000 description 6
- 241000219745 Lupinus Species 0.000 description 4
- 230000003416 augmentation Effects 0.000 description 4
- 238000007500 overflow downdraw method Methods 0.000 description 4
- 230000001186 cumulative effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 101000802640 Homo sapiens Lactosylceramide 4-alpha-galactosyltransferase Proteins 0.000 description 2
- 102100035838 Lactosylceramide 4-alpha-galactosyltransferase Human genes 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
- G05D1/0816—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability
- G05D1/0825—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability using mathematical models
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mathematical Analysis (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Physics (AREA)
- Mathematical Optimization (AREA)
- Algebra (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
Description
技术领域technical field
本发明涉及无人飞行器和飞行控制领域;具体讲,涉及使用北斗卫星导航系统的多旋翼飞行器定位与导航。The invention relates to the fields of unmanned aerial vehicles and flight control; specifically, it relates to the positioning and navigation of multi-rotor aircraft using the Beidou satellite navigation system.
背景技术Background technique
随着低成本惯性测量系统(Inertial Navigation System,INS)和全球定位系统的出现,无人飞行器系统的应用已经不仅局限于军用领域,低成本的无人飞行器越来越多的被应用于民用领域中。目前无人机领域研究成果中的绝大多数均依赖于全球定位系统(Global Positioning System,GPS)系统进行定位。然而,GPS接收机是一种被动传感器,极易受到干扰或欺骗,在常用的低成本微机电系统(Micro-Electro-Mechanical System,MEMS)传感器组成的惯性导航系统中,一旦失去GPS位置信息的更新,位置和速度的解算精度将会迅速的降低。在微型无人飞行器潜在应用场景中,GPS信号的可靠性无法得到保证,急需一种替代定位方式。With the emergence of low-cost inertial measurement systems (Inertial Navigation System, INS) and global positioning systems, the application of unmanned aerial vehicle systems is not limited to the military field, and low-cost unmanned aerial vehicles are increasingly used in civilian fields middle. Most of the current research results in the field of unmanned aerial vehicles rely on the Global Positioning System (Global Positioning System, GPS) system for positioning. However, the GPS receiver is a passive sensor, which is very vulnerable to interference or spoofing. In the inertial navigation system composed of commonly used low-cost Micro-Electro-Mechanical System (MEMS) sensors, once the GPS position information is lost update, the accuracy of position and velocity solutions will degrade rapidly. In the potential application scenarios of micro-unmanned aerial vehicles, the reliability of GPS signals cannot be guaranteed, and an alternative positioning method is urgently needed.
通过机载视觉传感器获得的图像信息,通过相关图像处理算法,获得飞行器位置与姿态信息进而进行控制,是微型无人飞行器自主定位与导航方面可行的一条途径。然而单独视觉导航算法,无法提供目标的准确经纬度信息,在战场侦查等方面的应用受到很大限制。The image information obtained by the airborne visual sensor and the related image processing algorithm are used to obtain the position and attitude information of the aircraft and then control it. It is a feasible way for the autonomous positioning and navigation of the micro UAV. However, the visual navigation algorithm alone cannot provide accurate latitude and longitude information of the target, and its application in battlefield reconnaissance is greatly limited.
由美国军方控制的GPS系统,于1994年完全建成,与之相比,中国新近开发的北斗导航系统,由于拥有独特的双向通信能力,使其对于干扰和欺骗信号具有很高的抵抗能力。目前,北斗导航系统尚未完全建成,覆盖区域仅限于中国和亚太地区,与成熟的GPS系统相比,定位精度也有差距,在中国大陆地区的定位精度约为10m,而GPS系统在经过几十年的完善后,定位精度已提高到1m。北斗系统相对较低的定位精度,对于需要悬停飞行的四旋翼飞行器的控制带来了很大的挑战,现有商品化、成熟的飞行控制系统所采用的GPS+惯导的融合方式,已不再适用于北斗系统。Compared with the GPS system controlled by the US military, which was fully completed in 1994, China's newly developed Beidou navigation system is highly resistant to jamming and spoofing signals due to its unique two-way communication capabilities. At present, the Beidou navigation system has not been fully completed, and its coverage area is limited to China and the Asia-Pacific region. Compared with the mature GPS system, the positioning accuracy is also far behind. After the improvement of the system, the positioning accuracy has been improved to 1m. The relatively low positioning accuracy of the Beidou system has brought great challenges to the control of quadrotor aircraft that need to hover. Then apply to the Beidou system.
为了实现使用北斗导航系统的高精度位置、速度估计,结合机载视觉导航系统是解决方案之一。在相关文献中,已有将视觉导航应用于辅助GPS系统的应用案例。例如:使用光流传感器获得的速度信息提高GPS的位置和速度测量精度。光流法提供的速度信息还可与GPS的速度进行比较,进而获得地面的相对高度。在空间交会对接过程中,视觉导航算法被用于与GPS信息融合,以获得空间飞行器间的相对位置信息。In order to achieve high-precision position and velocity estimation using the Beidou navigation system, combining the airborne visual navigation system is one of the solutions. In the relevant literature, there have been application cases of visual navigation applied to assisted GPS systems. For example: using the speed information obtained by the optical flow sensor to improve the position and speed measurement accuracy of GPS. The speed information provided by the optical flow method can also be compared with the speed of GPS to obtain the relative height of the ground. During space rendezvous and docking, visual navigation algorithms are used to fuse with GPS information to obtain relative position information between spacecraft.
在过去的研究工作中,使用GPS卫星导航系统的无人飞行器导航算法已取得了相对多的研究成果,尤其是在辅助GPS的广域增强系统(Wide Area Augmentation System,WAAS)投入应用以及美国军方去除SA(Selective Availability)干扰之后,民用GPS定位精度已经从初期的100米提高了1米。WAAS广域增强系统提供的辅助信息,可提高GPS位置测量信息的可靠性和精度。而SA干扰在2000年被美军关闭后,更大大提高了民用GPS的定位精度。在WASS系统覆盖的区域,GPS系统的定位精度可达1米,这就使低成本惯性导航系统也能在GPS的帮助下获得相对精确的位置估计,进而大大推动了民用无人飞行器领域的发展。In the past research work, the UAV navigation algorithm using the GPS satellite navigation system has achieved relatively many research results, especially when the GPS-assisted Wide Area Augmentation System (Wide Area Augmentation System, WAAS) was put into application and the U.S. military After removing the SA (Selective Availability) interference, the civilian GPS positioning accuracy has increased by 1 meter from the initial 100 meters. Auxiliary information provided by the WAAS Wide Area Augmentation System improves the reliability and accuracy of GPS position measurement information. After the SA interference was shut down by the US military in 2000, it greatly improved the positioning accuracy of civilian GPS. In the area covered by the WASS system, the positioning accuracy of the GPS system can reach 1 meter, which enables the low-cost inertial navigation system to obtain relatively accurate position estimation with the help of GPS, which in turn greatly promotes the development of the field of civilian unmanned aerial vehicles .
然而作为我国尚处于建设阶段的北斗卫星导航系统,类似WAAS这样的增强系统尚未搭建,因此无论是定位精度还是定位可靠性方面,均有较大的差距。在同一地点同一时刻采集北斗导航系统与GPS导航系统的原始测量值进行对比。通过静止状态下连续10分钟的测量,在东方向上所有GPS的位置测量值均在±1.5m的范围内,而北斗导航系统的位置测量值散布在±15m的范围。However, as my country's Beidou satellite navigation system is still in the construction stage, augmentation systems like WAAS have not yet been built, so there is a big gap in both positioning accuracy and positioning reliability. The original measurement values of the Beidou navigation system and the GPS navigation system were collected at the same time at the same place for comparison. Through continuous 10-minute measurements in a static state, all GPS position measurements in the east direction are within the range of ±1.5m, while those of the Beidou navigation system are scattered within the range of ±15m.
除了精度的不足之外,实际应用中北斗导航系统由于在轨卫星数量有限,定位可靠性较低,时常会因遮挡造成定位丢失。而对于传统的与低成本惯性导航器件组成的系统来说,可靠的位置测量是必不可少的,缺少位置信息的更新,低成本、低精度的惯性导航系统的位置估计在几秒钟内就会发散。即便是使用精密的导航级惯性器件,惯性导航系统的位置估计误差在五分钟内就达到了30m以上。In addition to the lack of accuracy, due to the limited number of satellites in orbit of the Beidou navigation system in practical applications, the positioning reliability is low, and positioning is often lost due to occlusion. For the traditional system composed of low-cost inertial navigation devices, reliable position measurement is essential, lacking the update of position information, the position estimation of low-cost and low-precision inertial navigation systems can be completed within a few seconds. will diverge. Even with sophisticated navigation-grade inertial devices, the position estimation error of the inertial navigation system reaches more than 30m within five minutes.
发明内容Contents of the invention
为了实现对四旋翼飞行器的精确控制,获得位置的准确、平滑估计是非常重要的。但是由于北斗卫星导航系统精度的不足,使用传统的、应用于GPS系统的传感器融合方法,并不能实现四旋翼飞行器的稳定自主悬停。另一方面,北斗导航系统尚处于建设当中,在轨卫星数量的限制,使北斗系统更易受建筑、树木等障碍物的遮挡,造成定位质量的恶化。为解决上述问题,本发明提出了一种适合北斗系统的传感器融合方法,通过引入视觉导航,位置估计的准确性和鲁邦性均得到了提升。本发明采用的具体技术方案是,基于北斗导航系统的微小型无人飞行器定位与导航方法,包括如下步骤:利用安装在四旋翼无人机底部的光流传感器获取无人机的速度信息,利用机载惯性导航装置获取加速度信息,利用机载视觉系统获取速度信息,结合北斗系统位置的原始测量值,经融合滤波获得对于位置和速度的估计;进而通过非线性的位置控制算法,实现飞行器位置控制。To achieve precise control of a quadrotor, it is important to obtain an accurate, smooth estimate of position. However, due to the lack of accuracy of the Beidou satellite navigation system, the traditional sensor fusion method applied to the GPS system cannot achieve stable and autonomous hovering of the quadrotor aircraft. On the other hand, the Beidou navigation system is still under construction, and the limitation of the number of satellites in orbit makes the Beidou system more vulnerable to obstacles such as buildings and trees, resulting in deterioration of positioning quality. In order to solve the above problems, the present invention proposes a sensor fusion method suitable for the Beidou system. By introducing visual navigation, the accuracy and Lupine performance of position estimation have been improved. The specific technical solution adopted by the present invention is that the positioning and navigation method of the micro-miniature unmanned aerial vehicle based on the Beidou navigation system includes the following steps: using the optical flow sensor installed at the bottom of the quadrotor drone to obtain the speed information of the drone, using The airborne inertial navigation device obtains the acceleration information, and the airborne vision system obtains the velocity information. Combined with the original measurement value of the position of the Beidou system, the estimation of the position and velocity is obtained through fusion filtering; and then through the nonlinear position control algorithm, the position of the aircraft is realized. control.
所述融合滤波其滤波器的状态向量X定义为:The state vector X of the filter of the fusion filter is defined as:
其中,(x,y)为水平方向的位置,为水平方向的速度;Among them, (x, y) is the position in the horizontal direction, is the velocity in the horizontal direction;
滤波器为卡尔曼滤波器,状态转移方程和观测方程如下式所示:The filter is a Kalman filter, and the state transition equation and observation equation are as follows:
X(k)=AX(k-1)+Bu(k-1)+ω(k-1)X(k)=AX(k-1)+Bu(k-1)+ω(k-1)
Z(k)=CX(k)+ν(k)Z(k)=CX(k)+ν(k)
其中,k代表时刻,u为输入向量,Z为观测向量,ω和v为具有正态分布特征的相互独立的输入噪声和观测噪声,输入向量u,和观测向量Z的定义如下:Among them, k represents the time, u is the input vector, Z is the observation vector, ω and v are the independent input noise and observation noise with normal distribution characteristics, the input vector u and the observation vector Z are defined as follows:
u=(ax,ay)T u=(a x ,a y ) T
其中(ax,ay)为机载惯性传感器获得的水平方向加速度测量值,状态转换矩阵A和输入控制矩阵B的定义如下:Where (a x , a y ) is the horizontal acceleration measurement value obtained by the airborne inertial sensor, the state transition matrix A and the input control matrix B are defined as follows:
其中δt为传感器的采样周期,观测矩阵C定义如下:Where δt is the sampling period of the sensor, and the observation matrix C is defined as follows:
卡尔曼滤波器的目标是使用k时刻的观测值Y(k),前一时刻的状态估计值以及前一时刻的输入控制量u(k-1),对k时刻的状态进行最优估计,这一统计学上的最优滤波器如下式所示:The goal of the Kalman filter is to use the observed value Y(k) at time k, the state estimate at the previous time And the input control quantity u(k-1) at the previous moment, for the state at time k For optimal estimation, this statistically optimal filter is given by the following formula:
P(k|k-1)=AP(k-1)AT+BQBT P(k|k-1)=AP(k-1)A T +BQB T
H(k)=P(k|k-1)CT(CP(k|k-1)CT+R)-1 H(k)=P(k|k-1)C T (CP(k|k-1)C T +R) -1
P(k)=(I-H(k)C)P(k|k-1)P(k)=(I-H(k)C)P(k|k-1)
其中P为状态的协方差矩阵,协方差矩阵Q代表了加速度数据的噪声,协方差矩阵R代表北斗系统和视觉系统的观测值的噪声,这两个矩阵均为对角阵,并可以由记录的实际飞行数据确定,Among them, P is the covariance matrix of the state, the covariance matrix Q represents the noise of the acceleration data, and the covariance matrix R represents the noise of the observation values of the Beidou system and the vision system. These two matrices are diagonal and can be recorded by The actual flight data is determined,
其中矩阵Q的数值较小,而R中对应与北斗系统观测值的项选取了较大的数值,而对应视觉系统观测值的项数值较小Among them, the value of the matrix Q is small, and the item corresponding to the observation value of the Beidou system in R selects a large value, while the value of the item corresponding to the observation value of the visual system is small
通过非线性的位置控制算法,实现飞行器位置控制是指采用非线性的鲁邦控制器,具体如下所述:Through the nonlinear position control algorithm, the realization of aircraft position control refers to the use of nonlinear Lupine controller, as follows:
选择飞行器的位置和偏航角作为系统的输出,表示为η=[x y z ψ]T,控制目标是使飞行器跟踪某一给定的轨迹,这条轨迹可表示为ηd=[xd yd zd ψd]T;横向位置x和纵向位置y可通过机载视觉系统的反馈获得,竖直方向的位置z可从板载气压计的读数获得,控制器由内环既姿态环与外环既位置环构成;内环采用了经典的比例、积分、微分控制(Proportion Integration Differentiation,PID)控制器,外环使用了非线性鲁邦控制器,期望的横滚、俯仰姿态角及横滚、俯仰姿态角速度由外环控制器计算得到,简化后的的四旋翼飞行器的平动方向的动力学模型,表示为:Select the position and yaw angle of the aircraft as the output of the system, expressed as η=[xyz ψ] T , the control target is to make the aircraft track a given trajectory, this trajectory can be expressed as η d =[x d y d z d ψ d ] T ; the lateral position x and longitudinal position y can be obtained through the feedback of the airborne vision system, and the vertical position z can be obtained from the readings of the onboard barometer. The loop is composed of a position loop; the inner loop uses a classic proportional, integral, differential control (Proportion Integration Differentiation, PID) controller, and the outer loop uses a nonlinear Lubang controller. The expected roll and pitch angles And roll, pitch attitude angular velocity Calculated by the outer loop controller, the simplified dynamic model of the translational direction of the quadrotor is expressed as:
当飞行器到达了给定的轨迹ηd=[xd yd zd ψd]T时,When the aircraft reaches the given trajectory η d =[x d y d z d ψ d ] T ,
定义辅助向量μ=[μx μy μz]T:Define the auxiliary vector μ=[μ x μ y μ z ] T :
这里μ代表了期望的加速度向量或虚拟位置控制向量。Here μ represents the desired acceleration vector or virtual position control vector.
外环位置控制器使用了基于鲁棒误差符号函数积分(Rotust Inetgral of theSignum of the Error,RISE)的新型鲁棒控制器。定义位置跟踪的误差信号如下:The outer loop position controller uses a new robust controller based on the robust integral of the Signum of the Error (RISE). The error signal that defines position tracking is as follows:
ex1=xd-x ey1=yd-y ez1=zd-ze x1 =x d -xe y1 =y d -ye z1 =z d -z
其中xd,yd,zd为时变的参考轨迹,引入如下的辅助误差信号:where x d , y d , z d are time-varying reference trajectories, and the following auxiliary error signals are introduced:
这里αx,αy和αz是正的增益,设计位置控制器μ:Here α x , α y and α z are positive gains, and the position controller μ is designed:
其中ksx,ksy,ksz,βx,βy,βz为正的增益,sgn(·)为标注的符号函数。Where k sx , k sy , k sz , β x , β y , β z are positive gains, and sgn(·) is the marked sign function.
μ(t)中各项可表示为:The terms in μ(t) can be expressed as:
解出总升力u1(t)和期望的姿态角 Solve for total lift u 1 (t) and desired attitude angle
设计内环即姿态环控制器的控制输入u2,u3,u4如下:Design the control inputs u 2 , u 3 , u 4 of the inner loop, namely the attitude loop controller, as follows:
式中kpθ,kdθ,kiθ,kpψ,kdψ,kiψ是正的增益,其中,为滚转角度姿态控制器的比例、微分、积分系数,kpθ,kdθ,kiθ,为俯仰角度姿态控制器的比例、微分、积分系数,kpψ,kdψ,kiψ为偏航角度姿态控制器的比例、微分、积分系数,跟踪误差eθ,eψ定义为:In the formula k pθ ,k dθ , kiθ ,k pψ ,k dψ ,k iψ are positive gains, where, are the proportional, differential and integral coefficients of the roll angle attitude controller, k pθ , k dθ , k iθ , are the proportional, differential and integral coefficients of the pitch angle attitude controller, k pψ , k dψ , k iψ are the yaw angles Proportional, differential, integral coefficients of attitude controller, tracking error e θ ,e ψ are defined as:
eθ=θd-θ eψ=ψd-ψ e θ = θ d -θ e ψ = ψ d -ψ
θd由外环控制器得到,ψd为偏航角的时变轨迹。 θ d is obtained by the outer loop controller, and ψ d is the time-varying trajectory of the yaw angle.
与已有技术相比,本发明的技术特点与效果:Compared with prior art, technical characteristic and effect of the present invention:
使用相对低精度的国产北斗卫星导航系统时,无人飞行器高精度位置控制的方法进行了初步研究。通过将北斗位置信息与机载视觉导航系统的速度信息进行融合,获得了对飞行器位置高精度、无累积误差的估计值,同时,在非线性位置控制器的控制下,实现了使用北斗卫星导航系统的高精度位置控制。长距离的飞行验证显示,本发明所提出的传感器融合方案,结合了光流法短期位置估计精度高和北斗卫星导航系统无长期累积误差的优点,初步实现了北斗导航系统在具备悬停飞行能力的无人飞行器上的应用。A preliminary study was conducted on the method of high-precision position control of unmanned aerial vehicles when using the relatively low-precision domestic Beidou satellite navigation system. By fusing the Beidou position information with the speed information of the airborne visual navigation system, the estimated value of the aircraft position with high precision and no cumulative error is obtained. At the same time, under the control of the nonlinear position controller, the use of Beidou satellite navigation is realized. High-precision position control of the system. The long-distance flight verification shows that the sensor fusion scheme proposed by the present invention combines the advantages of high short-term position estimation accuracy of the optical flow method and the advantages of no long-term cumulative error of the Beidou satellite navigation system, and initially realizes the Beidou navigation system with the hovering flight capability. applications on unmanned aerial vehicles.
附图说明Description of drawings
图1为本发明所涉及的多传感器融合滤波器的一种具体实施方式的示意图。FIG. 1 is a schematic diagram of a specific embodiment of a multi-sensor fusion filter involved in the present invention.
图2为本发明所涉及的机载北斗飞行控制系统的一种具体实施方式。Fig. 2 is a specific implementation of the airborne Beidou flight control system involved in the present invention.
图3为本发明所涉及方法的实际效果。Fig. 3 is the actual effect of the method involved in the present invention.
具体实施方式detailed description
本发明所要解决的技术问题是,提供一种基于北斗卫星导航系统与视觉传感器数据融合的无人机自主定位方法,实现室外环境下无人机的精准无漂移定位。The technical problem to be solved by the present invention is to provide an autonomous positioning method for unmanned aerial vehicles based on the fusion of Beidou satellite navigation system and visual sensor data, so as to realize accurate and drift-free positioning of unmanned aerial vehicles in outdoor environments.
本发明采用的技术方案是:采用北斗卫星导航系统与光流传感器数据融合的方法用于无人机的定位系统中,包括如下步骤:The technical scheme adopted in the present invention is: adopt the method of data fusion of the Beidou satellite navigation system and the optical flow sensor to be used in the positioning system of the unmanned aerial vehicle, comprising the following steps:
本发明采用“传感器融合(滤波)—控制”架构,将北斗与光流、惯导通过滤波器进行融合,实现了单独北斗导航系统无法实现的飞行控制精度。此外,还将非线性的基于鲁棒误差符号函数积分(Rotust Inetgral of the Signum of the Error,RISE)的新型鲁棒控制器用于飞行器的控制算法中,进一步提高了控制效果。The present invention adopts a "sensor fusion (filtering)-control" framework, and integrates Beidou, optical flow, and inertial navigation through a filter, and realizes flight control accuracy that cannot be achieved by a single Beidou navigation system. In addition, a new nonlinear robust controller based on the robust integral of the sign of error (Rotust Inegral of the Signum of the Error, RISE) is used in the control algorithm of the aircraft, which further improves the control effect.
利用安装在四旋翼无人机底部的光流传感器获取无人机的速度信息,并利用此速度信息来提高位置估计精度。通过使用如图1示的滤波器结构,在惯导的加速度信息、机载视觉系统的速度信息的帮助下,结合北斗系统位置的原始测量值,可以获得对于位置和速度的高精度的的可靠估计。进而通过非线性的位置控制算法,实现高精度的飞行器位置控制。The optical flow sensor installed on the bottom of the quadrotor UAV is used to obtain the speed information of the UAV, and this speed information is used to improve the accuracy of position estimation. By using the filter structure shown in Figure 1, with the help of the acceleration information of the inertial navigation system and the velocity information of the airborne vision system, combined with the original measurement value of the position of the Beidou system, a high-precision and reliable position and velocity can be obtained. estimate. Furthermore, the non-linear position control algorithm is used to realize high-precision aircraft position control.
本发明采取的技术方案是,将视觉传感器的数据与北斗导航系统的位置信息进行融合,进而实现无人机的定位,包括如下步骤:The technical solution adopted by the present invention is to fuse the data of the visual sensor with the position information of the Beidou navigation system, and then realize the positioning of the drone, including the following steps:
利用安装在四旋翼无人机底部的光流传感器获取无人机的速度信息,并利用此速度信息来提高位置估计精度。通过使用如图1示的滤波器结构,在惯导的加速度信息、机载视觉系统的速度信息的帮助下,结合北斗系统位置的原始测量值,可以获得对于位置和速度的高精度的可靠估计。The optical flow sensor installed on the bottom of the quadrotor UAV is used to obtain the speed information of the UAV, and this speed information is used to improve the accuracy of position estimation. By using the filter structure shown in Figure 1, with the help of the acceleration information of the inertial navigation system and the velocity information of the airborne vision system, combined with the original measurement value of the position of the Beidou system, a reliable estimate of position and velocity with high precision can be obtained .
所设计的滤波器的状态向量定义为:The state vector of the designed filter defined as:
其中,(x,y)为水平方向的位置,为水平方向的速度。Among them, (x, y) is the position in the horizontal direction, is the velocity in the horizontal direction.
卡尔曼滤波器的状态转移方程和观测方程如下式所示:The state transition equation and observation equation of the Kalman filter are as follows:
X(k)=AX(k-1)+Bu(k-1)+ω(k-1)X(k)=AX(k-1)+Bu(k-1)+ω(k-1)
Z(k)=CX(k)+ν(k)Z(k)=CX(k)+ν(k)
其中,u为输入向量,Z为观测向量,w和v为具有正态分布特征的相互独立的输入噪声和观测噪声。输入向量u,和观测向量Z的定义如下:Among them, u is the input vector, Z is the observation vector, w and v are independent input noise and observation noise with normal distribution characteristics. The input vector u, and the observation vector Z are defined as follows:
u=(ax,ay)T u=(a x ,a y ) T
其中(ax,ay)为机载惯性传感器获得的水平方向加速度测量值,状态转换矩阵A和输入控制矩阵B的定义如下:Where (a x , a y ) is the horizontal acceleration measurement value obtained by the airborne inertial sensor, the state transition matrix A and the input control matrix B are defined as follows:
其中δt为传感器的采样周期,观测矩阵C定义如下:Where δt is the sampling period of the sensor, and the observation matrix C is defined as follows:
卡尔曼滤波器的目标是使用k时刻的观测值Y(k),前一时刻的状态估计值以及前一时刻的输入控制量u(k-1),对k时刻的状态进行最优估计,这一统计学上的最优滤波器如下式所示:The goal of the Kalman filter is to use the observed value Y(k) at time k, the state estimate at the previous time And the input control quantity u(k-1) at the previous moment, for the state at time k For optimal estimation, this statistically optimal filter is given by the following formula:
P(k|k-1)=AP(k-1)AT+BQBT P(k|k-1)=AP(k-1)A T +BQB T
H(k)=P(k|k-1)CT(CP(k|k-1)CT+R)-1 H(k)=P(k|k-1)C T (CP(k|k-1)C T +R) -1
P(k)=(I-H(k)C)P(k|k-1)P(k)=(I-H(k)C)P(k|k-1)
其中协方差矩阵Q代表了加速度数据的噪声,协方差矩阵R代表北斗系统和视觉系统的观测值的噪声,这两个矩阵均为对角阵,并可以由记录的实际飞行数据确定。在本系统中,矩阵Q的数值较小,而R中对应与北斗系统观测值的项选取了较大的数值,而对应视觉系统观测值的项数值较小The covariance matrix Q represents the noise of the acceleration data, and the covariance matrix R represents the noise of the observations of the Beidou system and the vision system. These two matrices are diagonal and can be determined from the recorded actual flight data. In this system, the value of the matrix Q is small, and the item in R corresponding to the observation value of the Beidou system selects a large value, while the value of the item corresponding to the observation value of the visual system is small
为了提高对外界扰动的抑制能力,在无人飞行器上使用了非线性的鲁邦控制器。该控制器设计如下:In order to improve the ability to suppress external disturbances, a nonlinear Lupine controller is used on the unmanned aerial vehicle. The controller is designed as follows:
选择飞行器的位置和偏航角作为系统的输出,表示为η=[x y z ψ]T,控制目标是使飞行器跟踪某一给定的轨迹,这条轨迹可表示为ηd=[xd yd zd ψd]T。平动方向的位置x和y可通过机载视觉系统的反馈获得,竖直方向的位置z可从板载气压计的读数获得。控制器由内环(姿态环)与外环(位置环)构成。内环采用了经典的比例、积分、微分控制(Proportion Integration Differentiation,PID)控制器,外环使用了非线性鲁邦控制器。期望的姿态角及姿态角速度由外环控制器计算得到,简化后的的四旋翼飞行器的平动方向的动力学模型,可表示为:Select the position and yaw angle of the aircraft as the output of the system, expressed as η=[xyz ψ] T , the control target is to make the aircraft track a given trajectory, this trajectory can be expressed as η d =[x d y d z d ψ d ] T . The position x and y in the translational direction can be obtained through the feedback of the onboard vision system, and the position z in the vertical direction can be obtained from the reading of the onboard barometer. The controller consists of an inner loop (attitude loop) and an outer loop (position loop). The inner loop uses a classic proportional, integral, differential control (Proportion Integration Differentiation, PID) controller, and the outer loop uses a nonlinear Lupine controller. desired attitude angle and attitude angular velocity Calculated by the outer loop controller, the simplified dynamic model of the translational direction of the quadrotor aircraft can be expressed as:
当飞行器到达了给定的轨迹ηd=[xd yd zd ψd]T时,When the aircraft reaches the given trajectory η d =[x d y d z d ψ d ] T ,
定义辅助向量μ=[μx μy μz]T:Define the auxiliary vector μ=[μ x μ y μ z ] T :
这里μ(t)代表了期望的加速度向量或虚拟位置控制向量。Here μ(t) represents the desired acceleration vector or virtual position control vector.
由于微小型无人飞行器的重量较小,易受到气流等外界扰动的影响。为了提高控制器的鲁棒性,外环位置控制器使用了基于鲁棒误差符号函数积分(Rotust Inetgral ofthe Signum of the Error,RISE)的新型鲁棒控制器。定义位置跟踪的误差信号如下:Due to the small weight of the micro UAV, it is easily affected by external disturbances such as airflow. In order to improve the robustness of the controller, the outer loop position controller uses a new robust controller based on the robust integral of the Signum of the Error (RISE). The error signal that defines position tracking is as follows:
ex1=xd-x ey1=yd-y ez1=zd-ze x1 =x d -xe y1 =y d -ye z1 =z d -z
其中xd,yd,zd为时变的参考轨迹。为了方便后面的控制器设计,引入如下的辅助误差信号:Where x d , y d , z d are time-varying reference trajectories. In order to facilitate the subsequent controller design, the following auxiliary error signal is introduced:
这里αx,αy和αz是正的增益。设计位置控制器μ(t):Here α x , α y and α z are positive gains. Design the position controller μ(t):
其中ksx,ksy,ksz,βx,βy,βz为正的增益,sgn(·)为标注的符号函数。Where k sx , k sy , k sz , β x , β y , β z are positive gains, and sgn(·) is the marked sign function.
μ(t)中各项可表示为:The terms in μ(t) can be expressed as:
可解出总升力u1和期望的姿态角 The total lift u 1 and the expected attitude angle can be solved
设计内环(姿态环)控制器的控制输入u2,u3,u4如下:Design the control inputs u 2 , u 3 , u 4 of the inner loop (attitude loop) controller as follows:
式中kpθ,kdθ,kiθ,kpψ,kdψ,kiψ是正的增益,其中,为滚转角度姿态控制器的比例、微分、积分系数,kpθ,kdθ,kiθ,为俯仰角度姿态控制器的比例、微分、积分系数,kpψ,kdψ,kiψ为偏航角度姿态控制器的比例、微分、积分系数,跟踪误差eθ,eψ定义为:In the formula k pθ ,k dθ , kiθ ,k pψ ,k dψ ,k iψ are positive gains, where, are the proportional, differential and integral coefficients of the roll angle attitude controller, k pθ , k dθ , k iθ , are the proportional, differential and integral coefficients of the pitch angle attitude controller, k pψ , k dψ , k iψ are the yaw angles Proportional, differential, integral coefficients of attitude controller, tracking error e θ ,e ψ are defined as:
eθ=θd-θ eψ=ψd-ψ e θ = θ d -θ e ψ = ψ d -ψ
θd由外环控制器得到,ψd为偏航角的时变轨迹。 θ d is obtained by the outer loop controller, and ψ d is the time-varying trajectory of the yaw angle.
下面给出具体的实例:Specific examples are given below:
一、系统硬件连接及配置1. System hardware connection and configuration
如图2所示,本发明的基于视觉的四旋翼无人机自主飞行控制方法采用基于嵌入式架构的飞行控制结构,所搭建的实验平台包括四旋翼无人机本体、地面站、遥控器等。其中四旋翼无人机搭载了嵌入式计算机(该计算机内嵌Intel Core i3双核处理器,主频1.8GHz)、机载PX4FLOW光流传感器、GPS和飞行控制器(含惯性导航单元和气压计模块等)。地面站包括一台装有Linux操作系统的笔记本计算机,用于机载程序的启动及远程监控。该平台可通过遥控器进行手动起飞和降落,并在发生意外时紧急切换为手动模式,以确保实验安全。As shown in Figure 2, the vision-based quadrotor UAV autonomous flight control method of the present invention adopts a flight control structure based on an embedded architecture, and the built experimental platform includes a quadrotor UAV body, a ground station, a remote control, etc. . Among them, the quadrotor drone is equipped with an embedded computer (the computer is embedded with an Intel Core i3 dual-core processor with a main frequency of 1.8GHz), an airborne PX4FLOW optical flow sensor, GPS and a flight controller (including an inertial navigation unit and a barometer module. Wait). The ground station includes a notebook computer equipped with a Linux operating system, which is used for the start-up of the airborne program and remote monitoring. The platform can be manually taken off and landed through the remote control, and can be switched to manual mode in case of an accident to ensure the safety of the experiment.
二、飞行实验结果2. Flight experiment results
本实施例对上述实验平台进行了多组飞行控制实验,飞行实验环境为室外校园环境中。实验目标是使用北斗导航系统信息实现无人飞行器的高精度无漂移定位。In this embodiment, multiple groups of flight control experiments are carried out on the above-mentioned experimental platform, and the flight experiment environment is an outdoor campus environment. The goal of the experiment is to use the Beidou navigation system information to achieve high-precision drift-free positioning of unmanned aerial vehicles.
室外手持实验过程中的飞行轨迹曲线如图3所示。其中,标有●的曲线为用作参考的高精度GPS接收机的测量值,标有▲的为使用视觉传感器的速度信息积分而得的视觉里程计测量值,标有■的为本发明所设计的多传感器融合方法的融合结果。从图中可以看出,北斗卫星导航系统的原始测量值的精度相对较低,而视觉里程计法在长距离的工作后,产生了明显的累积误差,而使用本发明所设计的融合方法,可以获得高精度且无累积误差的位置估计,证明了本发明所设计的融合算法的有效性。The flight trajectory curve during the outdoor handheld experiment is shown in Figure 3. Among them, the curve marked with ● is the measured value of the high-precision GPS receiver used as a reference, the curve marked with ▲ is the measured value of the visual odometer obtained by using the speed information integration of the visual sensor, and the curve marked with ■ is the measured value of the present invention. Fusion results of the designed multi-sensor fusion method. It can be seen from the figure that the accuracy of the original measurement value of the Beidou satellite navigation system is relatively low, and the visual odometry method has produced obvious cumulative errors after long-distance work, but using the fusion method designed in the present invention, The position estimation with high precision and no accumulative error can be obtained, which proves the effectiveness of the fusion algorithm designed in the present invention.
显然,上述实例仅仅是为清楚的说明所作的举例,而并非对实施方式的限定,对于所属领域的普通技术人员来说,在上述说明的基础上还可以做出其他不同形式的变化或变动。这里无需也无法对所有实施方式予以穷举。而由此引申出的显而易见的变化或变动仍处于本发明创造的保护范围之中。Apparently, the above examples are only examples for clear description, rather than limiting the implementation. For those of ordinary skill in the art, other changes or changes in different forms can be made on the basis of the above description. It is not necessary and impossible to exhaustively list all implementation manners here. However, the obvious changes or variations derived therefrom are still within the scope of protection of the present invention.
Claims (4)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611032966.9A CN106647784A (en) | 2016-11-15 | 2016-11-15 | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611032966.9A CN106647784A (en) | 2016-11-15 | 2016-11-15 | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106647784A true CN106647784A (en) | 2017-05-10 |
Family
ID=58808761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611032966.9A Pending CN106647784A (en) | 2016-11-15 | 2016-11-15 | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106647784A (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107389968A (en) * | 2017-07-04 | 2017-11-24 | 武汉视览科技有限公司 | A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer |
CN108007474A (en) * | 2017-08-31 | 2018-05-08 | 哈尔滨工业大学 | A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking |
CN108052005A (en) * | 2017-12-07 | 2018-05-18 | 智灵飞(北京)科技有限公司 | Control method, the unmanned plane of a kind of interior unmanned plane speed limit and limit for height |
CN108536171A (en) * | 2018-03-21 | 2018-09-14 | 电子科技大学 | The paths planning method of multiple no-manned plane collaboration tracking under a kind of multiple constraint |
CN108535279A (en) * | 2018-03-09 | 2018-09-14 | 成都圭目机器人有限公司 | A kind of detection method detecting robot based on sewage pipeline |
CN109900265A (en) * | 2019-03-15 | 2019-06-18 | 武汉大学 | A kind of robot localization algorithm of camera/mems auxiliary Beidou |
CN110646814A (en) * | 2019-09-16 | 2020-01-03 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle deception method under combined navigation mode |
WO2020259185A1 (en) * | 2019-06-25 | 2020-12-30 | 京东方科技集团股份有限公司 | Method and apparatus for implementing visual odometer |
CN112363525A (en) * | 2020-11-30 | 2021-02-12 | 扬州市久冠航空科技有限公司 | Aircraft control method |
CN112539746A (en) * | 2020-10-21 | 2021-03-23 | 济南大学 | Robot vision/INS combined positioning method and system based on multi-frequency Kalman filtering |
CN114001736A (en) * | 2021-11-09 | 2022-02-01 | Oppo广东移动通信有限公司 | Positioning method, positioning device, storage medium and electronic equipment |
CN115542362A (en) * | 2022-12-01 | 2022-12-30 | 成都信息工程大学 | High-precision space positioning method, system, equipment and medium for electric power operation site |
RU2796411C1 (en) * | 2022-06-24 | 2023-05-23 | Федеральное государственное казенное военное образовательное учреждение высшего образования "Военный учебно-научный центр Военно-воздушных сил "Военно-воздушная академия имени профессора Н.Е. Жуковского и Ю.А. Гагарина" (г. Воронеж) Министерства обороны Российской Федерации | Flight control device for ground-based radio-technical facilities of flight support |
CN117269885A (en) * | 2023-11-23 | 2023-12-22 | 中国飞行试验研究院 | Aircraft positioning method and device based on opportunistic signal fusion |
CN117455202A (en) * | 2023-12-25 | 2024-01-26 | 青岛民航凯亚系统集成有限公司 | Positioning and scheduling method, system and device for apron equipment |
CN118225636A (en) * | 2024-05-23 | 2024-06-21 | 中国矿业大学 | A non-invasive method for estimating the motion state of tracer particles |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102829779A (en) * | 2012-09-14 | 2012-12-19 | 北京航空航天大学 | Aircraft multi-optical flow sensor and inertia navigation combination method |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104808231A (en) * | 2015-03-10 | 2015-07-29 | 天津大学 | Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion |
-
2016
- 2016-11-15 CN CN201611032966.9A patent/CN106647784A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102829779A (en) * | 2012-09-14 | 2012-12-19 | 北京航空航天大学 | Aircraft multi-optical flow sensor and inertia navigation combination method |
CN103365297A (en) * | 2013-06-29 | 2013-10-23 | 天津大学 | Optical flow-based four-rotor unmanned aerial vehicle flight control method |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104808231A (en) * | 2015-03-10 | 2015-07-29 | 天津大学 | Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion |
Non-Patent Citations (1)
Title |
---|
曹美会 等: "基于视觉的四旋翼无人机自主定位与控制系统", 《信息与控制》 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107389968A (en) * | 2017-07-04 | 2017-11-24 | 武汉视览科技有限公司 | A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer |
CN108007474A (en) * | 2017-08-31 | 2018-05-08 | 哈尔滨工业大学 | A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking |
CN108052005A (en) * | 2017-12-07 | 2018-05-18 | 智灵飞(北京)科技有限公司 | Control method, the unmanned plane of a kind of interior unmanned plane speed limit and limit for height |
CN108535279A (en) * | 2018-03-09 | 2018-09-14 | 成都圭目机器人有限公司 | A kind of detection method detecting robot based on sewage pipeline |
CN108535279B (en) * | 2018-03-09 | 2021-05-25 | 成都圭目机器人有限公司 | Detection method based on sewage pipeline detection robot |
CN108536171A (en) * | 2018-03-21 | 2018-09-14 | 电子科技大学 | The paths planning method of multiple no-manned plane collaboration tracking under a kind of multiple constraint |
CN108536171B (en) * | 2018-03-21 | 2020-12-29 | 电子科技大学 | A Path Planning Method for Cooperative Tracking of Multiple UAVs under Multiple Constraints |
CN109900265A (en) * | 2019-03-15 | 2019-06-18 | 武汉大学 | A kind of robot localization algorithm of camera/mems auxiliary Beidou |
WO2020259185A1 (en) * | 2019-06-25 | 2020-12-30 | 京东方科技集团股份有限公司 | Method and apparatus for implementing visual odometer |
CN110646814A (en) * | 2019-09-16 | 2020-01-03 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle deception method under combined navigation mode |
CN112539746A (en) * | 2020-10-21 | 2021-03-23 | 济南大学 | Robot vision/INS combined positioning method and system based on multi-frequency Kalman filtering |
CN112363525A (en) * | 2020-11-30 | 2021-02-12 | 扬州市久冠航空科技有限公司 | Aircraft control method |
CN114001736A (en) * | 2021-11-09 | 2022-02-01 | Oppo广东移动通信有限公司 | Positioning method, positioning device, storage medium and electronic equipment |
RU2796411C1 (en) * | 2022-06-24 | 2023-05-23 | Федеральное государственное казенное военное образовательное учреждение высшего образования "Военный учебно-научный центр Военно-воздушных сил "Военно-воздушная академия имени профессора Н.Е. Жуковского и Ю.А. Гагарина" (г. Воронеж) Министерства обороны Российской Федерации | Flight control device for ground-based radio-technical facilities of flight support |
CN115542362A (en) * | 2022-12-01 | 2022-12-30 | 成都信息工程大学 | High-precision space positioning method, system, equipment and medium for electric power operation site |
CN117269885A (en) * | 2023-11-23 | 2023-12-22 | 中国飞行试验研究院 | Aircraft positioning method and device based on opportunistic signal fusion |
CN117269885B (en) * | 2023-11-23 | 2024-02-20 | 中国飞行试验研究院 | Aircraft positioning method and device based on opportunistic signal fusion |
CN117455202A (en) * | 2023-12-25 | 2024-01-26 | 青岛民航凯亚系统集成有限公司 | Positioning and scheduling method, system and device for apron equipment |
CN117455202B (en) * | 2023-12-25 | 2024-06-28 | 青岛民航凯亚系统集成有限公司 | Positioning and scheduling method, system and device for apron equipment |
CN118225636A (en) * | 2024-05-23 | 2024-06-21 | 中国矿业大学 | A non-invasive method for estimating the motion state of tracer particles |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106647784A (en) | Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system | |
Barton | Fundamentals of small unmanned aircraft flight | |
Cho et al. | Wind estimation and airspeed calibration using a UAV with a single-antenna GPS receiver and pitot tube | |
Redding et al. | Vision-based target localization from a fixed-wing miniature air vehicle | |
CN104062977B (en) | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM | |
CN104460685A (en) | Control system for four-rotor aircraft and control method of control system | |
KR101574601B1 (en) | Multi rotor unmanned aerial vehicle, autonomous flight control method augmented by vision sensor thereof and record media recorded program for implement thereof | |
CN102692225A (en) | Attitude heading reference system for low-cost small unmanned aerial vehicle | |
CN103837151B (en) | A kind of aerodynamic model auxiliary navigation method of quadrotor | |
CN104729497A (en) | Ultra-small dual-duct unmanned plane combined navigation system and dual-mode navigation method | |
Dorobantu et al. | An airborne experimental test platform: From theory to flight | |
CN104808231A (en) | Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion | |
Song et al. | Towards autonomous control of quadrotor unmanned aerial vehicles in a GPS-denied urban area via laser ranger finder | |
CN105928515A (en) | Navigation system for unmanned plane | |
CN103712598A (en) | Attitude determination system and method of small unmanned aerial vehicle | |
Roh et al. | Dynamic accuracy improvement of a MEMS AHRS for small UAVs | |
CN108592911A (en) | A kind of quadrotor kinetic model/airborne sensor Combinated navigation method | |
Alarcon et al. | UAV helicopter relative state estimation for autonomous landing on moving platforms in a GPS-denied scenario | |
CN108873031B (en) | An optimization method for external parameter calibration of a 2-DOF pod | |
Crocoll et al. | Quadrotor inertial navigation aided by a vehicle dynamics model with in-flight parameter estimation | |
Toratani | Research and development of double tetrahedron hexa-rotorcraft (Dot-HR) | |
CN102706360B (en) | Method utilizing optical flow sensors and rate gyroscope to estimate state of air vehicle | |
CN108693372A (en) | A kind of course axis angular rate method of estimation of quadrotor | |
Perry et al. | Estimating angle of attack and sideslip under high dynamics on small UAVs | |
Bohdanov et al. | Vision-based quadrotor micro-uav position and yaw estimation and control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170510 |
|
WD01 | Invention patent application deemed withdrawn after publication |