CN113237478A - Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle Download PDF

Info

Publication number
CN113237478A
CN113237478A CN202110584179.XA CN202110584179A CN113237478A CN 113237478 A CN113237478 A CN 113237478A CN 202110584179 A CN202110584179 A CN 202110584179A CN 113237478 A CN113237478 A CN 113237478A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
coordinate system
data
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110584179.XA
Other languages
Chinese (zh)
Other versions
CN113237478B (en
Inventor
石硕
黄智图
李佳鹏
郑重
贺健
周聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202110584179.XA priority Critical patent/CN113237478B/en
Publication of CN113237478A publication Critical patent/CN113237478A/en
Application granted granted Critical
Publication of CN113237478B publication Critical patent/CN113237478B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • G05D1/0816Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a method for estimating the attitude and position of an unmanned aerial vehicle and the unmanned aerial vehicle, relates to the field of unmanned aerial vehicle control, and aims to solve the problem that the existing unmanned aerial vehicle cannot show better robustness by adopting a PID (proportion integration differentiation) controller, wherein the unmanned aerial vehicle comprises an unmanned aerial vehicle body, and an inertial sensing unit, an image processing unit and a micro control unit which are fixed with the unmanned aerial vehicle body; the self coordinate system of the inertial sensing unit is consistent with the body coordinate system, and the inertial sensing unit is connected with the micro control unit and used for measuring and obtaining three-axis angular velocity data and three-axis acceleration data of the unmanned aerial vehicle; the image processing unit is fixed below the unmanned aerial vehicle body, connected with the micro control unit and used for acquiring an image of an Apriltag; the micro-control unit comprises a storage device, a processor and a computer program stored in the storage device and executable on the processor, the processor executing the computer program to implement the method of one of the above.

Description

Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle
Technical Field
The invention relates to the field of unmanned aerial vehicle control.
Background
Four rotor unmanned aerial vehicle possess advantages such as the volume is less, the flight is nimble, can VTOL hover, portable with the fixed point as the member of small-size unmanned aerial vehicle family. But the unmanned aerial vehicle still faces a lot of practical problems such as poor control stability, low multi-unmanned aerial vehicle networking communication efficiency, difficulty in relative position keeping, collision and the like. The stability control of the unmanned aerial vehicle is an important premise for multi-machine formation control, hardware problems such as clock temperature drift, sensor noise, signal interference and the like, mechanical problems such as motor stall, high-temperature demagnetization and the like, software problems such as memory leakage, compiling optimization errors, low operation precision and the like, and sensor problems such as machine vision camera failure and the like caused by laser ranging failure and light reflection due to light absorption.
In the aspect of control algorithms, the existing position type or incremental type PID can solve most control problems, and the four-rotor unmanned plane is no exception. However, in the face of problems of air flow interference, intermittent runaway of an actuator and the like, the existing PID controller cannot show better robustness.
Disclosure of Invention
The invention aims to solve the problem that the existing unmanned aerial vehicle adopting a PID controller cannot show better robustness, and provides an unmanned aerial vehicle attitude and position estimation method and an unmanned aerial vehicle.
The invention discloses a method for estimating the attitude and the position of an unmanned aerial vehicle, which comprises an unmanned aerial vehicle attitude estimation method, wherein the unmanned aerial vehicle attitude estimation method comprises the following specific steps:
step one, carrying out normalization on three-axis acceleration measurement data of the unmanned aerial vehicle to obtain unit observation acceleration vector
Figure BDA0003087526650000011
Unit observed acceleration directionMeasurement of
Figure BDA0003087526650000012
Is the representation of the gravity acceleration direction in a body coordinate system;
step two, calculating the motion velocity vector under the navigation coordinate system
Figure BDA0003087526650000013
Expressed in a body coordinate system, obtaining unit estimated acceleration vector
Figure BDA0003087526650000014
Step three, calculating unit observation vector under the body coordinate system
Figure BDA0003087526650000015
(three-axis normalized acceleration vector of a b system of a machine body coordinate system directly provided by an airborne IMU sensor) and unit estimation acceleration vector
Figure BDA0003087526650000016
The cross product (the projection of the gravity acceleration normalization vector on a body coordinate system b system obtained by the calculation of the attitude calculation algorithm) of the error vector is obtained
Figure BDA0003087526650000017
Step four, error vector is calculated
Figure BDA0003087526650000018
The compensation quantity is used as the input of the adaptive Kalman filter, the output of the adaptive Kalman filter is obtained and used as the compensation quantity of the triaxial angular velocity data, and the compensated triaxial angular velocity data is obtained through the compensation quantity and the uncompensated triaxial angular velocity data;
and fifthly, estimating the attitude of the unmanned aerial vehicle through the compensated triaxial angular velocity data.
Further, the unit in the step one observes the acceleration vector
Figure BDA0003087526650000021
Calculated by the following formula:
Figure BDA0003087526650000022
wherein, ax、ayAnd azRepresenting the three-axis acceleration measurement data of the unmanned aerial vehicle after filtering processing; the three-axis acceleration measurement data of the unmanned aerial vehicle is obtained by measuring through an inertia sensing unit carried on the unmanned aerial vehicle.
Further, the motion velocity vector is obtained in the second step
Figure BDA0003087526650000023
The specific steps are as follows;
step two, firstly, the acceleration vector under the body coordinate system
Figure BDA0003087526650000024
By rotating the matrix
Figure BDA0003087526650000025
Rotating to a navigation coordinate system and removing the acceleration vector after the rotation
Figure BDA0003087526650000026
Obtaining the motion acceleration vector under the navigation coordinate system by the gravity acceleration component in the navigation coordinate system
Figure BDA0003087526650000027
Secondly, motion acceleration vector under the navigation coordinate system is subjected to
Figure BDA0003087526650000028
Integrating to obtain the motion velocity vector under the navigation coordinate system
Figure BDA0003087526650000029
Further, step twoUnit of (1) estimating acceleration vector
Figure BDA00030875266500000210
Calculated by the following formula:
Figure BDA00030875266500000211
wherein q is0、q1、q2、q3The real part and three imaginary parts of a quaternion q for representing the rotation process from a navigation coordinate system n system to a body coordinate system b system are respectively, namely q ═ q0,q1,q2,q3]。
Further, the concrete method of the step five is as follows:
solving a quaternion differential equation:
Figure BDA00030875266500000212
calculating the current attitude angle of the unmanned aerial vehicle by adopting a first-order Runge Kutta method to estimate the attitude of the unmanned aerial vehicle;
wherein,
Figure BDA00030875266500000213
estimating a vector for the last quaternion;
Figure BDA00030875266500000214
estimating a vector, i.e., a solution of a quaternion differential equation, for a current quaternion; t is the iteration cycle of the quaternion complementary filtering algorithm, and the unit is second; gyrox、gyroy、gyrozThe three-axis angular velocities are respectively obtained by filtering after direct measurement of an airborne IMU (inertial measurement unit) on a machine body coordinate system b.
Figure BDA00030875266500000215
Figure BDA00030875266500000216
Wherein,
Figure BDA00030875266500000217
representing the currently uncompensated triaxial angular velocity data,
Figure BDA00030875266500000218
represents the compensated triaxial angular velocity data,
Figure BDA00030875266500000219
the compensation amount of the triaxial angular velocity data.
Further, the method also comprises an unmanned plane position estimation method, and the unmanned plane position estimation method comprises the following specific steps:
sixthly, obtaining displacement initial data of an X axis, a Y axis and a Z axis of the unmanned aerial vehicle;
sixthly, Z-axis displacement correction data of the unmanned aerial vehicle are obtained, and the Z-axis displacement correction data are used for correcting the displacement initial data of the Z axis;
and sixthly, obtaining horizontal position data of the unmanned aerial vehicle under the ground coordinate system through the AprilTags label plane array, and correcting displacement initial data of the X axis and the Y axis by using the horizontal position data to complete position estimation of the unmanned aerial vehicle.
Further, the Z-axis displacement correction data of the unmanned aerial vehicle are obtained by carrying out data fusion on the ground distance and the altitude.
Further, the air conditioner is provided with a fan,
sixthly, arranging a plurality of April Tags labels into an April Tags label plane array on the ground, wherein each April Tags label comprises a pixel position and an ID number of the April Tags label;
sixthly, the unmanned aerial vehicle acquires an image of an AprilTag label, and the coordinate of the unmanned aerial vehicle in the AprilTag label array is calculated according to the pixel position and the ID number of the AprilTag label and the current attitude angle of the unmanned aerial vehicle, namely the horizontal position data of the unmanned aerial vehicle under the ground coordinate system.
The invention discloses an unmanned aerial vehicle, which comprises an unmanned aerial vehicle body, an inertial sensing unit, an image processing unit and a micro control unit, wherein the inertial sensing unit, the image processing unit and the micro control unit are fixed with the unmanned aerial vehicle body;
the self coordinate system of the inertial sensing unit is consistent with the body coordinate system, and the inertial sensing unit is connected with the micro control unit and used for measuring and obtaining three-axis angular velocity data and three-axis acceleration data of the unmanned aerial vehicle and sending the three-axis angular velocity data and the three-axis acceleration data to the micro control unit;
the image processing unit is fixed below the unmanned aerial vehicle body, connected with the micro control unit and used for acquiring an image of an Apriltag and sending the image to the micro control unit;
the micro-control unit comprises a storage device, a processor and a computer program stored in the storage device and executable on the processor, the processor executing the computer program to implement the method of one of the above.
Furthermore, the device also comprises a laser ranging module and a barometer;
the laser ranging module is fixed below the unmanned aerial vehicle body, is connected with the micro control unit, is used for measuring the distance to the ground and sends the distance to the micro control unit;
and the barometer is connected with the micro control unit and used for measuring the altitude and sending the altitude to the micro control unit.
The invention has the beneficial effects that:
the unmanned aerial vehicle flight control circuit board and the unmanned aerial vehicle flight control system are designed and realized by taking unmanned aerial vehicle fixed-point flight and unmanned aerial vehicle networking as backgrounds according to basic principles and procedures of visual positioning and attitude control.
When the unmanned aerial vehicle is subjected to flight test and is suspended at a fixed point, the hovering effect is good, the unmanned aerial vehicle designed by the invention is fully proved to realize the expected design target and the expected test target, and the targets of miniaturization, flexible flight, accurate positioning, stable control and the like are realized. Compared with a classical quaternion complementary filtering algorithm and the like, the attitude calculation and integrated navigation pose estimation algorithm based on the adaptive Kalman filter provided and applied by the invention has better robustness on burst interference (airflow, electromagnetism, large maneuver) and the like in the flight process of the unmanned aerial vehicle.
Drawings
Fig. 1 is a circuit topology diagram of a processor of the drone of the present invention;
fig. 2 is a circuit topology diagram of the connection between the processor of the unmanned aerial vehicle and the inertial sensing unit MPU9250 according to the present invention;
fig. 3 is a circuit topology diagram of the connection between the processor of the drone and the WIFI module ESP8266 according to the present invention;
fig. 4 is a circuit topology diagram of the connection of the processor of the drone of the present invention to the data storage module EEPROM;
fig. 5 is a circuit topology diagram of the unmanned aerial vehicle of the present invention in which a processor is connected to a laser ranging module TOF10120, an image processing unit OpenMv4, an optical flow sensor PMW3901, and a USART 3;
fig. 6 is a circuit topology diagram of the connection between the processor of the unmanned aerial vehicle of the present invention and the 2.4G wireless chip NRF24L 01;
fig. 7 is a circuit topology diagram of the connection between the processor of the drone and the program Download of the present invention;
FIG. 8 is a circuit topology diagram of the connection of the processor of the drone of the present invention to a memory Card TF Card;
fig. 9 is a circuit topology diagram of the connection of the processor of the drone of the present invention to the display screen;
fig. 10 is a circuit topology diagram of the connection of the processor and the motor drive of the unmanned aerial vehicle of the present invention;
fig. 11 is a circuit topology diagram of the connection of the processor of the drone of the present invention to the barometer;
fig. 12 is a schematic diagram of a modular structure of the drone of the present invention;
fig. 13 is a schematic diagram of an AprilTags tag planar array in a method for estimating the attitude and position of an unmanned aerial vehicle according to the present invention;
fig. 14 is a schematic diagram of a principle of an image coordinate transformation algorithm in the unmanned aerial vehicle attitude and position estimation method of the present invention.
Detailed Description
Wherein, unmanned aerial vehicle's PCB plate-making technique:
due to the small size of the unmanned aerial vehicle, the requirement for the wiring of the PCB is high, and a flight control system on the PCB needs to be reduced, so that the intersection among all elements is reduced as much as possible. If the PCB layout of the unmanned aerial vehicle is not good, the interference of input alternating current signals is increased to influence the output of the circuit, and therefore the performance of the whole system is influenced.
Aiming at the characteristics of the unmanned aerial vehicle, the PCB of the unmanned aerial vehicle is designed in a refined mode, and the layout and wiring of the PCB are optimized. :
unmanned aerial vehicle attitude calculation technology
The attitude controller of a quad-rotor drone requires as input the difference between the current attitude angle and the target attitude angle. The inertial sensing unit provides triaxial acceleration and triaxial angular velocity data, and real-time attitude angle data needs to be calculated through an attitude calculation algorithm. Common estimation algorithms for the attitude angle of the unmanned aerial vehicle mainly include a direct integration method and a quaternion complementary filtering algorithm proposed by Madgwick.
Position estimation technology of unmanned aerial vehicle:
the position controller of a quad-rotor drone requires as input the difference between the current position and the target position. The estimation of the current position needs triaxial acceleration and triaxial attitude angle data, and is obtained by calculation of a position estimation algorithm. In general, there are two onboard situations of the onboard inertial sensing unit, namely platform type and strapdown type.
The unmanned aerial vehicle designed by the invention is under the condition of strapdown type airborne condition, and the position estimation correspondingly adopts a strapdown type inertial navigation algorithm and combines a plurality of position estimation methods including a machine vision positioning algorithm to carry out observation correction. Under the condition of strap-down airborne, the coordinate system of the inertial sensing unit is consistent with the coordinate system of the body. In this case, the platform coordinate system needs to be defined by software, and then the position estimation is performed, and the algorithm is a strapdown inertial navigation algorithm. The algorithm is simple in mechanical and hardware implementation, but is complex and large in calculation amount.
Positioning technology based on machine vision:
the invention selects the STM32H743VIT6 model MCU as the OpenMV4 module of the micro control unit by the star pupil science and technology company as the running platform of the machine vision positioning algorithm, which is a suitable choice. In order to reduce weight, the light-weight OpenMV4 mini camera is selected as an unmanned aerial vehicle airborne image processing unit, April tags identification technology is combined, an April tags array capable of representing global coordinates is built, and a high-level language Python script is used for realizing an unmanned aerial vehicle navigation coordinate system positioning algorithm based on machine vision.
Be provided with motor drive module U1 on the circuit board, inertial sensor U2, barometer U3 converts to earth altitude through calculating current environment atmospheric pressure, U4 provides sufficient calculation power and realizes that the unmanned aerial vehicle gesture is resolved and control algorithm, 2.4G fixed frequency antenna U5 is used for controlling the unmanned aerial vehicle flight, OLED screen U6 shows unmanned aerial vehicle current state, EEPROM memory chip U7 is used for storing unmanned aerial vehicle data, voltage detection module U8 is used for detecting unmanned aerial vehicle battery voltage, wifi module U9 can be through wifi control unmanned aerial vehicle, SWD (debugging mode) debugging interface U10 is used for downloading and debugging unmanned aerial vehicle program, U11 and U12 voltage module provide stable 3.3V voltage and 5V voltage and supply power for unmanned aerial vehicle. In an unmanned aerial vehicle pose control and navigation positioning algorithm, an AKF algorithm based on ML estimation is adopted to process sensor measurement such as IMU and the like; realizing optimal information fusion of multiple IMUs by adopting a distributed fusion algorithm; realizing the error-tolerant information fusion of multiple distance meters by adopting a PDA algorithm; information fusion of the INS and a visual positioning system is realized by adopting a loose combination navigation algorithm based on AKF; and the fixed-point pose control is realized by adopting a four-ring cascade PID algorithm.
As shown in fig. 1 to 12, the PC0, PC1, PC2, and PC3 of the processor stm32 are connected to the screen;
the PA0 and PA1 of the processor stm32 are connected with TXD and RXD of the laser ranging sensor TOF 10120;
PA2 and PA3 of the processor stm32 are connected with TXD and RXD of the optical flow sensor PMW 3901;
PB10 and PB11 of the stm32 of the processor are connected with TXD and RXD of USART 3;
the PA9 and PA10 of the processor stm32 are connected with TXD and RXD of the image recognition unit OpenMv 4;
the PA4 of the stm32 processor is connected with the IMU device MPU 9250;
PA5, PA6 and PA7 of the stm32 of the processor are connected with the air pressure sensor MS5611 and the inertial sensing unit MPU 9250;
the PC4 of the stm32 of the processor is connected with the air pressure sensor MS 5611;
the PC5 of the stm32 of the processor is connected with the voltage detection module ADC;
PB0 and PB1 of the stm32 processor are connected with SDA and SCL of the EEPROM of the data storage module;
the processor stm32 is connected with PB12, PB13, PB14, PB15, PC6 and PC7 of the 2.4G wireless chip NRF24L 01;
the PC8, PC9, PC10, PC11, PC12, PD2 of the processor stm32 are connected to the memory Card TF Card;
PB2, PA8, PA11 and PA12 of the stm32 of the processor are connected with the status light;
the PA13 and PA14 of the stm32 of the processor are connected with the SWD Download program;
the PA15, PB3, PB4, PB5 and PC13 of the stm32 of the processor are connected with a WIFI module ESP 8266;
PB6, PB7, PB8 and PB9 of the stm32 of the processor are connected with a Motor Driver;
NRST of processor stm32 is connected to RESET;
the No. 60 pin is connected with the BOOT LOADER;
the processor stm32 and the memory Card TF Card form a micro control unit 1.
The following description is made in connection with the specific design of the unmanned aerial vehicle:
an NMOS driving module U1 is welded at each of the four corners of the unmanned aerial vehicle body; the gyroscope U2 and the barometer U3 are arranged at the upper left part of the central part; the outer side of the central part is provided with an OLED display screen U6; the inner side of the U6 is provided with a micro control unit U4 and a WIFI module U9; the left lower part of the central part is provided with a 2.4G fixed frequency antenna; the lower right of the central part is provided with an SWD download debugging interface U10; the center part of the circuit also comprises an EEPROM module U7, a voltage detection module U8, a 5V boosting circuit U11 and a 3.3V voltage reduction circuit U12 from left to right. Exquisite pleasing to the eye in the overall arrangement, area utilization is high, greatly reduced unmanned aerial vehicle's weight to reduce electric energy loss, improved unmanned aerial vehicle's duration.
Some of the units are described in detail below:
the micro control unit uses an MCU model STM32F405RGT6 from ST corporation, which can provide enough computing power to realize the attitude calculation and control algorithm mentioned herein without excessive performance waste.
The storage unit uses a 24LC02BT-I/OT model EEPROM storage IC of Microchip company, and is responsible for power-down storage of important data, including but not limited to controller related parameters, angular velocity meter zero drift measurement data, accelerometer zero drift measurement data and the like, any general GPIO can be selected without considering a special hardware interface, and the micro package of SOT23-5 is beneficial to improvement of hardware space utilization rate.
The inertial sensing unit adopts an MPU9250 model IMU of Invensense company, so that the communication load is reduced to a great extent.
The auxiliary sensing unit selects an MS5611 type barometer of MEAS company and a TOF10120 laser ranging module of Sharp company, and the length can be complemented by a data fusion algorithm to realize the estimation of height information; laser rangefinder module carries in unmanned aerial vehicle's bottom.
The wireless communication unit adopts a Zigbee-based wireless self-organizing network communication structure, and can realize self-adaptive adjustment capability under the condition of high-speed mobile nodes.
The power management unit selects an LMR62014S model Boost voltage stabilizer of TI company to provide a 5V power supply, and establishes a topological route of the lithium battery pointing to the OpenMV module; a TPS63000.Q1 model Buck. boost buck-boost voltage stabilizer of a TI company is selected to provide a 3.3V power supply, and a topological route of a lithium battery pointing to a relevant unit is established.
The driving part of the unmanned aerial vehicle is an important bridge connecting a hardware part and a software part, the invention uses C language and assembly language to carry out co-compilation according to the responsible function and hardware interface definition of a corresponding hardware unit, designs a driving program for each hardware unit, ensures that a micro control unit can work with other functional units, defines the task of the unmanned aerial vehicle by the functional part, mainly comprises sensor data processing, attitude resolving, attitude and position control, wireless communication, memory refreshing and the like, and explains the algorithm program design of the unmanned aerial vehicle as follows:
in the aspect of the digital filter algorithm, the adaptive Kalman filtering based on the maximum likelihood estimation is applied for denoising, although the Kalman filtering can only fit a linear Gaussian system, the maximum advantage is that the calculated amount is small, and the filtering effect is good, so that the method is suitable for filtering inertial data with the sampling frequency of1 kHz. In order to improve the running speed and theoretically have no coupling relation between six-axis inertial data, the six-axis inertial data are respectively processed by adopting independent one-dimensional adaptive linear Kalman filters, and covariance is not considered.
And in the aspect of an attitude calculation algorithm, a quaternion combined attitude estimation method based on adaptive Kalman filtering is adopted, the difference between the last estimated attitude and the current accelerometer calculated attitude is used as the input of a Kalman filter, the output of the Kalman filter is used for correcting the current triaxial angular velocity, and finally, the corrected triaxial angular velocity is integrated to the last estimated attitude by a first-order Runge-Kutta method to obtain the current estimated attitude. The algorithm is complex and large in calculation amount, but the dynamic advantage of an angular velocity meter and the static advantage of an accelerometer are simultaneously utilized, and the adaptive Kalman filter based on the maximum likelihood estimation can effectively adapt to the maneuvering uncertainty of the body and the transfer of the covariance matrix under a nonlinear system. Therefore, this can guarantee stability and accuracy in a long time, is applicable to unmanned aerial vehicle.
The adaptive Kalman filtering algorithm:
before the iteration of the adaptive Kalman filtering algorithm is carried out, the process noise covariance Q needs to be determined through sampling statistics, a perfect process equation and a measurement equation are established through a physical model, the sizes of the attenuation factor parameter gamma and the historical innovation number N to be used are artificially confirmed, and the attenuation factor parameter gamma and the historical innovation number N are always unchanged in the subsequent process.
First, confirm the initial value
Figure BDA0003087526650000071
And Pk-1|k-1
Second, confirm one-step state prediction
Figure BDA0003087526650000081
And its associated covariance matrix Pk|k-1
Figure BDA0003087526650000082
Pk|k-1=Φk-1Pk-1|k-1Φk-1 T+Gk-1Qk-1Gk-1 T
Third, confirm a one-step measurement prediction
Figure BDA0003087526650000083
Figure BDA0003087526650000084
The fourth step, obtain new measured value ZkCalculating innovation to obtain innovation { v } corresponding to the measured value of Nk-N+1,…,vk}。
Figure BDA0003087526650000085
Fifthly, confirming the maximum likelihood estimation of the one-step measurement prediction covariance matrix
Figure BDA0003087526650000086
Figure BDA0003087526650000087
In which ξiIs an exponential decay factor, described as:
Figure BDA0003087526650000088
sixthly, confirming the maximum similarity of the process noise variance matrixBut estimate
Figure BDA0003087526650000089
And determining an adaptive Kalman filter gain Kk
Figure BDA00030875266500000810
Figure BDA00030875266500000811
Seventhly, confirming the system state filtering estimated value
Figure BDA00030875266500000812
And its associated covariance matrix Pk|k
Figure BDA00030875266500000813
Pk|k=Pk|k-1-KkHkPk|k-1
An attitude resolving algorithm based on adaptive Kalman filtering:
firstly, normalizing triaxial acceleration measurement data to obtain unit observation acceleration vector
Figure BDA00030875266500000814
I.e. the representation of the direction of the gravitational acceleration in the machine coordinate system.
Figure BDA00030875266500000815
Wherein, ax、ayAnd azThe three-axis acceleration data after filtering processing is shown, and the unit requirement is avoided.
Second step, calculating to obtain navigation coordinate system
Figure BDA00030875266500000816
Expressing the vector in a body coordinate system to obtain a unit estimated acceleration vector
Figure BDA0003087526650000091
Figure BDA0003087526650000092
Thirdly, calculating unit observation vector under the body coordinate system
Figure BDA0003087526650000093
Sum unit estimated vector
Figure BDA0003087526650000094
Cross product of (2) to obtain an error vector
Figure BDA0003087526650000095
Figure BDA0003087526650000096
And fourthly, taking the error vector as the input of the adaptive Kalman filter, and taking the output of the adaptive Kalman filter as the compensation quantity of the triaxial angular velocity data:
Figure BDA0003087526650000097
wherein,
Figure BDA0003087526650000098
representing the uncompensated triaxial angular velocity data this time,
Figure BDA0003087526650000099
representing compensated triaxial angular velocity data.
And fifthly, solving a quaternion differential equation, and calculating the attitude of the unmanned aerial vehicle obtained by the estimation by adopting a first-order Runge Kutta method, namely the rotation relation between a navigation coordinate system n and a vehicle coordinate system b, and expressing the rotation relation by using a normalized quaternion q. Then q at time t may be iteratively calculated using q at time t-1.
Figure BDA00030875266500000910
Wherein,
Figure BDA00030875266500000911
the vector is estimated for the last quaternion,
Figure BDA00030875266500000912
and (4) estimating a vector for the quaternion, wherein T is an iteration period of a quaternion complementary filtering algorithm and has a unit of second.
Strapdown inertial navigation: directly measuring the acceleration vector obtained by IMU device under the body coordinate system
Figure BDA00030875266500000913
Acceleration vector under system) through a rotation matrix
Figure BDA00030875266500000914
Rotating to a navigation coordinate system, and removing the gravity acceleration component to obtain a motion acceleration vector under the navigation coordinate system
Figure BDA00030875266500000915
By means of motion acceleration vector under navigation coordinate system
Figure BDA00030875266500000916
The motion velocity vector under the navigation coordinate system can be obtained by integration
Figure BDA00030875266500000917
And a displacement vector
Figure BDA00030875266500000918
Auxiliary sensing and positioning: the movement speed and displacement data obtained by the inertial navigation algorithm executed by the inertial sensing unit have good dynamic response characteristics, but the inertial sensing unit has an inevitable zero drift problem, so that the obtained data has a drift problem, especially the displacement data subjected to secondary integration. TOF10120 laser rangefinder module and MS5611 barometer can provide comparatively accurate unmanned aerial vehicle Z axle displacement data with ground distance, altitude as former data respectively to this is corrected the Z axle result of strap-down inertial navigation as the calibration value. The PWM3901 optical flow sensor and the airborne OpenMV can acquire accurate displacement data of the unmanned aerial vehicle on the XY axis according to a certain rule by processing a map image, the accurate displacement data is used as a calibration value to calibrate the XY axis result of strapdown inertial navigation, and the airborne OpenMV provides a functional library for detecting data such as ID, position, deflection angle and the like of AprilTags in the image. By combining AprilTags planar array with a rule capable of following, through a coordinate transformation algorithm, the horizontal position data of the unmanned aerial vehicle under the ground coordinate system can be obtained through the label coordinates in the image coordinate system.
The integrated navigation positioning algorithm based on the adaptive Kalman filtering comprises the following steps: the invention adopts a combined navigation positioning algorithm based on the self-adaptive Kalman filtering, utilizes the position data obtained by the auxiliary sensing unit to obtain the acceleration, speed and position data of the strapdown inertial navigation, and combines a kinematics model under the six-dimensional freedom of the unmanned aerial vehicle to realize the optimal estimation of the pose of the unmanned aerial vehicle.
Positioning algorithm based on machine vision: in order to reduce weight, the invention selects a light-weight OpenMV4 mini camera as an unmanned aerial vehicle airborne image processing unit, combines April tags identification technology, builds an April tags array capable of expressing global coordinates, and realizes an unmanned aerial vehicle navigation coordinate system positioning algorithm based on machine vision through a high-level language Python script.
Aprilatas tag location: the OpenMV4 can quickly identify April tags in pictures, provide tag ID numbers and tag position data in an image coordinate system, arrange a plurality of April tags into a two-dimensional array at equal intervals according to rules, and directly calculate the coordinates of an unmanned aerial vehicle body in the April tag array by combining the current attitude angle information of the unmanned aerial vehicle with the pixel position and the ID number of a certain April tag in the visual field provided by the OpenMV 4. The AprilTags array is shown in fig. 13. And taking a rectangular coordinate system established by the AprilTags label array as a navigation coordinate system, so that the position of the unmanned aerial vehicle can be observed. And calibrating the position estimation result of the strapdown inertial navigation through a three-order fusion complementary filtering algorithm, so as to obtain more reliable position data.
Image coordinate conversion algorithm: airborne OpenMV4 camera and unmanned aerial vehicle organism are relatively fixed, and camera direction points to the opposite direction of the Z axis of the organism coordinate system, so the organism coordinate system vector
Figure BDA0003087526650000105
Can express the direction vector of the camera and describe the vector
Figure BDA0003087526650000108
(representation of Z-axis of body coordinate system in world coordinate system) to vector
Figure BDA0003087526650000106
(representation of the Z-axis of the navigational coordinate System in the world coordinate System) rotation matrix of the rotation process
Figure BDA0003087526650000107
And rotating the central coordinates of the images to obtain the position coordinates after conversion. However, the image center coordinates also need to be obtained by calculating the position of the tag in the image and the current attitude and height of the unmanned aerial vehicle. The image center, the tag center and the unmanned aerial vehicle center are in a positional relationship as shown in fig. 14. Wherein, OXYZsAnd a coordinate system which takes the machine body as a circle center and is parallel to the navigation coordinate system is expressed, and the coordinate system is referred to as an s system for short hereinafter. OXYZbWhich represents a coordinate system of the body, hereinafter referred to as b system. The α plane represents the AprilTags array plane, and the β plane represents the image plane with the Z axis normal to the b axis. The T point is the position of AprilTags label in the image on the alpha plane, TiThe point is the position of the AprilTags label in the image in the beta plane, TiC is a perpendicular line of the intersection line of the alpha plane and the beta plane. A and B are the image centersAnd the projection point of the unmanned aerial vehicle body in the s system. PP 'and PP' describe the position transformation relation of the projection points A and B based on an s system.
Order vector
Figure BDA0003087526650000101
Sum vector
Figure BDA0003087526650000102
Is thetaTVector of motion
Figure BDA0003087526650000103
Sum vector
Figure BDA0003087526650000104
Is gammaTThe positive rotation direction is defined by a right-hand screw rule, and the current position of the unmanned aerial vehicle is determined by two times of coordinate system rotation (from a pixel coordinate system (a right-hand coordinate system with the plane of an aprilatas label image acquired by an image processing unit being XOY) to a body coordinate system and then from the body coordinate system to a navigation coordinate system)
Figure BDA0003087526650000111
Can be expressed as follows:
Figure BDA0003087526650000112
wherein, c(*)And s(*)Represent cos (. about.) & sin (. about.), t, respectivelyx,nAnd ty,nRepresenting the XY axis coordinates of the located tag in the navigational coordinate system defined by the AprilTags array.

Claims (10)

1. A method for estimating the attitude and position of an unmanned aerial vehicle is characterized by comprising the following steps:
step one, carrying out normalization on three-axis acceleration measurement data of the unmanned aerial vehicle to obtain unit observation acceleration vector
Figure FDA0003087526640000011
The unit observed acceleration vector
Figure FDA0003087526640000012
Is the representation of the gravity acceleration direction in a body coordinate system;
step two, calculating the motion velocity vector under the navigation coordinate system
Figure FDA0003087526640000013
Expressed in a body coordinate system, obtaining unit estimated acceleration vector
Figure FDA0003087526640000014
Step three, calculating unit observation vector under the body coordinate system
Figure FDA0003087526640000015
Sum unit estimated acceleration vector
Figure FDA0003087526640000016
Cross product of (2) to obtain an error vector
Figure FDA0003087526640000017
Step four, error vector is calculated
Figure FDA0003087526640000018
Obtaining the output of the adaptive Kalman filter as the compensation quantity of the triaxial angular velocity data as the input of the adaptive Kalman filter, and obtaining the compensated triaxial angular velocity data through the compensation quantity and the uncompensated triaxial angular velocity data;
and fifthly, estimating the attitude of the unmanned aerial vehicle through the compensated triaxial angular velocity data.
2. The position and attitude of a drone of claim 1The state estimation method is characterized in that the unit observation acceleration vector in the step one
Figure FDA0003087526640000019
Calculated by the following formula:
Figure FDA00030875266400000110
wherein, ax、ayAnd azRepresenting the three-axis acceleration measurement data of the unmanned aerial vehicle after filtering processing; the three-axis acceleration measurement data of the unmanned aerial vehicle is obtained by measuring through an inertial sensing unit carried on the unmanned aerial vehicle.
3. The method of claim 2, wherein the velocity vector of the unmanned aerial vehicle is obtained in step two
Figure FDA00030875266400000111
The specific steps are as follows;
step two, firstly, the acceleration vector under the body coordinate system
Figure FDA00030875266400000112
By rotating the matrix
Figure FDA00030875266400000113
Rotating to a navigation coordinate system and removing the acceleration vector after the rotation
Figure FDA00030875266400000114
Obtaining the motion acceleration vector under the navigation coordinate system by the gravity acceleration component in the navigation coordinate system
Figure FDA00030875266400000115
Secondly, motion acceleration vector under the navigation coordinate system is subjected to
Figure FDA00030875266400000116
Integrating to obtain the motion velocity vector under the navigation coordinate system
Figure FDA00030875266400000117
4. The method of claim 3, wherein the unit estimation acceleration vector in step two is estimated
Figure FDA00030875266400000118
Calculated by the following formula:
Figure FDA00030875266400000119
wherein q is0、q1、q2、q3The real part and the three imaginary parts of the quaternion q used to represent the rotation process of both the navigation coordinate system to the body coordinate system are shown separately.
5. The method for estimating the position and the attitude of the unmanned aerial vehicle according to claim 3, wherein the concrete method of the fifth step is as follows:
solving a quaternion differential equation:
Figure FDA0003087526640000021
calculating the current attitude angle of the unmanned aerial vehicle by adopting a first-order Runge Kutta method to estimate the attitude of the unmanned aerial vehicle;
wherein,
Figure FDA0003087526640000022
estimating a vector for the last quaternion;
Figure FDA0003087526640000023
estimating a vector, i.e., a solution of a quaternion differential equation, for a current quaternion; t is the iteration cycle of the quaternion complementary filtering algorithm, and the unit is second;
Figure FDA0003087526640000024
Figure FDA0003087526640000025
wherein,
Figure FDA0003087526640000026
representing the currently uncompensated triaxial angular velocity data,
Figure FDA0003087526640000027
represents the compensated triaxial angular velocity data,
Figure FDA0003087526640000028
the compensation amount of the triaxial angular velocity data.
6. The method for estimating the position and the attitude of the unmanned aerial vehicle according to claim 7, further comprising an unmanned aerial vehicle position estimation method, the unmanned aerial vehicle position estimation method comprising the following specific steps:
sixthly, obtaining displacement initial data of an X axis, a Y axis and a Z axis of the unmanned aerial vehicle;
sixthly, Z-axis displacement correction data of the unmanned aerial vehicle are obtained, and the Z-axis displacement correction data are used for correcting the displacement initial data of the Z axis;
and sixthly, obtaining horizontal position data of the unmanned aerial vehicle under the ground coordinate system through an AprilTags label plane array, and correcting displacement initial data of the X axis and the Y axis by using the horizontal position data to complete position estimation of the unmanned aerial vehicle.
7. The method of claim 6, wherein the Z-axis displacement correction data of the UAV is obtained by data fusion of ground distance and altitude.
8. A method of position and attitude estimation of drones according to claim 6,
sixthly, arranging a plurality of April Tags labels into an April Tags label plane array on the ground, wherein each April Tags label comprises a pixel position and an ID number of the April Tags label;
sixthly, the unmanned aerial vehicle acquires an image of an AprilTag label, and the coordinate of the unmanned aerial vehicle in the AprilTag label array is calculated according to the pixel position and the ID number of the AprilTag label and the current attitude angle of the unmanned aerial vehicle, namely the horizontal position data of the unmanned aerial vehicle under the ground coordinate system.
9. An unmanned aerial vehicle is characterized by comprising an unmanned aerial vehicle body, an inertial sensing unit (1), an image processing unit (2) and a micro control unit (3), wherein the inertial sensing unit (1), the image processing unit and the micro control unit are fixed on the unmanned aerial vehicle body;
the coordinate system of the inertial sensing unit (1) is consistent with the coordinate system of the body, and the inertial sensing unit (1) is connected with the micro control unit (3) and used for measuring to obtain three-axis angular velocity data and three-axis acceleration data of the unmanned aerial vehicle and sending the three-axis angular velocity data and the three-axis acceleration data to the micro control unit (3);
the image processing unit (2) is fixed below the unmanned aerial vehicle body, and the image processing unit (2) is connected with the micro control unit (3) and used for acquiring an image of an April tag and sending the image to the micro control unit (3);
the micro control unit (3) comprises a memory device, a processor and a computer program stored in the memory device and executable on the processor, the processor executing the computer program to implement the method of one of claims 8 to 9.
10. A drone according to claim 9, characterised by further comprising a laser ranging module (4) and a barometer (5);
the laser ranging module (4) is fixed below the unmanned aerial vehicle body, and the laser ranging module (4) is connected with the micro control unit (3) and used for measuring the distance to the ground and sending the distance to the micro control unit (3);
the barometer (5) is connected with the micro control unit (3) and used for measuring the altitude and sending the altitude to the micro control unit (3).
CN202110584179.XA 2021-05-27 2021-05-27 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle Active CN113237478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110584179.XA CN113237478B (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110584179.XA CN113237478B (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN113237478A true CN113237478A (en) 2021-08-10
CN113237478B CN113237478B (en) 2022-10-14

Family

ID=77139108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110584179.XA Active CN113237478B (en) 2021-05-27 2021-05-27 Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN113237478B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114007047A (en) * 2021-11-02 2022-02-01 广西电网有限责任公司贺州供电局 Power transformation field operation real-time monitoring and alarming system based on machine vision
CN116298882A (en) * 2023-05-18 2023-06-23 深圳市好盈科技股份有限公司 Target unmanned aerial vehicle motor demagnetization detection method and device

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105136145A (en) * 2015-08-11 2015-12-09 哈尔滨工业大学 Kalman filtering based quadrotor unmanned aerial vehicle attitude data fusion method
CN106197422A (en) * 2016-06-27 2016-12-07 东南大学 A kind of unmanned plane based on two-dimensional tag location and method for tracking target
CN107139178A (en) * 2017-05-10 2017-09-08 哈尔滨工业大学深圳研究生院 A kind of grasping means of unmanned plane and its view-based access control model
CN108007474A (en) * 2017-08-31 2018-05-08 哈尔滨工业大学 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
CN108592917A (en) * 2018-04-25 2018-09-28 珠海全志科技股份有限公司 A kind of Kalman filtering Attitude estimation method based on misalignment
CN109724602A (en) * 2018-12-17 2019-05-07 南京理工大学 A kind of attitude algorithm system and its calculation method based on hardware FPU
CN109916407A (en) * 2019-02-03 2019-06-21 河南科技大学 Indoor mobile robot combined positioning method based on adaptive Kalman filter
CN109931926A (en) * 2019-04-04 2019-06-25 山东智翼航空科技有限公司 Unmanned aerial vehicle seamless autonomous navigation algorithm based on station center coordinate system
CN110017837A (en) * 2019-04-26 2019-07-16 沈阳航空航天大学 A kind of Combinated navigation method of the diamagnetic interference of posture
CN110081878A (en) * 2019-05-17 2019-08-02 东北大学 A kind of posture and location determining method of multi-rotor unmanned aerial vehicle
CN110375773A (en) * 2019-07-29 2019-10-25 兰州交通大学 MEMS inertial navigation system posture initial method
CN110702107A (en) * 2019-10-22 2020-01-17 北京维盛泰科科技有限公司 Monocular vision inertial combination positioning navigation method
CN110793515A (en) * 2018-08-02 2020-02-14 哈尔滨工业大学 Unmanned aerial vehicle attitude estimation method based on single-antenna GPS and IMU under large-mobility condition
CN111076722A (en) * 2019-11-18 2020-04-28 广州南方卫星导航仪器有限公司 Attitude estimation method and device based on self-adaptive quaternion
CN111197984A (en) * 2020-01-15 2020-05-26 重庆邮电大学 Vision-inertial motion estimation method based on environmental constraint
US20200218288A1 (en) * 2017-02-24 2020-07-09 Flir Detection, Inc. Control systems for unmanned aerial vehicles
CN111831962A (en) * 2020-07-14 2020-10-27 河北科技大学 Four-rotor unmanned aerial vehicle attitude calculation method and device and terminal equipment
CN112013836A (en) * 2020-08-14 2020-12-01 北京航空航天大学 Attitude heading reference system algorithm based on improved adaptive Kalman filtering
CN112630813A (en) * 2020-11-24 2021-04-09 中国人民解放军国防科技大学 Unmanned aerial vehicle attitude measurement method based on strapdown inertial navigation and Beidou satellite navigation system

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105136145A (en) * 2015-08-11 2015-12-09 哈尔滨工业大学 Kalman filtering based quadrotor unmanned aerial vehicle attitude data fusion method
CN106197422A (en) * 2016-06-27 2016-12-07 东南大学 A kind of unmanned plane based on two-dimensional tag location and method for tracking target
US20200218288A1 (en) * 2017-02-24 2020-07-09 Flir Detection, Inc. Control systems for unmanned aerial vehicles
CN107139178A (en) * 2017-05-10 2017-09-08 哈尔滨工业大学深圳研究生院 A kind of grasping means of unmanned plane and its view-based access control model
CN108007474A (en) * 2017-08-31 2018-05-08 哈尔滨工业大学 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
CN108592917A (en) * 2018-04-25 2018-09-28 珠海全志科技股份有限公司 A kind of Kalman filtering Attitude estimation method based on misalignment
CN110793515A (en) * 2018-08-02 2020-02-14 哈尔滨工业大学 Unmanned aerial vehicle attitude estimation method based on single-antenna GPS and IMU under large-mobility condition
CN109724602A (en) * 2018-12-17 2019-05-07 南京理工大学 A kind of attitude algorithm system and its calculation method based on hardware FPU
CN109916407A (en) * 2019-02-03 2019-06-21 河南科技大学 Indoor mobile robot combined positioning method based on adaptive Kalman filter
CN109931926A (en) * 2019-04-04 2019-06-25 山东智翼航空科技有限公司 Unmanned aerial vehicle seamless autonomous navigation algorithm based on station center coordinate system
CN110017837A (en) * 2019-04-26 2019-07-16 沈阳航空航天大学 A kind of Combinated navigation method of the diamagnetic interference of posture
CN110081878A (en) * 2019-05-17 2019-08-02 东北大学 A kind of posture and location determining method of multi-rotor unmanned aerial vehicle
CN110375773A (en) * 2019-07-29 2019-10-25 兰州交通大学 MEMS inertial navigation system posture initial method
CN110702107A (en) * 2019-10-22 2020-01-17 北京维盛泰科科技有限公司 Monocular vision inertial combination positioning navigation method
CN111076722A (en) * 2019-11-18 2020-04-28 广州南方卫星导航仪器有限公司 Attitude estimation method and device based on self-adaptive quaternion
CN111197984A (en) * 2020-01-15 2020-05-26 重庆邮电大学 Vision-inertial motion estimation method based on environmental constraint
CN111831962A (en) * 2020-07-14 2020-10-27 河北科技大学 Four-rotor unmanned aerial vehicle attitude calculation method and device and terminal equipment
CN112013836A (en) * 2020-08-14 2020-12-01 北京航空航天大学 Attitude heading reference system algorithm based on improved adaptive Kalman filtering
CN112630813A (en) * 2020-11-24 2021-04-09 中国人民解放军国防科技大学 Unmanned aerial vehicle attitude measurement method based on strapdown inertial navigation and Beidou satellite navigation system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JIAPENG LI等: "A Multi-source Fused Location Estimation Method for UAV Based on Machine Vision and Strapdown Inertial Navigation", 《ARTIFICIAL INTELLIGENCE FOR COMMUNICATIONS AND NETWORKS. SECOND EAI INTERNATIONAL CONFERENCE, AICON 2020》 *
PENG XIE等: "The Attitude Adjustment of Multi-rotor Unmanned Aerial Vehicles Based on Information Fusion Analysis", 《2019 INTERNATIONAL CONFERENCE ON SMART GRID AND ELECTRICAL AUTOMATION (ICSGEA)》 *
任剑秋等: "基于STM32的多传感器四旋翼姿态控制系统设计", 《电子技术应用》 *
罗阳: "面向建筑物检测的无人机定位导航系统研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 *
郑重等: "时变通信延迟下的无人机编队鲁棒自适应控制", 《中国惯性技术学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114007047A (en) * 2021-11-02 2022-02-01 广西电网有限责任公司贺州供电局 Power transformation field operation real-time monitoring and alarming system based on machine vision
CN114007047B (en) * 2021-11-02 2024-03-15 广西电网有限责任公司贺州供电局 Machine vision-based real-time monitoring and alarming system for operation of power transformation site
CN116298882A (en) * 2023-05-18 2023-06-23 深圳市好盈科技股份有限公司 Target unmanned aerial vehicle motor demagnetization detection method and device

Also Published As

Publication number Publication date
CN113237478B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
US10732647B2 (en) Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
US11279045B2 (en) Robot pose estimation method and apparatus and robot using the same
Achtelik et al. Autonomous navigation and exploration of a quadrotor helicopter in GPS-denied indoor environments
EP3388786A1 (en) Autonomous positioning and navigation device, positioning and navigation method and autonomous positioning and navigation system
EP1901153A1 (en) Control system for unmanned 4-rotor-helicopter
CN113237478B (en) Unmanned aerial vehicle attitude and position estimation method and unmanned aerial vehicle
Dijkshoorn Simultaneous localization and mapping with the ar. drone
Qi et al. Autonomous landing solution of low-cost quadrotor on a moving platform
CN104280022A (en) Digital helmet display device tracking system of visual-aided inertial measuring unit
WO2018182524A1 (en) Real time robust localization via visual inertial odometry
CN114088087B (en) High-reliability high-precision navigation positioning method and system under unmanned aerial vehicle GPS-DENIED
Unicomb et al. Distance function based 6dof localization for unmanned aerial vehicles in gps denied environments
Riether Agile quadrotor maneuvering using tensor-decomposition-based globally optimal control and onboard visual-inertial estimation
Angeletti et al. Autonomous indoor hovering with a quadrotor
CN116952229A (en) Unmanned aerial vehicle positioning method, device, system and storage medium
Leng et al. An improved method for odometry estimation based on EKF and Temporal Convolutional Network
Hafner et al. An autonomous flying robot for testing bio-inspired navigation strategies
Luo et al. An autonomous helicopter with vision based navigation
Wu et al. Integrated navigation algorithm based on vision-inertial extended Kalman filter for low-cost unmanned aerial vehicle
Hou et al. Visual Inertial Navigation Optimization Method Based on Landmark Recognition
Ahmadinejad et al. A low-cost vision-based tracking system for position control of quadrotor robots
CN117408084B (en) Enhanced Kalman filtering method and system for unmanned aerial vehicle track prediction
Singhania et al. Autonomous navigation of a multirotor using visual odometry and dynamic obstacle avoidance
Wang et al. Real-time visual odometry for autonomous MAV navigation using RGB-D camera
Joberg Multirotor pickup of object in the sea

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant