CN114184194A - Unmanned aerial vehicle autonomous navigation positioning method in rejection environment - Google Patents

Unmanned aerial vehicle autonomous navigation positioning method in rejection environment Download PDF

Info

Publication number
CN114184194A
CN114184194A CN202111445139.3A CN202111445139A CN114184194A CN 114184194 A CN114184194 A CN 114184194A CN 202111445139 A CN202111445139 A CN 202111445139A CN 114184194 A CN114184194 A CN 114184194A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
speed
data
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111445139.3A
Other languages
Chinese (zh)
Inventor
何中翔
张礼
李鹏程
刘博�
李津
袁双
王淑君
唐蔚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 29 Research Institute
Original Assignee
CETC 29 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 29 Research Institute filed Critical CETC 29 Research Institute
Priority to CN202111445139.3A priority Critical patent/CN114184194A/en
Publication of CN114184194A publication Critical patent/CN114184194A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention provides an unmanned aerial vehicle autonomous navigation positioning method in a rejection environment, which comprises the following processes: step 1, judging whether the unmanned aerial vehicle is in a static or uniform motion state at present, if so, entering step 2; otherwise, entering step 3; step 2, updating the initial attitude by utilizing extended Kalman filtering at regular time according to data output by the accelerometer and the magnetometer, and entering step 4; 3, performing inertial navigation strapdown calculation by using gyroscope data on the basis of the attitude updated last time to acquire attitude information until the unmanned aerial vehicle returns to a static or uniform state, and entering the step 4; step 4, outputting the updated attitude information of the unmanned aerial vehicle; and 5, calculating the latest position, height and speed information through extended Kalman filtering fusion according to the output data of the optical flow sensor, the barometer and the gyroscope. The scheme provided by the invention can realize autonomous navigation and positioning when the unmanned aerial vehicle is in a refused environment.

Description

Unmanned aerial vehicle autonomous navigation positioning method in rejection environment
Technical Field
The invention relates to the technical field of navigation and positioning, in particular to an unmanned aerial vehicle autonomous navigation and positioning method in a rejection environment.
Background
Positioning is to give the position information of the drone in a map or coordinate system. Navigation is to continuously give all or part of motion information such as the position, the speed, the acceleration, the attitude angular velocity, the attitude angular acceleration and the like of the unmanned aerial vehicle in a map or a coordinate system in real time. At present, a navigation positioning system used on a small unmanned aerial vehicle mainly comprises an Inertial Navigation System (INS), a global positioning navigation system (GPS), a GPS/INS integrated navigation system and the like. The angular rate and the acceleration measured by an Inertial Navigation System (INS) are utilized, and the motion parameters such as the position, the speed, the attitude and the like of the unmanned aerial vehicle can be obtained through integral operation, but the INS has the defect that errors are accumulated along with time, so that the navigation precision of long-time work is poor. The GPS and GPS/INS integrated navigation needs to receive satellite navigation signals to realize navigation and positioning, and when the unmanned aerial vehicle enters a satellite navigation signal rejection environment, how to realize accurate navigation and positioning for a long time becomes a problem which is difficult to solve.
In the prior art, autonomous navigation and positioning of the unmanned aerial vehicle are generally realized by using loose combination of visual navigation and inertial navigation, and the unmanned aerial vehicle is mainly used for two-dimensional motion unmanned aerial vehicles such as vehicle-mounted navigation and is not suitable for the three-dimensional motion unmanned aerial vehicle such as the unmanned aerial vehicle. For example, patent application CN201710269515.5 discloses a loose combination navigation method of visual navigation/inertial navigation, which periodically uses the position and speed information of visual navigation to correct the inertial navigation parameters, thereby solving the problem of inertial navigation error accumulation over time. This patent mainly carries out autonomic navigation location to mobile robot, because the robot moves on the plane, needn't consider the change of gesture, so this method is effectual, but it is not applicable to the unmanned aerial vehicle motion condition that this kind of gesture of unmanned aerial vehicle flight and position all can change at any time.
Disclosure of Invention
Aiming at the problems in the prior art, the unmanned aerial vehicle autonomous navigation positioning method under the rejection environment is provided, and based on an INS/optical flow/magnetometer/barometer combined navigation scheme of extended Kalman filtering, sensor data such as the INS, the optical flow and the magnetometer are fused, the speed, the position and the attitude information of the unmanned aerial vehicle are estimated, so that the unmanned aerial vehicle can realize autonomous navigation positioning by utilizing a sensor of the unmanned aerial vehicle under the rejection environment.
The technical scheme adopted by the invention is as follows: an unmanned aerial vehicle autonomous navigation positioning method in a rejection environment is characterized by comprising the following processes:
step 1, when an unmanned aerial vehicle enters a rejection environment, acquiring initial position, speed and attitude information of the unmanned aerial vehicle through an inertial navigation system; acquiring data output by an accelerometer, a magnetometer and a gyroscope of the unmanned aerial vehicle in real time, and updating initial position, speed and attitude information subsequently;
step 2, judging whether the unmanned aerial vehicle is in a static or uniform motion state at present, if so, entering step 3; otherwise, entering the step 4;
step 3, updating initial attitude information at regular time by using extended Kalman filtering according to data output by the accelerometer and the magnetometer, and entering step 5;
step 4, performing inertial navigation strapdown calculation by using gyroscope data on the basis of the attitude updated last time to acquire attitude information until the unmanned aerial vehicle returns to a static or uniform state, and entering step 5;
step 5, outputting the updated attitude information of the unmanned aerial vehicle;
and 6, performing fusion and solution on the latest position, height and speed information through extended Kalman filtering according to the output data of the optical flow sensor, the barometer, the accelerometer and the gyroscope of the unmanned aerial vehicle.
Further, the specific substeps of step 3 are:
step 3.1, acquiring the current roll angle and pitch angle of the unmanned aerial vehicle through acceleration output data;
step 3.2, acquiring the current course angle of the unmanned aerial vehicle according to the output data of the magnetometer and the current roll angle and pitch angle of the unmanned aerial vehicle;
and 3.3, resolving by using the current roll angle, pitch angle and course angle of the unmanned aerial vehicle through extended Kalman filtering, and updating the attitude acquired by the inertial navigation system according to the resolving result.
Further, the roll angle and pitch angle obtaining method in step 3.1 is as follows:
Figure BDA0003383880820000021
wherein,
Figure BDA0003383880820000022
the measured value of the accelerometer is obtained when the unmanned aerial vehicle is in a static state or a uniform motion state; gamma and theta are respectively the roll angle and the pitch angle of the unmanned aerial vehicle.
Further, the course angle obtaining method in the step 3.2 is as follows:
Figure BDA0003383880820000023
wherein,
Figure BDA0003383880820000024
for the geomagnetic intensity, the geomagnetic data is obtained by looking up a table according to the known general geographic position of the unmanned aerial vehicle,
Figure BDA0003383880820000025
and (4) giving out the magnetic strength of the unmanned aerial vehicle, wherein phi is the heading angle of the unmanned aerial vehicle.
Further, the step 6 specifically includes: the method comprises the steps of adopting extended Kalman filtering to fuse data of a barometer, an accelerometer and a gyroscope and data of an optical flow sensor and an ultrasonic sensor, selecting speed and position information of the unmanned aerial vehicle in a navigation coordinate system as state quantities, and estimating the position, height and speed information of the unmanned aerial vehicle by using output of the optical flow sensor, the ultrasonic sensor and the barometer of the unmanned aerial vehicle as observed quantities.
Further, the step 6 comprises the following sub-steps:
step 61, projecting the three-dimensional motion to a two-dimensional image plane of the camera by using a pinhole model in an optical flow estimation method, acquiring coordinates of the camera on the imaging plane and a component expression of an optical flow vector in three directions of x, y and z,
step 62, resolving a component expression of the optical flow vector based on data output by the barometer, the ultrasonic sensor and the gyroscope to obtain the average speed of the unmanned aerial vehicle in a camera coordinate system;
step 63, converting the average speed of the camera in the coordinate system into the speed of the unmanned aerial vehicle in the geographic coordinate system through the coordinate system conversion matrix;
and 65, integrating the speed in the geographic coordinate system to obtain the position information of the unmanned aerial vehicle in the geographic coordinate system.
Compared with the prior art, the beneficial effects of adopting the technical scheme are as follows: the scheme provided by the invention fuses the INS, the optical flow, the magnetometer, the barometer and other sensor data carried by the unmanned aerial vehicle to estimate the position, the height, the speed and the attitude information of the unmanned aerial vehicle, so that the unmanned aerial vehicle can still utilize the sensor carried by the unmanned aerial vehicle to accurately and autonomously navigate and position in a long time under the satellite navigation signal rejection environment
Drawings
Fig. 1 is a flowchart of an autonomous navigation positioning method for an unmanned aerial vehicle in a denial environment according to the present invention.
FIG. 2 is a diagram illustrating a pinhole model according to an embodiment of the present invention.
FIG. 3 is a schematic structural diagram of an extended Kalman filter in an embodiment of the present invention.
FIG. 4 is a schematic structural diagram of estimating a velocity and a position by extended Kalman filtering according to an embodiment of the present invention.
FIG. 5 is a schematic structural diagram of an extended Kalman filter estimation attitude in an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
The invention adopts an INS/optical flow/magnetometer/barometer combined navigation scheme based on extended Kalman filtering, fuses the INS, the optical flow, the magnetometer and other sensor data, estimates the speed, position and attitude information of the unmanned aerial vehicle, and enables the unmanned aerial vehicle to realize autonomous navigation positioning by using a sensor thereof in a rejection environment, and the specific scheme is as follows:
as shown in fig. 1, an autonomous navigation positioning method for an unmanned aerial vehicle in a rejection environment includes the following steps:
step 1, when an unmanned aerial vehicle enters a rejection environment, acquiring initial position, speed, height and attitude information of the unmanned aerial vehicle through an inertial navigation system; acquiring data output by an accelerometer, a magnetometer and a gyroscope of the unmanned aerial vehicle in real time, and updating initial position, speed, height and attitude information subsequently;
step 2, judging whether the unmanned aerial vehicle is in a static or uniform motion state at present, if so, entering step 3; otherwise, entering the step 4;
step 3, updating initial attitude information at regular time by using extended Kalman filtering according to data output by the accelerometer and the magnetometer, and entering step 5;
step 4, performing inertial navigation strapdown calculation by using gyroscope data on the basis of the attitude updated last time to acquire attitude information until the unmanned aerial vehicle returns to a static or uniform state, and entering step 5;
step 5, outputting the updated attitude information of the unmanned aerial vehicle;
and 6, calculating the latest position, height and speed information by adopting extended Kalman filtering fusion according to the output data of the optical flow sensor, the barometer, the accelerometer and the gyroscope of the unmanned aerial vehicle and the initial position, speed and height information obtained by the inertial navigation system.
Specifically, the speed, height, and attitude information obtained by the inertial navigation system in step 1 have a certain error, and thus the speed, height, and attitude information need to be updated by other data. In the present embodiment, the extended kalman filtering process is based on the model structure shown in fig. 3, and is subdivided into the filtering structures of fig. 4 and fig. 5 for different situations.
For the update of the attitude information, as described in step 3, the attitude information is updated using the model shown in fig. 5 according to the data output by the accelerometer and the magnetometer. Under the condition that the unmanned aerial vehicle does not have self acceleration, the accelerometer can determine the roll angle and the pitch angle of the unmanned aerial vehicle through a sensitive gravity field; the magnetometer can calculate the course angle of the unmanned aerial vehicle by means of the attitude information of the unmanned aerial vehicle obtained by the accelerometer. By combining the two, the full attitude information without time accumulated error can be obtained.
Specifically, the process of acquiring the roll angle and the pitch angle of the unmanned aerial vehicle by using the accelerometer is as follows:
the component of the gravity vector in the geographic coordinate system is [0,0, -g]TWhen the unmanned aerial vehicle is in a static state (no acceleration relative to the navigation coordinate system), the measurement value of the accelerometer installed in the coordinate system of the unmanned aerial vehicle is
Figure BDA0003383880820000041
Because the acceleration of gravity is perpendicular to the horizontal plane, the course angle of the unmanned aerial vehicle does not influence the output of the acceleration in the x direction and the y direction, and then
Figure BDA0003383880820000042
Wherein gamma and theta are respectively the roll angle and the pitch angle of the unmanned aerial vehicle.
The specific process of acquiring the course angle of the unmanned aerial vehicle by using the magnetometer is as follows:
assuming the geomagnetic intensity as
Figure BDA0003383880820000043
Magnetometer along unmanned aerial vehicle coordinate system FbThree-axis direction installation, the geomagnetic intensity on the unmanned aerial vehicle is
Figure BDA0003383880820000044
The projection of the geomagnetic intensity on each axis of the geographic coordinate system and the unmanned aerial vehicle coordinate system can be represented by a transformation matrix between the two coordinate systems, i.e. the geomagnetic intensity is projected on each axis of the geographic coordinate system and the unmanned aerial vehicle coordinate system
Figure BDA0003383880820000051
Wherein phi is the course angle of the unmanned plane,
Figure BDA0003383880820000052
the value of (a) is given by an onboard magnetometer;
Figure BDA0003383880820000053
the value can be obtained by looking up a table according to the known general geographic position of the unmanned aerial vehicle, and is compared with the speed T of the unmanned aerial vehicle under the camera coordinate systemx,Ty,TzIt is related.
Assuming that the geomagnetic field of the unmanned aerial vehicle is kept constant in the flying process, and combining the pitch angle theta and the roll angle gamma determined by the accelerometer, the heading angle phi of the unmanned aerial vehicle under the geographic coordinate system can be calculated according to the formula (2).
The attitude of the unmanned aerial vehicle can be obtained through the integration of the gyroscope output angular rate signals, but the micro-electro-mechanical system (MEMS) gyroscope has serious drift, and the accelerometer/magnetometer combination can provide different noise and drift-free attitudes, so that data in each sensor are fused by using extended Kalman filtering. Filtering Process As shown in FIG. 5, the system state vector can be represented as
Figure BDA0003383880820000054
Wherein [ q ] is0,q1,q2,q3]TThe system state quaternion can be determined by the fourth-order Runge Kutta method, [ omega ]xyz]TIs the angular velocity value output by the gyroscope.
The observation vector of the system can be expressed as
Figure BDA0003383880820000055
The four quaternions can be obtained by using the readings of the accelerometer and the magnetometer and combining a Gaussian-Newton method.
In step 6, data of the barometer, the accelerometer and the gyroscope, and data of the optical flow sensor and the ultrasonic sensor are fused by adopting extended Kalman filtering, speed and position information of the unmanned aerial vehicle in a navigation coordinate system are selected as state quantities, and the position, height and speed information of the unmanned aerial vehicle are estimated by using outputs of the optical flow sensor, the ultrasonic sensor and the barometer of the unmanned aerial vehicle as observed quantities. Through the extended Kalman filtering process, information such as the speed and the position of the unmanned aerial vehicle under the geographic coordinate system can be obtained as shown in FIG. 4
The optical flow sensor is an integrated visual sensor which integrates an Image Acquisition System (IAS) and a Digital Signal Processor (DSP) into one chip and embeds an optical flow algorithm. The optical flow sensor can measure the visual motion and output a two-dimensional measurement value, and can be used for the robot to measure the visual motion, sense the relative motion and the like. The use of optical flow for flight navigation and obstacle avoidance has also become a hot problem in the field of small unmanned aerial vehicle research in recent years.
The optical flow sensor continuously acquires the surface images of the object at a certain speed through the IAS, and then the DSP analyzes the generated image digital matrix. Because two adjacent images always have the same characteristics, the average motion of the surface characteristics of the object can be judged by comparing the position change information of the characteristic points, the analysis result is finally converted into two-dimensional coordinate offset and is stored in a specific register in the form of pixel number, and the detection of the moving object is realized. If the position information of the unmanned aerial vehicle is obtained only by the optical flow sensor, map information or initial position information needs to be loaded in advance, and the absolute position information of the unmanned aerial vehicle can be obtained by combining the unmanned aerial vehicle motion information calculated by the optical flow sensor.
Estimation of motion models of objects by optical flow essentially projects three-dimensional motion onto a cameraOn the two-dimensional image plane, in this embodiment, the pinhole model is used for estimation, and as shown in FIG. 2, let P bec(Xc,Yc,Zc) Is a camera coordinate system XcYcZcThe next point, f, denotes the focal length of the camera, point PcThe coordinate in the imaging plane of the camera is p (x, y, f), and
Figure BDA0003383880820000061
Figure BDA0003383880820000062
Figure BDA0003383880820000063
in the formula
Figure BDA0003383880820000064
And
Figure BDA0003383880820000065
point P and point P are respectively pointed to by origin OcVector of (A), Pc(Xc,Yc,Zc) May be obtained from initial information obtained when the inertial navigation system enters a rejection environment.
Through a series of mathematical derivation, the optical flow vector can be finally obtained
Figure BDA0003383880820000066
The components in the three x, y, z directions are as follows:
Figure BDA0003383880820000067
Figure BDA0003383880820000068
Figure BDA0003383880820000069
Figure BDA00033838808200000610
is the translational velocity of the unmanned aerial vehicle in the camera coordinate system. In the formula
Figure BDA00033838808200000611
The components of the optical flow vector in the x direction and the y direction can be calculated by a method of block matching minimum absolute error sum; height ZcCan be measured by an ultrasonic sensor on board a barometer or an optical flow sensor; angular velocity value omegax、ωy、ωzCan be obtained from a gyroscope; x and y can be obtained from the following formulae (4) and (5). Thereby estimating the translation speed of the aircraft under the camera coordinate system
Figure BDA00033838808200000612
Then transforming the matrix through the coordinate system
Figure BDA00033838808200000613
The speed of the unmanned aerial vehicle under the geographic coordinate system can be obtained, and the position information of the unmanned aerial vehicle under the geographic coordinate system can be obtained after integration.
In the attitude measurement process in the scheme of the embodiment, when the unmanned aerial vehicle is static (hovering) or moves at a constant speed, the attitude obtained by integrating the gyroscope is corrected by utilizing the extended Kalman filtering timing; when the unmanned aerial vehicle is detected to accelerate, decelerate or rotate at a high speed, attitude correction is not carried out, and inertial navigation strapdown calculation is carried out by using gyroscope data on the basis of the attitude updated in the previous step until the unmanned aerial vehicle returns to a static (hovering) or uniform speed state.
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed. Those skilled in the art to which the invention pertains will appreciate that insubstantial changes or modifications can be made without departing from the spirit of the invention as defined by the appended claims.
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.

Claims (6)

1. An unmanned aerial vehicle autonomous navigation positioning method in a rejection environment is characterized by comprising the following processes:
step 1, when an unmanned aerial vehicle enters a rejection environment, acquiring initial position, speed and attitude information of the unmanned aerial vehicle through an inertial navigation system; acquiring data output by an accelerometer, a magnetometer and a gyroscope of the unmanned aerial vehicle in real time, and updating initial position, speed and attitude information subsequently;
step 2, judging whether the unmanned aerial vehicle is in a static or uniform motion state at present, if so, entering step 3; otherwise, entering the step 4;
step 3, updating initial attitude information at regular time by using extended Kalman filtering according to data output by the accelerometer and the magnetometer, and entering step 5;
step 4, performing inertial navigation strapdown calculation by using gyroscope data on the basis of the attitude updated last time to acquire attitude information until the unmanned aerial vehicle returns to a static or uniform state, and entering step 5;
step 5, outputting the updated attitude information of the unmanned aerial vehicle;
and 6, performing fusion and solution on the latest position, height and speed information through extended Kalman filtering according to the output data of the optical flow sensor, the barometer, the accelerometer and the gyroscope of the unmanned aerial vehicle.
2. The unmanned aerial vehicle autonomous navigation positioning method under the rejection environment of claim 1, wherein the specific substeps of step 3 are:
step 3.1, acquiring the current roll angle and pitch angle of the unmanned aerial vehicle through acceleration output data;
step 3.2, acquiring the current course angle of the unmanned aerial vehicle according to the output data of the magnetometer and the current roll angle and pitch angle of the unmanned aerial vehicle;
and 3.3, resolving by using the current roll angle, pitch angle and course angle of the unmanned aerial vehicle through extended Kalman filtering, and updating the attitude acquired by the inertial navigation system according to the resolving result.
3. The unmanned aerial vehicle autonomous navigation positioning method under the rejection environment of claim 2, wherein the roll angle and pitch angle obtaining method in the step 3.1 is as follows:
Figure FDA0003383880810000011
wherein,
Figure FDA0003383880810000012
the measured value of the accelerometer is obtained when the unmanned aerial vehicle is in a static state or a uniform motion state; gamma and theta are respectively the roll angle and the pitch angle of the unmanned aerial vehicle.
4. The unmanned aerial vehicle autonomous navigation positioning method under the rejection environment of claim 3, wherein the course angle obtaining method of the step 3.2 is as follows:
Figure FDA0003383880810000021
wherein,
Figure FDA0003383880810000022
for the geomagnetic intensity, the geomagnetic data is obtained by looking up a table according to the known general geographic position of the unmanned aerial vehicle,
Figure FDA0003383880810000023
and (4) giving out the magnetic strength of the unmanned aerial vehicle, wherein phi is the heading angle of the unmanned aerial vehicle.
5. The unmanned aerial vehicle autonomous navigation positioning method under the rejection environment of claim 4, wherein the step 6 specifically comprises: the method comprises the steps of adopting extended Kalman filtering to fuse data of a barometer, an accelerometer and a gyroscope and data of an optical flow sensor and an ultrasonic sensor, selecting speed and position information of the unmanned aerial vehicle in a navigation coordinate system as state quantities, and estimating the position, height and speed information of the unmanned aerial vehicle by using output of the optical flow sensor, the ultrasonic sensor and the barometer of the unmanned aerial vehicle as observed quantities.
6. The unmanned aerial vehicle autonomous navigation positioning method under the rejection environment of claim 1, wherein the step 6 comprises the following substeps:
step 61, projecting the three-dimensional motion to a two-dimensional image plane of the camera by using a pinhole model in an optical flow estimation method, acquiring coordinates of the camera on the imaging plane and a component expression of an optical flow vector in three directions of x, y and z,
step 62, resolving a component expression of the optical flow vector based on data output by the barometer, the ultrasonic sensor and the gyroscope to obtain the average speed of the unmanned aerial vehicle in a camera coordinate system;
step 63, converting the average speed of the camera in the coordinate system into the speed of the unmanned aerial vehicle in the geographic coordinate system through the coordinate system conversion matrix;
and 65, integrating the speed in the geographic coordinate system to obtain the position information of the unmanned aerial vehicle in the geographic coordinate system.
CN202111445139.3A 2021-11-30 2021-11-30 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment Pending CN114184194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111445139.3A CN114184194A (en) 2021-11-30 2021-11-30 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111445139.3A CN114184194A (en) 2021-11-30 2021-11-30 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment

Publications (1)

Publication Number Publication Date
CN114184194A true CN114184194A (en) 2022-03-15

Family

ID=80541824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111445139.3A Pending CN114184194A (en) 2021-11-30 2021-11-30 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment

Country Status (1)

Country Link
CN (1) CN114184194A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 Combined navigation algorithm fusing optical flow position and speed information

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 Combined navigation algorithm fusing optical flow position and speed information

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
化雪荟,陈大力: "INS /光流/磁强计组合导航在小型无人机中的应用" *
李涛;梁建琦;闫浩;朱志飞;唐军: "INS/光流/磁强计/气压计组合导航系统在无人机中的应用" *

Similar Documents

Publication Publication Date Title
CN109540126B (en) Inertial vision integrated navigation method based on optical flow method
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
CN106959110B (en) Cloud deck attitude detection method and device
CN107727079B (en) Target positioning method of full-strapdown downward-looking camera of micro unmanned aerial vehicle
CN106289275B (en) Unit and method for improving positioning accuracy
JP4989035B2 (en) Error correction of inertial navigation system
EP2133662B1 (en) Methods and system of navigation using terrain features
CN106767752B (en) Combined navigation method based on polarization information
US20080195316A1 (en) System and method for motion estimation using vision sensors
CN107014371A (en) UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
JP2019074532A (en) Method for giving real dimensions to slam data and position measurement using the same
JP5602070B2 (en) POSITIONING DEVICE, POSITIONING METHOD OF POSITIONING DEVICE, AND POSITIONING PROGRAM
CN109186597B (en) Positioning method of indoor wheeled robot based on double MEMS-IMU
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
US20170074678A1 (en) Positioning and orientation data analysis system and method thereof
CN105928515B (en) A kind of UAV Navigation System
CN113670334B (en) Initial alignment method and device for aerocar
Mercado et al. Gps/ins/optic flow data fusion for position and velocity estimation
CN110325822B (en) Cradle head pose correction method and cradle head pose correction device
CN110736457A (en) combination navigation method based on Beidou, GPS and SINS
CN106813662A (en) A kind of air navigation aid based on light stream
CN112985420B (en) Small celestial body attachment optical navigation feature recursion optimization method
CN109143303B (en) Flight positioning method and device and fixed-wing unmanned aerial vehicle
CN111504323A (en) Unmanned aerial vehicle autonomous positioning method based on heterogeneous image matching and inertial navigation fusion
JP4986883B2 (en) Orientation device, orientation method and orientation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220315

WD01 Invention patent application deemed withdrawn after publication