CN112649001B - Gesture and position resolving method for small unmanned aerial vehicle - Google Patents

Gesture and position resolving method for small unmanned aerial vehicle Download PDF

Info

Publication number
CN112649001B
CN112649001B CN202011382997.3A CN202011382997A CN112649001B CN 112649001 B CN112649001 B CN 112649001B CN 202011382997 A CN202011382997 A CN 202011382997A CN 112649001 B CN112649001 B CN 112649001B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
triaxial
coordinate system
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011382997.3A
Other languages
Chinese (zh)
Other versions
CN112649001A (en
Inventor
赵晋伟
王志刚
杨大鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Original Assignee
Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC filed Critical Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Priority to CN202011382997.3A priority Critical patent/CN112649001B/en
Publication of CN112649001A publication Critical patent/CN112649001A/en
Application granted granted Critical
Publication of CN112649001B publication Critical patent/CN112649001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)

Abstract

The application provides a method for resolving the gesture and the position of a small unmanned aerial vehicle, wherein the gesture resolving comprises the following steps: calculating an initial attitude angle of the unmanned aerial vehicle under a static condition according to the triaxial accelerometer and the triaxial magnetometer, and then carrying out integral calculation on the attitude angle of the unmanned aerial vehicle at each subsequent moment according to the angular speed output by the triaxial gyroscope; synchronously calculating the attitude angle of the unmanned aerial vehicle when the unmanned aerial vehicle moves; finally, respectively carrying out high-pass filtering and low-pass filtering and fusion on the two unmanned aerial vehicle attitude angles in the process to obtain unmanned aerial vehicle attitudes; the position calculation includes: firstly, acquiring longitude, latitude and altitude information of an unmanned aerial vehicle according to a longitude and latitude module, and converting the longitude, latitude and altitude information into a ground coordinate system with a certain known fixed point as a coordinate origin; then, carrying out secondary integration according to the three-axis accelerometer and combining the attitude angles to obtain a three-axis coordinate relative to the motion starting point of the unmanned aerial vehicle; and finally, respectively carrying out high-pass filtering and low-pass filtering and fusion on the position information in the process to obtain the position of the unmanned aerial vehicle.

Description

Gesture and position resolving method for small unmanned aerial vehicle
Technical Field
The application belongs to the technical field of aircraft control, and particularly relates to a gesture and position resolving method of a small unmanned aerial vehicle.
Background
The navigation system is an indispensable component for carrying out state estimation and environment perception in the autonomous flight process of the unmanned aerial vehicle. Among these, attitude and position calculations are one of its most prominent functions, as well as being critical to autonomous flight. Typically, the state estimation further estimates the attitude, altitude, speed and position of the drone by fusing multi-source information from different sensors.
The traditional navigation system usually relies on output data of a single sensor to carry out gesture and position resolving, and the inertial sensor and the satellite positioning system cannot be fully and effectively utilized according to the working characteristics of the sensor, so that the influence of external vibration interference is relatively large under relatively severe working conditions, the stability and the precision of output data are relatively poor, even the output result is invalid due to data divergence, and the unmanned aerial vehicle can be out of control when serious.
Disclosure of Invention
The application aims to provide a gesture and position resolving method of a small unmanned aerial vehicle, which aims to solve or alleviate at least one problem in the background technology.
The technical scheme of the application is as follows: a method for resolving the gesture and the position of a small unmanned aerial vehicle, wherein:
the gesture resolving includes:
calculating an initial attitude angle of the unmanned aerial vehicle according to the output of the triaxial accelerometer and the triaxial magnetometer under the static condition of the unmanned aerial vehicle, and then carrying out integral calculation on a first attitude angle at each subsequent moment according to the output angular speed of the triaxial gyroscope;
when the unmanned aerial vehicle moves, synchronously calculating a second attitude angle according to the output of the triaxial accelerometer and the triaxial magnetometer;
respectively carrying out high-pass filtering and low-pass filtering on a first attitude angle calculated according to gyroscope output and a second attitude angle calculated according to accelerometer and magnetometer output, inputting the first attitude angle and the second attitude angle into a complementary filter for fusion, and finally obtaining the attitude of the unmanned aerial vehicle;
the position calculation includes:
acquiring longitude, latitude and altitude information of the unmanned aerial vehicle according to the longitude and latitude receiving module, converting the information into a ground coordinate system with a certain known fixed point as a coordinate origin, and outputting position coordinate information of the unmanned aerial vehicle in the ground coordinate system;
performing secondary integration according to the finally obtained attitude angle of the unmanned aerial vehicle to obtain a triaxial coordinate relative to the motion starting point of the unmanned aerial vehicle;
and finally, respectively carrying out high-pass and low-pass filtering on the position information converted by the unmanned aerial vehicle ground coordinate system and the position information obtained by integrating the attitude angles of the unmanned aerial vehicle, and inputting the obtained information into a complementary filter for fusion, thereby finally obtaining the position of the unmanned aerial vehicle.
In the application, the process of calculating the initial attitude angle of the unmanned aerial vehicle according to the output of the triaxial accelerometer and the triaxial magnetometer under the static condition of the unmanned aerial vehicle comprises the following steps:
performing coordinate matrix transformation from a ground coordinate system to a machine body coordinate system:
in the formula ,the angle of incidence, the angle of yaw and the angle of roll are respectively the transformation matrix;
the sensitive axis of the triaxial accelerometer is arranged along each axis of the machine body coordinate system, and when the static base is initially aligned, the specific force relation between the output of the triaxial accelerometer and the ground coordinate system is as follows:
in the formula ,fb For three-axis output of accelerometer, n x 、n y and nz The method comprises the steps that the ratio of components of the acceleration of the unmanned aerial vehicle in the three-axis directions of a machine body coordinate system, measured by a three-axis accelerometer, to the local gravitational acceleration is measured, and g is the local gravitational acceleration;
each sensitive axis of the triaxial magnetometer is arranged along each axis of the unmanned aerial vehicle coordinate system, and the relation between the triaxial magnetic field intensity and the navigation coordinate system and the unmanned aerial vehicle coordinate system is measured by the triaxial magnetometer:
in the formula ,Mb For measuring the triaxial magnetic field intensity under the unmanned plane coordinate system, M n For measuring the triaxial magnetic field strength in the navigation coordinate system, H 0 Is the initial height;
when the unmanned aerial vehicle is static, the projection of gravitational acceleration on the unmanned aerial vehicle triaxial can be measured to the triaxial accelerometer, and the output of triaxial magnetometer is the projection of geomagnetic field on the unmanned aerial vehicle triaxial, obtains to calculate auxiliary attitude angle and does:
in the formula ,Hx 、H y and Hz Is the three-axis output of the magnetometer.
In the application, the first attitude angle at each subsequent moment after integral calculation according to the output angular speed of the three-axis gyroscope is as follows:
wherein k is time, p k 、q k 、r k Angular velocity information of the system at the current moment measured by the triaxial gyroscope; Δt is the tri-axis gyroscope output time interval.
In the application, the process of obtaining the triaxial coordinates relative to the motion starting point of the unmanned aerial vehicle by carrying out secondary integration according to the finally obtained attitude angle of the unmanned aerial vehicle comprises the following steps:
establishing a ground coordinate system S g (o g ,x g ,y g ,z g ) Taking a place with known longitude and latitude height as an origin, converting longitude and latitude coordinates into coordinates in a ground coordinate system as follows:
wherein lambda and phi are the longitude and latitude of the position where the longitude and latitude module is located respectively 0 、φ 0 Longitude and latitude of origin of coordinates, R m For local radius of curvature of meridian, R n The curvature radius of the local mortise circle is the curvature radius;
the longitude and latitude module outputs data and performs coordinate conversion, and simultaneously uses an inertial sensor to perform secondary integration estimation on the position of the unmanned aerial vehicle, and then the two are fused to output high-precision high-update-rate position information,
where u, v and w are components of the speed of the unmanned aerial vehicle projected onto the three axes of the body coordinate system, g is the local gravitational acceleration, n x 、n y and nz The method is characterized in that the ratio of the components of the acceleration of the unmanned aerial vehicle in the three-axis direction of the machine body coordinate system and the local gravity acceleration measured by the three-axis accelerometer is calculated, p, q and r are the components of the angular velocity of the unmanned aerial vehicle in the three-axis direction of the machine body coordinate system measured by the three-axis gyroscope, and phi and theta are the rolling angle and the pitch angle of the unmanned aerial vehicle.
In the application, the positions of the finally obtained unmanned aerial vehicle are as follows:
in the formula, x, y and h are the position coordinates and the height coordinates of the unmanned aerial vehicle.
Compared with the method provided by the application, the attitude and position calculation is carried out by means of single sensor data, and the attitude and position information output by the multi-sensor data fusion algorithm based on the inertial sensor and the GPS/Beidou module adopted by the algorithm is reduced, so that the output noise is reduced, the measurement precision is improved, the information update rate is improved, and the long-term stability is improved.
Drawings
In order to more clearly illustrate the technical solution provided by the present application, the following description will briefly refer to the accompanying drawings. It will be apparent that the figures described below are merely some embodiments of the application.
FIG. 1 is a schematic diagram of a gesture fusion process in the method of the present application.
FIG. 2 is a schematic diagram of a position fusion process in the method of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application become more apparent, the technical solutions in the embodiments of the present application will be described in more detail below with reference to the accompanying drawings in the embodiments of the present application.
In order to solve the problems pointed out in the background art, the application provides a gesture and position resolving method, which fuses respective outputs according to the working characteristics of different sensors, so that resolving accuracy can be improved as much as possible under the condition of occupying smaller computing resources, and the influence of external vibration interference on data resolving stability is reduced.
The gesture and position resolving method of the small unmanned aerial vehicle mainly comprises the following two parts:
1. the gesture resolving method comprises the following steps:
as shown in fig. 1, firstly, an initial attitude angle of the unmanned aerial vehicle is calculated according to the output of a triaxial accelerometer and a triaxial magnetometer under the static condition of the unmanned aerial vehicle, and then the attitude angle of the unmanned aerial vehicle at each subsequent moment is calculated by integral calculation according to the angular speed output by the triaxial gyroscope;
then, synchronously calculating the attitude angle of the unmanned aerial vehicle according to the output of the triaxial accelerometer and the triaxial magnetometer when the unmanned aerial vehicle moves;
and finally, carrying out high-pass filtering and low-pass filtering on the unmanned aerial vehicle attitude angle calculated according to the gyroscope output in the first step and the unmanned aerial vehicle attitude angle calculated according to the accelerometer and the magnetometer output in the second step, inputting the high-pass filtering and the low-pass filtering into a complementary filter for fusion, and finally obtaining the unmanned aerial vehicle attitude.
The specific process comprises the following steps:
performing coordinate matrix transformation from a ground coordinate system to a machine body coordinate system:
the sensitive axes of the triaxial accelerometer are arranged along each axis of the machine body coordinate system, and when the static base is initially aligned, the specific force relation between the output of the triaxial accelerometer and the ground coordinate system is as follows:
in the formula ,fb The three-axis output of the accelerometer is given, and g is the local gravitational acceleration.
Each sensitive axis of the triaxial magnetometer is arranged along each axis of the unmanned aerial vehicle coordinate system, and the relation between the triaxial magnetic field intensity and the navigation coordinate system and the unmanned aerial vehicle coordinate system is measured as follows:
wherein ,
in the formula ,Mb For measuring the triaxial magnetic field intensity under the unmanned plane coordinate system, M n The triaxial magnetic field strength is measured for the navigational coordinate system.
When the unmanned aerial vehicle is static, the projection of gravitational acceleration on the unmanned aerial vehicle triaxial can be measured to the triaxial accelerometer, and the output of triaxial magnetometer is the projection of geomagnetic field on the unmanned aerial vehicle triaxial. Therefore, the calculation auxiliary attitude angle can be obtained by the above formula:
in the formula ,nx 、n y and nz For three-axis output of accelerometer, H x 、H y and Hz And the three-axis output of the magnetometer is that theta, phi and phi are Euler angles of the unmanned aerial vehicle.
The initial posture information is obtained through the algorithm. Then, according to the angular velocity information output by the triaxial gyroscope, the attitude angle can be calculated according to the following formula:
and p, q and r are angular velocity information of the system at the current moment measured by the three-axis gyroscope. Δt is the output time interval of the three-axis gyroscope, and the attitude update time interval in the hardware platform of the system is 3ms.
And estimating an initial attitude angle by adopting the gravity acceleration output by the triaxial accelerometer and the projection of the triaxial magnetometer in the triaxial direction of the unmanned plane when the unmanned plane is stationary, and then carrying out integral calculation on the attitude angle by using the projection of the angular speed of the unmanned plane output by the triaxial gyroscope in the triaxial direction of the unmanned plane. And then, respectively carrying out low-pass filtering and high-pass filtering on the auxiliary attitude angle calculated by the triaxial accelerometer and the triaxial magnetometer and the calculated attitude angle obtained by the gyroscope integration, and inputting the low-pass filtering and the high-pass filtering into a complementary filter for fusion, thus completing attitude estimation based on the inertial sensor.
2. The position calculating method comprises the following steps:
as shown in fig. 2, longitude, latitude and altitude information of the unmanned aerial vehicle is obtained according to a longitude and latitude module (for example, the longitude, latitude and altitude information can be obtained after screening by using a Beidou or GPS navigation message received by a GNSS receiving module), and then the longitude, latitude and altitude information is converted into a ground coordinate system with a certain known fixed point as an origin of coordinates, and coordinates of the unmanned aerial vehicle in the ground coordinate system are output;
then, carrying out secondary integration on the attitude angle output by combining the attitude calculation method according to the output of the triaxial accelerometer to obtain triaxial coordinates relative to the motion starting point of the unmanned aerial vehicle;
and finally, respectively carrying out high-pass and low-pass filtering on the position information output by the first step and the second step, inputting the position information into a complementary filter, and fusing the position information to finally obtain the position of the unmanned aerial vehicle.
The specific process comprises the following steps:
the position calculation firstly screens out the required longitude and latitude height information from the navigation message, and then establishes a ground coordinate system S g (o g x g y g z g ) The origin is selected at a certain place with known longitude and latitude height, o g x g North finger, o g y g Refer to the east. The longitude and latitude coordinates can be converted into coordinates in a ground coordinate system according to the following formula, and the unit is meter。
In the formula, λ represents longitude, φ represents latitude, R m For local radius of curvature of meridian, R n Is the curvature radius of the local mortise circle. Both are specifically calculated by the following formulas.
In the above formula, the earth radius R is taken e 6378137m, earth flatness e= 0.0033528131779, (λ) 00 ) The longitude and latitude of the origin of coordinates O, (λ, Φ) are the longitude and latitude of the location where the GPS/beidou module receiving antenna is located.
And when the system receives the GNSS module output data and performs coordinate conversion, the inertial sensor is used for performing secondary integration estimation on the position of the unmanned aerial vehicle, and then the inertial sensor and the unmanned aerial vehicle are fused to output high-precision high-update-rate position information.
In the above formula, u, v and w are components of the speed of the unmanned aerial vehicle projected onto the three axes of the machine body coordinate system, g is the local gravitational acceleration, n x 、n y and nz For the ratio of the components of the acceleration of the unmanned aerial vehicle in the three-axis direction of the machine body coordinate system and the local gravity acceleration measured by the three-axis accelerometer, p, q and r are the components of the angular velocity of the unmanned aerial vehicle in the three-axis direction of the machine body coordinate system measured by the three-axis gyroscope, and phi and theta are the rolling angle and the pitch angle of the unmanned aerial vehicle:
in the formula, x, y and h are the position coordinates and the height coordinates of the unmanned aerial vehicle.
Compared with the method provided by the application, the attitude and position calculation is carried out by means of single sensor data, and the attitude and position information output by the multi-sensor data fusion algorithm based on the inertial sensor and the GPS/Beidou module adopted by the algorithm is reduced, so that the output noise is reduced, the measurement precision is improved, the information update rate is improved, and the long-term stability is improved.
Finally, the present application also provides a computer device, including: one or more processors; a storage means for storing one or more programs; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of the preceding claims.
The embodiment of the application also provides a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as described in any of the above.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, various steps, methods, apparatuses or modules may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present application should be included in the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (3)

1. A method for resolving the gesture and position of a small unmanned aerial vehicle is characterized in that,
the gesture resolving includes:
calculating an initial attitude angle of the unmanned aerial vehicle according to the output of the triaxial accelerometer and the triaxial magnetometer under the static condition of the unmanned aerial vehicle, and then carrying out integral calculation on a first attitude angle at each subsequent moment according to the output angular speed of the triaxial gyroscope;
when the unmanned aerial vehicle moves, synchronously calculating a second attitude angle according to the output of the triaxial accelerometer and the triaxial magnetometer;
respectively carrying out high-pass filtering and low-pass filtering on a first attitude angle calculated according to gyroscope output and a second attitude angle calculated according to accelerometer and magnetometer output, inputting the first attitude angle and the second attitude angle into a complementary filter for fusion, and finally obtaining the attitude of the unmanned aerial vehicle;
the process for calculating the initial attitude angle of the unmanned aerial vehicle according to the output of the triaxial accelerometer and the triaxial magnetometer under the static condition of the unmanned aerial vehicle comprises the following steps:
performing coordinate matrix transformation from a ground coordinate system to a machine body coordinate system:
in the formula ,the angle of incidence, the angle of yaw and the angle of roll are respectively the transformation matrix;
the sensitive axes of the triaxial accelerometer are arranged along each axis of the machine body coordinate system, and when the static base is initially aligned, the specific force relation between the output of the triaxial accelerometer and the ground coordinate system is as follows:
in the formula ,fb For three-axis output of accelerometer, n x 、n y and nz The method comprises the steps that the ratio of components of the acceleration of the unmanned aerial vehicle in the three-axis directions of a machine body coordinate system, measured by a three-axis accelerometer, to the local gravitational acceleration is measured, and g is the local gravitational acceleration;
each sensitive axis of the triaxial magnetometer is arranged along each axis of the unmanned aerial vehicle coordinate system, and the relation between the triaxial magnetic field intensity and the navigation coordinate system and the unmanned aerial vehicle coordinate system is measured as follows:
wherein ,
in the formula ,Mb For measuring the triaxial magnetic field intensity under the unmanned plane coordinate system, M n For measuring the triaxial magnetic field strength in the navigation coordinate system, H 0 Is the initial height;
when the unmanned aerial vehicle is static, the projection of gravitational acceleration on the unmanned aerial vehicle triaxial is measured to the triaxial accelerometer, and the output of triaxial magnetometer is the projection of geomagnetic field on the unmanned aerial vehicle triaxial, obtains calculating auxiliary attitude angle:
in the formula ,Hx 、H y and Hz Is the three-axis output of the magnetometer;
the first attitude angle at each subsequent moment after integral calculation is carried out according to the output angular speed of the three-axis gyroscope is as follows:
where k is the time, p k 、q k 、r k Angular velocity information of the system at the current moment measured by the three-axis gyroscope is delta t, which is the output time interval of the three-axis gyroscope;
the position calculation includes:
acquiring longitude, latitude and altitude information of the unmanned aerial vehicle according to the longitude and latitude receiving module, converting the information into a ground coordinate system with a certain known fixed point as a coordinate origin, and outputting position coordinate information of the unmanned aerial vehicle in the ground coordinate system;
performing secondary integration according to the finally obtained attitude angle of the unmanned aerial vehicle to obtain a triaxial coordinate relative to the motion starting point of the unmanned aerial vehicle;
finally, respectively carrying out high-pass and low-pass filtering on position information converted by the unmanned aerial vehicle ground coordinate system and position information obtained by integrating the attitude angle of the unmanned aerial vehicle, and inputting the position information and the position information into a complementary filter for fusion to finally obtain the position of the unmanned aerial vehicle;
the process of obtaining the triaxial coordinates relative to the motion starting point of the unmanned aerial vehicle by carrying out secondary integration according to the finally obtained attitude angle of the unmanned aerial vehicle comprises the following steps:
establishing a ground coordinate system S g (o g x g y g z g ) The origin is selected at a place with known longitude and latitude height, and longitude and latitude coordinates are converted into coordinates in a ground coordinate system:
in the formula ,λ、Longitude and latitude, lambda, of the location of the longitude and latitude module, respectively 0 、/>Longitude and latitude of origin of coordinates, R m Is local toRadius of curvature of meridian, R n The curvature radius of the local mortise circle is the curvature radius;
further can be obtained:
when receiving output data of the longitude and latitude module and performing coordinate conversion, performing secondary integration estimation on the position of the unmanned aerial vehicle by using an inertial sensor, and then fusing the output data and the position information to output high-precision and high-update-rate position information:
where u, v and w are components of the speed of the unmanned aerial vehicle projected onto the three axes of the body coordinate system, g is the local gravitational acceleration, n x 、n y and nz The method comprises the steps that the ratio of the components of the acceleration of the unmanned aerial vehicle in the three-axis direction of a machine body coordinate system and the local gravity acceleration measured by a three-axis accelerometer is calculated, p, q and r are the components of the angular velocity of the unmanned aerial vehicle in the three-axis direction of the machine body coordinate system measured by the three-axis gyroscope, and phi and theta are the rolling angle and the pitch angle of the unmanned aerial vehicle;
the position of the finally obtained unmanned aerial vehicle is as follows:
in the formula, x, y and h are the position coordinates and the height coordinates of the unmanned aerial vehicle.
2. A computer device, comprising:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of claim 1.
3. A computer readable storage medium storing a computer program which, when executed by a processor, implements the method of claim 1.
CN202011382997.3A 2020-12-01 2020-12-01 Gesture and position resolving method for small unmanned aerial vehicle Active CN112649001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011382997.3A CN112649001B (en) 2020-12-01 2020-12-01 Gesture and position resolving method for small unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011382997.3A CN112649001B (en) 2020-12-01 2020-12-01 Gesture and position resolving method for small unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112649001A CN112649001A (en) 2021-04-13
CN112649001B true CN112649001B (en) 2023-08-22

Family

ID=75350123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011382997.3A Active CN112649001B (en) 2020-12-01 2020-12-01 Gesture and position resolving method for small unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112649001B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114400986A (en) * 2021-12-07 2022-04-26 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Cascade filtering method and device for portable relative gravimeter

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008241320A (en) * 2007-03-26 2008-10-09 Mitsubishi Electric Corp Flying object and method for aligning inertial device to be mounted on flying object
JP2013029512A (en) * 2011-07-28 2013-02-07 Memsic Inc System and method for portable electronic device that detect attitude and angular velocity using magnetic sensor and accelerometer
CN107063262A (en) * 2017-04-07 2017-08-18 武汉理工大学 A kind of complementary filter method resolved for UAV Attitude
CN108398128A (en) * 2018-01-22 2018-08-14 北京大学深圳研究生院 A kind of the fusion calculation method and device of attitude angle
CN109506646A (en) * 2018-11-20 2019-03-22 石家庄铁道大学 A kind of the UAV Attitude calculation method and system of dual controller
CN110081878A (en) * 2019-05-17 2019-08-02 东北大学 A kind of posture and location determining method of multi-rotor unmanned aerial vehicle
CN110146077A (en) * 2019-06-21 2019-08-20 台州知通科技有限公司 Pose of mobile robot angle calculation method
CN111637878A (en) * 2020-06-19 2020-09-08 四川陆垚控制技术有限公司 Unmanned aerial vehicle navigation filter

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9332476B2 (en) * 2013-10-04 2016-05-03 Blackberry Limited Method and apparatus to correct indoor positioning by utilizing Wi-Fi handovers

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008241320A (en) * 2007-03-26 2008-10-09 Mitsubishi Electric Corp Flying object and method for aligning inertial device to be mounted on flying object
JP2013029512A (en) * 2011-07-28 2013-02-07 Memsic Inc System and method for portable electronic device that detect attitude and angular velocity using magnetic sensor and accelerometer
CN107063262A (en) * 2017-04-07 2017-08-18 武汉理工大学 A kind of complementary filter method resolved for UAV Attitude
CN108398128A (en) * 2018-01-22 2018-08-14 北京大学深圳研究生院 A kind of the fusion calculation method and device of attitude angle
CN109506646A (en) * 2018-11-20 2019-03-22 石家庄铁道大学 A kind of the UAV Attitude calculation method and system of dual controller
CN110081878A (en) * 2019-05-17 2019-08-02 东北大学 A kind of posture and location determining method of multi-rotor unmanned aerial vehicle
CN110146077A (en) * 2019-06-21 2019-08-20 台州知通科技有限公司 Pose of mobile robot angle calculation method
CN111637878A (en) * 2020-06-19 2020-09-08 四川陆垚控制技术有限公司 Unmanned aerial vehicle navigation filter

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于增强型显式互补滤波的无人机姿态算法;马力等;《桂林电子科技大学学报》;20191025(第05期);全文 *

Also Published As

Publication number Publication date
CN112649001A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN111323050B (en) Strapdown inertial navigation and Doppler combined system calibration method
CN109931926A (en) A kind of small drone based on topocentric coordinate system is seamless self-aid navigation algorithm
WO2019071916A1 (en) Antenna beam attitude control method and system
CN104698485B (en) Integrated navigation system and air navigation aid based on BD, GPS and MEMS
CN112505737B (en) GNSS/INS integrated navigation method
JP5602070B2 (en) POSITIONING DEVICE, POSITIONING METHOD OF POSITIONING DEVICE, AND POSITIONING PROGRAM
CN111121766B (en) Astronomical and inertial integrated navigation method based on starlight vector
CN105928515B (en) A kind of UAV Navigation System
CN104374388A (en) Flight attitude determining method based on polarized light sensor
CN102087110B (en) Miniature underwater moving vehicle autonomous attitude detecting device and method
US11408735B2 (en) Positioning system and positioning method
CN110440827B (en) Parameter error calibration method and device and storage medium
CN110017837A (en) A kind of Combinated navigation method of the diamagnetic interference of posture
CN103994766A (en) Anti-GPS-failure orientation method for fixed-wing unmanned aerial vehicle
CN112556724A (en) Initial coarse alignment method for low-cost navigation system of micro aircraft in dynamic environment
CN105928519B (en) Navigation algorithm based on INS inertial navigation and GPS navigation and magnetometer
CN112649001B (en) Gesture and position resolving method for small unmanned aerial vehicle
CN107807375B (en) Unmanned aerial vehicle attitude tracking method and system based on multiple GPS receivers
Spielvogel et al. A stable adaptive attitude estimator on SO (3) for true-North seeking gyrocompass systems: Theory and preliminary simulation evaluation
Tang et al. An attitude estimate method for fixed-wing UAV s using MEMS/GPS data fusion
CN114877881A (en) Fusion method and fusion system for course angle measurement data of unmanned aerial vehicle
Tripathi et al. Design considerations of orientation estimation system
CN111811500A (en) Target object pose estimation method and device, storage medium and electronic equipment
Kurniawan et al. Displacement estimation and tracking of quadrotor UAV in dynamic motion
Cheng et al. Modeling and simulation of low-cost integrated navigation system on vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant