CN117073669A - Aircraft positioning method - Google Patents

Aircraft positioning method Download PDF

Info

Publication number
CN117073669A
CN117073669A CN202311042901.2A CN202311042901A CN117073669A CN 117073669 A CN117073669 A CN 117073669A CN 202311042901 A CN202311042901 A CN 202311042901A CN 117073669 A CN117073669 A CN 117073669A
Authority
CN
China
Prior art keywords
image
aircraft
points
matching
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311042901.2A
Other languages
Chinese (zh)
Inventor
郑敬浩
雷波
谭海
范强
刘松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Original Assignee
Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Huazhong Tianjing Tongshi Technology Co ltd filed Critical Wuhan Huazhong Tianjing Tongshi Technology Co ltd
Priority to CN202311042901.2A priority Critical patent/CN117073669A/en
Publication of CN117073669A publication Critical patent/CN117073669A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Abstract

The invention discloses an aircraft positioning method, wherein an aircraft shoots photos right below at regular intervals, initial positioning coordinates of the aircraft are obtained according to inertial navigation, a superpoint neural network and a superglue matching algorithm are used for matching the photos shot in practice with a pre-stored reference map, mismatching points are removed, transformation matrixes of the two images are calculated to obtain position information of the shot photos, and finally a Kalman filtering algorithm is used for fusing visual navigation and inertial navigation results to obtain aircraft coordinate points.

Description

Aircraft positioning method
Technical Field
The invention belongs to the technical field of integrated navigation of aircrafts, and particularly relates to an aircraft positioning method.
Background
Although satellite navigation has found wide application in various industries, there is still a need for autonomous navigation of aircraft, particularly in areas where no GPS signals or GPS signals are disturbed. However, the inertial navigation technology has good real-time performance, but the working principle causes the positioning error to accumulate faster, and the inertial navigation technology cannot be used for a long time.
The visual navigation positioning accuracy is high, but the real-time problem caused by the overlarge operand and the accuracy problem caused by the overabundant natural scene are faced.
In addition, most of the common visual navigation methods are based on spliced camera photos for positioning navigation, and map information prepared in advance is rarely used.
Disclosure of Invention
Aiming at the technical problems in the prior art, the invention provides an aircraft hybrid positioning method based on inertial navigation and image matching.
The technical scheme adopted for solving the technical problems is as follows: a method of locating an aircraft, comprising the steps of:
s1, extracting features of a reference satellite map of a task area by using a superpoint neural network to obtain pixel position coordinates (x, y) of key points, description vectors with 1X 256 dimensions and two norms of 1 and confidence coefficient confidence with the value between 0 and 1, and storing the coordinates and the description vectors as the reference map;
s2, calibrating the initial position of the aircraft as (x) 0 ,y 0 );
S3, recording a photo taken right below every time T when the aircraft executes a task as an image img;
s4, scaling and rotating the image img according to the altitude information and the attitude information of inertial navigation to obtain an image A, wherein the image A and a reference satellite map have the same spatial resolution and direction, and coordinates, description vectors and confidence coefficients of key points in the image A are extracted by using a superpoint neural network;
s5, determining a position section B of the aircraft on the reference map at the shooting moment t according to the inertial navigation information, wherein the center point (x t ,y t ) Positioning coordinates for inertial navigation;
s6, using a superglue matching algorithm pairThe characteristic points in the image A and the interval B are subjected to preliminary matching, then the RANSAC algorithm is used for removing mismatching points, and finally n matching point pairs are obtainedAnd
i=1,2,…n;
s7, if n is less than 5, the number of matching points is considered to be too small, the matching effect is poor, and the step S3 is returned; otherwise, the matching effect is considered to be better, and the step S8 is performed;
s8, calculating a transformation matrix M from the image A to the interval B;
s9, calculating the center point of the image ACoordinates processed by the transformation matrix MThe aircraft position obtained as visual navigation;
s10, using Kalman filtering to obtain a pair of coordinates (x t ,y t ) Andand fusing to obtain fused aircraft coordinate points.
In the aircraft positioning method, in step S5, inertial navigation coordinate positioning error is defined as delta, and interval B is defined as a distance (x) t -Δ,y t -delta) to (x t +Δ,y t + delta).
In the aircraft positioning method, in step S6, a superglue matching algorithm uses a multi-head attention mechanism to iteratively update description vectors of feature points in two images of an image A and a section B, then calculates an inner product S between the feature points, and uses a sinkhorn algorithm to update S.
According to the aircraft positioning method, the cross contrast is used for preliminary matching: if the m-point in interval B is the best match for the n-point in image A, and the n-point in image A is also the most best match for the m-point in interval B, then the pair { n-m } is considered to be the pair of match points.
IN the aircraft positioning method, IN step S10, positioning errors of inertial navigation and visual navigation are respectively set as VN and IN, and then aircraft coordinate points obtained by kalman filtering are as follows:
the beneficial effects of the invention are as follows:
1, the traditional orb and sift use the information of manual design such as image gray scale and gray gradient to match the image. However, the camera photo and the reference map often have imaging difference, and the matching effect is poor; the super point neural network and the super glue matching algorithm are introduced to extract and match characteristic points respectively, and the two algorithms use the deep neural network to extract and match characteristics, so that deep semantic information in an image can be extracted, and not only gray level and gradient information of a pixel level. Experiments show that the image matching algorithm used in the method has the best matching effect no matter in villages or cities, can accurately match camera images with a reference map, has good adaptability, and the matching accuracy can reach 1-4 pixels.
2, the method uses inertial navigation information to perform preliminary screening and focusing on the reference map, reduces the range to be matched, and reduces the time consumption of visual navigation.
And 3, after the visual navigation result and the inertial navigation are fused, experiments show that the accumulated error of the inertial navigation is greatly reduced. The fused method has better real-time performance and smaller positioning error, and is more suitable for long-distance long-endurance flight tasks. Because satellite signals are not needed, the navigation method provided by the invention has good anti-interference performance, strong autonomy and wider application range.
Drawings
FIG. 1 is a schematic illustration of an original image of the present invention;
FIG. 2 is a schematic diagram of an original image after rotational scaling;
FIG. 3 is a graph showing the matching result of a conventional sift algorithm;
FIG. 4 is a graph showing the comparison of the superpoint+superglue localization effects of the present invention;
fig. 5 is a result of the RANSAC algorithm of the present invention to remove mismatching points.
Detailed Description
The invention is described in further detail below with reference to the accompanying drawings.
The invention discloses a combined positioning method based on visual navigation and inertial navigation, which uses a satellite map as a reference, matches an unmanned aerial vehicle real-time shot image with the reference map in real time, and fuses positioning results by using Kalman filtering, so that the advantages of the two methods are combined, positioning errors are reduced, and a better navigation effect is achieved. The method specifically comprises the following steps.
S1, acquiring a satellite map of a task area, extracting features of the satellite map by using a superpoint neural network, acquiring coordinates (x, y) of key points, description vector descriptors and confidence level confidence, and storing the coordinates (x, y), description vector descriptors and confidence level confidence as a reference map.
Wherein (x, y) is the pixel location of the keypoint; the description vector descriptors is a 1×256-dimensional vector, and its two norms are 1; the confidence is a number between 0 and 1. Compared with a direct storage satellite map, the method not only reduces the storage space, but also accelerates the matching calculation process. The SuperPoint neural network uses two-dimensional convolution to perform feature extraction, and has higher robustness than the traditional orb, sift and other algorithms.
S2, calibrating the initial position of the aircraft, and marking as (x) 0 ,y 0 ). The aircraft used in this example was a Sinkiang M210 unmanned aircraft.
S3, when the aircraft executes the mission, a photo is taken towards the right lower side at intervals of time T, and the photo is recorded as an image img. In this embodiment, T takes 5s.
And S4, scaling and rotating the image img according to the altitude information and the attitude information of the inertial navigation to obtain an image A, wherein the image A and the reference satellite map have the same spatial resolution and direction.
Fig. 1 shows an original photo, and the image shown in fig. 2 is obtained through scaling rotation, the spatial resolution is approximately the same as that of the base image, and the azimuth angle of the image is reduced to be within 10 degrees.
Let H denote the aircraft altitude, f be the camera focal length, s be the pixel size of the camera sensor, then the spatial resolution of the image img (the actual size represented by one pixel point in the photograph) is (Hs)/f, scaling the image img to the reference map spatial resolution, facilitating the subsequent matching. Let θ denote the azimuth of the aircraft, then rotate the image counter-clockwise by θ, reducing the effect of rotation on the match. And extracting coordinates, description vectors and confidence degrees of key points in the image A by using a superpoint neural network after the image A is obtained. The description vector descriptors is a 1×256-dimensional vector, and its two norms are 1; the confidence is a number between 0 and 1.
S5, determining a position section B of the aircraft on the reference map at the shooting time t according to the inertial navigation information, wherein the center point (x t ,y t ) Coordinates are located for inertial navigation. And after the interval B is obtained, extracting coordinates, description vectors and confidence degrees of key points in the image A by using a superpoint neural network. The description vector descriptors is a 1×256-dimensional vector, and its two norms are 1; the confidence is a number between 0 and 1.
The range of interval B represents the positioning error of inertial navigation. Inertial navigation-determined position coordinates (x t ,y t ) If the error of (a) is denoted as delta, the interval B is the interval (x t -Δ,y t -delta) to (x t +Δ,y t + delta). Delta is related to the self performance of inertial navigation, the flying speed of an aircraft and the like, and can be increased along with the accumulated duration of inertial navigationAnd increases. In this case, consider the actual case, where Δ is located 200m.
S6, performing preliminary matching on the characteristic points in the image A and the section B by using a superglue matching algorithm, removing mismatching points by using a RANSAC algorithm, and finally obtaining n matching point pairsAnd->i=1, 2, … n. Fig. 3 shows the matching result of the conventional sift algorithm, and fig. 4 shows a comparison chart of the positioning effect of the superpoint+superglue of the present invention: the left side of each image is a satellite map, the right side of each image is a shot photo, the result of the traditional sift algorithm can be seen to be poor, because the traditional sift algorithm uses information such as gray level or gray level gradient of images, and the matching effect of superpoint+superglue is very good because the imaging difference between the satellite map and aerial photo is very large, the same area can be dark in the satellite map and bright in the aerial photo.
The super-glue matching algorithm uses a multi-head attention mechanism to iteratively update the description vectors of the feature points in the two images A and B, then calculates the inner product s between the feature points, and uses a sinkhorn algorithm to update s, wherein the larger the s value is, the closer the description vectors of the two points are, and the more likely the two points represent the same position.
The preliminary matching also uses cross contrast: if the m-point in interval B is the best match for the n-point in image A, and the n-point in image A is also the most best match for the m-point in interval B, then the pair { n-m } is considered to be the pair of match points.
There are many mismatching points in the matching point pair obtained in this step, and the mismatching points need to be removed. Regardless of the distortion of the camera, there is only translational, rotational, and scaling transformation of the two images. Given an error e=4, 3 matching point pairs are selected multiple times, a transformation matrix M is calculated, and the points (x A ,y A ) Transformed coordinates (x A→B ,y A→B ) Statistics (x A→B ,y A→B ) And (x) B ,y B ) Error less than eThe number of point pairs, the largest number of point pairs being the last n matched point pairsAnd->
S7, if n is less than 5, the number of matching points is considered to be too small, the matching effect is poor, and the step S3 is returned; otherwise, the matching effect is considered to be good, and the process proceeds to step S8.
S8, calculating a transformation matrix from the image A to the interval Bm x Is a parameter of a 2×3 matrix, and the specific calculation method of the transformation matrix M is as follows:
s9, calculating the center point of the image ACoordinates processed by the transformation matrix MThe aircraft position obtained as a visual navigation.
In this embodiment, the spatial resolution of the reference map is 0.597m, and the average error of image matching is 4 pixels, so the error of visual navigation is about 2.4m.
S10, setting the positioning errors of inertial navigation and visual navigation as VN and IN respectively, and using Kalman filtering to obtain a coordinate (x t ,y t ) Andfusing to obtain a fused aircraft coordinate point as
IN order to simplify the calculation, IN this embodiment, the positioning errors of the IN inertial navigation and the visual navigation VN are set to be fixed values, and vn=2.4, in=200, and the position of the coordinate point of the aircraft obtained by the kalman filtering is
Fig. 5 is a schematic diagram showing the matching effect after the mismatching points are removed, and it can be seen that the matching of the corresponding points between the two graphs is very correct and the matching effect is very good.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and some practical embodiments, and variations and modifications may be made by those skilled in the art without departing from the inventive concept, which are all within the scope of the present invention.

Claims (5)

1. An aircraft positioning method is characterized in that: comprises the following steps of
S1, extracting features of a reference satellite map of a task area by using a superpoint neural network, obtaining pixel position coordinates (x, y) of key points, a description vector with 1X 256 dimensions and a binary norm of 1 and the confidence coefficient of a numerical value between 0 and 1, and storing the description vector and the confidence coefficient as the reference map;
s2, calibrating the initial position of the aircraft as (x) 0 ,y 0 );
S3, recording a photo taken right below every time T when the aircraft executes a task as an image img;
s4, scaling and rotating the image img according to the height information and the attitude information of inertial navigation to obtain an image A, and extracting coordinates, description vectors and confidence coefficients of key points in the image A by using a superpoint neural network;
s5, determining a position section B of the aircraft on the reference map at the shooting time t according to the inertial navigation information, wherein the center point (x t ,y t ) Positioning coordinates for inertial navigation;
s6, performing preliminary matching on the characteristic points in the image A and the section B by using a superglue matching algorithm, removing mismatching points by using a RANSAC algorithm, and finally obtaining n matching point pairsAnd->
S7, if n is less than 5, returning to the step S3; otherwise, the step S8 is carried out;
s8, calculating a transformation matrix M from the image A to the interval B;
s9, calculating the center point of the image ACoordinates processed by the transformation matrix MThe aircraft position obtained as visual navigation;
s10, using Kalman filtering to obtain a pair of coordinates (x t ,y t ) Andand fusing to obtain fused aircraft coordinate points.
2. The method of claim 1, wherein the inertial navigation coordinate positioning error defined in step S5 is denoted as delta, and interval B is defined as the slave(x t -Δ,y t -delta) to (x t +Δ,y t + delta).
3. The method according to claim 2, wherein in step S6, the superglue matching algorithm uses a multi-head attention mechanism to iteratively update the description vectors of the feature points in the image a and the section B, and then calculates the inner product S between the feature points, and updates S using a sinkhorn algorithm.
4. A method of positioning an aircraft according to claim 3, wherein said preliminary matching further uses cross-contrast: if the m-point in interval B is the best match for the n-point in image A, and the n-point in image A is also the most best match for the m-point in interval B, then the pair { n-m } is considered to be the pair of match points.
5. The method according to claim 4, wherein IN the step S10, the positioning errors of inertial navigation and visual navigation are respectively set to VN and IN, and the aircraft coordinate points obtained by kalman filtering are:
CN202311042901.2A 2023-08-18 2023-08-18 Aircraft positioning method Pending CN117073669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311042901.2A CN117073669A (en) 2023-08-18 2023-08-18 Aircraft positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311042901.2A CN117073669A (en) 2023-08-18 2023-08-18 Aircraft positioning method

Publications (1)

Publication Number Publication Date
CN117073669A true CN117073669A (en) 2023-11-17

Family

ID=88711016

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311042901.2A Pending CN117073669A (en) 2023-08-18 2023-08-18 Aircraft positioning method

Country Status (1)

Country Link
CN (1) CN117073669A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274391A (en) * 2023-11-23 2023-12-22 长春通视光电技术股份有限公司 Digital map matching target positioning method based on graphic neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274391A (en) * 2023-11-23 2023-12-22 长春通视光电技术股份有限公司 Digital map matching target positioning method based on graphic neural network
CN117274391B (en) * 2023-11-23 2024-02-06 长春通视光电技术股份有限公司 Digital map matching target positioning method based on graphic neural network

Similar Documents

Publication Publication Date Title
CN110966991B (en) Single unmanned aerial vehicle image positioning method without control point
CN109345588B (en) Tag-based six-degree-of-freedom attitude estimation method
US8723953B2 (en) Generation of aerial images
CN108534782B (en) Binocular vision system-based landmark map vehicle instant positioning method
CN112419374B (en) Unmanned aerial vehicle positioning method based on image registration
US9959625B2 (en) Method for fast camera pose refinement for wide area motion imagery
CN107560603B (en) Unmanned aerial vehicle oblique photography measurement system and measurement method
CN111507901B (en) Aerial image splicing and positioning method based on aerial GPS and scale invariant constraint
CN114216454B (en) Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS refusing environment
CN114936971A (en) Unmanned aerial vehicle remote sensing multispectral image splicing method and system for water area
CN113624231B (en) Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN113313659B (en) High-precision image stitching method under multi-machine cooperative constraint
CN111798373A (en) Rapid unmanned aerial vehicle image stitching method based on local plane hypothesis and six-degree-of-freedom pose optimization
CN117073669A (en) Aircraft positioning method
CN112016478A (en) Complex scene identification method and system based on multispectral image fusion
CN111815765A (en) Heterogeneous data fusion-based image three-dimensional reconstruction method
CN114238675A (en) Unmanned aerial vehicle ground target positioning method based on heterogeneous image matching
CN111583342B (en) Target rapid positioning method and device based on binocular vision
JP2023530449A (en) Systems and methods for air and ground alignment
CN113744315A (en) Semi-direct vision odometer based on binocular vision
CN115127554B (en) Unmanned aerial vehicle autonomous navigation method and system based on multi-source vision assistance
Zhang et al. An UAV navigation aided with computer vision
CN113239936A (en) Unmanned aerial vehicle visual navigation method based on deep learning and feature point extraction
Ge et al. A fast mosaicking method for small UAV image sequence using a small number of ground control points

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination