CN102353377A - High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof - Google Patents

High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof Download PDF

Info

Publication number
CN102353377A
CN102353377A CN2011101935879A CN201110193587A CN102353377A CN 102353377 A CN102353377 A CN 102353377A CN 2011101935879 A CN2011101935879 A CN 2011101935879A CN 201110193587 A CN201110193587 A CN 201110193587A CN 102353377 A CN102353377 A CN 102353377A
Authority
CN
China
Prior art keywords
information
unmanned plane
module
processing module
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011101935879A
Other languages
Chinese (zh)
Other versions
CN102353377B (en
Inventor
王养柱
赵启兵
肖江阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201110193587.9A priority Critical patent/CN102353377B/en
Publication of CN102353377A publication Critical patent/CN102353377A/en
Application granted granted Critical
Publication of CN102353377B publication Critical patent/CN102353377B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a high altitude long endurance unmanned aerial vehicle integrated navigation system and a navigating and positioning method thereof. The integrated navigation system comprises inertial navigation equipment, a scene matching positioning module, an atmosphere sensor, an information merging treatment module, an upstream communication interface and a downstream communication interface. The scene matching positioning module comprises an onboard imaging sensor, a digital map depot and an image matching module. The navigation system provided by the invention employs a feature based rapid robustness feature matching method to extract characteristic points of a reference map and a real time map for matching; the extracted characteristic points have invariable dimension, rotation and luminance, and have certain adaptation to noise, so as to enhance matching robustness and instantaneity of the system to multisource remote sensing images. Meanwhile, addition of a Mahalonobis distance based on statistics further enhances matching precision to obtain positioning information with high precision. Adoption of a modularization processing mechanism, which conducts matching positioning and information merging respectively, increases data processing speed of the system and facilitates maintenance and debugging of each module.

Description

A kind of HAE unmanned plane integrated navigation system and navigation locating method thereof
Technical field
The present invention relates to a kind of HAE unmanned plane integrated navigation system and navigation locating method thereof, belong to the integrated navigation technical field.
Background technology
Unmanned plane plays an important role in national defense construction as an important branch of aircraft in modern age always.Along with new and high technology in the national defence extensive applications, revolutionary change will take place in following war pattern, unmanned plane will be able to exhibit one's skill to the full.Its purposes will expand to multiple-tasks such as carrying out scouting, supervision, early warning, relaying communication, attack key target by single execution reconnaissance mission, and this all has higher requirement to unmanned plane accuracy of navigation systems, reliability, adaptability etc.
Each model unmanned plane mainly is to guarantee navigation accuracy through the mode of inertial navigation and satellite navigation system combination both at home and abroad at present; The most famous satellite navigation system surely belongs to the GPS of the U.S.; And the right to use of this system is controlled by mil, and its, availability can't be guaranteed in wartime." Big Dipper " satellite navigation system of China's independent development is still in establishing; Do not form integral framework as yet; Consider that simultaneously satellite-signal is subject to electromagnetic interference (EMI), transmission quality receives the interactive signal of weather effect and this machine and satellite to be prone to intercepted and captured shortcomings such as causing target exposure by the enemy; Feasible research has high precision, navigational system anti-interference, round-the-clock, independence becomes the research emphasis of current unmanned plane, also is the development trend of following unmanned plane navigational system.
In conjunction with the existing inertial navigation equipment level of China, inertial navigation and scene matching aided navigation integrated navigation mode are optimal technical scheme.This kind navigate mode is to obtain target scene through airborne advanced imageing sensor, utilizes images match to obtain the locating information of unmanned plane, and merges with the metrical information of inertial navigation equipment, finally obtains the high precision navigational parameter of unmanned plane.Because of technical know-how, the research work to inertial navigation and scene matching aided navigation integrated navigation system does not abroad also appear in the newspapers.At home, there are many colleges and universities, research institute carrying out this aspect Study on Technology energetically at present, all carried out fruitful research work like colleges and universities such as Tsing-Hua University, Beijing Institute of Aeronautics, Xi Gongda, South Airways, and obtained certain achievement in research from different aspects.
In existing inertial navigation and the work of scene matching aided navigation The Research of Integrated Navigation Systems, still exist the following weak point to need to improve:
1) use single-processor, information processing capability is limited.Because the mission requirements of unmanned plane need carry out long high altitude surveillance flight, load capacity is limited, requires the airborne equipment should be in light weight, and is low in energy consumption again, so most unmanned plane only adopts a microprocessor.Single-processor should be gathered the data of various sensors, carries out information fusion again and obtains navigational parameter, has reduced the response speed of system.
2) handle the multi-source remote sensing image, the Matching Location precision is not high.Because reference map is often taken by satellite and is obtained; And figure is obtained by airborne imaging sensor in real time; Gray scale between the two exists than big-difference; The factors such as angle rotation, scale and noise that also exist are between the two simultaneously disturbed, and make the images match result have more mistake match condition, cause the Matching Location precision not high.
Summary of the invention
The objective of the invention is to deficiency to prior art; HAE unmanned plane integrated navigation system and navigation locating method thereof are proposed; To improve real-time, independence and the precision of unmanned plane during flying navigator fix; Reduce the degree of dependence of unmanned plane, widen the availability in wartime of unmanned plane GPS.
The present invention provides a kind of HAE unmanned plane integrated navigation system, comprises inertial navigation equipment, scene matching aided navigation locating module, atmospheric sensor, information fusion processing module, uplink communication interface and downgoing communication interface.
Described inertial navigation equipment comprises tri-axis angular rate gyroscope and three axis accelerometer.The tri-axis angular rate gyroscope is three distributions along the unmanned plane body, is connected with the information fusion processing module through the amplification filtering circuit, is used to measure the attitude information of unmanned plane, comprises the angle of pitch, roll angle and course angle; Three axis accelerometer is three distributions along the unmanned plane body, is connected with the information fusion processing module through emitter follower.It draws the positional information of unmanned plane through integral operation, comprises longitude and latitude.Said tri-axis angular rate gyroscope adopts micro electronmechanical angular rate gyroscope, and described three axis accelerometer adopts micro electronmechanical accelerometer.
Said scene matching aided navigation locating module comprises airborne imaging sensor, digitally picture library and images match module.Described airborne imaging sensor is installed in the head front lower place of unmanned plane, is used for obtaining in real time the target area scene of unmanned plane below, is used for the real-time figure input as the images match module; Described digitally picture library, the positional information of the fusion treatment of receiving information module is called the map image of corresponding region, is used for the reference map input as the images match module, comprises information such as longitude and latitude; Described images match module; Comprise feature extraction submodule and Matching Location submodule; The feature extraction submodule is accepted the real-time figure that airborne imaging sensor obtains and the reference map of numerical map library call, accomplishes the image characteristic point extraction to real-time figure and reference map; The real-time figure that described Matching Location submodule extracts the feature extraction submodule and the image characteristic point of reference map carry out navigational parameter and calculate, and obtain the current position information of unmanned plane, as the input of information fusion processing module.Described airborne imaging sensor adopts multispectral sensor, and described digitally picture library adopts the global digital map database that is obtained by satellite or geodetic surveying.
Described atmospheric sensor comprises height indicator and pitot meter.Said height indicator all is connected with the airborne imaging sensor vision area calculating sub module of information fusion processing module through A/D converter with pitot meter.
Described information fusion processing module comprises information filter submodule and the airborne imaging sensor vision area calculating sub module that combines atmospheric sensor information.The information filter submodule is connected with three axis accelerometer with the tri-axis angular rate gyroscope of inertial navigation equipment; The information filter submodule also is connected with images match module, height indicator, the pitot meter of scene matching aided navigation locating module, and the information filter submodule is used to handle the positional information of the measurement parameter and the scene matching aided navigation locating module of height indicator, pitot meter, inertial navigation equipment.Airborne imaging sensor vision area calculating sub module also is connected with inertial navigation equipment.
Described uplink communication interface is connected with the airborne imaging sensor vision area calculating sub module of information fusion processing module, accepts the telecommand of land station through radio receiver, and state of flight and parameter are provided with.
Said downgoing communication interface is connected with the information filter submodule of information fusion processing module, sends current attitude, position and the elevation information of unmanned plane to land station through wireless launcher, is convenient to the real-time telemetry monitoring of ground controller to unmanned plane.
The present invention also proposes a kind of navigation locating method of HAE unmanned plane integrated navigation system, specifically comprises following step:
Step 1: the uplink communication interface sends command information to the airborne imaging sensor vision area calculating sub module of information fusion processing module through the telecommand of radio receiver reception land station.
Step 2: the scene matching aided navigation locating module carries out the Matching Location between remote sensing images, obtains the unmanned plane current position information, and it is offered the information fusion processing module.
At first obtain the target area scene of unmanned plane below in real time, be used for real-time figure input as the images match module through the airborne imaging sensor that is installed in unmanned plane head front lower place.While, the image of corresponding region in the call number map office was used for the reference map input as the images match module according to the relevant location information of the airborne imaging sensor vision area calculating sub module of information fusion processing module.To scheme in real time then to handle through the feature extraction submodule of images match module and the Matching Location of Matching Location submodule with reference map; Obtain the current location information of unmanned plane, at last current location information is sent to the information filter submodule of information fusion processing module through bus.
Step 3: the information filter submodule of information fusion processing module carries out many information fusion, obtains the total state information of unmanned plane.The information filter submodule of information fusion processing module carries out fused filtering with the metrical information of height indicator, pitot meter and inertial navigation equipment, and compares with the uplink communication interface message that receives, the state of flight of adjustment unmanned plane.
Step 4: the downgoing communication interface is connected with the information filter submodule of information fusion processing module; The position that unmanned plane is current, highly, navigational parameter information such as speed and attitude angle send land station to through wireless launcher, are convenient to the real-time telemetry monitoring of ground controller to unmanned plane.
The invention has the advantages that:
1) the present invention provides a kind of HAE unmanned plane integrated navigation system, has adopted the modularization treatment mechanism, carries out Matching Location and information fusion respectively, has improved the data processing speed of system, makes things convenient for the maintenance and debugging of each module.
2) the present invention provides a kind of HAE unmanned plane integrated navigation system, adopts fast robust property feature matching method based on characteristic to extract reference map and matees with the unique point of figure in real time.The unique point of being extracted has yardstick, rotation and brightness unchangeability, and noise is had certain adaptive faculty, has improved coupling robustness and the real-time of system to the multi-source remote sensing image.Add mahalanobis distance simultaneously, further improved matching precision, obtain high-precision location information based on statistic.
3) the present invention provides a kind of HAE unmanned plane integrated navigation system; The inertial navigation equipment that adopts is a micro-electro-mechanical sensors; Use integrated, integrated design means; Components and parts height as much as possible is integrated on the same circuit board, has that volume is little, a modularization, in light weight and low cost and other advantages.
Description of drawings
Fig. 1: the present invention provides a kind of structural representation of HAE unmanned plane integrated navigation system;
Fig. 2: images match Module Design schematic flow sheet among the present invention.
Among the figure: 1: inertial navigation equipment; 2-scene matching aided navigation locating module; The 3-atmospheric sensor; 4-information fusion processing module; 5-uplink communication interface; 6-downgoing communication interface; The 7-radio receiver; The 8-wireless launcher;
The airborne imaging sensor of 201-; 202-is picture library digitally; 203-images match module;
The 301-height indicator; The 302-pitot meter.
Embodiment
To combine accompanying drawing that the present invention is done further detailed description below.
The present invention provides a kind of HAE unmanned plane integrated navigation system, and is as shown in Figure 1, comprises inertial navigation equipment 1, scene matching aided navigation locating module 2, atmospheric sensor 3, information fusion processing module 4, uplink communication interface 5 and downgoing communication interface 6.
Described inertial navigation equipment 1 is used to measure quantity of information such as the tri-axis angular rate, acceleration of unmanned plane body system, comprises tri-axis angular rate gyroscope and three axis accelerometer.The tri-axis angular rate gyroscope is three distributions along the unmanned plane body, is connected with information fusion processing module 4 through the amplification filtering circuit, is used to measure the attitude information of unmanned plane, comprises the angle of pitch, roll angle and course angle; Three axis accelerometer is three distributions along the unmanned plane body, is connected with information fusion processing module 4 through emitter follower.It draws the positional information of unmanned plane through integral operation, comprises longitude and latitude.Said tri-axis angular rate gyroscope adopts micro electronmechanical angular rate gyroscope, and described three axis accelerometer adopts micro electronmechanical accelerometer.
Said scene matching aided navigation locating module 2 comprises airborne imaging sensor 201, digitally picture library 202 and images match module 203.Described airborne imaging sensor 201 is installed in the head front lower place of unmanned plane, is used for obtaining in real time the target area scene of unmanned plane below, is used for the real-time figure input as images match module 203; Described digitally picture library 202, the positional information of the fusion treatment of receiving information module 4 is called the map image of corresponding region, is used for the reference map input as images match module 203, comprises information such as longitude and latitude; Described images match module 203; Comprise feature extraction submodule and Matching Location submodule; The feature extraction submodule is accepted real-time figure that airborne imaging sensor 201 obtains and the reference map that calls of picture library 202 digitally, accomplishes the image characteristic point extraction to real-time figure and reference map; The real-time figure that described Matching Location submodule extracts the feature extraction submodule and the image characteristic point of reference map carry out navigational parameter and calculate, and obtain the current location information of unmanned plane, as the input of information fusion processing module 4.Described airborne imaging sensor 201 adopts multispectral sensor, and described digitally picture library 202 adopts the global digital map database that is obtained by satellite or geodetic surveying.
Described atmospheric sensor 3 comprises height indicator 301 and pitot meter 302.Said height indicator 301 all is connected with the information filter submodule with the airborne imaging sensor vision area calculating sub module of information fusion processing module 4 through A/D converter with pitot meter 302.Height indicator 301 is used to measure the flying height of unmanned plane, and pitot meter 302 is used to measure the speed of unmanned plane with respect to air.
Described information fusion processing module 4 comprises two interconnective information filter submodules and the airborne imaging sensor vision area calculating sub module that combines atmospheric sensor 3 information.The information filter submodule all is connected with the images match module 203 of inertial navigation equipment 1 (tri-axis angular rate gyroscope and three axis accelerometer), height indicator 301, pitot meter 302, scene matching aided navigation locating module 2, and the information filter submodule is used to handle the positional information of the measurement parameter and the scene matching aided navigation locating module 2 of height indicator 301, pitot meter 302, inertial navigation equipment 1.Airborne imaging sensor vision area calculating sub module is connected with pitot meter 302 with height indicator 301 through A/D converter, accomplishes the attitude algorithm that is carried out unmanned plane by inertial navigation equipment 1 metrical information.Airborne imaging sensor vision area calculating sub module also is connected with inertial navigation equipment 1, receives inertial navigation equipment 1 metrical information, adopts the hypercomplex number method to carry out attitude algorithm, finally can calculate the full attitude information of unmanned plane, comprises the angle of pitch, roll angle, course angle.The information filter submodule of information fusion processing module 4; Be used for quantity of information such as inertial navigation equipment 1, scene matching aided navigation locating module 2, height indicator 301 and pitot meter 302 are carried out filtering; Obtain the current navigational parameter information of unmanned plane, comprise attitude angle, position, speed and elevation information etc.Again with these navigation informations through downgoing communication interface 6, send land station to through wireless launcher 8.
Described uplink communication interface 5 is connected with the airborne imaging sensor vision area calculating sub module of information fusion processing module 4; Accept the telecommand of land station through radio receiver 7; Send the land station's telecommand that receives the airborne imaging sensor vision area calculating sub module of information fusion processing module 4 to, state of flight and parameter are provided with.
Said downgoing communication interface 6 is connected with the information filter submodule of information fusion processing module 4; Send the current navigational parameter information (attitude angle, position, speed and elevation information) of unmanned plane to land station through wireless launcher 8, be convenient to the real-time telemetry monitoring of ground controller unmanned plane.
Body system is the orthogonal coordinate system X-Y-Z that is based upon on the unmanned plane among the present invention; Wherein: the X axle is perpendicular to the unmanned plane symmetrical plane and point to right-hand; The Y axle is in the unmanned plane symmetrical plane, points to unmanned plane motion the place ahead by barycenter, and the Z axle is in the unmanned plane symmetrical plane and perpendicular to Y axle points upwards.Through the corresponding computing of information fusion processing module 4, calculate the total state quantity of information such as attitude and position of current unmanned plane.Airborne imaging sensor 201 by being installed in unmanned plane head front lower place is taken photo by plane to target scene in real time; Obtain real-time figure; The location compute information of utilizing information fusion processing module 4 simultaneously is from digitally calling the regional image of correspondence position the picture library 202, as reference map.With figure and reference map in real time as the input of the feature extraction submodule of images match module 203; Obtaining angle rotation on two width of cloth images, scale and noise etc. is the unique point of invariant; After obtaining the current unmanned plane gained of taking photo by plane, the Matching Location submodule of images match module 203 schemes the transformation parameter of relative datum figure in real time; And then obtain the unmanned plane current position information, like longitude, latitude etc.
Among the present invention in the scene matching aided navigation locating module 2 images match module 203 time consider that in design the real-time figure of images match module 203 inputs is obtained by airborne imaging sensor 201; Reference map is then obtained by satellite; Do not exist between the homology remote sensing images than high-gray level difference, therefore employing is carried out images match based on the fast robust property feature matching method of the mahalanobis distance of statistic.Concrete image matching method is as shown in Figure 2, comprises following step:
(1) feature extraction
At first reference map is done the gray processing processing with scheming in real time, convert Color Remote Sensing Image to gray level image, be convenient to follow-up images match and realize.Formation product partial image then, the gray level image that gray processing was handled once obtains by following formula individual element point traversal.
I Σ ( x , y ) = Σ i = 0 i ≤ x Σ j = 0 j ≤ y I ( i , j ) - - - ( 1 )
In the formula, (i, j) horizontal ordinate is (i, the pixel value of j) locating, I to I on the expression gray level image (x, y) horizontal ordinate is (x, the pixel value of y) locating on the expression integral image.
Then adopt the Hessian matrix to carry out feature detection, and replace gaussian filtering, improve detection speed with square frame filtering.Value after note square frame wave filter and the image convolution is respectively D Xx, D Xy, D Yy, then the Hessian matrix representation is:
H = D xx D xy D xy D yy - - - ( 2 )
D wherein XxExpression gaussian filtering second derivative
Figure BDA0000075142180000062
With gray level image I (i, convolution results j), D XyExpression gaussian filtering second derivative
Figure BDA0000075142180000063
With gray level image I (i, convolution results j), D YyExpression gaussian filtering second derivative
Figure BDA0000075142180000064
With gray level image I (i, convolution results j).The traversal gray level image, obtain point (x, y) locate the corresponding determinant det (H) of Hessian matrix and be:
det(H)=D xxD yy-(0.9D xy) 2 (3)
When det (H)>0, (x y) is considered to extreme point to corresponding pixel points, if do not satisfy det (H)>0, travels through gray level image again.
Set up metric space, seek the extreme point on the different scale, metric space is realized through the size that changes the square wave filter.At last above-mentioned extreme point is carried out non-maximum value and suppress to handle, seek Local Extremum as unique point.
(2) feature descriptor generates
At first confirm the principal direction of unique point, guarantee rotational invariance.In the unique point neighborhood, calculate the little wave response of Haar of x and y direction, and compose, make near the response contribution of unique point maximum with Gauss's weights.Response in the sector region of certain angle (often selecting in 30 °~90 ° scopes preferred 60 °) added up forms new vector, travels through whole circular neighborhood, selects the principal direction of the direction of long vector as unique point.
Be the center with the unique point then, coordinate axis rotated on the principal direction of unique point.Choose the square area of certain length of side (15s~25s, preferred 20s, s are unique point place scale-value), and be divided into the n sub regions, calculate the principal direction level and the vertical little wave response of Haar of the relative unique point of asking in all subregion, be designated as d xAnd d yTherefore in each subregion, can form 4 dimensional vector v=(∑ d x, ∑ d y, ∑ | d x|, ∑ | d y|), make each unique point to represent by the proper vector of 4n dimension.
(3) proper vector coupling
Adopt the arest neighbors ratio matching process based on Euclidean distance comparatively commonly used.But this method has only been used the local message of unique point, and does not consider with regard to the geometric distributions information between unique point.
The present invention adds the mahalanobis distance based on statistic at matching stage, fully takes into account the geometric distributions information between unique point, improves the matching precision between image characteristic point.
For sample space Z={ (x by n unique point composition of sample 1, y 1) T..., (x n, y n) T, arbitrary sample point z i=(x i, y i) TTo sample average μ=(μ x, μ y) TMahalanobis distance MD iBe defined as:
MD i = ( z i - μ ) T C z - 1 ( z i - μ ) - - - ( 4 )
μ = ( μ x , μ y ) T = 1 n ( Σ i = 1 n x i , Σ i = 1 n y i ) T , C z = 1 n [ Σ i = 1 n x i - μ x y i - μ y ( x i - μ x , y i - μ y ) ]
C wherein zBe covariance matrix, Expression C zInverse matrix.
Utilize the right step of mahalanobis distance screening match point following:
1. calculate the mahalanobis distance of the interior corresponding point of two coupling point sets (be reference map with figure go up the set that unique point is constituted) in real time, and be poor difference Dist i
2. seek difference Dist iIn maximal value, the note as DistMax;
3. it is right to delete the match point that satisfies the following formula condition:
Dist i>k*DistMax
The span of threshold value k is 0.1~0.0001, accomplishes the proper vector coupling.
The present invention also proposes a kind of navigation locating method of HAE unmanned plane integrated navigation system, specifically comprises following step:
Step 1: uplink communication interface 5 sends command information to the airborne imaging sensor vision area calculating sub module of information fusion processing module 4 through the telecommand of radio receiver 7 reception land stations.
Step 2: the Matching Location that scene matching aided navigation locating module 2 carries out between remote sensing images, obtain the unmanned plane current position information, and it is offered information fusion processing module 4.
At first obtain the target area scene of unmanned plane below in real time, be used for real-time figure input as images match module 203 through the airborne imaging sensor 201 that is installed in unmanned plane head front lower place.While, the image of corresponding region in the call number map office 202 was used for the reference map input as images match module 203 according to the relevant location information of the airborne imaging sensor vision area calculating sub module of information fusion processing module 4.To scheme in real time then to handle through the feature extraction submodule of images match module 203 and the Matching Location of Matching Location submodule with reference map; Obtain the current location information of unmanned plane, at last current location information is sent to the information filter submodule of information fusion processing module 4 through bus.
Step 3: the information filter submodule of information fusion processing module 4 carries out many information fusion, obtains the total state information of unmanned plane.The information filter submodule of information fusion processing module 4 carries out fused filtering with the metrical information of height indicator 301, pitot meter 302 and inertial navigation equipment 1, and compares with uplink communication interface 5 information that receive, the state of flight of adjustment unmanned plane.
Step 4: downgoing communication interface 6 is connected with the information filter submodule of information fusion processing module 4; The position that unmanned plane is current, highly, navigational parameter information such as speed and attitude angle send land station to through wireless launcher 8, are convenient to the real-time telemetry monitoring of ground controller to unmanned plane.

Claims (5)

1. a HAE unmanned plane integrated navigation system is characterized in that: comprise inertial navigation equipment, scene matching aided navigation locating module, atmospheric sensor, information fusion processing module, uplink communication interface and downgoing communication interface;
Described inertial navigation equipment is used to measure the tri-axis angular rate and the acceleration information of unmanned plane body system, comprises tri-axis angular rate gyroscope and three axis accelerometer; The tri-axis angular rate gyroscope is connected with the information fusion processing module through the amplification filtering circuit, is used to measure the attitude information of unmanned plane; Three axis accelerometer is connected with the information fusion processing module through emitter follower, and it draws the positional information of unmanned plane through integral operation;
Said scene matching aided navigation locating module comprises airborne imaging sensor, digitally picture library and images match module; Described airborne imaging sensor is installed in the head front lower place of unmanned plane, is used for obtaining in real time the target area scene of unmanned plane below, as the real-time figure input of images match module; The receive information positional information of fusion treatment module of described digitally picture library is called the map image of corresponding region, is used for the reference map input as the images match module; Described images match module comprises feature extraction submodule and Matching Location submodule; The feature extraction submodule is accepted the real-time figure that airborne imaging sensor obtains and the reference map of numerical map library call; Completion is extracted the image characteristic point of real-time figure and reference map; The real-time figure that the Matching Location submodule extracts the feature extraction submodule and the image characteristic point of reference map carry out navigational parameter and calculate, and obtain the current location information of unmanned plane, as the input of information fusion processing module;
Described atmospheric sensor comprises height indicator and pitot meter, and height indicator all is connected with the information fusion processing module through A/D converter with pitot meter; Height indicator is used to measure the flying height of unmanned plane, and pitot meter is used to measure the speed of unmanned plane with respect to air;
Described information fusion processing module comprises information filter submodule and the airborne imaging sensor vision area calculating sub module that combines atmospheric sensor information; The information filter submodule all is connected with the images match module of tri-axis angular rate gyroscope, three axis accelerometer, height indicator, pitot meter, scene matching aided navigation locating module; The information filter submodule is used to handle the positional information of the measurement parameter and the scene matching aided navigation locating module of height indicator, pitot meter, inertial navigation equipment, obtains the current navigational parameter information of unmanned plane; Airborne imaging sensor vision area calculating sub module is connected with pitot meter with height indicator through A/D converter, accomplishes the attitude algorithm that is carried out unmanned plane by the inertial navigation equipment metrical information; Airborne imaging sensor vision area calculating sub module also is connected with inertial navigation equipment, receives the inertial navigation equipment metrical information, calculates the full attitude information of unmanned plane;
Described uplink communication interface is connected with the airborne imaging sensor vision area calculating sub module of information fusion processing module; Accept the telecommand of land station through radio receiver, the land station's telecommand that receives is sent to the airborne imaging sensor vision area calculating sub module of information fusion processing module;
Described downgoing communication interface is connected with the information filter submodule of information fusion processing module, sends the current navigational parameter information of unmanned plane to land station through wireless launcher.
2. a kind of HAE unmanned plane integrated navigation system according to claim 1; It is characterized in that: described scene matching aided navigation locating module adopts the fast robust property feature matching method based on the mahalanobis distance of statistic to carry out images match, and concrete image matching method comprises following step:
(1) feature extraction
At first reference map is done gray processing with real-time figure and handle, convert Color Remote Sensing Image to gray level image, formation product partial image then, the gray level image that gray processing was handled travels through by following formula individual element point and once obtains;
I Σ ( x , y ) = Σ i = 0 i ≤ x Σ j = 0 j ≤ y I ( i , j )
In the formula, (i, j) horizontal ordinate is (i, the pixel value of j) locating, I to I on the expression gray level image (x, y) horizontal ordinate is (x, the pixel value of y) locating on the expression integral image;
Adopt the Hessian matrix to carry out feature detection then, and replace gaussian filtering with square frame filtering, the value after note square frame wave filter and the image convolution is respectively D Xx, D Xy, D Yy, then the Hessian matrix representation is:
H = D xx D xy D xy D yy
D wherein XxExpression gaussian filtering second derivative
Figure FDA0000075142170000023
With gray level image I (i, convolution results j), D XyExpression gaussian filtering second derivative
Figure FDA0000075142170000024
With gray level image I (i, convolution results j), D YyExpression gaussian filtering second derivative
Figure FDA0000075142170000025
With gray level image I (i, convolution results j) obtain the corresponding determinant det (H) of Hessian matrix and are:
det(H)=D xxD yy-(0.9D xy) 2
When det (H)>0, (x y) is considered to extreme point to corresponding pixel points; If do not satisfy det (H)>0; Again travel through gray level image, set up metric space, seek the extreme point on the different scale; At last above-mentioned extreme point is carried out non-maximum value and suppress to handle, seek Local Extremum as unique point;
(2) feature descriptor generates
At first confirm the principal direction of unique point; Guarantee rotational invariance; In the unique point neighborhood, calculate the little wave response of Haar of x and y direction, and compose with Gauss's weights, response in 30 °~90 ° the sector region is added up forms new vector; Travel through whole circular neighborhood, select the principal direction of the direction of long vector as unique point;
Be the center then with the unique point; Coordinate axis is rotated on the principal direction of unique point; Choose the square area that the length of side is 15s~25s, s is a unique point place scale-value, and is divided into the n sub regions; Calculate the principal direction level and the vertical little wave response of Haar of the relative unique point of asking in all subregion, be designated as d xAnd d y, therefore in each subregion, can form 4 dimensional vector v=(∑ d x, ∑ d y, ∑ | d x|, ∑ | d y|), make each unique point to represent by the proper vector of 4n dimension;
(3) proper vector coupling
For sample space Z={ (x by n unique point composition of sample 1, y 1) T..., (x n, y n) T, arbitrary sample point z i=(x i, y i) TTo sample average μ=(μ x, μ y) TMahalanobis distance MD iBe defined as:
MD i = ( z i - μ ) T C z - 1 ( z i - μ )
μ = ( μ x , μ y ) T = 1 n ( Σ i = 1 n x i , Σ i = 1 n y i ) T , C z = 1 n [ Σ i = 1 n x i - μ x y i - μ y ( x i - μ x , y i - μ y ) ]
C wherein zBe covariance matrix,
Figure FDA0000075142170000034
Expression C zInverse matrix;
1. calculate the mahalanobis distance of corresponding point in the two coupling point sets, and be poor difference Dist i
2. seek difference Dist iIn maximal value, the note as DistMax;
3. it is right to delete the match point that satisfies the following formula condition:
Dist i>k*DistMax
K is a threshold value, accomplishes the proper vector coupling.
3. a kind of HAE unmanned plane integrated navigation system according to claim 1, it is characterized in that: described tri-axis angular rate gyroscope adopts micro electronmechanical angular rate gyroscope, and described three axis accelerometer adopts micro electronmechanical accelerometer.
4. a kind of HAE unmanned plane integrated navigation system according to claim 1; It is characterized in that: described airborne imaging sensor adopts multispectral sensor, and described digitally picture library adopts the global digital map database that is obtained by satellite or geodetic surveying.
5. the navigation locating method of a HAE unmanned plane integrated navigation system is characterized in that: comprise following step:
Step 1: the uplink communication interface sends command information to the airborne imaging sensor vision area calculating sub module of information fusion processing module through the telecommand of radio receiver reception land station;
Step 2: the target area scene of at first obtaining the unmanned plane below in real time through the airborne imaging sensor that is installed in unmanned plane head front lower place; Be used for real-time figure input as the images match module; While is according to the relevant location information of the airborne imaging sensor vision area calculating sub module of information fusion processing module; The image of corresponding region in the call number map office; Be used for reference map input as the images match module; To scheme in real time then to handle through the feature extraction submodule of images match module and the Matching Location of Matching Location submodule, obtain the locating information of unmanned plane, at last locating information sent to the information filter submodule of information fusion processing module through bus with reference map;
Step 3: the information filter submodule of information fusion processing module carries out many information fusion, obtains the total state information of unmanned plane; The information filter submodule of information fusion processing module carries out fused filtering with the metrical information of height indicator, pitot meter and inertial navigation equipment, and compares with the uplink communication interface message that receives, the state of flight of adjustment unmanned plane;
Step 4: the downgoing communication interface is connected with the information filter submodule of information fusion processing module, sends the unmanned plane locating information to land station through wireless launcher.
CN201110193587.9A 2011-07-12 2011-07-12 High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof Expired - Fee Related CN102353377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110193587.9A CN102353377B (en) 2011-07-12 2011-07-12 High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110193587.9A CN102353377B (en) 2011-07-12 2011-07-12 High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof

Publications (2)

Publication Number Publication Date
CN102353377A true CN102353377A (en) 2012-02-15
CN102353377B CN102353377B (en) 2014-01-22

Family

ID=45576994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110193587.9A Expired - Fee Related CN102353377B (en) 2011-07-12 2011-07-12 High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof

Country Status (1)

Country Link
CN (1) CN102353377B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102637040A (en) * 2012-04-23 2012-08-15 清华大学 Unmanned aerial vehicle cluster visual navigation task coordination method and system
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103322999A (en) * 2013-05-24 2013-09-25 哈尔滨工程大学 History state reserved information filtering algorithm suitable for multi-boat navigation
CN103697889A (en) * 2013-12-29 2014-04-02 北京航空航天大学 Unmanned aerial vehicle self-navigation and positioning method based on multi-model distributed filtration
CN104159031A (en) * 2014-08-19 2014-11-19 湖北易瓦特科技有限公司 Method and equipment of locating and tracking target object
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104581072A (en) * 2015-01-05 2015-04-29 惠州市加迈电器有限公司 Night shooting equipment
CN104590554A (en) * 2015-01-05 2015-05-06 惠州市加迈电器有限公司 Path-finding lighting equipment
CN103162687B (en) * 2013-03-07 2015-11-18 中国人民解放军国防科学技术大学 Based on the image/inertial navigation combination navigation method of information credibility
CN106444828A (en) * 2016-09-14 2017-02-22 芜湖扬展新材料科技服务有限公司 Small unmanned aerial vehicle ground station system based on LabView
CN106468547A (en) * 2015-08-17 2017-03-01 波音公司 Utilize multiple optical pickocffs is independent of global positioning system for self-conductance aircraft(“GPS”)Navigation system
CN106708075A (en) * 2016-12-30 2017-05-24 浙江大学 Long range oilseed rape field SPAD value remote sensing system and acquisition method based on fixed wing unmanned plane
CN106998447A (en) * 2017-03-31 2017-08-01 大庆安瑞达科技开发有限公司 Wide area, oil field infrared panorama imaging radar scout command and control system
CN108027248A (en) * 2015-09-04 2018-05-11 克朗设备公司 The industrial vehicle of positioning and navigation with feature based
CN108106635A (en) * 2017-12-15 2018-06-01 中国船舶重工集团公司第七0七研究所 Inertia defends the anti-interference posture course calibration method of long endurance for leading integrated navigation system
CN108569412A (en) * 2018-03-15 2018-09-25 徐琳 Unmanned plane during flying ability self-test platform
CN109533327A (en) * 2018-03-15 2019-03-29 徐琳 Unmanned plane during flying ability self checking method
CN110411532A (en) * 2012-04-13 2019-11-05 外托尼克斯有限公司 Movable property data logger and transmitter
CN111492326A (en) * 2017-12-21 2020-08-04 Wing航空有限责任公司 Image-based positioning for unmanned aerial vehicles and related systems and methods
CN113418527A (en) * 2021-06-15 2021-09-21 西安微电子技术研究所 Strong real-time double-structure continuous scene fusion matching navigation positioning method and system
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Unmanned aerial vehicle self-navigation based on terrain matching
CN114964209A (en) * 2022-05-13 2022-08-30 天健极光(北京)科技发展有限公司 Long-endurance unmanned aerial vehicle autonomous navigation method and system based on infrared array imaging

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033973A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Attitude determination method of mini-aircraft inertial integrated navigation system
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101046385A (en) * 2007-04-20 2007-10-03 北京航空航天大学 Method of realizing combined navigation system structure for aviation
CN101246012A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Combinated navigation method based on robust dissipation filtering
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
WO2009098154A1 (en) * 2008-02-04 2009-08-13 Tele Atlas North America Inc. Method for map matching with sensor detected objects
CN101520328A (en) * 2009-04-01 2009-09-02 西北工业大学 Method for autonomous navigation using geomagnetic field line map
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN101858748A (en) * 2010-05-28 2010-10-13 南京航空航天大学 Fault-tolerance autonomous navigation method of multi-sensor of high-altitude long-endurance unmanned plane
US20100283832A1 (en) * 2004-08-04 2010-11-11 American Gnc Corporation Miniaturized GPS/MEMS IMU integrated board
CN102052925A (en) * 2010-12-16 2011-05-11 西北工业大学 Adaptive area scene matching method based on spatial relationship constraint

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100283832A1 (en) * 2004-08-04 2010-11-11 American Gnc Corporation Miniaturized GPS/MEMS IMU integrated board
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101033973A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Attitude determination method of mini-aircraft inertial integrated navigation system
CN101046385A (en) * 2007-04-20 2007-10-03 北京航空航天大学 Method of realizing combined navigation system structure for aviation
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
WO2009098154A1 (en) * 2008-02-04 2009-08-13 Tele Atlas North America Inc. Method for map matching with sensor detected objects
CN101246012A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Combinated navigation method based on robust dissipation filtering
CN101520328A (en) * 2009-04-01 2009-09-02 西北工业大学 Method for autonomous navigation using geomagnetic field line map
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN101858748A (en) * 2010-05-28 2010-10-13 南京航空航天大学 Fault-tolerance autonomous navigation method of multi-sensor of high-altitude long-endurance unmanned plane
CN102052925A (en) * 2010-12-16 2011-05-11 西北工业大学 Adaptive area scene matching method based on spatial relationship constraint

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
徐超等: "一种无人机视觉导航方法及其滤波算法改进", 《北京航空航天大学学报》 *
曹娟娟等: "低成本多传感器组合导航系统在小型无人机自主飞行中的研究与应用", 《航空学报》 *
王新龙等: "高空长航时无人机高精度自主定位方法", 《航空学报》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110411532B (en) * 2012-04-13 2021-04-27 外托尼克斯有限公司 Mobile property data recorder and transmitter
CN110411532A (en) * 2012-04-13 2019-11-05 外托尼克斯有限公司 Movable property data logger and transmitter
CN102637040A (en) * 2012-04-23 2012-08-15 清华大学 Unmanned aerial vehicle cluster visual navigation task coordination method and system
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103175524B (en) * 2013-02-20 2015-11-25 清华大学 A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
CN103162687B (en) * 2013-03-07 2015-11-18 中国人民解放军国防科学技术大学 Based on the image/inertial navigation combination navigation method of information credibility
CN103322999A (en) * 2013-05-24 2013-09-25 哈尔滨工程大学 History state reserved information filtering algorithm suitable for multi-boat navigation
CN103697889A (en) * 2013-12-29 2014-04-02 北京航空航天大学 Unmanned aerial vehicle self-navigation and positioning method based on multi-model distributed filtration
CN103697889B (en) * 2013-12-29 2016-05-25 北京航空航天大学 A kind of unmanned plane independent navigation and localization method based on multi-model Distributed filtering
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104159031A (en) * 2014-08-19 2014-11-19 湖北易瓦特科技有限公司 Method and equipment of locating and tracking target object
CN104581072B (en) * 2015-01-05 2018-10-02 嘉兴欧祥通讯设备有限公司 Night capture apparatus
CN104581072A (en) * 2015-01-05 2015-04-29 惠州市加迈电器有限公司 Night shooting equipment
CN104590554A (en) * 2015-01-05 2015-05-06 惠州市加迈电器有限公司 Path-finding lighting equipment
CN106468547A (en) * 2015-08-17 2017-03-01 波音公司 Utilize multiple optical pickocffs is independent of global positioning system for self-conductance aircraft(“GPS”)Navigation system
CN108027248A (en) * 2015-09-04 2018-05-11 克朗设备公司 The industrial vehicle of positioning and navigation with feature based
CN106444828A (en) * 2016-09-14 2017-02-22 芜湖扬展新材料科技服务有限公司 Small unmanned aerial vehicle ground station system based on LabView
CN106708075A (en) * 2016-12-30 2017-05-24 浙江大学 Long range oilseed rape field SPAD value remote sensing system and acquisition method based on fixed wing unmanned plane
CN106708075B (en) * 2016-12-30 2020-01-07 浙江大学 Wide-range rape field SPAD value remote sensing system based on fixed-wing unmanned aerial vehicle and acquisition method
CN106998447A (en) * 2017-03-31 2017-08-01 大庆安瑞达科技开发有限公司 Wide area, oil field infrared panorama imaging radar scout command and control system
CN108106635A (en) * 2017-12-15 2018-06-01 中国船舶重工集团公司第七0七研究所 Inertia defends the anti-interference posture course calibration method of long endurance for leading integrated navigation system
CN111492326A (en) * 2017-12-21 2020-08-04 Wing航空有限责任公司 Image-based positioning for unmanned aerial vehicles and related systems and methods
CN111492326B (en) * 2017-12-21 2024-04-19 Wing航空有限责任公司 Image-based positioning for unmanned aerial vehicles and related systems and methods
CN109533327B (en) * 2018-03-15 2020-04-10 拓航科技有限公司 Unmanned aerial vehicle flight capability self-checking method
CN109533327A (en) * 2018-03-15 2019-03-29 徐琳 Unmanned plane during flying ability self checking method
CN108569412A (en) * 2018-03-15 2018-09-25 徐琳 Unmanned plane during flying ability self-test platform
CN113418527A (en) * 2021-06-15 2021-09-21 西安微电子技术研究所 Strong real-time double-structure continuous scene fusion matching navigation positioning method and system
CN113418527B (en) * 2021-06-15 2022-11-29 西安微电子技术研究所 Strong real-time double-structure continuous scene fusion matching navigation positioning method and system
WO2022262164A1 (en) * 2021-06-15 2022-12-22 西安微电子技术研究所 Strong real-time double-structure continuous scene fusion matching navigation positioning method and system
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Unmanned aerial vehicle self-navigation based on terrain matching
CN114964209A (en) * 2022-05-13 2022-08-30 天健极光(北京)科技发展有限公司 Long-endurance unmanned aerial vehicle autonomous navigation method and system based on infrared array imaging

Also Published As

Publication number Publication date
CN102353377B (en) 2014-01-22

Similar Documents

Publication Publication Date Title
CN102353377B (en) High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN107451593B (en) High-precision GPS positioning method based on image feature points
CN111077556B (en) Airport luggage tractor positioning device and method integrating Beidou and multiple sensors
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN111492326B (en) Image-based positioning for unmanned aerial vehicles and related systems and methods
CN107229063A (en) A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN107727079A (en) The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN106373159A (en) Simplified unmanned aerial vehicle multi-target location method
CN104777499A (en) Combined navigation method based on INS (inertial navigation system)/GPS (global position system)/SAR (synthetic aperture radar)
CN108151737A (en) A kind of unmanned plane bee colony collaborative navigation method under the conditions of the mutual observed relationships of dynamic
CN110361010A (en) It is a kind of based on occupy grating map and combine imu method for positioning mobile robot
CN111649737B (en) Visual-inertial integrated navigation method for precise approach landing of airplane
CN110186468B (en) High-precision map making method and device for automatic driving
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
JP7190699B2 (en) Flight system and landing control method
CN102788579A (en) Unmanned aerial vehicle visual navigation method based on SIFT algorithm
Suwandi et al. Low-cost IMU and GPS fusion strategy for apron vehicle positioning
CN106352897B (en) It is a kind of based on the silicon MEMS gyro estimation error of monocular vision sensor and bearing calibration
CN109974713A (en) A kind of navigation methods and systems based on topographical features group
CN104729482A (en) Ground tiny target detection system and ground tiny target detection method based on airship
CN113419235A (en) Unmanned aerial vehicle positioning method based on millimeter wave radar
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN109143303B (en) Flight positioning method and device and fixed-wing unmanned aerial vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140122

Termination date: 20140712

EXPY Termination of patent right or utility model