CN102506868B - SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system - Google Patents

SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system Download PDF

Info

Publication number
CN102506868B
CN102506868B CN201110371864.0A CN201110371864A CN102506868B CN 102506868 B CN102506868 B CN 102506868B CN 201110371864 A CN201110371864 A CN 201110371864A CN 102506868 B CN102506868 B CN 102506868B
Authority
CN
China
Prior art keywords
module
information
navigation system
navigation
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110371864.0A
Other languages
Chinese (zh)
Other versions
CN102506868A (en
Inventor
程农
胡海东
李威
杨霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201110371864.0A priority Critical patent/CN102506868B/en
Publication of CN102506868A publication Critical patent/CN102506868A/en
Application granted granted Critical
Publication of CN102506868B publication Critical patent/CN102506868B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and a system. The method includes the following steps: performing image matching for scene matching auxiliary navigation, and confirming the position of an aircraft through the affine transformation relationship between a digital map and a shoot image; performing terrain matching for the terrain reference navigation through adopting the terrain matching method, and confirming the position of the aircraft as per the elevation data; building an error model for the SINS, as well as observation models of the SMANS and the TRNS; and performing information fusion for the output of the SINS, the SMANS and the TRNS, so as to obtain the optimum estimation result, and further calibrate the SINS; The system includes an atmosphere inertia navigation system, a flight path generator module, an SINS/SMANS combined navigation system, an SINS/TRNS combined navigation system and a federated filtering module. The method and the system can effectively improve the accuracy of the navigation systems, and have high fault tolerance, independency and reliability.

Description

SINS/SMANS/TRNS Combinated navigation method and system based on federated filter
Technical field
The present invention relates to integrated navigation field of locating technology, relate in particular to a kind of SINS/SMANS/TRNS Combinated navigation method and system based on federated filter.
Background technology
" navigation " is exactly that correct vectored flight device arrives destination at the appointed time along predetermined course line.In order to complete this task, need to know at any time instantaneous geographic position, the headway of aircraft, the parameters such as attitude course of navigation.These parameters, so-called navigational parameter.To manned aircraft, these navigational parameters can be by navigator by observing instrument and calculating.But, along with the continuous increase of speed and voyage, more and more higher to the requirement of navigation; In order to alleviate and replace navigator's work, just there are various navigational system, the various navigational parameters that need can be provided automatically.
Inertial navigation is a kind of air navigation aid of independence.It rely on completely airborne equipment autonomous complete navigation task, and there is not any optical, electrical contact in the external world.Therefore, good concealment, work is not subject to the restriction of meteorological condition.The advantage that this is unique is widely used inertial navigation system on guided missile, naval vessel, aircraft, spacecraft, occupies outstanding status in airmanship.Inertial navigation is by the measurement data that gyroscope and accelerometer provide, to determine position, speed and the attitude Navigation parameter of place carrier.By the combination of these two kinds of measurements, just can determine this carrier in the translation motion of inertial coordinates system and calculate its position.
Strap-down inertial (SINS) system has been removed the most mechanical complexity of plateform system by sensor is connected (or fixing) on the housing of carrier.The potential benefit of this method is that cost, size reduce, reliability improves.Small-sized, accurate strapdown inertial navigation system can install on various aircraft, and the subject matter of bringing is that computational complexity significantly increases, and needs to measure high-revolving device.Yet the continuous progress of computer technology combines with the exploitation of applicable sensor, this design is become a reality.
The shortcoming of strap-down inertial is that the position estimation precision that it provides can be drifted about in time.In Long time scale, the inertial sensor defect that the speed that navigation error increases is mainly used by initial alignment precision, system and the dynamic perfromance of carrier movement locus determine.Although adopt more accurate sensor can improve precision, it is very expensive that the cost of inertia system can become, and the precision improving is also limited.In recent years, in order to solve the error drift problem of strapdown inertial navitation system (SINS), a kind of method that is suitable for multiple application is integrated navigation technology, and integrated navigation is to adopt one or more secondary navigation systems to revise inertial navigation.Wherein, scene matching aided navigation system and terrain refer enced navigation system are exactly the navigational system of two kinds of high independences.
Scene matching aided navigation (SMANS) is to utilize real-time landform scene figure airborne or that missile-borne imageing sensor gathers in flight course and previously prepared benchmark landform scene figure to carry out real-time matching calculating and the technology that obtains precise location information.Scene matching aided navigation navigation belongs to autonomous location, can provide zero-miss guidance for aircraft, and navigation accuracy and flying distance are irrelevant, and cost is relatively low.The landform picture of its below when scene matching aided navigation adopts an imaging system to set up aircraft to fly forward, while needing position coordinates, a part of scan image is stored " scene " of formation aircraft below landform.By this process, image is converted to " pixel " array, and each pixel has a numerical value that represents part brightness of image.Scene to " catching " is processed, and to remove noise and to strengthen those, may provide the feature of navigation information, then adopt related algorithm find pre-stored in terrain surface specifications database discernible figure.Find, after the feature and the characteristic matching in database in scene, according to the attitude of aircraft and terrain clearance, to carry out several calculating, just can calculate scene in the position that is hunted down moment.
Landform is secondary navigation system in widespread attention and that success is used in recent years with reference to navigation (TRNS) technology, is also the irrelevant low level navigation technology of a kind of autonomous, hidden, round-the-clock, navigation and positioning accuracy and voyage.The most frequently used landform frame of reference adopts the aircraft of a radio altimeter, an airborne air pressure inertial navigation system and a storage to fly over the terrain profile Yan Hangkongqisudushiliangqianchuimian figure in region.Radio altimeter is measured the height on ground, the valuation in conjunction with inertial navigation system to sea level elevation, the ground contour below can the computer vision reappear theory flight path on carrier.Then the ground contour obtaining and the topographic map data of storage are compared to realize compatibility and mate, can determine thus the position of carrier.
Summary of the invention
(1) technical matters that will solve
The technical problem to be solved in the present invention is: a kind of SINS/SMANS/TRNS Combinated navigation method and system based on federated filter is provided, effectively to improve the precision of navigational system, and has high fault tolerance, high independence and high reliability.
(2) technical scheme
For addressing the above problem, the invention provides a kind of SINS/SMANS/TRNS integrated navigation system based on federated filter, comprising:
Atmosphere inertial navigation system, for obtaining inertial navigation positional information output;
Flight path generator module, for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
SINS/SMANS integrated navigation system, comprising:
Imageing sensor vision area and positional parameter computing module, for according to vision area and the positional parameter of the airborne imageing sensor of positional information calculation of described atmosphere inertial navigation system output;
Digital reference map database, for obtaining suitable digital reference map according to the result of calculation of described imageing sensor vision area and positional parameter computing module;
Imageing sensor analog module, for attitude and the height conversion information of the aircraft that obtains according to described flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
Images match module, for carrying out registration to the described digital reference map being obtained by described digital reference map database with by the realtime graphic of described imageing sensor analog module simulation, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module, for carrying out information fusion according to the aircraft real-time position information of the inertial navigation positional information of atmosphere inertial navigation system output and the output of images match module;
SINS/TRNS integrated navigation system, comprising:
Laser ceilometer, for receiving the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that terrain match module obtains, obtains surveying relative height output;
Landform altitude database, for providing terrain data;
Terrain match module, exports actual elevation information, output landform slope and inertial navigation elevation information for the inertial navigation positional information of exporting according to atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that described landform altitude database provides;
SINS/TRNS Kalman subfilter module, for carrying out information fusion according to the output signal of described laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federated filter module, for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged, obtain the final positional information result merging, described atmosphere inertial navigation system is carried out to error correction simultaneously.
On the other hand, the present invention also provides a kind of method that realizes SINS/SMANS/TRNS integrated navigation based on combinations thereof navigational system, comprising:
Atmosphere inertial navigation system obtains inertial navigation positional information output;
Flight path generator module simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
Imageing sensor vision area and positional parameter computing module are according to vision area and the positional parameter of the airborne imageing sensor of positional information calculation of described atmosphere inertial navigation system output;
Digital reference map database root obtains suitable digital reference map according to the result of calculation of described imageing sensor vision area and positional parameter computing module;
The attitude of the aircraft that imageing sensor analog module obtains according to described flight path generator module and height conversion information, simulation generates the realtime graphic that airborne imageing sensor is taken;
Images match module is carried out registration to the described digital reference map being obtained by described digital reference map database with by the realtime graphic of described imageing sensor analog module simulation, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module is carried out information fusion according to the aircraft real-time position information of the inertial navigation positional information of atmosphere inertial navigation system output and the output of images match module;
The actual elevation information that the actual absolute altitude information that Return Signal for Laser Altimeter flight path generator module obtains and terrain match module obtain, obtains surveying relative height output;
The terrain data that the actual position information that terrain match module is exported according to the inertial navigation positional information of atmosphere inertial navigation system output, flight path generator module and described landform altitude database provide is exported actual elevation information, output landform slope and inertial navigation elevation information;
SINS/TRNS Kalman subfilter module is carried out information fusion according to the output signal of described laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federated filter module merges the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module, obtains the final positional information result merging, and described atmosphere inertial navigation system is carried out to error correction simultaneously;
Described method is further comprising the steps of:
S1: scene matching aided navigation adopts angle point method to carry out images match, by attitude and the height conversion of aircraft, determines the position of aircraft;
S2: landform adopts Approach of Terrain Matching to carry out terrain match with reference to navigation, by actual relative height and inertial navigation relative height, finally determines the position of aircraft;
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation and landform are with reference to the observation model of navigation;
S4: strap-down inertial, scene matching aided navigation and landform are carried out to information fusion with reference to the output of navigation, draw optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
Preferably, described step S1 specifically comprises the following steps:
S11: input Aerial Images and area image to be matched, respectively two images are carried out to the extraction of unique point, image is carried out to metric space represents and carry out three-dimensional localization;
S12: by determining that the direction character of point of interest builds the descriptor vector of point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and to Mismatching point to rejecting;
S13: the point of interest based on the match is successful is to determining the homography matrix of described Aerial Images and area image to be matched, position and the rotation relationship of the two width images that provide according to described homography matrix, determine the position of figure in area map to be matched of taking photo by plane, thereby determine the flight position of aircraft.
Preferably, the descriptor vector of described point of interest can be 64 dimensions or 128 dimension formations.
Preferably, described step S2 specifically comprises the following steps:
S21: input landform altitude data, and set up Terrain Linearization model;
S22: according to aircraft horizontal position error, calculate landform altitude poor, poorer according to the poor calculating landform of the poor and described landform altitude of the absolute altitude of aircraft relative height;
S23: calculating gained landform relative height Cha Yu observation station is obtained to relative height is poor to be compared, revise and finally determine the position of aircraft.
Preferably, described step S4 specifically comprises the following steps:
S41: the state after information fusion and variance battle array are carried out to initialization;
S42: form inertial navigation/scene matching aided navigation subfilter and inertial navigation/landform with reference to navigation subfilter, the state of each subfilter, variance battle array and state-noise battle array are carried out to information distribution;
S43: inertial navigation/scene matching aided navigation subfilter and inertial navigation/landform are carried out respectively to time renewal and observation renewal separately with reference to navigation subfilter, obtain the estimated information of each subfilter;
S44: the estimated information of all subfilters is merged to the global state optimal estimation information that becomes.
(3) beneficial effect
The present invention by inertial navigation be applicable to the scene matching aided navigation of the smooth or undistinguishable region of large area and be applicable to landform landform more coarse or that change violent region to form integrated navigation system with reference to navigation, by scene matching aided navigation and landform, with reference to navigation, effectively revise the drift error of inertial navigation, effectively improve the precision of navigational system, and there is high fault tolerance, high independence and high reliability.
Accompanying drawing explanation
Fig. 1 is according to the flow chart of steps of embodiment of the present invention Combinated navigation method;
Fig. 2 is according to the process flow diagram of embodiment of the present invention Combinated navigation method step 1;
Fig. 3 is according to the process flow diagram of embodiment of the present invention Combinated navigation method step 2;
Fig. 4 is according to the process flow diagram of embodiment of the present invention Combinated navigation method step 4;
Fig. 5 is the rough schematic view of the federated filter fusion structure of the integrated navigation system of enforcement the inventive method;
Fig. 6 is for implementing the structural representation of the integrated navigation system of the inventive method.
Embodiment
Below in conjunction with drawings and Examples, that the present invention is described in detail is as follows.
Embodiment mono-:
As shown in Figure 1, the present embodiment has been recorded a kind of SINS/SMANS/TRNS Combinated navigation method based on federated filter, comprises the following steps:
S1: scene matching aided navigation adopts angle point method to carry out images match, by attitude and the height conversion of aircraft, determines the position of aircraft;
As shown in Figure 2, described step S1 specifically comprises the following steps:
S11: input Aerial Images and area image to be matched, to two images, adopt for example method of the approximate Hessian matrix of frame shape wave filter to carry out the extraction of unique point respectively, image is carried out to metric space represents and carry out three-dimensional localization;
S12: by determining that the direction character of point of interest builds the descriptor vector of point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and by RANSAC algorithm to Mismatching point to rejecting;
S13: the point of interest based on the match is successful is to determining the homography matrix of described Aerial Images and area image to be matched, position and the rotation relationship of the two width images that provide according to described homography matrix, determine the position of figure in area map to be matched of taking photo by plane, thereby determine the flight position of aircraft.
S2: landform adopts SITAN (Sandia Inertial Terrain Aided Navigation with reference to navigation, Sang Diya inertia Models in Terrain Aided Navigation) Approach of Terrain Matching carries out terrain match, by actual relative height and inertial navigation relative height, and finally determine the position of aircraft;
As shown in Figure 3, described step S2 specifically comprises the following steps:
S21: input landform altitude data, and set up Terrain Linearization model;
S22: calculate the poor Δ h of landform altitude according to aircraft horizontal position error (Δ x, Δ y) l, the poor Δ h of landform relative height rcan be expressed as:
Δh r=Δh-Δh l
Wherein, Δ h is that aircraft absolute altitude is poor;
S23: calculating gained landform relative height Cha Yu observation station is obtained to relative height is poor to be compared, revise the position of aircraft, finally determine the position of aircraft.
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation and landform are with reference to the observation model of navigation;
Wherein, the error model of strapdown inertial navigation system is:
X=FX+Gw
Figure GDA0000393491660000081
Wherein X is system state vector, and F is 5 * 5 system matrixes, and G is system noise input battle array, and w is system noise vector,
Figure GDA0000393491660000082
δ λ, δ h, δ v e, δ v nbe respectively latitude error, longitude error, height error, east orientation velocity error and north orientation velocity error, its non-zero entry is:
Figure GDA0000393491660000083
Figure GDA0000393491660000084
Figure GDA0000393491660000085
Figure GDA0000393491660000091
Figure GDA0000393491660000092
Figure GDA0000393491660000094
The observation model of scene matching aided navigation is:
Z SMANS=H SMANSX+V SMANS
H SMANS = I 2 × 2 0 2 × 3
Wherein, the error model that X is strapdown inertial navigation system, Z sMANSfor the observed quantity of scene matching aided navigation, V sMANSobservation noise for scene matching aided navigation.
Landform with reference to the observation model of navigation is:
Z TRNS=H TRNSX+V TRNS
H TRNS = - k x - k y 1 0 0
Wherein, Z tRNSfor the observed quantity of landform with reference to navigation, V tRNSfor landform with reference to navigation observation noise, k x, k ybe respectively the landform slope of x direction and y direction.
S4: strap-down inertial, scene matching aided navigation and landform are carried out to information fusion with reference to navigation output, draw optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
The descriptor vector of described point of interest can be 64 dimensions or 128 dimensions form.
As shown in Figure 4, described step S4 specifically comprises the following steps:
S41: the function of state after configuration information merges with variance battle array function P f, and respectively the two is carried out to initialization; Even
x ^ f ( 0 ) = x ( 0 ) , P f ( 0 ) = P ( 0 )
S42: form SINS/SMANS subfilter and SINS/TRNS subfilter, the state of each subfilter, variance battle array and state-noise battle array are carried out to information distribution: the i.e. state to i subfilter variance battle array P iwith state-noise battle array Q icarry out information distribution: order
x ^ i ( k ) = x ^ f ( k )
P i ( k ) = β i - 1 ( k ) P f ( k )
Q i ( k ) = β i - 1 ( k ) Q f ( k )
Wherein
Figure GDA0000393491660000105
p fand Q fbe respectively state, variance battle array and state-noise battle array after fusion, β ifor the information distribution factor, k>=1, i=1,2.β imeet:
β 12=1。
S43: inertial navigation/scene matching aided navigation subfilter and inertial navigation/landform are carried out respectively to time renewal and observation renewal separately with reference to navigation subfilter, obtain the estimated information of each subfilter:
(1) after time renewal, obtain:
x(k|k-1)=Φ(k|k-1)x(k-1)
P(k|k-1)=Φ(k|k-1)P(k-1)Φ T(k|k-1)+Q(k-1)
x(k-1)=x(k|k-1)
P(k-1)=P(k|k-1)
Wherein, the state vector that x is wave filter, the variance battle array that P is wave filter, Φ is the system state transfer matrix that F matrix is corresponding;
(2) observation obtains after upgrading:
K(k)=P(k|k-1)H T(k)(H(k)P(k|k-1)H T(k)+R(k)) -1
x(k)=x(k|k-1)+K(k)(z(k)-H(k)x(k|k-1))
P(k)=(I-K(k)H(k))P(k|k-1)
Wherein, K is Kalman filtering gain battle array, and H is observing matrix.
S44: the estimated information of each subfilter is merged to the global state optimal estimation information that becomes:
P f - 1 ( k ) = P m - 1 ( k ) + Σ i = 1 n P i - 1 ( k )
x ^ f ( k ) = P f ( k ) [ P m - 1 ( k ) x ^ m ( k ) + Σ i = 1 n P i - 1 ( k ) ] .
The federated filter fusion structure of performing step 4 as shown in Figure 5, the signal that SINS and SMANS system obtain is transfused to SINS/SMANS subfilter, the signal that SINS and TRNS system obtain is transfused to SINS/TRNS subfilter, waits after processing output after merging through upgrading.
Embodiment bis-:
As shown in Figure 6, the present embodiment has been recorded a kind of integrated navigation system of realizing combinations thereof air navigation aid, comprising:
Atmosphere inertial navigation system, for obtaining inertial navigation positional information output;
Flight path generator module, for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
SINS/SMANS integrated navigation system, comprising:
Imageing sensor vision area and positional parameter computing module, for according to vision area and the positional parameter of the airborne imageing sensor of positional information calculation of described atmosphere inertial navigation system output;
Digital reference map database, for obtaining suitable digital reference map according to the result of calculation of described imageing sensor vision area and positional parameter computing module;
Imageing sensor analog module, for attitude and the height conversion information of the aircraft that obtains according to described flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
Images match module, for carrying out registration to the described digital reference map being obtained by described digital reference map database with by the realtime graphic of described imageing sensor analog module simulation, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module, for carrying out information fusion according to the aircraft real-time position information of the inertial navigation positional information of atmosphere inertial navigation system output and the output of images match module;
SINS/TRNS integrated navigation system, comprising:
Laser ceilometer, for receiving the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that terrain match module obtains, obtains surveying relative height output;
Landform altitude database, for providing terrain data;
Terrain match module, exports actual elevation information, output landform slope and inertial navigation elevation information for the inertial navigation positional information of exporting according to atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that described landform altitude database provides;
SINS/TRNS Kalman subfilter module, for carrying out information fusion according to the output signal of described laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federated filter module, for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged, obtain the final positional information result merging, described atmosphere inertial navigation system is carried out to error correction simultaneously.
Above embodiment is only for illustrating the present invention; and be not limitation of the present invention; the those of ordinary skill in relevant technologies field; without departing from the spirit and scope of the present invention; can also make a variety of changes and modification; therefore all technical schemes that are equal to also belong to category of the present invention, and scope of patent protection of the present invention should be defined by the claims.

Claims (6)

1. the SINS/SMANS/TRNS integrated navigation system based on federated filter, is characterized in that, comprising:
Atmosphere inertial navigation system, for obtaining inertial navigation positional information output;
Flight path generator module, for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
SINS/SMANS integrated navigation system, comprising:
Imageing sensor vision area and positional parameter computing module, for according to vision area and the positional parameter of the airborne imageing sensor of positional information calculation of described atmosphere inertial navigation system output;
Digital reference map database, for obtaining suitable digital reference map according to the result of calculation of described imageing sensor vision area and positional parameter computing module;
Imageing sensor analog module, for attitude and the height conversion information of the aircraft that obtains according to described flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
Images match module, for carrying out registration to the described digital reference map being obtained by described digital reference map database with by the realtime graphic of described imageing sensor analog module simulation, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module, for carrying out information fusion according to the aircraft real-time position information of the inertial navigation positional information of atmosphere inertial navigation system output and the output of images match module;
SINS/TRNS integrated navigation system, comprising:
Laser ceilometer, for receiving the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that terrain match module obtains, obtains surveying relative height output;
Landform altitude database, for providing terrain data;
Terrain match module, exports actual elevation information, output landform slope and inertial navigation elevation information for the inertial navigation positional information of exporting according to atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that described landform altitude database provides;
SINS/TRNS Kalman subfilter module, for carrying out information fusion according to the output signal of described laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federated filter module, for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged, obtain the final positional information result merging, described atmosphere inertial navigation system is carried out to error correction simultaneously.
2. based on integrated navigation system as claimed in claim 1, realize a method for SINS/SMANS/TRNS integrated navigation, it is characterized in that, comprising:
Atmosphere inertial navigation system obtains inertial navigation positional information output;
Flight path generator module simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
Imageing sensor vision area and positional parameter computing module are according to vision area and the positional parameter of the airborne imageing sensor of positional information calculation of described atmosphere inertial navigation system output;
Digital reference map database root obtains suitable digital reference map according to the result of calculation of described imageing sensor vision area and positional parameter computing module;
The attitude of the aircraft that imageing sensor analog module obtains according to described flight path generator module and height conversion information, simulation generates the realtime graphic that airborne imageing sensor is taken;
Images match module is carried out registration to the described digital reference map being obtained by described digital reference map database with by the realtime graphic of described imageing sensor analog module simulation, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module is carried out information fusion according to the aircraft real-time position information of the inertial navigation positional information of atmosphere inertial navigation system output and the output of images match module;
The actual elevation information that the actual absolute altitude information that Return Signal for Laser Altimeter flight path generator module obtains and terrain match module obtain, obtains surveying relative height output;
The terrain data that the actual position information that terrain match module is exported according to the inertial navigation positional information of atmosphere inertial navigation system output, flight path generator module and described landform altitude database provide is exported actual elevation information, output landform slope and inertial navigation elevation information;
SINS/TRNS Kalman subfilter module is carried out information fusion according to the output signal of described laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federated filter module merges the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module, obtains the final positional information result merging, and described atmosphere inertial navigation system is carried out to error correction simultaneously;
Described method is further comprising the steps of:
S1: scene matching aided navigation adopts angle point method to carry out images match, by attitude and the height conversion of aircraft, determines the position of aircraft;
S2: landform adopts Approach of Terrain Matching to carry out terrain match with reference to navigation, by actual relative height and inertial navigation relative height, finally determines the position of aircraft;
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation and landform are with reference to the observation model of navigation;
S4: strap-down inertial, scene matching aided navigation and landform are carried out to information fusion with reference to the output of navigation, draw optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
3. Combinated navigation method as claimed in claim 2, is characterized in that, described step S1 specifically comprises the following steps:
S11: input Aerial Images and area image to be matched, respectively two images are carried out to the extraction of unique point, image is carried out to metric space represents and carry out three-dimensional localization;
S12: by determining that the direction character of point of interest builds the descriptor vector of point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and to Mismatching point to rejecting;
S13: the point of interest based on the match is successful is to determining the homography matrix of described Aerial Images and area image to be matched, position and the rotation relationship of the two width images that provide according to described homography matrix, determine the position of figure in area map to be matched of taking photo by plane, thereby determine the flight position of aircraft.
4. Combinated navigation method as claimed in claim 3, is characterized in that, the descriptor vector of described point of interest can be 64 dimensions or 128 dimensions form.
5. Combinated navigation method as claimed in claim 2, is characterized in that, described step S2 specifically comprises the following steps:
S21: input landform altitude data, and set up Terrain Linearization model;
S22: according to aircraft horizontal position error, calculate landform altitude poor, poorer according to the poor calculating landform of the poor and described landform altitude of the absolute altitude of aircraft relative height;
S23: calculating gained landform relative height Cha Yu observation station is obtained to relative height is poor to be compared, revise and finally determine the position of aircraft.
6. Combinated navigation method as claimed in claim 2, is characterized in that, described step S4 specifically comprises the following steps:
S41: the state after information fusion and variance battle array are carried out to initialization;
S42: form inertial navigation/scene matching aided navigation subfilter and inertial navigation/landform with reference to navigation subfilter, the state of each subfilter, variance battle array and state-noise battle array are carried out to information distribution;
S43: inertial navigation/scene matching aided navigation subfilter and inertial navigation/landform are carried out respectively to time renewal and observation renewal separately with reference to navigation subfilter, obtain the estimated information of each subfilter;
S44: the estimated information of all subfilters is merged to the global state optimal estimation information that becomes.
CN201110371864.0A 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system Active CN102506868B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110371864.0A CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110371864.0A CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Publications (2)

Publication Number Publication Date
CN102506868A CN102506868A (en) 2012-06-20
CN102506868B true CN102506868B (en) 2014-03-12

Family

ID=46218975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110371864.0A Active CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Country Status (1)

Country Link
CN (1) CN102506868B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8483960B2 (en) 2002-09-20 2013-07-09 Visual Intelligence, LP Self-calibrated, remote imaging and data processing system
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system
EP2888628A4 (en) * 2012-08-21 2016-09-14 Visual Intelligence Lp Infrastructure mapping system and method
CN102829785B (en) * 2012-08-30 2014-12-31 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN103353310B (en) * 2013-06-01 2017-06-09 西北工业大学 A kind of laser near-net shaping
CN103591955B (en) * 2013-11-21 2016-03-30 西安中科光电精密工程有限公司 Integrated navigation system
CN105547300A (en) * 2015-12-30 2016-05-04 航天恒星科技有限公司 All-source navigation system and method used for AUV (Autonomous Underwater Vehicle)
US10802135B2 (en) * 2016-12-21 2020-10-13 The Boeing Company Method and apparatus for raw sensor image enhancement through georegistration
WO2018218536A1 (en) * 2017-05-31 2018-12-06 深圳市大疆创新科技有限公司 Flight control method, apparatus and control terminal and control method therefor, and unmanned aerial vehicle
CN109214254B (en) * 2017-07-07 2020-08-14 北京臻迪科技股份有限公司 Method and device for determining displacement of robot
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN109029434A (en) * 2018-06-29 2018-12-18 电子科技大学 Based on the Sang Diya inertia terrain auxiliary navigation method pushed away under adaptive scale
CN112154389A (en) * 2019-07-30 2020-12-29 深圳市大疆创新科技有限公司 Terminal device and data processing method thereof, unmanned aerial vehicle and control method thereof
CN113074722A (en) * 2020-01-03 2021-07-06 上海航空电器有限公司 Method for improving, positioning and correcting terrain reference navigation precision based on vision assistance technology
CN111854728B (en) * 2020-05-20 2022-12-13 哈尔滨工程大学 Fault-tolerant filtering method based on generalized relative entropy
CN112859137A (en) * 2020-12-31 2021-05-28 国营芜湖机械厂 Airborne SINS/BDS/GNSS/TAN combined navigation semi-physical simulation system
CN113155126B (en) * 2021-01-04 2023-10-20 航天时代飞鸿技术有限公司 Visual navigation-based multi-machine cooperative target high-precision positioning system and method
CN113406566B (en) * 2021-06-04 2023-09-19 广东汇天航空航天科技有限公司 Method and device for positioning aircraft
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Unmanned aerial vehicle self-navigation based on terrain matching

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8244469B2 (en) * 2008-03-16 2012-08-14 Irobot Corporation Collaborative engagement for target identification and tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
A Novel Algorithm for SINS/CNS/GPS Integrated Navigation System;Haidong Hu et al.;《Joint 48th IEEE Conference on Decision and Control and 28th Chinese Control Conference》;20091218;1471-1475 *
Haidong Hu et al..A Novel Algorithm for SINS/CNS/GPS Integrated Navigation System.《Joint 48th IEEE Conference on Decision and Control and 28th Chinese Control Conference》.2009,
一种新的复合地形辅助导航方法;谢建春等;《计算机仿真》;20090331;第26卷(第3期);43-45,128 *
一种高效的图像局部特征匹配算法;杨恒等;《西北工业大学学报》;20100430;第28卷(第2期);291-297 *
信息融合技术在INS/GPS/TAN/SMN四组合系统中的应用;江春红等;《信息与控制》;20011231;第30卷(第6期);537-542 *
杨恒等.一种高效的图像局部特征匹配算法.《西北工业大学学报》.2010,第28卷(第2期),
江春红等.信息融合技术在INS/GPS/TAN/SMN四组合系统中的应用.《信息与控制》.2001,第30卷(第6期),
王翌等.惯性/卫星定位/地形匹配/景象匹配组合导航技术.《新世纪 新机遇 新挑战——知识创新和高新技术产业发展(下册)》.2001, *
谢建春等.一种新的复合地形辅助导航方法.《计算机仿真》.2009,第26卷(第3期),

Also Published As

Publication number Publication date
CN102506868A (en) 2012-06-20

Similar Documents

Publication Publication Date Title
CN102506868B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system
CN103697889B (en) A kind of unmanned plane independent navigation and localization method based on multi-model Distributed filtering
CN111102978B (en) Method and device for determining vehicle motion state and electronic equipment
CN107727079B (en) Target positioning method of full-strapdown downward-looking camera of micro unmanned aerial vehicle
Conte et al. Vision-based unmanned aerial vehicle navigation using geo-referenced information
Sim et al. Integrated position estimation using aerial image sequences
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN102829785B (en) Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN111338383B (en) GAAS-based autonomous flight method and system, and storage medium
CN112230242A (en) Pose estimation system and method
CN112146655A (en) Elastic model design method for BeiDou/SINS tight integrated navigation system
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN106352897B (en) It is a kind of based on the silicon MEMS gyro estimation error of monocular vision sensor and bearing calibration
CN110388939A (en) One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
CN109375647A (en) Miniature multi-source perceptual computing system
CN113253325B (en) Inertial satellite sequential tight combination lie group filtering method
CN110068325A (en) A kind of lever arm error compensating method of vehicle-mounted INS/ visual combination navigation system
Mostafa et al. Optical flow based approach for vision aided inertial navigation using regression trees
Zhao et al. Distributed filtering-based autonomous navigation system of UAV
CN114897942B (en) Point cloud map generation method and device and related storage medium
CN115930948A (en) Orchard robot fusion positioning method
CN106123894B (en) Based on the matched InSAR/INS Combinated navigation method of interference fringe
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant