CN105222788B - The automatic correcting method of the matched aircraft Route Offset error of feature based - Google Patents

The automatic correcting method of the matched aircraft Route Offset error of feature based Download PDF

Info

Publication number
CN105222788B
CN105222788B CN201510641575.6A CN201510641575A CN105222788B CN 105222788 B CN105222788 B CN 105222788B CN 201510641575 A CN201510641575 A CN 201510641575A CN 105222788 B CN105222788 B CN 105222788B
Authority
CN
China
Prior art keywords
image
aircraft
matching
point
sift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510641575.6A
Other languages
Chinese (zh)
Other versions
CN105222788A (en
Inventor
陶晓明
刘喜佳
徐洁
葛宁
陆建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201510641575.6A priority Critical patent/CN105222788B/en
Publication of CN105222788A publication Critical patent/CN105222788A/en
Application granted granted Critical
Publication of CN105222788B publication Critical patent/CN105222788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The automatic correcting method of the matched aircraft Route Offset error of feature based belongs to aircraft air route position offset alignment technique field, is divided into two parts of processed offline and online processing:Processed offline part using satellite or unmanned plane obtain selected landmark region in prior image, obtain the SIFT feature parameter of prior image and the geographical location information of landmark region using the soft method of SIFT algorithms, generate prior data bank;The prior data bank is loaded into aircraft memory by aircraft, SIFT feature parameter is acquired with SIFT algorithm softwares to acquisition image during flight, it carries out matching constitutive characteristic point pair with prior image SIFT feature, then again with the threshold value of characteristic point logarithm during the essence matching of setting, and different offset calculation methods is respectively adopted in the vehicle yaw precision of setting, opposite to be based on single-point terrestrial reference aircraft space position real-time computing technique, the present invention overcomes influence of height and attitude angle the information to calculation accuracy.

Description

The automatic correcting method of the matched aircraft Route Offset error of feature based
Technical field
The present invention provides a kind of automatic correcting methods of the matched aircraft Route Offset error of feature based, belong to image Information processing and the combination field of navigation field.According to the adaptively selected Route Offset calculation method of feature registration result, thus Obtain the position offset of aircraft.
Background technology
Aircraft will awing rely on inertial navigation system (Inertial Navigation System, INS) progress Positioning, but due to the accumulated error of inertia device, the position coordinates of inertial navigation output can be dissipated with accumulated time.In order to correct The accumulated error of inertial navigation is generally modified the output of INS by the way of integrated navigation, improves the output accuracy of inertial navigation. INS/ global position systems are the modes of now the most commonly used integrated navigation, such as global positioning system (Global Position System, GPS) or Beidou satellite navigation system (Beidou Navigation Satellite System, BDS).
But in practical applications, global position system is not available in real time, can by satellite platform availability, with And the influence of signal interference etc..When global position system fails, need by other manner to inertial navigation export result into Row is corrected.Since vision guided navigation independence is strong, precision is high, and real-time is good, so INS/ vision guided navigations become integrated navigation One important development direction.
The unmanned plane during flying of patent (application publication number CN 103822635A) view-based access control model information of Ding Wenrui, Kang Chuanbo Spatial location real-time computing technique, the patent calculate the spatial position of aircraft using single-point terrestrial reference in real time, calculate When need elevation information using single-point terrestrial reference, height error is introduced into result of calculation.And the present invention can make full use of The information of landmark region clicks through line displacement using SIFT matching characteristics and resolves, when matching is more than given threshold N_fine to number When, step (5.2) method or step (5.3) method may be used and resolved, step (5.2) method eliminates height error pair The influence of calculation accuracy, step (5.3) method eliminate the influence of attitude angle and elevation information to calculation accuracy.
Invention content
Imaging sensor, the Route Offset resolving side of feature based registration are utilized the object of the present invention is to provide a kind of Method when global position system fails, by the acquisition of the image information to known landmarks, is realized to navigate to aircraft inertia and be missed The real-time amendment of difference.
The system is divided into two parts of processed offline and aircraft online processing.
The effect of processed offline is selected landmark region and establishes prior information database, including following steps:
It selectes and obtains the priori image information of the landmark region in step (1) landmark region.
Step (2) carries out SIFT feature extraction to the priori image of the landmark region.
Step (3) generates the prior data bank of the landmark region, including prior image information, characteristic point information and ground Mark the geographical location information in region.
When the effect of online processing is aircraft flight using imaging sensor to landmark region carry out Image Acquisition, with from The prior data bank for the landmark region that line is established is compared, and detects landmark region, and according to landmark region in image is acquired Pixel coordinate, the location information of calculating aircraft.Including following steps:
Step (1) will be in the priori data library storage to the memory of aircraft of landmark region.
The realtime graphic of the selected landmark region of step (2) shooting, and carry out image preprocessing:
Due to by environment, sensor and platform interference etc. influenced, aircraft obtain image data can there are noises And various forms of interference, in order to ensure the quality of subsequent operation and performance, so first having to locate the image of acquisition in advance Reason operation.
The SIFT feature of step (3) extraction acquisition image:
SIFT feature extraction is carried out to the acquisition image after pretreatment using SIFT algorithm softwares, landmark region is adopted Collect input of the image as SIFT algorithms, output obtains stable SIFT feature parameter sets.
Step (4) characteristic matching:
Characteristic matching is passed according to the SIFT feature parameter sets and image of prior image in the online data of landmark region The SIFT feature parameter sets of sensor acquisition image are registrated, and characteristic matching is divided into thick matching and essence matching two parts.
Step (4.1) feature slightly matches:
Thick matching be with unique description vectors (description of landmark region prior image and the SIFT feature for acquiring image to Euclidean distance between amount) establish prior image and acquire image between mapping relations, using the Euclidean distance between vector come into Row rough registration.Thick matched output result includes prior image and the acquisition successful characteristic point pair of images match, also includes to this The scoring score, score of matching characteristic point pair are square of this feature point to description vectors Euclidean distance.
The thick matched result of step (4.2) statistics:
If the characteristic point of successful match is transferred to step (2), if the step to for " 0 ", exporting " resolving failure " in rough registration The characteristic point logarithm of successful match is more than zero in rapid, then counts the logarithm N of successful match.
Step (4.3) feature essence matches:
The matched threshold value of feature essence is set as N_fine (N_fine>3) calculation accuracy of needs, is set as DIST (units For rice).
The matching of feature essence be it is thick it is matched on the basis of, using MLESAC (Maximum Likelihood Estimation By Sample and Consensus, random sampling maximal possibility estimation) further screened, remove the spy of matching error Remaining matching is carried out the resolving of position of aircraft offset by sign matching pair to being transferred to step (5).
As number N≤N_fine of the successful characteristic point pair of rough registration, output N is to minimum in characteristic point centering score Characteristic point pair, resolved using step (5.1) method.When the number N of the successful characteristic point pair of rough registration>During N_fine, The matching that essence registration screening makes mistake is carried out to carry out to the rear correct characteristic point pair of output, while to the calculation accuracy DIST of setting Judge, if DIST≤10m, be transferred to step (5.3) and resolved, if DIST>10m is transferred to step (5.2) and is resolved.
Step (5) Route Offset resolves:
Route Offset resolving is inclined come the position of calculating aircraft according to pixel coordinate of the landmark region in image is acquired It moves, which is the emphasis place of the present invention.
The present invention provides the method (a) (b) (c) of three kinds of progress Route Offset resolvings, the success in being matched according to essence The number N for the characteristic point pair matched and the calculation accuracy DIST of setting select corresponding method to resolve.
The advantage of the invention is that:
(1) adaptability is good.The information of GPS or the Big Dipper need not be used, directly using known landmarks region in image is acquired Coordinate, it will be able to the spatial position of aircraft is positioned.
(2) stability is good.In the present invention during position coordinates of calculating aircraft, three kinds of strategies are specifically devised, using three The mode that kind method is combined can make full use of the information of landmark region, improve calculation accuracy, meanwhile, near landmark point Information it is not perfect when, can also be resolved by method (a), it is adaptable.
Description of the drawings
Fig. 1 is the flow chart of the processed offline part of the present invention.
Fig. 2 is the flow chart of the online processing part of the present invention.
Fig. 3 is the output result after online processing slightly matches in the process.
Fig. 4 is the output result after essence matching during online processing.
Specific embodiment
The specific embodiment of the present invention is described in detail below according to attached drawing.
The automatic correcting method of the matched aircraft Route Offset error of feature based of the present invention, processed offline part Flow chart is as shown in Figure 1, including following steps:
Step (1) selectes landmark region and obtains the prior image of landmark region:
According to the line of flight for the aircraft drafted in advance, the selection of landmark region is carried out.Selected landmark region should It with abundant, apparent characteristics of image and is not easy to obscure, as skyscraper, viaduct, river etc. have obvious characteristic and are difficult to The characteristics of duplication.
Image Acquisition is carried out to the landmark region of setting using modes such as satellite or unmanned planes, obtains the elder generation of the landmark region Test image and the geographic coordinate information of the landmark region.
Step (2) extracts the SIFT feature of prior image:
SIFT feature extraction is carried out to the prior image using SIFT algorithm softwares, obtain the description of its characteristic point to AmountForm the set of prior image feature point description vectorNcSIFT feature for prior image is counted,
Step (3) generates the prior data bank of landmark region:The image information that step (1) (2) is obtained, landmark region Geographical coordinate, SIFT feature parameter are deposited into prior data bank, complete the foundation of prior data bank.
The automatic correcting method of the matched aircraft course deviation shift error of feature based of the present invention, the stream of online processing part Journey figure is as shown in Fig. 2, including following steps:
The prior data bank of landmark region is previously stored in the memory of aircraft by step (1).
Step (2) captured in real-time selectes the image of landmark region, and carries out image preprocessing:
To aircraft image sensor obtain acquisition image pre-process, pretreatment include using median filtering method into Row denoising, grey level histogram correction carry out image enhancement, and carrying out lens distortion using the camera intrinsic parameter for demarcating acquisition in advance rectifys Just.
Step (3) carries out SIFT feature extraction to the acquisition image of landmark region:
SIFT feature extraction is carried out to the acquisition image after pretreatment using SIFT algorithm softwares, obtains its characteristic point Description vectorsThe set V of generation acquisition image SIFT feature description vectorsp,NpTo acquire the SIFT feature points of image.
Step (4) prior image and acquisition image carry out SIFT feature matching:
Characteristic matching is passed according to the SIFT feature parameter sets and image of prior image in the online data of landmark region The SIFT feature parameter sets of sensor acquisition image are matched.Characteristic matching is divided into thick matching and essence matching two parts.
Step (4.1) feature slightly matches:
Step (4.1.1), in acquisition image SIFT feature point set VpIn appoint and take the description vectors of a characteristic pointMake it With the set V of prior image SIFT feature description vectorscIn all characteristic points description vectorsIt carries out according to the following steps special The thick matching of sign point:
Step (4.1.1.1) calculates following parameters:
V is calculated respectivelypWith VcIn all characteristic points description vectorsEuclidean distanceAnd by described Euclidean distanceSize from small to large, form the ascending sequence of Euclidean distance, it is assumed that in the Euclidean distance sequenceFor the only one minimum value in the sequence,Sub-minimum for the Euclidean distance sequence.
Step (4.1.1.2) judgesIt is no:
If:It is less than, then successful match, obtains a pair of of characteristic point pairIt and willAs this feature The score score of point pair is exported, i.e.,
If:It is more than or equal to, then the thick this feature point that it fails to match, that is, acquires in image of the pointDo not have in prior image There is corresponding characteristic point that can match.
Step (4.1.1.3), in the acquisition image SIFT feature point set VpIn each characteristic point of remainder perform successively Step (4.1.1.1)~step (4.1.1.2), the N until acquiring imagepAll matching finishes a SIFT feature.
Step (4.2) counts thick matching result:
If:If the SIFT points of successful match are " 0 ", output, which resolves, to fail, return to step (2), if successful match SIFT feature points are more than zero, then count the logarithm N of successful match.
Figure (3) is after slightly matching as a result, red area is prior image, and what red circle represented is prior image SIFT feature, blue portion are acquisition images, and what green crunode represented is the SIFT feature for acquiring image.Rough registration matches Matching to having the point of erroneous matching, as shown in FIG., so to use the further screening of essence registration.
Step (4.3), the matching of characteristic point essence:
Step (4.3.1), initialization:
The matched threshold value N_define=8 of setting essence, yaw precision DIST=20m.
Thick successful match number N≤8 counted in step (4.3.2) judgment step (4.2) are no:
If:Thick matching characteristic points N≤8, perform step (5-a),
If:Thick matching characteristic points N>8, step (4.3.3) is performed,
Step (4.3.3), using MLESAC (Maximum Likelihood Estimation by Sample and Consensus, random sampling maximal possibility estimation) smart registration is carried out, to the matching that rough registration determines to the consistent of coordinate relationship Property is analyzed, and removes the characteristic matching pair of matching error, remaining correct matching characteristic point is exported.
Figure (4) is that the characteristic matching result after being registrated by essence exports, and error matching points have been removed.
Step (4.3.4) judges that yaw precision DIST≤10m of setting is no:
If:> 10m then perform step (5-b),
If:≤ 10m then performs step (5-c),
Step (5), according to the position offset of pixel coordinate calculating aircraft of the landmark region in image is acquired, this It is the emphasis place of the present invention:
The data that aircraft provides include, the current latitude and longitude coordinates (B, L) of aircraft, present level H, and aircraft is worked as Preceding attitude angle:Yaw angle α, pitch angle β, roll angle γ.
Three kinds of methods of the position offset of calculating aircraft are implemented as follows:
Step (5.1), using elevation information H, three attitude angle α, beta, gamma and landmark region M in the picture for shooting image Vegetarian refreshments coordinateCalculating aircraft is in the northeast day established using the position coordinates that aircraft inertia navigation provides as origin Coordinate under coordinate system.
If coordinate (xs of the arbitrary point m of landmark region M under camera coordinate systemm,ym,zm), and its corresponding imaging point p Coordinate in imaging plane coordinate system is (px,py), imaging point p is (u in the coordinate of image coordinate system (u-v coordinate systems)p, vp).Then coordinate (the u under image coordinate systemp,vp) and imaging plane coordinate system under coordinate (px,py) relationship such as formula 1-1 institutes Show.It can be obtained according to projection theory, coordinate (xs of the m under camera coordinate systemm,ym,zm) and imaging plane coordinate system coordinate (px,py) as shown in formula 1-2.Sx and Sy is respectively the pixel number under unit physical size in imaging plane X-axis and Y direction Mesh, f are the focal length of video camera.
It can be obtained by formula 1-1 and formula 1-2
I.e.
Northeast day coordinate system (X is established by origin of the optical center O of video camerawYwZw), camera coordinate system and the northeast day are sat The relationship of system is marked as shown in formula 1-4, RwIt is the posture spin matrix of camera coordinate system and video camera northeast day coordinate,
It can be obtained by formula 1-3 and formula 1-4,
When can correctly be identified there are one landmark point m, three attitude angles this moment can be obtained, yaw angle α, pitch angle β, Roll angle γ and shooting height H, z at this timewZ-axis coordinate zs of the landmark point m under camera coordinate system is obtained in=Hm, so as to Obtain coordinate (xs of the m under the coordinate system of video camera northeast dayw,yw,zw)。
This method needs to rely on the height H of aircraft, the coordinate of attitude angle and terrestrial reference in the picture.And height H is It is obtained by height sensor measurement, there are error, equally there is also errors for attitude angle.It goes to calculate using the data for having had error The position of aircraft, result can be brought into error in the result.
Step (5.2), according to method (5-a), it can be seen that, it is assumed that the actual position of aircraft is defeated with aircraft navigation Coordinate (x under the heaven-made coordinate system in northeast that out position is established for originc,yc,zc), the landmark point m in landmark region schemes in acquisition As the pixel coordinate in sensor is (up,vp), while landmark point m is in the northeast that aircraft navigation outgoing position is that origin is established Coordinate under its coordinate system is (xm,ym,zm), the image space (u in image coordinate systemp,vp) can be by imaging equation 1-6 It determines.
up=[(cos α cos γ+sin α sin β sin γ) (xm-xc)-cosβsinγ(zm-zc)
-(cosγsinα-cosαsinβsinγ)(ym-yc)]fSx
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinβ)(ym-yc)+cosβcosγ(zm-zc)]vp=[sin β (zm-zc)+cos αcosβ(ym-yc)+cosβsinα(xm-xc)]fSy
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinγ)(ym-yc)+cosβcosγ(zm-zc)]
(1-6)
Wherein, (xc,yc,hc) be video camera space coordinate;(α, β, γ) is the spatial attitude angle of video camera;(f,Sx, Sy) be video camera imaging focal length, pixel distribution density, be video camera it is parameter-embedded, can by manufacturer provide or carry out from Line is demarcated.
Above-mentioned equation is about (xc,yc,zc) with the Linear Equations of (α, beta, gamma), utilize criterion of least squares Solve (xc,yc,zc), obtain unmanned plane current position coordinates (xc,yc,zc) namely aircraft actual position relative to aircraft The offset of inertial navigation outgoing position coordinate, this method are only calculated using attitude angle information, eliminate height H measurement errors to most The influence of result afterwards.
Step (5.3), when characteristic matching is to N >=3, can also constitution optimization function, it is excellent to solve this using alternative manner Change problem.
Such as formula 1-7 constitution optimization functions,The characteristic point of smart successful match in image is acquired according to the position of aircraft The respective pixel coordinate being calculated with attitude angle is put, is calculated by imaging equation, is about (xc,yc,zc) and (α, β, γ) Function,It is pixel coordinate of the characteristic point of smart successful match in acquisition image in image is acquired.
M is the characteristic point logarithm that exports after essence matching, i=1,2 ... m.
The true location coordinate and attitude angle of aircraft are obtained using iterative manner to majorized function J, uses matlab Included fminserch functions are iterated solution, and iterations are set as 10000, and the allowable error of iteration is set as 1e-4, is obtained (xc,yc,zc).The position coordinates for the aircraft that this method is calculated can eliminate the influence of height error and attitude error.
In three kinds of methods, first method is needed using attitude angle information and elevation information, the aircraft position being calculated It puts coordinate and introduces attitude error and height error.But requirement of this method to matching characteristic pair is relatively low, theoretically only Characteristic matching is wanted to N >=1, you can use.
Second method theoretically requires characteristic matching to N >=3, can go setting phase in actual design according to real system The threshold value N_define answered.This method does not need to utilize elevation information, it is only necessary to provide attitude angle information, the flight being calculated Device position coordinates error is only influenced by attitude error, and error is smaller than first method.
The third method theoretically requires characteristic matching to N >=3, can go setting phase in actual design according to real system The threshold value N_define answered.This method uses iterative method, does not both need to elevation information, it is not required that attitude angle information resolves essence Degree can reach below 10m.
The resolving mode combined using (a) (b) (c) three kinds of methods can make full use of the information of landmark region, improve solution Precision is calculated, meanwhile, when the information near landmark point is not perfect, it can also be resolved by method (a), it is adaptable.

Claims (1)

1. the automatic correcting method of the matched aircraft Route Offset error of feature based, which is characterized in that be divided into ground and locate offline Reason and two parts of aircraft online processing:
Ground processed offline part, is divided into following steps:
Step (1) is chosen with obvious characteristic and is difficult to the region replicated as landmark region, obtained using unmanned plane or satellite The prior image of the landmark region;
Step (2) extracts the SIFT feature of prior image:
Using SIFT algorithm softwares to the description vectors of prior image extraction SIFT featureForm prior image feature The set of point description vectors NcFor prior image SIFT feature is counted,
Step (3) generates the prior data bank of landmark region, including prior image, the ground of characteristic point information and landmark region Manage location information,
Aircraft online processing part, is divided into following steps:
Step (1), the prior data bank of landmark region is previously loaded into the memory of aircraft,
Step (2), shoots the realtime graphic of selected landmark region, and carries out image preprocessing:
The landmark region is shot during aircraft real-time flight and carries out Image Acquisition, then, successively using medium filtering denoising, ash Histogram adjusting is spent to enhance image, finally carries out lens distortion calibration with preset camera intrinsic parameter,
The SIFT feature of step (3) extraction acquisition image,
The description vectors of image zooming-out SIFT feature collected using SIFT algorithm softwaresGeneration acquisition image SIFT The set V of feature point description vectorp, NpTo adopt Collect the SIFT feature points of image,
Step (4) carries out SIFT feature matching to the prior image and acquisition image:
Step (4.1) characteristic point slightly matches:
Step (4.1.1), in acquisition image SIFT feature point set VpIn appoint and take the description vectors of a characteristic pointMake itself and elder generation Test the set V of image SIFT feature description vectorscIn all characteristic points description vectorsCharacteristic point is carried out according to the following steps Thick matching:
Step (4.1.1.1) calculates following parameters:
V is calculated respectivelypWith VcIn all characteristic points description vectorsEuclidean distanceAnd by the Euclidean away from FromSize from small to large, form the ascending sequence of Euclidean distance, it is assumed that in the Euclidean distance sequenceFor the only one minimum value in the sequence,For the sub-minimum of the Euclidean distance sequence,
Step (4.1.1.2) judgesIt is no:
If:It is less than, then successful match, obtains a pair of of characteristic point pairIt and willAs this feature point pair Score score exported, i.e.,
If:It is more than or equal to, then the thick this feature point that it fails to match, that is, acquires in image of the pointNo pair in prior image The characteristic point answered can match,
Step (4.1.1.3), in the acquisition image SIFT feature point set VpIn each characteristic point of remainder perform step successively (4.1.1.1)~step (4.1.1.2), the N until acquiring imagepAll matching finishes a SIFT feature,
Step (4.2) counts thick matching result:
If:If the SIFT points of successful match are " 0 ", output resolves failure, return to step (2), if the SIFT of successful match is special Sign points are more than zero, then count the logarithm N of successful match,
Step (4.3), the matching of characteristic point essence:
Step (4.3.1), initialization:
The matched threshold value N_define of setting essence, and the N_define set should meet N_define>3, precision DIST is yawed, it is single Position is rice,
The thick successful match number N≤N_define counted in step (4.3.2) judgment step (4.2) is no:
If:Thick matching characteristic points N≤N_define, performs step (5.2),
If:Thick matching characteristic points N>N_define performs step (4.3.3),
Step (4.3.3), using MLESAC (Maximum Likelihood Estimation by Sample and Consensus, random sampling maximal possibility estimation) smart registration is carried out, to the matching that rough registration determines to the consistent of coordinate relationship Property is analyzed, and removes the characteristic matching pair of matching error, remaining correct matching characteristic point is exported,
Step (4.3.4) judges that yaw precision DIST≤10m of setting is no:
If:> 10m then perform step (5.2),
If:≤ 10m then performs step (5.3),
Step (5), according to the position offset of pixel coordinate calculating aircraft of the landmark region in image is acquired:
Step (5.1) aircraft acquires following parameter,
The current latitude and longitude coordinates of unmanned plane (B, L), present level H, unmanned plane current pose angle:Yaw angle α, pitch angle β, roll Angle γ,
Step (5.2), step is as follows:
Step (5.2.1) selects matching pair minimum in score, performs step (5.2.2),
Step (5.2.2), the successively position offset of calculating aircraft according to the following steps:
Step (5.2.2.1) establishes northeast day coordinate system (X by origin of the optical center O of video camerawYwZw),
Landmark point m is calculated as follows using the position coordinates that aircraft inertia navigation system provides as origin in step (5.2.2.2) Coordinate (x under the northeast day coordinate system of foundationw,yw,zw):
Wherein (up,vp) it is pixel coordinates of the landmark point m in image is acquired,
zmFor z-axis coordinates of the landmark point m in camera coordinate system, zw=H is it is known that be obtained zm, so as to find out xw,yw,
RwIt is the posture spin matrix of camera coordinate system and video camera northeast day coordinate,It is RwTransposed matrix,
Sx and Sy is respectively the pixel number acquired in image forming plane coordinate system X-axis and Y direction under unit physical size, For the parameter-embedded of video camera,
F is focal length of camera, is the parameter-embedded of video camera,
Step (5.3), step is as follows:
Step (5.3.1) is set:
Coordinate of the actual position of aircraft under the heaven-made coordinate system in northeast established using aircraft navigation outgoing position as origin (xc,yc,zc), pixel coordinates of the landmark point m in imaging sensor is acquired in landmark region is (up,vp), while landmark point m Coordinate under the northeast day coordinate system that aircraft navigation outgoing position is origin foundation is (xm,ym,zm),
Step (5.3.1.1) imaging equation is:
up=[(cos α cos γ+sin α sin β sin γ) (xm-xc)-cosβsinγ(zm-zc)
-(cosγsinα-cosαsinβsinγ)(ym-yc)]fSx
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinβ)(ym-yc)+cosβcosγ(zm-zc)]vp=[sin β (zm-zc)+cosα cosβ(ym-yc)+cosβsinα(xm-xc)]fSy
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinγ)(ym-yc)+cosβcosγ(zm-zc)]
Step (5.3.1.2), above-mentioned equation are about (xc,yc,zc) with the Linear Equations of (α, beta, gamma), using most Small two, which multiply criterion, solves (xc,yc,zc), obtain unmanned plane current position coordinates (xc,yc,zc) namely aircraft actual position phase Offset for aircraft inertial navigation outgoing position coordinate,
Step (5.4), step is as follows:
Majorized function J is calculated as follows in step (5.4.1),
Wherein
M is the characteristic point logarithm that exports after essence matching, i=1,2 ... m,
The correspondence picture that the characteristic point of smart successful match is calculated according to the position and attitude angle of aircraft in acquisition image Plain coordinate, is calculated by imaging equation, is about (xc,yc,zc) with the function of (α, beta, gamma),
It is pixel coordinate of the characteristic point of smart successful match in acquisition image in image is acquired,
Step (5.4.2) uses following Optimization Methods:
Solution is iterated using the fminseach function pairs J of matlab, iterations are set as 10000, the allowable error of iteration 1e-4 is set as, (x is obtainedc,yc,zc)。
CN201510641575.6A 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based Active CN105222788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510641575.6A CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510641575.6A CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Publications (2)

Publication Number Publication Date
CN105222788A CN105222788A (en) 2016-01-06
CN105222788B true CN105222788B (en) 2018-07-06

Family

ID=54991878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510641575.6A Active CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Country Status (1)

Country Link
CN (1) CN105222788B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444832A (en) * 2016-09-28 2017-02-22 北京航天时代激光导航技术有限责任公司 Navigation method for low-altitude cruising state of antisubmarine aircraft

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825517B (en) * 2016-03-31 2018-09-07 湖北航天技术研究院总体设计所 A kind of image correcting method and system of navigation height error
CN108253940B (en) 2016-12-29 2020-09-22 东莞前沿技术研究院 Positioning method and device
CN108958064B (en) * 2017-05-17 2021-10-01 上海微小卫星工程中心 Attitude guidance law error judgment method and system and electronic equipment
CN109307510A (en) * 2017-07-28 2019-02-05 广州极飞科技有限公司 Flight navigation method, apparatus and unmanned vehicle
CN109324337B (en) * 2017-07-31 2022-01-14 广州极飞科技股份有限公司 Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
CN107843240B (en) * 2017-09-14 2020-03-06 中国人民解放军92859部队 Method for rapidly extracting same-name point information of unmanned aerial vehicle image in coastal zone
CN107976146B (en) * 2017-11-01 2019-12-10 中国船舶重工集团公司第七一九研究所 Self-calibration method and measurement method of linear array CCD camera
CN109344970B (en) * 2018-11-27 2022-03-15 中国电子科技集团公司第二十研究所 Vision target-based dynamic reasoning method on unmanned aerial vehicle
CN109614998A (en) * 2018-11-29 2019-04-12 北京航天自动控制研究所 Landmark database preparation method based on deep learning
CN110113528B (en) * 2019-04-26 2021-05-07 维沃移动通信有限公司 Parameter obtaining method and terminal equipment
CN111815525B (en) * 2019-12-11 2024-04-09 长沙天仪空间科技研究院有限公司 Scene-based radiation calibration method and system
CN111899305A (en) * 2020-07-08 2020-11-06 深圳市瑞立视多媒体科技有限公司 Camera automatic calibration optimization method and related system and equipment
CN112033390B (en) * 2020-08-18 2022-07-12 深圳优地科技有限公司 Robot navigation deviation rectifying method, device, equipment and computer readable storage medium
CN112902957B (en) * 2021-01-21 2024-01-16 中国人民解放军国防科技大学 Missile-borne platform navigation method and system
CN113252079B (en) * 2021-07-05 2022-03-29 北京远度互联科技有限公司 Pod calibration method and device for unmanned aerial vehicle, electronic equipment and storage medium
CN114693807B (en) * 2022-04-18 2024-02-06 国网江苏省电力有限公司泰州供电分公司 Method and system for reconstructing mapping data of power transmission line image and point cloud
CN114578188B (en) * 2022-05-09 2022-07-08 环球数科集团有限公司 Power grid fault positioning method based on Beidou satellite
CN115618749B (en) * 2022-12-05 2023-04-07 四川腾盾科技有限公司 Error compensation method for real-time positioning of large unmanned aerial vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100250123A1 (en) * 2009-03-30 2010-09-30 Caterpillar Inc. Method and system for dispensing material from machines
US8195393B2 (en) * 2009-06-30 2012-06-05 Apple Inc. Analyzing and consolidating track file data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444832A (en) * 2016-09-28 2017-02-22 北京航天时代激光导航技术有限责任公司 Navigation method for low-altitude cruising state of antisubmarine aircraft

Also Published As

Publication number Publication date
CN105222788A (en) 2016-01-06

Similar Documents

Publication Publication Date Title
CN105222788B (en) The automatic correcting method of the matched aircraft Route Offset error of feature based
CN105021184B (en) It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation
CN106529495B (en) Obstacle detection method and device for aircraft
CN108534782B (en) Binocular vision system-based landmark map vehicle instant positioning method
Kolomenkin et al. Geometric voting algorithm for star trackers
CN110081881B (en) Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN110296691A (en) Merge the binocular stereo vision measurement method and system of IMU calibration
EP1106505B1 (en) Attitude angle sensor correcting apparatus for an artificial satellite
CN107796391A (en) A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN107560603B (en) Unmanned aerial vehicle oblique photography measurement system and measurement method
US20130322698A1 (en) Method and an apparatus for image-based navigation
CN113850126A (en) Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN108665499B (en) Near distance airplane pose measuring method based on parallax method
CN111829532B (en) Aircraft repositioning system and method
CN109387192B (en) Indoor and outdoor continuous positioning method and device
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN110751123B (en) Monocular vision inertial odometer system and method
CN109443359A (en) A kind of geographic positioning of ground full-view image
CN109376208A (en) A kind of localization method based on intelligent terminal, system, storage medium and equipment
CN111238540A (en) Lopa gamma first camera-satellite sensitive installation calibration method based on fixed star shooting
CN111044037A (en) Geometric positioning method and device for optical satellite image
CN110246194A (en) Method for quickly calibrating rotation relation between camera and inertia measurement unit
CN109470239A (en) Field compensation method, relevant apparatus and computer program
CN112489091A (en) Full strapdown image seeker target tracking method based on direct-aiming template
CN114693754A (en) Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant