CN105043392A - Aircraft pose determining method and aircraft pose determining device - Google Patents

Aircraft pose determining method and aircraft pose determining device Download PDF

Info

Publication number
CN105043392A
CN105043392A CN201510507925.XA CN201510507925A CN105043392A CN 105043392 A CN105043392 A CN 105043392A CN 201510507925 A CN201510507925 A CN 201510507925A CN 105043392 A CN105043392 A CN 105043392A
Authority
CN
China
Prior art keywords
realtime graphic
texture image
feature point
image
pixel region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510507925.XA
Other languages
Chinese (zh)
Other versions
CN105043392B (en
Inventor
李立春
周建亮
孙军
范文山
张伟
尚德生
苗毅
许颖慧
程肖
吴晓进
戴烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEOPLES LIBERATION ARMY TROOP 63920
Original Assignee
PEOPLES LIBERATION ARMY TROOP 63920
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEOPLES LIBERATION ARMY TROOP 63920 filed Critical PEOPLES LIBERATION ARMY TROOP 63920
Priority to CN201510507925.XA priority Critical patent/CN105043392B/en
Publication of CN105043392A publication Critical patent/CN105043392A/en
Application granted granted Critical
Publication of CN105043392B publication Critical patent/CN105043392B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Abstract

The invention discloses an aircraft pose determining method and an aircraft pose determining device. The method comprises the steps of acquiring a reference map and a real-time image of an airborne camera; correcting the resolution ratio of the real-time image, wherein the corrected resolution ratio of the real-time image is consistent to the resolution ratio of a standard texture image; matching and positioning the corrected real-time image and the standard texture image, and determining the matching region of the corrected real-time image in the standard texture image; selecting at least three non-collinear characteristic points in the matching region of the standard texture image, at least determining three corresponding control points; and constructing a space resection calculating equation according to the control points, thus determining the imaging position and the imaging posture. According to the method, the three-dimensional information provided by a topographic map corresponding to the reference map is combined to carry out reverse calculation, meanwhile, the flying position and posture data of an aircraft can be obtained, so that the positioning relative to the reference topographic map can be obtained directly, the accumulated error is avoided, and the precision is high.

Description

A kind of aircraft pose defining method and device
Technical field
The present invention relates to navigation and automatic control technology field in the communications field, particularly, relate to a kind of aircraft pose defining method and device.
Background technology
In the navigation and vehicle controL process of the aircraft such as spacecraft, unmanned plane, the key message that the position and attitude of aircraft is safe flight and executes the task.Conventional having based on inertial navigation, based on methods such as GPS navigation and terrain match navigation in the flight position attitude determination method of aircraft.
Inertial navigation method utilizes the navigation sensor such as accelerometer and gyroscope to measure movable information, and the method utilizes the real time acceleration of accelerometer to measure and carries out integral operation, thus the position obtained in flight course and attitude information.But the method adopts integral operation to carry out pose determines to there is the problem of accumulated error, especially comparatively large in the accumulation of long-time application time error, usually need to adopt the method for other accurate positionings to correct inertial navigation result of calculation.
The GPS assembly that GPS navigation localization method utilizes aircraft to install receives the signal of the GPS navigation satellite run on track, signal according to the multiple satellites received carries out COMPREHENSIVE CALCULATING, obtaining the positional information of aircraft, is one of air navigation aid that the tool movement such as aviation aircraft, automobile is conventional.But due to gps satellite signal for external Navsat provides, usually cannot obtain high accuracy data, therefore computational solution precision is poor, and can signal normally obtain also under one's control.Meanwhile, owing to receiving the impact that gps satellite signal is blocked by satellite " sight line " in orbit, the positional information accuracy that GPS navigation method calculates and renewal frequency all can decline.
Images match location is divided into two kinds, and terrain match location is located with scene matching aided navigation.The Terrain Elevation of flight path measured in real time by the laser altimeter that terrain match utilizes aircraft to load, and benchmark topomap real-time measurements and aircraft loaded in advance carries out matching primitives, determines the position of aircraft thus realizes flight navigation.Scene matching navigation is the topographical features of necessarily mating, and adopts shooting to wait around image forming device admission flight path or the regional landforms of target proximity, and is stored in carry-on former figure and compares, and carries out coupling position & navigation.Laser altimeter belongs to specialized equipment, and energy consumption is comparatively large, and this kind of exact instrument is expensive, is difficult to install and use on general aircraft, and the navigation of this one dimension coupling is suitable for the flight of massif landform.Scene matching aided navigation belongs to the navigation of two dimension coupling, can determine the deviation of aircraft two coordinates, is suitable only for flat country navigation.
Position and attitude parameter is the key message that aircraft security flies, executes the task smoothly, is the important content that aircraft navigation controls.Have based on radio survey method, inertial navigation method in usual method that energy consumption is large and the accumulation of error is serious, adopt GPS localization method based on navigation satellite signal, there is problem under one's control, terrain match location needs accurate expensive elevation carrection instrument, is difficult to promote on general aircraft.
Summary of the invention
The present invention is to overcome the defect that in prior art, pose defining method deviation accumulation is serious, according to an aspect of the present invention, proposes a kind of aircraft pose defining method.
A kind of aircraft pose defining method that the embodiment of the present invention provides, comprising:
Obtain the realtime graphic of reference map and Airborne Camera, reference map comprises benchmark texture image and benchmark topomap;
Carry out resolution correction to realtime graphic, the realtime graphic after correction is consistent with the resolution of benchmark texture image;
Realtime graphic after correcting mate with benchmark texture image and locates, determine the matching area of realtime graphic in benchmark texture image after correction;
Choose the unique point of at least three not conllinear in matching area in benchmark texture image, and on benchmark topomap, inquire about corresponding height value according to unique point, at least determine corresponding three reference mark;
Build space resection according to reference mark and resolve equation, be specified to image position and attitude.
In technique scheme, resolution correction is carried out to realtime graphic, comprising:
According to the equivalent focal length of real time imagery and the resolution of flying height determination realtime graphic, and determine convergent-divergent multiple that realtime graphic is corrected according to the resolution of benchmark texture image;
According to convergent-divergent multiple, convergent-divergent process is carried out to realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of benchmark texture image.
In technique scheme, the realtime graphic after correcting mate with benchmark texture image and locates, determine the matching area of realtime graphic in benchmark texture image after correction, comprising:
Choose unique point to be matched in realtime graphic after calibration, the interest value of unique point to be matched is greater than the first predetermined threshold value; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Calculate the related coefficient between the first pixel region of the realtime graphic after correcting and the second pixel region of benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of benchmark texture image; First pixel region is the pixel region of the default length and width centered by unique point to be matched, and the second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
In technique scheme, choose the unique point of at least three not conllinear, comprising:
Determine the interest value of all pixels in matching area, and choose the maximum pixel of interest value as fisrt feature point; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Determine second feature point, second feature point is that the distance between fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
Determine third feature point, third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between fisrt feature point and second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
In technique scheme, build space resection according to reference mark and resolve equation, be specified to image position and attitude, comprise:
Build range equation according to three reference mark, be specified to three image-forming ranges between image position and three unique points;
Build space resection according to three reference mark and three image-forming ranges and resolve equation, resolve the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
Based on same inventive concept, the embodiment of the present invention also provides a kind of aircraft pose determining device, comprising:
Acquisition module, for obtaining the realtime graphic of reference map and Airborne Camera, reference map comprises benchmark texture image and benchmark topomap;
Correction module, for carrying out resolution correction to realtime graphic, the realtime graphic after correction is consistent with the resolution of benchmark texture image;
Matching module, also locating for mate with benchmark texture image the realtime graphic after correcting, determining the matching area of realtime graphic in benchmark texture image after correction;
Determination module, for choosing the unique point of at least three not conllinear in the matching area in benchmark texture image, and inquiring about corresponding height value according to unique point, at least determining corresponding three reference mark on benchmark topomap;
Processing module, resolving equation for building space resection according to reference mark, being specified to image position and attitude.
In technique scheme, correction module comprises:
Computing unit, for according to the equivalent focal length of real time imagery and the resolution of flying height determination realtime graphic, and determines the convergent-divergent multiple that corrects realtime graphic according to the resolution of benchmark texture image;
Unit for scaling, for carrying out convergent-divergent process according to convergent-divergent multiple to realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of benchmark texture image.
In technique scheme, matching module comprises:
Choose unit, for choosing unique point to be matched in realtime graphic after calibration, the interest value of unique point to be matched is greater than the first predetermined threshold value; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Matching unit, for calculating the related coefficient between the first pixel region of the realtime graphic after correction and the second pixel region of benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of benchmark texture image; First pixel region is the pixel region of the default length and width centered by unique point to be matched, and the second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
In technique scheme, determination module comprises:
First determining unit, for determining the interest value of all pixels in matching area, and chooses the maximum pixel of interest value as fisrt feature point; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Second determining unit, for determining second feature point, second feature point is that the distance between fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
3rd determining unit, for determining third feature point, third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between fisrt feature point and second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
In technique scheme, processing module comprises:
First construction unit, for building range equation according to three reference mark, is specified to three image-forming ranges between image position and three unique points;
Second construction unit, resolving equation for building space resection according to three reference mark and three image-forming ranges, resolving the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
A kind of aircraft pose defining method that the embodiment of the present invention provides and device, vision real time imagery and the benchmark topomap in the flight navigating area in advance obtained and landforms benchmark texture image thereof carried out mating and oppositely resolves, thus obtaining position and the attitude parameter of aircraft.Carry out in matching process, adopting image height information to correct realtime graphic resolution at aircraft real time imagery and texture maps, based on this to figure and benchmark texture maps are mated in real time, add coupling reliability, and the three-dimensional information of combining topomap corresponding to reference map and providing oppositely resolves, can obtain flight position and the attitude data of aircraft, comparatively additive method can obtain more navigation information simultaneously; Adopt this position and attitude defining method, can obtain directly relative to the location of benchmark topomap, there is not cumulative errors, precision is high.Meanwhile, the pose defining method of this pattern adopts optical imagery to input as sensor, and be a kind of passive perceptual model, energy consumption is low, little by outer signals interference.Relative to Scene matching method based, eliminate the process from the orthograph of figure generation in real time picture, simplify part processing procedure, achieve and obtain more navigation data with less computing.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from instructions, or understand by implementing the present invention.Object of the present invention and other advantages realize by structure specifically noted in write instructions, claims and accompanying drawing and obtain.
Below by drawings and Examples, technical scheme of the present invention is described in further detail.
Accompanying drawing explanation
Accompanying drawing is used to provide a further understanding of the present invention, and forms a part for instructions, together with embodiments of the present invention for explaining the present invention, is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is the process flow diagram of aircraft pose defining method in the embodiment of the present invention;
Fig. 2 is the schematic diagram of real time imagery and reference map in the embodiment of the present invention;
Fig. 3 is the process flow diagram of aircraft pose defining method in embodiment one;
Fig. 4 is that in embodiment one, reference mark is chosen and the corresponding schematic diagram that distributes;
Fig. 5 is the schematic diagram of three reference mark imagings in embodiment one;
Fig. 6 is the schematic diagram of the known parameters in embodiment one in the attitude algorithm of position;
Fig. 7 is the structural drawing of aircraft pose determining device in the embodiment of the present invention;
Fig. 8 is the structural drawing of correction module in the embodiment of the present invention;
Fig. 9 is the structural drawing of matching module in the embodiment of the present invention;
Figure 10 is the structural drawing of determination module in the embodiment of the present invention;
Figure 11 is the structural drawing of processing module in the embodiment of the present invention.
Embodiment
Below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in detail, but is to be understood that protection scope of the present invention not by the restriction of embodiment.
According to the embodiment of the present invention, provide a kind of aircraft pose defining method, shown in Figure 1, the method comprises:
Step 101: the realtime graphic obtaining reference map and Airborne Camera, this reference map comprises benchmark texture image and benchmark topomap.
In the embodiment of the present invention, realtime graphic is that the vision imaging apparatus (i.e. Airborne Camera) that carries in aircraft flight process is to the imaging of flight range, reference map is the topomap in the flight navigating area that can obtain in advance, comprises the benchmark texture image of two dimension and three-dimensional benchmark topomap.Shown in Figure 2, be from left to right respectively the benchmark topomap of realtime graphic, benchmark texture image and three-dimensional in Fig. 2.
Step 102: carry out resolution correction to realtime graphic, the realtime graphic after correction is consistent with the resolution of benchmark texture image.
The method that multiple resolution corrects can be adopted, as long as make the realtime graphic after correcting consistent with the resolution of benchmark texture image.In the embodiment of the present invention, specifically adopt with the following method, comprise steps A 1-A2:
Steps A 1, according to the equivalent focal length of real time imagery and the resolution of image height determination realtime graphic, and determine convergent-divergent multiple that realtime graphic is corrected according to the resolution of benchmark texture image;
Steps A 2, according to convergent-divergent multiple, convergent-divergent process is carried out to realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of benchmark texture image.
Step 103: the realtime graphic after correcting mate with benchmark texture image and locates, determine the matching area of realtime graphic in benchmark texture image after correction.
Concrete, choose unique point to be matched in realtime graphic after calibration, the interest value of this unique point to be matched is greater than the first predetermined threshold value; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Calculate the related coefficient between the first pixel region of the realtime graphic after correcting and the second pixel region of benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of benchmark texture image; First pixel region is the pixel region of the default length and width centered by unique point to be matched, and the second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
Step 104: the unique point choosing at least three not conllinear in the matching area in benchmark texture image, and on benchmark topomap, inquire about corresponding height value according to three unique points, at least determine corresponding three reference mark.
Concrete, the interest value of this unique point is also greater than default threshold value, and the computing method of interest value are described above, no longer describe in detail herein.The method of selected characteristic point is as follows, comprises step B1-B3:
Step B1, determine the interest value of all pixels in matching area, and choose the maximum pixel of interest value as fisrt feature point;
Step B2, determine second feature point, second feature point is that the distance between fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
Step B3, determine third feature point, third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between fisrt feature point and second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
More than merely illustrate the method choosing three unique points, when existing more than three unique points, other unique points can be chosen with reference to the method choosing second feature point or third feature.
Because benchmark texture image is X-Y scheme, so unique point only has Two Variables; When determining the height value corresponding to unique point in the benchmark topomap of three-dimensional, the reference mark of three variablees can be defined.
Step 105: build space resection according to three reference mark and resolve equation, be specified to image position and attitude.
Concrete, build range equation according to three reference mark, be specified to three image-forming ranges between image position and three unique points; Build space resection according to three reference mark and three image-forming ranges and resolve equation, resolve the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
A kind of aircraft pose defining method that the embodiment of the present invention provides, vision real time imagery and the benchmark topomap in the flight navigating area in advance obtained and landforms benchmark texture image thereof carried out mating and oppositely resolves, thus obtaining position and the attitude parameter of aircraft.Carry out in matching process, adopting image height information to correct realtime graphic resolution at aircraft real time imagery and texture maps, based on this to figure and benchmark texture maps are mated in real time, add coupling reliability, and the three-dimensional information of combining topomap corresponding to reference map and providing oppositely resolves, can obtain flight position and the attitude data of aircraft, comparatively additive method can obtain more navigation information simultaneously; Adopt this position and attitude defining method, can obtain directly relative to the location of benchmark topomap, there is not cumulative errors, precision is high.Meanwhile, the pose defining method of this pattern adopts optical imagery to input as sensor, and be a kind of passive perceptual model, energy consumption is low, little by outer signals interference.Relative to Scene matching method based, eliminate the process from the orthograph of figure generation in real time picture, simplify part processing procedure, achieve and obtain more navigation data with less computing.
The flow process of this aircraft pose defining method is introduced in detail below by an embodiment.
Embodiment one
In embodiment one, vision real time imagery and the benchmark topomap in the flight navigating area in advance obtained and landforms benchmark texture image thereof carried out mating and oppositely resolves, thus obtaining position and the attitude parameter of aircraft.Concrete, participate in Fig. 3, comprise the following steps:
Step 301: according to the equivalent focal length of real time imagery and the resolution of flying height determination realtime graphic, and convergent-divergent multiple that realtime graphic is corrected is determined according to the resolution of benchmark texture image.
In embodiment one,
If the resolution of benchmark texture image is d r(rice/pixel).And the realtime graphic in flight course when aircraft altitude is L rice is I l, then according to optical imagery relation, the resolution d of realtime graphic lcalculated by formula (1), wherein f is imaging system equivalent focal length.
d L = L f - - - ( 1 )
Then mating with benchmark texture maps to realize automatically realtime graphic reliably, needing realtime graphic to be corrected to the resolution consistent with benchmark texture image, to the convergent-divergent multiple Z that realtime graphic corrects lRprovided by formula (2).
Z L R = d L d R - - - ( 2 )
Step 302: carry out convergent-divergent process to realtime graphic according to convergent-divergent multiple, and resampling, obtain the realtime graphic after the correction consistent with the resolution of benchmark texture image.
In embodiment one, adopt bilinear interpolation method to realtime graphic I lconvergent-divergent Z lRresampling doubly, obtains the realtime graphic I ' after the correction consistent with benchmark texture maps resolution-scale l.
Step 303: choose unique point to be matched in realtime graphic after calibration, the interest value of this unique point to be matched is greater than the first predetermined threshold value.
Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Concrete, in embodiment one, with the central point p of the realtime graphic after correcting 0for initial point, calculate the interest value I of this point vif interest value is greater than the first predetermined threshold value Iv th(in embodiment one, value is 0.6), then select this central point p 0for unique point p subject to registration r.
If this central point p 0interest value be not more than the first predetermined threshold value Iv th, then setpoint distance value d=1, and with this central point p 0centered by, calculating successively with central point distance is the interest value of all pixels of d, if wherein certain pixel p iinterest value be greater than the first predetermined threshold value and then stop search, select this pixel p ifor unique point p subject to registration r.If interest value a little still undesirable, then with Δ d (can value 1) for step-length, successively increase distance d, i.e. d=d+ Δ d, perform above-mentioned calculating successively and the step of central point apart from the interest value of all pixels for d again, so circulation is until determine unique point p subject to registration r.
Step 304: calculate the related coefficient between the first pixel region of the realtime graphic after correcting and the second pixel region of benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of benchmark texture image.
Wherein, the first above-mentioned pixel region is the pixel region of the default length and width centered by unique point to be matched, and the second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
In embodiment one, determine unique point p subject to registration rafter, with p rcentered by, with W (in embodiment one value 15) for length and width yardstick is (namely with p rcentered by, choose a slice rectangular area that length and width are W), realtime graphic is selected this regional area as template image I to be matched m.
With the position (x that inertial navigation system of flying provides 0, y 0) be the initial center of benchmark texture image, on benchmark texture image, radius is in the hunting zone of R, calculates the template image I in the W × W local area image and realtime graphic determined centered by each pixel respectively mrelated coefficient, select maximum correlation coefficient pixel as coupling positioning result value and by the match point of benchmark texture image the scope W of the same name determined × W region I bas the matching area of benchmark texture image.Wherein, the related coefficient calculated between two width images is general knowledge known in this field, does not repeat herein.
Step 305: the interest value determining all pixels in matching area, and choose the maximum pixel of interest value as fisrt feature point.
Step 306: determine second feature point, second feature point is that the distance between fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value.
Step 307: determine third feature point, third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between fisrt feature point and second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
In embodiment one, determine matching area I bafter, according to the unique point not enough greatly principles of conllinear and interest value, select 3 unique points, namely above-mentioned step 305-307.Choosing process specifically can be as follows:
Matching area I on step C1, Calculation Basis problem figure bthe interest value of interior all pixels, chooses the maximum unique point of wherein interest value as first unique point p b1.
Step C2, set the first predeterminable range d=D th; Successively investigate distance feature point p b1distance is D thpoint, if certain point feature interest value be greater than the second predetermined threshold value Iv th(can value be 0.6), then select this point to be second unique point p b2, otherwise the value of distance d is increased with default step-length (can value for 1), repetition step C2, until choose the second feature point p satisfied condition b2.
Step C3, determine unique point p b1and p b2between straight line (i.e. characteristic straight line) l 12, for characteristic straight line l 12distance is greater than the 3rd predeterminable range D toLscope in pixel p bi, investigate p one by one biwith p b1and p b2between distance d b1, d b2and interest value Iv biif distance meets min (d b1, d b2) >d th(d thbe the second predeterminable range) and interest value meets Iv bi>Iv th, then this p is selected bipoint is third feature point p b3.
Step 308: inquire about corresponding height value according to unique point on benchmark topomap, at least determines corresponding three reference mark.
Concrete, adopt the method for computed image related coefficient, realtime graphic adopts respectively least-squares iteration determine p b1, p b2, p b3the accuracy registration point p of the same name of three unique points r1(x 1, y 1), p r2(x 2, y 2), p r3(x 3, y 3), according to p on the three-dimensional land map of benchmark b1, p b2, p b3image coordinate inquire about the height value of three Feature point correspondence, obtain corresponding reference mark P b1(X 1, Y 1, Z 1), P b2(X 2, Y 2, Z 2), P b3(X 3, Y 3, Z 3).Reference mark for resolving position and attitude is chosen and is distributed corresponding schematic diagram as shown in Figure 4.
Step 309: build range equation according to three reference mark, be specified to three image-forming ranges between image position and three unique points.
Concrete, shown in Figure 5, in Fig. 5,1,2,3 three point represents three reference mark P respectively b1(X 1, Y 1, Z 1), P b2(X 2, Y 2, Z 2), P b3(X 3, Y 3, Z 3), P sfor the position at current flight device place, be the image space needing in embodiment one to determine.S 1, S 2, S 3be expressed as picture photocentre P sto the distance at three reference mark, it is unknown number to be separated.S 12the distance between reference mark 1,2, S 23the distance between reference mark 2,3, S 31it is the distance between reference mark 1,3; projection imaging light P s-1, P sangle between-2, projection imaging light P s-2, P sangle between-3, projection imaging light P s-1, P sangle between-3, these three angles can calculate according to picture point and camera parameter.
Specific as follows according to the range equation that three reference mark build, image position P can be specified to according to this range equation sand three image-forming range S between three unique points 1, S 2, S 3.
Step 310: build space resection according to three reference mark and three image-forming ranges and resolve equation, resolve the rotation matrix R of image space and expression attitude.
Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
Concrete, shown in Figure 6, Fig. 6 represent position and attitude resolve in known parameters source illustrate, wherein P sidentical with Fig. 5, be expressed as picture photocentre, S ibe i-th reference mark, p ibe expressed as the imaging at i-th reference mark, α ithe imaging side parallactic angle of the one-tenth projection ray at i-th reference mark, β iit is the image height angle of the projection ray at i-th reference mark.
In embodiment one,
Resolve image space P s(X s, Y s, Z s) with attitude relative to the rotation matrix R of the frame of reference, wherein rotation matrix R is as implied above, forms system of equations, resolve position P after three reference mark parameters substitute into following formula respectively swith the rotation matrix R representing attitude.
S i cosα i cosβ i sinα i cosβ i sinβ i - X i Y i Z i = 0 S i sinβ i + Z i - S i sinα i cosβ i - Y - S i sinβ i - Z i 0 S i cosα i cosβ i + X i S i sinα i cosβ i + Y i - S i cosα i cosβ i - X i 0 a b c - 1 - c b c 1 - a - b a 1 X S Y S Z S
Three matrix expressions can be obtained after respectively three reference mark parameters being substituted into above formula, and then can a be determined, b, c and image space P s(X s, Y s, Z s).Determining a, after b, c, rotation matrix R can determined.
It should be noted that, the computing method adopted in the present embodiment one are only preferred embodiment, and it also can adopt other similar approach.Such as, the resolution alignment technique of other maturations can be adopted or with other similar formulae discovery interest value I v, or choose three interest value maximum and not conllinear o'clock as three unique points etc., do not repeat herein.
A kind of aircraft pose defining method that the embodiment of the present invention provides, vision real time imagery and the benchmark topomap in the flight navigating area in advance obtained and landforms benchmark texture image thereof carried out mating and oppositely resolves, thus obtaining position and the attitude parameter of aircraft.Carry out in matching process, adopting image height information to correct realtime graphic resolution at aircraft real time imagery and texture maps, based on this to figure and benchmark texture maps are mated in real time, add coupling reliability, and the three-dimensional information of combining topomap corresponding to reference map and providing oppositely resolves, can obtain flight position and the attitude data of aircraft, comparatively additive method can obtain more navigation information simultaneously; Adopt this position and attitude defining method, can obtain directly relative to the location of benchmark topomap, there is not cumulative errors, precision is high.Meanwhile, the pose defining method of this pattern adopts optical imagery to input as sensor, and be a kind of passive perceptual model, energy consumption is low, little by outer signals interference.Relative to Scene matching method based, eliminate the process from the orthograph of figure generation in real time picture, simplify part processing procedure, achieve and obtain more navigation data with less computing.
More than describe the flow process of aircraft pose defining method in detail, the method also can be realized by corresponding device, introduces the 26S Proteasome Structure and Function of this device below in detail.
A kind of aircraft pose determining device that the embodiment of the present invention provides, shown in Figure 7, comprising:
Acquisition module 71, for obtaining the realtime graphic of reference map and Airborne Camera, reference map comprises benchmark texture image and benchmark topomap;
Correction module 72, for carrying out resolution correction to realtime graphic, the realtime graphic after correction is consistent with the resolution of benchmark texture image;
Matching module 73, also locating for mate with benchmark texture image the realtime graphic after correcting, determining the matching area of realtime graphic in benchmark texture image after correction;
Determination module 74, for choosing the unique point of at least three not conllinear in the matching area in benchmark texture image, and inquiring about corresponding height value according to unique point, at least determining corresponding three reference mark on benchmark topomap;
Processing module 75, resolving equation for building space resection according to reference mark, being specified to image position and attitude.
Preferably, shown in Figure 8, correction module 72 comprises:
Computing unit 721, for according to the equivalent focal length of real time imagery and the resolution of image height determination realtime graphic, and determines the convergent-divergent multiple that corrects realtime graphic according to the resolution of benchmark texture image;
Unit for scaling 722, for carrying out convergent-divergent process according to convergent-divergent multiple to realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of benchmark texture image.
Preferably, shown in Figure 9, matching module 73 comprises:
Choose unit 731, for choosing unique point to be matched in realtime graphic after calibration, the interest value of unique point to be matched is greater than the first predetermined threshold value; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Matching unit 732, for calculating the related coefficient between the first pixel region of the realtime graphic after correction and the second pixel region of benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of benchmark texture image; First pixel region is the pixel region of the default length and width centered by unique point to be matched, and the second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
Preferably, shown in Figure 10, determination module 74 comprises:
First determining unit 741, for determining the interest value of all pixels in matching area, and chooses the maximum pixel of interest value as fisrt feature point; Wherein, interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Second determining unit 742, for determining second feature point, second feature point is that the distance between fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
3rd determining unit 743, for determining third feature point, third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between fisrt feature point and second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
Preferably, shown in Figure 11, processing module 75 comprises:
First construction unit 751, for building range equation according to three reference mark, is specified to three image-forming ranges between image position and three unique points;
Second construction unit 752, resolving equation for building space resection according to three reference mark and three image-forming ranges, resolving the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
A kind of aircraft pose defining method that the embodiment of the present invention provides and device, vision real time imagery and the benchmark topomap in the flight navigating area in advance obtained and landforms benchmark texture image thereof carried out mating and oppositely resolves, thus obtaining position and the attitude parameter of aircraft.Carry out in matching process, adopting image height information to correct realtime graphic resolution at aircraft real time imagery and texture maps, based on this to figure and benchmark texture maps are mated in real time, add coupling reliability, and the three-dimensional information of combining topomap corresponding to reference map and providing oppositely resolves, can obtain flight position and the attitude data of aircraft, comparatively additive method can obtain more navigation information simultaneously; Adopt this position and attitude defining method, can obtain directly relative to the location of benchmark topomap, there is not cumulative errors, precision is high.Meanwhile, the pose defining method of this pattern adopts optical imagery to input as sensor, and be a kind of passive perceptual model, energy consumption is low, little by outer signals interference.Relative to Scene matching method based, eliminate the process from the orthograph of figure generation in real time picture, simplify part processing procedure, achieve and obtain more navigation data with less computing.
The present invention can have multiple multi-form embodiment; above for Fig. 1-Figure 11 by reference to the accompanying drawings to technical scheme of the present invention explanation for example; this does not also mean that the instantiation that the present invention applies can only be confined in specific flow process or example structure; those of ordinary skill in the art should understand; specific embodiments provided above is some examples in multiple its preferred usage, and the embodiment of any embodiment the claims in the present invention all should within technical solution of the present invention scope required for protection.
Last it is noted that the foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, although with reference to previous embodiment to invention has been detailed description, for a person skilled in the art, it still can be modified to the technical scheme described in foregoing embodiments, or carries out equivalent replacement to wherein portion of techniques feature.Within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. an aircraft pose defining method, is characterized in that, comprising:
Obtain the realtime graphic of reference map and Airborne Camera, described reference map comprises benchmark texture image and benchmark topomap;
Carry out resolution correction to described realtime graphic, the realtime graphic after correction is consistent with the resolution of described benchmark texture image;
Realtime graphic after described correction is mated with described benchmark texture image and locates, determine the matching area of realtime graphic in described benchmark texture image after correcting;
Choose the unique point of at least three not conllinear in matching area in described benchmark texture image, and on described benchmark topomap, inquire about corresponding height value according to unique point, at least determine corresponding three reference mark;
Build space resection according to described reference mark and resolve equation, be specified to image position and attitude.
2. method according to claim 1, is characterized in that, describedly carries out resolution correction to described realtime graphic, comprising:
According to equivalent focal length and the flying height of described real time imagery, determine the resolution of described realtime graphic, then determine convergent-divergent multiple that described realtime graphic is corrected according to the resolution of benchmark texture image;
According to described convergent-divergent multiple, convergent-divergent process is carried out to described realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of described benchmark texture image.
3. method according to claim 1, is characterized in that, described coupling with described benchmark texture image the realtime graphic after described correction locates, and determines the matching area of realtime graphic in described benchmark texture image after correcting, comprising:
Choose unique point to be matched in realtime graphic after calibration, the interest value of described unique point to be matched is greater than the first predetermined threshold value; Wherein, described interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by described unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Calculate the related coefficient between the first pixel region of the realtime graphic after described correction and the second pixel region of described benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of described benchmark texture image; Described first pixel region is the pixel region of the default length and width centered by described unique point to be matched, and described second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
4., according to the arbitrary described method of claim 1-3, it is characterized in that, described in choose the unique point of at least three not conllinear, comprising:
Determine the interest value of all pixels in described matching area, and choose the maximum pixel of interest value as fisrt feature point; Wherein, described interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by described unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Determine second feature point, described second feature point is that the distance between described fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
Determine third feature point, described third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between described fisrt feature point and described second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Described characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
5. according to the arbitrary described method of claim 1-3, it is characterized in that, describedly build space resection according to described reference mark and resolve equation, be specified to image position and attitude, comprise:
Build range equation according to three reference mark, be specified to three image-forming ranges between image position and described three unique points;
Build space resection according to described three reference mark and described three image-forming ranges and resolve equation, resolve the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
6. an aircraft pose determining device, is characterized in that, comprising:
Acquisition module, for obtaining the realtime graphic of reference map and Airborne Camera, described reference map comprises benchmark texture image and benchmark topomap;
Correction module, for carrying out resolution correction to described realtime graphic, the realtime graphic after correction is consistent with the resolution of described benchmark texture image;
Matching module, also locates for mating with described benchmark texture image the realtime graphic after described correction, determines the matching area of realtime graphic in described benchmark texture image after correcting;
Determination module, for choosing the unique point of at least three not conllinear in the matching area in described benchmark texture image, and inquiring about corresponding height value according to unique point, at least determining corresponding three reference mark on described benchmark topomap;
Processing module, resolving equation for building space resection according to described reference mark, being specified to image position and attitude.
7. device according to claim 6, is characterized in that, described correction module comprises:
Computing unit, for according to the equivalent focal length of real time imagery and flying height, determines the resolution of described realtime graphic, and determines the convergent-divergent multiple that corrects described realtime graphic according to the resolution of described benchmark texture image;
Unit for scaling, for carrying out convergent-divergent process according to described convergent-divergent multiple to described realtime graphic, and resampling, obtain the realtime graphic after the correction consistent with the resolution of described benchmark texture image.
8. device according to claim 6, is characterized in that, described matching module comprises:
Choose unit, for choosing unique point to be matched in realtime graphic after calibration, the interest value of described unique point to be matched is greater than the first predetermined threshold value; Wherein, described interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by described unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Matching unit, for calculating the related coefficient between the first pixel region of the realtime graphic after described correction and the second pixel region of described benchmark texture image respectively, the second pixel region selecting related coefficient maximum is as the matching area of described benchmark texture image; Described first pixel region is the pixel region of the default length and width centered by described unique point to be matched, and described second pixel region is the pixel region of the default length and width in benchmark texture image centered by a certain pixel.
9., according to the arbitrary described device of claim 6-8, it is characterized in that, described determination module comprises:
First determining unit, for determining the interest value of all pixels in described matching area, and chooses the maximum pixel of interest value as fisrt feature point; Wherein, described interest value I vfor:
I v=det [M]-0.04 (tr [M]) 2, in formula, M is the covariance matrix of the pixel region of default length and width centered by described unique point to be matched, M = Σ x , y w ( x , y ) I x 2 I x I y I x I y I y 2 , Det [M] is the determinant of matrix M, and tr [M] is the mark of matrix M, and w (x, y) is Gauss's weighting function, I xthe difference value of transverse direction, I yit is the difference value of longitudinal direction;
Second determining unit, for determining second feature point, described second feature point is that the distance between described fisrt feature point is the first predeterminable range and interest value is greater than the pixel of the second predetermined threshold value;
3rd determining unit, for determining third feature point, described third feature point is and the distance distance be all greater than between the second predeterminable range and characteristic straight line between described fisrt feature point and described second feature point is greater than the 3rd predeterminable range and interest value is greater than the pixel of the second predetermined threshold value; Described characteristic straight line is by fisrt feature point and the determined straight line of second feature point.
10., according to the arbitrary described device of claim 6-8, it is characterized in that, described processing module comprises:
First construction unit, for building range equation according to three reference mark, is specified to three image-forming ranges between image position and described three unique points;
Second construction unit, resolving equation for building space resection according to described three reference mark and described three image-forming ranges, resolving the rotation matrix R of image space and expression attitude; Wherein, rotation matrix R is:
R = ( E - 1 - c b c 1 - a - b a 1 ) - 1 ( E + 1 - c b c 1 - a - b a 1 ) - 1
Wherein, E is unit matrix, and a, b, c resolve according to space resection three parameters that equation determines.
CN201510507925.XA 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device Active CN105043392B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510507925.XA CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510507925.XA CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Publications (2)

Publication Number Publication Date
CN105043392A true CN105043392A (en) 2015-11-11
CN105043392B CN105043392B (en) 2018-03-02

Family

ID=54450126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510507925.XA Active CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Country Status (1)

Country Link
CN (1) CN105043392B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571573A (en) * 2015-12-29 2016-05-11 南京中观软件技术有限公司 Oblique photography position and attitude determining method and system thereof, and airplane position and attitude determining method and system thereof
CN105606073A (en) * 2016-01-11 2016-05-25 谭圆圆 Unmanned aerial vehicle processing system and flight state data processing method thereof
CN105825517B (en) * 2016-03-31 2018-09-07 湖北航天技术研究院总体设计所 A kind of image correcting method and system of navigation height error
CN108917753A (en) * 2018-04-08 2018-11-30 中国人民解放军63920部队 Method is determined based on the position of aircraft of structure from motion
WO2019007258A1 (en) * 2017-07-07 2019-01-10 腾讯科技(深圳)有限公司 Method, apparatus and device for determining camera posture information, and storage medium
CN109540173A (en) * 2018-09-17 2019-03-29 江西洪都航空工业集团有限责任公司 A kind of Transfer Alignment of vision auxiliary
WO2021217403A1 (en) * 2020-04-28 2021-11-04 深圳市大疆创新科技有限公司 Method and apparatus for controlling movable platform, and device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080109184A1 (en) * 2006-11-06 2008-05-08 Canon Kabushiki Kaisha Position and orientation measurement method and apparatus
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103927738A (en) * 2014-01-10 2014-07-16 北京航天飞行控制中心 Planet vehicle positioning method based on binocular vision images in large-distance mode

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080109184A1 (en) * 2006-11-06 2008-05-08 Canon Kabushiki Kaisha Position and orientation measurement method and apparatus
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103927738A (en) * 2014-01-10 2014-07-16 北京航天飞行控制中心 Planet vehicle positioning method based on binocular vision images in large-distance mode

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘勇: "基于影像的运动平台自定位测姿", 《中国博士学位论文全文数据库基础科学辑》 *
汤念: "景象匹配制导地面半实物仿真验证系统", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571573A (en) * 2015-12-29 2016-05-11 南京中观软件技术有限公司 Oblique photography position and attitude determining method and system thereof, and airplane position and attitude determining method and system thereof
CN105571573B (en) * 2015-12-29 2018-05-01 南京中观软件技术有限公司 Oblique photograph positioning and orientation method and system and Plane location method for determining posture and system
CN105606073A (en) * 2016-01-11 2016-05-25 谭圆圆 Unmanned aerial vehicle processing system and flight state data processing method thereof
CN105825517B (en) * 2016-03-31 2018-09-07 湖北航天技术研究院总体设计所 A kind of image correcting method and system of navigation height error
WO2019007258A1 (en) * 2017-07-07 2019-01-10 腾讯科技(深圳)有限公司 Method, apparatus and device for determining camera posture information, and storage medium
TWI683259B (en) * 2017-07-07 2020-01-21 大陸商騰訊科技(深圳)有限公司 Method and related device of determining camera posture information
US10963727B2 (en) 2017-07-07 2021-03-30 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for determining camera posture information
US11605214B2 (en) 2017-07-07 2023-03-14 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for determining camera posture information
CN108917753A (en) * 2018-04-08 2018-11-30 中国人民解放军63920部队 Method is determined based on the position of aircraft of structure from motion
CN108917753B (en) * 2018-04-08 2022-02-15 中国人民解放军63920部队 Aircraft position determination method based on motion recovery structure
CN109540173A (en) * 2018-09-17 2019-03-29 江西洪都航空工业集团有限责任公司 A kind of Transfer Alignment of vision auxiliary
WO2021217403A1 (en) * 2020-04-28 2021-11-04 深圳市大疆创新科技有限公司 Method and apparatus for controlling movable platform, and device and storage medium

Also Published As

Publication number Publication date
CN105043392B (en) 2018-03-02

Similar Documents

Publication Publication Date Title
Stöcker et al. Quality assessment of combined IMU/GNSS data for direct georeferencing in the context of UAV-based mapping
CN105043392A (en) Aircraft pose determining method and aircraft pose determining device
Maaref et al. Lane-level localization and mapping in GNSS-challenged environments by fusing lidar data and cellular pseudoranges
Li Mobile mapping: An emerging technology for spatial data acquisition
US6639553B2 (en) Passive/ranging/tracking processing method for collision avoidance guidance
GREJNER‐BRZEZINSKA Direct exterior orientation of airborne imagery with GPS/INS system: Performance analysis
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN110470304B (en) High-precision target positioning and speed measuring method based on unmanned aerial vehicle photoelectric platform
Carle et al. Global rover localization by matching lidar and orbital 3d maps
CN105928518A (en) Indoor pedestrian UWB/INS tightly combined navigation system and method adopting pseudo range and position information
Wen et al. Object-detection-aided GNSS and its integration with lidar in highly urbanized areas
Ouyang et al. Cooperative navigation of UAVs in GNSS-denied area with colored RSSI measurements
CN108225282B (en) Remote sensing camera stereo mapping method and system based on multivariate data fusion
CN112923919A (en) Pedestrian positioning method and system based on graph optimization
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
Niu et al. Camera-based lane-aided multi-information integration for land vehicle navigation
Chu et al. Performance comparison of tight and loose INS-Camera integration
CN103245948B (en) Image match navigation method for double-area image formation synthetic aperture radars
Toth et al. Terrain-based navigation: Trajectory recovery from LiDAR data
Adams et al. Velocimeter LIDAR-based multiplicative extended Kalman filter for Terrain relative navigation applications
CN115930948A (en) Orchard robot fusion positioning method
Emter et al. Stochastic cloning and smoothing for fusion of multiple relative and absolute measurements for localization and mapping
CN115388890A (en) Visual sense-based multi-unmanned aerial vehicle cooperative ground target positioning method
Aboutaleb et al. Examining the Benefits of LiDAR Odometry Integrated with GNSS and INS in Urban Areas
CN106123894A (en) InSAR/INS Combinated navigation method based on interference fringe coupling

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant