CN105043392B - A kind of aircraft pose determines method and device - Google Patents

A kind of aircraft pose determines method and device Download PDF

Info

Publication number
CN105043392B
CN105043392B CN201510507925.XA CN201510507925A CN105043392B CN 105043392 B CN105043392 B CN 105043392B CN 201510507925 A CN201510507925 A CN 201510507925A CN 105043392 B CN105043392 B CN 105043392B
Authority
CN
China
Prior art keywords
mtd
mrow
realtime graphic
benchmark
mtr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510507925.XA
Other languages
Chinese (zh)
Other versions
CN105043392A (en
Inventor
李立春
周建亮
孙军
范文山
张伟
尚德生
苗毅
许颖慧
程肖
吴晓进
戴烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEOPLES LIBERATION ARMY TROOP 63920
Original Assignee
PEOPLES LIBERATION ARMY TROOP 63920
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEOPLES LIBERATION ARMY TROOP 63920 filed Critical PEOPLES LIBERATION ARMY TROOP 63920
Priority to CN201510507925.XA priority Critical patent/CN105043392B/en
Publication of CN105043392A publication Critical patent/CN105043392A/en
Application granted granted Critical
Publication of CN105043392B publication Critical patent/CN105043392B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of aircraft pose to determine method and device, wherein, this method includes:Obtain the realtime graphic of reference map and Airborne Camera;Resolution ratio correction is carried out to realtime graphic, the realtime graphic after correction is consistent with the resolution ratio of benchmark texture image;Matching positioning is carried out to the realtime graphic after correction and benchmark texture image, it is determined that matching area of the realtime graphic after correction in benchmark texture image;At least three not conllinear characteristic points are chosen in matching area in benchmark texture image, and at least determine corresponding three control points;Space resection is built according to control point and resolves equation, determines image space and posture.The three-dimensional information that topographic map corresponding to this method joint reference map provides reversely is resolved, and can be obtained the flight position and attitude data of aircraft simultaneously, can be obtained the positioning directly with respect to benchmark topographic map, high in the absence of accumulated error, precision.

Description

A kind of aircraft pose determines method and device
Technical field
The present invention relates to navigation and automatic control technology field in the communications field, in particular it relates to a kind of aircraft pose Determine method and device.
Background technology
In the navigation and control process of the aircraft such as spacecraft, unmanned plane, the position and attitude of aircraft be safe flight and The key message of execution task.That is commonly used in the flight position attitude determination method of aircraft has based on inertial navigation, based on GPS The methods of navigation and terrain match are navigated.
Inertial navigation method is that movable information is measured using navigation sensors such as accelerometer and gyroscopes, the party Method is measured using the real time acceleration of accelerometer and carries out integral operation, is believed so as to obtain the position in flight course with posture Breath.But this method carries out pose determination using integral operation, the problem of accumulated error be present, especially mistiming is applied in long-time Difference-product tires out larger, it is often necessary to inertial navigation result of calculation is corrected using the method for other accurate positionings.
The GPS navigation satellite that the GPS component reception that GPS navigation localization method is installed using aircraft is run on track Signal, COMPREHENSIVE CALCULATING is carried out according to the signal of the multiple satellites received, the positional information of aircraft is obtained, is aviation flight One of conventional air navigation aid of the tool movements such as device, automobile.But because gps satellite signal provides for external aeronautical satellite, lead to Chang Wufa obtains high accuracy data, therefore computational solution precision is poor, and can signal normally obtain and be subject to people.Together When, influenceed due to receiving gps satellite signal and being blocked by satellite in orbit " sight ", the position letter that GPS navigation method calculates Breath accuracy and renewal frequency can all decline.
Images match positioning is divided into two kinds, and terrain match positioning positions with scene matching aided navigation.Terrain match is filled using aircraft The laser altimeter of load measures the Terrain Elevation of flight path in real time, the benchmark that real-time measurements and aircraft are pre-loaded with Topographic map carries out matching primitives, determines the position of aircraft so as to realize flight navigation.Scene matching navigation is necessarily to match Topographical features, using shooting etc. image forming device admission flight path around or target proximity area landforms, with being stored in Carry-on artwork compares, and carries out matching positioning and navigation.Laser altimeter belongs to special equipment, and energy consumption is larger, and this Class precision instrument is expensive, it is difficult to is installed and used on general aircraft, this one-dimensional matching navigation is suitable for massif landform Flight.Scene matching aided navigation belongs to two dimension matching navigation, it may be determined that the deviation of two coordinates of aircraft, is suitable only for flat country Navigation.
Position and attitude parameter is aircraft security flight, the key message for smoothly performing task, is aircraft navigation control Important content.Have based on Radio Measurement method, inertial navigation method that energy consumption is big and the accumulation of error is serious in usual method, Navigation satellite signal is based on using GPS location method, the problem of under one's control be present, terrain match positioning needs accurate costliness Elevation carrection instrument, it is difficult to promoted on general aircraft.
The content of the invention
The present invention be in order to overcome in the prior art pose determine that method error adds up the defects of serious, according to the present invention On one side, propose that a kind of aircraft pose determines method.
A kind of aircraft pose provided in an embodiment of the present invention determines method, including:
The realtime graphic of reference map and Airborne Camera is obtained, reference map includes benchmark texture image and benchmark topographic map;
Resolution ratio correction is carried out to realtime graphic, the realtime graphic after correction is consistent with the resolution ratio of benchmark texture image;
Matching positioning is carried out to the realtime graphic after correction and benchmark texture image, it is determined that the realtime graphic after correction is in base Matching area in quasi- texture image;
At least three not conllinear characteristic points are chosen in matching area in benchmark texture image, and are existed according to characteristic point Corresponding height value is inquired about on benchmark topographic map, at least determines corresponding three control points;
Space resection is built according to control point and resolves equation, determines image space and posture.
In the above-mentioned technical solutions, resolution ratio correction is carried out to realtime graphic, including:
The resolution ratio of realtime graphic is determined according to the equivalent focal length of real time imagery and flying height, and according to benchmark texture maps The resolution ratio of picture determines the scaling multiple being corrected to realtime graphic;
Processing, and resampling are zoomed in and out to realtime graphic according to scaling multiple, obtain point with benchmark texture image Realtime graphic after the consistent correction of resolution.
In the above-mentioned technical solutions, matching positioning is carried out to the realtime graphic after correction and benchmark texture image, determines school Matching area of the realtime graphic in benchmark texture image after just, including:
Characteristic point to be matched is chosen in realtime graphic after calibration, the interest value of characteristic point to be matched is more than first and preset Threshold value;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Calculate respectively the first pixel region and the benchmark texture image of the realtime graphic after correction the second pixel region it Between coefficient correlation, select matching area of the second maximum pixel region of coefficient correlation as benchmark texture image;First picture Plain region is the pixel region of the default length and width centered on characteristic point to be matched, and the second pixel region is with benchmark texture image In default length and width centered on a certain pixel pixel region.
In the above-mentioned technical solutions, at least three not conllinear characteristic points are chosen, including:
The interest value of all pixels point in matching area is determined, and chooses the maximum pixel of interest value as fisrt feature Point;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Determine second feature point, second feature point is that the distance between fisrt feature point is the first pre-determined distance and emerging Pixel of the interest value more than the second predetermined threshold value;
Determine third feature point, third feature point is to be all higher than the with the distance between fisrt feature point and second feature point The distance between two pre-determined distances and characteristic straight line are more than the 3rd pre-determined distance and interest value is more than the picture of the second predetermined threshold value Vegetarian refreshments;Characteristic straight line is the straight line determined by fisrt feature point and second feature point.
In the above-mentioned technical solutions, space resection is built according to control point and resolves equation, determine image space and appearance State, including:
Build range equations according to three control points, determine three imagings between image space and three characteristic points away from From;
Equation is resolved according to three control points and three image-forming range structure space resections, resolves image space and table Show the spin matrix R of posture;Wherein, spin matrix R is:
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
Based on same inventive concept, the embodiment of the present invention also provides a kind of aircraft pose determining device, including:
Acquisition module, for obtaining the realtime graphic of reference map and Airborne Camera, reference map include benchmark texture image and Benchmark topographic map;
Correction module, for carrying out resolution ratio correction to realtime graphic, the realtime graphic after correction and benchmark texture image Resolution ratio it is consistent;
Matching module, it is additionally operable to carry out matching positioning to the realtime graphic after correction and benchmark texture image, it is determined that correction Matching area of the realtime graphic afterwards in benchmark texture image;
Determining module, for choosing at least three not conllinear characteristic points in the matching area in benchmark texture image, And corresponding height value is inquired about on benchmark topographic map according to characteristic point, at least determine corresponding three control points;
Processing module, equation is resolved for building space resection according to control point, determines image space and posture.
In the above-mentioned technical solutions, correction module includes:
Computing unit, the resolution ratio of realtime graphic is determined for the equivalent focal length according to real time imagery and flying height, and The scaling multiple for determining to be corrected realtime graphic according to the resolution ratio of benchmark texture image;
Unit for scaling, for zooming in and out processing, and resampling, acquisition and benchmark to realtime graphic according to scaling multiple Realtime graphic after the consistent correction of the resolution ratio of texture image.
In the above-mentioned technical solutions, matching module includes:
Unit is chosen, for choosing characteristic point to be matched, the interest of characteristic point to be matched in realtime graphic after calibration Value is more than the first predetermined threshold value;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Matching unit, the of the first pixel region and benchmark texture image for calculating the realtime graphic after correction respectively Coefficient correlation between two pixel regions, select matching of the second maximum pixel region of coefficient correlation as benchmark texture image Region;First pixel region for default length and width centered on characteristic point to be matched pixel region, the second pixel region be with The pixel region of default length and width in benchmark texture image centered on a certain pixel.
In the above-mentioned technical solutions, determining module includes:
First determining unit, for determining the interest value of all pixels point in matching area, and choose interest value maximum Pixel is as fisrt feature point;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Second determining unit, for determining second feature point, second feature point is that the distance between fisrt feature point is First pre-determined distance and interest value are more than the pixel of the second predetermined threshold value;
3rd determining unit, for determining third feature point, third feature point is and fisrt feature point and second feature point The distance between be all higher than the second pre-determined distance, be more than the 3rd pre-determined distance with the distance between characteristic straight line and interest value is big In the pixel of the second predetermined threshold value;Characteristic straight line is the straight line determined by fisrt feature point and second feature point.
In the above-mentioned technical solutions, processing module includes:
First construction unit, for building range equation according to three control points, determine image space and three characteristic points Between three image-forming ranges;
Second construction unit, for according to three control points and three resolving sides of image-forming range structure space resection Journey, resolve image space and represent the spin matrix R of posture;Wherein, spin matrix R is:
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
A kind of aircraft pose provided in an embodiment of the present invention determines method and device, by vision real time imagery with obtaining in advance The benchmark topographic map in flight navigating area and its landforms benchmark texture image taken carries out matching and reversely resolved, so as to be flown The position of device and attitude parameter.Using image height information to reality in aircraft real time imagery and texture maps carry out matching process When image resolution ratio correct, based on this in real time figure and benchmark texture maps match, add matching reliability, and combine base The three-dimensional information that topographic map corresponding to quasi- figure provides reversely is resolved, and can obtain the flight position and posture of aircraft simultaneously Data, more navigation informations can be obtained compared with other method;Method is determined using the position and attitude, can obtain directly with respect to The positioning of benchmark topographic map, it is high in the absence of accumulated error, precision.Meanwhile the pose of this pattern determines that method is to use optics Imaging inputs as sensor, is a kind of passive perceptual model, and energy consumption is low, is disturbed by outer signals small.Relative to scene matching aided navigation Method, the process from real-time figure generation orthograph picture is eliminated, part processing procedure is simplified, realizes and obtained with less computing Obtain more navigation datas.
Other features and advantages of the present invention will be illustrated in the following description, also, partly becomes from specification Obtain it is clear that or being understood by implementing the present invention.The purpose of the present invention and other advantages can be by the explanations write Specifically noted structure is realized and obtained in book, claims and accompanying drawing.
Below by drawings and examples, technical scheme is described in further detail.
Brief description of the drawings
Accompanying drawing is used for providing a further understanding of the present invention, and a part for constitution instruction, the reality with the present invention Apply example to be used to explain the present invention together, be not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is the flow chart that aircraft pose determines method in the embodiment of the present invention;
Fig. 2 is the schematic diagram of real time imagery and reference map in the embodiment of the present invention;
Fig. 3 is the flow chart that aircraft pose determines method in embodiment one;
Fig. 4 is that control point is chosen in embodiment one and distribution corresponds to schematic diagram;
Fig. 5 is the schematic diagram of three control point imagings in embodiment one;
Fig. 6 is the schematic diagram of the known parameters in the attitude algorithm of position in embodiment one;
Fig. 7 is the structure chart of aircraft pose determining device in the embodiment of the present invention;
Fig. 8 is the structure chart of correction module in the embodiment of the present invention;
Fig. 9 is the structure chart of matching module in the embodiment of the present invention;
Figure 10 is the structure chart of determining module in the embodiment of the present invention;
Figure 11 is the structure chart of processing module in the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawings, the embodiment of the present invention is described in detail, it is to be understood that the guarantor of the present invention Shield scope is not limited by embodiment.
According to embodiments of the present invention, there is provided a kind of aircraft pose determines method, and shown in Figure 1, this method includes:
Step 101:The realtime graphic of reference map and Airborne Camera is obtained, the reference map includes benchmark texture image and benchmark Topographic map.
In the embodiment of the present invention, realtime graphic is vision imaging apparatus (the i.e. airborne phase carried during aircraft flight Machine) imaging to flight range, reference map is the topographic map in the flight navigating area that can be obtained in advance, includes the benchmark line of two dimension Manage image and three-dimensional benchmark topographic map.It is shown in Figure 2, it is respectively from left to right realtime graphic, benchmark texture image in Fig. 2 With the benchmark topographic map of three-dimensional.
Step 102:Resolution ratio correction is carried out to realtime graphic, the resolution of the realtime graphic after correction and benchmark texture image Rate is consistent.
The method that the correction of a variety of resolution ratio can be used, as long as so that realtime graphic and benchmark texture image after correction Resolution ratio is consistent.In the embodiment of the present invention, specifically with the following method, including step A1-A2:
Step A1, the resolution ratio of realtime graphic is determined according to the equivalent focal length of real time imagery and image height, and according to base The resolution ratio of quasi- texture image determines the scaling multiple being corrected to realtime graphic;
Step A2, processing, and resampling are zoomed in and out to realtime graphic according to scaling multiple, obtained and benchmark texture maps Realtime graphic after the consistent correction of the resolution ratio of picture.
Step 103:Matching positioning is carried out to the realtime graphic after correction and benchmark texture image, it is determined that real-time after correction Matching area of the image in benchmark texture image.
Specifically, choosing characteristic point to be matched in realtime graphic after calibration, the interest value of the characteristic point to be matched is big In the first predetermined threshold value;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Calculate respectively the first pixel region and the benchmark texture image of the realtime graphic after correction the second pixel region it Between coefficient correlation, select matching area of the second maximum pixel region of coefficient correlation as benchmark texture image;First picture Plain region is the pixel region of the default length and width centered on characteristic point to be matched, and the second pixel region is with benchmark texture image In default length and width centered on a certain pixel pixel region.
Step 104:At least three not conllinear characteristic points of selection in matching area in benchmark texture image, and according to Three characteristic points inquire about corresponding height value on benchmark topographic map, at least determine corresponding three control points.
Specifically, the interest value of this feature point is also greater than default threshold value, the computational methods of interest value as described above, this Place is no longer described in detail.The method of selected characteristic point is as follows, including step B1-B3:
Step B1, the interest value of all pixels point in matching area is determined, and chooses the maximum pixel conduct of interest value Fisrt feature point;
Step B2, determine second feature point, second feature point be the distance between fisrt feature point be first it is default away from From and interest value be more than the second predetermined threshold value pixel;
Step B3, third feature point is determined, third feature point is and the distance between fisrt feature point and second feature point The distance between the second pre-determined distance and characteristic straight line is all higher than to be more than second more than the 3rd pre-determined distance and interest value and preset The pixel of threshold value;Characteristic straight line is the straight line determined by fisrt feature point and second feature point.
The method for choosing three characteristic points, when more than three characteristic point be present, other characteristic points are merely illustrated above It is referred to choose second feature point or the method for third feature is chosen.
Because benchmark texture image is X-Y scheme, so characteristic point only has two variables;In the benchmark topographic map of three-dimensional It is determined that during height value corresponding with characteristic point, it may be determined that have the control point of three variables.
Step 105:Space resection is built according to three control points and resolves equation, determines image space and posture.
Specifically, building range equation according to three control points, three between image space and three characteristic points are determined Image-forming range;Resolve equations according to three control points and three image-forming ranges structure space resections, resolve image space and Represent the spin matrix R of posture;Wherein, spin matrix R is:
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
A kind of aircraft pose provided in an embodiment of the present invention determines method, and vision real time imagery is flown with what is obtained in advance Benchmark topographic map and its landforms benchmark texture image in row navigating area carry out matching and reversely resolved, so as to obtain the position of aircraft Put and attitude parameter.Using image height information to realtime graphic in aircraft real time imagery and texture maps carry out matching process Resolution ratio is corrected, and figure in real time and benchmark texture maps are matched based on this, add matching reliability, and combine reference map institute The three-dimensional information that corresponding topographic map provides reversely is resolved, and can obtain the flight position and attitude data of aircraft simultaneously, More navigation informations can be obtained compared with other method;Method is determined using the position and attitude, can be obtained directly with respect to benchmark The positioning of topographic map, it is high in the absence of accumulated error, precision.Meanwhile the pose of this pattern determines that method is to use optical imagery Inputted as sensor, be a kind of passive perceptual model, energy consumption is low, is disturbed by outer signals small.Relative to Scene matching method, The process from real-time figure generation orthograph picture is eliminated, part processing procedure is simplified, realizes and obtained more with less computing More navigation datas.
The flow that the aircraft pose determines method is discussed in detail below by one embodiment.
Embodiment one
In embodiment one, by vision real time imagery and the benchmark topographic map and its landforms in the flight navigating area that in advance obtains Benchmark texture image carries out matching and reversely resolved, so as to obtain the position of aircraft and attitude parameter.Specifically, Fig. 3 is participated in, Comprise the following steps:
Step 301:The resolution ratio of realtime graphic is determined according to the equivalent focal length of real time imagery and flying height, and according to base The resolution ratio of quasi- texture image determines the scaling multiple being corrected to realtime graphic.
In embodiment one,
If the resolution ratio of benchmark texture image is dR(rice/pixel).And in flight course when aircraft altitude is L rice Realtime graphic is IL, then according to optical imagery relation, the resolution ratio d of realtime graphicLCalculated by formula (1), wherein f is imaging system Equivalent focal length.
Then match to realize automatic reliable realtime graphic with benchmark texture maps, it is necessary to which realtime graphic is corrected to and base The consistent resolution ratio of quasi- texture image, to the scaling multiple Z of realtime graphic correctionLRProvided by formula (2).
Step 302:Processing, and resampling are zoomed in and out to realtime graphic according to scaling multiple, obtained and benchmark texture Realtime graphic after the consistent correction of the resolution ratio of image.
In embodiment one, using bilinear interpolation method to realtime graphic ILScale ZLRTimes after resampling, obtain with Realtime graphic I ' after the consistent correction of benchmark texture maps resolution-scaleL
Step 303:Characteristic point to be matched is chosen in realtime graphic after calibration, the interest value of the characteristic point to be matched is big In the first predetermined threshold value.
Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Specifically, in embodiment one, with the central point p of the realtime graphic after correction0For initial point, the emerging of the point is calculated Interesting value IvIf interest value is more than the first predetermined threshold value IvTh(value is 0.6 in embodiment one), then select central point p0For Characteristic point p subject to registrationr
If central point p0Interest value be not more than the first predetermined threshold value IvTh, then setpoint distance value d=1, and with this center Point p0Centered on, the interest value for d all pixels point with central point distance is calculated successively, if wherein some pixel pi's Interest value then stops search more than the first predetermined threshold value, selects pixel piFor characteristic point p subject to registrationr.If a little emerging Interest value is still undesirable, then gradually increases distance d, i.e. d=d+ Δs d, then perform above-mentioned with Δ d (can be with value 1) for step-length Successively calculate with central point distance for d all pixels point interest value the step of, so circulation until it is determined that spy subject to registration Levy point pr
Step 304:The second picture of the first pixel region and benchmark texture image of the realtime graphic after correction is calculated respectively Coefficient correlation between plain region, select Matching band of the second maximum pixel region of coefficient correlation as benchmark texture image Domain.
Wherein, the first above-mentioned pixel region is the pixel region of the default length and width centered on characteristic point to be matched, the Two pixel regions are the pixel region of the default length and width centered on a certain pixel in benchmark texture image.
In embodiment one, characteristic point p subject to registration is determinedrAfterwards, with prCentered on, it is with W (value 15 in embodiment one) Length and width yardstick is (i.e. with prCentered on, the long a width of W of selection a piece of rectangular area), the regional area is selected on realtime graphic As template image I to be matchedM
Position (the x provided with flight inertial navigation system0,y0) on the basis of the initial center of texture image, in benchmark Radius is in R hunting zone on texture image, calculates the W × W regional area figures determined centered on each pixel respectively Picture and the template image I on realtime graphicMCoefficient correlation, select the positioning knot of the pixel of maximum correlation coefficient as matching Fruit valueAnd by the match point of benchmark texture imageScope W × W area the I of the same name determinedbAs benchmark texture image With region.Wherein, the coefficient correlation calculated between two images is known in the art general knowledge, does not repeat herein.
Step 305:The interest value of all pixels point in matching area is determined, and chooses the maximum pixel conduct of interest value Fisrt feature point.
Step 306:Determine second feature point, second feature point be the distance between fisrt feature point be first it is default away from From and interest value be more than the second predetermined threshold value pixel.
Step 307:Third feature point is determined, third feature point is and the distance between fisrt feature point and second feature point The distance between the second pre-determined distance and characteristic straight line is all higher than to be more than second more than the 3rd pre-determined distance and interest value and preset The pixel of threshold value;Characteristic straight line is the straight line determined by fisrt feature point and second feature point.
In embodiment one, matching area I is determinedbAfterwards, the enough greatly principles of not conllinear according to characteristic point and interest value, selection 3 characteristic points, i.e., above-mentioned step 305-307.Selection process specifically can be as follows:
Step C1, matching area I on calculating benchmark problem figurebThe interest value of interior upper all pixels point, choose wherein interest It is worth maximum characteristic point as first characteristic point pb1
Step C2, the first pre-determined distance d=D is setth;Gradually investigate distance feature point pb1Distance is DthPoint, if certain The feature interest value of point is more than the second predetermined threshold value IvTh(can be using value as 0.6), then this o'clock be selected as second characteristic point pb2, Otherwise to preset step-length (can be using value as 1) increase distance d value, repeat step C2, second until choosing the condition of satisfaction Characteristic point pb2
Step C3, characteristic point p is determinedb1And pb2Between straight line (i.e. characteristic straight line) l12, for characteristic straight line l12Distance More than the 3rd pre-determined distance DtoLIn the range of pixel pbi, p is investigated one by onebiWith pb1And pb2The distance between db1、db2It is and emerging Interesting value IvbiIf distance meets min (db1,db2)>dth(dthAs the second pre-determined distance) and interest value meet Ivbi>IvTh, then Select the pbiPoint is third feature point pb3
Step 308:Corresponding height value is inquired about on benchmark topographic map according to characteristic point, at least determines corresponding three controls Point processed.
Specifically, using the method for calculating image correlation coefficient, it is true that least-squares iteration is respectively adopted on realtime graphic Determine pb1、pb2、pb3The accuracy registration point p of the same name of three characteristic pointsr1(x1,y1)、pr2(x2,y2)、pr3(x3,y3), the three of benchmark Tie up on topographic map according to pb1、pb2、pb3Image coordinate inquire about three characteristic points corresponding to height value, obtain corresponding control Point Pb1(X1,Y1,Z1)、Pb2(X2,Y2,Z2)、Pb3(X3,Y3,Z3).Control point for resolving position and attitude chooses and distribution is corresponding Schematic diagram is as shown in Figure 4.
Step 309:Range equation is built according to three control points, determines three between image space and three characteristic points Image-forming range.
Specifically, shown in Figure 5, in Fig. 5,1,2,3 three point represents three control point P respectivelyb1(X1,Y1,Z1)、Pb2 (X2,Y2,Z2)、Pb3(X3,Y3,Z3), PSFor the position where current flight device, as in embodiment one it needs to be determined that into image position Put.S1、S2、S3It is expressed as photocentre PSIt is unknown number to be solved to the distance at three control points.S12Be control point 1,2 it Between distance, S23It is control point 2, the distance between 3, S31It is control point 1, the distance between 3;It is projection imaging light PS- 1、PSAngle between -2,It is projection imaging light PS-2、PSAngle between -3,It is projection imaging light PS-1、PS-3 Between angle, these three angles can be calculated according to picture point and camera parameter.
It is specific as follows according to the range equation that three control points are built, image space P can determine that according to the range equationS Three image-forming range S between three characteristic points1、S2、S3
Step 310:Equation is resolved according to three control points and three image-forming range structure space resections, resolves imaging Position and the spin matrix R for representing posture.
Wherein, spin matrix R is:
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
Specifically, shown in Figure 6, Fig. 6 represents the known parameters source explanation in position and attitude resolving, wherein PSWith figure 5 is identical, is expressed as photocentre, SiFor i-th of control point, piIt is expressed as the imaging at i-th of control point, αiIt is i-th of control point Into the imaging azimuth of projection ray, βiIt is the image height angle of the projection ray at i-th of control point.
In embodiment one,
Resolve image space PS(XS, YS, ZS) spin matrix R with posture relative to the frame of reference, wherein spin matrix R is as it appears from the above, three control point parameters form equation group, resolving position P after substituting into following formula respectivelySSpin moment with representing posture Battle array R.
Three matrix expressions can be obtained after three control point parameters are substituted into above formula respectively, and then a, b can be determined, C and image space PS(XS, YS, ZS).It is determined that can determine that spin matrix R after a, b, c.
It should be noted that the computational methods used in the present embodiment one are only preferred embodiment, it can also be adopted With other similar approach.It is for instance possible to use other ripe resolution ratio alignment techniques or with other similar to formula calculate it is emerging Interesting value IvOr choose three interest values it is maximum and not conllinear o'clock as three characteristic points etc., do not repeat herein.
A kind of aircraft pose provided in an embodiment of the present invention determines method, and vision real time imagery is flown with what is obtained in advance Benchmark topographic map and its landforms benchmark texture image in row navigating area carry out matching and reversely resolved, so as to obtain the position of aircraft Put and attitude parameter.Using image height information to realtime graphic in aircraft real time imagery and texture maps carry out matching process Resolution ratio is corrected, and figure in real time and benchmark texture maps are matched based on this, add matching reliability, and combine reference map institute The three-dimensional information that corresponding topographic map provides reversely is resolved, and can obtain the flight position and attitude data of aircraft simultaneously, More navigation informations can be obtained compared with other method;Method is determined using the position and attitude, can be obtained directly with respect to benchmark The positioning of topographic map, it is high in the absence of accumulated error, precision.Meanwhile the pose of this pattern determines that method is to use optical imagery Inputted as sensor, be a kind of passive perceptual model, energy consumption is low, is disturbed by outer signals small.Relative to Scene matching method, The process from real-time figure generation orthograph picture is eliminated, part processing procedure is simplified, realizes and obtained more with less computing More navigation datas.
The flow that aircraft pose determines method is described in detail above, and this method can also be by corresponding device come real It is existing, the 26S Proteasome Structure and Function of the device is described in detail below.
A kind of aircraft pose determining device provided in an embodiment of the present invention, it is shown in Figure 7, including:
Acquisition module 71, for obtaining the realtime graphic of reference map and Airborne Camera, reference map includes benchmark texture image With benchmark topographic map;
Correction module 72, for carrying out resolution ratio correction to realtime graphic, the realtime graphic after correction and benchmark texture maps The resolution ratio of picture is consistent;
Matching module 73, it is additionally operable to carry out matching positioning to the realtime graphic after correction and benchmark texture image, determines school Matching area of the realtime graphic in benchmark texture image after just;
Determining module 74, for choosing at least three not conllinear features in the matching area in benchmark texture image Point, and corresponding height value is inquired about on benchmark topographic map according to characteristic point, at least determine corresponding three control points;
Processing module 75, equation is resolved for building space resection according to control point, determines image space and posture.
Preferably, shown in Figure 8, correction module 72 includes:
Computing unit 721, the resolution ratio of realtime graphic is determined for the equivalent focal length according to real time imagery and image height, And the scaling multiple for determining to be corrected realtime graphic according to the resolution ratio of benchmark texture image;
Unit for scaling 722, for zooming in and out processing, and resampling to realtime graphic according to scaling multiple, obtain and Realtime graphic after the consistent correction of the resolution ratio of benchmark texture image.
Preferably, shown in Figure 9, matching module 73 includes:
Choose unit 731, for choosing characteristic point to be matched in realtime graphic after calibration, characteristic point to be matched it is emerging Interest value is more than the first predetermined threshold value;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Matching unit 732, for calculating the first pixel region and benchmark texture image of the realtime graphic after correcting respectively The second pixel region between coefficient correlation, select the second maximum pixel region of coefficient correlation as benchmark texture image Matching area;First pixel region is the pixel region of default length and width centered on characteristic point to be matched, the second pixel region For the pixel region of the default length and width centered on a certain pixel in benchmark texture image.
Preferably, shown in Figure 10, determining module 74 includes:
First determining unit 741, for determining the interest value of all pixels point in matching area, and choose interest value maximum Pixel as fisrt feature point;Wherein, interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Second determining unit 742, for determining second feature point, second feature point is the distance between fisrt feature point It is more than the pixel of the second predetermined threshold value for the first pre-determined distance and interest value;
3rd determining unit 743, for determining third feature point, third feature point is and fisrt feature point and second feature The distance between point is all higher than the distance between the second pre-determined distance and characteristic straight line and is more than the 3rd pre-determined distance and interest value More than the pixel of the second predetermined threshold value;Characteristic straight line is the straight line determined by fisrt feature point and second feature point.
Preferably, shown in Figure 11, processing module 75 includes:
First construction unit 751, for building range equation according to three control points, determine image space and three features Three image-forming ranges between point;
Second construction unit 752, for being resolved according to three control points and three image-forming range structure space resections Equation, resolve image space and represent the spin matrix R of posture;Wherein, spin matrix R is:
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
A kind of aircraft pose provided in an embodiment of the present invention determines method and device, by vision real time imagery with obtaining in advance The benchmark topographic map in flight navigating area and its landforms benchmark texture image taken carries out matching and reversely resolved, so as to be flown The position of device and attitude parameter.Using image height information to reality in aircraft real time imagery and texture maps carry out matching process When image resolution ratio correct, based on this in real time figure and benchmark texture maps match, add matching reliability, and combine base The three-dimensional information that topographic map corresponding to quasi- figure provides reversely is resolved, and can obtain the flight position and posture of aircraft simultaneously Data, more navigation informations can be obtained compared with other method;Method is determined using the position and attitude, can obtain directly with respect to The positioning of benchmark topographic map, it is high in the absence of accumulated error, precision.Meanwhile the pose of this pattern determines that method is to use optics Imaging inputs as sensor, is a kind of passive perceptual model, and energy consumption is low, is disturbed by outer signals small.Relative to scene matching aided navigation Method, the process from real-time figure generation orthograph picture is eliminated, part processing procedure is simplified, realizes and obtained with less computing Obtain more navigation datas.
The present invention can have a variety of various forms of embodiments, above by taking Fig. 1-Figure 11 as an example with reference to accompanying drawing to this hair Bright technical scheme explanation for example, this is not meant to that the instantiation that the present invention is applied can only be confined to specific flow Or in example structure, one of ordinary skill in the art is it is to be appreciated that specific embodiment presented above is a variety of Some examples in its preferred usage, any embodiment for embodying the claims in the present invention all should be wanted in technical solution of the present invention Within the scope of asking protection.
Finally it should be noted that:The preferred embodiments of the present invention are the foregoing is only, are not intended to limit the invention, Although the present invention is described in detail with reference to the foregoing embodiments, for those skilled in the art, it still may be used To be modified to the technical scheme described in foregoing embodiments, or equivalent substitution is carried out to which part technical characteristic. Within the spirit and principles of the invention, any modification, equivalent substitution and improvements made etc., it should be included in the present invention's Within protection domain.

Claims (8)

1. a kind of aircraft pose determines method, it is characterised in that including:
The realtime graphic of reference map and Airborne Camera is obtained, the reference map includes benchmark texture image and benchmark topographic map;
Resolution ratio correction is carried out to the realtime graphic, the resolution ratio one of the realtime graphic after correction and the benchmark texture image Cause;
Matching positioning is carried out to the realtime graphic after the correction and the benchmark texture image, it is determined that the realtime graphic after correction Matching area in the benchmark texture image;
At least three not conllinear characteristic points are chosen in matching area in the benchmark texture image, and are existed according to characteristic point Corresponding height value is inquired about on the benchmark topographic map, at least determines corresponding three control points;
Space resection is built according to the control point and resolves equation, determines image space and posture;
Wherein, the realtime graphic to after the correction carries out matching positioning with the benchmark texture image, it is determined that after correction Matching area of the realtime graphic in the benchmark texture image, including:
Characteristic point to be matched is chosen in realtime graphic after calibration, the interest value of the characteristic point to be matched is more than first and preset Threshold value;Wherein, the interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on the characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
The first pixel region of the realtime graphic after the correction and the second pixel region of the benchmark texture image are calculated respectively Coefficient correlation between domain, select Matching band of the second maximum pixel region of coefficient correlation as the benchmark texture image Domain;First pixel region is the pixel region of default length and width centered on the characteristic point to be matched, second picture Plain region is the pixel region of the default length and width centered on a certain pixel in benchmark texture image.
2. according to the method for claim 1, it is characterised in that described that resolution ratio correction, bag are carried out to the realtime graphic Include:
According to the equivalent focal length and flying height of the real time imagery, the resolution ratio of the realtime graphic is determined, further according to benchmark The resolution ratio of texture image determines the scaling multiple being corrected to the realtime graphic;
Processing, and resampling are zoomed in and out to the realtime graphic according to the scaling multiple, obtained and the benchmark texture Realtime graphic after the consistent correction of the resolution ratio of image.
3. method according to claim 1 or 2, it is characterised in that at least three not conllinear characteristic points of the selection, bag Include:
The interest value of all pixels point in the matching area is determined, and chooses the maximum pixel of interest value as fisrt feature Point;Wherein, the interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on the characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) is Gauss weighting function, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Determine second feature point, the second feature point be the distance between described fisrt feature point be the first pre-determined distance, And interest value is more than the pixel of the second predetermined threshold value;
Third feature point is determined, the third feature point is and the distance between the fisrt feature point and second feature point The distance between the second pre-determined distance and characteristic straight line is all higher than to be more than second more than the 3rd pre-determined distance and interest value and preset The pixel of threshold value;The characteristic straight line is the straight line determined by fisrt feature point and second feature point.
4. method according to claim 1 or 2, it is characterised in that described that the friendship of space rear is built according to the control point Equation can be resolved, determines image space and posture, including:
Build range equations according to three control points, determine three imagings between image space and three characteristic points away from From;
Equation is resolved according to three control points and three image-forming ranges structure space resection, resolves image space With the spin matrix R for representing posture;Wherein, spin matrix R is:
<mrow> <mi>R</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mi>E</mi> <mo>-</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>c</mi> </mrow> </mtd> <mtd> <mi>b</mi> </mtd> </mtr> <mtr> <mtd> <mi>c</mi> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>a</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>b</mi> </mrow> </mtd> <mtd> <mi>a</mi> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <msup> <mrow> <mo>(</mo> <mi>E</mi> <mo>+</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>c</mi> </mrow> </mtd> <mtd> <mi>b</mi> </mtd> </mtr> <mtr> <mtd> <mi>c</mi> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>a</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>b</mi> </mrow> </mtd> <mtd> <mi>a</mi> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> </mrow>
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
A kind of 5. aircraft pose determining device, it is characterised in that including:
Acquisition module, for obtaining the realtime graphic of reference map and Airborne Camera, the reference map include benchmark texture image and Benchmark topographic map;
Correction module, for carrying out resolution ratio correction to the realtime graphic, the realtime graphic after correction and the benchmark texture The resolution ratio of image is consistent;
Matching module, it is additionally operable to carry out matching positioning to the realtime graphic after the correction and the benchmark texture image, it is determined that Matching area of the realtime graphic in the benchmark texture image after correction;
Determining module, for choosing at least three not conllinear characteristic points in the matching area in the benchmark texture image, And corresponding height value is inquired about on the benchmark topographic map according to characteristic point, at least determine corresponding three control points;
Processing module, equation is resolved for building space resection according to the control point, determines image space and posture;
Wherein, the matching module includes:
Unit is chosen, for choosing characteristic point to be matched, the interest of the characteristic point to be matched in realtime graphic after calibration Value is more than the first predetermined threshold value;Wherein, the interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on the characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Matching unit, for the first pixel region for calculating the realtime graphic after the correction respectively and the benchmark texture image The second pixel region between coefficient correlation, select the second maximum pixel region of coefficient correlation as the benchmark texture maps The matching area of picture;First pixel region is the pixel region of the default length and width centered on the characteristic point to be matched, Second pixel region is the pixel region of the default length and width centered on a certain pixel in benchmark texture image.
6. device according to claim 5, it is characterised in that the correction module includes:
Computing unit, for the equivalent focal length and flying height according to real time imagery, the resolution ratio of the realtime graphic is determined, and The scaling multiple for determining to be corrected the realtime graphic according to the resolution ratio of the benchmark texture image;
Unit for scaling, for zooming in and out processing, and resampling to the realtime graphic according to the scaling multiple, obtain and Realtime graphic after the consistent correction of the resolution ratio of the benchmark texture image.
7. the device according to claim 5 or 6, it is characterised in that the determining module includes:
First determining unit, for determining the interest value of all pixels point in the matching area, and choose interest value maximum Pixel is as fisrt feature point;Wherein, the interest value IvFor:
Iv=det [M] -0.04 (tr [M])2, M is the pixel region of the default length and width centered on the characteristic point to be matched in formula The covariance matrix in domain,Det [M] is matrix M determinant, and tr [M] is matrix M Mark, w (x, y) are Gauss weighting functions, IxIt is the difference value of transverse direction, IyIt is the difference value of longitudinal direction;
Second determining unit, for determining second feature point, the second feature point between the fisrt feature point away from From the pixel for being more than the second predetermined threshold value for the first pre-determined distance and interest value;
3rd determining unit, for determining third feature point, the third feature point is and the fisrt feature point and described the The distance between two characteristic points be all higher than the second pre-determined distance, the 3rd pre-determined distance be more than with the distance between characteristic straight line and Interest value is more than the pixel of the second predetermined threshold value;The characteristic straight line is determined by fisrt feature point and second feature point Straight line.
8. the device according to claim 5 or 6, it is characterised in that the processing module includes:
First construction unit, for building range equation according to three control points, determine image space and three characteristic points Between three image-forming ranges;
Second construction unit, for being resolved according to three control points and three image-forming ranges structure space resection Equation, resolve image space and represent the spin matrix R of posture;Wherein, spin matrix R is:
<mrow> <mi>R</mi> <mo>=</mo> <msup> <mrow> <mo>(</mo> <mi>E</mi> <mo>-</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>c</mi> </mrow> </mtd> <mtd> <mi>b</mi> </mtd> </mtr> <mtr> <mtd> <mi>c</mi> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>a</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>b</mi> </mrow> </mtd> <mtd> <mi>a</mi> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <msup> <mrow> <mo>(</mo> <mi>E</mi> <mo>+</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>c</mi> </mrow> </mtd> <mtd> <mi>b</mi> </mtd> </mtr> <mtr> <mtd> <mi>c</mi> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mi>a</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>b</mi> </mrow> </mtd> <mtd> <mi>a</mi> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> </mrow>
Wherein, E is unit matrix, and a, b, c is three parameters that equation determination is resolved according to space resection.
CN201510507925.XA 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device Active CN105043392B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510507925.XA CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510507925.XA CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Publications (2)

Publication Number Publication Date
CN105043392A CN105043392A (en) 2015-11-11
CN105043392B true CN105043392B (en) 2018-03-02

Family

ID=54450126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510507925.XA Active CN105043392B (en) 2015-08-17 2015-08-17 A kind of aircraft pose determines method and device

Country Status (1)

Country Link
CN (1) CN105043392B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571573B (en) * 2015-12-29 2018-05-01 南京中观软件技术有限公司 Oblique photograph positioning and orientation method and system and Plane location method for determining posture and system
CN105606073A (en) * 2016-01-11 2016-05-25 谭圆圆 Unmanned aerial vehicle processing system and flight state data processing method thereof
CN105825517B (en) * 2016-03-31 2018-09-07 湖北航天技术研究院总体设计所 A kind of image correcting method and system of navigation height error
CN110517319B (en) 2017-07-07 2022-03-15 腾讯科技(深圳)有限公司 Method for determining camera attitude information and related device
CN108917753B (en) * 2018-04-08 2022-02-15 中国人民解放军63920部队 Aircraft position determination method based on motion recovery structure
CN109540173A (en) * 2018-09-17 2019-03-29 江西洪都航空工业集团有限责任公司 A kind of Transfer Alignment of vision auxiliary
WO2021217403A1 (en) * 2020-04-28 2021-11-04 深圳市大疆创新科技有限公司 Method and apparatus for controlling movable platform, and device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103927738A (en) * 2014-01-10 2014-07-16 北京航天飞行控制中心 Planet vehicle positioning method based on binocular vision images in large-distance mode

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5403861B2 (en) * 2006-11-06 2014-01-29 キヤノン株式会社 Information processing apparatus and information processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103927738A (en) * 2014-01-10 2014-07-16 北京航天飞行控制中心 Planet vehicle positioning method based on binocular vision images in large-distance mode

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于影像的运动平台自定位测姿;刘勇;《中国博士学位论文全文数据库基础科学辑》;20150515;第66-67,98页 *
景象匹配制导地面半实物仿真验证系统;汤念;《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》;20120215;第36-43,50-53页 *

Also Published As

Publication number Publication date
CN105043392A (en) 2015-11-11

Similar Documents

Publication Publication Date Title
CN105043392B (en) A kind of aircraft pose determines method and device
Stöcker et al. Quality assessment of combined IMU/GNSS data for direct georeferencing in the context of UAV-based mapping
US10509983B2 (en) Operating device, operating system, operating method, and program therefor
EP3454008B1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN105184776B (en) Method for tracking target
CN105737858B (en) A kind of Airborne Inertial Navigation System attitude parameter calibration method and device
CN106708066A (en) Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN108235735A (en) Positioning method and device, electronic equipment and computer program product
CN107504981A (en) A kind of attitude of satellite error correcting method and equipment based on laser-measured height data
CN110470304B (en) High-precision target positioning and speed measuring method based on unmanned aerial vehicle photoelectric platform
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN106767791A (en) A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing
CN107192375B (en) A kind of unmanned plane multiple image adaptive location bearing calibration based on posture of taking photo by plane
CN106408601A (en) GPS-based binocular fusion positioning method and device
CN108896957A (en) The positioning system and method in a kind of unmanned plane control signal source
CN107192376A (en) Unmanned plane multiple image target positioning correction method based on interframe continuity
CN113295174B (en) Lane-level positioning method, related device, equipment and storage medium
CN108364304A (en) A kind of system and method for the detection of monocular airborne target
CN105242252B (en) Downward Trendline Spotlight SAR Imaging radar fix method based on images match
Johnson et al. Design and analysis of map relative localization for access to hazardous landing sites on mars
CN107389968A (en) A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer
CN107796370A (en) For obtaining the method, apparatus and mobile mapping system of conversion parameter
KR101764222B1 (en) System and method for high precise positioning
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN110515110B (en) Method, device, equipment and computer readable storage medium for data evaluation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant