CN105004354B - Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine - Google Patents

Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine Download PDF

Info

Publication number
CN105004354B
CN105004354B CN201510347204.7A CN201510347204A CN105004354B CN 105004354 B CN105004354 B CN 105004354B CN 201510347204 A CN201510347204 A CN 201510347204A CN 105004354 B CN105004354 B CN 105004354B
Authority
CN
China
Prior art keywords
mrow
msup
msub
mtd
systems
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510347204.7A
Other languages
Chinese (zh)
Other versions
CN105004354A (en
Inventor
李红光
丁文锐
刘家良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201510347204.7A priority Critical patent/CN105004354B/en
Publication of CN105004354A publication Critical patent/CN105004354A/en
Application granted granted Critical
Publication of CN105004354B publication Critical patent/CN105004354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/30Interpretation of pictures by triangulation
    • G01C11/34Aerial triangulation

Abstract

The invention discloses unmanned plane visible ray under a kind of large slanting view angle machine and infrared image object localization method, including the first step:Visible ray and the positioning of infrared image target based on imaging model.Second step:Target location error feature extraction is with representing under multifactor impact.3rd step:Target location error prediction and compensation under And of Varying Depth, large slanting view angle machine.Under the conditions of large slanting view angle machine, the present invention can effectively improve unmanned plane target positioning precision;Target positioning proposed by the present invention and error compensating method, amount of calculation is small, can reach the requirement of real-time of airborne calculating;The present invention is applied to a variety of image object positioning applications for meeting central projection imaging model such as visible images, infrared image.

Description

Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine
Technical field
The invention belongs to technical field of remote sensing image processing, and in particular to unmanned plane visible ray and red under a kind of large slanting view angle machine Outer image target positioning method.
Background technology
Unmanned plane target positioning is used as a kind of advanced reconnaissance data treatment technology, has in civil and military field important Application value.
Object localization method based on unmanned plane reconnaissance image mainly has following three kinds:1 mesh based on Image matching Demarcate the target positioning of position, 2 based on imaging model, 3 object localization methods based on space intersection.Method 1, precision is high, calculates It is time-consuming, and control point or orthography are relied on, it is not easy processing and widespread adoption in real time.Method 2, amount of calculation is small, but positions Precision is easily influenceed by factors such as parameter error, imaging postures.Method 3, spatial interaction positioning implement inconvenience, often carry out single-point survey Away from positioning, precision is not high, and the point distance measurement in image can only be positioned.
During unmanned plane during flying, the most frequently used real-time target localization method is second of target based on imaging model Positioning.Under error certain condition, this method target location accuracy angle of squint (imaging sensor optical axis and vertically downward direction Angle) increase and reduce.With unmanned plane according to data statistics, during ordinary circumstance performs reconnaissance mission, angle of squint is arrived 0 Probability between 30 degree about stands 20%, and the probability between 30 to 60 degree accounts for 60%, and the probability more than 60 degree accounts for 20%.At certain Angle of squint is more than 70 degree in the case of a little special scoutings.Large slanting view angle machine has had a strong impact on target location accuracy so that UAS Targeting capability is had a greatly reduced quality.
If it is 70mCEP that certain unmanned plane, which vertically descends apparent time (angle of squint is 0 degree) target location accuracy, then become with angle of squint Greatly, position error increases therewith, when especially angle of squint is more than 50 degree, position error increase tendency approximation exponential curve, such as schemes 1。
The content of the invention
The invention aims to target under the conditions of solving the problems, such as unmanned plane large slanting view angle machine to position, and proposes a kind of big oblique Unmanned plane visible ray and infrared image object localization method under visual angle, on the basis of being positioned based on imaging model target, to mesh Demarcation bit vector error is predicted and compensated, and improves target location accuracy.
Unmanned plane visible ray and infrared image object localization method under the large slanting view angle machine of the present invention, including following step Suddenly:
The first step:Visible ray and the positioning of infrared image target based on imaging model.
Visible ray and the positioning of infrared image target based on imaging model, can be divided into based on central projection imaging model structure Build collinearity equation, system geometric correction and target positioning clearing.
Second step:Target location error feature extraction is with representing under multifactor impact.
Analyzing influence unmanned plane reconnaissance image corrects the various factors of positioning precision, is closed according to the coupling between each factor System, by influence position error factor carry out simplifying equivalent process, it is determined that eventually for position error predictive compensation feature because Element, and solve the expression formula of equivalent features factor.
3rd step:Target location error prediction and compensation under And of Varying Depth, large slanting view angle machine.
The characteristic factor obtained using second step establishes the mathematical modeling of position error prediction, and chooses sufficiently known positioning The sufficient training sample of error is trained to model, obtains the parameter of forecast model.Using this parameter to scouting to be corrected Image carries out position error prediction, obtains the error vector of the image, the overlay error vector on first step target location base, The final image coordinate determined after compensation.
The advantage of the invention is that:
(1) under the conditions of large slanting view angle machine, the present invention can effectively improve unmanned plane target positioning precision;
(2) target positioning proposed by the present invention and error compensating method, amount of calculation is small, can reach the real-time of airborne calculating Property require;
(3) present invention is applied to a variety of image mesh for meeting central projection imaging model such as visible images, infrared image Demarcate position application.
Brief description of the drawings
Fig. 1 is the tendency chart that present invention explanation position error changes and increased with angle of squint;
Fig. 2 is geometric correction process fast multipole method conversion schematic diagram of the present invention;
Fig. 3 is the center collinearity condition equation schematic diagram of the invention established used in calibration model;
Fig. 4 is the position error factor equivalent model schematic diagram that the present invention designs;
Fig. 5 is that equivalent characteristic factor optical axis extreme coordinates solve schematic diagram;
Fig. 6 is to be trained samples selection according to aircraft altitude, angle of squint and imaging direction angle;
Fig. 7 is the training sample selection distribution map when working flying height;
Fig. 8 is flow chart of the method for the present invention.
Embodiment
Below in conjunction with drawings and examples, the present invention is described in further detail.
The present invention is unmanned plane visible ray and infrared image object localization method under a kind of large slanting view angle machine, flow such as Fig. 8 institutes Show, specific implementation step is as follows:
The first step:Visible ray and the positioning of infrared image target based on imaging model;
Specially:
(1) the reconnaissance image system geometric correction based on coordinate system conversion;
The process that the process of image rectification, actually image coordinate system are changed to earth coordinates.
System-level geometric correction is carried out based on central projection imaging model, it is necessary to which the conversion established between each coordinate system is closed System.The conversion of the plane of delineation to rectangular coordinate system in space need experience " pixel coordinate system (I systems)->Camera coordinates system (C systems)- >Unmanned plane coordinate system (P systems)->Northern day east coordinate system (N systems)->Rectangular coordinate system in space (G systems)->Earth coordinates (E System) " process, such as Fig. 2, give the schematic diagram of each coordinate system.
Wherein, translation transformation T be present between pixel coordinate system (I systems) and camera coordinates system (C systems)IAnd reference axis upset ChangeX is changed into opposite number.Due to installing reason, camera coordinates system (C systems) center is sat with aircraft Mark system (P systems) center is inconsistent, therefore translation transformation T between C systems and P systems be presentC.Further, since camera platform is with respect to aircraft Platform has a rotation of two frees degree in orientation and pitching, therefore C systems are to perspective transform M being also present between P systemsC.Using P systems origin as original Point establishes east northeast day coordinate system (N systems), and P systems rotate relative to the free degree that there are three directions in N systems, course, pitching, roll.P systems arrive Perspective transform M is used in the change of N systems insteadpRepresent.
Image plane F in I systemsI(xI,yI,zI) the image plane F into N systemsN(xN,yN,zN) conversion can be expressed as it is following Process, such as formula (1):
FN=Mp·Mc·Tc·Yinv·TI·FI (1)
Wherein:FNRepresent image be transformed into N systems after coordinate, any point in image can be by (xN,yN,zN) represent;
FIRepresent coordinate of the image in I systems, any point in image can be by (xI,yI,zI) represent.
The XOY plane of G systems is Gauss Kru&4&ger projection face, and origin is Greenwich meridian and the intersection point in equator.XOY Plane height above sea level is zero.Z axis points to day to XYZ meets left hand rule.N systems are parallel with G systems, and simply origin is different, therefore Translation transformation be present.Its three translational movements are exactly aircraft axes P systems origin (being considered aircraft GPS location) at three of G systems Projection in reference axis.The conversion of G systems to E systems needs to carry out according to the projection mode (Gauss Kru&4&ger projection) of regulation.
In summary, it is sharp further according to the relation of N systems, G systems, E systems using formula (1) image plane after I systems transform to N systems It is theoretical with space similar triangles and Gauss Kru&4&ger projection, image plane is finally completed from I systems to the transfer process of E systems.This Process is reconnaissance image system geometric correction process.
(2) the target positioning based on collinearity equation;
Analyzed from imaging model, it is seen that light and infrared image belong to central projection, imaging moment, object point, photo centre, as Point, 3 points on straight line, that is, meet collinear condition.Therefore, picture point and accordingly the space reflection relation between millet cake Equation can be built by collinear condition, so that it is determined that the position of picture point, realizes the target positioning of image.
Such as Fig. 3, it is assumed that picture point a is (u, v) in the coordinate of I systems, then, can be in the hope of according to the conversion of formula (1) coordinate system Coordinate (Xs of the picture point a in N systemsda,Yda,Zda), if the object point A corresponding with picture point a is (X in G systems coordinateA,YA,ZA), in photography Heart S G systems coordinate is (XS,YS,ZS), because aircraft locality N systems and G systems are parallel to each other, can be obtained according to similar triangles relation To the N systems (X of picture pointda,Yda,Zda) G systems coordinate (X with corresponding object pointA,YA,ZA) between relation be:
Being write above formula as matrix form is:
λ is scale factor in formula, (XS,YS,ZS) it is to be thrown as the E systems coordinate (B, L, H) where aircraft through Gauss-Ke Lvge Shadow converts what is obtained.
Assuming that I system coordinate of the spot in correcting image is (uo,vo), then the E systems coordinate (x of spoto,yo) It can be calculated by following formula:
Wherein, Latnorth、Latsouth、Lngeast、LngwestThe respectively north latitude of correcting image, south latitude, east longitude and west longitude Boundary value, Width, Height are respectively the wide and high of correcting image.
Second step:Target location error feature extraction is with representing under multifactor impact;
Specially:
(1) selection target position error feature
Due to introducing image size, image height, image space, pixel dimension, focal length, aircraft in target positioning equation The multivariate datas such as course angle, aircraft pitch angle, aircraft roll angle, platform azimuth, the platform angle of site.The comprehensive work of multiple factors The difficulty analyzed with target location error is added.
Analysis learns that carriage angle (vector angle, aircraft pitch angle, aircraft roll angle) is peaceful during flight The final synthesis that influences of the change of platform attitude angle (platform azimuth, the platform angle of site) is embodied in sensor light axle in three dimensions In sensing.When height, aircraft pitch angle, aircraft roll angle, the platform angle of site when unmanned plane keep constant, as unmanned plane navigates To the azimuthal change (0-360 degree) in angle and platform, the projection pointed on the ground of optical axis is with unmanned plane center spot projection For the circle in the center of circle.When optical axis point to the angle (angle of squint) of vertically downward direction from the radius justified when gradually increasing for 0 degree not yet Disconnected increase.Under different height, during the difference of angle of squint, the sensing of two optical axises may also be on same circle, such as Fig. 4.
Draw a conclusion from the graph, with the change of angle of squint and height, the span of the radius of concentric circles can 0 arrive nothing It is poor big.This is turned out, and optical axis points to and the influence of height is all representative to whole region of scouting, and can be pointed to optical axis Regional extent and target position location are scouted with two factor analyses of height.In view of directivity requirement, optical axis is pointed to and decomposed Into angle of squint (α in Fig. 4) and optical axis at the angle of horizontal plane and direct north (in Fig. 4, deflection β).Therefore, will squint Angle, imaging direction angle, subsequent analysis highly is carried out as basic affecting parameters, angle of squint, imaging direction angle, height will be set It is set to target location error feature.
(2) target location error character representation
In the case that any attitude angle is not present in unmanned plane, airframe coordinate system is to overlap with northern day east coordinate system , in this case, it is assumed that optical axis length is unit value 1, as shown in Fig. 5 thick lines, calculate that emergent shaft end points is sat in east northeast day Three-dimensional coordinate in mark system represents (x, y, z), as shown in Figure 5
In formulaRepresent the platform angle of site;κ represents platform azimuth.
P systems are transformed into when attitudes vibration occurs for unmanned plane itself, that is, by N systems, the rotation of body will drive optical axis Sensing changes.At this point it is possible to changing (coordinate rotation) according to coordinate system regains optical axis end points under new coordinate system The three-dimensional coordinate of (P systems) represents (x', y', z'), orderRepresent aircraft pitch angle, ω2Represent craft inclination angle, κ2Represent aircraft Course angle such as formula
Using the three-dimensional coordinate (x', y', z') of optical axis, angle of squint α and imaging direction angle beta can be solved:
3rd step:Target location error prediction and compensation under And of Varying Depth, large slanting view angle machine;
Specially:
(1) target location error forecast model models
Target location error mathematical prediction model is established using the method for error regression analysis.Utilize what is chosen in second step Feature and predicted value form sample data, carry out model training.Understood according to experimental data observation, unmanned plane target positioning misses Between the feature such as difference and flying height, angle of squint and non-linear relation, specific relation still need further research and analysis.
According to angle of squint α, corresponding training sample is formed by view data after the correction for having determined that error, training sample Sum is designated as m, and i-th of training sample is denoted as (x(i),y(i)), y(i)For sample error value, x(i)For the feature of sample, n are included Feature is represented by x(i)=[x0,x1,x2,…,xn]T, wherein x0=1, x1=Htan (α), x2=H, x3=β, and error with The present x of non-linear body between feature4=x1 2,x5=x2 2,x6=x3 2,x7=x1 3,x8=x2 3,x9=x3 3......;
The forecast model mathematical form of structure is shown below:
Wherein:hθ(x) it is model to be predicted on feature x, θ is parameter to be determined in model, θ=[θ012,…, θn]T
For above-mentioned model, it is necessary to design the parameter of forecast model according to the error characteristics that real data reflects. Different type of machines or imaging platform target location error characteristic are different, and forecast model is also different.
Next, a mechanism is needed to go assessment parameter θ whether relatively good or reach estimated requirement, so will be to upper State hθ(x) function is assessed, and corresponding cost function is as follows:
The method declined using gradient minimizes cost function J (θ), optimal parameter θ is solved, first to cost function Seek local derviation:
Then θ value is calculated using the method for iteration:
θ=θ-k ▽θJ (14)
In formula:K represents learning rate, and k values are set bigger, and convergence rate is faster, and learning time is shorter, and precision is lower, k values What is set is smaller, and convergence rate is slower, and learning time is longer, and precision is higher.
(2) under the conditions of different height, different angles of squint, the selection of training sample
Assuming that when certain unmanned plane performs reconnaissance mission, image height change is from 3000 to 7000 meter, and angle of squint is from 0 to 90 Degree, imaging direction angle equal-probability distribution in the range of 0 to 360 degree.
In order to improve adaptability of the target location error forecast model to a variety of conditions, it is necessary to increase in image height, tiltedly Sample size under the conditions of visual angle, imaging direction angle are various.In order to effectively select the sample of forecast model needs, statistics pre- Survey the quantity of sample in some region in space, judge the credibility of current predictive result, it is necessary to establish sample in prediction space Distribution represents the method with measurement.
The present invention divides resolution ratio in image height, angle of squint, three, imaging direction angle component, obtains image height point Resolution, angle of squint distributive law, imaging direction angular resolution.According to three resolution ratio design sample distribution expressions and measuring method.
Represent and measure on sample distribution, can be analyzed using two methods.Method 1, such as Fig. 6, utilize imaging Highly, angle of squint, three, imaging direction angle component establish three-dimensional coordinate system, and training sample is distributed at " three dimensions " Expression and the range measurement between sample to be tested and training sample.Utilize the image height of aircraft, angle of squint, imaging direction angle It is trained, preferable training sample should fill entirely will be by working height range, strabismus angular region and imaging direction angle model Enclose the cubic space of determination.When by the sample to be predicted that image height, angle of squint, imaging direction angle determine Fig. 6 three-dimensional Position in coordinate system can then obtain preferable prediction result in the range of training sample includes.Method 2, such as Fig. 7, Image height, angle of squint, three, imaging direction angle component are transformed into the two-dimentional disk of optical axis sensing, planar carry out sample This expression and measurement.Due to being often operated in a relatively stable height for unmanned plane, therefore the selection of training sample is more More is determined by angle of squint and imaging direction angle.When training sample is filled with corresponding annulus well, then can reach Preferable prediction effect.
(3) prediction and compensation of target location error
Using the position error forecast model of above-mentioned foundation and the training sample of determination, forecast model is trained, obtained Modulus shape parameter, position error prediction is carried out to image to be corrected using this model parameter, determines the error vector of the image ΔE(xe,ye).This error is compensated to the target longitude and latitude that the first step obtains, obtains more accurately latitude and longitude information (xo',yo'), as shown in Equation 15:
Wherein, xeRepresent the error lengths in image x directions under earth coordinates predicted, yeRepresent the figure predicted As the error lengths in the y directions under earth coordinates.

Claims (1)

1. unmanned plane visible ray and infrared image object localization method, specific implementation step are as follows under a kind of large slanting view angle machine:
The first step:Visible ray and the positioning of infrared image target based on central projection imaging model;
Specially:
(1) the reconnaissance image system geometric correction based on coordinate system conversion;
System-level geometric correction, the transformation relation established between each coordinate system, image are carried out based on central projection imaging model Plane is to the transfer process of rectangular coordinate system in space:Pixel coordinate system I systems->Camera coordinates system C systems->Unmanned plane coordinate system P systems->Northern day east coordinate system N systems->Rectangular coordinate system in space G systems->Earth coordinates E systems;
Translation transformation T be present between pixel coordinate system I systems and camera coordinates system C systemsIAnd reference axis upset change:
X is changed into opposite number;
Translation transformation T be present between camera coordinates system C systems center and unmanned plane coordinate system P systemsCAnd perspective transform MC
Northern day east coordinate system N systems are established by origin of P systems origin, P systems rotate relative to the free degree that there are three directions in N systems, course, Perspective transform M is used in pitching, roll, the change of P systems to N systems insteadpRepresent;
Image plane F in I systemsI(xI,yI,zI) the image plane F into N systemsN(xN,yN,zN) be transformed to:
FN=Mp·Mc·Tc·Xinv·TI·FI (1)
There is translation transformation in N systems and G systems, its three translational movements are unmanned plane coordinate system P system's origins in three reference axis of G systems Projection;
The conversion of G systems to E systems is carried out according to the projection mode of regulation;
In summary, using formula (1) image plane after I systems transform to N systems, sky is utilized further according to the relation of N systems, G systems, E systems Between similar triangles and Gauss Kru&4&ger projection it is theoretical, be finally completed image plane from I systems to the transfer process of E systems;
(2) the target positioning based on collinearity equation;
Assuming that picture point a is (u, v) in the coordinate of I systems, then according to the conversion of formula (1) coordinate system, seats of the picture point a in N systems is obtained Mark (Xda,Yda,Zda), if the object point A corresponding with picture point a is (X in G systems coordinateA,YA,ZA), photo centre S G systems coordinate is (XS,YS,ZS), obtain picture point a N systems (Xda,Yda,Zda) G systems coordinate (X with corresponding object point AA,YA,ZA) between relation be:
<mrow> <mfrac> <msub> <mi>X</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> <mrow> <msub> <mi>X</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>S</mi> </msub> </mrow> </mfrac> <mo>=</mo> <mfrac> <msub> <mi>Y</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> <mrow> <msub> <mi>Y</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>S</mi> </msub> </mrow> </mfrac> <mo>=</mo> <mfrac> <msub> <mi>Z</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> <mrow> <msub> <mi>Z</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>S</mi> </msub> </mrow> </mfrac> <mo>=</mo> <mfrac> <mn>1</mn> <mi>&amp;lambda;</mi> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Being write above formula as matrix form is:
<mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>X</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>Y</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>Z</mi> <mrow> <mi>d</mi> <mi>a</mi> </mrow> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfrac> <mn>1</mn> <mi>&amp;lambda;</mi> </mfrac> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>X</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>S</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>Y</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>S</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>Z</mi> <mi>A</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>S</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
λ is scale factor in formula, (XS,YS,ZS) it is through Gauss Kru&4&ger projection as E systems coordinate (B, L, H) where unmanned plane What conversion obtained;
Assuming that I system coordinate of the spot in correcting image is (uo,vo), then the E systems coordinate (x of spoto,yo) be:
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>o</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <msub> <mi>Lng</mi> <mrow> <mi>e</mi> <mi>a</mi> <mi>s</mi> <mi>t</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>Lng</mi> <mrow> <mi>w</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <mi>W</mi> <mi>i</mi> <mi>d</mi> <mi>t</mi> <mi>h</mi> </mrow> </mfrac> <msub> <mi>u</mi> <mi>o</mi> </msub> <mo>+</mo> <msub> <mi>Lng</mi> <mrow> <mi>w</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>o</mi> </msub> <mo>=</mo> <msub> <mi>Lat</mi> <mrow> <mi>n</mi> <mi>o</mi> <mi>r</mi> <mi>t</mi> <mi>h</mi> </mrow> </msub> <mo>-</mo> <mfrac> <mrow> <mo>(</mo> <msub> <mi>Lat</mi> <mrow> <mi>n</mi> <mi>o</mi> <mi>r</mi> <mi>t</mi> <mi>h</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>Lat</mi> <mrow> <mi>s</mi> <mi>o</mi> <mi>u</mi> <mi>t</mi> <mi>h</mi> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <mi>H</mi> <mi>e</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> <mi>t</mi> </mrow> </mfrac> <msub> <mi>v</mi> <mi>o</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow> 1
Wherein, Latnorth、Latsouth、Lngeast、LngwestThe respectively north latitude of correcting image, south latitude, east longitude and west longitude border Value, Width, Height are respectively the wide and high of correcting image;
Second step:Target location error feature extraction is with representing under multifactor impact;
Specially:
(1) selection target position error feature;
By angle of squint, imaging direction angle, highly it is set to target location error feature;
(2) target location error character representation;
In the case that any attitude angle is not present in unmanned plane, unmanned plane coordinate system overlaps with northern day east coordinate system, it is assumed that Optical axis length is unit value 1, obtains three-dimensional coordinate of the optical axis end points in Bei Tiandong coordinate systems and represents (x, y, z):
In formulaRepresent the platform angle of site;κ represents platform azimuth;
When attitudes vibration occurs for unmanned plane itself, i.e., P systems are transformed into by N systems, the rotation of body will drive optical axis to point to generation Change, now, according to coordinate system conversion regain three-dimensional coordinate of the optical axis end points under new coordinate system represent (x', y', z'):
Wherein,Represent the unmanned plane angle of pitch, ω2Represent unmanned plane inclination angle, κ2Represent unmanned plane course angle;
Using the three-dimensional coordinate (x', y', z') of optical axis, angle of squint α and imaging direction angle beta are solved:
<mrow> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mi>&amp;alpha;</mi> <mo>=</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>z</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <mi>&amp;alpha;</mi> <mo>=</mo> <mi>arctan</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>z</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>sin</mi> <mi>&amp;beta;</mi> <mo>=</mo> <mfrac> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>cos</mi> <mi>&amp;beta;</mi> <mo>=</mo> <mfrac> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow> 2
<mrow> <mi>&amp;beta;</mi> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>arcsin</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>&gt;</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>&gt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>&amp;pi;</mi> <mo>-</mo> <mi>arcsin</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>&gt;</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>&lt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>&amp;pi;</mi> <mo>+</mo> <mi>arcsin</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>&lt;</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>&lt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mfrac> <mrow> <mn>3</mn> <mi>&amp;pi;</mi> </mrow> <mn>2</mn> </mfrac> <mo>+</mo> <mi>arcsin</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>|</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>|</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>&lt;</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>&gt;</mo> <mn>0</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow>
3rd step:Target location error prediction and compensation under And of Varying Depth, large slanting view angle machine;
Specially:
(1) target location error forecast model models;
According to angle of squint α, corresponding training sample, the sum of training sample are formed by view data after the correction for having determined that error M is designated as, i-th of training sample is denoted as (x(i),y(i)), y(i)For sample error value, x(i)For the feature of sample, n feature is included It is represented by x(i)=[x0,x1,x2,…,xn]T, wherein x0=1, x1=Htan (α), x2=H, x3=β, and error and feature Between the present x of non-linear body4=x1 2,x5=x2 2,x6=x3 2,x7=x1 3,x8=x2 3,x9=x3 3......;
The forecast model mathematical form of structure is shown below:
<mrow> <mi>y</mi> <mo>=</mo> <msub> <mi>h</mi> <mi>&amp;theta;</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> <msub> <mi>x</mi> <mn>0</mn> </msub> <mo>+</mo> <msub> <mi>&amp;theta;</mi> <mn>1</mn> </msub> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>&amp;theta;</mi> <mn>2</mn> </msub> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>+</mo> <mo>...</mo> <mo>+</mo> <msub> <mi>&amp;theta;</mi> <mi>n</mi> </msub> <msub> <mi>x</mi> <mi>n</mi> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>&amp;theta;</mi> <mi>i</mi> </msub> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>=</mo> <msup> <mi>&amp;theta;</mi> <mi>T</mi> </msup> <mi>x</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
Wherein:hθ(x) it is model to be predicted on feature x, θ is parameter to be determined in model, θ=[θ012,…,θn]T
To above-mentioned hθ(x) function is assessed, and cost function is as follows:
<mrow> <mi>J</mi> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <msup> <mrow> <mo>(</mo> <msub> <mi>h</mi> <mi>&amp;theta;</mi> </msub> <mo>(</mo> <msup> <mi>x</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msup> <mo>)</mo> <mo>-</mo> <msup> <mi>y</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
The method declined using gradient minimizes cost function J (θ), solves optimal parameter θ, asks inclined to cost function first Lead:
<mrow> <msub> <mo>&amp;dtri;</mo> <mi>&amp;theta;</mi> </msub> <mi>J</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <mfrac> <mo>&amp;part;</mo> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> </mrow> </mfrac> <mi>J</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mfrac> <mo>&amp;part;</mo> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mi>J</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mrow> <mfrac> <mo>&amp;part;</mo> <mrow> <mo>&amp;part;</mo> <msub> <mi>&amp;theta;</mi> <mi>n</mi> </msub> </mrow> </mfrac> <mi>J</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> </mrow>
Then θ value is calculated using the method for iteration:
θ=θ-k ▽θJ (14)
In formula:K represents learning rate;
(2) under the conditions of different height, different angles of squint, the selection of training sample;
In image height, angle of squint, three, imaging direction angle component, resolution ratio is divided, obtains image height resolution ratio, angle of squint Distributive law, imaging direction angular resolution, according to three resolution ratio design sample distribution expressions and measuring method;
(3) prediction and compensation of target location error;
Using the position error forecast model of above-mentioned foundation and the training sample of determination, forecast model is trained, obtains mould Shape parameter, position error prediction is carried out to image to be corrected using this model parameter, determines the error vector Δ E of the image (xe,ye);This error is compensated to the target longitude and latitude that the first step obtains, obtains more accurately latitude and longitude information (xo', yo'), as shown in Equation 15:
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mi>o</mi> </msub> <mo>&amp;prime;</mo> </msup> <mo>=</mo> <msub> <mi>x</mi> <mi>o</mi> </msub> <mo>+</mo> <msub> <mi>x</mi> <mi>e</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mi>o</mi> </msub> <mo>&amp;prime;</mo> </msup> <mo>=</mo> <msub> <mi>y</mi> <mi>o</mi> </msub> <mo>+</mo> <msub> <mi>y</mi> <mi>e</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>15</mn> <mo>)</mo> </mrow> </mrow>
Wherein, xeRepresent the error lengths in image x directions under earth coordinates predicted, yeThe image for representing to predict exists The error lengths in y directions under earth coordinates.
CN201510347204.7A 2015-06-19 2015-06-19 Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine Active CN105004354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510347204.7A CN105004354B (en) 2015-06-19 2015-06-19 Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510347204.7A CN105004354B (en) 2015-06-19 2015-06-19 Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine

Publications (2)

Publication Number Publication Date
CN105004354A CN105004354A (en) 2015-10-28
CN105004354B true CN105004354B (en) 2017-12-05

Family

ID=54377118

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510347204.7A Active CN105004354B (en) 2015-06-19 2015-06-19 Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine

Country Status (1)

Country Link
CN (1) CN105004354B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111998823A (en) * 2020-08-26 2020-11-27 吉林大学 Target ranging method based on binocular different-light-source ranging device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107590878A (en) * 2017-09-13 2018-01-16 中国人民解放军火箭军工程大学 A kind of unmanned plane during flying safe prediction apparatus for evaluating and method
CN108109118B (en) * 2017-12-15 2021-10-15 大连理工大学 Aerial image geometric correction method without control points
CN108388341B (en) * 2018-02-11 2021-04-23 苏州笛卡测试技术有限公司 Man-machine interaction system and device based on infrared camera-visible light projector
CN109493309A (en) * 2018-11-20 2019-03-19 北京航空航天大学 A kind of infrared and visible images variation fusion method keeping conspicuousness information
CN109636850B (en) * 2019-01-14 2021-02-19 刘翔宇 Visible light positioning method for indoor intelligent lamp
CN110081982B (en) * 2019-03-11 2021-01-15 中林信达(北京)科技信息有限责任公司 Unmanned aerial vehicle target positioning method based on double-spectrum photoelectric search
CN110113094B (en) * 2019-05-09 2021-08-13 西安爱生技术集团公司 Communication relay unmanned aerial vehicle visibility calculation method for lift-off
CN112835376B (en) * 2019-11-22 2023-03-10 中国电力科学研究院有限公司 Machine head positioning method and system for unmanned aerial vehicle electricity testing
CN113124901B (en) * 2021-04-01 2022-03-11 中铁第四勘察设计院集团有限公司 Position correction method and device, electronic device and storage medium
CN113284128B (en) * 2021-06-11 2023-05-16 中国南方电网有限责任公司超高压输电公司天生桥局 Image fusion display method and device based on power equipment and computer equipment
CN114926552B (en) * 2022-06-17 2023-06-27 中国人民解放军陆军炮兵防空兵学院 Method and system for calculating Gaussian coordinates of pixel points based on unmanned aerial vehicle image

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4985704A (en) * 1987-01-17 1991-01-15 Scicon Limited Processing parameter generator for synthetic aperture radar
CN102565797A (en) * 2011-12-21 2012-07-11 北京航空航天大学 Geometric correction method for spotlight-mode satellite SAR (synthetic aperture radar) image
CN102788978A (en) * 2012-07-20 2012-11-21 电子科技大学 Squint spaceborne/airborne hybrid bistatic synthetic aperture radar imaging method
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
CN103207388A (en) * 2013-03-26 2013-07-17 中国科学院电子学研究所 Method for calibrating airborne interference synthesis aperture radar (SAR) under squint condition
CN103675815A (en) * 2013-09-27 2014-03-26 西安电子科技大学 Method for accurately estimating Doppler rate in large-strabismus SAR (Synthetic Aperture Radar) imaging mode

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4985704A (en) * 1987-01-17 1991-01-15 Scicon Limited Processing parameter generator for synthetic aperture radar
CN102565797A (en) * 2011-12-21 2012-07-11 北京航空航天大学 Geometric correction method for spotlight-mode satellite SAR (synthetic aperture radar) image
CN102788978A (en) * 2012-07-20 2012-11-21 电子科技大学 Squint spaceborne/airborne hybrid bistatic synthetic aperture radar imaging method
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
CN103207388A (en) * 2013-03-26 2013-07-17 中国科学院电子学研究所 Method for calibrating airborne interference synthesis aperture radar (SAR) under squint condition
CN103675815A (en) * 2013-09-27 2014-03-26 西安电子科技大学 Method for accurately estimating Doppler rate in large-strabismus SAR (Synthetic Aperture Radar) imaging mode

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种无人机大斜视SAR成像方法;郑力文等;《雷达科学与技术》;20071231;第5卷(第6期);431-435 *
多视场遥感图像虚拟焦面拼接理论误差分析;岳春宇等;《航天返回与遥感》;20150430;第36卷(第2期);60-68 *
面阵CCD航空相机斜视图像几何畸变校正误差分析;周前飞等;《仪器仪表学报》;20140630;第35卷(第6期);1-8 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111998823A (en) * 2020-08-26 2020-11-27 吉林大学 Target ranging method based on binocular different-light-source ranging device

Also Published As

Publication number Publication date
CN105004354A (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN105004354B (en) Unmanned plane visible ray and infrared image object localization method under large slanting view angle machine
CN105180963B (en) Unmanned plane telemetry parameter modification method based on online calibration
CN103345737B (en) A kind of UAV high resolution image geometric correction method based on error compensation
CN100541232C (en) The thick bearing calibration of aviation multiple spectrum scanner geometric under the no attitude information condition
CN104764443B (en) A kind of tight imaging geometry model building method of Optical remote satellite
CN102426025B (en) Simulation analysis method for drift correction angle during remote sensing satellite attitude maneuver
CN101813465B (en) Monocular vision measuring method of non-contact precision measuring corner
CN105160125B (en) A kind of simulating analysis of star sensor quaternary number
CN110220491B (en) Method for estimating installation error angle of optical pod of unmanned aerial vehicle
CN105548976A (en) Shipborne radar offshore precision identification method
CN105444778B (en) A kind of star sensor based on imaging geometry inverting is in-orbit to determine appearance error acquisition methods
CN110030978B (en) Method and system for constructing geometric imaging model of full-link optical satellite
CN111102981B (en) High-precision satellite relative navigation method based on UKF
CN106767714A (en) Improve the equivalent mismatch model multistage Calibration Method of satellite image positioning precision
CN105737858A (en) Attitude parameter calibration method and attitude parameter calibration device of airborne inertial navigation system
CN105891821A (en) Automatic tracking method of airborne downward-looking measurement target
CN106525054B (en) A kind of above pushed away using star is swept single star of remote sensing images information and independently surveys orbit determination method
CN111896009B (en) Method and system for correcting imaging sight line offset caused by satellite flight motion
CN111505608B (en) Laser pointing on-orbit calibration method based on satellite-borne laser single-chip footprint image
CN110646782A (en) Satellite-borne laser on-orbit pointing calibration method based on waveform matching
Li et al. Opportunity rover localization and topographic mapping at the landing site of Meridiani Planum, Mars
CN101793517B (en) Online quick method for improving accuracy of attitude determination of airborne platform
CN102519454B (en) Selenocentric direction correction method for sun-earth-moon navigation
CN105182315A (en) Method for obtaining remote sensing image ground resolution of large swing angle optical remote sensing satellite
CN104019800B (en) The method of big side-sway line array CCD remote sensing images positioning for ground

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20151028

Assignee: Beijing northern sky long hawk UAV Technology Co.,Ltd.

Assignor: BEIHANG University

Contract record no.: X2021990000039

Denomination of invention: Target location method of UAV in visible and infrared images with large squint angle

Granted publication date: 20171205

License type: Exclusive License

Record date: 20210119