CN104501779A - High-accuracy target positioning method of unmanned plane on basis of multi-station measurement - Google Patents

High-accuracy target positioning method of unmanned plane on basis of multi-station measurement Download PDF

Info

Publication number
CN104501779A
CN104501779A CN201510010637.3A CN201510010637A CN104501779A CN 104501779 A CN104501779 A CN 104501779A CN 201510010637 A CN201510010637 A CN 201510010637A CN 104501779 A CN104501779 A CN 104501779A
Authority
CN
China
Prior art keywords
camera
unmanned plane
target
high
step
Prior art date
Application number
CN201510010637.3A
Other languages
Chinese (zh)
Inventor
马传焱
时荔蕙
王春龙
郝博雅
孙宇翔
高洪兴
Original Assignee
中国人民解放军63961部队
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国人民解放军63961部队 filed Critical 中国人民解放军63961部队
Priority to CN201510010637.3A priority Critical patent/CN104501779A/en
Publication of CN104501779A publication Critical patent/CN104501779A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying

Abstract

The invention discloses a high-accuracy target positioning method of an unmanned plane on the basis of multi-station measurement. The high-accuracy target positioning method comprises the following steps: determining a satellite receiver, an inertial measurement unit, a camera holder and a camera equipped on the unmanned plane; before the unmanned plane takes off, calibrating inner parameters of the camera; in the flying of the unmanned plane, selecting interesting target points, and adjusting the azimuth angle and the elevation angle of the camera holder to guarantee that the target points are in the viewing field in the set time; extracting out coordinates of homonymy points of targets in images shot at different times, and calculating the coordinates (X, Y, Z) of the target points in a world coordinate system; converting the (X, Y, Z) into longitude and latitude and geodetic height and finishing the target positioning process. The high-accuracy target positioning method disclosed by the invention has the advantages that a photoelectric reconnaissanle platform is not needed, and laser ranging is not needed, so that the self safety is effectively guaranteed, and the cost of measuring equipment is also reduced; simultaneously, the high-accuracy target positioning method has high positioning accuracy and wide application range, and has important practical significance in the field of high-accuracy target positioning of the unmanned plane.

Description

Based on the unmanned plane high precision object localization method that multistation is measured

Technical field

The present invention relates to the passive target localization of a kind of high precision unmanned plane, specifically a kind of unmanned plane high precision object localization method measured based on multistation.

Background technology

Unmanned plane compared with manned aircraft, have that volume is little, cost is low, easy to use, to the advantage such as environmental requirement is low.Along with the raising of technical merit, unmanned plane is widely used in every field, comprises reconnaissance and surveillance, rescue and relief work, topographic(al) reconnaissance, target strike etc.Target localization is one of unmanned plane critical function, and its object is ask for the three-dimensional coordinate of target under earth coordinates.At present, high precision unmanned plane target location has become the focus of domestic and international unmanned plane research field.

Unmanned plane target location mainly can be divided into active location and passive location.Active location is based on based on attitude measurement/laser ranging location model, and under this location model, unmanned plane needs equipment photoelectronic reconnaissance platform to complete the function such as target following, laser ranging.Active location measuring accuracy is general higher, but laser belongs to visible ray category, is unfavorable for the hidden of self, is equipped with photoelectronic reconnaissance platform cost overhead larger simultaneously.Passive location is by camera acquisition target image, image analysis algorithm is utilized to obtain target location, mainly contain: based on the target localization of Image matching based, this method utilizes retrievable multi-source image resource, is setting up in advance under benchmark image condition, unmanned plane television image through overcorrect is mated with reference base picture, thus realize target location, the method target location accuracy is higher, but poor real, and being subject to the restriction of data acquisition, applicability is not high; Based on the target localization of imaging model, be under acquisition carrier elevation, focal length of camera, the isoparametric condition of video camera elements of exterior orientation, utilize collinearity condition equation to calculate the relative position of terrain object and carrier.In actual use, the method needs to suppose that target area to be measured is smooth ground, and target location accuracy is lower.

Summary of the invention

The present invention proposes a kind of unmanned plane high precision object localization method based on multi-vision visual.By the repeatedly observation to same impact point, reach the effect being similar to multi-vision visual location survey.

Technical scheme of the present invention is: a kind of unmanned plane high precision object localization method measured based on multistation, is characterized in that, comprise following process:

The first step, determine unmanned plane equipment satellite receiver, Inertial Measurement Unit (IMU), camera pan-tilt, video camera, wherein camera pan-tilt can realize two degree-of-freedom motion;

Second step, before unmanned plane takes off, the intrinsic parameter of calibrating camera;

3rd step, in unmanned plane during flying, select interested impact point, regulate position angle and the angular altitude of camera pan-tilt, ensure that, within the time of setting, impact point is in camera field of view;

4th step, extract not the coordinate of target same place in captured image in the same time;

5th step, calculate the coordinate of impact point in world coordinate system based on the multistation target localization model of weighted least square ;

6th step, using the geographic coordinate system of first measurement point as world coordinate system, will be converted to longitude and latitude and geodetic height, complete target localization process.

In described second step, the concrete scaling method of calibrating camera intrinsic parameter comprises the following steps:

(1) camera calibration target is prepared;

(2) utilize video camera to take 15 width images of different azimuth, guarantee that the image that obtains all has distribution in visual field within the scope of each, the shooting distance of simultaneous camera and the placed angle of target will have abundant change;

(3) extract the angular coordinate of shooting image, use Zhang Zhengyou scaling method calibrating camera intrinsic parameter.

Technique effect of the present invention is: the unmanned plane target localization method that the present invention is based on multi-vision visual does not need to carry photoelectronic reconnaissance platform, without the need to carrying out laser ranging, effectively ensure that inherently safe, also reducing the cost of measuring equipment.Meanwhile, the method has higher positioning precision, applied widely, has important practical significance to high precision unmanned plane target positioning field.

Accompanying drawing explanation

Fig. 1 is unmanned plane target positioning system used in the present invention;

Fig. 2 is camera calibration target of the present invention;

Fig. 3 is cross-goal localization method schematic diagram;

Fig. 4 is multistation object localization method schematic diagram;

Fig. 5 is coordinate conversion process flow diagram.

Embodiment

A kind of unmanned plane high precision object localization method measured based on multistation of the present invention, comprises following process:

The first step, determine unmanned plane equipment satellite receiver, Inertial Measurement Unit (IMU), camera pan-tilt, video camera, wherein camera pan-tilt can realize two degree-of-freedom motion (orientation rotation and height rotate).

Second step, before unmanned plane takes off, use the camera marking method in the present invention, determine the intrinsic parameter of video camera.

3rd step, in unmanned plane during flying, select interested impact point, regulate position angle and the angular altitude of camera pan-tilt, ensure that in certain hour, impact point is in camera field of view.

4th step, extract not the coordinate of target same place in captured image in the same time.

5th step, calculate the coordinate of impact point in world coordinate system according to the multistation target localization model based on weighted least square in the present invention .

6th step, using the geographic coordinate system of first measurement point as world coordinate system, will be converted to longitude and latitude and geodetic height, complete target localization process.

Below above-mentioned flow process is further elaborated.

1 System's composition and coordinate definition

1.1 system global structure

The unmanned plane target positioning system of discussion of the present invention comprises satellite receiver 1, Inertial Measurement Unit (IMU) 2, camera pan-tilt 3 and video camera 4, video camera 4 is arranged on camera pan-tilt 3, camera pan-tilt 3 can realize two degree-of-freedom motion (orientation rotation and height rotate), (IMU is used for measuring UAV Attitude as shown in Figure 1, satellite receiver is used for obtaining unmanned plane position, and camera pan-tilt is used for adjusting the sensing of video camera).Camera pan-tilt 3 and video camera 4 adopt pod propulsion structure to lay aboard.In target localization process, scouting video and telemetry intelligence (TELINT) transfer to land station's display through Data-Link, handle hand and control camera system search spot by operating rod and other instruction, when interested target appears on picture, by the repeatedly shooting to target same place, in conjunction with the data such as position angle and angular altitude of the attitude measurement data of aircraft, satellite receiver 1 positional information, camera pan-tilt 3, by a series of calculating, obtain the three-dimensional coordinate of target, complete position fixing process.

1.2 coordinate definition

First, coordinate system is defined as follows:

(1) world coordinate system

Also claim global coordinate system, the present invention is world coordinate system with unmanned plane to the geographic coordinate of first measurement point during target localization.

(2) camera coordinate system

The initial point of camera coordinates system is the X of video camera photocentre, X-axis and Y-axis and image, and Y-axis is parallel, and Z axis is camera optical axis, and it is vertical with the plane of delineation.

(3) image coordinate system

Image physical coordinates system be with optical axis with picture plane intersection point for initial point (being called figure principal point), the rectangular coordinate system being unit with actual physics yardstick (millimeter, micron etc.).Wherein X-axis, Y-axis are parallel with the X of image pixel coordinates system, Y-axis respectively.

Image pixel coordinates system is with image upper left angle point for initial point, take pixel as the rectangular coordinate system of coordinate unit.X, Y represent the line number of this pixel in digital picture and columns respectively.

2 unmanned plane target locator key technology

2.1 camera intrinsic parameters are demarcated

Camera calibration is in fact the process determining camera interior and exterior parameter, wherein the demarcation of inner parameter refer to determine video camera intrinsic, the inner geometry irrelevant with location parameter and optical parametric, comprise picture centre coordinate, focal length, scale factor and lens distortion etc.Unmanned plane take photo by plane carry sensor be generally high-resolution large-viewing-field Digital Video, there is comparatively significantly lens distortion, the body vibration that flying platform unsteady attitude and engine cause is there is in process, simultaneously in order to satisfied difference photography environment also needs frequent focus variation and unmanned plane is taken photo by plane.Therefore, in order to improve the precision of fixed ground target localization, camera calibration is very necessary.

Intrinsic parameters of the camera comprises linear transformation parameter and nonlinear distortion variable element.Linear transformation refers to classical perspective model, and linear perspective conversion have expressed the mapping relations of image coordinate system and camera coordinate system.Imaging process disobeys the nonlinear distortion being called video camera of perspective model, due to the complicacy of Lens Design and the impact of fresh water (FW) equality factor, actual lens imaging system can not strictly meet perspective model, produce so-called lens distortion, common as radial distortion, tangential distortion, thin prism distortion etc., thus larger distortion is being had away from picture centre place, in actual high precision is photogrammetric, especially, when using wide-angle lens, nonlinear model should be adopted to describe imaging relations.

In video camera imaging, ideal image point perspective projection model should be obeyed, i.e. object point, photocentre, picture point conllinear.But due to the existence of lens distortion, actual imaging point with ideal image point do not overlap, the difference between them is referred to as aberration .Lens distortion is divided into radial distortion and tangential distortion, considers that the phase difference model of radial distortion and tangential distortion can be described by 5 difference coefficients:

(1)

Wherein, for difference coefficient, for picture point to depart from the ratio of principal point and corresponding equivalent focal length in level and vertical direction.

At present, typical camera parameters scaling method mainly contains following three classes: (1) is based on the camera calibration of 3D stereo target; (2) based on the camera calibration of radial constraint; (3) based on the camera calibration of 2D plane target drone.Wherein, the camera calibration method based on RAC that representative both at home and abroad scaling method has Tsai to propose; Wen neural network replaces imaging model, the neural network of proposition, does not need to set up accurate projection model; What Zhang Zhengyou etc. proposed utilizes the orthogonality of rotation matrix and nonlinear optimization camera plane standardization etc.Method due to Zhang Zhengyou facilitates easy to operate, and moderate accuracy is widely adopted in Calibration of camera intrinsic parameters.In the method, require that video camera takes a plane target drone in two or more different azimuth, video camera and 2D target can freely move, and do not need to know kinematic parameter.In calibration process, assuming that intrinsic parameters of the camera is constant all the time, no matter namely video camera is from any angle shot target, intrinsic parameters of the camera is all constant, only has external parameter to change.The camera marking method that the present invention adopts Zhang Zhengyou to propose, adopt chessboard target as shown in Figure 2, video camera is utilized to take 15 width images of different azimuth, in order to improve stated accuracy, reduce the size of stochastic error, the image obtained all has distribution in visual field within the scope of each, and on shooting distance, have the depth of certain size (general 4m to 15m), the placed angle of target etc. also will have abundant change (-45 ° to 45 °) simultaneously.Through overtesting, through the video camera that intrinsic parameter is demarcated, can effectively improve unmanned plane target positioning precision.

2.2 same place extracting method

So-called same place and the imaging of same impact point not in the same time on camera field of view.The present invention adopts the same place extracted based on the local matching criterion in region on different images.At present, the conventional local matching criterion based on region mainly contains the absolute value (SAD of respective pixel difference in image sequence, Sum of Absolute Differences), quadratic sum (the SSD of respective pixel difference in image sequence, Sum of Squared Differences), the correlativity (NCC, Normalized Cross Correlation) etc. of image.Wherein SAD algorithm is the simplest, and after template size is determined, SAD algorithm fastest, therefore the present invention uses SAD algorithm as same place extracting method.

SAD algorithm is the simplest matching algorithm of one, is formulated as:

(2)

(3)

This kind of method is exactly centered by the source match point of left order image, defines a window d, add up the gray-scale value of its window and, then in right order image its left and right window of step by step calculation gray scale and difference, final search to the minimum region of difference be same place.Its specific implementation step is as follows:

1, a wicket is constructed, similar and convolution kernel;

2, centered by object pixel, cover source images with window, select all pixels in window overlay area;

3, same window covers image to be matched and selects the pixel of overlay area;

4, source images overlay area deducts the overlay area of image to be matched, and obtain the absolute value of all pixel differences and;

5, move the window of image to be matched, repeat the action of 3,4; (have hunting zone here, exceed this scope and jump out)

6, find the window that within the scope of this, sad value is minimum, namely have found the block of pixels with the optimum matching of source images impact point in image to be matched.

2.3 cross-goal positioning principles

If aerial c 1with c 2take the photograph station for two to photograph on a surface target, obtain a cubic phase pair, as shown in Figure 3.Terrain object point ppicture point on the photograph of left and right is p 1with p 2.Obviously, ray of the same name c 1 p 1with c 2 p 2intersect at terrain object point p.

According to perspective projection imaging relation, can derive c 1with c 2imaging collinearity equation be respectively:

(4)

(5)

Wherein, for ppoint actual imaging point coordinate; for equivalent focal length; for principal point coordinate; for ppoint exists take the photograph the coordinate under the camera coordinate system of station.

According to the relative pose relation of camera coordinate system and world coordinate system, can obtain:

(6)

Wherein, for the coordinate of impact point in world coordinate system, for the rotation matrix component that world coordinate system is consistent with camera coordinate system attitude and need; world coordinate system initial point is moved on to the translational movement of camera coordinate system initial point by representative.

In unmanned plane target position fixing process, using the geographic coordinate system of first measurement point as world coordinate system, then first time measurement point , second measurement point value can locate difference for twice by satellite positioning receiver and calculate.By Inertial Measurement Unit and camera cradle head, obtain aircraft crab angle , the angle of pitch , roll angle and the position angle of video camera and angular altitude , can obtain:

(7)

Simultaneous equations (4), (5), (6), (7), solve impact point pcoordinate .

2.4 based on the multistation target localization model of weighted least square

Above-mentioned object localization method, result of calculation is very sensitive to various noise.This is because for most photographic measurement system, the object distance of imaging system is much larger than focal length.According to the fundamental relation of perspective projection, now camera parameters or picture point extract the little deviation in the imaging light orientation that resultant error causes, and can bring the spatial point positioning result error of obviously amplifying.On the basis of cross-goal positioning principle, adopt the repetitive measurement to impact point of the same name, by optimum derivation algorithm, improve precision and the robustness of target location algorithm.

As shown in Figure 4, unmanned plane, in the flight course of preset flight path, carries out impact point n( n>3) secondary shooting, obtains nopen image.

Then according to collinearity equation, have:

(8)

Wherein: ,

Formula (8) is carried out first order Taylor expansion at initial value place, can obtain:

(9)

Wherein:

Order

(10)

(11)

Convolution (9), (10), (11) obtain:

(12)

According to least-squares estimation, can obtain:

(13)

Unmanned plane carries out in target localization process, at each measurement point, the attitude of aircraft, flying height, the position angle of video camera, angular altitude are all different, in this case, even if use same video camera, but because external parameters of cameras is different, cause each measurement point target location accuracy different, also different to the contribution of error.Therefore, weighted least square is introduced.

Order for weighting matrix, and:

(14)

Then have:

(15)

According to formula (11), (15)

(16)

for target location initial value, can try to achieve according to cross-goal positioning principle.Because the site error of initial value is comparatively large, add the error that linearization brings, make to try to achieve for the first time larger with actual value deviation.Adopt process of iteration, when positioning result trend stationary value, iteration terminates.

The estimation of weighting matrix is comparatively difficult, usually selects diagonal matrix or simpler unit matrix, there are some researches show, although the weight matrix selected has error, is still unbiased esti-mator to the weighted least square of unknown parameter.We have employed a kind of convenience, science method to obtain weight matrix, in practicality, obtain good effect.Its core concept is: for the measurement point causing error larger, gives less weights, and the measurement point that error is less gives larger weights, thus can strengthen " contribution " of good measurement point, improves the precision of least-squares estimation.In target localization process, measurement point distance objective point is far away, and positioning precision is poorer, and the distance of measurement point and impact point is pointed to angle by the elevation of measurement point and measurement point camera optical axis and jointly determined, meets basic triangle relation.Can obtain thus:

(17)

Wherein, for the element in weight matrix, for camera optical axis points to angle, hfor measurement point elevation.

In the present invention, using the geographic coordinate system of first measurement point as world coordinate system, will by the coordinate transform process shown in Fig. 5 be transformed into (longitude and latitude and geodetic height) in earth coordinates, complete target localization process.

Claims (2)

1., based on the unmanned plane high precision object localization method that multistation is measured, it is characterized in that, comprise following process:
The first step, determine unmanned plane equipment satellite receiver, Inertial Measurement Unit (IMU), camera pan-tilt, video camera, wherein camera pan-tilt can realize two degree-of-freedom motion;
Second step, before unmanned plane takes off, the intrinsic parameter of calibrating camera;
3rd step, in unmanned plane during flying, select interested impact point, regulate position angle and the angular altitude of camera pan-tilt, ensure that, within the time of setting, impact point is in camera field of view;
4th step, extract not the coordinate of target same place in captured image in the same time;
5th step, calculate the coordinate of impact point in world coordinate system based on the multistation target localization model of weighted least square ;
6th step, using the geographic coordinate system of first measurement point as world coordinate system, will be converted to longitude and latitude and geodetic height, complete target localization process.
2. the unmanned plane high precision object localization method measured based on multistation according to claim 1, it is characterized in that, in described second step, the concrete scaling method of calibrating camera intrinsic parameter comprises the following steps:
(1) camera calibration target is prepared;
(2) utilize video camera to take 15 width images of different azimuth, guarantee that the image that obtains all has distribution in visual field within the scope of each, the shooting distance of simultaneous camera and the placed angle of target will have abundant change;
(3) extract the angular coordinate of shooting image, use Zhang Zhengyou scaling method calibrating camera intrinsic parameter.
CN201510010637.3A 2015-01-09 2015-01-09 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement CN104501779A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510010637.3A CN104501779A (en) 2015-01-09 2015-01-09 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510010637.3A CN104501779A (en) 2015-01-09 2015-01-09 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement

Publications (1)

Publication Number Publication Date
CN104501779A true CN104501779A (en) 2015-04-08

Family

ID=52943205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510010637.3A CN104501779A (en) 2015-01-09 2015-01-09 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement

Country Status (1)

Country Link
CN (1) CN104501779A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794490A (en) * 2015-04-28 2015-07-22 中测新图(北京)遥感技术有限责任公司 Slanted image homonymy point acquisition method and slanted image homonymy point acquisition device for aerial multi-view images
CN105606128A (en) * 2015-12-01 2016-05-25 中国科学院上海技术物理研究所 External-field calibration method of space-borne laser altimeter
CN105631886A (en) * 2015-12-01 2016-06-01 中国科学院上海技术物理研究所 Relative positioning method for laser light spot and foot print camera on basis of aviation image
CN105698762A (en) * 2016-01-15 2016-06-22 中国人民解放军国防科学技术大学 Rapid target positioning method based on observation points at different time on single airplane flight path
CN105841676A (en) * 2016-03-24 2016-08-10 北京林业大学 Forest fire night positioning terrestrial photogrammetry method
CN106096207A (en) * 2016-06-29 2016-11-09 武汉中观自动化科技有限公司 A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure based on multi-vision visual and system
CN106403900A (en) * 2016-08-29 2017-02-15 上海交通大学 Flyer tracking and locating system and method
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology
CN106595668A (en) * 2016-12-12 2017-04-26 中国航空工业集团公司洛阳电光设备研究所 Passive location algorithm for electro-optical pod
CN107065895A (en) * 2017-01-05 2017-08-18 南京航空航天大学 A kind of plant protection unmanned plane determines high-tech
CN107438808A (en) * 2016-10-31 2017-12-05 深圳市大疆创新科技有限公司 A kind of method, apparatus and relevant device of rod volume control
CN108253936A (en) * 2018-01-04 2018-07-06 南京航空航天大学 A kind of unmanned plane target localization method for reducing optical axis and being directed toward random error
CN108827147A (en) * 2017-05-18 2018-11-16 金钱猫科技股份有限公司 A kind of image measuring method and system based on Fast Calibration

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021202A1 (en) * 2003-04-25 2005-01-27 Lockheed Martin Corporation Method and apparatus for video on demand
CN201517925U (en) * 2009-10-13 2010-06-30 经纬卫星资讯股份有限公司 Unmanned aerial vehicle remote sensing detector
CN102707306A (en) * 2011-12-29 2012-10-03 成都飞机工业(集团)有限责任公司 Combined navigation method applicable to unmanned aerial vehicle in glide landing stage
CN103822615A (en) * 2014-02-25 2014-05-28 北京航空航天大学 Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021202A1 (en) * 2003-04-25 2005-01-27 Lockheed Martin Corporation Method and apparatus for video on demand
CN201517925U (en) * 2009-10-13 2010-06-30 经纬卫星资讯股份有限公司 Unmanned aerial vehicle remote sensing detector
CN102707306A (en) * 2011-12-29 2012-10-03 成都飞机工业(集团)有限责任公司 Combined navigation method applicable to unmanned aerial vehicle in glide landing stage
CN103822615A (en) * 2014-02-25 2014-05-28 北京航空航天大学 Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
余家祥,等: "基于多帧图像同名点的无人机对地定位新方法", 《兵工学报》 *
迟健男,等: "《视觉测量技术》", 30 June 2011, 机械工业出版社 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794490B (en) * 2015-04-28 2018-10-02 中测新图(北京)遥感技术有限责任公司 The inclination image same place acquisition methods and device of aviation multi-view images
CN104794490A (en) * 2015-04-28 2015-07-22 中测新图(北京)遥感技术有限责任公司 Slanted image homonymy point acquisition method and slanted image homonymy point acquisition device for aerial multi-view images
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology
CN106454209B (en) * 2015-08-06 2019-08-06 航天图景(北京)科技有限公司 The fast anti-data link system of unmanned plane emergency and method based on TEMPORAL-SPATIAL INFORMATION FUSION
CN105631886A (en) * 2015-12-01 2016-06-01 中国科学院上海技术物理研究所 Relative positioning method for laser light spot and foot print camera on basis of aviation image
CN105606128A (en) * 2015-12-01 2016-05-25 中国科学院上海技术物理研究所 External-field calibration method of space-borne laser altimeter
CN105698762B (en) * 2016-01-15 2018-02-23 中国人民解放军国防科学技术大学 Target method for rapidly positioning based on observation station at different moments on a kind of unit flight path
CN105698762A (en) * 2016-01-15 2016-06-22 中国人民解放军国防科学技术大学 Rapid target positioning method based on observation points at different time on single airplane flight path
CN105841676A (en) * 2016-03-24 2016-08-10 北京林业大学 Forest fire night positioning terrestrial photogrammetry method
CN106096207B (en) * 2016-06-29 2019-06-07 武汉中观自动化科技有限公司 A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure and system based on multi-vision visual
CN106096207A (en) * 2016-06-29 2016-11-09 武汉中观自动化科技有限公司 A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure based on multi-vision visual and system
CN106403900A (en) * 2016-08-29 2017-02-15 上海交通大学 Flyer tracking and locating system and method
CN107438808A (en) * 2016-10-31 2017-12-05 深圳市大疆创新科技有限公司 A kind of method, apparatus and relevant device of rod volume control
CN106595668A (en) * 2016-12-12 2017-04-26 中国航空工业集团公司洛阳电光设备研究所 Passive location algorithm for electro-optical pod
CN106595668B (en) * 2016-12-12 2019-07-09 中国航空工业集团公司洛阳电光设备研究所 A kind of passive localization algorithm for photoelectric nacelle
CN107065895A (en) * 2017-01-05 2017-08-18 南京航空航天大学 A kind of plant protection unmanned plane determines high-tech
CN108827147A (en) * 2017-05-18 2018-11-16 金钱猫科技股份有限公司 A kind of image measuring method and system based on Fast Calibration
CN108844456A (en) * 2017-05-18 2018-11-20 金钱猫科技股份有限公司 A kind of rapid image measurement method and system
CN108253936A (en) * 2018-01-04 2018-07-06 南京航空航天大学 A kind of unmanned plane target localization method for reducing optical axis and being directed toward random error

Similar Documents

Publication Publication Date Title
US7071970B2 (en) Video augmented orientation sensor
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
US9607219B2 (en) Determination of position from images and associated camera positions
Xiang et al. Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform
Zongjian UAV for mapping—low altitude photogrammetric survey
CN103119611B (en) The method and apparatus of the location based on image
Nagai et al. UAV-borne 3-D mapping system by multisensor integration
Vallet et al. Photogrammetric performance of an ultra light weight swinglet UAV
US20090262974A1 (en) System and method for obtaining georeferenced mapping data
Zhang et al. An Unmanned Aerial Vehicle‐Based Imaging System for 3D Measurement of Unpaved Road Surface Distresses 1
Rinaudo et al. Archaeological site monitoring: UAV photogrammetry can be an answer
Li Potential of high-resolution satellite imagery for national mapping products
CN105928498A (en) Determination Of Object Data By Template-based Uav Control
US20060215935A1 (en) System and architecture for automatic image registration
Delacourt et al. DRELIO: An unmanned helicopter for imaging coastal areas
US9194954B2 (en) Method for geo-referencing an imaged area
Yahyanejad et al. Incremental mosaicking of images from autonomous, small-scale uavs
US9083859B2 (en) System and method for determining geo-location(s) in images
Kim et al. Feasibility of employing a smartphone as the payload in a photogrammetric UAV system
JP2009501996A (en) Image geometric correction method and apparatus
Nagai et al. UAV Borne mapping by multi sensor integration
CN103674063B (en) A kind of optical remote sensing camera geometric calibration method in-orbit
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN102829785A (en) Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN105469405B (en) Positioning and map constructing method while view-based access control model ranging

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150408

RJ01 Rejection of invention patent application after publication