CN112950719A - Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform - Google Patents

Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform Download PDF

Info

Publication number
CN112950719A
CN112950719A CN202110091538.8A CN202110091538A CN112950719A CN 112950719 A CN112950719 A CN 112950719A CN 202110091538 A CN202110091538 A CN 202110091538A CN 112950719 A CN112950719 A CN 112950719A
Authority
CN
China
Prior art keywords
target
coordinates
coordinate system
camera
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110091538.8A
Other languages
Chinese (zh)
Other versions
CN112950719B (en
Inventor
张通
王晨昕
庞明慧
江奕蕾
刘春江
符文星
杨忠龙
张添钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110091538.8A priority Critical patent/CN112950719B/en
Publication of CN112950719A publication Critical patent/CN112950719A/en
Application granted granted Critical
Publication of CN112950719B publication Critical patent/CN112950719B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform, which comprises the steps that firstly, the unmanned aerial vehicle photoelectric platform identifies a target and determines the position of the target in a view field; then changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the position occupied by the target reaches more than one fifth of the picture of the onboard camera; according to the shot picture, determining the position coordinate of the target centroid in the imaging coordinate system, and converting the position coordinate into a geodetic coordinate system; and finally, calculating the error range of the target positioning result and compensating. The method of the invention combines the attitude error, the position error and the optical level platform frame angle error of the unmanned aerial vehicle into the external orientation element error of the camera for analysis, reasonably compensates the target position, and can quickly obtain the accurate positioning of the target.

Description

Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform
Technical Field
The invention belongs to the technical field of unmanned aerial vehicles, and particularly relates to a passive target rapid positioning method.
Background
With the development of science and technology, unmanned combat systems are applied more and more widely in the military field. In the process of hitting the ground by the unmanned aerial vehicle, calculating the positioning error of the ground target by the unmanned aerial vehicle is a key link, correcting and compensating various distortions in the positioning process can improve the accuracy of a positioning result, and then combining the analysis of the error and the calculation of an error range, the target indication can be provided for the follow-up accurate hitting.
The target positioning based on the unmanned aerial vehicle mainly comprises three methods, namely positioning based on collineation, positioning based on image matching and positioning based on attitude measurement or laser ranging.
The collinear photography positioning has more limitations, and as it is generally assumed that the target region to be measured is flat, the inside and outside orientation elements of the camera need to be obtained, and both the inside and outside orientation elements may have errors, and the like.
The positioning based on image matching needs to establish a reference image in advance and carry out reference point matching with the corrected unmanned aerial vehicle image, so that the use condition of the positioning mode is limited, and the real-time performance of image matching is poor, so that the positioning method is not suitable for being used in the occasion of fast positioning of the unmanned aerial vehicle.
Positioning based on attitude measurement or laser ranging requires that the unmanned aerial vehicle hover above a target and perform laser positioning and ranging, and is obviously not suitable for quick positioning occasions.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform, firstly, the unmanned aerial vehicle photoelectric platform identifies a target and determines the position of the target in a view field; then changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the position occupied by the target reaches more than one fifth of the picture of the onboard camera; according to the shot picture, determining the position coordinate of the target centroid in the imaging coordinate system, and converting the position coordinate into a geodetic coordinate system; and finally, calculating the error range of the target positioning result and compensating. The method of the invention combines the attitude error, the position error and the optical level platform frame angle error of the unmanned aerial vehicle into the external orientation element error of the camera for analysis, reasonably compensates the target position, and can quickly obtain the accurate positioning of the target.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
step 1: identifying a target by an unmanned aerial vehicle photoelectric platform, and determining the position of the target in a view field;
step 2: changing the focal length f of an airborne camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the occupied position of the target reaches more than 1 of the L of the picture of the airborne camera;
and step 3: determining the position of the target in the geodetic coordinate system;
step 3-1: distortion correction is carried out on the imaging data;
the distortion correction formula is:
Figure BDA0002912762680000021
Figure BDA0002912762680000022
wherein k is1,k2,p1,p2,s1,s2Is the distortion coefficient, (x)ui,yui) To assume ideal coordinates of the imaging coordinate system in an error-free and distortion-free state, (x)vi,yvi) Representing the coordinates measured in the imaging coordinate system in actual imaging;
the distortion coefficient is calibrated by three pairs of known points, and the ideal coordinates of the three pairs of known points are assumed as follows: (x)u1,yu1),(xu2,yu2),(xu3,yu3) The actual coordinates are: (x)v1,yv1),(xv2,yv2),(xv3,yv3) Then the distortion coefficient is expressed as:
Figure BDA0002912762680000023
wherein:
P=[k1,k2,p1,p2,s1,s2]T
Figure BDA0002912762680000024
Figure BDA0002912762680000025
Figure BDA0002912762680000031
step 3-2: determining centroid position of target
And performing linear fitting on the appearance of the target by adopting a least square method, wherein the fitting result is expressed as:
f(x)=ax+b (3)
with (x)i,yi) When i is 1,2,3 … n represents the coordinates of the target edge point in the imaging coordinate system in the photographed picture, and n is the number of the target edge points, equation (3) satisfies:
Figure BDA0002912762680000032
wherein, a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
Figure BDA0002912762680000033
step 3-3: performing coordinate conversion on the centroid coordinates (x, y);
converting the target centroid coordinates (x, y) from the imaging coordinate system to a geodetic coordinate system, wherein the coordinates of a target point in the imaging coordinate system, the coordinates of a target point in the geodetic coordinate system and the projection center of the camera optical system satisfy a collinear equation:
Figure BDA0002912762680000034
Figure BDA0002912762680000035
wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, and when the area to be measured is regarded as a flat area, Z is made to be 0; x is the number of0,y0F is an internal orientation element of the camera and is a known parameter marked when the camera is out of the field, (x)0,y0) As principal point-like coordinates, (X)s,Ys,Zs) Coordinates of the position of the projection center of the camera optical system, ai,bi,ciIs the directional cosine, and is respectively expressed as:
Figure BDA0002912762680000041
wherein,
Figure BDA0002912762680000042
omega and kappa are respectively a course inclination angle, a side inclination angle and a picture rotation angle in the exterior orientation element of the camera, respectively represent the rotation angles of the camera around the Z axis, the Y axis and the X axis of a geodetic coordinate system, and respectively represent the rotation angles of the camera with the X axiss,Ys,ZsJointly form the exterior part of the cameraA bit element;
step 3-4: calculating the error range of the target positioning result;
the error range of the target point coordinates is expressed as follows by taking the derivation of equation (6):
Figure BDA0002912762680000043
and 4, step 4: and (3) calculating the absolute value of the formula (8) to obtain the error of the coordinates of the target point, thereby compensating the positioning result:
Figure BDA0002912762680000051
wherein:
Figure BDA0002912762680000052
Figure BDA0002912762680000053
Figure BDA0002912762680000054
the measurement result after error compensation is:
Figure BDA0002912762680000055
preferably, L ═ 5.
The invention has the following beneficial effects:
1. the target rapid positioning method provided by the invention can output the position information of the target in real time.
2. The target rapid positioning method provided by the invention integrates the attitude error, the position error and the optical level platform frame angle error of the unmanned aerial vehicle into the external orientation element error of the camera for analysis, reasonably compensates the target position, and can rapidly obtain the target accurate positioning.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Fig. 2 is a schematic diagram of positioning photographing of an unmanned aerial vehicle photoelectric platform.
Fig. 3 is a schematic diagram of image distortion, in which diagram (a) is an ideal image, diagram (b-1) is radial distortion-barrel distortion, diagram (b-2) is radial distortion-pincushion distortion, and diagram (c) is tangential distortion.
Fig. 4 is a schematic diagram of coordinates of image points before correction according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of the corrected coordinates of the image point according to the embodiment of the invention.
FIG. 6 is a diagram illustrating the result of the linear fitting of the image slice and the position of the centroid according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating the direct location of the centroid position according to the embodiment of the present invention.
FIG. 8 is a diagram illustrating the centroid position location results after error compensation in accordance with an embodiment of the present invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The invention aims to provide a passive target rapid positioning algorithm based on an unmanned aerial vehicle active photoelectric platform, which can rapidly output the position of a target in the process of low-altitude flight of an unmanned aerial vehicle and solve the problems of more use limitations, poorer positioning accuracy and low real-time performance of the conventional collinear photography positioning technology.
As shown in fig. 1, a passive target fast positioning method based on an unmanned aerial vehicle active photoelectric platform includes the following steps:
step 1: identifying a target by an unmanned aerial vehicle photoelectric platform, and determining the position of the target in a view field;
step 2: the camera installed on the photoelectric platform at the lower part of the unmanned aerial vehicle is used for photographing and imaging the target. Changing the focal length f of an airborne camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the occupied position of the target reaches more than 5 minutes and 1 time of the picture of the airborne camera;
and step 3: determining the position of the target in the geodetic coordinate system;
step 3-1: distortion correction is carried out on the imaging data;
as shown in fig. 3, the lens distortion is classified into three categories, one of which is radial distortion, which is caused by curvature error of the camera lens, and causes the actual image to be higher or lower than the ideal image, and is the main component of the distortion; the tangential distortion is caused by the fact that the optical centers of the camera lenses are not strictly coaxial when the camera lenses are combined; the distortion correction formula is:
Figure BDA0002912762680000061
Figure BDA0002912762680000062
wherein k is1,k2,p1,p2,s1,s2Is the distortion coefficient, (x)ui,yui) To assume ideal coordinates of the imaging coordinate system in an error-free and distortion-free state, (x)vi,yvi) Representing the coordinates measured in the imaging coordinate system in actual imaging;
the distortion coefficient is calibrated by three pairs of known points, and the ideal coordinates of the three pairs of known points are assumed as follows: (x)u1,yu1),(xu2,yu2),(xu3,yu3) The actual coordinates are: (x)v1,yv1),(xv2,yv2),(xv3,yv3) Then the distortion coefficient is expressed as:
Figure BDA0002912762680000071
wherein:
P=[k1,k2,p1,p2,s1,s2]T
Figure BDA0002912762680000072
Figure BDA0002912762680000073
Figure BDA0002912762680000074
step 3-2: determining centroid position of target
By analyzing the geometric shape of the target and adopting a least square method to perform linear fitting on the shape of the target, the fitting result is expressed as:
f(x)=ax+b (3)
with (x)i,yi) When i is 1,2,3 … n represents the coordinates of the target edge point in the imaging coordinate system in the photographed picture, and n is the number of the target edge points, equation (3) satisfies:
Figure BDA0002912762680000075
wherein, a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
Figure BDA0002912762680000081
step 3-3: performing coordinate conversion on the centroid coordinates (x, y);
converting the target centroid coordinates (x, y) from the imaging coordinate system to a geodetic coordinate system, wherein the coordinates of a target point in the imaging coordinate system, the coordinates of a target point in the geodetic coordinate system and the projection center of the camera optical system satisfy a collinear equation:
Figure BDA0002912762680000082
Figure BDA0002912762680000083
wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, and when the area to be measured is regarded as a flat area, Z is made to be 0; x is the number of0,y0F is an internal orientation element of the camera and is a known parameter marked when the camera is out of the field, (x)0,y0) As principal point-like coordinates, (X)s,Ys,Zs) Coordinates of the position of the projection center of the camera optical system, ai,bi,ciIs the directional cosine, and is respectively expressed as:
Figure BDA0002912762680000084
wherein,
Figure BDA0002912762680000085
omega and kappa are respectively a course inclination angle, a side inclination angle and a picture rotation angle in the exterior orientation element of the camera, respectively represent the rotation angles of the camera around the Z axis, the Y axis and the X axis of a geodetic coordinate system, and respectively represent the rotation angles of the camera with the X axiss,Ys,ZsThe exterior orientation elements of the camera are formed together;
step 3-4: calculating the error range of the target positioning result;
the error range of the target point coordinates is expressed as follows by taking the derivation of equation (6):
Figure BDA0002912762680000091
and 4, step 4: and (3) calculating the absolute value of the formula (8) to obtain the error of the coordinates of the target point, thereby compensating the positioning result:
Figure BDA0002912762680000092
wherein:
Figure BDA0002912762680000093
Figure BDA0002912762680000094
Figure BDA0002912762680000095
the measurement result after error compensation is:
Figure BDA0002912762680000096
the specific embodiment is as follows:
the region to be measured of the present embodiment can be assumed to be a flat area. And carrying out error analysis on the internal and external orientation elements of the camera, regarding the attitude error and the position error of the unmanned aerial vehicle and the errors of the light level platform frame angle as Gaussian distribution, and analyzing the external orientation element errors of the camera.
The unmanned aerial vehicle rapid positioning system mainly comprises the following three parts:
(1) photographing and imaging by using the photoelectric platform of the unmanned aerial vehicle;
(2) determining the position;
(3) and calculating an error range.
When the area of a ground target occupying more than one fifth of the area of a camera screen on the unmanned aerial vehicle is shot by the photoelectric platform of the unmanned aerial vehicle, a photo of the position of the target is shot, the target position is presumed according to the photo, and the error range of the target position is rapidly calculated.
1. The method comprises the following steps that a machine photoelectric platform identifies a target and determines the position of the target in a field of view;
2. as shown in fig. 2, a camera installed on the photoelectric platform at the lower part of the unmanned aerial vehicle takes pictures of the target for imaging. Changing the focal length f of an airborne camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the occupied position of the target reaches more than 5 minutes and 1 time of the picture of the airborne camera;
3. distortion correction is performed on the imaging data, and fig. 4 is coordinates of an image point before correction, and fig. 5 is coordinates of an image point after correction.
4. The centroid position of the target is determined as shown in fig. 6. The hollow rhombus marks the position of the centroid of the target, the four straight lines are linear fitting results of the four edges of the target, and the points on the four edges are corrected image points.
5. The result of the centroid position coordinate transformation is shown in fig. 7, in which the black triangle represents the actual position of the target centroid, and the black square represents the target centroid position localization result obtained by direct calculation.
6. The final centroid localization result after error compensation is shown in fig. 8, in which the black triangle represents the actual position of the target centroid, the black square represents the target centroid position localization result obtained by direct calculation, and the black diamond represents the target centroid position localization result obtained after error compensation.

Claims (2)

1. A passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform is characterized by comprising the following steps:
step 1: identifying a target by an unmanned aerial vehicle photoelectric platform, and determining the position of the target in a view field;
step 2: changing the focal length f of an airborne camera of the unmanned aerial vehicle photoelectric platform, and taking a picture of the target when the occupied position of the target reaches more than 1 of the L of the picture of the airborne camera;
and step 3: determining the position of the target in the geodetic coordinate system;
step 3-1: distortion correction is carried out on the imaging data;
the distortion correction formula is:
Figure FDA0002912762670000011
Figure FDA0002912762670000012
wherein k is1,k2,p1,p2,s1,s2Is the distortion coefficient, (x)ui,yui) To assume ideal coordinates of the imaging coordinate system in an error-free and distortion-free state, (x)vi,yvi) Representing the coordinates measured in the imaging coordinate system in actual imaging;
the distortion coefficient is calibrated by three pairs of known points, and the ideal coordinates of the three pairs of known points are assumed as follows: (x)u1,yu1),(xu2,yu2),(xu3,yu3) The actual coordinates are: (x)v1,yv1),(xv2,yv2),(xv3,yv3) Then the distortion coefficient is expressed as:
Figure FDA0002912762670000013
wherein:
P=[k1,k2,p1,p2,s1,s2]T
Figure FDA0002912762670000014
Figure FDA0002912762670000015
Figure FDA0002912762670000021
step 3-2: determining centroid position of target
And performing linear fitting on the appearance of the target by adopting a least square method, wherein the fitting result is expressed as:
f(x)=ax+b (3)
with (x)i,yi) When i is 1,2,3 … n represents the coordinates of the target edge point in the imaging coordinate system in the photographed picture, and n is the number of the target edge points, equation (3) satisfies:
Figure FDA0002912762670000022
wherein, a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
Figure FDA0002912762670000023
step 3-3: performing coordinate conversion on the centroid coordinates (x, y);
converting the target centroid coordinates (x, y) from the imaging coordinate system to a geodetic coordinate system, wherein the coordinates of a target point in the imaging coordinate system, the coordinates of a target point in the geodetic coordinate system and the projection center of the camera optical system satisfy a collinear equation:
Figure FDA0002912762670000024
Figure FDA0002912762670000025
wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, and when the area to be measured is regarded as a flat area, Z is made to be 0; x is the number of0,y0F is an internal orientation element of the camera and is a known parameter marked when the camera is out of the field, (x)0,y0) As principal point-like coordinates, (X)s,Ys,Zs) For projection in camera optical systemsPosition coordinates of the heart, ai,bi,ciIs the directional cosine, and is respectively expressed as:
Figure FDA0002912762670000031
wherein,
Figure FDA0002912762670000032
omega and kappa are respectively a course inclination angle, a side inclination angle and a picture rotation angle in the exterior orientation element of the camera, respectively represent the rotation angles of the camera around the Z axis, the Y axis and the X axis of a geodetic coordinate system, and respectively represent the rotation angles of the camera with the X axiss,Ys,ZsThe exterior orientation elements of the camera are formed together;
step 3-4: calculating the error range of the target positioning result;
the error range of the target point coordinates is expressed as follows by taking the derivation of equation (6):
Figure FDA0002912762670000033
and 4, step 4: and (3) calculating the absolute value of the formula (8) to obtain the error of the coordinates of the target point, thereby compensating the positioning result:
Figure FDA0002912762670000041
wherein:
Figure FDA0002912762670000042
Figure FDA0002912762670000043
Figure FDA0002912762670000044
the measurement result after error compensation is:
Figure FDA0002912762670000045
2. the passive target rapid positioning method based on the unmanned aerial vehicle active photoelectric platform is characterized in that L is 5.
CN202110091538.8A 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform Active CN112950719B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110091538.8A CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110091538.8A CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Publications (2)

Publication Number Publication Date
CN112950719A true CN112950719A (en) 2021-06-11
CN112950719B CN112950719B (en) 2024-06-04

Family

ID=76236046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110091538.8A Active CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Country Status (1)

Country Link
CN (1) CN112950719B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114322940A (en) * 2021-12-02 2022-04-12 中国人民解放军96796部队 Method and system for measuring air explosion position center through air-ground integrated multi-view intersection
CN115144879A (en) * 2022-07-01 2022-10-04 燕山大学 Multi-machine multi-target dynamic positioning system and method
WO2023015566A1 (en) * 2021-08-13 2023-02-16 深圳市大疆创新科技有限公司 Control method, control device, movable platform, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
CN107976899A (en) * 2017-11-15 2018-05-01 中国人民解放军海军航空工程学院 A kind of precision target positioning and striking method based on someone/unmanned plane cooperative engagement systems
CN109684909A (en) * 2018-10-11 2019-04-26 武汉工程大学 A kind of unmanned plane target key point real-time location method, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
CN107976899A (en) * 2017-11-15 2018-05-01 中国人民解放军海军航空工程学院 A kind of precision target positioning and striking method based on someone/unmanned plane cooperative engagement systems
CN109684909A (en) * 2018-10-11 2019-04-26 武汉工程大学 A kind of unmanned plane target key point real-time location method, system and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
常晓飞;段丽娟;符文星;闫杰;: "基于A~*算法的无人机爬升轨迹设计", 飞行力学, no. 06, 15 December 2008 (2008-12-15) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023015566A1 (en) * 2021-08-13 2023-02-16 深圳市大疆创新科技有限公司 Control method, control device, movable platform, and storage medium
CN114322940A (en) * 2021-12-02 2022-04-12 中国人民解放军96796部队 Method and system for measuring air explosion position center through air-ground integrated multi-view intersection
CN114322940B (en) * 2021-12-02 2024-04-09 中国人民解放军96796部队 Method and system for measuring center of idle explosion position by air-ground integrated multi-purpose intersection
CN115144879A (en) * 2022-07-01 2022-10-04 燕山大学 Multi-machine multi-target dynamic positioning system and method

Also Published As

Publication number Publication date
CN112950719B (en) 2024-06-04

Similar Documents

Publication Publication Date Title
CN112950719A (en) Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform
US7479982B2 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
CN106127697B (en) EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN103674063B (en) A kind of optical remote sensing camera geometric calibration method in-orbit
CN103822615B (en) A kind of multi-control point extracts and the unmanned aerial vehicle target real-time location method be polymerized automatically
CN106373159A (en) Simplified unmanned aerial vehicle multi-target location method
CN107564057B (en) High-orbit planar array optical satellite in-orbit geometric calibration method considering atmospheric refraction correction
CN113538595B (en) Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner
CN106595700A (en) Target channel space reference calibration method based on three-point coordinate measurement
CN113947638B (en) Method for correcting orthographic image of fish-eye camera
CN109341720A (en) A kind of remote sensing camera geometric calibration method based on fixed star track
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN111508028A (en) Autonomous in-orbit geometric calibration method and system for optical stereo mapping satellite camera
CN107784633B (en) Unmanned aerial vehicle aerial image calibration method suitable for plane measurement
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN110595374A (en) Large structural part real-time deformation monitoring method based on image transmission machine
CN111561867A (en) Airplane surface appearance digital measurement method
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
CN116563699A (en) Forest fire positioning method combining sky map and mobile phone image
CN114608540A (en) Measurement network type determining method of digital photogrammetric system
CN108413989B (en) Distortion correction method based on image re-projection
CN111595289A (en) Three-dimensional angle measurement system and method based on image processing
CN118111478B (en) Nacelle system error rapid calibration method based on map matching
CN115994854B (en) Method and system for registering marker point cloud and image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant