CN112950719B - Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform - Google Patents

Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform Download PDF

Info

Publication number
CN112950719B
CN112950719B CN202110091538.8A CN202110091538A CN112950719B CN 112950719 B CN112950719 B CN 112950719B CN 202110091538 A CN202110091538 A CN 202110091538A CN 112950719 B CN112950719 B CN 112950719B
Authority
CN
China
Prior art keywords
target
coordinate system
coordinates
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110091538.8A
Other languages
Chinese (zh)
Other versions
CN112950719A (en
Inventor
张通
王晨昕
庞明慧
江奕蕾
刘春江
符文星
杨忠龙
张添钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202110091538.8A priority Critical patent/CN112950719B/en
Publication of CN112950719A publication Critical patent/CN112950719A/en
Application granted granted Critical
Publication of CN112950719B publication Critical patent/CN112950719B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a passive target quick positioning method based on an unmanned aerial vehicle active photoelectric platform, which comprises the steps of firstly, identifying a target by an unmanned aerial vehicle optical level platform and determining the position of the target in a field of view; then changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the position occupied by the target reaches more than one fifth of the photo of the onboard camera; determining the position coordinates of the target centroid in an imaging coordinate system according to the photographed photo, and converting the position coordinates into a geodetic coordinate system; and finally, calculating the error range of the target positioning result and compensating. According to the method, the attitude error, the position error and the error of the frame angle of the photoelectric platform of the unmanned aerial vehicle are combined into the error of the external azimuth element of the camera for analysis, the position of the target is reasonably compensated, and the accurate positioning of the target can be quickly obtained.

Description

Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform
Technical Field
The invention belongs to the technical field of unmanned aerial vehicles, and particularly relates to a passive target rapid positioning method.
Background
With the development of science and technology, unmanned combat systems are becoming more and more widely used in the military field. In the process of the unmanned aerial vehicle to strike the ground, calculating the positioning error of the unmanned aerial vehicle to the ground target is a key link, the accuracy of a positioning result can be improved through correction and compensation of various distortions in the positioning process, and then the target indication can be provided for the follow-up implementation of accurate striking by combining the analysis of the error and the calculation of the error range.
The target positioning based on the unmanned aerial vehicle mainly comprises three methods, namely positioning based on collineation, positioning based on image matching and positioning based on gesture measurement or laser ranging.
Among them, the positioning based on the collinear photography has more limitation, such as that it is generally assumed that the target area to be measured is flat, the internal and external azimuth elements of the camera need to be acquired, and errors may exist in both the internal and external azimuth elements.
The positioning based on image matching needs to pre-establish a reference image and match the reference image with the corrected unmanned aerial vehicle image, so that the use condition of the positioning mode is limited, and the real-time performance of image matching is poor, so that the positioning method is not suitable for being used in the occasion of rapid positioning of the unmanned aerial vehicle.
Positioning based on gesture measurement or laser ranging requires an unmanned aerial vehicle to hover over a target and range with laser positioning, which is obviously unsuitable for quick positioning occasions.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform, which comprises the steps of firstly identifying a target by an unmanned aerial vehicle optical level platform and determining the position of the target in a field of view; then changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the position occupied by the target reaches more than one fifth of the photo of the onboard camera; determining the position coordinates of the target centroid in an imaging coordinate system according to the photographed photo, and converting the position coordinates into a geodetic coordinate system; and finally, calculating the error range of the target positioning result and compensating. According to the method, the attitude error, the position error and the error of the frame angle of the photoelectric platform of the unmanned aerial vehicle are combined into the error of the external azimuth element of the camera for analysis, the position of the target is reasonably compensated, and the accurate positioning of the target can be quickly obtained.
The technical scheme adopted by the invention for solving the technical problems comprises the following steps:
step 1: the unmanned aerial vehicle optical level platform identifies the target and determines the position of the target in the field of view;
step 2: changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the occupied position of the target reaches more than 1 of L-th of the photo of the onboard camera;
step 3: determining the position of a target in a geodetic coordinate system;
Step 3-1: performing distortion correction on the imaging data;
The distortion correction formula is:
Wherein k 1,k2,p1,p2,s1,s2 is a distortion coefficient, (x ui,yui) is an ideal coordinate of the imaging coordinate system under the assumption of no error and no distortion state, and (x vi,yvi) represents a coordinate measured in the imaging coordinate system in actual imaging;
The distortion coefficients are calibrated by three pairs of known points, assuming ideal coordinates for the three pairs of known points: (x u1,yu1),(xu2,yu2),(xu3,yu3), the actual coordinates are: (x v1,yv1),(xv2,yv2),(xv3,yv3), the distortion coefficient is expressed as:
Wherein:
P=[k1,k2,p1,p2,s1,s2]T
step 3-2: determining centroid position of target
The shape of the target is linearly fitted by adopting a least square method, and the fitting result is expressed as follows:
f(x)=ax+b (3)
Taking (x i,yi), i=1, 2,3 … n to represent the coordinates of the target edge points in the imaging coordinate system in the taken photo, and n is the number of target edge points, then the formula (3) satisfies:
wherein a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
step 3-3: coordinate conversion is carried out on centroid coordinates (x, y);
converting the coordinates (x, y) of the target centroid from an imaging coordinate system to a geodetic coordinate system, wherein the coordinates of the target point of the imaging coordinate system, the coordinates of the target point of the geodetic coordinate system and the projection center of the camera optical system satisfy a collineation equation:
wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, the region to be measured is regarded as a flat region, and Z=0; x 0,y0, f is the azimuth element in the camera, is a known parameter marked when the camera comes out, (X 0,y0) is the image principal point coordinate, (X s,Ys,Zs) is the position coordinate of the projection center of the camera optical system, a i,bi,ci is the direction cosine, and is respectively expressed as:
Wherein, Omega and kappa are heading dip angles, sideways dip angles and photo rotation angles in the external azimuth elements of the camera respectively, and represent angles of rotation of the camera around a Z axis, a Y axis and an X axis of a geodetic coordinate system respectively, and form the external azimuth elements of the camera together with X s,Ys,Zs;
step 3-4: calculating an error range of a target positioning result;
Deriving the equation (6), the error range of the target point coordinates is expressed as:
Step 4: the absolute value of the formula (8) is cancelled and calculated to obtain the error of the coordinates of the target point, so that the positioning result is compensated:
Wherein:
the measurement result after error compensation is:
preferably, the l=5.
The beneficial effects of the invention are as follows:
1. The target quick positioning method provided by the invention can output the position information of the target in real time.
2. According to the target quick positioning method provided by the invention, the attitude error, the position error and the error of the frame angle of the photoelectric platform of the unmanned aerial vehicle are integrated into the error of the external azimuth element of the camera for analysis, the position of the target is reasonably compensated, and the accurate positioning of the target can be quickly obtained.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Fig. 2 is a schematic view of unmanned aerial vehicle photoelectric platform positioning photographing.
Fig. 3 is a schematic view of image distortion, wherein fig. (a) is an ideal image, fig. (b-1) is radial distortion-barrel distortion, fig. (b-2) is radial distortion-pincushion distortion, and fig. (c) is tangential distortion.
FIG. 4 is a schematic diagram of pixel coordinates before correction according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of corrected pixel coordinates according to an embodiment of the invention.
FIG. 6 is a graph showing the result of linear fitting and centroid position according to an embodiment of the present invention.
FIG. 7 is a graph showing the result of direct centroid position location in accordance with an embodiment of the present invention.
FIG. 8 is a graph showing the result of the centroid position location after error compensation according to the embodiment of the present invention.
Detailed Description
The invention will be further described with reference to the drawings and examples.
The invention aims to provide a passive target quick positioning algorithm based on an unmanned aerial vehicle active photoelectric platform, which can quickly output the position of a target in the low-altitude flight process of an unmanned aerial vehicle and solve the problems of more use limitations, poor positioning precision and low instantaneity of the existing collinear photographic positioning technology.
As shown in fig. 1, a passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform comprises the following steps:
step 1: the unmanned aerial vehicle optical level platform identifies the target and determines the position of the target in the field of view;
Step 2: and photographing and imaging the target through a camera arranged on a lower photoelectric platform of the unmanned aerial vehicle. Changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the position occupied by the target reaches more than 1 of 5 minutes of the photo of the onboard camera;
step 3: determining the position of a target in a geodetic coordinate system;
Step 3-1: performing distortion correction on the imaging data;
As shown in fig. 3, the lens distortion is classified into three types, one is radial distortion, which is caused by curvature errors of the camera lens, and causes an actual image to be higher or lower than an ideal image, and is a main component of distortion; tangential distortion is caused by the fact that the optical centers are not strictly coaxial when the camera lenses are combined; the distortion correction formula is:
Wherein k 1,k2,p1,p2,s1,s2 is a distortion coefficient, (x ui,yui) is an ideal coordinate of the imaging coordinate system under the assumption of no error and no distortion state, and (x vi,yvi) represents a coordinate measured in the imaging coordinate system in actual imaging;
The distortion coefficients are calibrated by three pairs of known points, assuming ideal coordinates for the three pairs of known points: (x u1,yu1),(xu2,yu2),(xu3,yu3), the actual coordinates are: (x v1,yv1),(xv2,yv2),(xv3,yv3), the distortion coefficient is expressed as:
Wherein:
P=[k1,k2,p1,p2,s1,s2]T
step 3-2: determining centroid position of target
By analyzing the geometric shape of the target, the shape of the target is linearly fitted by adopting a least square method, and the fitting result is expressed as follows:
f(x)=ax+b (3)
Taking (x i,yi), i=1, 2,3 … n to represent the coordinates of the target edge points in the imaging coordinate system in the taken photo, and n is the number of target edge points, then the formula (3) satisfies:
wherein a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
step 3-3: coordinate conversion is carried out on centroid coordinates (x, y);
Converting the coordinates (x, y) of the target centroid from an imaging coordinate system to a geodetic coordinate system, wherein the coordinates of the target point of the imaging coordinate system, the coordinates of the target point of the geodetic coordinate system and the projection center of the camera optical system satisfy a collineation equation:
wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, the region to be measured is regarded as a flat region, and Z=0; x 0,y0, f is the azimuth element in the camera, is a known parameter marked when the camera comes out, (X 0,y0) is the image principal point coordinate, (X s,Ys,Zs) is the position coordinate of the projection center of the camera optical system, a i,bi,ci is the direction cosine, and is respectively expressed as:
Wherein, Omega and kappa are heading dip angles, sideways dip angles and photo rotation angles in the external azimuth elements of the camera respectively, and represent angles of rotation of the camera around a Z axis, a Y axis and an X axis of a geodetic coordinate system respectively, and form the external azimuth elements of the camera together with X s,Ys,Zs;
step 3-4: calculating an error range of a target positioning result;
Deriving the equation (6), the error range of the target point coordinates is expressed as:
Step 4: the absolute value of the formula (8) is cancelled and calculated to obtain the error of the coordinates of the target point, so that the positioning result is compensated:
Wherein:
the measurement result after error compensation is:
Specific examples:
The area to be measured in this embodiment may be assumed to be a flat area. And (3) carrying out error analysis on the internal and external azimuth elements of the camera, regarding the attitude error, the position error and the error of the frame angle of the photoelectric platform of the unmanned aerial vehicle as Gaussian distribution, and merging the Gaussian distribution and the error into the external azimuth element error of the camera for analysis.
The unmanned aerial vehicle rapid positioning system mainly comprises the following three parts:
(1) Photographing and imaging by using an unmanned aerial vehicle photoelectric platform;
(2) Position determination;
(3) And calculating an error range.
The unmanned aerial vehicle photoelectric platform photographing imaging means that when a ground target occupies more than one fifth of the area of a camera screen on the unmanned aerial vehicle, a photo of the position of the target is taken, the position of the target is presumed according to the photo, and the error range of the position of the target is calculated rapidly.
1. The electromechanical platform identifies the target and determines the position of the target in the field of view;
2. As shown in fig. 2, a camera mounted on a lower photovoltaic platform of the unmanned aerial vehicle photographs and images a target. Changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the position occupied by the target reaches more than 1 of 5 minutes of the photo of the onboard camera;
3. distortion correction is performed on the imaging data, and fig. 4 is the image point coordinates before correction and fig. 5 is the image point coordinates after correction.
4. The centroid position of the target is determined as shown in fig. 6. The position of the target centroid is marked by the open diamond, four straight lines are the linear fitting result of four sides of the target, and four points are corrected image points.
5. The result of the centroid position coordinate transformation is shown in fig. 7, wherein a black triangle indicates the actual position of the target centroid, and a black square indicates the directly calculated target centroid position positioning result.
6. The final error-compensated centroid positioning result is shown in fig. 8, wherein the black triangle represents the actual position of the target centroid, the black square represents the directly calculated target centroid positioning result, and the black diamond represents the error-compensated target centroid positioning result.

Claims (2)

1. A passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform is characterized by comprising the following steps:
step 1: the unmanned aerial vehicle optical level platform identifies the target and determines the position of the target in the field of view;
step 2: changing the focal length f of an onboard camera of the unmanned aerial vehicle photoelectric platform, and photographing the target when the occupied position of the target reaches more than 1 of L-th of the photo of the onboard camera;
step 3: determining the position of a target in a geodetic coordinate system;
Step 3-1: performing distortion correction on the imaging data;
The distortion correction formula is:
Wherein k 1,k2,p1,p2,s1,s2 is a distortion coefficient, (x ui,yui) is an ideal coordinate of the imaging coordinate system under the assumption of no error and no distortion state, and (x vi,yvi) represents a coordinate measured in the imaging coordinate system in actual imaging;
the distortion coefficients are calibrated by three pairs of known points, assuming ideal coordinates for the three pairs of known points: (x u1,yu1),(xu2,yu2),(xu3,yu3), the actual coordinates are: (x v1,yv1),(xv2,yv2),(xv3,yv3), the distortion coefficient is expressed as:
Wherein:
P=[k1,k2,p1,p2,s1,s2]T
step 3-2: determining centroid position of target
The shape of the target is linearly fitted by adopting a least square method, and the fitting result is expressed as follows:
f(x)=ax+b (3)
Taking (x i,yi), i=1, 2,3 … n to represent the coordinates of the target edge points in the imaging coordinate system in the taken photo, and n is the number of target edge points, then the formula (3) satisfies:
wherein a and b are coefficients obtained by linear fitting, and centroid coordinates (x, y) of the target in an imaging coordinate system are obtained, and the calculation formula is as follows:
step 3-3: coordinate conversion is carried out on centroid coordinates (x, y);
converting the coordinates (x, y) of the target centroid from an imaging coordinate system to a geodetic coordinate system, wherein the coordinates of the target point of the imaging coordinate system, the coordinates of the target point of the geodetic coordinate system and the projection center of the camera optical system satisfy a collineation equation:
Wherein, (X, Y, Z) is the coordinate of the target centroid in the geodetic coordinate system, the region to be measured is regarded as a flat region, and Z=0; x 0,y0, f is the azimuth element in the camera, is a known parameter marked when the camera comes out, (X 0,y0) is the image principal point coordinate, (X s,Ys,Zs) is the position coordinate of the projection center of the camera optical system, a i,bi,ci is the direction cosine, and is respectively expressed as:
Wherein, Omega and kappa are heading dip angles, sideways dip angles and photo rotation angles in the external azimuth elements of the camera respectively, and represent angles of rotation of the camera around a Z axis, a Y axis and an X axis of a geodetic coordinate system respectively, and form the external azimuth elements of the camera together with X s,Ys,Zs;
step 3-4: calculating an error range of a target positioning result;
Deriving the equation (6), the error range of the target point coordinates is expressed as:
Step 4: the absolute value of the formula (8) is cancelled and calculated to obtain the error of the coordinates of the target point, so that the positioning result is compensated:
Wherein:
the measurement result after error compensation is:
2. The passive target rapid positioning method based on an unmanned aerial vehicle active photoelectric platform according to claim 1, wherein l=5.
CN202110091538.8A 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform Active CN112950719B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110091538.8A CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110091538.8A CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Publications (2)

Publication Number Publication Date
CN112950719A CN112950719A (en) 2021-06-11
CN112950719B true CN112950719B (en) 2024-06-04

Family

ID=76236046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110091538.8A Active CN112950719B (en) 2021-01-23 2021-01-23 Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform

Country Status (1)

Country Link
CN (1) CN112950719B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023015566A1 (en) * 2021-08-13 2023-02-16 深圳市大疆创新科技有限公司 Control method, control device, movable platform, and storage medium
CN114322940B (en) * 2021-12-02 2024-04-09 中国人民解放军96796部队 Method and system for measuring center of idle explosion position by air-ground integrated multi-purpose intersection
CN115144879A (en) * 2022-07-01 2022-10-04 燕山大学 Multi-machine multi-target dynamic positioning system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
CN107976899A (en) * 2017-11-15 2018-05-01 中国人民解放军海军航空工程学院 A kind of precision target positioning and striking method based on someone/unmanned plane cooperative engagement systems
CN109684909A (en) * 2018-10-11 2019-04-26 武汉工程大学 A kind of unmanned plane target key point real-time location method, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN104574415A (en) * 2015-01-26 2015-04-29 南京邮电大学 Target space positioning method based on single camera
CN107976899A (en) * 2017-11-15 2018-05-01 中国人民解放军海军航空工程学院 A kind of precision target positioning and striking method based on someone/unmanned plane cooperative engagement systems
CN109684909A (en) * 2018-10-11 2019-04-26 武汉工程大学 A kind of unmanned plane target key point real-time location method, system and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于A~*算法的无人机爬升轨迹设计;常晓飞;段丽娟;符文星;闫杰;;飞行力学;20081215(第06期);全文 *

Also Published As

Publication number Publication date
CN112950719A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN112950719B (en) Passive target rapid positioning method based on unmanned aerial vehicle active photoelectric platform
CN106127697B (en) EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN107886531B (en) Virtual control point acquisition method based on laser ranging and object space matching
CN103822615A (en) Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points
CN107564057B (en) High-orbit planar array optical satellite in-orbit geometric calibration method considering atmospheric refraction correction
CN106595700A (en) Target channel space reference calibration method based on three-point coordinate measurement
CN110006452B (en) Relative geometric calibration method and system for high-resolution six-size wide-view-field camera
KR100663836B1 (en) Motor control system for focus matching aerial photographic camera
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN109341720A (en) A kind of remote sensing camera geometric calibration method based on fixed star track
CN113947638B (en) Method for correcting orthographic image of fish-eye camera
CN113900125A (en) Satellite-ground combined linear array imaging remote sensing satellite full-autonomous geometric calibration method and system
CN108594255B (en) Laser ranging auxiliary optical image joint adjustment method and system
CN106289317A (en) The unit calibration method of a kind of single-lens digital aviation measuring camera and device
CN110007440B (en) Full-color camera optical system for digital aviation mapping
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
CN110017833B (en) Full-screen image point geographic coordinate positioning method based on pixel type ground model
Clédat et al. Camera calibration models and methods for corridor mapping with UAVS
CN113421300B (en) Method and device for determining actual position of object in fisheye camera image
WO2023279529A1 (en) Method for joint estimation of atmospheric refraction and ground attitude of ground-based star tracker
CN115311366A (en) RPC model-based geometric calibration method and system for space-borne segmented linear array sensor
CN112257630A (en) Unmanned aerial vehicle detection imaging method and device of power system
CN116091546B (en) Observation construction method under push-broom mode of optical camera
CN113900122B (en) Satellite-ground combined area array imaging remote sensing satellite full-autonomous geometric calibration method and system
Li et al. A Robust Camera Self-calibration Method Based on Circular Oblique Images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant