CN115018908B - Aircraft landing point monocular measurement method based on shadows - Google Patents

Aircraft landing point monocular measurement method based on shadows Download PDF

Info

Publication number
CN115018908B
CN115018908B CN202210623211.5A CN202210623211A CN115018908B CN 115018908 B CN115018908 B CN 115018908B CN 202210623211 A CN202210623211 A CN 202210623211A CN 115018908 B CN115018908 B CN 115018908B
Authority
CN
China
Prior art keywords
target
camera
point
shadow
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210623211.5A
Other languages
Chinese (zh)
Other versions
CN115018908A (en
Inventor
郭凯
陈洪林
高新
田野
谷俊豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chinese People's Liberation Army 63660
Original Assignee
Chinese People's Liberation Army 63660
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chinese People's Liberation Army 63660 filed Critical Chinese People's Liberation Army 63660
Priority to CN202210623211.5A priority Critical patent/CN115018908B/en
Publication of CN115018908A publication Critical patent/CN115018908A/en
Application granted granted Critical
Publication of CN115018908B publication Critical patent/CN115018908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a shadow-based aircraft landing point monocular measurement method. Belonging to the technical fields of photogrammetry, weapon test and the like. According to the invention, 3 mark points on the target surface are utilized, a camera is utilized to shoot a target image at any position and the shadow of the target image on the target surface, then a target axis and a shadow axis are extracted from the image, and the intersection point of the two axes is the target landing position, namely the imaging position of the landing point in the image. According to the imaging position and the camera imaging principle, a line-surface intersection method is adopted to calculate the three-dimensional information of the landing point of the aircraft. The method effectively solves the problems that a plurality of cameras or at least 4 mark points are needed for binocular measurement and monocular measurement, an aircraft touchdown time image is required to be shot, the requirement on the camera frame rate is not high, and the measuring method is higher in measuring efficiency and more convenient to operate and implement.

Description

Aircraft landing point monocular measurement method based on shadows
Technical field:
the invention relates to the technical fields of photogrammetry, weapon test and the like. In particular to a shadow-based aircraft landing point monocular measurement method.
Background
In the field of weapon tests, particularly aircraft tests, such as missile tests, etc., photogrammetry is often used to measure the landing point of an aircraft target. Photogrammetry can be categorized into binocular vision measurement and monocular vision measurement, depending on the number of devices used. Binocular vision measurement, which uses no less than two cameras to obtain 3D information of a target, such as a landing point, is widely used in weapon experiments (Yu Qifeng, shangyang. Principles of photogrammetry and application research [ M ]. Scientific publishers). Monocular vision can only acquire 2D information of a target, and if 3D measurement is required to be carried out on information such as a target landing point, other constraint conditions are required to be added. The related technical personnel put forward a monocular vision measuring method, (Zhao Zinian, zhang Yongdong, chen Honglin, she Hu, gu Junhao) a target miss distance measuring method and a system design [ C ].2021 Chinese automated university treatise on the construction target, but the method is an approximate measuring method and requires specific requirements on the inclination angle of a camera. Since weapon trials are mostly high-speed targets, this method requires capturing an image of the target near the plane, and thus requires a high frame rate camera. It can be seen that existing methods have limitations in weapon trials, such as the need for multiple cameras, or the use of proximity measurement and high frame rate cameras, etc. In practical application, when there are not enough cameras or the frame rate of the cameras is not enough to be required to shoot an image at the landing time, the measurement of the target landing point cannot be completed with high precision.
Disclosure of Invention
The invention aims to provide a shadow-based aircraft landing point monocular measurement method, which utilizes a single camera to measure a shadow aircraft landing point. The invention solves the technical problems that at least 2 cameras are needed for traditional landing point measurement, images of the target at the moment are needed to be shot, or at least 4 mark points are needed for approximate measurement when one camera is adopted for measurement, the inclination angle of the camera is specifically required, and the images of the target at the moment are needed to be shot.
In order to achieve the above purpose and solve the above technical problems, the technical scheme of the present invention is as follows.
An aircraft landing point measurement method based on shadows comprises the following steps:
step 1, placing a camera at a high position, enabling a camera view field to cover a landing point area completely, observing a target surface where a target shadow is located, and arranging 3 mark points on the target surface; the method comprises the steps that sun rays or light sources are utilized to supplement light to a region where a target appears, so that the target forms a shadow on the target surface, and the light supplement region is a region which needs to be covered by a drop point required in each measurement task;
And 2, shooting a target and shadow image after the target enters the field of view of the camera, wherein the target is not required to be positioned on or close to the target surface, and extracting the axis of the target and the axis of the target shadow in the image by using a characteristic line extraction algorithm to obtain expressions of the two axes, as follows.
Wherein u and v are components in an image pixel coordinate system respectively, a 1,b1,c1 is a parameter of a corresponding straight line after an axis of a target is extracted, a 2,b2,c2 is a parameter of a corresponding straight line after an axis of a shadow is extracted, the parameters are known quantities at the moment, and an equation set (1) is solved to obtain the target touch moment, namely an imaging position (u t vt) where a falling point is located;
Step 3, measuring three-dimensional coordinates of 3 mark points by using three-dimensional measuring equipment to obtain coordinates (X i YiZi), wherein i=1, 2 and 3; according to the coordinates of the 3 mark points, calibrating parameters outside the camera by using a camera P3P calibration algorithm to obtain a 3X 3 rotation matrix R and a 3X 1 translation vector t;
Step 4, preparing a composite material; obtaining a target surface equation by using 3 mark point coordinates and adopting a formula (2)
Obtaining a target surface equation of AX+BY+CZ=1;
Step 5, obtaining an imaging relationship according to the imaging position (u t vt) where the falling point in step 2 is located, the rotation matrix R obtained in step 3, and the translation vector t as follows
Wherein, A is a matrix of parameters in the camera, and because the camera is used and all parameters of the camera are known, A is a known quantity; M=A· [ R t ] is a3×4 matrix, known; zc is a Z-axis component of a three-dimensional coordinate of a target in a camera coordinate system, and is unknown; (X t Yt Zt) is the coordinates of the drop points to be solved, and the coordinates are to be solved; elimination of zc yields two equations, as follows.
The equation is implemented as a spatial straight line passing through the falling point (X t Yt Zt) and the camera optical center, the falling point is located on the straight line, and the specific position is unknown;
step 6, solving the drop point by line-surface intersection
Since the landing point is located on the target surface, the equation is satisfied
AXt+BYt+CZt=1 (5)
The drop point can be solved by combining equations (4) and (5).
The effective benefit of the invention is as follows:
1. The invention provides a shadow-based aircraft landing point measurement method, which can finish measurement by only shooting one image and only needing one low-frame-rate camera.
2. According to the invention, an image of the landing time of the target is not required to be shot, the missile can appear in the field of view of the camera, and the required external condition is that the target generates shadows on the target surface (such as the plane where the striking point of the missile is located) under the light.
3. According to the method provided by the invention, only 3 mark points are needed, a camera is used for shooting a target image at any position and shadows thereof on a target surface, then a target axis and a shadow axis are extracted from the image, and the intersection point of the two axes is the position of a target falling point, namely the imaging position of the falling point in the image;
4. the method provided by the invention has lower requirements on the frame rate of the camera, and has no requirements on the inclination angle of the placement of the camera.
5. The method is suitable for a scene that the target aircraft generates shadows on the target surface (such as the plane where the missile striking point is located) under the light, and the measuring method has higher measuring efficiency and more convenient operation and implementation.
Drawings
FIG. 1 is a schematic diagram of a shadow generated by a target after light source light supplement according to the present invention;
FIG. 2 is a schematic view of the object shading after the sun ray is supplemented with light according to the present invention;
FIG. 3 is a plot of the intersection of lines and planes using shadows in accordance with the present invention;
FIG. 4 is a schematic diagram of the effect of the invention on the imaging of the marker points, targets and shadows thereof.
Detailed Description
The invention is explained and illustrated in detail below with reference to the drawings and to specific embodiments.
The invention provides a shadow-based aircraft drop point measurement method, which comprises the following steps of
Step 1, placing a camera at a high position, enabling a camera view field to cover a landing point area completely, observing a target surface where a target shadow is located, and arranging 3 mark points on the target surface; and (3) supplementing light to the possible area of the target by using solar rays or a light source, so that the target forms a shadow on the target surface, wherein the light supplementing area is an area which needs to be covered by a drop point required in each measurement task, as shown in fig. 1 and 2.
Step 2, after the target enters the field of view of the camera, shooting a target and shadow image, wherein the target does not need to be positioned on or close to the target surface at the moment, and because the intersection line of the axis of the target and the shadow in the image is unchanged along with the movement of the target even if the target is not positioned on the target surface, the invention has no requirement on the ground clearance of the target; and extracting the axis of the target and the axis of the shadow of the target in the image by using a characteristic line extraction algorithm to obtain expressions of the two axes as follows.
Wherein u and v are components in an image pixel coordinate system respectively, a 1,b1,c1 is a parameter of a corresponding straight line after an axis of a target is extracted, a 2,b2,c2 is a parameter of a corresponding straight line after an axis of a shadow is extracted, the parameters are known quantities at the moment, and an equation set (1) is solved to obtain the target touch moment, namely an imaging position (u t vt) where a falling point is located;
From this step, it can be seen that the image of the target landing point is not required to be captured, i.e. the image at any moment when the target actually touches the target. Because the intersection of the object axis and the shadow in the image remains unchanged as the object moves even though the object is not on the target surface.
According to the method, the image of the landing time of the target is not required to be shot, the aircraft is in the field of view of the camera, the external condition is that the aircraft generates shadows on the target surface (such as the plane where the missile hit point is located) under the light, the shadows can provide additional constraint for the landing point measurement, or the shadows can be considered as one image acquired by the other camera, so that the two cameras can acquire the image of the target, and the high-precision landing point measurement can be performed on the target. The method is suitable for measuring the missile landing point by a single camera under the condition of light (sunlight or artificial light supplement at night).
Step 3, measuring three-dimensional coordinates of 3 marker points by using three-dimensional measuring equipment such as RTK or total station to obtain coordinates (X i Yi Zi), i=1, 2,3; according to the coordinates of the 3 mark points, calibrating parameters outside the camera by using a camera P3P calibration algorithm to obtain a 3X3 rotation matrix R and a 3X 1 translation vector t;
step 4, preparing a composite material; calculating a target surface equation by using 3 mark point coordinates and adopting a formula (2)
The target surface equation is obtained as ax+by+cz=1.
Step 5, obtaining an imaging relationship according to the imaging position (u t vt) where the falling point in step 2 is located, the rotation matrix R obtained in step 3, and the translation vector t as follows
Wherein, A is a matrix of parameters in the camera, and because the camera is used and all parameters of the camera are known, A is a known quantity; M=A· [ R t ] is a3×4 matrix, known; zc is a Z-axis component of a three-dimensional coordinate of a target in a camera coordinate system, and is unknown; (X t Yt Zt) is the coordinates of the drop points to be solved, and the coordinates are to be solved; elimination of zc yields two equations, as follows.
The equation is implemented as a spatial straight line passing through the landing point (X t Yt Zt) and the camera's optical center, on which the landing point lies, but the specific location is unknown.
Step 6, solving the drop point by line-surface intersection
Since the landing point is located on the target surface, the equation is satisfied
AXt+BYt+CZt=1 (10)
By combining equations (4) and (5), 3 unknowns are the drop point (X t Yt Zt), and there are 3 equations, so the drop point can be solved;
Since (4) is a space straight line and (5) is a space plane, the solution is also called line-plane intersection, taking light source light filling as an example, and the specific geometric principle is shown in fig. 3.
According to the invention, 3 mark points on the target surface are utilized, a camera is utilized to shoot a target image at any position and the shadow of the target image on the target surface, then a target axis and a shadow axis are extracted from the image, and the intersection point of the two axes is the target landing position, namely the imaging position of the landing point in the image. According to the imaging position and the camera imaging principle, a line-surface intersection method is adopted to calculate the three-dimensional information of the landing point of the aircraft.
The method solves the problems that the prior algorithm needs a plurality of cameras and needs to shoot images of the ground contact time of the aircraft when the aircraft is in the landing point of binocular measurement, and the prior algorithm needs more than or equal to 4 mark points and needs to shoot images of the ground contact time or the ground contact time of the aircraft when the aircraft is in the landing point of monocular measurement; in addition, the problem of high requirements on the frame rate of the camera is solved. Therefore, the measuring method of the invention has higher measuring efficiency and more convenient operation and implementation.
The invention is further illustrated by the following examples.
Example 1
Three mark point coordinates P1 (-15, 10, 0), P2 (15, 10, 0), P3 (15, -10, 0) are distributed on the target surface. Setting the resolution of the camera to 1920 multiplied by 1080, the pixel size to 5 mu m and the focal length to 50mm; the camera position is (300, 300, 300) with its optical axis pointing at (0, 0). The aircraft is a cylinder with the diameter of 0.6m and the length of 6m, is 10m away from the target surface, and forms shadows by using solar rays. The mark points, the targets and the shadow simulation imaging effect diagram thereof are shown in fig. 4, wherein gray is the target, black is the shadow, and virtual points are three laid mark points for calibrating the camera. The image is an imaging effect image shot by a camera when the target is in the air, and the actual falling point of the target is (0,1.763,0).
Extracting the axes of the aircraft and the shadow to obtain an expression thereof; and then calculating an intersection point of the two axes, wherein the intersection point is the imaging position of the aircraft landing point moment in the graph.
According to the imaging position and the internal and external parameters, a line-surface intersection method is adopted to obtain the three-dimensional coordinate of the falling point (0.075,1.684,0), and the error is 0.109m. It can be seen that the drop point error measured by the shadow-based aircraft drop point monocular measurement method provided by the invention is very low, and the method can be used for measuring the aircraft drop point.

Claims (2)

1. An aircraft landing point measurement method based on shadows is characterized by comprising the following steps:
step 1, placing a camera at a high position, enabling a camera view field to cover a landing point area completely, observing a target surface where a target shadow is located, and arranging 3 mark points on the target surface; the method comprises the steps that sun rays or light sources are utilized to supplement light to a region where a target appears, so that the target forms a shadow on the target surface, and the light supplement region is a region which needs to be covered by a drop point required in each measurement task;
Step 2, shooting a target and shadow image after the target enters the field of view of the camera, wherein the target is not required to be positioned on or close to the target surface, and extracting the axis of the target and the axis of the target shadow in the image by utilizing a characteristic line extraction algorithm to obtain expressions of the two axes, wherein the expression is as follows;
Wherein u and v are components in an image pixel coordinate system respectively, a 1,b1,c1 is a parameter of a corresponding straight line after an axis of a target is extracted, a 2,b2,c2 is a parameter of a corresponding straight line after an axis of a shadow is extracted, and at the moment, the parameters are known quantities, and an equation set (1) is solved to obtain the target touch moment, namely an imaging position (u t vt) where a falling point is located;
step 3, measuring three-dimensional coordinates of 3 mark points by using three-dimensional measuring equipment to obtain coordinates (X i Yi Zi), wherein i=1, 2 and 3; according to the coordinates of the 3 mark points, calibrating parameters outside the camera by using a camera P3P calibration algorithm to obtain a 3X 3 rotation matrix R and a 3X 1 translation vector t;
step 4, obtaining a target surface equation by using 3 mark point coordinates and adopting a formula (2)
Obtaining a target surface equation of AX+BY+CZ=1;
Step 5, obtaining an imaging relationship according to the imaging position (u t vt) where the falling point in step 2 is located, the rotation matrix R obtained in step 3, and the translation vector t as follows
Wherein, A is a matrix of parameters in the camera, and because the camera is used and all parameters of the camera are known, A is a known quantity; M=A· [ R t ] is a3×4 matrix, known; zc is a Z-axis component of a three-dimensional coordinate of a target in a camera coordinate system, and is unknown; (X t Yt Zt) is the falling point coordinate of the demand solution to be solved; elimination of zc yields two equations, as follows;
The equation is implemented as a spatial straight line passing through the falling point (X t Yt Zt) and the camera optical center, the falling point is located on the straight line, and the specific position is unknown;
step 6, solving the drop point by line-surface intersection
Since the landing point is located on the target surface, the equation is satisfied
AXt+BYt+CZt=1 (5)
The drop point can be solved by combining equations (4) and (5).
2. The shadow-based aircraft drop point measurement method of claim 1, wherein the three-dimensional measurement device is an RTK or a total station.
CN202210623211.5A 2022-06-01 2022-06-01 Aircraft landing point monocular measurement method based on shadows Active CN115018908B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210623211.5A CN115018908B (en) 2022-06-01 2022-06-01 Aircraft landing point monocular measurement method based on shadows

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210623211.5A CN115018908B (en) 2022-06-01 2022-06-01 Aircraft landing point monocular measurement method based on shadows

Publications (2)

Publication Number Publication Date
CN115018908A CN115018908A (en) 2022-09-06
CN115018908B true CN115018908B (en) 2024-04-26

Family

ID=83072131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210623211.5A Active CN115018908B (en) 2022-06-01 2022-06-01 Aircraft landing point monocular measurement method based on shadows

Country Status (1)

Country Link
CN (1) CN115018908B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138999A (en) * 2015-09-16 2015-12-09 三峡大学 Shadow-based night object single-camera locating device and method
CN107202982A (en) * 2017-05-22 2017-09-26 徐泽宇 A kind of beacon arrangement calculated based on UAV position and orientation and image processing method
CN206649349U (en) * 2017-01-13 2017-11-17 天津津宇凯创航空科技发展有限公司 It is a kind of nobody with clapping aircraft
CN110675431A (en) * 2019-10-08 2020-01-10 中国人民解放军军事科学院国防科技创新研究院 Three-dimensional multi-target tracking method fusing image and laser point cloud
CN113129280A (en) * 2021-04-09 2021-07-16 中国人民解放军63660部队 Target drop point measuring method based on building contour features
CN113739765A (en) * 2021-08-23 2021-12-03 中国人民解放军63660部队 Binocular collaborative drop point measurement method without additional control point
KR20220069623A (en) * 2020-11-20 2022-05-27 한국항공우주연구원 Unmmaned aerial vehicle and operating method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138999A (en) * 2015-09-16 2015-12-09 三峡大学 Shadow-based night object single-camera locating device and method
CN206649349U (en) * 2017-01-13 2017-11-17 天津津宇凯创航空科技发展有限公司 It is a kind of nobody with clapping aircraft
CN107202982A (en) * 2017-05-22 2017-09-26 徐泽宇 A kind of beacon arrangement calculated based on UAV position and orientation and image processing method
CN110675431A (en) * 2019-10-08 2020-01-10 中国人民解放军军事科学院国防科技创新研究院 Three-dimensional multi-target tracking method fusing image and laser point cloud
KR20220069623A (en) * 2020-11-20 2022-05-27 한국항공우주연구원 Unmmaned aerial vehicle and operating method thereof
CN113129280A (en) * 2021-04-09 2021-07-16 中国人民解放军63660部队 Target drop point measuring method based on building contour features
CN113739765A (en) * 2021-08-23 2021-12-03 中国人民解放军63660部队 Binocular collaborative drop point measurement method without additional control point

Also Published As

Publication number Publication date
CN115018908A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN109272532B (en) Model pose calculation method based on binocular vision
CN105066909B (en) A kind of many laser stripe quick three-dimensional measuring methods of hand-held
CN104764440B (en) Rolling object monocular pose measurement method based on color image
CN103256896B (en) Position and posture measurement method of high-speed rolling body
CN107121125B (en) A kind of communication base station antenna pose automatic detection device and method
CN109238235B (en) Method for realizing rigid body pose parameter continuity measurement by monocular sequence image
CN104482924B (en) Body of revolution object pose vision measuring method
CN110849331B (en) Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN105091866A (en) Part position and posture identification visual system and calibration method thereof
CN108896017B (en) Method for measuring and calculating position parameters of projectile near-explosive fragment groups
CN108663043A (en) Distributed boss's POS node relative pose measurement method based on single camera auxiliary
CN108253935B (en) Ultra-high-speed free flight attitude measurement method for complex-shape model
Liu et al. High-precision pose measurement method in wind tunnels based on laser-aided vision technology
CN115526937A (en) Large-target-surface CCD vertical target rapid calibration system
Yuan et al. A precise calibration method for line scan cameras
CN108154535A (en) Camera Calibration Method Based on Collimator
CN115018908B (en) Aircraft landing point monocular measurement method based on shadows
CN110866954B (en) Method for measuring high-precision attitude of bullet target under length constraint
CN112461204A (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
CN104296658B (en) The detection of a kind of cliff blast hole based on virtual binocular vision and positioner and localization method
CN108876825A (en) A kind of space non-cooperative target Relative Navigation three-dimensional matching estimation method
CN109631849A (en) A kind of high gradient slope crag measurement method based on oblique photograph
Liu et al. Correction method for non-landing measuring of vehicle-mounted theodolite based on static datum conversion
CN108375350A (en) A kind of high-precision cylinder bullet angle measuring device based on image
CN104296657B (en) The detection of a kind of cliff blast hole based on binocular vision and positioner and localization method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant