CN112729305B - Multi-target positioning method based on single aircraft seeker image information - Google Patents

Multi-target positioning method based on single aircraft seeker image information Download PDF

Info

Publication number
CN112729305B
CN112729305B CN202011519602.XA CN202011519602A CN112729305B CN 112729305 B CN112729305 B CN 112729305B CN 202011519602 A CN202011519602 A CN 202011519602A CN 112729305 B CN112729305 B CN 112729305B
Authority
CN
China
Prior art keywords
target
coordinate system
seeker
aircraft
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011519602.XA
Other languages
Chinese (zh)
Other versions
CN112729305A (en
Inventor
白显宗
陈磊
宗康
朱子雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical National Defense Technology Innovation Institute PLA Academy of Military Science
Priority to CN202011519602.XA priority Critical patent/CN112729305B/en
Publication of CN112729305A publication Critical patent/CN112729305A/en
Application granted granted Critical
Publication of CN112729305B publication Critical patent/CN112729305B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the technical field of aircrafts, and provides a method for positioning a plurality of targets according to image information of a single aircraft seeker. Based on the image information of the seeker, a multi-target positioning method is provided under the two conditions that the position of a reference target is known and unknown. The position of the reference target is known, the relative position of the multiple targets and the reference target perpendicular to the sight line is measured by utilizing the image information of the seeker, the relative position is converted into a ground coordinate system, the position of the reference target is known, and the position of the auxiliary point can be obtained. And the connecting line of the aircraft and the auxiliary point is extended to the ground, so that a multi-target position estimation value can be obtained. And (4) measuring the directions of multiple targets by utilizing the image information of the seeker to obtain a unit relative position vector of the aircraft and the multiple targets, converting the unit relative position vector of the aircraft and the multiple targets into a ground coordinate system, and extending the unit relative position vector of the aircraft and the targets to the ground to obtain a target position estimated value. The invention can be applied to various ground target tasks of aircrafts, and improves the environment adaptability and the task execution capability.

Description

Multi-target positioning method based on single aircraft seeker image information
Technical Field
The invention belongs to the technical field of aircrafts, and particularly relates to a method for positioning a plurality of targets according to image information of a single aircraft seeker.
Background
When the aircraft reconnaissance and attack the ground target, the aircraft faces the difficult problems of complex and changeable ground environment, random discovery of time-sensitive targets and the like, and needs to improve the sensing and positioning capability of the aircraft on the complex environment and the time-sensitive targets. With the development of infrared and visible light imaging seeker technology, the seeker can have multiple target detection and identification capabilities, and support can be provided for multi-target positioning. In order to meet the requirement of the aircraft cluster for cooperatively executing tasks and improve the autonomous intelligent level of the aircraft, the on-board processing and positioning result real-time interaction capability of multi-target positioning needs to be realized.
The existing multi-target positioning methods mainly comprise two types. One is based on a multi-sensor platform, for example, patent CN103795935A (a camera type multi-target positioning method and device based on image correction) discloses a method for multi-target positioning by using multiple sets of cameras, which uses an optical intersection measurement positioning method to realize multi-target positioning; patent CN 105741261B (a planar multi-target positioning method based on four cameras) discloses a planar multi-target positioning method based on four cameras, which combines two by two target azimuth angles obtained by two adjacent cameras to position a target. The method has complex hardware equipment, is not suitable for single aircraft application, and has poor real-time performance. The second type is that the multiple targets are positioned based on an unmanned aerial vehicle-mounted optical level platform, for example, patent CN 106373159 a (a simplified unmanned aerial vehicle multiple target positioning method) can realize simultaneous real-time positioning of multiple targets, but the method requires a laser range finder to measure the distance between the photoelectric platform and the main target, and estimates the height information of the required vehicle from the distance of the target; white crown ice and the like are published in 2020 in the paper ' multi-target positioning to the ground of an airborne photoelectric platform ' of optical precision engineering ' and provide a multi-target positioning method based on a digital elevation model, and a laser distance measuring machine is also required to measure the distance of a main target, and the digital elevation model is required, so that the method is not suitable for being applied to a guided missile guide head; when multi-target positioning is carried out in a paper unmanned aerial vehicle multi-target positioning algorithm research of ship electronic engineering in 2020, Yangguang and the like, accurate height information of an airborne machine is also needed.
Disclosure of Invention
The invention aims to solve the technical problems of complex equipment, high information requirement, large data storage capacity and the like of the conventional multi-target positioning method, provides the multi-target positioning method based on the image information of the seeker of the single aircraft, can be applied to ground target tasks of various aircrafts, has the characteristics of small calculated amount and high stability, and can realize quick and effective calculation of multi-target positions.
The specific technical scheme comprises the following steps:
setting a single aircraft as a reference target as a target 1, and setting any one of multiple targets as a target 2;
the relevant coordinate system definition requires: the ground coordinate system G is fixedly connected with the earth, and the origin of coordinates is selected as a certain reference point on the ground; the aircraft body coordinate system B is fixedly connected with the aircraft body, the origin of coordinates is the mass center of the aircraft, and the coordinate axes point to the direction of the inertial main shaft of the aircraft; a seeker mounting coordinate system S is fixedly connected with a mounting base of the seeker, and the origin of coordinates is generally the rotation origin of the seeker; the seeker view field coordinate system is fixedly connected with a seeker lens view field;
the transformation of the relevant coordinate system comprises: converting a ground coordinate system G into an aircraft body coordinate system B, wherein the conversion relation between the ground coordinate system G and the aircraft body coordinate system B is determined by an aircraft attitude angle;
the aircraft body coordinate system B is converted into a seeker mounting coordinate system S, and the conversion relation between the aircraft body coordinate system B and the seeker mounting coordinate system S is determined by a seeker non-coaxial mounting deflection angle;
the seeker mounting coordinate system S is converted into a seeker view field coordinate system V, and the conversion relation between the seeker mounting coordinate system S and the seeker view field coordinate system V is determined by the frame rotation angle of the seeker;
step 1, reading data
Reading the data of the seeker, judging whether the data is valid, if the data is valid, performing the second step, and if the data is invalid, reading the data from the seeker;
step 2, determining the position of the target 2 in the visual field coordinate system of the seeker
Azimuth information of the target 2 in a field-of-view coordinate system is represented by a seeker pitch angle deviation and a yaw angle deviation, and the two angle information is calculated and obtained according to pixel deviation of the target 2 relative to an imaging center point of the target 1 in the field of view;
step 3, calculating the position of the target 2
3.1 target 1 location is known
3.1.1 calculating the relative position vector of target 1 and virtual target 2
Setting a virtual target 2 'which is positioned on a connecting line between the aircraft and the target 2, the distance from the virtual target to the aircraft is equal to that of the target 1, calculating the representation of the relative position vector in a visual field coordinate system through the azimuth difference between the target 1 and the target 2' in the visual field coordinate system and the distance between the aircraft and the target,
3.1.2 coordinate System conversion of relative position vectors
Sequentially calculating the representation of the relative position vector in a seeker mounting coordinate system, an aircraft body coordinate system and a ground coordinate system through the conversion of a related coordinate system according to the coordinates of the relative position vector of the target 1 and the target 2' in a view field coordinate system;
3.1.3 calculating the estimated position of the target 2
The line connecting the aircraft and the target 2' is extended to the ground, and the estimated value of the position of the target 2 can be calculated by the geometrical relation,
3.2 target 1 location unknown
3.2.1 calculating the Unit relative position vector of the aircraft to the target 2
Assuming that the target 1 is positioned at the center of the visual field of the seeker and the target 2 is oriented in the visual field coordinate system, calculating the unit relative position vector of the aircraft to the target 2 can be represented in the visual field coordinate system,
coordinate system conversion of 3.2.2 unit relative position vector
Sequentially calculating the representation of the relative position vector in an installation coordinate system, an aircraft body coordinate system and a ground coordinate system through the coordinate system conversion of the coordinate system of the unit relative position vector from the aircraft to the target 2 in a view field coordinate system;
3.2.3 calculating the estimated position of the target 2
And extending the relative position vector of the single phase to the ground, and calculating by a geometric relationship to obtain a position estimation value of the target 2.
Further, acquiring a more accurate value of the target 2, and further comprising step 4 of sequentially weighting average and outlier rejection;
sequentially obtaining the target 2 position estimation value of each time point for the sequential observation values of the multiple time points, and obtaining an average value with better precision by using a sequential weighted average method; and in the imaging process of the seeker, the field value appears on the observed pixel points, and in order to avoid the influence of the field value, when the observation frequency is not less than 5, the field value is eliminated.
Further, a simpler wild value rejection method is provided, when the observation times are more than 10 times, the position of the target 2 is calculated according to the observation value, then the distance between the position of the target 2 and the sequential average value of the previous time point is calculated, if the distance is more than a threshold value, the observation is a wild value, and the wild value is directly rejected, and the weighted average at the back is not included.
Compared with the prior art, the method has the advantages that:
(1) the information of the seeker of the aircraft is utilized to carry out multi-target positioning, so that the information of the seeker of the aircraft is fully utilized, required equipment is reduced, and the complexity of a system is reduced;
(2) distance measurement information between the aircraft and the target is not needed, and the aircraft is not required to be provided with distance measurement equipment, so that the information requirement is simplified;
(3) the data storage capacity and the processing capacity are small, and the method is suitable for the rapid real-time processing of the processor on the aircraft.
(4) The method can be applied to ground target tasks of various aircrafts, and improves the environment adaptability and the task execution capability.
Drawings
FIG. 1 is a schematic diagram of coordinate system transformation according to the present invention;
FIG. 2 is a schematic view of the orientation of the object of the present invention in a field-of-view coordinate system;
FIG. 3 is a schematic view of dual target co-location in accordance with the present invention;
FIG. 4 is a schematic diagram of a known method of object 1 of the present invention;
FIG. 5 is a schematic diagram of a process of the present invention object 1, which is unknown;
FIG. 6 is an error map of object 2 of the present invention;
FIG. 7 is a graph showing the result of a continuous weighted average of 50 points according to the present invention;
FIG. 8 is a schematic diagram of the results of 50 successive point direct calculations according to the present invention;
FIG. 9 is an estimate of target 2 without the addition of a outlier rejection algorithm of the present invention;
fig. 10 is an estimate of target 2 incorporating a outlier rejection algorithm of the present invention.
Detailed Description
The invention is explained and illustrated in detail below with reference to the figures and examples.
The design idea of the invention is that a multi-target positioning method is provided under two conditions of known reference target position and unknown reference target position based on seeker image information by taking cooperative reconnaissance-strike of sensitive targets on the ground by an aircraft cluster as a background. When the position of the reference target is known, the relative orientation of the multiple targets and the reference target perpendicular to the sight line is measured by utilizing the image information of the seeker, and the relative orientation is converted into a ground coordinate system, so that the position of the auxiliary point can be obtained because the position of the reference target is known. And the connecting line of the aircraft and the auxiliary point is extended to the ground, so that a multi-target position estimation value can be obtained. When the reference target position is unknown, the directions of multiple targets are measured by utilizing the image information of the seeker, the unit relative position vector of the aircraft and the multiple targets is obtained, and the unit relative position vector is converted into a ground coordinate system. And extending the relative position vector of the aircraft and the target to the ground to obtain a target position estimated value. The method can be applied to ground target tasks of various aircrafts, and improves the environment adaptability and the task execution capability.
In the following description, a reference target is set as target 1, any one of the multiple targets is referred to as target 2, and other multiple target positioning methods are the same as target 2.
Establishment of a coordinate system
Ground coordinate system G
The ground coordinate system G is fixedly connected with the earth, and the origin of coordinates is selected as a certain reference point on the ground.
For example, one of the ground coordinates is the north heaven coordinate system G, and the origin O may be selected from the ground reference points X, such as the ground projection of the launch point or the entry point of the aircraft, the ground target point, and the likeGThe axis being in the ground plane and pointing in the local north direction, YGThe ground normal with the axis passing through the origin point points in a direction away from the center of the earth, ZGThe axis being in the plane and perpendicular to XGThe axis, pointing is determined according to the right-hand rule.
Aircraft body coordinate system B
The aircraft body coordinate system B is fixedly connected with the aircraft body, the origin of coordinates is generally the center of mass of the aircraft, and the coordinate axes point to the direction of the inertial main shaft of the aircraft.
For example, an origin O of a body coordinate system of the aircraft is taken at the center of mass of the aircraft, the coordinate system is fixedly connected with the aircraft, an x axis is in a symmetrical plane of the aircraft and points to the head in a direction parallel to the design axis of the aircraft, a y axis is in a left-right symmetrical plane of the aircraft and is perpendicular to the x axis and points to the upper part of the aircraft, and a z axis is perpendicular to the symmetrical plane of the aircraft and points to the right side of the aircraft.
Seeker mounting coordinate System S
The seeker mounting coordinate system is fixedly connected with a mounting base of the seeker, and the origin of coordinates is generally the rotation origin of the seeker.
For example, the origin O of a seeker mounting coordinate system is located at the seeker rotation origin, XSThe axis is in the symmetrical plane of the aircraft and points to the center direction of the view field of the seeker when the rotation angle of the rotary table of the seeker is zero, YSThe axis being in the plane of bilateral symmetry of the aircraft, with XSWith axes directed vertically and above the aircraft, ZSThe axis is directed to the right side of the aircraft perpendicular to the plane of symmetry of the aircraft.
Seeker field of view coordinate System V
The seeker view field coordinate system is fixedly connected with the seeker lens view field.
For example, the origin O of a seeker's field of view coordinate system is located at the seeker's rotation origin, XVThe axis pointing in the direction of the centre of the field of view of the seeker YVThe axis perpendicular to the centre of the seeker's field of view is pointing upwards, ZVThe axis is directed to the right perpendicular to the center of the seeker's field of view.
Transformation of coordinate systems
Conversion of the ground coordinate system G into an aircraft body coordinate system B
The conversion relation between the ground coordinate system G and the aircraft body coordinate system B is determined by the attitude angle of the aircraft.
For example, as shown in FIG. 1, the attitude angles are respectively the pitch angles
Figure BDA0002849060150000051
Yaw angle psi and roll angle gamma, a coordinate transformation matrix rotated in the order of 3-2-1
Figure BDA0002849060150000052
Wherein M isG→BA coordinate transformation matrix, M, from the ground coordinate system G to the aircraft body coordinate system BB→GIs a coordinate transformation matrix from an aircraft body coordinate system B to a ground coordinate system G.
Conversion of the aircraft body coordinate system B into the seeker mounting coordinate system S
The conversion relation between the aircraft body coordinate system B and the guide head installation coordinate system S is determined by the non-coaxial installation deflection angle of the guide head.
For example, between the aircraft body coordinate system B and the seeker mounting coordinate system S, there is an offset angle sigma due to non-coaxial mounting of the seeker, which is generally negative for ground-based mission aircraft, and the coordinate transformation matrix is
Figure BDA0002849060150000053
Wherein M isB→SFor a coordinate transformation matrix, M, from an aircraft body coordinate system B to a seeker mounting coordinate system SS→BA coordinate transformation matrix of a coordinate system S to an aircraft body coordinate system B is installed for the seeker.
Conversion of seeker mounting coordinate system S into seeker field of view coordinate system V
The conversion relation between the seeker mounting coordinate system S and the seeker view field coordinate system V is determined by the frame rotation angle of the seeker.
For example, the rotation angles of the seeker frame are a pitch rotation angle delta and a yaw rotation angle lambda, respectively, and a coordinate transformation matrix rotating in 3-2 order is
Figure BDA0002849060150000054
Wherein M isS→VInstalling a coordinate transformation matrix, M, from a coordinate system S to a seeker view field coordinate system V for the seekerV→SA coordinate transformation matrix for the seeker field of view coordinate system V to the seeker mounting coordinate system S.
Step 1, reading data
Reading the data of the seeker, judging whether the data is valid, if the data is valid, performing the second step, and if the data is invalid, reading the data from the seeker;
step 2, determining the position of the target 2 in the visual field coordinate system of the seeker
The azimuth information of the target 2 in the field-of-view coordinate system is determined by the pitch angle deviation epsilonyAnd yaw angle deviation epsilonzIndicating that the two angle information are according to object 2Pixel deviations within the field of view relative to the imaging centre point of the object 1 are calculated.
As shown in fig. 2, for example, the pixel deviations are n respectivelyyAnd nzThe pixel values of the seeker in two directions are NyAnd NzAngle of view is respectively omegayAnd ΩzThen there is
Figure BDA0002849060150000061
Step 3, calculating the position of the target 2
3.1 knowledge of the position of the target 1
When the position of the target 1 is known, the relative orientation of the target 2 and the target 1 perpendicular to the sight line is measured by utilizing the image information of the seeker and converted into a ground coordinate system, and the position of the auxiliary point can be obtained because the position of the target 1 is known. And (3) extending the connecting line of the aircraft and the auxiliary point to the equal altitude surface of the target 1 to obtain the multi-target position estimation value, as shown in the attached figure 3.
A specific example of a solution is given below:
at a certain time t, the position vector of the aircraft in the ground coordinate system is
Figure BDA0002849060150000062
Assuming that the absolute position of the target 1 is known, it is expressed in the ground coordinate system as
Figure BDA0002849060150000063
After entering the terminal guidance, the seeker locks the target, and the coordinates of the target 1 in the seeker view field are respectively epsilony1、εz1The coordinates of the object 2 in the field of view of the seeker are respectively epsilony2、εz2The average value is obtained by the formula (4). ρ is the distance between the aircraft and the target 1 at this time.
A virtual target 2' is provided, which is located on the line connecting the aircraft and the target 2, at a distance from the aircraft equal to the distance of the target 1 from the aircraft, also denoted p. The relative position vector of the target 1 to the target 2' can be approximately expressed in the coordinate system of the view field of the seeker
r12'V=[0 ρsinΔεy ρsinΔεz]T (7)
Wherein
Δεy=εy2y1,Δεz=εz2z1 (8)
According to the pitching rotation angle delta and the yawing rotation angle lambda of the seeker frame, the mounting deflection angle of the seeker is sigma, and the attitude pitch angle of the aircraft is
Figure BDA0002849060150000064
The yaw angle psi and the roll angle gamma are obtained, and the conversion relation between the view field coordinate system and the ground coordinate system is
r12'G=MB→G·MS→B·MV→S·r12'V (9)
The position vector of the target 2' in the ground coordinate system is
R2'=R1+r12'G (10)
The relative position vector of the aircraft to the target 2' is
rm2'=R2'-Rm (11)
R is determined bym2'Extending to the ground to obtain the position vector of the target 2, as shown in fig. 4.
Figure BDA0002849060150000071
3.2 target 1 location unknown case
When the position of the target 1 is unknown, the azimuth of the target 2 is measured by utilizing the image information of the seeker, a unit relative position vector of the aircraft and the target 2 is obtained, and the unit relative position vector is converted into a ground coordinate system. And (4) extending the relative position vector of the aircraft and the target 2 to the local horizontal plane to obtain a target position estimation value.
A specific example of a solution is given below:
assuming that the absolute position of the target 1 is unknown, the seeker locks the target after entering the final guidance, namely the target 1 is at the center of the visual field of the seeker, and the target 2 has a deviation angle in the visual field
Figure BDA0002849060150000072
The unit relative position vector of the aircraft to the target 2 can be expressed in the field-of-view coordinate system as
Figure BDA0002849060150000073
The aircraft attitude pitch angle is also determined according to the pitching rotation angle delta and the yawing rotation angle lambda of the seeker frame, the mounting declination angle of the seeker is sigma, and
Figure BDA0002849060150000074
yaw angle psi and roll angle gamma, converting the relative position vector to a ground coordinate system
Figure BDA0002849060150000075
By the following way
Figure BDA0002849060150000076
Extending to the ground to obtain the position vector of the target 2, as shown in fig. 5.
Figure BDA0002849060150000077
This method is poor in accuracy because there is no position information of the target 1, and the position information of the target 2 is obtained by coordinate conversion, and is greatly affected by the attitude error of the aircraft. Fig. 6 shows the error distribution of the position of target 2 calculated by the two methods, which are sampled 5000 times in the case where the two targets are at a distance of 412m and the aircraft is at a distance of about 2976.6m from the target.
Step 4, outlier rejection and sequential weighted average
And for the multi-time point sequential observed values, sequentially obtaining the target 2 position estimated value of each time point, and obtaining a better longitude average value by using a sequential weighted average method.
For example, for a multi-point sequential observation, a target 2-position vector estimation value for each point can be obtained according to equation (12) or equation (16)
Figure BDA0002849060150000081
Sequential average values can be obtained by using a weighted average method
Figure BDA0002849060150000082
Taking the sequential observation at (5s) over 50 consecutive time points as an example, fig. 7 shows the 3 σ position error ellipse of target 2 obtained by methods 1 and 2 and the variation curve of the error ellipse size (semi-major axis and semi-minor axis) with time by the sequential observation weighted average method. As can be seen, the error of both method 1 and method 2 diminishes over time. As the detection time increased, the sequential co-location error decreased, which was directly calculated in comparison with fig. 8, but initially decreased by a larger amount, and after about 1s (10 data points), decreased by a significantly smaller amount.
In the actual seeker imaging process, outliers may appear at observed pixel points. In order to avoid the influence of the wild value, when the observation frequency is more than a certain value, the wild value is removed.
The general outlier elimination can be eliminated by methods known by technicians in the field such as cluster analysis, and the invention provides a simple outlier elimination method as follows:
and (3) calculating the position of the target 2 according to the observation value for more than 10 times, calculating the distance between the position and the sequential average value obtained last time, if the distance is more than a threshold value, directly eliminating the observation as a wild value, and not counting the weighted average at the back, wherein the threshold value is determined by the position, the speed and the navigation precision of the aircraft and the observation precision of the seeker, and a person skilled in the art can select the threshold value according to specific task requirements.
Fig. 9 shows the estimated position value of the target 2 obtained when the outlier is present in the observed value and the outlier removing algorithm is not added, and after the outlier is present, the estimated position value is gradually adjusted to the true position value of the target 2 by sequentially averaging the estimated position values. Fig. 10 shows the estimated position of the target 2 when the outlier rejection algorithm is added, and after the outlier occurs, the outlier is directly rejected.

Claims (3)

1. A multi-target positioning method based on single aircraft seeker image information is characterized by comprising the following steps:
setting a single aircraft as a reference target as a target 1, and setting any one of multiple targets as a target 2;
the relevant coordinate system definition requires: the ground coordinate system G is fixedly connected with the earth, and the origin of coordinates is selected as a certain reference point on the ground; the aircraft body coordinate system B is fixedly connected with the aircraft body, the origin of coordinates is the mass center of the aircraft, and the coordinate axes point to the direction of the inertial main shaft of the aircraft; a seeker mounting coordinate system S is fixedly connected with a mounting base of the seeker, and the origin of coordinates is the rotation origin of the seeker; the seeker view field coordinate system is fixedly connected with a seeker lens view field;
the transformation of the relevant coordinate system comprises: converting a ground coordinate system G into an aircraft body coordinate system B, wherein the conversion relation between the ground coordinate system G and the aircraft body coordinate system B is determined by an aircraft attitude angle;
the aircraft body coordinate system B is converted into a seeker mounting coordinate system S, and the conversion relation between the aircraft body coordinate system B and the seeker mounting coordinate system S is determined by a seeker non-coaxial mounting deflection angle;
the seeker mounting coordinate system S is converted into a seeker view field coordinate system V, and the conversion relation between the seeker mounting coordinate system S and the seeker view field coordinate system V is determined by the frame rotation angle of the seeker;
step 1, reading data
Reading the data of the seeker, judging whether the data is valid, if the data is valid, performing the step 2, and if the data is invalid, reading the data from the seeker;
step 2, determining the position of the target 2 in the visual field coordinate system of the seeker
Azimuth information of the target 2 in a field-of-view coordinate system is represented by a seeker pitch angle deviation and a yaw angle deviation, and the two angle information is calculated and obtained according to pixel deviation of the target 2 relative to an imaging center point of the target 1 in the field of view;
step 3, calculating the position of the target 2
3.1 target 1 location is known
3.1.1 calculating the relative position vector of object 1 and virtual object 2
Setting a virtual target 2 ', wherein the virtual target 2' is positioned on a connecting line between the aircraft and the target 2, the distance from the virtual target 2 'to the aircraft is equal to the distance from the target 1 to the aircraft, and calculating the representation of the relative position vector of the target 1 and the virtual target 2' in a view field coordinate system through the azimuth difference of the target 1 and the virtual target 2 'in the view field coordinate system and the distance from the aircraft to the virtual target 2';
3.1.2 coordinate System conversion of relative position vectors
Sequentially calculating the representation of the relative position vector in a seeker mounting coordinate system, an aircraft body coordinate system and a ground coordinate system through the conversion of a related coordinate system according to the coordinates of the relative position vector of the target 1 and the virtual target 2' in a view field coordinate system;
3.1.3 calculating the estimated position of the target 2
The connecting line between the aircraft and the virtual target 2' is extended to the ground, the estimated value of the position of the target 2 is calculated by the geometrical relationship,
3.2 target 1 location unknown
3.2.1 calculating the Unit relative position vector of the aircraft to the target 2
The target 1 is positioned at the center of the visual field of the seeker, the representation of the unit relative position vector of the aircraft to the target 2 in the visual field coordinate system is calculated according to the position of the target 2 in the visual field coordinate system,
coordinate system conversion of 3.2.2 unit relative position vector
Sequentially calculating the representation of the relative position vector in an installation coordinate system, an aircraft body coordinate system and a ground coordinate system through the coordinate system conversion of the coordinate system of the unit relative position vector from the aircraft to the target 2 in a view field coordinate system;
3.2.3 calculating the estimated position of the target 2
And extending the relative position vector of the single phase to the ground, and calculating by a geometric relationship to obtain a position estimation value of the target 2.
2. The multi-target positioning method based on image information of the single aircraft seeker is characterized by further comprising the steps of 4, sequential weighted averaging and outlier rejection in order to obtain a more accurate value of the target 2; sequentially obtaining the target 2 position estimation value of each time point for the sequential observation values of the multiple time points, and obtaining an average value with better precision by using a sequential weighted average method; and in the imaging process of the seeker, the field value appears on the observed pixel points, and in order to avoid the influence of the field value, when the observation frequency is not less than 5, the field value is eliminated.
3. The multi-target positioning method based on image information of a single aircraft seeker as claimed in claim 2, wherein the outlier in the step 4 is removed, when the number of observation times is larger than 10, the position of the target 2 is calculated according to the observation value, then the distance between the position of the target 2 and the sequential average value of the previous time point is calculated, if the distance is larger than a threshold value, the observation is the outlier, and the outlier is directly removed without being counted in the subsequent weighted average.
CN202011519602.XA 2020-12-21 2020-12-21 Multi-target positioning method based on single aircraft seeker image information Active CN112729305B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011519602.XA CN112729305B (en) 2020-12-21 2020-12-21 Multi-target positioning method based on single aircraft seeker image information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011519602.XA CN112729305B (en) 2020-12-21 2020-12-21 Multi-target positioning method based on single aircraft seeker image information

Publications (2)

Publication Number Publication Date
CN112729305A CN112729305A (en) 2021-04-30
CN112729305B true CN112729305B (en) 2022-03-08

Family

ID=75604105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011519602.XA Active CN112729305B (en) 2020-12-21 2020-12-21 Multi-target positioning method based on single aircraft seeker image information

Country Status (1)

Country Link
CN (1) CN112729305B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114234876B (en) * 2021-12-23 2023-06-23 中国人民解放军空军军医大学 Method for measuring width of remote target

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9638821B2 (en) * 2014-03-20 2017-05-02 Lockheed Martin Corporation Mapping and monitoring of hydraulic fractures using vector magnetometers
CN107726921B (en) * 2017-08-30 2019-12-03 湖北航天技术研究院总体设计所 A kind of Active Radar angle method of guidance suitable under the conditions of target maneuver
CN107976660B (en) * 2017-11-10 2021-06-08 西安电子科技大学 Missile-borne multi-channel radar ultra-low-altitude target analysis and multi-path echo modeling method
CN108594815B (en) * 2018-04-20 2021-02-02 武汉大学 Staged wheeled robot moving path planning method

Also Published As

Publication number Publication date
CN112729305A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN112197761B (en) High-precision multi-gyroplane co-location method and system
US5072396A (en) Navigation systems
CN111102981B (en) High-precision satellite relative navigation method based on UKF
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
WO2015160287A1 (en) A method and system for estimating information related to a vehicle pitch and/or roll angle
CN111366148A (en) Target positioning method suitable for multiple observations of airborne photoelectric observing and sighting system
CN110657808B (en) Active target positioning method and system for airborne photoelectric pod
Yang et al. High accuracy active stand-off target geolocation using UAV platform
CN111238540A (en) Lopa gamma first camera-satellite sensitive installation calibration method based on fixed star shooting
CN117455960B (en) Passive positioning filtering method for airborne photoelectric system to ground under time-varying observation noise condition
CN112346104A (en) Unmanned aerial vehicle information fusion positioning method
CN108444468B (en) Directional compass integrating downward vision and inertial navigation information
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
CN112729305B (en) Multi-target positioning method based on single aircraft seeker image information
RU2513900C1 (en) Method and device to determine object coordinates
CN109146936B (en) Image matching method, device, positioning method and system
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN111983592A (en) Passive positioning fitting direction-finding speed-measuring method for airborne photoelectric system
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN110887477A (en) Autonomous positioning method based on north polarization pole and polarized sun vector
CN116309798A (en) Unmanned aerial vehicle imaging positioning method
CN115542363A (en) Attitude measurement method suitable for vertical downward-looking aviation pod
CN114136314A (en) Auxiliary attitude calculation method for aerospace vehicle
CN107796417B (en) Method for adaptively estimating scene matching and inertial navigation installation error
Xiaochen et al. Evaluation of Lucas-Kanade based optical flow algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant