CN112985398A - Target positioning method and system - Google Patents

Target positioning method and system Download PDF

Info

Publication number
CN112985398A
CN112985398A CN201911291169.6A CN201911291169A CN112985398A CN 112985398 A CN112985398 A CN 112985398A CN 201911291169 A CN201911291169 A CN 201911291169A CN 112985398 A CN112985398 A CN 112985398A
Authority
CN
China
Prior art keywords
coordinate system
aircraft
transformation matrix
camera
position coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911291169.6A
Other languages
Chinese (zh)
Inventor
陈宇楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingbangda Trade Co Ltd
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingbangda Trade Co Ltd
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingbangda Trade Co Ltd, Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingbangda Trade Co Ltd
Priority to CN201911291169.6A priority Critical patent/CN112985398A/en
Publication of CN112985398A publication Critical patent/CN112985398A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a target positioning method and a target positioning system, and relates to the technical field of target positioning. The method comprises the following steps: acquiring the position coordinates of a projection point of a target point on a pixel plane of a camera in real time by using the camera carried on the aircraft; acquiring flight control data of an aircraft in real time; based on flight control data, a coordinate system transformation matrix for transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system is constructed; and positioning the position coordinates of the target point under the world geographic coordinate system based on the position coordinates of the projection point and the coordinate system transformation matrix. According to the target positioning method provided by the embodiment of the invention, the laser ranging machine is prevented from being carried on the aircraft, the load weight and the cost of the aircraft are reduced, and the target positioning range of the target positioning method of the aircraft is enlarged.

Description

Target positioning method and system
Technical Field
The invention relates to the technical field of target positioning, in particular to a target positioning method and a target positioning system.
Background
Currently, an aviation photoelectric imaging system is commonly used for detecting and tracking a ground target. The aviation photoelectric imaging system is carried on a fixed wing or other manned or unmanned aerial vehicles for high-altitude operation. If the ground target is to be positioned, the position and the posture of the photoelectric imaging system relative to the aircraft and the distance between the photoelectric imaging system and the ground target need to be acquired in real time. And the GPS information and inertial navigation data of the airborne end of the aircraft are converted by a coordinate system to complete target positioning. In the whole target positioning process, the acquisition of the distance value between the aircraft and the target is critical.
In the prior art, the airborne target positioning technology can be divided into an active positioning technology and a passive positioning technology. The active positioning, that is, the airborne terminal directly measures the distance value between the aircraft and the ground target by emitting electromagnetic waves, for example, the distance value is measured by using a laser ranging sensor; the passive positioning means that the airborne end does not emit electromagnetic waves directly irradiating the target, the distance value between the aircraft and the target is obtained through indirect calculation by utilizing a visual camera, and then the position of the image center is solved by using a proportional positioning method. Most airborne object localization techniques rely on an object localization model as shown in fig. 1 for localization. Coordinate a of object a in the camera coordinate system is (x)0,y0,z0) Comprises the following steps:
Figure BDA0002319161420000011
wherein z is0The distance between the photoelectric pod and the ground plane is defined as alpha, the azimuth angle of the photoelectric pod is defined as beta, the pitch angle of the photoelectric pod is defined as beta, and R is the distance value between the photoelectric imaging system and the target.
The active target positioning technology relies on laser ranging, and is fast and accurate, but the flying height unit of a high-altitude operation fixed wing or other manned or unmanned aircraft carrying an aviation photoelectric imaging system is kilometer grade, and a high-power and high-quality laser ranging machine is required, so that the load weight of the aircraft is increased, and the cost of target positioning is increased. Meanwhile, the proportional localization method used in the passive target localization technology can only solve the position of the image center, does not utilize the position information of the target point in the image, and has large localization limitation.
Disclosure of Invention
In view of this, embodiments of the present invention provide a target positioning method and system, which reduce the load weight and cost of an aircraft and improve the target positioning range.
According to a first aspect of the present invention, there is provided a target positioning method, comprising:
acquiring the position coordinates of a projection point of the target point on a pixel plane of the camera in real time by using a camera carried on an aircraft;
acquiring flight control data of the aircraft in real time;
constructing a coordinate system transformation matrix for transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data;
and positioning the position coordinates of the target point in a world geographic coordinate system based on the position coordinates of the projection point and the coordinate system transformation matrix.
Optionally, the positioning the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix includes:
transforming the position coordinates of the optical center of the camera from a camera coordinate system to an inertial coordinate system using the coordinate system transformation matrix;
converting the position coordinates of the projection points of the target points on the pixel plane into an inertial coordinate system by using the coordinate system transformation matrix; and
z based on optical center of the camera under inertial coordinate systemIAxial coordinates and Z of said projected pointsIAnd calculating the geometric relation between the axis coordinates to obtain the image depth of the target point in the camera coordinate system.
Optionally, the positioning the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix further includes:
and transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system by using the coordinate system transformation matrix based on the image depth and the position coordinates of the projection point of the pixel plane.
Optionally, the positioning the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix further includes:
and transforming the position coordinates of the target point from an inertial coordinate system to a world geographic coordinate system, and positioning the position coordinates of the target point in the world geographic coordinate system.
Optionally, the image depth is Z of the target point in a camera coordinate systemcAxis coordinates.
Optionally, the flight control data includes: the position information of the aircraft, the attitude information of the aircraft, the position information of the holder and the attitude information of the holder;
the position information of the aircraft includes: the position coordinates of the aircraft under an inertial coordinate system and the position coordinates of the aircraft under a holder coordinate system;
the attitude information of the aircraft includes: a roll angle of the aircraft, a pitch angle of the aircraft, and a yaw angle of the aircraft;
the cradle head position information includes: a translation distance of a rotational center of a pan-tilt relative to an optical center of the camera in a camera coordinate system; and
the cradle head attitude information includes: pan-tilt yaw angle and pan-tilt pitch angle.
Optionally, the constructing a coordinate system transformation matrix for transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data includes:
constructing a first transformation matrix from an inertial coordinate system to an aircraft geographic coordinate system based on the position information of the aircraft;
constructing a second transformation matrix from an aircraft geographic coordinate system to an aircraft body coordinate system based on the attitude information of the aircraft;
constructing a third transformation matrix from an aircraft body coordinate system to a cradle head coordinate system based on the cradle head attitude information and the aircraft position information;
constructing a fourth transformation matrix from the holder coordinate system to the camera coordinate system based on the holder position information;
multiplying the first transformation matrix, the second transformation matrix, the third transformation matrix and the fourth transformation matrix to obtain a fifth transformation matrix;
and calculating an inverse matrix of the fifth transformation matrix to obtain the coordinate system transformation matrix for transforming the position coordinate of the target point from the camera coordinate system to the inertial coordinate system.
Optionally, the calculation formula for transforming the position coordinate of the target point from the camera coordinate system to the inertial coordinate system by using the coordinate system transformation matrix based on the image depth and the position coordinate of the projection point of the pixel plane is as follows:
Figure BDA0002319161420000031
wherein the content of the first and second substances,
Figure BDA0002319161420000032
is the position coordinate of the target point under an inertial coordinate system, q is the position coordinate of the projection point of the target point on the pixel plane,
Figure BDA0002319161420000041
the coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system, C-1Is the inverse of the camera's intrinsic reference matrix,
Figure BDA0002319161420000042
is a homogeneous matrix, I is a 3 × 3 identity matrix, and λ is the image depth.
According to a second aspect of the present invention, there is provided an object localization system comprising:
a first acquisition unit configured to perform real-time acquisition of position coordinates of a projection point of the target point on a pixel plane of a camera using the camera mounted on the aircraft;
a second acquisition unit configured to perform real-time acquisition of flight control data of the aircraft;
a construction unit configured to execute construction of a coordinate system transformation matrix that transforms the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data;
a positioning unit configured to perform positioning of the position coordinates of the target point in a world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix.
According to a third aspect of the present invention, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed, implement the object localization method as described above.
According to a fourth aspect of the present invention, there is provided a control apparatus for target positioning, comprising:
a memory for storing computer instructions;
a processor coupled to the memory, the processor configured to perform an object localization method as described above based on computer instructions stored by the memory.
One embodiment of the present invention has the following advantages or benefits: the position coordinates of the projection point of the target point on the pixel plane of the camera are acquired in real time by using the camera carried on the aircraft, and the flight control data generated by the aircraft in the flight process are acquired in real time by using the information acquisition instrument carried on the aircraft, so that the laser ranging machine carried on the aircraft is avoided, and the load weight and the cost of the aircraft are reduced.
And based on flight control data of the aircraft, a coordinate system transformation matrix for transforming the position coordinate of the target point from a camera coordinate system to an inertial coordinate system is constructed, based on the position coordinate of the projection point and the coordinate system transformation matrix, the position coordinate of the target point is transformed from the camera coordinate system to the inertial coordinate system, then the position coordinate of the target point is transformed from the inertial coordinate system to a world geographic coordinate system, and the position coordinate of the target point under the world geographic coordinate system is positioned. The coordinate transformation is carried out on the position coordinates of the projection points of the target points which are acquired in real time and are located on the pixel plane of the camera through the coordinate system transformation matrix so as to position the position coordinates of the target points under the world geographic coordinate system, the position coordinates of the projection points of any one target point on the pixel plane can be positioned, the position coordinates of any one target point under the world geographic coordinate system can be further positioned, and the target positioning range of the target positioning method of the aircraft is improved.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of the embodiments of the present invention with reference to the accompanying drawings, in which:
fig. 1 shows a schematic configuration of a conventional object locating system.
Fig. 2 shows a flow diagram of an object localization method according to an embodiment of the present invention.
FIG. 3 shows a schematic diagram of a pinhole camera model of an object localization method of one embodiment of the invention.
Fig. 4 shows a flow chart of a target positioning method according to an embodiment of the invention.
Fig. 5 shows a schematic diagram of the positioning technology principle of the object positioning method of one embodiment of the present invention.
FIG. 6 shows a schematic structural diagram of an object location system of one embodiment of the present invention.
Fig. 7 shows a block diagram of a control device for object location according to an embodiment of the invention.
Detailed Description
The present invention will be described below based on examples, but the present invention is not limited to only these examples. In the following detailed description of the present invention, certain specific details are set forth. It will be apparent to one skilled in the art that the present invention may be practiced without these specific details. Well-known methods, procedures, and procedures have not been described in detail so as not to obscure the present invention. The figures are not necessarily drawn to scale.
Fig. 2 is a flowchart illustrating an object locating method according to an embodiment of the present invention. FIG. 3 shows a schematic diagram of a pinhole camera model of an object localization method of one embodiment of the invention.As shown in FIG. 3, the camera coordinate system { X }c,Yc,Zc}, origin O in the camera coordinate systemcThe coordinate unit is meter and is positioned at the optical center of the camera; image plane coordinate system { Xim,Yim,ZcEach axis is parallel to each axis of the camera coordinate system, and the origin O is under the image plane coordinate systemimThe coordinate unit is meter and is positioned on an image plane; pixel plane coordinate system { Xip,Yip}, origin O of pixel plane coordinate systemipAnd the coordinate unit is a pixel and is positioned at the upper left corner of the image.
q=(xip,yip,1,1)TIs the coordinates of the target point in the camera coordinate system
Figure BDA0002319161420000061
Figure BDA0002319161420000062
A homogeneous projection onto the pixel plane, i.e. the position coordinates of the projected point of the target point at the pixel plane of the camera. The image plane coordinate system is an intermediate transformation coordinate system, and the position coordinates of the projection point of the target point on the pixel plane of the camera can be converted from pixel units to meter units by the following formulas (2) and (3):
xim=(-yip+0y)Sy (2)
yim=(-xip+0x)Sx (3)
wherein, 0xAnd 0yIs the origin O of the pixel plane coordinate systemipTo the origin O of the image plane coordinate systemimDeviation of translation of SxAnd SyIs a size conversion factor from pixel units to meter units.
From similar triangles, obtain
Figure BDA0002319161420000063
Figure BDA0002319161420000064
Where f is the camera focal length.
The pinhole camera model for transforming the position coordinates of any target point on the pixel plane to the camera coordinate system can be obtained by the formula, and the pinhole camera model is as follows:
Figure BDA0002319161420000065
wherein the content of the first and second substances,
Figure BDA0002319161420000066
c is a reference matrix in the camera,
Figure BDA0002319161420000067
Figure BDA0002319161420000068
is the position coordinates of the target point in the camera coordinate system,
Figure BDA0002319161420000069
i is a 3 × 3 identity matrix. λ is the image depth of the target point in the camera coordinate system, i.e. the Z of the target point in the camera coordinate systemcAxis coordinates.
A flowchart of an object locating method according to an embodiment of the present invention shown in fig. 2 is described below with reference to fig. 3. The method specifically comprises the following steps:
in step S210, position coordinates of a projection point of the target point on a pixel plane of the camera are acquired in real time by using the camera mounted on the aircraft.
In this step, a telephoto monocular camera is mounted on the onboard end of an aircraft, for example, an unmanned aerial vehicle (fixed wing, multi-rotor), and the position coordinates of the projection point of the target point on the pixel plane of the camera, i.e., q ═ x (x), can be acquired manually or by a detection and tracking program from the pictures captured in real timeip,yip,1,1)T
In step S220, flight control data of the aircraft is acquired in real time.
In the step, flight control data generated by the aircraft in the flight process are collected in real time by using an information collecting instrument carried on the aircraft. Flight control data includes: the position information of the aircraft, the attitude information of the aircraft, the position information of the cradle head and the attitude information of the cradle head. Wherein the position information of the aircraft comprises: the position coordinates of the aircraft under the inertial coordinate system and the position coordinates of the aircraft under the holder coordinate system; the attitude information of the aircraft includes: the roll angle of the aircraft, the pitch angle of the aircraft and the yaw angle of the aircraft; the cradle head position information comprises: the translation distance of the rotational center of the holder relative to the optical center of the camera under the camera coordinate system; the posture information of the holder comprises: pan-tilt yaw angle and pan-tilt pitch angle.
In step S230, a coordinate system transformation matrix that transforms the position coordinates of the target point from a camera coordinate system to an inertial coordinate system is constructed based on the flight control data.
In an alternative embodiment of the invention, the world geographic coordinate system has an origin O located at the geocenter, an X axis pointing to the intersection of the earth's equator and the first meridian, a Z axis pointing to the north pole and parallel to the earth's rotation axis, and a Y axis perpendicular to the plane XOZ to form a right-handed helical coordinate system with the X axis and the Z axis, and the GPS coordinates of any point are expressed in terms of longitude, latitude and altitude.
Inertial frame I, origin OIIs the takeoff point, X, of the aircraftIThe axis pointing in the east-ward direction, YIThe axis pointing in the north direction, ZIThe axis pointing in a vertically upward direction, with XIAxis, YIThe axes form a right-handed helical coordinate system, which is a fixed coordinate system.
Aircraft geographic coordinate system v, origin OvLocated in the mass center of the carrier, XvThe axis pointing in the north direction, YvPointing in the east-ward direction, ZvThe axis pointing in the direction of the earth's center, with XvAxis, YvThe axes form a right-handed helical coordinate system, which is a motion coordinate system.
Aircraft body coordinate system b, origin ObLocated in the mass center of the carrier, XbThe axis points in the direction of the head of the carrier, YbThe axis points in the direction of the right wing of the carrier, ZbAxis perpendicular XbObYbPlane and pointing in the direction of the ground, and XbAxis, YbThe axes form a right-handed helical coordinate system, which is a motion coordinate system.
Cloud platform coordinate system g, origin OgIs positioned at the rotating center (intersection point of two rotating shafts) of the pan-tilt head and XgThe axis is coincident with the optical axis of the airborne camera and is perpendicular to the pixel plane, the shooting direction is taken as the positive direction, YgThe axis pointing in the direction of the right side of the wing, ZgThe axis being perpendicular to the pixel plane and pointing in the direction of the ground, and XgAxis, YgThe axes constitute a right-handed helical coordinate system.
Camera coordinate system c, origin OcLocated in the optical center, X, of the airborne cameracThe axis being parallel to the pixel plane and horizontally to the right, YcAxis perpendicular to XcAxially downward, ZcThe axis is coincident with the optical axis of the airborne camera and is perpendicular to the pixel plane, the shooting direction is taken as the positive direction, and the X direction is takencAxis, YcThe axes constitute a right-handed helical coordinate system.
In this step, a first transformation matrix from the inertial frame of coordinates I to the geographic frame of coordinates v of the aircraft is constructed on the basis of the position information of the aircraft
Figure BDA0002319161420000081
Wherein a first transformation matrix from the inertial frame I to the aircraft geographic frame v
Figure BDA0002319161420000082
The calculation formula of (2) is as follows:
Figure BDA0002319161420000083
wherein the content of the first and second substances,
Figure BDA0002319161420000084
Figure BDA0002319161420000085
is the position coordinate of the aircraft under the inertial coordinate system.
Constructing a second transformation matrix from the aircraft geographic coordinate system v to the aircraft body coordinate system b based on the attitude information of the aircraft
Figure BDA0002319161420000086
Wherein a second transformation matrix from the aircraft geographical coordinate system v to the aircraft body coordinate system b
Figure BDA0002319161420000087
The calculation formula of (2) is as follows:
Figure BDA0002319161420000088
wherein the content of the first and second substances,
Figure BDA0002319161420000089
phi, theta, psi is the Euler angle of the aircraft, phi is the roll angle (roll) of the aircraft, theta is the pitch angle (pitch) of the aircraft, and psi is the yaw angle (yaw) of the aircraft.
Constructing a third transformation matrix from an aircraft body coordinate system b to a cradle head coordinate system g based on cradle head attitude information and aircraft position information
Figure BDA0002319161420000091
Wherein a third transformation matrix from the aircraft body coordinate system b to the head coordinate system g
Figure BDA0002319161420000092
The calculation formula of (2) is as follows:
Figure BDA0002319161420000093
wherein the content of the first and second substances,
Figure BDA0002319161420000094
αazis Z around the head coordinate system ggThe angle of rotation of the shaft, namely the yaw angle of the pan-tilt; alpha is alphaphIs Y around the head coordinate system ggThe angle of shaft rotation, i.e. pan-tilt angle.
Figure BDA0002319161420000095
Is the position coordinate of the aircraft under the coordinate system of the holder.
Constructing a fourth transformation matrix from the pan-tilt coordinate system g to the camera coordinate system c based on the pan-tilt position information
Figure BDA0002319161420000096
Wherein a fourth transformation matrix from the pan-tilt coordinate system g to the camera coordinate system c
Figure BDA0002319161420000097
The calculation formula of (2) is as follows:
Figure BDA0002319161420000098
wherein the content of the first and second substances,
Figure BDA0002319161420000099
Figure BDA00023191614200000910
is the translation distance of the rotational center of the pan/tilt head relative to the optical center of the camera in the camera coordinate system.
Transforming the first transformation matrix
Figure BDA00023191614200000911
Second transformation matrix
Figure BDA00023191614200000912
Third transformation matrix
Figure BDA00023191614200000913
And a fourth transformation matrix
Figure BDA00023191614200000914
Multiplying to obtain a fifth transformation matrix
Figure BDA00023191614200000915
And calculating an inverse matrix of the fifth transformation matrix to obtain a coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system.
In step S240, the position coordinates of the target point in the world geographic coordinate system are located based on the position coordinates of the projected point and the coordinate system transformation matrix.
In the step, the position coordinate of the target point is transformed from the camera coordinate system to the inertial coordinate system based on the position coordinate of the projection point and the coordinate system transformation matrix, and then the position coordinate of the target point is transformed from the inertial coordinate system to the world geographic coordinate system, so that the position coordinate of the target point under the world geographic coordinate system is positioned in real time.
According to the embodiment of the invention, the camera carried on the aircraft is used for acquiring the position coordinates of the projection point of the target point on the pixel plane of the camera in real time, and the information acquisition instrument carried on the aircraft is used for acquiring the flight control data generated by the aircraft in the flight process in real time, so that the laser ranging machine carried on the aircraft is avoided, and the load weight and the cost of the aircraft are reduced.
Meanwhile, a first transformation matrix, a second transformation matrix, a third transformation matrix and a fourth transformation matrix are constructed on the basis of flight control data acquired by an aircraft; and multiplying the first transformation matrix, the second transformation matrix, the third transformation matrix and the fourth transformation matrix to obtain a fifth transformation matrix. And calculating an inverse matrix of the fifth transformation matrix to obtain a coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system, and obtaining the position coordinates of the target point in the inertial coordinate system based on the coordinate system transformation matrix. A coordinate system transformation matrix is constructed through flight control data acquired by the aircraft in real time, time delay is avoided, and the real-time performance of target positioning is improved.
Fig. 4 is a flowchart illustrating an object locating method according to an embodiment of the present invention. Specifically, in step S240 in fig. 2, a specific process of positioning the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projection point and the coordinate system transformation matrix includes the following steps:
in step S410, the position coordinates of the optical center of the camera are transformed from the camera coordinate system to the inertial coordinate system using the coordinate system transformation matrix.
In this step, the position coordinates of the optical center of the camera are transformed from the camera coordinate system to the inertial coordinate system using the coordinate system transformation matrix. The formula for transforming the position coordinates of the optical center of the camera in the camera coordinate system to the inertial coordinate system through the coordinate system transformation matrix is as follows:
Figure BDA0002319161420000101
wherein the content of the first and second substances,
Figure BDA0002319161420000102
is the position coordinate of the camera optical center under the inertial coordinate system, and the coordinate of the camera optical center under the camera coordinate system is
Figure BDA0002319161420000103
A coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system.
In step S420, the coordinate system transformation matrix is used to transform the position coordinates of the projection point of the target point on the pixel plane into an inertial coordinate system.
In this step, the position coordinates of the projected point of the target point on the pixel plane are converted to an inertial coordinate system using a coordinate system transformation matrix. According to the pinhole camera model represented by the formula (6), the formula for converting the position coordinates of the projection point q of the target point on the pixel plane into the inertial coordinate system through the coordinate system transformation matrix is as follows:
Figure BDA0002319161420000111
wherein the content of the first and second substances,
Figure BDA0002319161420000112
is the position coordinate of a projected point q of a target point on a pixel plane under an inertial coordinate system, q is the projected point of the target point on the pixel plane,
Figure BDA0002319161420000113
Figure BDA0002319161420000114
coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system, C-1Is the inverse of the camera internal reference matrix.
In step S430, based on Z of the optical center of the camera in the inertial coordinate systemIAxial coordinates and Z of said projected pointsIAnd calculating the geometric relation between the axis coordinates to obtain the image depth of the target point in the camera coordinate system.
Fig. 5 shows a schematic diagram of the positioning technology principle of the object positioning method of one embodiment of the present invention. As shown in FIG. 5, the inertial frame { X }I,YI,ZIZ of light center of lower phase machineIAxial coordinates and Z of the projection point q of the target point on the pixel planeIThere is a geometric relationship between the axis coordinates: if the terrain is flat, Z of the camera optical center under the inertial coordinate systemIAxial coordinates and Z of the projection point q of the target point on the pixel planeIThe geometric relationship between the axis coordinates is:
Figure BDA0002319161420000115
the image depth lambda of the target point in the camera coordinate system is that the target point is in the cameraZ in the coordinate systemcAxis coordinates. The calculation formula for obtaining the image depth lambda is as follows:
Figure BDA0002319161420000116
wherein the content of the first and second substances,
Figure BDA0002319161420000117
z being the optical centre of the phase machine in the inertial coordinate systemIThe coordinates of the axes are set to be,
Figure BDA0002319161420000118
is Z of a projection point q of a target point on a pixel plane under an inertial coordinate systemIAxis coordinates.
In step S440, the position coordinates of the target point are transformed from the camera coordinate system to the inertial coordinate system using the coordinate system transformation matrix based on the image depth and the position coordinates of the projected point of the pixel plane.
In this step, the position coordinates of the target point are transformed from the camera coordinate system to the inertial coordinate system using a coordinate system transformation matrix based on the image depth and the position coordinates of the projected point of the pixel plane. Based on the image depth and the position coordinates of the projection point of the pixel plane, the calculation formula for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system by using the coordinate system transformation matrix is as follows:
Figure BDA0002319161420000121
wherein the content of the first and second substances,
Figure BDA0002319161420000122
is the position coordinate of the target point under the inertial coordinate system, q is the position coordinate of the projection point of the target point on the pixel plane,
Figure BDA0002319161420000123
Figure BDA0002319161420000124
coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system, C-1Is the inverse of the camera's intrinsic reference matrix,
Figure BDA0002319161420000125
i is a 3 × 3 identity matrix.
In step S450, the position coordinates of the target point are transformed from the inertial coordinate system to the world geographic coordinate system, and the position coordinates of the target point in the world geographic coordinate system are located.
According to the embodiment of the invention, a coordinate system transformation matrix for transforming the position coordinate of the target point from a camera coordinate system to an inertial coordinate system is constructed based on flight control data of the aircraft, the position coordinate of the target point is transformed from the camera coordinate system to the inertial coordinate system based on the position coordinate of a projection point of the target point on a pixel plane of the camera and the coordinate system transformation matrix, and then the position coordinate of the target point is transformed from the inertial coordinate system to a world geographic coordinate system, so as to locate the position coordinate of the target point under the world geographic coordinate system. The coordinate transformation is carried out on the position coordinates of the acquired projection points of the target points on the pixel plane of the camera through the coordinate system transformation matrix so as to position the position coordinates of the target points under the world geographic coordinate system, the position coordinates of the projection points of any one target point on the pixel plane can be positioned, the position coordinates of any one target point under the world geographic coordinate system can be further positioned, and the target positioning range of the target positioning method of the aircraft is improved.
FIG. 6 is a schematic diagram of the structure of an object location system according to an embodiment of the present invention. As shown in fig. 6, the object locating system includes: a first acquisition unit 610, a second acquisition unit 620, a construction unit 630 and a positioning unit 640.
A first obtaining unit 610 configured to perform obtaining, in real time, position coordinates of a projected point of the target point on a pixel plane of a camera mounted on an aircraft using the camera.
A second obtaining unit 620 configured to perform obtaining flight control data of the aircraft in real time.
A construction unit 630 configured to perform a construction of a coordinate system transformation matrix for transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data.
A positioning unit 640 configured to perform positioning of the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projected point of the pixel plane and the coordinate system transformation matrix.
Fig. 7 is a block diagram of a control apparatus for object location according to an embodiment of the present invention. The apparatus shown in fig. 7 is only an example and should not limit the functionality and scope of use of embodiments of the present invention in any way.
Referring to fig. 7, the apparatus includes a processor 710, a memory 720, and an input-output device 730, which are connected by a bus. Memory 720 includes Read Only Memory (ROM) and Random Access Memory (RAM), with various computer instructions and data required to perform system functions being stored in memory 720, and with various computer instructions being read by processor 710 from memory 720 to perform various appropriate actions and processes. An input/output device including an input portion of a keyboard, a mouse, and the like; an output section including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The memory 720 also stores the following computer instructions to perform the operations specified by the object localization method of embodiments of the present invention.
Accordingly, embodiments of the present invention provide a computer-readable storage medium storing computer instructions, which when executed, implement the operations specified in the above-mentioned object positioning method.
The flowcharts and block diagrams in the figures and block diagrams illustrate the possible architectures, functions, and operations of the systems, methods, and apparatuses according to the embodiments of the present invention, and may represent a module, a program segment, or merely a code segment, which is an executable instruction for implementing a specified logical function. It should also be noted that the executable instructions that implement the specified logical functions may be recombined to create new modules and program segments. The blocks of the drawings, and the order of the blocks, are thus provided to better illustrate the processes and steps of the embodiments and should not be taken as limiting the invention itself.
The above description is only a few embodiments of the present invention, and is not intended to limit the present invention, and various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A method of locating an object, comprising:
acquiring the position coordinates of a projection point of the target point on a pixel plane of the camera in real time by using a camera carried on an aircraft;
acquiring flight control data of the aircraft in real time;
constructing a coordinate system transformation matrix for transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data;
and positioning the position coordinates of the target point in a world geographic coordinate system based on the position coordinates of the projection point and the coordinate system transformation matrix.
2. The method of claim 1, wherein the positioning the location coordinates of the target point in a world geographic coordinate system based on the location coordinates of the proxels and the coordinate system transformation matrix comprises:
transforming the position coordinates of the optical center of the camera from a camera coordinate system to an inertial coordinate system using the coordinate system transformation matrix;
converting the position coordinates of the projection points of the target points on the pixel plane into an inertial coordinate system by using the coordinate system transformation matrix; and
based on the optical centre of the camera in the inertial frameZIAxial coordinates and Z of said projected pointsIAnd calculating the geometric relation between the axis coordinates to obtain the image depth of the target point in the camera coordinate system.
3. The method of claim 2, wherein the positioning the location coordinates of the target point in the world geographic coordinate system based on the location coordinates of the proxel and the coordinate system transformation matrix, further comprises:
and transforming the position coordinates of the target point from a camera coordinate system to an inertial coordinate system by using the coordinate system transformation matrix based on the image depth and the position coordinates of the projection point of the pixel plane.
4. The method of claim 3, wherein the real-time locating the position coordinates of the target point in the world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix further comprises:
and transforming the position coordinates of the target point from an inertial coordinate system to a world geographic coordinate system, and positioning the position coordinates of the target point in the world geographic coordinate system.
5. The method of claim 4, wherein the image depth is the Z of the target point in a camera coordinate systemcAxis coordinates.
6. The method of claim 1, wherein the flight control data comprises: the position information of the aircraft, the attitude information of the aircraft, the position information of the holder and the attitude information of the holder;
the position information of the aircraft includes: the position coordinates of the aircraft under an inertial coordinate system and the position coordinates of the aircraft under a holder coordinate system;
the attitude information of the aircraft includes: a roll angle of the aircraft, a pitch angle of the aircraft, and a yaw angle of the aircraft;
the cradle head position information includes: a translation distance of a rotational center of a pan-tilt relative to an optical center of the camera in a camera coordinate system; and
the cradle head attitude information includes: pan-tilt yaw angle and pan-tilt pitch angle.
7. The method of claim 6, wherein the constructing a coordinate system transformation matrix that transforms the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data comprises:
constructing a first transformation matrix from an inertial coordinate system to an aircraft geographic coordinate system based on the position information of the aircraft;
constructing a second transformation matrix from an aircraft geographic coordinate system to an aircraft body coordinate system based on the attitude information of the aircraft;
constructing a third transformation matrix from an aircraft body coordinate system to a cradle head coordinate system based on the cradle head attitude information and the aircraft position information;
constructing a fourth transformation matrix from the holder coordinate system to the camera coordinate system based on the holder position information;
multiplying the first transformation matrix, the second transformation matrix, the third transformation matrix and the fourth transformation matrix to obtain a fifth transformation matrix;
and calculating an inverse matrix of the fifth transformation matrix to obtain the coordinate system transformation matrix for transforming the position coordinate of the target point from the camera coordinate system to the inertial coordinate system.
8. The target positioning method according to claim 3, wherein the calculation formula for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system using the coordinate system transformation matrix based on the image depth and the position coordinates of the projection point of the pixel plane is:
Figure FDA0002319161410000031
wherein the content of the first and second substances,
Figure FDA0002319161410000032
is the position coordinate of the target point under an inertial coordinate system, q is the position coordinate of the projection point of the target point on the pixel plane,
Figure FDA0002319161410000033
Figure FDA0002319161410000034
the coordinate system transformation matrix for transforming the position coordinates of the target point from the camera coordinate system to the inertial coordinate system, C-1Is the inverse of the camera's intrinsic reference matrix,
Figure FDA0002319161410000035
is a homogeneous matrix, I is a 3 × 3 identity matrix, and λ is the image depth.
9. An object positioning system, comprising:
a first acquisition unit configured to perform real-time acquisition of position coordinates of a projection point of the target point on a pixel plane of a camera using the camera mounted on the aircraft;
a second acquisition unit configured to perform real-time acquisition of flight control data of the aircraft;
a construction unit configured to execute construction of a coordinate system transformation matrix that transforms the position coordinates of the target point from a camera coordinate system to an inertial coordinate system based on the flight control data;
a positioning unit configured to perform positioning of the position coordinates of the target point in a world geographic coordinate system based on the position coordinates of the projected point and the coordinate system transformation matrix.
10. A computer-readable storage medium storing computer instructions which, when executed, implement the object localization method according to any one of claims 1 to 8.
11. A control device for object positioning, comprising:
a memory for storing computer instructions;
a processor coupled to the memory, the processor configured to perform implementing the object localization method of any of claims 1-8 based on computer instructions stored by the memory.
CN201911291169.6A 2019-12-16 2019-12-16 Target positioning method and system Pending CN112985398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911291169.6A CN112985398A (en) 2019-12-16 2019-12-16 Target positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911291169.6A CN112985398A (en) 2019-12-16 2019-12-16 Target positioning method and system

Publications (1)

Publication Number Publication Date
CN112985398A true CN112985398A (en) 2021-06-18

Family

ID=76343035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911291169.6A Pending CN112985398A (en) 2019-12-16 2019-12-16 Target positioning method and system

Country Status (1)

Country Link
CN (1) CN112985398A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113804187A (en) * 2021-09-01 2021-12-17 河北汉光重工有限责任公司 Integrated system for photoelectric pod target positioning
CN114167900A (en) * 2021-11-19 2022-03-11 北京环境特性研究所 Photoelectric tracking system calibration method and device based on unmanned aerial vehicle and differential GPS

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113804187A (en) * 2021-09-01 2021-12-17 河北汉光重工有限责任公司 Integrated system for photoelectric pod target positioning
CN114167900A (en) * 2021-11-19 2022-03-11 北京环境特性研究所 Photoelectric tracking system calibration method and device based on unmanned aerial vehicle and differential GPS
CN114167900B (en) * 2021-11-19 2023-06-30 北京环境特性研究所 Photoelectric tracking system calibration method and device based on unmanned aerial vehicle and differential GPS

Similar Documents

Publication Publication Date Title
CN111652964B (en) Auxiliary positioning method and system for power inspection unmanned aerial vehicle based on digital twinning
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN107014380B (en) Combined navigation method based on visual navigation and inertial navigation of aircraft
CN104808558B (en) A kind of multitask load system suitable for extraordinary general-purpose aircraft
CN107247458A (en) UAV Video image object alignment system, localization method and cloud platform control method
CN111966133A (en) Visual servo control system of holder
Johnson et al. Real-time terrain relative navigation test results from a relevant environment for Mars landing
CN105627991A (en) Real-time panoramic stitching method and system for unmanned aerial vehicle images
CN105790155A (en) Differential-GPS-based unmanned-aerial-vehicle autonomous routing inspection system and method for power transmission line
CN111366148B (en) Target positioning method suitable for multiple observations of airborne photoelectric observing and sighting system
Madawalagama et al. Low cost aerial mapping with consumer-grade drones
CN104618689A (en) Method and system for monitoring offshore oil spillage based on UAV
CN110706273B (en) Real-time collapse area measurement method based on unmanned aerial vehicle
CN113240813B (en) Three-dimensional point cloud information determining method and device
CN105243364B (en) Photoelectric nacelle searching method, device and system
CN113640825A (en) Unmanned aerial vehicle composite three-dimensional surveying and mapping system and method
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN112985398A (en) Target positioning method and system
CN115439531A (en) Method and equipment for acquiring target space position information of target object
Qiao et al. Ground target geolocation based on digital elevation model for airborne wide-area reconnaissance system
CN114494423B (en) Unmanned platform load non-central target longitude and latitude positioning method and system
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN112132029B (en) Unmanned aerial vehicle remote sensing image rapid positioning method for earthquake emergency response

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination