CN115201746A - Phase interferometer static calibration method based on visual positioning - Google Patents

Phase interferometer static calibration method based on visual positioning Download PDF

Info

Publication number
CN115201746A
CN115201746A CN202210612919.0A CN202210612919A CN115201746A CN 115201746 A CN115201746 A CN 115201746A CN 202210612919 A CN202210612919 A CN 202210612919A CN 115201746 A CN115201746 A CN 115201746A
Authority
CN
China
Prior art keywords
camera
point
calibration
points
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210612919.0A
Other languages
Chinese (zh)
Inventor
陈鸿
杜靖
刘开元
李进杰
刘扬
郑修鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Campus of Naval Aviation University of PLA
Original Assignee
Qingdao Campus of Naval Aviation University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Campus of Naval Aviation University of PLA filed Critical Qingdao Campus of Naval Aviation University of PLA
Priority to CN202210612919.0A priority Critical patent/CN115201746A/en
Publication of CN115201746A publication Critical patent/CN115201746A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/023Monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention relates to a static calibration method of a phase interferometer based on visual positioning, which comprises a camera, a feature board, a positioning feature point, a signal source and a control computer by means of calibration equipment; the control computer is connected with the camera and the signal source; the characteristic board adopts black and white checkerboards which are used for positioning the camera and is placed near the phase interferometer antenna array, the area of the checkerboards is determined by the calibration distance of the camera, and the distance d between every two checkerboards is known; positioning characteristic points, which are placed on the surface of the antenna array and used for calculating the base line of the antenna array on the carrier platform; the camera adopts a digital camera; the signal source uses a general microwave signal source.

Description

Phase interferometer static calibration method based on visual positioning
Technical Field
The invention relates to a static calibration method of a phase interferometer based on visual positioning.
Background
In electronic reconnaissance, a phase interferometer direction-finding system has the advantages of wide frequency band, high precision, high sensitivity and the like, is moderate in size, and is widely applied to various airborne, shipborne and vehicle-mounted platforms. The basic principle of phase interferometer direction finding is to determine the orientation of a radiation source by detecting the phase difference of signals received by a plurality of array elements of an antenna array by the same radiation source. In the engineering implementation, because the signal transmission paths from each antenna to the receiver and from the receiver to the processor of the phase interferometer direction-finding system cannot be completely consistent, inherent phase difference must exist between each antenna and a receiving channel. Therefore, in practical applications, the inherent phase difference must be eliminated by calibration to ensure the correctness and high precision of the final direction finding result. Particularly, after the direction-finding system is developed and installed on an application platform for use for a period of time, the direction-finding system needs to be calibrated regularly or according to the circumstances in order to ensure the direction-finding accuracy due to the change of the hardware state of the equipment (aging, replacement and the like).
In the installed state, the calibration method of the phase interferometer direction-finding system mainly comprises two types at present:
one type is dynamic calibration, also known as internal source calibration. The calibration signal is generated by a signal source arranged in the phase interferometer direction-finding equipment, the phase inconsistency of each channel from the receiver to the processor is corrected, the calibration signal is a built-in function when the equipment leaves a factory, the operation is convenient, and no additional detection equipment is needed, but the calibration signal of the built-in signal source is generally injected from the receiver, so the phase inconsistency of a section of channel from an antenna to the receiver cannot be corrected by the correction mode, and the dynamic calibration cannot achieve the correction purpose after the antenna, a feeder line or the receiver equipment is aged or replaced;
the other type is static calibration, also called external radiation calibration, that is, an independent signal transmitting device is used for transmitting calibration signals, the whole working process of irradiation from the outside of the reconnaissance equipment and direction finding of the reduction equipment can be realized, and the whole process from the antenna to the processor of each channel can be calibrated. Obviously, static calibration can be more comprehensively and accurately calibrated, and meanwhile, according to the maintenance requirement of the interferometer direction-finding equipment, static calibration must be performed when the equipment is used for a period of time (generally several months) or extension sets such as an antenna, a feeder line and a receiver are replaced.
Aiming at the existing static calibration implementation scheme of the external field:
the outfield static calibration equipment mainly comprises a total station, a signal source and a control computer. The conventional calibration process is shown in fig. 1. At present, when static calibration is carried out on a phase interferometer direction-finding system in an external field installation environment, the problems of low efficiency, limited precision and the like exist in the process of measuring the normal line of a phase interferometer direction-finding array under the influence of the attitude of an installation platform, the ground flatness, manual operation and the like. The invention is based on the computer vision principle, automatically measures the normal position of the interferometer array through the vision positioning technology, thereby replacing the manual measurement means, realizing the automatic high-precision calibration of the normal of the phase interferometer direction-finding array under the installation environment, and improving the overall efficiency and precision of the static calibration of the phase interferometer direction-finding system under the external field condition.
First step normal measurement: the normal OS of the phase interferometer antenna array is determined with a total station.
And a second step of radiating signals: at a sufficiently large distance, along the normal OS direction, a signal source is placed, radiating the calibration signal.
Thirdly, generating a calibration table: the phase difference between the received signals of each array element of the antenna array is measured and used as the inherent phase difference of the phase interferometer under the frequency. Then, the frequency is changed, and the phase difference between each array element channel under each frequency point is measured to form a phase calibration table.
Fourth step, injecting equipment: and injecting the phase calibration table into the direction-finding equipment for correcting the direction-finding result in real time and ensuring the direction-finding precision.
At present, under the control of a calibration computer, the second step to the fourth step can be efficiently and automatically completed by equipment. The main bottleneck affecting the speed and accuracy of external field static calibration is the first step, namely, the normal of the phase interferometer antenna array is measured.
Taking the most common total station measurement method as an example, assuming that the fixed points a and B are two fixed points when the phase interferometer antenna array is installed, which are determined when the direction-finding system is installed on the platform, the connecting line of a and B represents the installation baseline of the interferometer array.
Firstly, finding a projection A ' B ' of two points AB on the ground by using a plumb method, and finding a center O ' of the A ' B ';
then, aligning the O 'point on the ground by using a total station to perform balancing, namely aiming the O' point at the center of the total station, and simultaneously adjusting the posture of the total station to a horizontal position;
next, the extension A 'B' is aligned with the total station, and the orientation at that time is set to 0 °
Finally, the total station is rotated exactly 90 °, i.e. the normal OS is found.
It can thus be seen that the process of determining the normal OS of the phase interferometer antenna array using a total station is basically a manual determination process. There are several disadvantages:
firstly, the efficiency is low: under the condition of an external field, the total station is operated to perform projection, balancing and aiming, repeated correction is often needed, time and labor are consumed, especially in the current practical direction-finding system, a phase interferometer array surface often has a plurality of baselines for resolving direction-finding ambiguity, so that all the normals of the baselines need to be calibrated, the step of searching the normals is very complicated, one calibration often takes tens of hours, even tens of hours, and the calibration efficiency is low.
Secondly, the precision is limited: because the operation process is mainly completed manually, the steps of projection, balancing and aiming are performed in a way that one ring is buckled with another ring, a large amount of errors are easily introduced due to improper operation and accumulated in subsequent links, and finally the calibrated normal angle deviation is large and exceeds the direction-finding error range of equipment, so that the calibration precision requirement cannot be met.
Thirdly, the site is limited: because the projection is adopted to the ground for observation and calibration, firstly, a relatively flat and wide place is required, secondly, the attitude of the installation platform is required to be basically horizontal to the ground, and otherwise, the installation platform is required to be leveled.
Disclosure of Invention
The invention aims to solve the technical problem of providing a static calibration method of a phase interferometer based on visual positioning, which uses a common camera to replace a total station, realizes the three-dimensional coordinate reconstruction of the phase interferometer array on a carrier platform under an external field environment based on computer vision and a mapping geometric technology, and automatically measures the normal position of the interferometer array through the visual positioning technology on the basis, thereby replacing the manual measuring means of the total station, realizing the automatic high-precision calibration of a phase interferometer method line and improving the overall efficiency and precision of the static calibration under the external field condition.
In order to solve the problems, the technical scheme adopted by the invention is as follows:
the calibration equipment used in the invention comprises a camera, a feature board, a positioning feature point, a signal source and a control computer; the control computer is connected with the camera and the signal source; the characteristic board adopts black and white checkerboards which are used for positioning the camera and placed near the phase interferometer antenna array, the area of the checkerboards is determined by the calibration distance of the camera, and the distance d between every two checkerboards is known; positioning characteristic points, which are placed on the surface of the antenna array and used for calculating the base line of the antenna array on the carrier platform; the camera adopts a digital camera; the signal source uses a general microwave signal source.
The control computer is provided with a processing center of visual positioning processing and automatic calibration software and is used for interacting with the camera, acquiring photos shot by the camera in real time to reconstruct three-dimensional coordinates, dynamically calculating the current position of the camera and aligning errors and indicating an alignment correction direction; the control computer completes the cross-linking with the phase interferometer direction-finding equipment, controls the signal source work of the microwave, and automatically calculates and generates a full-frequency-band phase calibration table;
the camera, the signal source and the control computer are integrated on the calibration trolley with the small wheels;
the calibration method is as follows,
s1, constructing a static calibration environment, namely constructing the calibration environment under the condition of an external field, placing a three-dimensional reconstruction feature board and a positioning feature point, and connecting a camera, a signal source and a control computer; then connecting the control computer and the phase interferometer direction-finding equipment to be measured;
s2, visually reconstructing a three-dimensional coordinate, namely shooting a plurality of photos and/or videos of the phase interferometer array at different angles by using a camera, wherein the photos and/or videos comprise feature plates and positioning feature points; then, based on a computer vision algorithm, automatically reconstructing three-dimensional coordinates of a phase interferometer array on a carrier platform in an external field environment;
s3, calculating a vertical plane of the interferometer array base line in the reconstructed three-dimensional space, wherein the vertical plane comprises a normal to be calibrated;
s4, dynamically indicating an included angle between the current shooting position of the camera and the vertical plane through a visual positioning algorithm, and guiding and correcting the shooting position to 0 degree through a calibration cart so as to finish the alignment of a normal OS of the phase interferometer antenna array;
and S5, under the automatic control of the control computer, a signal source on the calibration cart at the normal position radiates calibration signals, the calibration signals are compared with phase differences acquired by each antenna of the direction-finding equipment of the phase interferometer to be detected, and a phase calibration table is automatically generated, so that static calibration is completed.
Wherein in S2, S2.1 is executed, the camera is at the set P 1 The image shot at the position is completely shot to a characteristic plate and a characteristic point which are placed at the direction-finding antenna array of the phase interferometer in advance; because the feature board and the positioning feature points are both in a checkerboard with alternate light and shade, all feature angular points on the feature board and the positioning feature points are detected through a checkerboard angular point detection algorithm, two-dimensional coordinates of the angular points in the images are solved, and the two-dimensional coordinates are arranged from left to right and from top to bottom so as to ensure that indexes of the same angular point in different images are consistent; wherein, the corner points with consistent indexes are called matching points;
s2.2, initializing a three-dimensional coordinate of the feature board;
firstly, setting a first crossed corner point at the upper left corner of a feature board as a world coordinate system origin O (0,0,0), and defining the feature board on an XOY plane, wherein the corner point distance d is known, so that the coordinates of corner points of the mth line and the nth column in the world coordinate system are [ (n-1) d, (m-1) d,0], and thus three-dimensional space coordinates of all corner points on the three-dimensional reconstruction feature board under the world coordinate system are determined;
s2.3, calculating the pose of the camera, firstly, when the camera is started from P 1 When a point is shot, the coordinate system (P) of the camera is considered 1 -x 1 y 1 z 1 ) Rigid body motion occurs with respect to the world coordinate system (O-XYZ); then, according to the three-dimensional space rigid body motion principle, the motion between two coordinate systems is defined as a rotation plus a translation, a characteristic board corner point N is reconstructed in three dimensions, and the three-dimensional coordinates in a world coordinate system are known and are marked as N = [ X, Y, Z ]] T (ii) a At P 1 Camera coordinate system (P) 1 -x 1 y 1 z 1 ) Is represented by N 1 =[x n1 ,y n1 ,z n1 ] T
Then N is 1 The coordinates of (a) are: n is a radical of 1 =RN+t;
Wherein, R is a rotation matrix of 3 multiplied by 3, t is a translation matrix of 3 multiplied by 1, and R and t are the position and the posture of the camera, namely the pose of the camera; next, defining the augmentation matrix [ R | t ] as a 3 × 4 matrix, the formula (1):
Figure BDA0003672498250000051
camera pose [ R | t]Is unknown, and P is determined by the corner point image on the three-dimensional reconstruction plate 1 Pose [ R | t ] of point camera]N in P 1 In the formation of an image of n 1 Let n be 1 Has a two-dimensional coordinate of n 1 =[u n1 ,v n1 ] T
Thirdly, the camera adopts a pinhole camera model, and if f is the focal length of the camera. Then there are:
Figure BDA0003672498250000052
namely z n1 n 1 =KN 1
K is an internal parameter matrix of the camera and is a preset coefficient of all images;
from formula (2):
Figure BDA0003672498250000053
then, the formula (3) is developed to obtain two constraints instead of the formula (1):
Figure BDA0003672498250000054
wherein P to be solved 1 Point camera pose augmentation matrix [ R | t [ ]]The total dimension is 12, the three-dimensional coordinates (X, Y, Z) of the angular points on the camera internal reference K and the three-dimensional reconstruction characteristic board are known, so that the constraint of formula (4) is provided, and 6 characteristic angular points can be aligned to the matrix [ R | t [ ]]Solving, and solving a least square solution of an equation by using an SVD method when more than 6 points exist;
then, after solving the rotation matrix R and the translation matrix t, determining that the camera is at P 1 Position and attitude at the time of point shooting. Next, the positions and attitudes of the cameras at all subsequent shot points (pn, n =2,3,4 …) are determined in the same manner;
s2.4, calculating three-dimensional coordinates of the positioning characteristic points, and determining P at S2.3 1 And P 2 After the poses of the two shooting point cameras, determining three-dimensional coordinates of all positioning characteristic points in the space in a triangulation manner; firstly, a locating feature point A, a is taken 1 And a 2 Is that the camera is at P 1 And P 2 In the image shot at the position, the pixel coordinates of the positioning feature points are obtained; then, the ray P 1 All points on A are projected to the same pixel point a 1 When the spatial position of A is unknown, the ray P 2 d 2 And ray P 1 a 1 Intersecting the point A, deducing the spatial position of the point A, and realizing three-dimensional reconstruction of the positioning characteristic points;
and S2.5, when the least square solution of the image is solved, the error is larger than a set threshold value, and bundling adjustment is executed. Projecting the coordinates of the calculated point A to a camera P i To obtain a two-dimensional coordinate a i However, the actual imaging coordinate of the point A in the image is a' i When a is i And a' i When not coincident, a i And a' i The distance between two points is called the reprojection error, setThe beam adjustment minimizes the re-projection errors of all images by adjusting the pose of the camera and the coordinates of the point A; the ideal result is obtained when the number of images M reaches 10;
in S3-S4, visual positioning guides alignment;
in S3, vertical plane measurement in the interferometer antenna array is carried out, after three-dimensional coordinates of all positioning characteristic points are obtained, a straight line l is fitted in space by using the points, a plane perpendicular to the straight line is obtained, the plane is a vertical plane of the phase interferometer antenna array, and the vertical plane is intersected with the straight line l at a point C;
in S4, calibrating the offset dynamic guiding alignment, firstly, considering t in the camera pose as the coordinate of the camera center P in the space of the current shooting position, and subtracting the coordinate of the point P and the coordinate of the point C by knowing the coordinate of the point P and the coordinate of the point C to obtain a direction vector t of a connecting line of the camera center P and the point C pc The current calibration position is the current shooting position, and the included angle between the current calibration position and the middle vertical plane is the vector t pc The included angle with the middle vertical plane is marked as theta
Figure BDA0003672498250000061
And then, after the offset angle theta exists, dynamically adjusting the current camera position through the size and the sign indication of the offset angle, and shooting position = calibration position until the offset angle theta is less than +/-delta alignment precision, namely, the normal measurement and alignment of the phase interferometer antenna array are completed.
The invention has the advantages that:
firstly, the calibration time is greatly shortened. The normal direction of the phase interferometer array antenna on the external field platform can be rapidly determined by shooting dozens of photos (a section of dynamic video) in a full-automatic three-dimensional reconstruction mode, and the position and the motion of the phase interferometer array antenna on the external field platform can be determined in a visual positioning mode, so that the radiation position is dynamically adjusted to be aligned to the normal. The traditional mode of using a total station is avoided, manual projection, balancing and aiming are performed, the time-consuming and labor-consuming process is repeatedly corrected, the time for once calibration can be shortened to within half an hour from the original tens of hours, and the calibration efficiency is greatly improved.
Secondly, the calibration precision is ensured. Firstly, normal measurement and deviation indication are completed automatically, so that manual errors are eliminated; the normal alignment precision is determined by three-dimensional reconstruction and vision positioning precision, which is an optical measurement mode in strict sense, and on the basis of correct calibration of camera internal parameters, the normal alignment precision is less than 0.1 degree (test value), thereby meeting the calibration requirement.
Thirdly, the requirement on the operation of the personnel is not high. In the calibration process, normal alignment can be finished under the guidance of the system only by an operator being capable of clearly shooting photos or videos containing the feature boards and the feature points at multiple angles; the operator is not required to master complicated measurement operations such as balancing and aiming of the total station; the calibration method has the advantages that the requirement on the capability of calibration personnel is reduced, the calibration method can be mastered by using and maintaining personnel of the phase interferometer direction-finding equipment, and the maintenance cost is reduced.
And fourthly, the requirements on the field and the platform state are lower. Under the condition that the field is uneven or the platform for installing the antenna array is inclined or rolled to a certain degree, the visual reconstruction of the camera shooting is not influenced, the three-dimensional coordinates of the antenna array can be reconstructed correctly, the laborious work of traction, platform jacking and the like is not needed, and the scene applicability of the invention is stronger.
Fifthly, the cost of the calibration equipment is not high. The total station in the traditional static calibration system is replaced by a camera, and meanwhile, vision positioning software is deployed in a control computer, so that the calibration system can be upgraded.
Drawings
FIG. 1 is a diagram of a prior art phase interferometer external field static calibration scheme.
FIG. 2 is a diagram of the outfield static calibration scheme of the present invention.
Fig. 3 is a flow chart of the implementation process of the present invention.
Fig. 4 is a schematic diagram of three-dimensional coordinates of a visually reconstructed antenna array according to the present invention.
Fig. 5 is a schematic diagram of the three-dimensional reconstruction feature plate corner point coordinates of the invention.
Fig. 6 is a schematic diagram of bundle adjustment according to the present invention.
Fig. 7 is a schematic view of the visual alignment guide of the present invention.
Detailed Description
Based on fig. 1-7, the calibration device used in the present invention includes a camera 3, a feature board 4, a signal source 1, a control computer 2, a camera interferometer antenna array 5, and a positioning feature point 6; the overall implementation is shown in fig. 2. The feature plate 4 is a three-dimensional reconstruction feature plate.
In fig. 2, there are two feature boards, one is a checkerboard with black and white alternately, which is placed near the phase interferometer antenna array and has a size within 1 square meter (determined by calibration distance), and the distance d between each two is known and used for visually reconstructing three-dimensional coordinates of space; another is a positioning feature point, placed at the antenna array positioning point, for visually positioning the baseline of the antenna array on the carrier platform.
The camera uses a common digital camera, and the performance of the camera can meet the requirement that clear characteristic plates and characteristic points are shot at a calibration distance. The signal source uses a traditional general microwave signal source, covers the working frequency band and can be controlled by programs. The control computer is a processing center of visual positioning processing and automatic calibration software, and firstly, the interaction with the camera is completed, the picture (video) shot by the camera is obtained in real time to reconstruct the three-dimensional coordinate, and simultaneously, the current camera position and the alignment error are dynamically calculated, and the alignment correction direction is indicated; and secondly, completing the cross-linking with the phase interferometer direction-finding equipment, controlling the work of a microwave signal source, and automatically calculating to generate a full-frequency-band phase calibration table.
The specific steps of the implementation of the invention are shown in figure 3.
The method comprises the following steps of firstly, constructing a static calibration environment, constructing the calibration environment under the condition of an external field, placing a three-dimensional reconstruction feature board and a positioning feature point, and connecting a camera, a signal source and a control computer (the three can be integrated on a calibration trolley with small wheels); then connecting the control computer and the phase interferometer direction-finding equipment to be measured;
secondly, visually reconstructing a three-dimensional coordinate, and shooting a plurality of pictures or videos (including a feature plate and positioning feature points) of the phase interferometer array at different angles by using a camera; then automatically reconstructing three-dimensional coordinates of a phase interferometer array on a carrier platform under an external field environment based on a computer vision algorithm;
thirdly, calculating a vertical plane (containing a normal to be calibrated) of the interferometer array base line in the reconstructed three-dimensional space;
fourthly, dynamically indicating an included angle between the current shooting position (the position of the calibration cart) of the camera and the middle vertical plane through a visual positioning algorithm to guide the quick correction shooting position (the calibration cart) to 0 degree, thereby finishing the alignment of the normal OS of the phase interferometer antenna array;
fifthly, under the automatic control of the control computer, a signal source on the calibration cart at the normal position radiates calibration signals, the calibration signals are compared with phase differences acquired by each antenna of the phase interferometer direction-finding equipment to be tested, and a phase calibration table is automatically generated, so that static calibration is completed.
Two key technologies in the implementation process are analyzed in detail as follows:
as shown in FIG. 4, the camera is at P 1 And P 2 And the two images shot at the positions completely shoot a characteristic plate and a characteristic point which are placed at the direction-finding antenna array of the phase interferometer in advance. The characteristic board and the characteristic points are formed by grids with alternate light and shade and are shaped like a chessboard, all characteristic angular points on the characteristic board and the positioning characteristic points are detected through a chessboard angular point detection algorithm, two-dimensional coordinates of the angular points in the image are solved, the two-dimensional coordinates are arranged from left to right and from top to bottom, the indexes of the same angular point in different images are consistent, and the angular points with consistent indexes are called matching points.
(1) Feature plate three-dimensional coordinate initialization
Since the feature plate is a plane, the first corner point at the top left corner can be set as the world coordinate system origin O (0,0,0), and the feature plate is in the XOY plane, as shown in fig. 5. The angular point spacing d is known, the coordinates of the angular points in the mth row and the nth column in the world coordinate system are [ (n-1) · d, (m-1) · d,0], so that the three-dimensional space coordinates of all the angular points on the three-dimensional reconstruction feature board in the world coordinate system can be determined.
(2) Camera pose calculation
When the camera is driven from P 1 When a point is shot, the coordinate system (P) of the camera can be considered 1 -x 1 y 1 z 1 ) Sit against the worldRigid motion occurs in the object system (O-XYZ). According to the three-dimensional rigid motion principle, the motion between two coordinate systems can be composed of one rotation plus one translation.
Such as: three-dimensionally reconstructing a characteristic plate corner point N, three-dimensional coordinates in a world coordinate system are known and are marked as N = [ X, Y, Z = [)] T (ii) a At P 1 Camera coordinate system (P) 1 -x 1 y 1 z 1 ) Is represented by N 1 =[x n1 ,y n1 ,z n1 ] T
Then N is 1 The coordinates of (a) are:
N 1 =RN+t
wherein, R is a rotation matrix of 3 multiplied by 3, t is a translation matrix of 3 multiplied by 1, and R and t are the position and the posture of the camera (the camera pose for short). Defining the augmented matrix [ R | t ] as a 3 × 4 matrix, the following equation can be listed:
Figure BDA0003672498250000101
at this time, the camera pose [ R | t]Is unknown, P can be determined from the corner images taken on the three-dimensional reconstruction plate 1 Pose [ R | t ] of point camera]. N is in P 1 In the formation of n 1 Let n be 1 Two-dimensional coordinates in the image 1 are n 1 =[u n1 ,v n1 ] T
From the pinhole camera model, if f is the focal length of the camera. Then there are:
Figure BDA0003672498250000102
namely, it is
z n1 n 1 =KN 1
Where K is the intrinsic parameter matrix of the camera. The internal parameters of the camera are fixed after leaving the factory and cannot be changed in the using process. Some camera manufacturers may label the camera internal parameters directly, or may use a mature calibration algorithm to obtain the camera internal parameters (e.g., zhang Zhengyou calibration method), so that K may be considered to be known.
From formula (2):
Figure BDA0003672498250000103
two constraints are obtained after the unfolding of the substitution formula (1):
Figure BDA0003672498250000104
p to be solved 1 Point camera pose augmentation matrix [ R | t [ ]]There are 12 dimensions in total, the camera reference K and the three-dimensional coordinates (X, Y, Z) of the corner points on the three-dimensional reconstruction feature plate are known, and therefore the two above constraints can be provided. Then 6 characteristic corner points can be aligned to the matrix R | t]And (5) solving, and solving the least square solution of the equation by using methods such as SVD (singular value decomposition) and the like when the number of the points is more than 6.
After solving the rotation matrix R and the translation matrix t, the camera can be determined to be in P 1 Position and attitude at the time of point shooting. Similarly, P can be determined 2 The position and attitude of the camera at a point or even at any other shot point.
(3) Three-dimensional coordinate calculation of positioning feature points
Determine P 1 And P 2 After the poses of the two shooting point cameras, the three-dimensional coordinates of any other positioning feature points which are not on the three-dimensional reconstruction feature plate in the space can be determined in a triangulation mode. Taking the positioning feature point A in FIG. 4 as an example, a 1 And a 2 Is that the camera is at P 1 And P 2 And in the image shot at the position, the pixel coordinates of the positioning characteristic points. Then from the first image the ray P 1 All points on A will be projected to the same pixel point a 1 (ii) a If the position of A is not known, then the ray P can be seen on the second image 2 a 2 Will interact with the ray P 1 a 1 The intersection is at the point A, so that the spatial position of the point A can be deduced, and the three-dimensional reconstruction of the positioning characteristic point is realized.
The process is sensitive to noise, and two images are obtainedThe error is larger when the least square solution is used, so P is utilized 1 And P 2 After calculating the three-dimensional coordinates of the positioning feature points, the subsequent M (M) is usually adopted>10 Three-dimensional coordinates of the feature points are optimally located by an image, and the method is called bundle adjustment and is shown in figure 6.
The calculated coordinates of the point A are projected to a camera P i To obtain a two-dimensional coordinate a i However, the actual imaging coordinate of the point A in the image is a' i ,a i And a' i The distance between two points is called a re-projection error, and the cluster adjustment enables the re-projection errors of all cameras to be minimum by adjusting the position and coordinates of the camera and the point A. Therefore, the positioning characteristic points and the camera pose are a continuous optimization process, the more the number of the shot images is, the smaller the error is, and a stable result can be obtained when the number M of the shot images reaches 10.
Guiding alignment with respect to visual positioning
(1) Vertical plane determination in interferometer antenna arrays
After three-dimensional coordinates of all the positioning characteristic points are obtained, a straight line l is fitted in the space by utilizing the points, a plane vertical to the straight line is obtained, the plane is a vertical plane of the phase interferometer antenna array, and the vertical plane is intersected with the straight line l at a point C.
(2) Calibration position offset dynamic guide alignment
T in the camera pose can be regarded as the coordinate of the camera center P at the current shooting position in the space, if the coordinates of the point P and the point C are known, the direction vector t of the connecting line of the camera center P and the point C can be obtained by subtracting the two coordinates pc . The current calibration position is the current shooting position, and the included angle between the current calibration position and the vertical plane is the vector t pc The included angle with the middle vertical plane is marked as theta
Figure BDA0003672498250000121
With the offset angle θ, the current camera position (shooting position = calibration position) can be dynamically adjusted according to the magnitude and sign of the offset angle until the offset angle θ is less than ± δ (alignment accuracy), and the measurement and alignment of the phase interferometer antenna array normal are completed.
The present invention has been described in sufficient detail for clarity of disclosure and is not exhaustive of the prior art.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; it is obvious as a person skilled in the art to combine several aspects of the invention. And such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention. The technical contents not described in detail in the present invention are all known techniques.

Claims (4)

1. A static calibration method of a phase interferometer based on visual positioning is characterized in that: comprises the following steps of (a) carrying out,
s1, constructing a static calibration environment, namely constructing the calibration environment under the condition of an external field, placing a three-dimensional reconstruction feature board and a positioning feature point, and connecting a camera, a signal source and a control computer; then connecting the control computer with the direction-finding equipment of the phase interferometer to be measured;
s2, visually reconstructing a three-dimensional coordinate, namely shooting a plurality of photos and/or videos of the phase interferometer array at different angles by using a camera, wherein the photos and/or videos comprise feature plates and positioning feature points; then, based on a computer vision algorithm, automatically reconstructing three-dimensional coordinates of a phase interferometer array on a carrier platform in an external field environment;
s3, calculating a vertical plane of the interferometer array base line in the reconstructed three-dimensional space, wherein the vertical plane comprises a normal to be calibrated;
s4, dynamically indicating an included angle between the current shooting position of the camera and the vertical plane through a visual positioning algorithm, and guiding and correcting the shooting position to 0 degree through a calibration cart so as to finish the alignment of a normal OS of the phase interferometer antenna array;
and S5, under the automatic control of the control computer, a signal source on the calibration trolley at the normal position radiates calibration signals, the calibration signals are compared with phase differences acquired by each antenna of the direction-finding equipment of the phase interferometer to be detected, and a phase calibration table is automatically generated, so that static calibration is completed.
2. The visual positioning-based static calibration method for phase interferometers according to claim 1, wherein: the calibration equipment comprises a camera, a feature board, a signal source and a control computer;
the control computer is electrically connected with the camera and the signal source; positioning characteristic points are arranged on the phase interferometer antenna array;
the characteristic board adopts black and white checkerboards which are arranged near the phase interferometer antenna array, the area of the checkerboards is determined by the calibration distance of the camera, and the distance d between every two checkerboards is known and is used for visually reconstructing a three-dimensional space and positioning the camera;
the positioning feature points are placed on the surface of the antenna array and used for calculating a base line of the antenna array on the visual positioning carrier platform;
the camera adopts a digital camera; the signal source uses a universal microwave signal source;
the control computer is provided with a processing center of visual positioning processing and automatic calibration software and is used for interacting with the camera, acquiring photos shot by the camera in real time to reconstruct three-dimensional coordinates, dynamically calculating the current camera position and the angle error between the camera and an antenna base line and indicating the alignment correction direction; the control computer completes the cross-linking with the phase interferometer direction-finding equipment, controls the signal source work of the microwave, and automatically calculates and generates a full-frequency-band phase calibration table;
the camera, the signal source and the control computer are integrated on the calibration trolley with the small wheels.
3. The visual positioning-based static calibration method for phase interferometers according to claim 1, wherein: wherein in S2, S2.1 is executed, the camera is at the set P 1 The image shot at the position is completely shot to a characteristic plate and a characteristic point which are placed at the direction-finding antenna array of the phase interferometer in advance; because the characteristic board and the positioning characteristic points are all adoptedDetecting all characteristic angular points on the characteristic board and the positioning characteristic points through a checkerboard angular point detection algorithm, solving two-dimensional coordinates of the angular points in the image, and arranging the angular points from left to right and from top to bottom in sequence to ensure that indexes of the same angular point in different images are consistent; wherein, the corner points with consistent indexes are called matching points;
s2.2, initializing a three-dimensional coordinate of the feature board;
firstly, setting a first crossed corner point at the upper left corner on a feature board, wherein the first crossed corner point is a world coordinate system origin 0 (0,0,0), the feature board is defined on an XOY plane, and the distance d between corner points is known, so that the coordinates of the corner points of the mth row and the nth column in the world coordinate system are [ (n-1) d, (m-1) d,0], and therefore three-dimensional space coordinates of all the corner points on the three-dimensional reconstruction feature board in the world coordinate system are determined;
s2.3, calculating the pose of the camera, firstly, when the camera is started from P 1 When a point is shot, the coordinate system (P) of the camera is considered 1 -x 1 y 1 z 1 ) Rigid body motion occurs with respect to the world coordinate system (O-XYZ); then, according to the three-dimensional space rigid motion principle, the motion between two coordinate systems is defined as a rotation plus a translation, a feature board corner point N is reconstructed three-dimensionally, and the three-dimensional coordinates in the world coordinate system are known and are marked as N = [ X, Y, Z ]] T (ii) a At P 1 Camera coordinate system (P) 1 -x 1 y 1 z 1 ) Is represented by N 1 =[x n1 ,y n1 ,z n1 ] T
Then N is 1 The coordinates of (a) are: n is a radical of 1 =RN+t;
Wherein, R is a rotation matrix of 3 multiplied by 3, t is a translation matrix of 3 multiplied by 1, and R and t are the position and the posture of the camera, namely the pose of the camera; next, defining the augmentation matrix [ R | t ] as a 3 × 4 matrix, the formula (1):
Figure FDA0003672498240000031
camera pose [ R | t]Is unknown by shootingTo the corner point image on the three-dimensional reconstruction plate to determine P 1 Pose [ R | t ] of point camera]N in P 1 In the formation of n 1 N is given 1 Has a two-dimensional coordinate of n 1 =[u n1 ,v n1 ] T
Thirdly, the camera adopts a pinhole camera model, and if f is the focal length of the camera. Then there are:
Figure FDA0003672498240000032
i.e. z n1 n 1 =KN 1
K is an internal parameter matrix of the camera and is a preset coefficient of all images;
from formula (2):
Figure FDA0003672498240000033
then, the formula (3) is developed to obtain two constraints instead of the formula (1):
Figure FDA0003672498240000034
wherein P to be solved 1 Point camera pose augmentation matrix [ R | t [ ]]There are 12 dimensions, the camera parameters K and the three-dimensional coordinates (X, Y, Z) of the corner points on the three-dimensional reconstruction feature board are known, so that by providing the constraint of formula (4), 6 feature corner points can be aligned to the matrix [ R | t |)]Solving, and solving a least square solution of an equation by using an SVD method when the number of the points is more than 6;
then, after solving the rotation matrix R and the translation matrix t, determining that the camera is at P 1 Position and attitude at the time of point shooting. Next, the positions and attitudes of the cameras at all subsequent shot points (pn, n =2,3,4 …) are determined in the same manner;
s2.4, calculating the three-dimensional coordinates of the positioning characteristic points, and determining P in S2.3 1 And P 2 After the poses of the two shooting point cameras are determined, three-dimensional positioning characteristic points in the space are determined in a triangulation modeCoordinates; firstly, a locating feature point A, a is taken 1 And a 2 Is that the camera is at P 1 And P 2 In the image shot at the position, the pixel coordinates of the positioning feature points are obtained; then, the ray P 1 All points on A are projected to the same pixel point a 1 When the spatial position of A is unknown, the ray P 2 a 2 And the ray P 1 a 1 Intersecting the point A, deducing the spatial position of the point A, and realizing three-dimensional reconstruction of the positioning characteristic points;
s2.5, when the least square solution of the image is solved, the error is larger than a set threshold value, and bundling adjustment is executed; projecting the calculated coordinates of the point A to a camera P i To obtain a two-dimensional coordinate a i The actual imaging coordinate of the point A in the image is a' i When a is i And a' i When not coincident, a i And a' i The distance between the two points is called a reprojection error, and the cluster adjustment enables the reprojection errors of all the images to be minimum by adjusting the position and the coordinates of the camera and the point A; the desired result is obtained when the number of images M reaches 10.
4. The visual positioning-based static calibration method for phase interferometers according to claim 1, wherein: in S3-S4, visual positioning guides alignment;
in S3, vertical plane measurement in the interferometer antenna array is carried out, after three-dimensional coordinates of all positioning characteristic points are obtained, a straight line l is fitted in space by using the points, a plane perpendicular to the straight line is obtained, the plane is a vertical plane of the phase interferometer antenna array, and the vertical plane is intersected with the straight line l at a point C;
in S4, calibrating the offset dynamic guiding alignment, firstly, considering t in the camera pose as the coordinate of the camera center P in the space of the current shooting position, and subtracting the coordinate of the point P and the coordinate of the point C by knowing the coordinate of the point P and the coordinate of the point C to obtain a direction vector t of a connecting line of the camera center P and the point C pc The current calibration position is the current shooting position, and the included angle between the current calibration position and the vertical plane is the vector t pc The included angle with the middle vertical plane is marked as theta
Figure FDA0003672498240000041
Then, when the offset angle theta exists, the current camera position is dynamically adjusted through the size and the positive and negative sign indication of the offset angle, the shooting position = calibration position, until the offset angle theta is less than +/-delta alignment precision, and then the normal measurement and alignment of the phase interferometer antenna array are completed.
CN202210612919.0A 2022-05-31 2022-05-31 Phase interferometer static calibration method based on visual positioning Pending CN115201746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210612919.0A CN115201746A (en) 2022-05-31 2022-05-31 Phase interferometer static calibration method based on visual positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210612919.0A CN115201746A (en) 2022-05-31 2022-05-31 Phase interferometer static calibration method based on visual positioning

Publications (1)

Publication Number Publication Date
CN115201746A true CN115201746A (en) 2022-10-18

Family

ID=83576686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210612919.0A Pending CN115201746A (en) 2022-05-31 2022-05-31 Phase interferometer static calibration method based on visual positioning

Country Status (1)

Country Link
CN (1) CN115201746A (en)

Similar Documents

Publication Publication Date Title
CN110057295B (en) Monocular vision plane distance measuring method without image control
US5638461A (en) Stereoscopic electro-optical system for automated inspection and/or alignment of imaging devices on a production assembly line
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN107014399B (en) Combined calibration method for satellite-borne optical camera-laser range finder combined system
US8730130B1 (en) System and method for automatically aligning immersive displays
US20090040312A1 (en) Calibration apparatus and method thereof
CN109655079B (en) Method for measuring coordinate system from star sensor to prism coordinate system
WO1995034996A1 (en) Method and apparatus for transforming coordinate systems in an automated video monitor alignment system
CN109186944B (en) Airborne multi-optical-axis optical load optical axis consistency calibration method
CN111896221B (en) Alignment method of rotating optical measurement system for virtual coordinate system auxiliary camera calibration
CN107339935B (en) Target space intersection measuring method for full-view scanning measuring system
CN110779688B (en) Method for testing field splicing precision of large-field area array device
CN109285195B (en) Monocular projection system pixel-by-pixel distortion correction method based on large-size target and application thereof
CN104019829A (en) Vehicle-mounted panorama camera based on POS (position and orientation system) and external parameter calibrating method of linear array laser scanner
CN106705860B (en) A kind of laser distance measurement method
CN108447100B (en) Method for calibrating eccentricity vector and visual axis eccentricity angle of airborne three-linear array CCD camera
CN109839027A (en) A kind of test device and method of thermal imaging gun sight dress meter accuracy
CN104634246A (en) Floating type stereo visual measuring system and measuring method for coordinates of object space
CN116740187A (en) Multi-camera combined calibration method without overlapping view fields
Yuan et al. A precise calibration method for line scan cameras
Boehm et al. Accuracy of exterior orientation for a range camera
CN111754584A (en) Remote large-field-of-view camera parameter calibration system and method
Ye et al. A calibration trilogy of monocular-vision-based aircraft boresight system
CN111561867A (en) Airplane surface appearance digital measurement method
Navarro et al. Accuracy analysis of a mobile mapping system for close range photogrammetric projects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination