CN108106637B - Precision calibration method and device for distributed POS (point of sale) - Google Patents

Precision calibration method and device for distributed POS (point of sale) Download PDF

Info

Publication number
CN108106637B
CN108106637B CN201810153914.XA CN201810153914A CN108106637B CN 108106637 B CN108106637 B CN 108106637B CN 201810153914 A CN201810153914 A CN 201810153914A CN 108106637 B CN108106637 B CN 108106637B
Authority
CN
China
Prior art keywords
target
sub
camera
node
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810153914.XA
Other languages
Chinese (zh)
Other versions
CN108106637A (en
Inventor
朱庄生
袁学忠
刘刚
李建利
顾宾
王世博
孙一弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Aeronautics and Astronautics
Original Assignee
Beijing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Aeronautics and Astronautics filed Critical Beijing University of Aeronautics and Astronautics
Priority to CN201810153914.XA priority Critical patent/CN108106637B/en
Publication of CN108106637A publication Critical patent/CN108106637A/en
Application granted granted Critical
Publication of CN108106637B publication Critical patent/CN108106637B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Abstract

The invention relates to a precision calibration method and a precision calibration device of a distributed POS (point of sale), firstly, calibrating the relative pose relationship of a first camera and a second camera without a public view field; respectively shooting a first child node target and a second child node target by a first camera and a second camera, and measuring the poses of the first child node and the second child node in a camera coordinate system; unifying pose data measured by the first camera and the second camera to a measurement reference coordinate system, and calculating the base line length and the base line angle between the first sub-node and the second sub-node; and checking the accuracy of the baseline length and baseline angle data measured by the distributed POS. The method and the device have the characteristics of high precision and strong anti-jamming capability, and can be used for performing pose precision calibration on the conventional low-precision and high-precision distributed combined navigation system.

Description

Precision calibration method and device for distributed POS (point of sale)
Technical Field
The invention relates to the field of distributed measurement, in particular to a precision calibration method and device of a distributed POS.
Background
The high-precision POS is composed of an Inertial Measurement Unit (IMU), a navigation Computer System (PCS), and a gps (global Positioning System). The high-precision POS can provide high-frequency and high-precision time, space and precision information for the high-resolution aerial remote sensing system, improves imaging precision and efficiency through motion error compensation, and is the key for realizing high-resolution imaging.
China makes certain progress in the aspect of single POS imaging, but due to the requirement traction of earth observation loads, such as multi-task loads integrating a high-resolution mapping camera, a full-spectrum imaging spectrometer and an SAR radar on the same carrier, airborne distributed array antennas SAR, flexible multi-baseline interference SAR, a carrier-borne sparse array imaging radar and the like, a plurality of or a plurality of loads are installed at different positions of an airplane, and the traditional single POS cannot realize multi-point high-precision position and attitude measurement and time unification of each load data.
In the distributed POS, a high-precision IMU is installed on a machine body to serve as a main node, and a low-precision IMU is installed on each imaging load position on a wing to serve as a sub node. In the flight process of the airplane, the main node and the sub-nodes respectively measure position and attitude data of each node, and high-precision data of the main node is transmitted to the sub-nodes through means of lever arm compensation and the like, so that the pose measurement precision of the sub-nodes is improved. And the motion compensation is completed for a plurality of imaging loads.
Therefore, the distributed POS plays a crucial role in multi-load and stereo imaging, the measurement accuracy is a key index for determining the measurement performance of the distributed POS, and performing effective accuracy calibration on the distributed POS is also one of the technical problems to be solved in the art.
Disclosure of Invention
The invention provides a precision calibration method and device of a distributed POS (point of sale), which are used for solving the problem that the prior art lacks an effective precision calibration technology for the distributed POS.
The technical scheme for solving the technical problem of the invention is as follows:
a precision calibration method of a distributed POS comprises the following steps:
calibrating the relative pose relationship of a first camera and a second camera without a public view field;
the first camera and the second camera respectively shoot a first sub-node target and a second sub-node target, and the poses of the first sub-node and the second sub-node in a camera coordinate system are measured;
unifying pose data measured by a first camera and a second camera to a measurement reference coordinate system, and calculating the base line length and the base line angle between the first sub-node and the second sub-node;
and checking the accuracy of the baseline length and baseline angle data measured by the distributed POS.
Preferably, the calibrating the relative pose relationship of the first camera and the second camera without a common view field specifically comprises:
statically placing a third target and a fourth target as calibration targets, and calibrating spatial three-dimensional coordinates of feature points on the third target and the fourth target in the same world coordinate system in advance;
the first camera and the second camera respectively shoot a third target and a fourth target, each image is obtained, and the relative pose relationship between the first camera and the second camera is calibrated according to the vector and distance relationship constructed between a plurality of feature points on the third target and the fourth target.
Preferably, the method for calibrating the relative pose relationship between the first camera and the second camera from the vector and distance relationship constructed between the plurality of feature points on the third target and the fourth target specifically comprises the following steps:
and (3) establishing an angle relation formula by using vector included angles constructed by any points on the third target and the fourth target to be equal in a target coordinate system and a camera coordinate system:
wherein the content of the first and second substances,
Figure GDA0002257013450000022
respectively constructing unit vectors for 0 and 1 points in targets a and b in a target coordinate system;
Figure GDA0002257013450000031
unit vectors respectively constructed for 0 and 1 points in a third target and a fourth target shot by a second camera and a first camera;
and establishing a position relation formula by the connecting line distance of the ith feature point on the third target and the fourth target in the target coordinate system and the camera coordinate system in an equal way:
Figure GDA0002257013450000032
wherein the content of the first and second substances,
Figure GDA0002257013450000033
coordinates of point i in a third target and a fourth target in a target coordinate system;third camera for second camera and first cameraCoordinates of point i in the target and the fourth target in the camera coordinate system.
Preferably, before the first camera and the second camera respectively shoot the first child node target and the second child node target, the method further comprises the following steps:
installing two sub IMUs on corresponding installation nodes, installing a first sub IMU on a first sub node, and installing a second sub IMU on a second sub node;
and rigidly and fixedly connecting the first sub-node target with the first sub-IMU, and rigidly and fixedly connecting the second sub-node target with the second sub-IMU.
Preferably, the measuring the poses of the first sub-node and the second sub-node in the camera coordinate system specifically includes:
the first camera and the second camera respectively shoot a first sub-node target and a second sub-node target at the same time; and taking a coordinate system of a first camera corresponding to the first sub-node as a measurement coordinate system, obtaining relative poses between the first sub-node target and the first sub-IMU and between the second sub-node target and the second sub-IMU based on a joint calibration method of orthogonal vectors and dynamic filtering, and converting the measured information of the first sub-node target and the measured information of the second sub-node target into corresponding pose information of the first sub-IMU and the second sub-IMU.
The invention also discloses a precision calibration device of the distributed POS, which comprises a first camera, a second camera, a first sub-node target, a second sub-node target, a calibration module, a measurement module and a calibration module;
the calibration module is used for calibrating the relative pose relationship of the first camera and the second camera without a public view field;
the first camera and the second camera respectively shoot a first sub-node target and a second sub-node target, and the measuring module is used for measuring the poses of the first sub-node and the second sub-node in a camera coordinate system;
the calibration module is used for unifying pose data measured by the first camera and the second camera to a measurement reference coordinate system and calculating the base line length and the base line angle between the first sub-node and the second sub-node; and checking the accuracy of the baseline length and baseline angle data measured by the distributed POS.
Preferably, the device further comprises a third target, a fourth target:
statically placing a third target and a fourth target as calibration targets, and calibrating spatial three-dimensional coordinates of feature points on the third target and the fourth target in the same world coordinate system in advance;
the first camera and the second camera respectively shoot the third target and the fourth target, each image is obtained, and the pose relation between the first camera and the second camera is calibrated according to the vector and distance relation constructed between a plurality of feature points on the third target and the fourth target.
Preferably, the calibration module is configured to:
and (3) establishing an angle relation formula by using vector included angles constructed by any points on the third target and the fourth target to be equal in a target coordinate system and a camera coordinate system:
Figure GDA0002257013450000041
wherein the content of the first and second substances,respectively constructing unit vectors for 0 and 1 points in targets a and b in a target coordinate system;
Figure GDA0002257013450000043
unit vectors respectively constructed for 0 and 1 points in a third target and a fourth target shot by a second camera and a first camera;
and establishing a position relation formula by the connecting line distance of the ith feature point on the third target and the fourth target in the target coordinate system and the camera coordinate system in an equal way:
wherein the content of the first and second substances,
Figure GDA0002257013450000045
as a target coordinate systemCoordinates of point i in the third target and the fourth target;
Figure GDA0002257013450000046
coordinates of i points in the third target and the fourth target photographed by the second camera and the first camera in a camera coordinate system.
Preferably, the apparatus further comprises a first sub-IMU, a second sub-IMU:
the first sub-IMU is installed on the first sub-node, and the second sub-IMU is installed on the second sub-node;
the first sub-node target is rigidly connected with the first sub-IMU, and the second sub-node target is rigidly connected with the second sub-IMU.
Preferably, the first camera and the second camera in the device are used for shooting the first sub-node target and the second sub-node target respectively at the same time;
the measurement module is used for taking a coordinate system of the first camera corresponding to the first sub-node as a measurement coordinate system, obtaining relative poses between the first sub-node target and the first sub-IMU and between the second sub-node target and the second sub-IMU based on a joint calibration method of an orthogonal vector and dynamic filtering, and converting the measured information of the first sub-node target and the measured information of the second sub-node target into corresponding pose information of the first sub-IMU and the second sub-IMU;
the calibration module is used for measuring the reference in a unified manner according to the calibrated pose relationship between the first camera and the second camera, calculating the base line length and the base line angle between the first sub-node and the second sub-node, and calibrating the measurement accuracy of the distributed POS.
Compared with the prior art, the invention can achieve the following technical effects:
aiming at the problem of distributed POS precision calibration, the invention firstly calibrates the relative pose relationship of two cameras without a common view field, measures the pose and pose data of two sub-nodes in a camera coordinate system to be unified to a measurement reference coordinate system, calculates the base line length and the base line angle between the two sub-nodes to calibrate the base line length and the base line angle data measured by the distributed POS, measures the pose change by using a visual measurement means, realizes non-contact, high-frequency and high-precision measurement, uses two targets with calibrated coordinates to calibrate the pose relationship of multiple cameras, ensures that the placement of the cameras is not restricted by space, and expands the measurement range. The method avoids the complex wing deflection modeling process required by the traditional measurement methods of fiber bragg gratings and the like, can measure the multi-point pose, deformation, vibration and the like of a large structural member in practical engineering application, realizes the calibration of more sub-nodes by expanding the number of cameras, and can be used for performing pose precision calibration on the conventional low-precision and high-precision distributed combined navigation system.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a hardware system diagram of an embodiment of the method and apparatus for precision calibration of distributed POS according to the present invention;
FIG. 2 is a schematic view of a camera calibration according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating an embodiment of a method for precision calibration of a distributed POS according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention discloses a method and a device for precision calibration of a distributed POS, wherein the method comprises the following steps: calibrating the relative pose relationship of a first camera and a second camera without a public view field; respectively shooting a first child node target and a second child node target by a first camera and a second camera, and measuring the poses of the first child node and the second child node in a camera coordinate system; unifying pose data measured by the first camera and the second camera to a measurement reference coordinate system, and calculating the base line length and the base line angle between the first sub-node and the second sub-node; and checking the accuracy of the baseline length and baseline angle data measured by the distributed POS.
The invention discloses a precision calibration device of a distributed POS (point of sale), which mainly comprises a first camera, a second camera, a first sub-node target, a second sub-node target, a calibration module, a measurement module and a calibration module; the calibration module is used for calibrating the relative pose relationship of the first camera and the second camera without a public view field; the measuring module is used for measuring the poses of the first child node and the second child node in a camera coordinate system; and the calibration module is used for unifying the pose data measured by the first camera and the second camera to a measurement reference coordinate system, calculating the base line length and the base line angle between the first sub-node and the second sub-node, and calibrating the accuracy of the base line length and the base line angle data measured by the distributed POS.
Referring to fig. 1 and 2, as an implementation manner, the basic components of the system of the present invention include a first camera 111, a second camera 112, a first sub-node target 113, a second sub-node target 114, a master IMU110, two sub-IMUs, a PCS, a GPS (PCS, GPS not shown in the figures), and a distributed system installation structure.
The two sub-IMUs are the first sub-IMU 115 and the second sub-IMU 116. The main IMU is installed at the central point of the structure and is used as a main node; the two sub IMUs are fixedly connected with the two targets 113/114 respectively and are installed at nodes to be detected on the structure to serve as sub nodes; the main node, the sub-node and the GPS are all communicated with the PCS to form a distributed POS system; the two cameras respectively shoot the two child nodes. Time synchronization of both the camera and IMU is achieved by GPS second pulses.
The following exemplifies an embodiment of the accuracy calibration method of distributed POS, which comprises the following specific implementation steps:
step S11: firstly, a precision calibration test environment of the distributed POS is built, a main node (high-precision IMU) and two sub-nodes (low-precision IMU) are installed on corresponding installation nodes of a flexible structure frame, and visual targets (a first sub-node target and a second sub-node target) processed with high precision are respectively and rigidly connected with one surface of each sub-IMU. Each camera captures a target.
Step S12: and statically placing a third target a and a fourth target b as calibration targets, and calibrating the spatial three-dimensional coordinates of the feature points on the third target a and the fourth target b in the same world coordinate system in advance. Cameras 111 and 112 respectively shoot targets a and b to respectively obtain an image, and the pose relationship between the cameras 111 and 112 is calibrated according to the vector and distance relationship constructed between a plurality of points on the targets a and b.
And (3) calibrating the two cameras, namely calibrating three-dimensional coordinates of all characteristic points a and b on the targets by using the two static targets a and b and using a laser tracker. Let the 111 coordinate system of the camera be the measuring coordinate system, and the coordinates of the feature points of the targets a and b in the world coordinate system are respectively marked as ai(i=1,...,9)、bi(i ═ 1.., 9). The positional and posture relationship between the cameras 111 and 112 is expressed as
Figure GDA0002257013450000071
The coordinates of the feature points of the target a in the coordinate system of the camera 112 are recorded as
Figure GDA0002257013450000072
The coordinates of the feature points of the target b in the camera 111 coordinate system are recorded as
① the pose relationship between the vector calibration cameras 111, 112 is first constructed
From a to a0、a1Establishing unit vector in target coordinate system
Figure GDA0002257013450000075
The unit vector can be obtained in the same way
Figure GDA0002257013450000076
An included angle theta is formed between the two vectors1
Figure GDA0002257013450000081
Simultaneously, the feature point a is calculated by an orthogonal iterative algorithm (OI algorithm)0,a1Coordinates in the camera 112 coordinate system areEstablishing vector unit quantity
Figure GDA0002257013450000083
The unit vector under the camera 111 can be obtained by the same method
Figure GDA0002257013450000084
Target a feature vector on camera 112 coordinate system
Figure GDA0002257013450000085
The rotation transformation can be converted into the coordinate system of the camera 111:
Figure GDA0002257013450000086
under the measurement coordinate system, vector
Figure GDA0002257013450000087
Have an angle therebetween
Figure GDA0002257013450000088
Figure GDA0002257013450000089
Will be provided with
Figure GDA00022570134500000810
Substituting the formula as follows:
Figure GDA00022570134500000811
because of the fact that
Figure GDA00022570134500000812
And theta1And (3) equality:
Figure GDA00022570134500000813
in the above formula only
Figure GDA00022570134500000814
Unknown, let
Figure GDA00022570134500000815
Subscript 011,012,013 denotes the vector components, r, respectively, at the X, Y, Z axisijIs composed ofRow i and column j; the above equation can be rewritten as:
Figure GDA00022570134500000817
the above formula is developed:
Figure GDA0002257013450000091
any two feature points on the targets a, b can construct a vector. Any two vectors can establish the above equation. Can be solved out
Figure GDA0002257013450000092
② the positional relationship between cameras 1, 2 is then calibrated by the distance relationship between the feature points on the targets a, b
Figure GDA0002257013450000093
The distance d between two characteristic points i on the targets a and b is expressed in a target coordinate system w as:
characteristic point a on target aiCoordinates in the camera 112 coordinate systemThe rotational-translational transformation can be converted to the coordinate system of the camera 111:
Figure GDA0002257013450000096
the distance between any two characteristic points on the targets a and b is expressed in a camera coordinate system as follows:
Figure GDA0002257013450000097
therefore, the position calibration relation can be obtained:
Figure GDA0002257013450000098
the upper type
Figure GDA0002257013450000099
Calculated from ①, only
Figure GDA00022570134500000910
Unknown, the number of the unknown numbers is 3, and an equation set is established through 3 points on the sub-target, so that the unknown number can be solved
Figure GDA00022570134500000911
Step S13: the cameras 111 and 112 with the relative relationship being calibrated respectively shoot the first sub-node target 113 and the second sub-node target 114, which are corresponding targets of the sub-nodes. Obtaining the relative pose between the target and the sub IMU at the sub node based on a combined calibration method of orthogonal vectors and dynamic filtering, and converting the measured target information into pose information of the sub node; and calculating the length of a base line and the angle of the base line (the included angle between the base line and the horizontal plane) between the sub-nodes 1 and 2 by using a calibrated measuring reference of the two-phase organ system.
And respectively shooting target planes at two sub-nodes by using a two-phase camera with a well-calibrated relative relationship to obtain the pose variation of the target relative to a camera coordinate system, and then calibrating the relative pose relationship between the IMU and the corresponding target by using a combined calibration method based on orthogonal vectors and dynamic filtering.
①, the coordinate system of camera 111 is used as the measuring coordinate system, and the position and posture of the child node 115 in the coordinate system of camera 111 are recorded ascR1,cT1The position and posture of the child node 116 in the coordinate system of the camera 112 are denoted as R2,T2. The relative position and posture relationship between the cameras 111 and 112 specified in the step (2)
Figure GDA0002257013450000101
Position relation of the obtained child node 116 in the measurement coordinate systemcT2
Figure GDA0002257013450000102
Base length l measurement:
Figure GDA0002257013450000103
baseline angle α:
Figure GDA0002257013450000104
in the formulacT2z,cT1zAre respectivelycT2,cT1Z-axis component of (a).
② the distributed POS can output the position and posture of each of the sub-nodes 115, 116 under the geographic coordinatesThe quantity precision is low, so that the pose information of the high-precision main node is transmitted to the sub-nodes through the lever arm information, and the pose precision of the sub-nodes is improved. On the basis, the length and the angle of a base line between two sub nodes are calculated and recorded as lb、αb
③ checking and correcting l measured by distributed POS from l, alpha measured in ①b、αbAccuracy Δ l, Δ α:
Figure GDA0002257013450000105
referring to FIG. 3, as an alternative embodiment, the calibration method of the present invention can also be summarized into three main steps shown in FIG. 3, including calibrating the relative pose between two cameras using an inter-vector angular relationship; and (3) specifying the relative position between the two cameras by using the distance relation of the characteristic points, transferring the positions and postures of the sub-nodes respectively measured by the two cameras to a measurement coordinate system, and checking the baseline length and angle data measured by the distributed POS.
In summary, the accuracy calibration method and device for distributed POS disclosed by the invention firstly complete pose measurement of a single sub-node through monocular vision, calibrate the relative pose relationship of two cameras by using two sub-targets with calibrated relative relationship, and acquire the relative pose between two sub-nodes after unifying a coordinate system, thereby realizing measurement of the length and angle of a base line between two sub-nodes. And the baseline data measured by the distributed POS are subjected to precision calibration. The method uses a visual measurement means to realize pose measurement, overcomes the defect that the traditional multi-camera calibration method without a public view field is limited by space, uses a high-precision three-dimensional coordinate measuring instrument to calibrate coordinate point coordinates on two targets in advance, and then calibrates the poses of the two cameras; the method has the characteristics of high precision and strong anti-interference capability, expands application scenes, and can be used for performing pose precision calibration on the prior distributed inertia products with low precision and high precision on the ground.

Claims (4)

1. A precision calibration method of a distributed POS is characterized by comprising the following steps:
calibrating the relative pose relationship of a first camera and a second camera without a public view field;
the first camera and the second camera respectively shoot a first target and a second target at the same time, and the poses of the first sub-node and the second sub-node in a measurement coordinate system are measured;
calculating the base line length and the base line angle between the first sub-node and the second sub-node according to the poses of the first sub-node and the second sub-node in a measurement coordinate system respectively, wherein the base line angle is the included angle between the base line and the horizontal plane;
calibrating the base line length and the base line angle data precision between the first sub-node and the second sub-node measured by the distributed POS;
before the step of shooting the first target and the second target by the first camera and the second camera respectively at the same time, the method further comprises the following steps:
installing two sub IMUs on corresponding installation nodes, wherein a first sub IMU is installed on a first sub node, and a second sub IMU is installed on a second sub node;
rigidly fixing the first target with the first sub-IMU, and rigidly fixing the second target with the second sub-IMU;
the step of measuring the poses of the first sub-node and the second sub-node in a measurement coordinate system respectively comprises the following steps:
and taking a coordinate system of the first camera corresponding to the first sub-node as a measurement coordinate system, obtaining relative poses between the first target and the first sub-IMU and between the second target and the second sub-IMU based on a joint calibration method of orthogonal vectors and dynamic filtering, and converting the measured information of the first target and the measured information of the second target into corresponding pose information of the first sub-IMU and the second sub-IMU.
2. The accuracy calibration method of the distributed POS according to claim 1, wherein the step of calibrating the relative pose relationship of the first camera and the second camera without a common view field specifically comprises:
statically placing a third target and a fourth target as calibration targets, and calibrating spatial three-dimensional coordinates of feature points on the third target and the fourth target in the same world coordinate system in advance;
the first camera shoots the fourth target, the second camera shoots the third target, each image is obtained, and an angle relation formula is established in a target coordinate system and a camera coordinate system by the included angle between the vector constructed by the two feature points on the third target and the vector constructed by the two feature points on the fourth target, wherein the included angle is equal to each other:
wherein the content of the first and second substances,a unit vector constructed for 0 and 1 points in a third target a in a target coordinate system,
Figure FDA0002257013440000023
a unit vector constructed for 0 and 1 point in a fourth target b in a target coordinate system;
Figure FDA0002257013440000024
a unit vector constructed for 0 and 1 points in the third target a photographed by the second camera,
Figure FDA0002257013440000025
a unit vector constructed for 0 and 1 points in a fourth target b shot by the first camera;
establishing a position relation formula by the connecting line distance between the ith characteristic point on the third target and the ith characteristic point on the fourth target in the target coordinate system and the camera coordinate system in an equal way:
wherein the content of the first and second substances,respectively representing the coordinates of the ith characteristic point in a third target a and a fourth target b in a target coordinate system;
Figure FDA0002257013440000028
the coordinates of the ith feature point in the camera coordinate system in the third target a shot by the second camera and the fourth target b shot by the first camera are respectively.
3. The precision calibration device of the distributed POS is characterized by comprising a first camera, a second camera, a first target, a second target, a first sub IMU, a second sub IMU, a calibration module, a measurement module and a calibration module;
the first sub-IMU is installed on the first sub-node, and the second sub-IMU is installed on the second sub-node;
the first target is rigidly connected with the first sub IMU, and the second target is rigidly connected with the second sub IMU;
the calibration module is used for calibrating the relative pose relationship of the first camera and the second camera without a public view field;
the first camera and the second camera are used for shooting a first target and a second target respectively at the same time, and the measuring module is used for measuring the poses of the first sub-node and the second sub-node in a measuring coordinate system respectively;
the calibration module is used for calculating a baseline length and a baseline angle between the first sub-node and the second sub-node according to the poses of the first sub-node and the second sub-node in a measurement coordinate system respectively, wherein the baseline angle is an included angle between a baseline and a horizontal plane; calibrating the base line length and the base line angle data precision between the first sub-node and the second sub-node measured by the distributed POS;
the measurement module is configured to use a coordinate system of the first camera corresponding to the first sub-node as a measurement coordinate system, obtain relative poses between the first target and the first sub-IMU and between the second target and the second sub-IMU based on a joint calibration method of an orthogonal vector and dynamic filtering, and convert the measured information of the first target and the measured information of the second target into pose information of the corresponding first sub-IMU and the corresponding second sub-IMU.
4. The accuracy calibration device of distributed POS according to claim 3, further comprising a third target, a fourth target:
statically placing a third target and a fourth target as calibration targets, and calibrating spatial three-dimensional coordinates of feature points on the third target and the fourth target in the same world coordinate system in advance;
the first camera shoots the fourth target, the second camera shoots the third target, and each image is obtained;
the calibration module is used for:
and (3) establishing an angle relation formula by enabling an included angle between a vector constructed by the two feature points on the third target and a vector constructed by the two feature points on the fourth target to be equal in a target coordinate system and a camera coordinate system:
Figure FDA0002257013440000031
wherein the content of the first and second substances,
Figure FDA0002257013440000032
constructing unit vectors for 0 and 1 points in a third target a in a target coordinate system;
Figure FDA0002257013440000033
a unit vector constructed for 0 and 1 point in a fourth target b in a target coordinate system;
Figure FDA0002257013440000034
a unit vector constructed for 0 and 1 points in the third target a photographed by the second camera,
Figure FDA0002257013440000035
0 to the fourth target b shot by the first camera,1, constructing a unit vector;
establishing a position relation formula by the connecting line distance between the ith characteristic point on the third target and the ith characteristic point on the fourth target in the target coordinate system and the camera coordinate system in an equal way:
Figure FDA0002257013440000036
wherein the content of the first and second substances,
Figure FDA0002257013440000037
respectively representing the coordinates of the ith characteristic point in a third target a and a fourth target b in a target coordinate system;
Figure FDA0002257013440000041
the coordinates of the ith feature point in the camera coordinate system in the third target a shot by the second camera and the fourth target b shot by the first camera are respectively.
CN201810153914.XA 2018-02-22 2018-02-22 Precision calibration method and device for distributed POS (point of sale) Active CN108106637B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810153914.XA CN108106637B (en) 2018-02-22 2018-02-22 Precision calibration method and device for distributed POS (point of sale)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810153914.XA CN108106637B (en) 2018-02-22 2018-02-22 Precision calibration method and device for distributed POS (point of sale)

Publications (2)

Publication Number Publication Date
CN108106637A CN108106637A (en) 2018-06-01
CN108106637B true CN108106637B (en) 2020-01-10

Family

ID=62205606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810153914.XA Active CN108106637B (en) 2018-02-22 2018-02-22 Precision calibration method and device for distributed POS (point of sale)

Country Status (1)

Country Link
CN (1) CN108106637B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109470272B (en) * 2018-12-05 2020-11-03 中国科学院长春光学精密机械与物理研究所 Calibration method of IMU (inertial measurement Unit) measurement reference
CN110363988B (en) * 2019-07-11 2021-05-28 南京慧尔视智能科技有限公司 System and method for calculating vehicle passing efficiency at intersection
CN110672094B (en) * 2019-10-09 2021-04-06 北京航空航天大学 Distributed POS multi-node multi-parameter instant synchronous calibration method
CN110646016B (en) * 2019-11-11 2021-04-13 北京航空航天大学 Distributed POS calibration method and device based on theodolite and vision-assisted flexible base line
CN111189391B (en) * 2020-01-10 2021-04-20 天津大学 Coordinate unification method based on measurement of middle point of axis of carbon fiber rod
JP7367251B1 (en) 2023-02-03 2023-10-23 興和株式会社 How to understand
CN117557659B (en) * 2024-01-10 2024-03-19 吉林大学 Opposite camera global calibration method and system based on one-dimensional target and turntable

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103323855B (en) * 2012-03-22 2015-12-02 中国科学院电子学研究所 A kind of precision acquisition methods of baseline dynamic measurement system
CN103852760B (en) * 2012-12-04 2016-02-03 中国科学院电子学研究所 A kind of many base measurements method based on rigidity and flexible baseline combination
CN104075688B (en) * 2013-03-29 2016-09-28 中原工学院 A kind of binocular solid stares the distance-finding method of monitoring system
CN103606147B (en) * 2013-11-06 2016-10-19 同济大学 Multiple stage is not total to visual field and measures the coordinate system conversion scaling method of camera
TWI556198B (en) * 2015-09-11 2016-11-01 經緯航太科技股份有限公司 Positioning and directing data analysis system and method thereof
CN105444687B (en) * 2015-11-30 2017-06-16 中国人民解放军国防科学技术大学 Based on to the relative pose variation measuring method regarding videographic measurment and laser ranging
CN105698765B (en) * 2016-02-22 2018-09-18 天津大学 Object pose method under double IMU monocular visions measurement in a closed series noninertial systems
CN106054185B (en) * 2016-05-23 2018-01-09 北京航空航天大学 A kind of airborne dual-antenna InSAR baseline computational methods based on distributed POS
CN106289246B (en) * 2016-07-25 2018-06-12 北京航空航天大学 A kind of flexible link arm measure method based on position and orientation measurement system

Also Published As

Publication number Publication date
CN108106637A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
CN108106637B (en) Precision calibration method and device for distributed POS (point of sale)
CN108375382B (en) Monocular vision-based position and attitude measurement system precision calibration method and device
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
CN110390697B (en) Millimeter wave radar and camera combined calibration method based on LM algorithm
CN108759798B (en) Method for realizing precision measurement of high-precision spacecraft
CN109655079B (en) Method for measuring coordinate system from star sensor to prism coordinate system
CN110969665B (en) External parameter calibration method, device, system and robot
CN108663043B (en) Single-camera-assisted distributed POS main node and sub node relative pose measurement method
CN106885585B (en) Integrated calibration method of satellite-borne photogrammetry system based on light beam adjustment
CN107728182A (en) Flexible more base line measurement method and apparatus based on camera auxiliary
CN110646016B (en) Distributed POS calibration method and device based on theodolite and vision-assisted flexible base line
CN109087355A (en) The monocular camera pose measuring apparatus and method updated based on iteration
CN108375383A (en) The airborne distribution POS flexibility base line measurement method and apparatus of polyphaser auxiliary
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN108154535B (en) Camera calibration method based on collimator
CN112229323A (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN113267794A (en) Antenna phase center correction method and device with base line length constraint
CN108225371B (en) Inertial navigation/camera installation error calibration method
CN108594255A (en) A kind of laser ranging auxiliary optical image association error compensation method and system
Pi et al. On-orbit geometric calibration using a cross-image pair for the linear sensor aboard the agile optical satellite
CN112857328B (en) Calibration-free photogrammetry method
CN111220118B (en) Laser range finder based on visual inertial navigation system and range finding method
CN109342008B (en) Wind tunnel test model attack angle single-camera video measuring method based on homography matrix
CN110672094B (en) Distributed POS multi-node multi-parameter instant synchronous calibration method
CN107330862B (en) Quaternion-based conversion method between two independent system coordinate systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant