CN113048985A - Camera relative motion estimation method under known relative rotation angle condition - Google Patents
Camera relative motion estimation method under known relative rotation angle condition Download PDFInfo
- Publication number
- CN113048985A CN113048985A CN202110596663.4A CN202110596663A CN113048985A CN 113048985 A CN113048985 A CN 113048985A CN 202110596663 A CN202110596663 A CN 202110596663A CN 113048985 A CN113048985 A CN 113048985A
- Authority
- CN
- China
- Prior art keywords
- camera
- view
- affine
- matrix
- translation vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims abstract description 146
- 239000011159 matrix material Substances 0.000 claims abstract description 90
- 230000009466 transformation Effects 0.000 claims abstract description 41
- 238000004364 calculation method Methods 0.000 claims abstract description 4
- 239000013598 vector Substances 0.000 claims description 74
- 238000013519 translation Methods 0.000 claims description 56
- 238000005259 measurement Methods 0.000 abstract description 36
- 238000009434 installation Methods 0.000 abstract description 9
- 230000008569 process Effects 0.000 abstract description 4
- 230000003190 augmentative effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 3
- 208000010718 Multiple Organ Failure Diseases 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012886 linear function Methods 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/262—Analysis of motion using transform domain methods, e.g. Fourier domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Analysis (AREA)
- Automation & Control Theory (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Computational Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Algebra (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application relates to a camera relative motion estimation method based on affine matching point pairs under the condition of known relative rotation angle. When the inertial measurement unit is fixedly connected with the camera under the installation condition, the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, so that the method can be applied to wider scenes such as unknown installation relation or changed installation relation between the inertial measurement unit and the camera. Meanwhile, the situation that the relative rotation angle of the camera or the multi-camera system is known is introduced, and the relative motion of the single camera and the multi-camera system can be estimated by adopting the image coordinates of the two affine matching point pairs and the local affine transformation matrix in the field. Therefore, the calculation accuracy and efficiency are improved, and the method is suitable for equipment with limited calculation capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobiles and augmented reality equipment.
Description
Technical Field
The application relates to the technical field of positioning, in particular to a camera relative motion estimation method under the condition of known relative rotation angle.
Background
Relative motion estimation of a single camera or multiple cameras is a fundamental problem in three-dimensional vision, such as robot positioning and mapping, augmented reality, and autopilot. The single-camera system is modeled by a central perspective projection model, and the relative motion estimation of the single camera generally utilizes an essential matrix algorithm of 5 homonymous point pairs or a homography matrix algorithm of 4 homonymous point pairs. The multi-camera system consists of a plurality of single cameras fixed to a single rigid body. It is modeled using a general camera model and does not have a single center of projection. Relative motion estimation for multiple cameras typically utilizes a minimum configuration solution of 6 homonymous point pairs, or a linear solution method of 17 homonymous point pairs.
At present, sensors such as an inertia measurement unit and a camera are generally arranged in electronic equipment such as a mobile phone and an unmanned aerial vehicle, and the inertia measurement unit and the camera are fixedly connected and installed, so that the inertia measurement unit can be used for providing rotation angle information for camera relative motion estimation, and the following two conditions can be mainly adopted: (1) if the installation relation between the inertia measurement unit and the camera is accurately calibrated in advance, the attitude angle can be directly provided for the camera according to the rotation angle output by the inertia measurement unit, so that the pose parameters to be solved in the estimation of the relative motion of the camera are reduced. For example, under the condition that the inertial measurement unit provides a uniform gravity direction for the camera, the relative motion of a single-camera system can be estimated using 3 homonymous point pairs, and the relative motion of a multi-camera system can be estimated using 4 homonymous point pairs. (2) If the installation relationship between the inertial measurement unit and the camera is unknown, the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera according to the characteristic that the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same under the fixed connection installation condition of the inertial measurement unit and the camera, and the pose parameters to be solved in the estimation of the relative motion of the camera can also be reduced. The relative motion of the single-camera system and the multi-camera system can be estimated using 4 pairs of homonymous points and 5 pairs of homonymous points, respectively. Because the inertial measurement unit and the camera do not need to be calibrated, the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has wide application scenes in the fields of three-dimensional reconstruction, visual odometry and the like.
The traditional relative motion estimation algorithm usually adopts characteristic algorithms such as SIFT and SURF to obtain homonymous point pairs. At present, in multi-view geometric estimation, affine matching point pairs extracted by feature algorithms such as ASIFT and MODS are receiving more and more attention because more image point pair information is included. The affine matching point pairs include not only image coordinates of the same-name point pairs but also local affine matrices of domain information between the same-name point pairs. Three constraint equations are given to each affine matching point pair in the epipolar geometric relationship, and the number of the point pairs required by the relative motion estimation method can be effectively reduced.
Disclosure of Invention
In view of the above, it is necessary to provide a camera relative motion estimation method capable of solving the problem under the condition of the known relative rotation angle by using the affine matching point pair.
A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and matches the constraint between the point and the camera motion model according to the affine. The method provides a minimum solution for estimating the relative motion based on two affine matching point pairs, and respectively estimates the motion of a single camera and a multi-camera system, thereby greatly reducing the number of point pairs required for solving the problem of estimating the relative motion of the single camera and the multi-camera system, obviously improving the precision and the robustness of the algorithm, and being suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
Drawings
FIG. 1 is a diagram of a method for estimating relative camera motion under a known relative rotation angle condition in one embodiment;
FIG. 2 is a schematic diagram of a single camera parameter distribution in one embodiment;
FIG. 3 is a diagram illustrating multi-camera parameter distribution in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a camera relative motion estimation method under a known relative rotation angle condition, comprising the steps of:
and 102, acquiring at least two affine matching point pairs in a first view and a second view shot by the camera, and selecting a jth affine matching point pair to establish a world reference system.
The origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction.
And 104, acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system.
The first position relation, the second position relation and the third position relation all comprise a rotation matrix and a translation vector.
And 106, parameterizing the rotation matrix and the translation vector, determining rotation parameter constraints corresponding to unknown numbers of the rotation matrix according to the relative rotation angle between the first view and the second view, representing the first position and orientation relation by adopting the parameterized rotation matrix and translation vector, and acquiring a corresponding essential matrix.
And 108, acquiring two affine transformation constraints corresponding to the intrinsic matrix determined by the jth affine matching point and the affine matching matrix in the affine matching points, and acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the intrinsic matrix and the affine matching matrix in the affine matching points.
And 110, obtaining a rotation matrix and a translation vector according to two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other affine matching points, and determining a relative motion relation of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and matches the constraint between the point and the camera motion model according to the affine. The method provides a minimum solution for estimating the relative motion based on two affine matching point pairs, and respectively estimates the motion of a single camera and a multi-camera system, thereby greatly reducing the number of point pairs required for solving the problem of estimating the relative motion of the single camera and the multi-camera system, obviously improving the precision and the robustness of the algorithm, and being suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
For single camera relative motion estimation, assume that the jth affine matching point pair is represented asWhereinAndnormalized homogeneous image coordinates of homonymous point pairs in the first view and the second view respectively,is a 2 x 2 local affine transformation matrix,characterize what isAndaffine transformation relations in a surrounding infinitesimal neighborhood. The corresponding unit direction vector of the same-name point pair can be calculated by the following equation:,. The input conditions of the embodiment of the present invention are two affine matching point pairs (at least one affine matching point pair and one homologous point pair) and the relative rotation angle of the single camera provided by the inertial measurement unit.
In one embodiment, the rotation matrix obtained by parameterizing the rotation matrix is obtained as follows:
wherein,is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
In one embodiment, the obtaining of the translation vector in the second posture relationship and the translation vector in the third posture relationship obtained by parameterizing the translation vector are respectively:
wherein,representing a translation vector in the second attitude relationship,an unknown depth parameter representing a translation vector parameterization in the second pose relationship,a unit vector representing the normalized homogeneous image coordinates in the first view,representing the translation vector in the third posture relationship,an unknown depth parameter representing the translation vector in the third pose relationship,a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
wherein,representing a translation vector in the first attitude relationship, and I representing a unit matrix;
and obtaining the corresponding essential matrix as follows:
In one embodiment, the epipolar geometry constraint is:
wherein,normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
where the subscript (1:2) represents the first two equations.
In one embodiment, the relative motion parameters of the single camera are determined to be four degrees of freedom according to the rotation parameter constraint; selecting two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint determined by other affine matching points, and constructing a first solving model as follows:
wherein,the element term in (1) is unknown number,Andthe second-order term of (a) is,to representThe matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
obtaining the unknown number according to the first solving model and the second solving modelThe six equations of (1) are:
obtaining algebraic solutions of six equations through Gr baby basis solution, determining rotation matrix R of first attitude relationship according to the algebraic solutions, and obtaining rotation matrix R of first attitude relationship according to Gr baby basis solutionMid-null space determinationAccording toIs calculated to obtainA translation vector in the two-position posture relation and a translation vector in the third-position posture relation,representation matrixThe sub-matrices of the first two rows and the first two columns,representation matrixThe sub-matrices of the first two rows and the first two columns,a determinant representing a matrix is provided,representing by unknownsForming two rows and two columns of submatrices.
In one embodiment, when the camera is a multi-camera system, the camera in the multi-camera system is acquiredExternal parameters of(ii) a Wherein, the multi-camera system includes: camera for taking first view or second viewAnd a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
wherein,is the serial number of the camera,is the sequence number of the affine match point pair,is the view sequence number. Unit direction vectorCan pass throughThe calculation results in that,,camera of person beingInCorresponding normalized homogeneous image coordinates;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
and calculating an essential matrix corresponding to the fourth pose relation as follows:
in one embodiment, the relative motion parameters of the multi-camera system are determined to be five degrees of freedom according to the rotation parameter constraints;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
obtaining the unknown number according to the third solving model and the fourth solving modelThe eight equations of (1) are:
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix RMid-null space determinationAccording toIs calculated to obtainA translation vector in the two-position posture relation and a translation vector in the third-position posture relation.
The following further description will be made in the case of a single camera and a multi-camera, respectively.
Single camera
Any one affine matching point pair is selected to define the world reference frame W, as shown in fig. 2. Assuming that the jth affine matching point pair is currently selected, the position of the jth affine matching point pair in the three-dimensional space is selected as the origin of W, and the coordinate axis direction of W is consistent with View1 (View 1 in fig. 2). The pose relationship between View1 and View2 (View 2 in FIG. 2) is represented asThe pose relationship between the view1 and the reference system W is expressed asThe pose relationship between the view2 and the reference system W is expressed as. In particular, it is possible to use, for example,,. Representing rotation matrices using Cayley parameterizationIt can be expressed as:
whereinIs a quaternion homogeneous vector. It is assumed that the relative rotation angle between view1 and view2 is known. It is noted that the relative rotation angle can be provided directly by the inertial measurement unit even in the case where the mounting relationship between the inertial measurement unit and the camera is unknown or varies. Three unknownsThe following constraints are satisfied:
Next, the following steps are carried outAndparametric to two unknown depth parametersLinear function of (c):
the relative motion between the two views is determined by the combination of two transformations: (i) from view1 to W, (ii) from W to view 2. Unknown number,Andis parameterized as. In the form of relative movementExpressed as:
the essential matrix can be represented as:
by general formulaCarry-in typeIt can be seen that each element in the essential matrix is associated withThe correlation is linear.
An affine matching point pair can extract three independent constraints for geometric model estimation, including oneEpipolar geometric constraints derived from homonymic point pair relationshipsAnd two local affine transformation matricesDerived affine transformation constraints. With known intra-camera reference, the epipolar geometric constraint between view1 and view2 is as follows:
description essence matrixWith local affine transformation matrixThe affine transformation constraint of the relationship of (a) can be expressed as follows:
where the subscript (1:2) represents the first two equations.
Since an affine matching point pair has been selected as the origin of the world reference frame for the special parameterization of the translation vector, it was found that homonymous point correspondences in the selected affine matching point pair cannot contribute a new constraint because the coefficients of the resulting equation are all zero. Thus, when the jth affine matching point pair is used to establish the world reference frame W, two affine matching point pairs can provide five equations. Specifically, the jth affine matching point pair baseIn the formulaTwo equations are provided. Another affine match point pair provides formula-basedAnd formulaThree equations of (a). By general formulaBrought intoAnd formulaUsing the hidden variable method, the five equations provided by two affine matching point pairs can be written as:
Obtaining two cameras by an inertial measurement unitAfter the relative rotation angle between the two cameras, the relative motion estimation problem of the single camera is four degrees of freedom. However, two affine matching point pairs may provide six independent constraints. This means that the number of constraints is larger than the number of unknowns and there are redundant constraints. Therefore, at least one affine matching point pair and one homologous point pair are sufficient to estimate the relative motion of the single camera under the condition of a known relative rotation angle. Can be selected fromOptionally three equations to explore the case of minimum solution. More specifically, the simultaneous combination of two affine transformation constraints of the jth affine match point pair and one epipolar geometric constraint of another affine match point pair yields 3 equations with 5 unknowns, i.e., the equationThe first three equations of (c):
due to the formulaHave a non-zero solution, thereforeIs satisfied with. Therefore, the temperature of the molten metal is controlled,all 2 x 2 sub-determinants of (a) must be zero. This gives information about three unknownsThree equations of (a).
In summary, assume that the jth affine matching point pair is selected to establish the world frame of reference W. Since two affine matching point pairs are required in the case of the minimum solution, another affine matching point pair can also be selected to establish the world reference frame W. Assuming that the jth AC is selected, a similar equation can be obtainedThe system of equations of (1):
general formulaAnd formulaBy doing the simultaneous, three unknowns can be obtainedSix equations of (2);
For the formulaAndthe formed polynomial equation set can obtain an algebraic solution through a Grubner-based method. To preserve numerical stability and avoid a large number of operations during the computation of the Gr baby's basis, in a limited fieldA random instance of a polynomial equation set is constructed. The Grubner base is then calculated using the computer number system Macaulay 2. Finally, a corresponding solution is found using an automatic Grubner-based solution algorithm.
The solution method described above has a maximum of 20 complex solutions and elimination templates of size 36 x 56. Once the rotation parameters are obtainedThen is used immediatelyTo obtain. Then, the formulaBy findingTo determine the null space of. Next, can pass throughComputingAnd. Finally, according to formulaThe relative motion of the single camera is calculated.
Multi-camera
The jth affine matching point pair is selected to define the world reference frame W, as shown in fig. 3. The position of the jth affine matching point pair in the three-dimensional space is selected as the origin of W, and the coordinate axis direction of W is consistent with that of View1 (View 1 in FIG. 3). In the reference of multi-camera systemIs expressed as. The transition between view1 and reference frame W is denoted asThe transition between View2 (View 1 in FIG. 3) and the reference frame W is denoted as. It is noted that,,. Then, toAndand carrying out parameterization. Can be expressed by the Pl ü cker vectorAll points on the line described are parameterized as:
whereinIs a vector of the direction of the unit,is a vector of the moment of the force,is an unknown depth parameter.
Suppose that the jth affine matching point pair is selected to correspond to the three-dimensional space positionTo define the origin of the world reference W. Will be connected withAnd a cameraThe Pl ü cker line of the optical center of (A) is indicated as. Then in view k, a pointSatisfies the following conditions:
equivalently, it can also be expressed as:
wherein,is the serial number of the camera,is the sequence number of the affine match point pair,is the view sequence number. Unit direction vectorCan pass throughIs calculated to obtain whereinCamera of person beingInCorresponding normalized homogeneous image coordinates. Herein, willAndparametric to two unknown depth parametersIs a linear function of (a).
Each affine matching point pair is associated with two perspectives in view1 and view 2. Relative movement between two camerasDetermined by a combination of four transforms: (i) from one perspective camera to view1, (ii) from view1 to W, (iii) from W to view2, (iv) from view2 to another perspective camera. In these four transformations, (i) and (iv) are determined in part by known external parameters. In sections (ii) and (iii), there are unknowns,Andthey are parameterized as. Relative movementCan be expressed as:
relative motion between two perspective cameras in each affine matching point pairAfter being represented, the essential matrixCan be expressed as:
by general formulaSubstituted typeAll the element entries in the essential matrix are equal toIn a linear relationship. Then, will formulaBrought intoAnd formulaFive equations may be obtained from two affine transformation constraints for the jth affine match point pair and three equations for the other affine match point pair from the two affine transformation constraints. These equations can be expressed as:
After the relative rotation angle between the two multi-camera systems is obtained through the inertial measurement unit, the relative motion estimation problem of the multi-camera is five degrees of freedom. Considering that two affine matching point pairs provide six independent constraints, the number of constraints is greater than the unknowns, and there are redundant constraints. Thus, the slave typeWherein four equations are randomly selected to explore the case of minimum solution. For example, two affine transformation constraints of the jth affine matching point pair and one epipolar geometric constraint and the first affine transformation constraint of another affine matching point pair are combined into four equations containing five unknowns, i.e., the formulaThe first four equations of (1):
due to the formulaHave a non-zero solution, thereforeIs satisfied with. Therefore, the temperature of the molten metal is controlled,all 3 x 3 sub-determinants of (a) must be zero. This gives information about three unknownsFour equations of (2).
Likewise, another affine matching point pair may be selected to establish the world reference frame W. Suppose that the first one is selectedAn affine matching point pair can be established by a similar formulaThe system of equations of (1):
general formulaAnd formulaBy doing the simultaneous, three unknowns can be obtainedEight equations of (1);
the order of these equations is 6. Furthermore, an additional constraint was found in the just-problem, namelyIs 1.
Affine transformation constraints provide additional equations for the above problem. Only when the world reference frame W is established using the jth affine match point pair, the two affine transformation constraints of the jth affine match point pair are used to construct the additional equations. Thus, for the relative motion estimation problem of multiple cameras, there are three additional equations:
The solution is performed using the Gr baby basis method. General formulaAnd formulaRespectively expressed as constraintsAnd constraint. Using constraints aloneCan be used forAnd estimating the relative motion under the condition of crossing or crossing affine matching point pairs. But at the same timeAndthe number of possible solutions can be reduced.
Once the rotation parameters are obtainedCan be calculated immediately. Then use formulaBy findingTo determine the null space of. Next, can pass throughComputingAnd. Finally, by combined transformationAndto calculate the relative motion of the multiple cameras.
The invention can achieve the following technical effects:
1) the method and the device provided by the invention fully utilize the affine matching point pair information between the views under the condition that the inertial measurement unit directly provides the relative rotation angle, greatly reduce the number of point pairs required for solving the relative motion estimation problem of a single-camera system and a multi-camera system, and obviously improve the accuracy and the robustness of the algorithm.
2) The invention utilizes the characteristic that the relative rotation angles of the inertial measurement unit and the camera are the same in the motion process under the condition that the inertial measurement unit and the camera are fixedly connected, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, and can be applied to wider scenes such as unknown or changed installation relation between the inertial measurement unit and the camera.
3) The relative movement of the single camera has 4 degrees of freedom under the condition that the inertial measurement unit directly provides the relative rotation angle of the single camera. A novel solving method for the minimum configuration solution of the relative motion estimation of the single camera is provided, and the solving method can accurately estimate the relative motion of the single camera through 2 affine matching point pairs.
4) Under the condition that the relative rotation angle of the multiple cameras is directly provided for the inertial measurement unit, the relative motion of the multiple cameras has 5 degrees of freedom. A novel solving method for the minimum configuration solution of the relative motion estimation of the multi-camera is provided, and the solving method can accurately estimate the relative motion of a multi-camera system through 2 crossed or non-crossed affine matching point pairs.
5) The method of the invention does not need to calibrate the inertial measurement unit and the camera, so that the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has higher precision and efficiency, and is suitable for equipment with limited computing capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobile, augmented reality equipment and the like.
For electronic equipment such as mobile phones, unmanned planes and the like fixedly connected with and provided with an inertia measuring unit and a camera, the inertia measuring unit is used for providing relative rotation angle information for the camera, and two affine matching points are used for estimating the relative motion process of a single camera and a multi-camera system as follows:
1) extracting affine matching point pairs between two views in the relative motion of the camera by algorithms such as ASIFT, MODS and the like, wherein the affine matching point pairs comprise local affine matrixes between image coordinates and corresponding field information of the same-name point pairs;
2) relative rotation angle information directly output by an inertia measurement unit fixedly connected with the camera;
3) according to the affine matching point pair extracted between the two views, the relative rotation angle output by the inertial measurement unit and the relative motion estimation algorithm provided by the invention, the relative motion between the single camera and the multi-camera system is solved. Meanwhile, a mismatch point pair in the affine matching point pair is eliminated by combining with the RANSAC frame, and a relative motion result of the camera is recovered.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (7)
1. A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
2. The method of claim 1, wherein determining a rotation parameter constraint for the rotation matrix corresponding to the unknown number based on the relative rotation angle between the first view and the second view comprises:
obtaining the rotation matrix obtained by parameterizing the rotation matrix as follows:
wherein,is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
3. The method of claim 1, wherein the representing the first pose relationship using the parameterized rotation matrix and translation vector and obtaining a corresponding essential matrix comprises:
obtaining a translation vector in the second position posture relation and a translation vector in the third position posture relation obtained by parameterizing the translation vector, wherein the obtained translation vectors are respectively:
wherein,representing a translation vector in the second attitude relationship,an unknown depth parameter representing a translation vector parameterization in the second pose relationship,a unit vector representing the normalized homogeneous image coordinates in the first view,representing the translation vector in the third posture relationship,an unknown depth parameter representing the translation vector in the third pose relationship,a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
wherein,representing a translation vector in the first position posture relation, I representing a unit matrix, and R representing a rotation matrix corresponding to the first position posture relation;
and obtaining the corresponding essential matrix as follows:
4. The method of claim 3, wherein the epipolar geometry constraint is:
wherein,normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
5. The method according to any one of claims 1 to 4, wherein solving the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the one epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points comprises:
determining the relative motion parameters of the single camera to be four degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and other epipolar geometric constraints determined by the affine matching points, and constructing a first solution model as follows:
wherein,the element term in (1) is unknown number,Andthe second-order term of (a) is,to representThe matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
according to the first solution model and the second solution modelSolving the model to obtain the unknown numberThe six equations of (1) are:
obtaining algebraic solutions of six equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship according to the algebraic solutions, and obtaining a rotation matrix R of a first attitude relationship according to the rotation matrix RMid-null space determinationAccording toCalculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation,representation matrixThe sub-matrices of the first two rows and the first two columns,representation matrixThe sub-matrices of the first two rows and the first two columns,a determinant representing a matrix is provided,representing by unknownsForming two rows and two columns of submatrices.
6. The method of claim 5, wherein when the camera is a multi-camera system, the method further comprises:
camera in system for acquiring multiple camerasExternal parameters of,A matrix of rotations is represented, which is,representing a translation vector; wherein, the multi-camera system includes: camera for taking first view or second viewAnd a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
wherein,is the serial number of the camera,is the sequence number of the affine match point pair,is the view's serial number;
unit direction vectorBy passingThe calculation results in that,camera of person beingInThe corresponding normalized homogeneous image coordinates are then compared,the force vector is represented by a force vector,an unknown depth parameter representing the translation vector parameterization;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
and calculating an essential matrix corresponding to the fourth pose relation as follows:
wherein,andrespectively representing cameraVideo and audio playerThe rotation matrix of (a) is,andare respectively camerasVideo and audio playerThe translation vector of (a);
7. The method according to claim 6, wherein said solving for said rotation matrix and said translation vector according to two affine transformation constraints corresponding to the jth said affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other said affine matching points further comprises:
determining the relative motion parameters of the multi-camera system to be five degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
obtaining the unknown number according to the third solving model and the fourth solving modelThe eight equations of (1) are:
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix RMid-null space determinationAccording toAnd calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110596663.4A CN113048985B (en) | 2021-05-31 | 2021-05-31 | Camera relative motion estimation method under known relative rotation angle condition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110596663.4A CN113048985B (en) | 2021-05-31 | 2021-05-31 | Camera relative motion estimation method under known relative rotation angle condition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113048985A true CN113048985A (en) | 2021-06-29 |
CN113048985B CN113048985B (en) | 2021-08-06 |
Family
ID=76518592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110596663.4A Active CN113048985B (en) | 2021-05-31 | 2021-05-31 | Camera relative motion estimation method under known relative rotation angle condition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113048985B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115451958A (en) * | 2022-11-10 | 2022-12-09 | 中国人民解放军国防科技大学 | Camera absolute attitude optimization method based on relative rotation angle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6137902A (en) * | 1997-07-22 | 2000-10-24 | Atr Human Information Processing Research Laboratories | Linear estimation method for three-dimensional position with affine camera correction |
CN101226638A (en) * | 2007-01-18 | 2008-07-23 | 中国科学院自动化研究所 | Method and apparatus for standardization of multiple camera system |
CN111476842A (en) * | 2020-04-10 | 2020-07-31 | 中国人民解放军国防科技大学 | Camera relative pose estimation method and system |
CN111696158A (en) * | 2020-06-04 | 2020-09-22 | 中国人民解放军国防科技大学 | Affine matching point pair-based multi-camera system relative pose estimation method and device |
CN112629565A (en) * | 2021-03-08 | 2021-04-09 | 中国人民解放军国防科技大学 | Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit |
-
2021
- 2021-05-31 CN CN202110596663.4A patent/CN113048985B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6137902A (en) * | 1997-07-22 | 2000-10-24 | Atr Human Information Processing Research Laboratories | Linear estimation method for three-dimensional position with affine camera correction |
CN101226638A (en) * | 2007-01-18 | 2008-07-23 | 中国科学院自动化研究所 | Method and apparatus for standardization of multiple camera system |
CN111476842A (en) * | 2020-04-10 | 2020-07-31 | 中国人民解放军国防科技大学 | Camera relative pose estimation method and system |
CN111696158A (en) * | 2020-06-04 | 2020-09-22 | 中国人民解放军国防科技大学 | Affine matching point pair-based multi-camera system relative pose estimation method and device |
CN112629565A (en) * | 2021-03-08 | 2021-04-09 | 中国人民解放军国防科技大学 | Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit |
Non-Patent Citations (4)
Title |
---|
DÁNIEL BARÁTH,ET.AL: "Making affine correspondences work in camera geometry computation", 《COMPUTER VISION - ECCV 2020 16TH EUROPEAN CONFERENCE》 * |
GUAN BANGLEI: "Relative Pose Estimation With a Single Affine Correspondence", 《IEEE TRANSACTIONS ON CYBERNETICS》 * |
涂国勇等: "仿射变形下的异源图像匹配方法", 《计算机辅助设计与图形学学报》 * |
田苗等: "一种无公共视场的多相机系统相对位姿解耦估计方法", 《光学学报》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115451958A (en) * | 2022-11-10 | 2022-12-09 | 中国人民解放军国防科技大学 | Camera absolute attitude optimization method based on relative rotation angle |
CN115451958B (en) * | 2022-11-10 | 2023-02-03 | 中国人民解放军国防科技大学 | Camera absolute attitude optimization method based on relative rotation angle |
Also Published As
Publication number | Publication date |
---|---|
CN113048985B (en) | 2021-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102737406B (en) | Three-dimensional modeling apparatus and method | |
KR100855657B1 (en) | System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor | |
US8442305B2 (en) | Method for determining 3D poses using points and lines | |
Orghidan et al. | Camera calibration using two or three vanishing points | |
CN106960454B (en) | Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle | |
Clipp et al. | Robust 6dof motion estimation for non-overlapping, multi-camera systems | |
CN108765498A (en) | Monocular vision tracking, device and storage medium | |
CN110411476B (en) | Calibration adaptation and evaluation method and system for visual inertial odometer | |
CN111754579B (en) | Method and device for determining external parameters of multi-view camera | |
CN101563709A (en) | Calibrating a camera system | |
CN112184824A (en) | Camera external parameter calibration method and device | |
WO2009035183A1 (en) | Method for self localization using parallel projection model | |
CN102750704A (en) | Step-by-step video camera self-calibration method | |
CN105956074A (en) | Single image scene six-degree-of-freedom positioning method of adjacent pose fusion guidance | |
CN113744340A (en) | Calibrating cameras with non-central camera models of axial viewpoint offset and computing point projections | |
CN101377404A (en) | Method for disambiguating space round gesture recognition ambiguity based on angle restriction | |
Shah et al. | Depth estimation using stereo fish-eye lenses | |
CN113048985B (en) | Camera relative motion estimation method under known relative rotation angle condition | |
Kurz et al. | Bundle adjustment for stereoscopic 3d | |
CN111145267A (en) | IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method | |
CN116630556A (en) | Method, system and storage medium for reconstructing map based on aerial map data | |
Pagel | Extrinsic self-calibration of multiple cameras with non-overlapping views in vehicles | |
CN116129031A (en) | Method, system and storage medium for three-dimensional reconstruction | |
CN113223163A (en) | Point cloud map construction method and device, equipment and storage medium | |
Kotake et al. | A marker calibration method utilizing a priori knowledge on marker arrangement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |