CN113048985B - Camera relative motion estimation method under known relative rotation angle condition - Google Patents

Camera relative motion estimation method under known relative rotation angle condition Download PDF

Info

Publication number
CN113048985B
CN113048985B CN202110596663.4A CN202110596663A CN113048985B CN 113048985 B CN113048985 B CN 113048985B CN 202110596663 A CN202110596663 A CN 202110596663A CN 113048985 B CN113048985 B CN 113048985B
Authority
CN
China
Prior art keywords
camera
view
affine
matrix
translation vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110596663.4A
Other languages
Chinese (zh)
Other versions
CN113048985A (en
Inventor
关棒磊
谭泽
李璋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110596663.4A priority Critical patent/CN113048985B/en
Publication of CN113048985A publication Critical patent/CN113048985A/en
Application granted granted Critical
Publication of CN113048985B publication Critical patent/CN113048985B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models

Abstract

The application relates to a camera relative motion estimation method based on affine matching point pairs under the condition of known relative rotation angle. When the inertial measurement unit is fixedly connected with the camera under the installation condition, the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, so that the method can be applied to wider scenes such as unknown installation relation or changed installation relation between the inertial measurement unit and the camera. Meanwhile, the situation that the relative rotation angle of the camera or the multi-camera system is known is introduced, and the relative motion of the single camera and the multi-camera system can be estimated by adopting the image coordinates of the two affine matching point pairs and the local affine transformation matrix in the field. Therefore, the calculation accuracy and efficiency are improved, and the method is suitable for equipment with limited calculation capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobiles and augmented reality equipment.

Description

Camera relative motion estimation method under known relative rotation angle condition
Technical Field
The application relates to the technical field of positioning, in particular to a camera relative motion estimation method under the condition of known relative rotation angle.
Background
Relative motion estimation of a single camera or multiple cameras is a fundamental problem in three-dimensional vision, such as robot positioning and mapping, augmented reality, and autopilot. The single-camera system is modeled by a central perspective projection model, and the relative motion estimation of the single camera generally utilizes an essential matrix algorithm of 5 homonymous point pairs or a homography matrix algorithm of 4 homonymous point pairs. The multi-camera system consists of a plurality of single cameras fixed to a single rigid body. It is modeled using a general camera model and does not have a single center of projection. Relative motion estimation for multiple cameras typically utilizes a minimum configuration solution of 6 homonymous point pairs, or a linear solution method of 17 homonymous point pairs.
At present, sensors such as an inertia measurement unit and a camera are generally arranged in electronic equipment such as a mobile phone and an unmanned aerial vehicle, and the inertia measurement unit and the camera are fixedly connected and installed, so that the inertia measurement unit can be used for providing rotation angle information for camera relative motion estimation, and the following two conditions can be mainly adopted: (1) if the installation relation between the inertia measurement unit and the camera is accurately calibrated in advance, the attitude angle can be directly provided for the camera according to the rotation angle output by the inertia measurement unit, so that the pose parameters to be solved in the estimation of the relative motion of the camera are reduced. For example, under the condition that the inertial measurement unit provides a uniform gravity direction for the camera, the relative motion of a single-camera system can be estimated using 3 homonymous point pairs, and the relative motion of a multi-camera system can be estimated using 4 homonymous point pairs. (2) If the installation relationship between the inertial measurement unit and the camera is unknown, the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera according to the characteristic that the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same under the fixed connection installation condition of the inertial measurement unit and the camera, and the pose parameters to be solved in the estimation of the relative motion of the camera can also be reduced. The relative motion of the single-camera system and the multi-camera system can be estimated using 4 pairs of homonymous points and 5 pairs of homonymous points, respectively. Because the inertial measurement unit and the camera do not need to be calibrated, the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has wide application scenes in the fields of three-dimensional reconstruction, visual odometry and the like.
The traditional relative motion estimation algorithm usually adopts characteristic algorithms such as SIFT and SURF to obtain homonymous point pairs. At present, in multi-view geometric estimation, affine matching point pairs extracted by feature algorithms such as ASIFT and MODS are receiving more and more attention because more image point pair information is included. The affine matching point pairs include not only the image coordinates of the same-name point pairs but also a local affine matrix of domain information between the same-name point pairs. Three constraint equations are given to each affine matching point pair in the epipolar geometric relationship, and the number of the point pairs required by the relative motion estimation method can be effectively reduced.
Disclosure of Invention
In view of the above, it is necessary to provide a camera relative motion estimation method capable of solving the problem under the condition of the known relative rotation angle by using the affine matching point pair.
A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and is based on the constraint between the affine matching point pair and the camera motion model. Two relative motion estimation minimum solutions based on two affine matching point pairs are provided, and the motion of a single camera and a multi-camera system is estimated respectively, so that the number of the point pairs required for solving the relative motion estimation problem of the single camera and the multi-camera system is greatly reduced, the accuracy and the robustness of the algorithm are obviously improved, and the method is suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
Drawings
FIG. 1 is a diagram of a method for estimating relative camera motion under a known relative rotation angle condition in one embodiment;
FIG. 2 is a schematic diagram of a single camera parameter distribution in one embodiment;
FIG. 3 is a diagram illustrating multi-camera parameter distribution in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a camera relative motion estimation method under a known relative rotation angle condition, comprising the steps of:
and 102, acquiring at least two affine matching point pairs in a first view and a second view shot by the camera, and selecting a jth affine matching point pair to establish a world reference system.
The origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction.
And 104, acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system.
The first position relation, the second position relation and the third position relation all comprise a rotation matrix and a translation vector.
And 106, parameterizing the rotation matrix and the translation vector, determining rotation parameter constraints corresponding to unknown numbers of the rotation matrix according to the relative rotation angle between the first view and the second view, representing the first position and orientation relation by adopting the parameterized rotation matrix and translation vector, and acquiring a corresponding essential matrix.
And 108, acquiring two affine transformation constraints corresponding to the intrinsic matrix determined by the jth affine matching point and the affine matching matrix in the affine matching points, and acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the intrinsic matrix and the affine matching matrix in the affine matching points.
And 110, obtaining a rotation matrix and a translation vector according to two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other affine matching points, and determining a relative motion relation of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and is based on the constraint between the affine matching point pair and the camera motion model. Two relative motion estimation minimum solutions based on two affine matching point pairs are provided, and the motion of a single camera and a multi-camera system is estimated respectively, so that the number of the point pairs required for solving the relative motion estimation problem of the single camera and the multi-camera system is greatly reduced, the accuracy and the robustness of the algorithm are obviously improved, and the method is suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
For single camera relative motion estimation, assume that the jth affine matching point pair is represented as
Figure 960635DEST_PATH_IMAGE001
Wherein
Figure 322346DEST_PATH_IMAGE002
And
Figure 33950DEST_PATH_IMAGE003
normalized homogeneous image coordinates of homonymous point pairs in the first view and the second view respectively,
Figure 430296DEST_PATH_IMAGE004
is a 2 x 2 local affine transformation matrix,
Figure 569154DEST_PATH_IMAGE004
characterize what is
Figure 152582DEST_PATH_IMAGE005
And
Figure 667877DEST_PATH_IMAGE006
affine transformation relations in a surrounding infinitesimal neighborhood. The corresponding unit direction vector of the same-name point pair can be calculated by the following equation:
Figure 918729DEST_PATH_IMAGE007
Figure 228488DEST_PATH_IMAGE008
. The input conditions of the embodiment of the present invention are two affine matching point pairs (at least one affine matching point pair and one homologous point pair) and the relative rotation angle of the single camera provided by the inertial measurement unit.
In one embodiment, the rotation matrix obtained by parameterizing the rotation matrix is obtained as follows:
Figure 299212DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure 618198DEST_PATH_IMAGE010
is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
Figure 457978DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 204217DEST_PATH_IMAGE012
indicating the relative angle of rotation.
In one embodiment, the obtaining of the translation vector in the second posture relationship and the translation vector in the third posture relationship obtained by parameterizing the translation vector are respectively:
Figure 762238DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 619335DEST_PATH_IMAGE014
representing a translation vector in the second attitude relationship,
Figure 844780DEST_PATH_IMAGE015
an unknown depth parameter representing a translation vector parameterization in the second pose relationship,
Figure 496341DEST_PATH_IMAGE016
representing a first viewThe unit vector of the homogeneous image coordinates is normalized,
Figure 541658DEST_PATH_IMAGE017
representing the translation vector in the third posture relationship,
Figure 936867DEST_PATH_IMAGE018
an unknown depth parameter representing the translation vector in the third pose relationship,
Figure 16818DEST_PATH_IMAGE019
a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
Figure 839281DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 371893DEST_PATH_IMAGE021
representing a translation vector in the first attitude relationship, and I representing a unit matrix;
and obtaining the corresponding essential matrix as follows:
Figure 570794DEST_PATH_IMAGE022
wherein, E represents an essential matrix,
Figure 505252DEST_PATH_IMAGE023
representing an anti-symmetric matrix.
In one embodiment, the epipolar geometry constraint is:
Figure 498615DEST_PATH_IMAGE024
wherein the content of the first and second substances,
Figure 518524DEST_PATH_IMAGE025
normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
Figure 521115DEST_PATH_IMAGE026
where the subscript (1:2) represents the first two equations.
In one embodiment, the relative motion parameters of the single camera are determined to be four degrees of freedom according to the rotation parameter constraint; selecting two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint determined by other affine matching points, and constructing a first solving model as follows:
Figure 310080DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 474345DEST_PATH_IMAGE028
the element term in (1) is unknown number
Figure 981549DEST_PATH_IMAGE029
Figure 787831DEST_PATH_IMAGE030
And
Figure 165723DEST_PATH_IMAGE031
the second-order term of (a) is,
Figure 32048DEST_PATH_IMAGE032
to represent
Figure 495390DEST_PATH_IMAGE033
The matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
Figure 105363DEST_PATH_IMAGE034
wherein the content of the first and second substances,
Figure 337761DEST_PATH_IMAGE035
the element term in (1) is unknown number
Figure 374987DEST_PATH_IMAGE036
Figure 325626DEST_PATH_IMAGE037
And
Figure 473711DEST_PATH_IMAGE038
the second order term of (d);
obtaining the unknown number according to the first solving model and the second solving model
Figure 826194DEST_PATH_IMAGE039
The six equations of (1) are:
Figure 46041DEST_PATH_IMAGE040
obtaining algebraic solutions of six equations through Gr baby basis solution, determining rotation matrix R of first attitude relationship according to the algebraic solutions, and obtaining rotation matrix R of first attitude relationship according to Gr baby basis solution
Figure 749554DEST_PATH_IMAGE041
Mid-null space determination
Figure 435751DEST_PATH_IMAGE042
According to
Figure 908320DEST_PATH_IMAGE043
Calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation,
Figure 21770DEST_PATH_IMAGE044
representation matrix
Figure 212580DEST_PATH_IMAGE045
The sub-matrices of the first two rows and the first two columns,
Figure 436888DEST_PATH_IMAGE046
representation matrix
Figure 29543DEST_PATH_IMAGE047
The sub-matrices of the first two rows and the first two columns,
Figure 48315DEST_PATH_IMAGE048
a determinant representing a matrix is provided,
Figure 726421DEST_PATH_IMAGE049
representing by unknowns
Figure 754420DEST_PATH_IMAGE050
Forming two rows and two columns of submatrices.
In one embodiment, when the camera is a multi-camera system, the camera in the multi-camera system is acquired
Figure 201581DEST_PATH_IMAGE051
External parameters of
Figure 656834DEST_PATH_IMAGE052
(ii) a Wherein, the multi-camera system includes: camera for taking first view or second view
Figure 556656DEST_PATH_IMAGE053
And a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
Figure 388346DEST_PATH_IMAGE054
wherein the content of the first and second substances,
Figure 690015DEST_PATH_IMAGE055
is the serial number of the camera,
Figure 581747DEST_PATH_IMAGE056
is the sequence number of the affine match point pair,
Figure 703287DEST_PATH_IMAGE057
is the view sequence number. Unit direction vector
Figure 338668DEST_PATH_IMAGE058
Can pass through
Figure 494842DEST_PATH_IMAGE059
The calculation results in that,
Figure 557476DEST_PATH_IMAGE060
Figure 166312DEST_PATH_IMAGE061
camera of person being
Figure 339805DEST_PATH_IMAGE062
In
Figure 616065DEST_PATH_IMAGE063
Corresponding normalized homogeneous image coordinates;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
Figure 584021DEST_PATH_IMAGE064
and calculating an essential matrix corresponding to the fourth pose relation as follows:
Figure 945732DEST_PATH_IMAGE065
in one embodiment, the relative motion parameters of the multi-camera system are determined to be five degrees of freedom according to the rotation parameter constraints;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
Figure 657336DEST_PATH_IMAGE066
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
Figure 788104DEST_PATH_IMAGE067
obtaining the unknown number according to the third solving model and the fourth solving model
Figure 926961DEST_PATH_IMAGE068
The eight equations of (1) are:
Figure 775968DEST_PATH_IMAGE069
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix R
Figure 291263DEST_PATH_IMAGE070
Mid-null space determination
Figure 276537DEST_PATH_IMAGE071
According to
Figure 851874DEST_PATH_IMAGE072
And calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation.
The following further description will be made in the case of a single camera and a multi-camera, respectively.
Single camera
Any one affine matching point pair is selected to define the world reference frame W, as shown in fig. 2. Assuming that the jth affine matching point pair is currently selected, the position of the jth affine matching point pair in the three-dimensional space is selected as the origin of W, and the coordinate axis direction of W is consistent with that of view 1. The pose relationship between view 1 and view 2 is expressed as
Figure 922599DEST_PATH_IMAGE073
The pose relationship between the view 1 and the reference system W is expressed as
Figure 241585DEST_PATH_IMAGE074
The pose relationship between the view 2 and the reference system W is expressed as
Figure 81365DEST_PATH_IMAGE075
. In particular, it is possible to use, for example,
Figure 827604DEST_PATH_IMAGE076
Figure 385624DEST_PATH_IMAGE077
. Representing rotation matrices using Cayley parameterization
Figure 242722DEST_PATH_IMAGE078
It can be expressed as:
Figure 937008DEST_PATH_IMAGE079
Figure 119728DEST_PATH_IMAGE080
wherein
Figure 165044DEST_PATH_IMAGE081
Is a quaternion homogeneous vector. It is assumed that the relative rotation angle between view 1 and view 2 is known. It is worth noting that the relative rotation angle is evenIt may also be provided directly by the inertial measurement unit in the event that the mounting relationship between the inertial measurement unit and the camera is unknown or varies. Three unknowns
Figure 560253DEST_PATH_IMAGE082
The following constraints are satisfied:
Figure 640205DEST_PATH_IMAGE083
Figure 462667DEST_PATH_IMAGE084
wherein
Figure 995280DEST_PATH_IMAGE085
Is the relative angle of rotation between the two views.
Next, the following steps are carried out
Figure 194180DEST_PATH_IMAGE086
And
Figure 128638DEST_PATH_IMAGE087
parametric to two unknown depth parameters
Figure 122002DEST_PATH_IMAGE088
Linear function of (c):
Figure 141910DEST_PATH_IMAGE089
Figure 144501DEST_PATH_IMAGE090
the relative motion between the two views is determined by the combination of two transformations: (i) from view 1 to W, (ii) from W to view 2. Unknown number
Figure 667887DEST_PATH_IMAGE091
Figure 363310DEST_PATH_IMAGE092
And
Figure 604936DEST_PATH_IMAGE093
is parameterized as
Figure 145638DEST_PATH_IMAGE094
. In the form of relative movement
Figure 789109DEST_PATH_IMAGE095
Expressed as:
Figure 389855DEST_PATH_IMAGE096
Figure 118777DEST_PATH_IMAGE097
the essential matrix can be represented as:
Figure 463170DEST_PATH_IMAGE098
Figure 226727DEST_PATH_IMAGE099
by general formula
Figure 998374DEST_PATH_IMAGE100
Carry-in type
Figure 214592DEST_PATH_IMAGE101
It can be seen that each element in the essential matrix is associated with
Figure 97097DEST_PATH_IMAGE102
The correlation is linear.
An affine matching point pair can extract three independent constraints for geometric model estimation, including a epipolar geometric constraint derived from homonymous point pair relationships
Figure 715160DEST_PATH_IMAGE103
And two local affine transformation matrices
Figure 657708DEST_PATH_IMAGE104
Derived affine transformation constraints. With known intra-camera reference, the epipolar geometric constraint between view 1 and view 2 is as follows:
Figure 361222DEST_PATH_IMAGE105
Figure 781839DEST_PATH_IMAGE106
description essence matrix
Figure 519988DEST_PATH_IMAGE107
With local affine transformation matrix
Figure 633438DEST_PATH_IMAGE108
The affine transformation constraint of the relationship of (a) can be expressed as follows:
Figure 824247DEST_PATH_IMAGE109
Figure 314135DEST_PATH_IMAGE110
where the subscript (1:2) represents the first two equations.
Since an affine matching point pair has been selected as the origin of the world reference frame for the special parameterization of the translation vector, it was found that homonymous point correspondences in the selected affine matching point pair cannot contribute a new constraint because the coefficients of the resulting equation are all zero. Thus, when the jth affine matching point pair is used to establish the world reference frame W, two affine matching point pairs can provide five equations. Specifically, the jth affine matching point pair is based on formula
Figure 375632DEST_PATH_IMAGE111
Two equations are provided. Another affine match point pair provides formula-based
Figure 925562DEST_PATH_IMAGE112
And formula
Figure 338088DEST_PATH_IMAGE111
Three equations of (a). By general formula
Figure 631666DEST_PATH_IMAGE113
Brought into
Figure 813249DEST_PATH_IMAGE112
And formula
Figure 268501DEST_PATH_IMAGE111
Using the hidden variable method, the five equations provided by two affine matching point pairs can be written as:
Figure 168324DEST_PATH_IMAGE114
Figure 14DEST_PATH_IMAGE115
wherein the content of the first and second substances,
Figure 301682DEST_PATH_IMAGE116
the element term in (1) is unknown number
Figure 927836DEST_PATH_IMAGE117
Figure 580534DEST_PATH_IMAGE118
And
Figure 950335DEST_PATH_IMAGE119
the second order term of (2).
After the relative rotation angle between the two cameras is obtained through the inertia measurement unit, the relative motion estimation problem of the single camera is four degrees of freedom. However, it is possible to use a single-layer,two affine matching point pairs may provide six independent constraints. This means that the number of constraints is larger than the number of unknowns and there are redundant constraints. Therefore, at least one affine matching point pair and one homologous point pair are sufficient to estimate the relative motion of the single camera under the condition of a known relative rotation angle. Can be selected from
Figure 106510DEST_PATH_IMAGE120
Optionally three equations to explore the case of minimum solution. More specifically, the simultaneous combination of two affine transformation constraints of the jth affine match point pair and one epipolar geometric constraint of another affine match point pair yields 3 equations with 5 unknowns, i.e., the equation
Figure 903565DEST_PATH_IMAGE120
The first three equations of (c):
Figure 777980DEST_PATH_IMAGE121
Figure 951472DEST_PATH_IMAGE122
due to the fact that
Figure 227733DEST_PATH_IMAGE122
The formula has a non-zero solution, therefore
Figure 195689DEST_PATH_IMAGE123
Is satisfied with
Figure 557400DEST_PATH_IMAGE124
. Therefore, the temperature of the molten metal is controlled,
Figure 269004DEST_PATH_IMAGE125
all 2 x 2 sub-determinants of (a) must be zero. This gives information about three unknowns
Figure 399771DEST_PATH_IMAGE126
Three equations of (a).
In summary, assume that the jth affine matching point pair is selected to establish the world frame of reference W. Since two affine matching point pairs are required in the case of the minimum solution, another affine matching point pair can also be selected to establish the world reference frame W. Assuming that the jth AC is selected, a similar equation can be obtained
Figure 804208DEST_PATH_IMAGE122
The system of equations of (1):
Figure 387636DEST_PATH_IMAGE127
Figure 902931DEST_PATH_IMAGE128
general formula
Figure 888204DEST_PATH_IMAGE122
And formula
Figure 463542DEST_PATH_IMAGE128
By doing the simultaneous, three unknowns can be obtained
Figure 534266DEST_PATH_IMAGE129
Six equations of (a).
Figure 853252DEST_PATH_IMAGE130
Figure 693032DEST_PATH_IMAGE131
The above formula is about
Figure 439271DEST_PATH_IMAGE132
The system of equations of the fourth order.
For the formula
Figure 731712DEST_PATH_IMAGE133
And
Figure 854389DEST_PATH_IMAGE131
the formed polynomial equation set can obtain an algebraic solution through a Grubner-based method. To preserve numerical stability and avoid a large number of operations during the computation of the Gr baby's basis, in a limited field
Figure 548676DEST_PATH_IMAGE134
A random instance of a polynomial equation set is constructed. The Grubner base is then calculated using the computer number system Macaulay 2. Finally, a corresponding solution is found using an automatic Grubner-based solution algorithm.
Has a maximum of 20 complex solutions and elimination templates of size 36 x 56. Once the rotation parameters are obtained
Figure 465816DEST_PATH_IMAGE135
Then is used immediately
Figure 511133DEST_PATH_IMAGE136
To obtain
Figure 171921DEST_PATH_IMAGE137
. Then, the formula
Figure 986293DEST_PATH_IMAGE138
By finding
Figure 74335DEST_PATH_IMAGE139
To determine the null space of
Figure 606947DEST_PATH_IMAGE140
. Next, can pass through
Figure 805848DEST_PATH_IMAGE141
Computing
Figure 474726DEST_PATH_IMAGE142
And
Figure 733669DEST_PATH_IMAGE143
. Finally, according toFormula (II)
Figure 753578DEST_PATH_IMAGE144
The relative motion of the single camera is calculated.
Multi-camera
The jth affine matching point pair is selected to define the world reference frame W, as shown in fig. 3. And selecting the position of the jth affine matching point pair in the three-dimensional space as the origin of the W, wherein the coordinate axis direction of the W is consistent with that of the view 1. In the reference of multi-camera system
Figure 490590DEST_PATH_IMAGE145
Is expressed as
Figure 279554DEST_PATH_IMAGE146
. The transition between view 1 and reference frame W is denoted as
Figure 709399DEST_PATH_IMAGE147
The transition between view 2 and reference frame W is denoted as
Figure 951024DEST_PATH_IMAGE148
. It is noted that,
Figure 757306DEST_PATH_IMAGE149
Figure 400777DEST_PATH_IMAGE150
. Then, to
Figure 1523DEST_PATH_IMAGE151
And
Figure 730444DEST_PATH_IMAGE152
and carrying out parameterization. Can be expressed by the Pl ü cker vector
Figure 74838DEST_PATH_IMAGE153
All points on the line described are parameterized as:
Figure 572815DEST_PATH_IMAGE154
Figure 344462DEST_PATH_IMAGE155
wherein
Figure 560680DEST_PATH_IMAGE156
Is a vector of the direction of the unit,
Figure 708765DEST_PATH_IMAGE157
is a vector of the moment of the force,
Figure 61248DEST_PATH_IMAGE158
is an unknown depth parameter.
Suppose that the jth affine matching point pair is selected to correspond to the three-dimensional space position
Figure 269376DEST_PATH_IMAGE159
To define the origin of the world reference W. Will be connected with
Figure 972890DEST_PATH_IMAGE159
And a camera
Figure 393507DEST_PATH_IMAGE160
The Pl ü cker line of the optical center of (A) is indicated as
Figure 131656DEST_PATH_IMAGE161
. Then in view k, a point
Figure 979526DEST_PATH_IMAGE162
Satisfies the following conditions:
Figure 170336DEST_PATH_IMAGE163
Figure 394644DEST_PATH_IMAGE164
equivalently, it can also be expressed as:
Figure 987299DEST_PATH_IMAGE165
Figure 271650DEST_PATH_IMAGE166
wherein the content of the first and second substances,
Figure 949756DEST_PATH_IMAGE167
is the serial number of the camera,
Figure 977755DEST_PATH_IMAGE168
is the sequence number of the affine match point pair,
Figure 159337DEST_PATH_IMAGE169
is the view sequence number. Unit direction vector
Figure 880169DEST_PATH_IMAGE170
Can pass through
Figure 779992DEST_PATH_IMAGE171
Is calculated to obtain wherein
Figure 611681DEST_PATH_IMAGE172
Camera of person being
Figure 647771DEST_PATH_IMAGE173
In
Figure 539503DEST_PATH_IMAGE174
Corresponding normalized homogeneous image coordinates. Herein, will
Figure 926622DEST_PATH_IMAGE175
And
Figure 296424DEST_PATH_IMAGE176
parametric to two unknown depth parameters
Figure 718178DEST_PATH_IMAGE177
Linear function ofAnd (4) counting.
Each affine matching point pair is associated with two perspectives in view 1 and view 2. Relative movement between two cameras
Figure 515232DEST_PATH_IMAGE178
Determined by a combination of four transforms: (i) from one perspective camera to view 1, (ii) from view 1 to W, (iii) from W to view 2, (iv) from view 2 to another perspective camera. In these four transformations, (i) and (iv) are determined in part by known external parameters. In sections (ii) and (iii), there are unknowns
Figure 389648DEST_PATH_IMAGE179
Figure 563140DEST_PATH_IMAGE180
And
Figure 839400DEST_PATH_IMAGE181
they are parameterized as
Figure 807356DEST_PATH_IMAGE182
. Relative movement
Figure 169068DEST_PATH_IMAGE183
Can be expressed as:
Figure 615093DEST_PATH_IMAGE184
Figure 11439DEST_PATH_IMAGE185
relative motion between two perspective cameras in each affine matching point pair
Figure 415875DEST_PATH_IMAGE186
After being represented, the essential matrix
Figure 999303DEST_PATH_IMAGE187
Watch capable of showingShown as follows:
Figure 514598DEST_PATH_IMAGE188
Figure 499872DEST_PATH_IMAGE189
by general formula
Figure 75210DEST_PATH_IMAGE190
Substituted type
Figure 145934DEST_PATH_IMAGE189
The element entries in the essential matrix are all equal to { λjkK = 1,2 in a linear relationship. Then, will formula
Figure 464920DEST_PATH_IMAGE189
Brought into
Figure 304700DEST_PATH_IMAGE191
And formula
Figure 50939DEST_PATH_IMAGE192
Five equations may be obtained from two affine transformation constraints for the jth affine match point pair and three equations for the other affine match point pair from the two affine transformation constraints. These equations can be expressed as:
Figure 608959DEST_PATH_IMAGE193
Figure 466057DEST_PATH_IMAGE194
Figure 425923DEST_PATH_IMAGE195
all the element terms in (1) are unknown numbers
Figure 343063DEST_PATH_IMAGE196
Figure 388379DEST_PATH_IMAGE197
And are and
Figure 49168DEST_PATH_IMAGE198
the second order term of (2).
After the relative rotation angle between the two multi-camera systems is obtained through the inertial measurement unit, the relative motion estimation problem of the multi-camera is five degrees of freedom. Considering that two affine matching point pairs provide six independent constraints, the number of constraints is greater than the unknowns, and there are redundant constraints. Thus, the slave type
Figure 597961DEST_PATH_IMAGE199
Wherein four equations are randomly selected to explore the case of minimum solution. For example, two affine transformation constraints of the jth affine matching point pair and one epipolar geometric constraint and the first affine transformation constraint of another affine matching point pair are combined into four equations containing five unknowns, i.e., the formula
Figure 951582DEST_PATH_IMAGE199
The first four equations of (1):
Figure 218615DEST_PATH_IMAGE200
Figure 417515DEST_PATH_IMAGE201
due to the formula
Figure 351973DEST_PATH_IMAGE201
Have a non-zero solution, therefore
Figure 610916DEST_PATH_IMAGE202
Is satisfied with
Figure 365246DEST_PATH_IMAGE203
. Therefore, the temperature of the molten metal is controlled,
Figure 367837DEST_PATH_IMAGE202
all 3 x 3 sub-determinants of (a) must be zero. This gives information about three unknowns
Figure 156801DEST_PATH_IMAGE204
Four equations of (2).
Likewise, another affine matching point pair may be selected to establish the world reference frame W. Assuming that the jth affine matching point pair is selected, a similar formula can be established
Figure 321066DEST_PATH_IMAGE201
The system of equations of (1):
Figure 828271DEST_PATH_IMAGE205
Figure 368974DEST_PATH_IMAGE206
general formula
Figure 12445DEST_PATH_IMAGE201
And formula
Figure 878770DEST_PATH_IMAGE206
By doing the simultaneous, three unknowns can be obtained
Figure 342112DEST_PATH_IMAGE207
Eight equations of (1).
Figure 952085DEST_PATH_IMAGE208
Figure 450062DEST_PATH_IMAGE209
The order of these equations is 6. Furthermore, an additional constraint was found in the just-problem, namely
Figure 221709DEST_PATH_IMAGE210
Is 1.
The affine transformation constraint provides an additional equation to the problem. Only when the world reference frame W is established using the jth affine match point pair, the two affine transformation constraints of the jth affine match point pair are used to construct the additional equations. Thus, for the relative motion estimation problem of multiple cameras, there are three additional equations:
Figure 437927DEST_PATH_IMAGE211
Figure 586011DEST_PATH_IMAGE212
the above formula is about
Figure 204075DEST_PATH_IMAGE213
The system of equations of the fourth order.
The solution is performed using the Gr baby basis method. General formula
Figure 881044DEST_PATH_IMAGE209
And formula
Figure 584557DEST_PATH_IMAGE212
Respectively expressed as constraints
Figure 270754DEST_PATH_IMAGE214
And constraint
Figure 743323DEST_PATH_IMAGE215
. Using constraints alone
Figure 856773DEST_PATH_IMAGE214
Can be used for relative motion estimation under the condition of non-crossed or crossed affine matching point pairs. But at the same time
Figure 47583DEST_PATH_IMAGE214
And
Figure 271891DEST_PATH_IMAGE215
the number of possible solutions can be reduced.
Once the rotation parameters are obtained
Figure 864546DEST_PATH_IMAGE216
Can be calculated immediately
Figure 680055DEST_PATH_IMAGE217
. Then use formula
Figure 92582DEST_PATH_IMAGE218
By finding
Figure 120581DEST_PATH_IMAGE219
To determine the null space of
Figure 567743DEST_PATH_IMAGE220
. Next, can pass through
Figure 22995DEST_PATH_IMAGE221
Computing
Figure 188397DEST_PATH_IMAGE222
And
Figure 754508DEST_PATH_IMAGE223
. Finally, by combined transformation
Figure 56176DEST_PATH_IMAGE224
And
Figure 682329DEST_PATH_IMAGE225
to calculate the relative motion of the multiple cameras.
The invention can achieve the following technical effects:
1) the method and the device provided by the invention fully utilize the affine matching point pair information between the views under the condition that the inertial measurement unit directly provides the relative rotation angle, greatly reduce the number of point pairs required for solving the relative motion estimation problem of a single-camera system and a multi-camera system, and obviously improve the accuracy and the robustness of the algorithm.
2) The invention utilizes the characteristic that the relative rotation angles of the inertial measurement unit and the camera are the same in the motion process under the condition that the inertial measurement unit and the camera are fixedly connected, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, and can be applied to wider scenes such as unknown or changed installation relation between the inertial measurement unit and the camera.
3) The relative movement of the single camera has 4 degrees of freedom under the condition that the inertial measurement unit directly provides the relative rotation angle of the single camera. A novel solving method for the minimum configuration solution of the relative motion estimation of the single camera is provided, and the solving method can accurately estimate the relative motion of the single camera through 2 affine matching point pairs.
4) Under the condition that the relative rotation angle of the multiple cameras is directly provided for the inertial measurement unit, the relative motion of the multiple cameras has 5 degrees of freedom. A novel solving method for the minimum configuration solution of the relative motion estimation of the multi-camera is provided, and the solving method can accurately estimate the relative motion of a multi-camera system through 2 crossed or non-crossed affine matching point pairs.
5) The method of the invention does not need to calibrate the inertial measurement unit and the camera, so that the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has higher precision and efficiency, and is suitable for equipment with limited computing capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobile, augmented reality equipment and the like.
For electronic equipment such as mobile phones, unmanned planes and the like fixedly connected with and provided with an inertia measuring unit and a camera, the inertia measuring unit is used for providing relative rotation angle information for the camera, and two affine matching points are used for estimating the relative motion process of a single camera and a multi-camera system as follows:
1) extracting affine matching point pairs between two views in the relative motion of the camera by algorithms such as ASIFT, MODS and the like, wherein the affine matching point pairs comprise local affine matrixes between image coordinates and corresponding field information of the same-name point pairs;
2) relative rotation angle information directly output by an inertia measurement unit fixedly connected with the camera;
3) according to the affine matching point pair extracted between the two views, the relative rotation angle output by the inertial measurement unit and the relative motion estimation algorithm provided by the invention, the relative motion between the single camera and the multi-camera system is solved. Meanwhile, a mismatch point pair in the affine matching point pair is eliminated by combining with the RANSAC frame, and a relative motion result of the camera is recovered.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
2. The method of claim 1, wherein determining a rotation parameter constraint for the rotation matrix corresponding to the unknown number based on the relative rotation angle between the first view and the second view comprises:
obtaining the rotation matrix obtained by parameterizing the rotation matrix as follows:
Figure 535036DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 982198DEST_PATH_IMAGE002
is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
Figure 703030DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 602852DEST_PATH_IMAGE004
indicating the relative angle of rotation.
3. The method of claim 1, wherein the representing the first pose relationship using the parameterized rotation matrix and translation vector and obtaining a corresponding essential matrix comprises:
obtaining a translation vector in the second position posture relation and a translation vector in the third position posture relation obtained by parameterizing the translation vector, wherein the obtained translation vectors are respectively:
Figure 434542DEST_PATH_IMAGE005
wherein the content of the first and second substances,
Figure 736211DEST_PATH_IMAGE006
representing a translation vector in the second attitude relationship,
Figure 362364DEST_PATH_IMAGE007
an unknown depth parameter representing a translation vector parameterization in the second pose relationship,
Figure 15062DEST_PATH_IMAGE008
a unit vector representing the normalized homogeneous image coordinates in the first view,
Figure 119284DEST_PATH_IMAGE009
representing the translation vector in the third posture relationship,
Figure 541038DEST_PATH_IMAGE010
an unknown depth parameter representing the translation vector in the third pose relationship,
Figure 603672DEST_PATH_IMAGE011
a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
Figure 212508DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 651580DEST_PATH_IMAGE013
representing a translation vector in the first position posture relation, I representing a unit matrix, and R representing a rotation matrix corresponding to the first position posture relation;
and obtaining the corresponding essential matrix as follows:
Figure 662261DEST_PATH_IMAGE014
wherein, E represents an essential matrix,
Figure 895796DEST_PATH_IMAGE015
representing an anti-symmetric matrix.
4. The method of claim 3, wherein the epipolar geometry constraint is:
Figure 991928DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 703532DEST_PATH_IMAGE017
normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
Figure 99879DEST_PATH_IMAGE018
where the subscript (1:2) represents the first two equations,
Figure 238736DEST_PATH_IMAGE019
representing a local affine transformation matrix corresponding to the normalized homogeneous image coordinates.
5. The method according to any one of claims 3 to 4, wherein solving the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the one epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points comprises:
determining the relative motion parameters of the single camera to be four degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and other epipolar geometric constraints determined by the affine matching points, and constructing a first solution model as follows:
Figure 87743DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 337459DEST_PATH_IMAGE021
the element term in (1) is unknown number
Figure 588312DEST_PATH_IMAGE022
Figure 898070DEST_PATH_IMAGE023
And
Figure 234374DEST_PATH_IMAGE024
the second-order term of (a) is,
Figure 287781DEST_PATH_IMAGE025
to represent
Figure 393140DEST_PATH_IMAGE026
The matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
Figure 873800DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 431820DEST_PATH_IMAGE028
the element term in (1) is unknown number
Figure 554497DEST_PATH_IMAGE029
Figure 248783DEST_PATH_IMAGE030
And
Figure 165924DEST_PATH_IMAGE031
the second order term of (d);
obtaining the unknown number according to the first solving model and the second solving model
Figure 211240DEST_PATH_IMAGE032
The six equations of (1) are:
Figure 872029DEST_PATH_IMAGE033
obtaining algebraic solutions of six equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship according to the algebraic solutions, and obtaining a rotation matrix R of a first attitude relationship according to the rotation matrix R
Figure 420822DEST_PATH_IMAGE034
Mid-null space determination
Figure 774443DEST_PATH_IMAGE035
According to
Figure 307055DEST_PATH_IMAGE036
Calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation,
Figure 240376DEST_PATH_IMAGE037
representation matrix
Figure 174834DEST_PATH_IMAGE038
The sub-matrices of the first two rows and the first two columns,
Figure 433777DEST_PATH_IMAGE039
representation matrix
Figure 188106DEST_PATH_IMAGE040
The sub-matrices of the first two rows and the first two columns,
Figure 190697DEST_PATH_IMAGE041
a determinant representing a matrix is provided,
Figure 979662DEST_PATH_IMAGE042
representing by unknowns
Figure 409506DEST_PATH_IMAGE043
Composition IIAnd the rows and the columns of the submatrixes.
6. The method of claim 5, wherein when the camera is a multi-camera system, the method further comprises:
camera in system for acquiring multiple cameras
Figure 651132DEST_PATH_IMAGE044
External parameters of
Figure 457414DEST_PATH_IMAGE045
Figure 835305DEST_PATH_IMAGE046
A matrix of rotations is represented, which is,
Figure 701630DEST_PATH_IMAGE047
representing a translation vector; wherein, the multi-camera system includes: camera for taking first view or second view
Figure 164973DEST_PATH_IMAGE048
And a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
Figure 774945DEST_PATH_IMAGE049
wherein the content of the first and second substances,
Figure 272923DEST_PATH_IMAGE050
is the serial number of the camera,
Figure 44570DEST_PATH_IMAGE051
is the sequence number of the affine match point pair,
Figure 526367DEST_PATH_IMAGE052
is the view's serial number; unit direction vector
Figure 408872DEST_PATH_IMAGE053
By passing
Figure 26935DEST_PATH_IMAGE054
The calculation results in that,
Figure 703904DEST_PATH_IMAGE055
camera of person being
Figure 407418DEST_PATH_IMAGE056
In
Figure 93614DEST_PATH_IMAGE057
The corresponding normalized homogeneous image coordinates are then compared,
Figure 566184DEST_PATH_IMAGE058
the force vector is represented by a force vector,
Figure 679633DEST_PATH_IMAGE059
an unknown depth parameter representing the translation vector parameterization;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
Figure 870443DEST_PATH_IMAGE060
and calculating an essential matrix corresponding to the fourth pose relation as follows:
Figure 94751DEST_PATH_IMAGE061
wherein the content of the first and second substances,
Figure 687407DEST_PATH_IMAGE062
and
Figure 237337DEST_PATH_IMAGE063
respectively representing camera
Figure 384284DEST_PATH_IMAGE064
Video and audio player
Figure 677862DEST_PATH_IMAGE065
The rotation matrix of (a) is,
Figure 859445DEST_PATH_IMAGE066
and
Figure 580276DEST_PATH_IMAGE067
image distinguishing machine
Figure 480099DEST_PATH_IMAGE068
Video and audio player
Figure 311789DEST_PATH_IMAGE069
The translation vector of (a);
Figure 347878DEST_PATH_IMAGE070
representing a rotation matrix between a first view and a second view,
Figure 239611DEST_PATH_IMAGE071
representing a translation vector between the first view and the second view.
7. The method according to claim 6, wherein said solving for said rotation matrix and said translation vector according to two affine transformation constraints corresponding to the jth said affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other said affine matching points further comprises:
determining the relative motion parameters of the multi-camera system to be five degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
Figure 361151DEST_PATH_IMAGE072
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
Figure 996531DEST_PATH_IMAGE073
obtaining the unknown number according to the third solving model and the fourth solving model
Figure 152706DEST_PATH_IMAGE074
The eight equations of (1) are:
Figure 215340DEST_PATH_IMAGE075
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix R
Figure 89755DEST_PATH_IMAGE076
Mid-null space determination
Figure 997668DEST_PATH_IMAGE077
According to
Figure 273929DEST_PATH_IMAGE078
And calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation.
CN202110596663.4A 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition Active CN113048985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110596663.4A CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110596663.4A CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Publications (2)

Publication Number Publication Date
CN113048985A CN113048985A (en) 2021-06-29
CN113048985B true CN113048985B (en) 2021-08-06

Family

ID=76518592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110596663.4A Active CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Country Status (1)

Country Link
CN (1) CN113048985B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115451958B (en) * 2022-11-10 2023-02-03 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN111476842A (en) * 2020-04-10 2020-07-31 中国人民解放军国防科技大学 Camera relative pose estimation method and system
CN111696158A (en) * 2020-06-04 2020-09-22 中国人民解放军国防科技大学 Affine matching point pair-based multi-camera system relative pose estimation method and device
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN111476842A (en) * 2020-04-10 2020-07-31 中国人民解放军国防科技大学 Camera relative pose estimation method and system
CN111696158A (en) * 2020-06-04 2020-09-22 中国人民解放军国防科技大学 Affine matching point pair-based multi-camera system relative pose estimation method and device
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Making affine correspondences work in camera geometry computation;Dániel Baráth,et.al;《Computer Vision - ECCV 2020 16th European Conference》;20201231;第723-740页 *
Relative Pose Estimation With a Single Affine Correspondence;Guan Banglei;《IEEE transactions on cybernetics》;20210428;第1-12页 *
一种无公共视场的多相机系统相对位姿解耦估计方法;田苗等;《光学学报》;20210331;第41卷(第5期);第1-8页 *
仿射变形下的异源图像匹配方法;涂国勇等;《计算机辅助设计与图形学学报》;20150831;第27卷(第8期);第1512-1517页 *

Also Published As

Publication number Publication date
CN113048985A (en) 2021-06-29

Similar Documents

Publication Publication Date Title
CN102737406B (en) Three-dimensional modeling apparatus and method
Quan et al. Affine structure from line correspondences with uncalibrated affine cameras
Ke et al. Quasiconvex optimization for robust geometric reconstruction
CN102278946B (en) Imaging device, distance measuring method
Sweeney et al. Solving for relative pose with a partially known rotation is a quadratic eigenvalue problem
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
Orghidan et al. Camera calibration using two or three vanishing points
Clipp et al. Robust 6dof motion estimation for non-overlapping, multi-camera systems
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
US8442305B2 (en) Method for determining 3D poses using points and lines
US10645365B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
CN111754579B (en) Method and device for determining external parameters of multi-view camera
CN112184824A (en) Camera external parameter calibration method and device
CN113256718B (en) Positioning method and device, equipment and storage medium
CN113744340A (en) Calibrating cameras with non-central camera models of axial viewpoint offset and computing point projections
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
Seo et al. A branch-and-bound algorithm for globally optimal calibration of a camera-and-rotation-sensor system
CN113048985B (en) Camera relative motion estimation method under known relative rotation angle condition
Huttunen et al. A monocular camera gyroscope
Kurz et al. Bundle adjustment for stereoscopic 3d
CN111145267A (en) IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method
Pagel Extrinsic self-calibration of multiple cameras with non-overlapping views in vehicles
JP3712847B2 (en) Three-dimensional shape measurement method, three-dimensional shape measurement device, and posture detection device for imaging means
CN113223163A (en) Point cloud map construction method and device, equipment and storage medium
JP5215615B2 (en) Three-dimensional position information restoration apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant