CN113048985A - Camera relative motion estimation method under known relative rotation angle condition - Google Patents

Camera relative motion estimation method under known relative rotation angle condition Download PDF

Info

Publication number
CN113048985A
CN113048985A CN202110596663.4A CN202110596663A CN113048985A CN 113048985 A CN113048985 A CN 113048985A CN 202110596663 A CN202110596663 A CN 202110596663A CN 113048985 A CN113048985 A CN 113048985A
Authority
CN
China
Prior art keywords
camera
view
affine
matrix
translation vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110596663.4A
Other languages
Chinese (zh)
Other versions
CN113048985B (en
Inventor
关棒磊
谭泽
李璋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110596663.4A priority Critical patent/CN113048985B/en
Publication of CN113048985A publication Critical patent/CN113048985A/en
Application granted granted Critical
Publication of CN113048985B publication Critical patent/CN113048985B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Algebra (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application relates to a camera relative motion estimation method based on affine matching point pairs under the condition of known relative rotation angle. When the inertial measurement unit is fixedly connected with the camera under the installation condition, the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, so that the method can be applied to wider scenes such as unknown installation relation or changed installation relation between the inertial measurement unit and the camera. Meanwhile, the situation that the relative rotation angle of the camera or the multi-camera system is known is introduced, and the relative motion of the single camera and the multi-camera system can be estimated by adopting the image coordinates of the two affine matching point pairs and the local affine transformation matrix in the field. Therefore, the calculation accuracy and efficiency are improved, and the method is suitable for equipment with limited calculation capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobiles and augmented reality equipment.

Description

Camera relative motion estimation method under known relative rotation angle condition
Technical Field
The application relates to the technical field of positioning, in particular to a camera relative motion estimation method under the condition of known relative rotation angle.
Background
Relative motion estimation of a single camera or multiple cameras is a fundamental problem in three-dimensional vision, such as robot positioning and mapping, augmented reality, and autopilot. The single-camera system is modeled by a central perspective projection model, and the relative motion estimation of the single camera generally utilizes an essential matrix algorithm of 5 homonymous point pairs or a homography matrix algorithm of 4 homonymous point pairs. The multi-camera system consists of a plurality of single cameras fixed to a single rigid body. It is modeled using a general camera model and does not have a single center of projection. Relative motion estimation for multiple cameras typically utilizes a minimum configuration solution of 6 homonymous point pairs, or a linear solution method of 17 homonymous point pairs.
At present, sensors such as an inertia measurement unit and a camera are generally arranged in electronic equipment such as a mobile phone and an unmanned aerial vehicle, and the inertia measurement unit and the camera are fixedly connected and installed, so that the inertia measurement unit can be used for providing rotation angle information for camera relative motion estimation, and the following two conditions can be mainly adopted: (1) if the installation relation between the inertia measurement unit and the camera is accurately calibrated in advance, the attitude angle can be directly provided for the camera according to the rotation angle output by the inertia measurement unit, so that the pose parameters to be solved in the estimation of the relative motion of the camera are reduced. For example, under the condition that the inertial measurement unit provides a uniform gravity direction for the camera, the relative motion of a single-camera system can be estimated using 3 homonymous point pairs, and the relative motion of a multi-camera system can be estimated using 4 homonymous point pairs. (2) If the installation relationship between the inertial measurement unit and the camera is unknown, the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera according to the characteristic that the relative rotation angles of the inertial measurement unit and the camera in the motion process are the same under the fixed connection installation condition of the inertial measurement unit and the camera, and the pose parameters to be solved in the estimation of the relative motion of the camera can also be reduced. The relative motion of the single-camera system and the multi-camera system can be estimated using 4 pairs of homonymous points and 5 pairs of homonymous points, respectively. Because the inertial measurement unit and the camera do not need to be calibrated, the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has wide application scenes in the fields of three-dimensional reconstruction, visual odometry and the like.
The traditional relative motion estimation algorithm usually adopts characteristic algorithms such as SIFT and SURF to obtain homonymous point pairs. At present, in multi-view geometric estimation, affine matching point pairs extracted by feature algorithms such as ASIFT and MODS are receiving more and more attention because more image point pair information is included. The affine matching point pairs include not only image coordinates of the same-name point pairs but also local affine matrices of domain information between the same-name point pairs. Three constraint equations are given to each affine matching point pair in the epipolar geometric relationship, and the number of the point pairs required by the relative motion estimation method can be effectively reduced.
Disclosure of Invention
In view of the above, it is necessary to provide a camera relative motion estimation method capable of solving the problem under the condition of the known relative rotation angle by using the affine matching point pair.
A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and matches the constraint between the point and the camera motion model according to the affine. The method provides a minimum solution for estimating the relative motion based on two affine matching point pairs, and respectively estimates the motion of a single camera and a multi-camera system, thereby greatly reducing the number of point pairs required for solving the problem of estimating the relative motion of the single camera and the multi-camera system, obviously improving the precision and the robustness of the algorithm, and being suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
Drawings
FIG. 1 is a diagram of a method for estimating relative camera motion under a known relative rotation angle condition in one embodiment;
FIG. 2 is a schematic diagram of a single camera parameter distribution in one embodiment;
FIG. 3 is a diagram illustrating multi-camera parameter distribution in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a camera relative motion estimation method under a known relative rotation angle condition, comprising the steps of:
and 102, acquiring at least two affine matching point pairs in a first view and a second view shot by the camera, and selecting a jth affine matching point pair to establish a world reference system.
The origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction.
And 104, acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system.
The first position relation, the second position relation and the third position relation all comprise a rotation matrix and a translation vector.
And 106, parameterizing the rotation matrix and the translation vector, determining rotation parameter constraints corresponding to unknown numbers of the rotation matrix according to the relative rotation angle between the first view and the second view, representing the first position and orientation relation by adopting the parameterized rotation matrix and translation vector, and acquiring a corresponding essential matrix.
And 108, acquiring two affine transformation constraints corresponding to the intrinsic matrix determined by the jth affine matching point and the affine matching matrix in the affine matching points, and acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the intrinsic matrix and the affine matching matrix in the affine matching points.
And 110, obtaining a rotation matrix and a translation vector according to two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other affine matching points, and determining a relative motion relation of the camera according to the rotation matrix and the translation vector.
The camera relative motion estimation method under the known relative rotation angle condition utilizes the relative rotation angle provided by the inertial measurement unit and matches the constraint between the point and the camera motion model according to the affine. The method provides a minimum solution for estimating the relative motion based on two affine matching point pairs, and respectively estimates the motion of a single camera and a multi-camera system, thereby greatly reducing the number of point pairs required for solving the problem of estimating the relative motion of the single camera and the multi-camera system, obviously improving the precision and the robustness of the algorithm, and being suitable for the condition that the installation relationship between an inertial measurement unit and the camera is unknown or has variation.
For single camera relative motion estimation, assume that the jth affine matching point pair is represented as
Figure 514165DEST_PATH_IMAGE001
Wherein
Figure 710660DEST_PATH_IMAGE002
And
Figure 885289DEST_PATH_IMAGE003
normalized homogeneous image coordinates of homonymous point pairs in the first view and the second view respectively,
Figure 892560DEST_PATH_IMAGE004
is a 2 x 2 local affine transformation matrix,
Figure 355902DEST_PATH_IMAGE005
characterize what is
Figure 231454DEST_PATH_IMAGE006
And
Figure 463852DEST_PATH_IMAGE007
affine transformation relations in a surrounding infinitesimal neighborhood. The corresponding unit direction vector of the same-name point pair can be calculated by the following equation:
Figure 625712DEST_PATH_IMAGE008
Figure 638668DEST_PATH_IMAGE009
. The input conditions of the embodiment of the present invention are two affine matching point pairs (at least one affine matching point pair and one homologous point pair) and the relative rotation angle of the single camera provided by the inertial measurement unit.
In one embodiment, the rotation matrix obtained by parameterizing the rotation matrix is obtained as follows:
Figure 255594DEST_PATH_IMAGE010
wherein,
Figure 280182DEST_PATH_IMAGE011
is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
Figure 753888DEST_PATH_IMAGE012
wherein,
Figure 926244DEST_PATH_IMAGE013
indicating the relative angle of rotation.
In one embodiment, the obtaining of the translation vector in the second posture relationship and the translation vector in the third posture relationship obtained by parameterizing the translation vector are respectively:
Figure 268232DEST_PATH_IMAGE014
wherein,
Figure 271960DEST_PATH_IMAGE015
representing a translation vector in the second attitude relationship,
Figure 854251DEST_PATH_IMAGE016
an unknown depth parameter representing a translation vector parameterization in the second pose relationship,
Figure 186007DEST_PATH_IMAGE017
a unit vector representing the normalized homogeneous image coordinates in the first view,
Figure 207052DEST_PATH_IMAGE018
representing the translation vector in the third posture relationship,
Figure 268549DEST_PATH_IMAGE019
an unknown depth parameter representing the translation vector in the third pose relationship,
Figure 208692DEST_PATH_IMAGE020
a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
Figure 417957DEST_PATH_IMAGE021
wherein,
Figure 586901DEST_PATH_IMAGE022
representing a translation vector in the first attitude relationship, and I representing a unit matrix;
and obtaining the corresponding essential matrix as follows:
Figure 299642DEST_PATH_IMAGE023
wherein, E represents an essential matrix,
Figure 410686DEST_PATH_IMAGE024
representing an anti-symmetric matrix.
In one embodiment, the epipolar geometry constraint is:
Figure 44930DEST_PATH_IMAGE025
wherein,
Figure 407778DEST_PATH_IMAGE026
normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
Figure 850392DEST_PATH_IMAGE027
where the subscript (1:2) represents the first two equations.
In one embodiment, the relative motion parameters of the single camera are determined to be four degrees of freedom according to the rotation parameter constraint; selecting two affine transformation constraints corresponding to the jth affine matching point and one epipolar geometric constraint determined by other affine matching points, and constructing a first solving model as follows:
Figure 210966DEST_PATH_IMAGE028
wherein,
Figure 394823DEST_PATH_IMAGE029
the element term in (1) is unknown number
Figure 233466DEST_PATH_IMAGE030
Figure 311012DEST_PATH_IMAGE031
And
Figure 639225DEST_PATH_IMAGE032
the second-order term of (a) is,
Figure 982482DEST_PATH_IMAGE033
to represent
Figure 828078DEST_PATH_IMAGE034
The matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
Figure 573180DEST_PATH_IMAGE035
wherein,
Figure 337874DEST_PATH_IMAGE036
the element term in (1) is unknown number
Figure 168427DEST_PATH_IMAGE037
Figure 535823DEST_PATH_IMAGE038
And
Figure 197749DEST_PATH_IMAGE039
the second order term of (d);
obtaining the unknown number according to the first solving model and the second solving model
Figure 71027DEST_PATH_IMAGE040
The six equations of (1) are:
Figure 326559DEST_PATH_IMAGE041
obtaining algebraic solutions of six equations through Gr baby basis solution, determining rotation matrix R of first attitude relationship according to the algebraic solutions, and obtaining rotation matrix R of first attitude relationship according to Gr baby basis solution
Figure 107433DEST_PATH_IMAGE042
Mid-null space determination
Figure 748499DEST_PATH_IMAGE043
According to
Figure 792678DEST_PATH_IMAGE044
Is calculated to obtainA translation vector in the two-position posture relation and a translation vector in the third-position posture relation,
Figure 394561DEST_PATH_IMAGE045
representation matrix
Figure 120071DEST_PATH_IMAGE046
The sub-matrices of the first two rows and the first two columns,
Figure 491010DEST_PATH_IMAGE047
representation matrix
Figure 706090DEST_PATH_IMAGE048
The sub-matrices of the first two rows and the first two columns,
Figure 919903DEST_PATH_IMAGE049
a determinant representing a matrix is provided,
Figure 308159DEST_PATH_IMAGE050
representing by unknowns
Figure 736866DEST_PATH_IMAGE051
Forming two rows and two columns of submatrices.
In one embodiment, when the camera is a multi-camera system, the camera in the multi-camera system is acquired
Figure 794952DEST_PATH_IMAGE052
External parameters of
Figure 637006DEST_PATH_IMAGE053
(ii) a Wherein, the multi-camera system includes: camera for taking first view or second view
Figure 766636DEST_PATH_IMAGE054
And a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
Figure 994659DEST_PATH_IMAGE055
wherein,
Figure 613859DEST_PATH_IMAGE056
is the serial number of the camera,
Figure 552996DEST_PATH_IMAGE057
is the sequence number of the affine match point pair,
Figure 17476DEST_PATH_IMAGE058
is the view sequence number. Unit direction vector
Figure 420775DEST_PATH_IMAGE059
Can pass through
Figure 601090DEST_PATH_IMAGE060
The calculation results in that,
Figure 824261DEST_PATH_IMAGE061
Figure 623589DEST_PATH_IMAGE062
camera of person being
Figure 553499DEST_PATH_IMAGE063
In
Figure 452185DEST_PATH_IMAGE064
Corresponding normalized homogeneous image coordinates;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
Figure 490548DEST_PATH_IMAGE065
and calculating an essential matrix corresponding to the fourth pose relation as follows:
Figure 765672DEST_PATH_IMAGE066
in one embodiment, the relative motion parameters of the multi-camera system are determined to be five degrees of freedom according to the rotation parameter constraints;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
Figure 799356DEST_PATH_IMAGE067
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
Figure 196839DEST_PATH_IMAGE068
obtaining the unknown number according to the third solving model and the fourth solving model
Figure 66706DEST_PATH_IMAGE069
The eight equations of (1) are:
Figure 942258DEST_PATH_IMAGE070
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix R
Figure 96028DEST_PATH_IMAGE071
Mid-null space determination
Figure 336516DEST_PATH_IMAGE072
According to
Figure 83892DEST_PATH_IMAGE073
Is calculated to obtainA translation vector in the two-position posture relation and a translation vector in the third-position posture relation.
The following further description will be made in the case of a single camera and a multi-camera, respectively.
Single camera
Any one affine matching point pair is selected to define the world reference frame W, as shown in fig. 2. Assuming that the jth affine matching point pair is currently selected, the position of the jth affine matching point pair in the three-dimensional space is selected as the origin of W, and the coordinate axis direction of W is consistent with View1 (View 1 in fig. 2). The pose relationship between View1 and View2 (View 2 in FIG. 2) is represented as
Figure 638502DEST_PATH_IMAGE074
The pose relationship between the view1 and the reference system W is expressed as
Figure 725406DEST_PATH_IMAGE075
The pose relationship between the view2 and the reference system W is expressed as
Figure 199113DEST_PATH_IMAGE076
. In particular, it is possible to use, for example,
Figure 292840DEST_PATH_IMAGE077
Figure 713457DEST_PATH_IMAGE078
. Representing rotation matrices using Cayley parameterization
Figure 717185DEST_PATH_IMAGE079
It can be expressed as:
Figure 237159DEST_PATH_IMAGE080
Figure 896811DEST_PATH_IMAGE081
wherein
Figure 652277DEST_PATH_IMAGE082
Is a quaternion homogeneous vector. It is assumed that the relative rotation angle between view1 and view2 is known. It is noted that the relative rotation angle can be provided directly by the inertial measurement unit even in the case where the mounting relationship between the inertial measurement unit and the camera is unknown or varies. Three unknowns
Figure 900725DEST_PATH_IMAGE083
The following constraints are satisfied:
Figure 716234DEST_PATH_IMAGE084
Figure 597602DEST_PATH_IMAGE085
wherein
Figure 297705DEST_PATH_IMAGE086
Is the relative angle of rotation between the two views.
Next, the following steps are carried out
Figure 213708DEST_PATH_IMAGE087
And
Figure 465698DEST_PATH_IMAGE088
parametric to two unknown depth parameters
Figure 21313DEST_PATH_IMAGE089
Linear function of (c):
Figure 321845DEST_PATH_IMAGE090
Figure 154671DEST_PATH_IMAGE091
the relative motion between the two views is determined by the combination of two transformations: (i) from view1 to W, (ii) from W to view 2. Unknown number
Figure 515246DEST_PATH_IMAGE092
Figure 43310DEST_PATH_IMAGE093
And
Figure 944270DEST_PATH_IMAGE094
is parameterized as
Figure 21816DEST_PATH_IMAGE095
. In the form of relative movement
Figure 350029DEST_PATH_IMAGE096
Expressed as:
Figure 693286DEST_PATH_IMAGE097
Figure 7724DEST_PATH_IMAGE098
the essential matrix can be represented as:
Figure 815143DEST_PATH_IMAGE099
Figure 517520DEST_PATH_IMAGE100
by general formula
Figure 535023DEST_PATH_IMAGE101
Carry-in type
Figure 777786DEST_PATH_IMAGE102
It can be seen that each element in the essential matrix is associated with
Figure 642973DEST_PATH_IMAGE103
The correlation is linear.
An affine matching point pair can extract three independent constraints for geometric model estimation, including oneEpipolar geometric constraints derived from homonymic point pair relationships
Figure 453935DEST_PATH_IMAGE104
And two local affine transformation matrices
Figure 302942DEST_PATH_IMAGE105
Derived affine transformation constraints. With known intra-camera reference, the epipolar geometric constraint between view1 and view2 is as follows:
Figure 552658DEST_PATH_IMAGE106
Figure 459303DEST_PATH_IMAGE107
description essence matrix
Figure 300220DEST_PATH_IMAGE108
With local affine transformation matrix
Figure 839785DEST_PATH_IMAGE109
The affine transformation constraint of the relationship of (a) can be expressed as follows:
Figure 565296DEST_PATH_IMAGE110
Figure 936234DEST_PATH_IMAGE111
where the subscript (1:2) represents the first two equations.
Since an affine matching point pair has been selected as the origin of the world reference frame for the special parameterization of the translation vector, it was found that homonymous point correspondences in the selected affine matching point pair cannot contribute a new constraint because the coefficients of the resulting equation are all zero. Thus, when the jth affine matching point pair is used to establish the world reference frame W, two affine matching point pairs can provide five equations. Specifically, the jth affine matching point pair baseIn the formula
Figure 151315DEST_PATH_IMAGE112
Two equations are provided. Another affine match point pair provides formula-based
Figure 630707DEST_PATH_IMAGE113
And formula
Figure 956646DEST_PATH_IMAGE112
Three equations of (a). By general formula
Figure 713249DEST_PATH_IMAGE114
Brought into
Figure 771335DEST_PATH_IMAGE113
And formula
Figure 285493DEST_PATH_IMAGE112
Using the hidden variable method, the five equations provided by two affine matching point pairs can be written as:
Figure 477440DEST_PATH_IMAGE115
Figure 947605DEST_PATH_IMAGE116
wherein,
Figure 504488DEST_PATH_IMAGE117
the element term in (1) is unknown number
Figure 568259DEST_PATH_IMAGE118
Figure 173684DEST_PATH_IMAGE119
And
Figure 576983DEST_PATH_IMAGE120
the second order term of (2).
Obtaining two cameras by an inertial measurement unitAfter the relative rotation angle between the two cameras, the relative motion estimation problem of the single camera is four degrees of freedom. However, two affine matching point pairs may provide six independent constraints. This means that the number of constraints is larger than the number of unknowns and there are redundant constraints. Therefore, at least one affine matching point pair and one homologous point pair are sufficient to estimate the relative motion of the single camera under the condition of a known relative rotation angle. Can be selected from
Figure 367085DEST_PATH_IMAGE121
Optionally three equations to explore the case of minimum solution. More specifically, the simultaneous combination of two affine transformation constraints of the jth affine match point pair and one epipolar geometric constraint of another affine match point pair yields 3 equations with 5 unknowns, i.e., the equation
Figure 777206DEST_PATH_IMAGE121
The first three equations of (c):
Figure 514218DEST_PATH_IMAGE122
Figure 834341DEST_PATH_IMAGE123
due to the formula
Figure 405131DEST_PATH_IMAGE124
Have a non-zero solution, therefore
Figure 381177DEST_PATH_IMAGE125
Is satisfied with
Figure 718617DEST_PATH_IMAGE126
. Therefore, the temperature of the molten metal is controlled,
Figure 752301DEST_PATH_IMAGE125
all 2 x 2 sub-determinants of (a) must be zero. This gives information about three unknowns
Figure 87468DEST_PATH_IMAGE127
Three equations of (a).
In summary, assume that the jth affine matching point pair is selected to establish the world frame of reference W. Since two affine matching point pairs are required in the case of the minimum solution, another affine matching point pair can also be selected to establish the world reference frame W. Assuming that the jth AC is selected, a similar equation can be obtained
Figure 347548DEST_PATH_IMAGE124
The system of equations of (1):
Figure 426362DEST_PATH_IMAGE128
Figure 65285DEST_PATH_IMAGE129
general formula
Figure 633670DEST_PATH_IMAGE124
And formula
Figure 318729DEST_PATH_IMAGE130
By doing the simultaneous, three unknowns can be obtained
Figure 122606DEST_PATH_IMAGE131
Six equations of (2);
Figure 6248DEST_PATH_IMAGE132
Figure 683217DEST_PATH_IMAGE133
the above formula is about
Figure 793256DEST_PATH_IMAGE134
The system of equations of the fourth order.
For the formula
Figure 213873DEST_PATH_IMAGE135
And
Figure 217601DEST_PATH_IMAGE136
the formed polynomial equation set can obtain an algebraic solution through a Grubner-based method. To preserve numerical stability and avoid a large number of operations during the computation of the Gr baby's basis, in a limited field
Figure 986843DEST_PATH_IMAGE137
A random instance of a polynomial equation set is constructed. The Grubner base is then calculated using the computer number system Macaulay 2. Finally, a corresponding solution is found using an automatic Grubner-based solution algorithm.
The solution method described above has a maximum of 20 complex solutions and elimination templates of size 36 x 56. Once the rotation parameters are obtained
Figure 646494DEST_PATH_IMAGE138
Then is used immediately
Figure 667540DEST_PATH_IMAGE139
To obtain
Figure 729037DEST_PATH_IMAGE140
. Then, the formula
Figure 419912DEST_PATH_IMAGE124
By finding
Figure 566860DEST_PATH_IMAGE141
To determine the null space of
Figure 391596DEST_PATH_IMAGE142
. Next, can pass through
Figure 228971DEST_PATH_IMAGE143
Computing
Figure 418644DEST_PATH_IMAGE144
And
Figure 849625DEST_PATH_IMAGE145
. Finally, according to formula
Figure 150156DEST_PATH_IMAGE146
The relative motion of the single camera is calculated.
Multi-camera
The jth affine matching point pair is selected to define the world reference frame W, as shown in fig. 3. The position of the jth affine matching point pair in the three-dimensional space is selected as the origin of W, and the coordinate axis direction of W is consistent with that of View1 (View 1 in FIG. 3). In the reference of multi-camera system
Figure 123929DEST_PATH_IMAGE147
Is expressed as
Figure 281241DEST_PATH_IMAGE148
. The transition between view1 and reference frame W is denoted as
Figure 137201DEST_PATH_IMAGE149
The transition between View2 (View 1 in FIG. 3) and the reference frame W is denoted as
Figure 428374DEST_PATH_IMAGE150
. It is noted that,
Figure 318970DEST_PATH_IMAGE151
Figure 912762DEST_PATH_IMAGE152
. Then, to
Figure 928123DEST_PATH_IMAGE153
And
Figure 570457DEST_PATH_IMAGE154
and carrying out parameterization. Can be expressed by the Pl ü cker vector
Figure 377876DEST_PATH_IMAGE155
All points on the line described are parameterized as:
Figure 80252DEST_PATH_IMAGE156
Figure 97756DEST_PATH_IMAGE157
wherein
Figure 543781DEST_PATH_IMAGE158
Is a vector of the direction of the unit,
Figure 205706DEST_PATH_IMAGE159
is a vector of the moment of the force,
Figure 751088DEST_PATH_IMAGE160
is an unknown depth parameter.
Suppose that the jth affine matching point pair is selected to correspond to the three-dimensional space position
Figure 68937DEST_PATH_IMAGE161
To define the origin of the world reference W. Will be connected with
Figure 380970DEST_PATH_IMAGE162
And a camera
Figure 835085DEST_PATH_IMAGE163
The Pl ü cker line of the optical center of (A) is indicated as
Figure 66215DEST_PATH_IMAGE164
. Then in view k, a point
Figure 668097DEST_PATH_IMAGE165
Satisfies the following conditions:
Figure 455925DEST_PATH_IMAGE166
Figure 967809DEST_PATH_IMAGE167
equivalently, it can also be expressed as:
Figure 182889DEST_PATH_IMAGE168
Figure 537647DEST_PATH_IMAGE169
wherein,
Figure 784958DEST_PATH_IMAGE170
is the serial number of the camera,
Figure 213665DEST_PATH_IMAGE171
is the sequence number of the affine match point pair,
Figure 927543DEST_PATH_IMAGE172
is the view sequence number. Unit direction vector
Figure 441701DEST_PATH_IMAGE173
Can pass through
Figure 243435DEST_PATH_IMAGE174
Is calculated to obtain wherein
Figure 854545DEST_PATH_IMAGE175
Camera of person being
Figure 411428DEST_PATH_IMAGE176
In
Figure 599833DEST_PATH_IMAGE177
Corresponding normalized homogeneous image coordinates. Herein, will
Figure 533154DEST_PATH_IMAGE178
And
Figure 998770DEST_PATH_IMAGE179
parametric to two unknown depth parameters
Figure 398659DEST_PATH_IMAGE180
Is a linear function of (a).
Each affine matching point pair is associated with two perspectives in view1 and view 2. Relative movement between two cameras
Figure 887409DEST_PATH_IMAGE181
Determined by a combination of four transforms: (i) from one perspective camera to view1, (ii) from view1 to W, (iii) from W to view2, (iv) from view2 to another perspective camera. In these four transformations, (i) and (iv) are determined in part by known external parameters. In sections (ii) and (iii), there are unknowns
Figure 421159DEST_PATH_IMAGE182
Figure 678965DEST_PATH_IMAGE183
And
Figure 764601DEST_PATH_IMAGE184
they are parameterized as
Figure 802964DEST_PATH_IMAGE185
. Relative movement
Figure 812509DEST_PATH_IMAGE186
Can be expressed as:
Figure 128083DEST_PATH_IMAGE187
Figure 994408DEST_PATH_IMAGE188
relative motion between two perspective cameras in each affine matching point pair
Figure 457751DEST_PATH_IMAGE189
After being represented, the essential matrix
Figure 457937DEST_PATH_IMAGE190
Can be expressed as:
Figure 690335DEST_PATH_IMAGE191
Figure 993140DEST_PATH_IMAGE192
by general formula
Figure 615882DEST_PATH_IMAGE193
Substituted type
Figure 232809DEST_PATH_IMAGE194
All the element entries in the essential matrix are equal to
Figure 382030DEST_PATH_IMAGE195
In a linear relationship. Then, will formula
Figure 980371DEST_PATH_IMAGE194
Brought into
Figure 152726DEST_PATH_IMAGE196
And formula
Figure 370081DEST_PATH_IMAGE197
Five equations may be obtained from two affine transformation constraints for the jth affine match point pair and three equations for the other affine match point pair from the two affine transformation constraints. These equations can be expressed as:
Figure 577071DEST_PATH_IMAGE198
Figure 97045DEST_PATH_IMAGE199
Figure 819014DEST_PATH_IMAGE200
all the element terms in (1) are unknown numbers
Figure 777742DEST_PATH_IMAGE030
Figure 760611DEST_PATH_IMAGE201
And are and
Figure 576120DEST_PATH_IMAGE202
the second order term of (2).
After the relative rotation angle between the two multi-camera systems is obtained through the inertial measurement unit, the relative motion estimation problem of the multi-camera is five degrees of freedom. Considering that two affine matching point pairs provide six independent constraints, the number of constraints is greater than the unknowns, and there are redundant constraints. Thus, the slave type
Figure 723068DEST_PATH_IMAGE203
Wherein four equations are randomly selected to explore the case of minimum solution. For example, two affine transformation constraints of the jth affine matching point pair and one epipolar geometric constraint and the first affine transformation constraint of another affine matching point pair are combined into four equations containing five unknowns, i.e., the formula
Figure 423170DEST_PATH_IMAGE204
The first four equations of (1):
Figure 135911DEST_PATH_IMAGE205
Figure 60005DEST_PATH_IMAGE206
due to the formula
Figure 881199DEST_PATH_IMAGE207
Have a non-zero solution, therefore
Figure 978468DEST_PATH_IMAGE208
Is satisfied with
Figure 14558DEST_PATH_IMAGE209
. Therefore, the temperature of the molten metal is controlled,
Figure 47236DEST_PATH_IMAGE210
all 3 x 3 sub-determinants of (a) must be zero. This gives information about three unknowns
Figure 965513DEST_PATH_IMAGE211
Four equations of (2).
Likewise, another affine matching point pair may be selected to establish the world reference frame W. Suppose that the first one is selected
Figure 69735DEST_PATH_IMAGE212
An affine matching point pair can be established by a similar formula
Figure 881702DEST_PATH_IMAGE213
The system of equations of (1):
Figure 413178DEST_PATH_IMAGE214
Figure 818751DEST_PATH_IMAGE215
general formula
Figure 398768DEST_PATH_IMAGE213
And formula
Figure 143871DEST_PATH_IMAGE216
By doing the simultaneous, three unknowns can be obtained
Figure 908564DEST_PATH_IMAGE217
Eight equations of (1);
Figure 739117DEST_PATH_IMAGE218
Figure 372092DEST_PATH_IMAGE219
the order of these equations is 6. Furthermore, an additional constraint was found in the just-problem, namely
Figure 971701DEST_PATH_IMAGE220
Is 1.
Affine transformation constraints provide additional equations for the above problem. Only when the world reference frame W is established using the jth affine match point pair, the two affine transformation constraints of the jth affine match point pair are used to construct the additional equations. Thus, for the relative motion estimation problem of multiple cameras, there are three additional equations:
Figure 641717DEST_PATH_IMAGE221
Figure 897249DEST_PATH_IMAGE222
the above formula is about
Figure 881385DEST_PATH_IMAGE223
The system of equations of the fourth order.
The solution is performed using the Gr baby basis method. General formula
Figure 663397DEST_PATH_IMAGE224
And formula
Figure 707576DEST_PATH_IMAGE225
Respectively expressed as constraints
Figure 699672DEST_PATH_IMAGE226
And constraint
Figure 284237DEST_PATH_IMAGE227
. Using constraints alone
Figure 858437DEST_PATH_IMAGE226
Can be used forAnd estimating the relative motion under the condition of crossing or crossing affine matching point pairs. But at the same time
Figure 11201DEST_PATH_IMAGE226
And
Figure 38063DEST_PATH_IMAGE227
the number of possible solutions can be reduced.
Once the rotation parameters are obtained
Figure 426319DEST_PATH_IMAGE228
Can be calculated immediately
Figure 41977DEST_PATH_IMAGE229
. Then use formula
Figure 427959DEST_PATH_IMAGE230
By finding
Figure 270013DEST_PATH_IMAGE231
To determine the null space of
Figure 399643DEST_PATH_IMAGE232
. Next, can pass through
Figure 620540DEST_PATH_IMAGE233
Computing
Figure 239740DEST_PATH_IMAGE234
And
Figure 241194DEST_PATH_IMAGE235
. Finally, by combined transformation
Figure 361466DEST_PATH_IMAGE236
And
Figure 499186DEST_PATH_IMAGE237
to calculate the relative motion of the multiple cameras.
The invention can achieve the following technical effects:
1) the method and the device provided by the invention fully utilize the affine matching point pair information between the views under the condition that the inertial measurement unit directly provides the relative rotation angle, greatly reduce the number of point pairs required for solving the relative motion estimation problem of a single-camera system and a multi-camera system, and obviously improve the accuracy and the robustness of the algorithm.
2) The invention utilizes the characteristic that the relative rotation angles of the inertial measurement unit and the camera are the same in the motion process under the condition that the inertial measurement unit and the camera are fixedly connected, and the relative rotation angle output by the inertial measurement unit can be directly used as the relative rotation angle of the camera or a plurality of cameras, and can be applied to wider scenes such as unknown or changed installation relation between the inertial measurement unit and the camera.
3) The relative movement of the single camera has 4 degrees of freedom under the condition that the inertial measurement unit directly provides the relative rotation angle of the single camera. A novel solving method for the minimum configuration solution of the relative motion estimation of the single camera is provided, and the solving method can accurately estimate the relative motion of the single camera through 2 affine matching point pairs.
4) Under the condition that the relative rotation angle of the multiple cameras is directly provided for the inertial measurement unit, the relative motion of the multiple cameras has 5 degrees of freedom. A novel solving method for the minimum configuration solution of the relative motion estimation of the multi-camera is provided, and the solving method can accurately estimate the relative motion of a multi-camera system through 2 crossed or non-crossed affine matching point pairs.
5) The method of the invention does not need to calibrate the inertial measurement unit and the camera, so that the integration of the inertial measurement unit and the camera is more flexible and convenient, and the method has higher precision and efficiency, and is suitable for equipment with limited computing capacity, such as unmanned aerial vehicle autonomous navigation, automatic driving automobile, augmented reality equipment and the like.
For electronic equipment such as mobile phones, unmanned planes and the like fixedly connected with and provided with an inertia measuring unit and a camera, the inertia measuring unit is used for providing relative rotation angle information for the camera, and two affine matching points are used for estimating the relative motion process of a single camera and a multi-camera system as follows:
1) extracting affine matching point pairs between two views in the relative motion of the camera by algorithms such as ASIFT, MODS and the like, wherein the affine matching point pairs comprise local affine matrixes between image coordinates and corresponding field information of the same-name point pairs;
2) relative rotation angle information directly output by an inertia measurement unit fixedly connected with the camera;
3) according to the affine matching point pair extracted between the two views, the relative rotation angle output by the inertial measurement unit and the relative motion estimation algorithm provided by the invention, the relative motion between the single camera and the multi-camera system is solved. Meanwhile, a mismatch point pair in the affine matching point pair is eliminated by combining with the RANSAC frame, and a relative motion result of the camera is recovered.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A camera relative motion estimation method under a known relative rotation angle condition, the method comprising:
acquiring at least two affine matching point pairs in a first view and a second view shot by a camera, and selecting a jth affine matching point pair to establish a world reference system; the origin of the world reference system is the position of the jth affine matching point pair in the three-dimensional space, and the coordinate axis direction of the world reference system is consistent with the first view direction;
acquiring a first posture relation between the first view and the second view, acquiring a second posture relation between the first view and the world reference system, and acquiring a third posture relation between the second view and the world reference system; the first position relation, the second position relation and the third position relation comprise a rotation matrix and a translation vector;
parameterizing the rotation matrix and the translation vector, and determining rotation parameter constraint of unknown numbers corresponding to the rotation matrix according to a relative rotation angle between the first view and the second view;
representing the first position and posture relation by adopting the parameterized rotation matrix and translation vector and acquiring a corresponding essential matrix;
acquiring two affine transformation constraints corresponding to the essential matrix determined by the jth affine matching point and affine matching matrixes in the affine matching points, acquiring one epipolar geometric constraint of the first view and the second view determined by other affine matching points and two affine transformation constraints corresponding to the essential matrix and the affine matching matrixes in the affine matching points;
and solving to obtain the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points, and determining the relative motion relationship of the camera according to the rotation matrix and the translation vector.
2. The method of claim 1, wherein determining a rotation parameter constraint for the rotation matrix corresponding to the unknown number based on the relative rotation angle between the first view and the second view comprises:
obtaining the rotation matrix obtained by parameterizing the rotation matrix as follows:
Figure 402615DEST_PATH_IMAGE001
wherein,
Figure 726280DEST_PATH_IMAGE002
is a quaternion homogeneous vector, R represents a rotation matrix corresponding to the first attitude relationship;
according to the relative rotation angle between the first view and the second view, determining the rotation parameter constraint of the unknown number corresponding to the rotation matrix as follows:
Figure 240438DEST_PATH_IMAGE003
wherein,
Figure 432385DEST_PATH_IMAGE004
indicating the relative angle of rotation.
3. The method of claim 1, wherein the representing the first pose relationship using the parameterized rotation matrix and translation vector and obtaining a corresponding essential matrix comprises:
obtaining a translation vector in the second position posture relation and a translation vector in the third position posture relation obtained by parameterizing the translation vector, wherein the obtained translation vectors are respectively:
Figure 981178DEST_PATH_IMAGE005
wherein,
Figure 725012DEST_PATH_IMAGE006
representing a translation vector in the second attitude relationship,
Figure 523203DEST_PATH_IMAGE007
an unknown depth parameter representing a translation vector parameterization in the second pose relationship,
Figure 456524DEST_PATH_IMAGE008
a unit vector representing the normalized homogeneous image coordinates in the first view,
Figure 797507DEST_PATH_IMAGE009
representing the translation vector in the third posture relationship,
Figure 322029DEST_PATH_IMAGE010
an unknown depth parameter representing the translation vector in the third pose relationship,
Figure 810779DEST_PATH_IMAGE011
a unit vector representing the normalized homogeneous image coordinates in the second view;
the parameterized rotation matrix and translation vector are adopted to represent the first attitude relationship as follows:
Figure 469163DEST_PATH_IMAGE012
wherein,
Figure 726969DEST_PATH_IMAGE013
representing a translation vector in the first position posture relation, I representing a unit matrix, and R representing a rotation matrix corresponding to the first position posture relation;
and obtaining the corresponding essential matrix as follows:
Figure 953551DEST_PATH_IMAGE014
wherein, E represents an essential matrix,
Figure 929597DEST_PATH_IMAGE015
representing an anti-symmetric matrix.
4. The method of claim 3, wherein the epipolar geometry constraint is:
Figure 611245DEST_PATH_IMAGE016
wherein,
Figure 51454DEST_PATH_IMAGE017
normalized homogeneous image coordinates for homonymous point pairs in the first view and the second view;
the affine transformation constraint is:
Figure 121041DEST_PATH_IMAGE018
where the subscript (1:2) represents the first two equations,
Figure 505755DEST_PATH_IMAGE019
representing a local affine transformation matrix corresponding to the normalized homogeneous image coordinates.
5. The method according to any one of claims 1 to 4, wherein solving the rotation matrix and the translation vector according to the two affine transformation constraints corresponding to the jth affine matching point and the one epipolar geometric constraint and the two affine transformation constraints determined by the other affine matching points comprises:
determining the relative motion parameters of the single camera to be four degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and other epipolar geometric constraints determined by the affine matching points, and constructing a first solution model as follows:
Figure 381307DEST_PATH_IMAGE020
wherein,
Figure 613705DEST_PATH_IMAGE021
the element term in (1) is unknown number
Figure 791877DEST_PATH_IMAGE022
Figure 539253DEST_PATH_IMAGE023
And
Figure 156179DEST_PATH_IMAGE024
the second-order term of (a) is,
Figure 430034DEST_PATH_IMAGE025
to represent
Figure 107003DEST_PATH_IMAGE026
The matrix size is three rows and two columns;
selecting other affine matching points to establish a world reference system, and obtaining a second solving model as follows:
Figure 76096DEST_PATH_IMAGE027
wherein,
Figure 168817DEST_PATH_IMAGE028
the element term in (1) is unknown number
Figure 375808DEST_PATH_IMAGE029
Figure 285995DEST_PATH_IMAGE030
And
Figure 945646DEST_PATH_IMAGE031
the second order term of (d);
according to the first solution model and the second solution modelSolving the model to obtain the unknown number
Figure 91326DEST_PATH_IMAGE032
The six equations of (1) are:
Figure 887243DEST_PATH_IMAGE033
obtaining algebraic solutions of six equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship according to the algebraic solutions, and obtaining a rotation matrix R of a first attitude relationship according to the rotation matrix R
Figure 702753DEST_PATH_IMAGE034
Mid-null space determination
Figure 787383DEST_PATH_IMAGE035
According to
Figure 549803DEST_PATH_IMAGE036
Calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation,
Figure 528123DEST_PATH_IMAGE037
representation matrix
Figure 452217DEST_PATH_IMAGE038
The sub-matrices of the first two rows and the first two columns,
Figure 7832DEST_PATH_IMAGE039
representation matrix
Figure 370680DEST_PATH_IMAGE040
The sub-matrices of the first two rows and the first two columns,
Figure 406769DEST_PATH_IMAGE041
a determinant representing a matrix is provided,
Figure 439448DEST_PATH_IMAGE042
representing by unknowns
Figure 560987DEST_PATH_IMAGE043
Forming two rows and two columns of submatrices.
6. The method of claim 5, wherein when the camera is a multi-camera system, the method further comprises:
camera in system for acquiring multiple cameras
Figure 727526DEST_PATH_IMAGE044
External parameters of
Figure 352543DEST_PATH_IMAGE045
Figure 805390DEST_PATH_IMAGE046
A matrix of rotations is represented, which is,
Figure 210963DEST_PATH_IMAGE047
representing a translation vector; wherein, the multi-camera system includes: camera for taking first view or second view
Figure 118876DEST_PATH_IMAGE048
And a clairvoyance camera;
and parameterizing the translation vector in the multi-camera system by adopting a Pl ü cker vector to obtain the translation vector as follows:
Figure 536082DEST_PATH_IMAGE049
wherein,
Figure 300776DEST_PATH_IMAGE050
is the serial number of the camera,
Figure 131329DEST_PATH_IMAGE051
is the sequence number of the affine match point pair,
Figure 764304DEST_PATH_IMAGE052
is the view's serial number;
unit direction vector
Figure 363913DEST_PATH_IMAGE053
By passing
Figure 33929DEST_PATH_IMAGE054
The calculation results in that,
Figure 289461DEST_PATH_IMAGE055
camera of person being
Figure 273597DEST_PATH_IMAGE056
In
Figure 55608DEST_PATH_IMAGE057
The corresponding normalized homogeneous image coordinates are then compared,
Figure 21159DEST_PATH_IMAGE058
the force vector is represented by a force vector,
Figure 826304DEST_PATH_IMAGE059
an unknown depth parameter representing the translation vector parameterization;
obtaining a fourth attitude relationship between the two perspective cameras corresponding to the first view and the second view according to the parameterized rotation matrix and translation vector, wherein the fourth attitude relationship is as follows:
Figure 676449DEST_PATH_IMAGE060
and calculating an essential matrix corresponding to the fourth pose relation as follows:
Figure 250649DEST_PATH_IMAGE061
wherein,
Figure 403413DEST_PATH_IMAGE062
and
Figure 430275DEST_PATH_IMAGE063
respectively representing camera
Figure 818531DEST_PATH_IMAGE064
Video and audio player
Figure 434189DEST_PATH_IMAGE065
The rotation matrix of (a) is,
Figure 820171DEST_PATH_IMAGE066
and
Figure 662225DEST_PATH_IMAGE067
are respectively cameras
Figure 791855DEST_PATH_IMAGE068
Video and audio player
Figure 12752DEST_PATH_IMAGE069
The translation vector of (a);
Figure 631952DEST_PATH_IMAGE070
representing a rotation matrix between a first view and a second view,
Figure 633406DEST_PATH_IMAGE071
representing a translation vector between the first view and the second view.
7. The method according to claim 6, wherein said solving for said rotation matrix and said translation vector according to two affine transformation constraints corresponding to the jth said affine matching point and one epipolar geometric constraint and two affine transformation constraints determined by other said affine matching points further comprises:
determining the relative motion parameters of the multi-camera system to be five degrees of freedom according to the rotation parameter constraint;
selecting two affine transformation constraints corresponding to the jth affine matching point and an epipolar geometric constraint and an affine transformation constraint determined by the other affine matching points, and constructing a third solution model as follows:
Figure 753678DEST_PATH_IMAGE072
selecting other affine matching points to establish a world reference system, and obtaining a fourth solution model as follows:
Figure 891398DEST_PATH_IMAGE073
obtaining the unknown number according to the third solving model and the fourth solving model
Figure 947079DEST_PATH_IMAGE074
The eight equations of (1) are:
Figure 435829DEST_PATH_IMAGE075
obtaining algebraic solutions of eight equations through a Gr baby basis solution, determining a rotation matrix R of a first attitude relationship of the multi-camera system according to the algebraic solutions, and determining a rotation matrix R of the first attitude relationship of the multi-camera system according to the rotation matrix R
Figure 844945DEST_PATH_IMAGE076
Mid-null space determination
Figure 102751DEST_PATH_IMAGE077
According to
Figure 329333DEST_PATH_IMAGE078
And calculating to obtain a translation vector in the second position posture relation and a translation vector in the third position posture relation.
CN202110596663.4A 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition Active CN113048985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110596663.4A CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110596663.4A CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Publications (2)

Publication Number Publication Date
CN113048985A true CN113048985A (en) 2021-06-29
CN113048985B CN113048985B (en) 2021-08-06

Family

ID=76518592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110596663.4A Active CN113048985B (en) 2021-05-31 2021-05-31 Camera relative motion estimation method under known relative rotation angle condition

Country Status (1)

Country Link
CN (1) CN113048985B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115451958A (en) * 2022-11-10 2022-12-09 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN111476842A (en) * 2020-04-10 2020-07-31 中国人民解放军国防科技大学 Camera relative pose estimation method and system
CN111696158A (en) * 2020-06-04 2020-09-22 中国人民解放军国防科技大学 Affine matching point pair-based multi-camera system relative pose estimation method and device
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN111476842A (en) * 2020-04-10 2020-07-31 中国人民解放军国防科技大学 Camera relative pose estimation method and system
CN111696158A (en) * 2020-06-04 2020-09-22 中国人民解放军国防科技大学 Affine matching point pair-based multi-camera system relative pose estimation method and device
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DÁNIEL BARÁTH,ET.AL: "Making affine correspondences work in camera geometry computation", 《COMPUTER VISION - ECCV 2020 16TH EUROPEAN CONFERENCE》 *
GUAN BANGLEI: "Relative Pose Estimation With a Single Affine Correspondence", 《IEEE TRANSACTIONS ON CYBERNETICS》 *
涂国勇等: "仿射变形下的异源图像匹配方法", 《计算机辅助设计与图形学学报》 *
田苗等: "一种无公共视场的多相机系统相对位姿解耦估计方法", 《光学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115451958A (en) * 2022-11-10 2022-12-09 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle
CN115451958B (en) * 2022-11-10 2023-02-03 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle

Also Published As

Publication number Publication date
CN113048985B (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN102737406B (en) Three-dimensional modeling apparatus and method
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
US8442305B2 (en) Method for determining 3D poses using points and lines
Orghidan et al. Camera calibration using two or three vanishing points
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
Clipp et al. Robust 6dof motion estimation for non-overlapping, multi-camera systems
CN108765498A (en) Monocular vision tracking, device and storage medium
CN110411476B (en) Calibration adaptation and evaluation method and system for visual inertial odometer
CN111754579B (en) Method and device for determining external parameters of multi-view camera
CN101563709A (en) Calibrating a camera system
CN112184824A (en) Camera external parameter calibration method and device
WO2009035183A1 (en) Method for self localization using parallel projection model
CN102750704A (en) Step-by-step video camera self-calibration method
CN105956074A (en) Single image scene six-degree-of-freedom positioning method of adjacent pose fusion guidance
CN113744340A (en) Calibrating cameras with non-central camera models of axial viewpoint offset and computing point projections
CN101377404A (en) Method for disambiguating space round gesture recognition ambiguity based on angle restriction
Shah et al. Depth estimation using stereo fish-eye lenses
CN113048985B (en) Camera relative motion estimation method under known relative rotation angle condition
Kurz et al. Bundle adjustment for stereoscopic 3d
CN111145267A (en) IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method
CN116630556A (en) Method, system and storage medium for reconstructing map based on aerial map data
Pagel Extrinsic self-calibration of multiple cameras with non-overlapping views in vehicles
CN116129031A (en) Method, system and storage medium for three-dimensional reconstruction
CN113223163A (en) Point cloud map construction method and device, equipment and storage medium
Kotake et al. A marker calibration method utilizing a priori knowledge on marker arrangement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant