CN107798705B - Attitude angle estimation method based on feature point set grouping - Google Patents

Attitude angle estimation method based on feature point set grouping Download PDF

Info

Publication number
CN107798705B
CN107798705B CN201710896806.7A CN201710896806A CN107798705B CN 107798705 B CN107798705 B CN 107798705B CN 201710896806 A CN201710896806 A CN 201710896806A CN 107798705 B CN107798705 B CN 107798705B
Authority
CN
China
Prior art keywords
matrix
num
group
matching
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710896806.7A
Other languages
Chinese (zh)
Other versions
CN107798705A (en
Inventor
董利达
高立娅
迟天阳
董文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Normal University
Original Assignee
Hangzhou Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Normal University filed Critical Hangzhou Normal University
Priority to CN201710896806.7A priority Critical patent/CN107798705B/en
Publication of CN107798705A publication Critical patent/CN107798705A/en
Application granted granted Critical
Publication of CN107798705B publication Critical patent/CN107798705B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an attitude angle estimation method based on feature point set grouping, which comprises the following steps: 1) detecting and describing image feature points by an SURF algorithm to obtain a feature point set to be input; 2) preprocessing an input feature set and grouping points; 3) respectively calculating a basic matrix for the grouped feature point sets, calculating the mean value of the distance sum from each group of matching points to the epipolar line, and selecting a data subset group corresponding to the minimum mean value as an optimal basic matrix; 4) and carrying out SVD singular value decomposition on the optimal matrix, calculating a rotation matrix, and carrying out position calculation on the rotation matrix and the rotation matrix in the sequence of rolling angle, yaw angle and pitch angle to obtain attitude angle information. The invention improves the calculation efficiency on the premise of ensuring the calculation precision.

Description

Attitude angle estimation method based on feature point set grouping
Technical Field
The invention relates to the fields of robot navigation, positioning and the like, in particular to an attitude angle estimation method based on feature point set grouping.
Background
With the development of signal processing theory and computer technology, it is the goal of researchers to make computers, robots or other intelligent machines acquire and process visual information like humans, and they attempt to acquire images of a three-dimensional scene using a camera, and through the processing of one or more images, make the computer recognize the surrounding scene information. For two images of a known set of matching points, the epipolar geometry constraint is the only information about the camera that can be obtained from the set of matching points. The epipolar geometry can be represented by a 3-order matrix with rank 2, i.e. a fundamental matrix, which includes information about internal and external parameters of the two image cameras. Thus, the epipolar geometry problem translates into an estimation problem for the fundamental matrix F.
The existing methods for estimating the fundamental matrix mainly include: linear methods, iterative methods and robust methods. The linear method has the advantages of small calculation amount and easy realization, but the matched corresponding points have mismatching or the points are not well positioned due to noise, the calculation of the linear method is unstable, and the obtained result precision is not ideal. Although the iterative method is more accurate than the linear method, it is really very time consuming and cannot handle abnormal data. Although the robust method can correctly simulate the epipolar set and can obtain a basic matrix with a better result, when the data contains error matching, the result is not ideal enough, and the algorithm is too dependent on the initial value obtained by the least square method.
Disclosure of Invention
Aiming at the defect that the estimation of a basic matrix by a mismatching point pair in a linear method is unstable, the invention provides an attitude angle estimation method based on characteristic point set grouping on the basis of the existing improved eight-point algorithm to overcome the defects of the existing method.
The technical scheme adopted by the invention for solving the technical problem is as follows: an attitude angle estimation method based on feature point set grouping comprises the following steps:
step (1), detecting and describing two image feature points by using SURF algorithm to obtain a feature point matching set S to be inputallTo SallSorting the ratio of the variation of the horizontal axis and the vertical axis to obtain a preprocessed matching point set
Figure BDA0001422088970000011
Step (2), the preprocessed matching point set Sr(length N, usually greater than 150) are sorted on the horizontal axis into groups, each group being divisible into K (8. ltoreq. K.ltoreq.15) elements and thus into num [ N/K ]]Group (d);
step (3) translating and scaling the num group of point sets, and estimating a basic moment F by using a normalization eight-point algorithmi(i=1,2,3,…,num);
Step (4), according to the corresponding basic matrix F of each group of samplesi(i=1,2,3,…,num)Calculating the average value of the sum of the distances from all the matching points to the epipolar lines, finding the data position dex with the minimum average value, and taking the data at the position as a matching point data group for calculating a basic matrix
Figure BDA0001422088970000021
Obtaining the optimal basic matrix Fz
Step (5) for the optimal basic matrix FzSVD singular value decomposition is carried out to obtain two possible rotation matrixes R1Or R2(ii) a Deriving a corresponding rotation matrix R by rotating the three-dimensional space according to the sequence of the roll angle, the yaw angle and the pitch anglezyx(ii) a By RzyxThe first component is positive and the diagonal component is close to 1 to determine the matrix R1Or R2Obtaining a final matrix R; and obtaining the attitude angle of the motion estimation through the component of R.
Further, in the step (4), the sum of distances from all matching points to epipolar lines is calculated according to the basic matrix corresponding to each group of samples, and the specific steps are as follows:
by calculating the fundamental matrix F of the num group dataseti(i ═ 1,2,3, …, num), then the epipolar lines for each set of two images are:
Figure BDA0001422088970000022
where M1 and M2 are each datasets of matching points in the num group,
Figure BDA0001422088970000023
is FiThe transposed matrix of (2);
using elementary matrices F for each groupi(i ═ 1,2,3, …, num) the sum of all matching point to epipolar distances is found:
Figure BDA0001422088970000024
where a (j,1), b (j,1), and c (j,1) are data of each row K of the data epipolar line I1 of each group, respectively, where j is 1,2,3 … K; equation (2) is calculationThe sum of the distances from the matching points to the epipolar lines in the picture 1 is calculated to be Davg2iThe final distance average is Davgi=mean(Davg1i+Davg2i) Wherein i ═ 1,2,3, …, num; selecting the smallest mean value in the num group of candidate data as the optimal basic matrix Fz:Davg=min(Davg1,Davg2,Davg3,…,Davgnum) Wherein z is the position with the minimum mean value, and z is more than or equal to 1 and less than or equal to num.
Further, the step (5) comprises the following specific steps:
(5.1) obtaining the optimal basic matrix FzPerforming SVD singular value decomposition, then Fz=Udiag(1,1,0)VT
(5.2) decomposition from the basic matrix FzWhen SR is equal to
Udiag(1,1,0)VT=Fz=SR=(UZUT)(UXVT)=U(ZX)VT
ZX ═ diag (1,1,0) can be deduced, since X is a rotation matrix, X ═ W or X ═ W can be deduced againTWherein W is an orthogonal matrix and Z is an antisymmetric matrix;
(5.3) two possible solutions of the rotation matrix can be derived from the above calculations: r1=UWVTOr R2=UWTVTDetermining a rotation matrix R, wherein
Figure BDA0001422088970000031
(5.4) for arbitrary rotation in three-dimensional space, in the order of roll-yaw-pitch, the corresponding rotation matrix can be expressed as:
Figure BDA0001422088970000032
wherein:
Figure BDA0001422088970000037
Ry(θ),Rz(phi) represents a rotation matrix around the x-axis, y-axis, and z-axis, respectively,
Figure BDA0001422088970000034
(5.5) by mutual reasoning in steps (5.3) and (5.4), the attitude angle information can be solved:
Figure BDA0001422088970000035
wherein:
Figure BDA0001422088970000036
represents the pitch angle, theta represents the yaw angle, and phi represents the roll angle.
The invention has the following beneficial effects: the invention reduces the influence of mismatching data on the algorithm and realizes the calculation of the attitude angle information of motion estimation through the fundamental matrix by preprocessing the characteristic points and re-presenting the characteristic points in a grouping form based on an improved eight-point method. The whole method is simple in flow and easy to implement, the requirement of an experimental tool is low, and the use result shows that the method can improve the calculation efficiency on the premise of ensuring the calculation accuracy, and particularly can embody the value of the method when the number of the characteristic points is large.
Drawings
FIG. 1 is a flowchart of an attitude angle estimation method based on feature point set grouping according to the present invention.
Detailed Description
Referring to fig. 1, the present invention provides an attitude angle estimation method based on feature point set grouping, which includes firstly using SURF algorithm to detect and describe image feature points, and obtaining a feature point set to be input; the preliminary set is preprocessed and grouped and selected to form an initial characteristic point set, so that the influence of mismatching characteristic points on an eight-point method is solved to a great extent, and a good initial data set is obtained. Then, an improved eight-point algorithm is used for calculating the initial value of each group of sample basic matrix between the two images, and the distance sum of all the matching points to the epipolar line is calculated by each initial valueSorting the average values of the polar line distance sums of all the obtained groups from small to large, and selecting the data subset group corresponding to the minimum average value as the initial value of the accurate basic matrix of the algorithm; by carrying out SVD singular value decomposition on the matrix, two possible rotation matrixes R are calculated1Or R2Selecting a final matrix R mainly by comparing the signs of the two matrices; for the sequence of rotation in three-dimensional space, the invention selects the sequence of rotation according to the roll angle, the yaw angle and the pitch angle to obtain a rotation matrix RzyxPerforming position calculation on the rotation matrix and the matrix R to obtain attitude angle information of motion estimation; the method comprises the following specific steps:
step (1), matching point set S of two initial imagesallSorting the ratio of the horizontal and vertical axis variation, removing the point with the difference value larger than 5 in the front and rear variation, and obtaining the preprocessed matching point set
Figure BDA0001422088970000041
The specific implementation steps are as follows:
(1.1) respectively extracting and matching characteristic points of the two images through a SURF algorithm to obtain a matching point set Sall
(1.2) to SallCalculating the ratio of the horizontal axis variation and the vertical axis variation, sorting based on the ratio data, removing the data points with the difference value larger than 5 in the sorted data column in order to remove the mismatching data, and obtaining the preprocessed matching point set
Figure BDA0001422088970000042
Step (2), point set Sr(length N, typically greater than 150) is ordered on the horizontal axis, typically with a lower threshold K, each group having at least K elements, and thus divided into about num ═ N/K]And a large amount of experimental data show that when K is more than or equal to 8 and less than or equal to 15, the data precision of the algorithm is stable, the overall time efficiency is improved, and K is 8 in the implementation.
And (3) translating and scaling the num group of point sets, and estimating a basic matrix F by using a normalized eight-point algorithmi(i=1,2,3,…,num);
Step (4), according to the corresponding basic matrix F of each group of samplesi(i ═ 1,2,3, …, num) calculating the average value of the sum of all matching points to epipolar line distances, finding the data position dex with the minimum average value, and using the data at this position as the matching point data set for calculating the basic matrix
Figure BDA0001422088970000043
Obtaining the optimal basic matrix Fz(ii) a The specific implementation steps are as follows:
(4.1) by calculating the fundamental matrix F for num groups of datasetsi(i ═ 1,2,3, …, num), then the epipolar lines for each set of two images are:
Figure BDA0001422088970000051
where M1 and M2 are each datasets of matching points in the num group,
Figure BDA0001422088970000052
is FiThe transposed matrix of (2);
(4.2) Using the fundamental matrix F of each groupi(i ═ 1,2,3, …, num) the sum of all matching point to epipolar distances is found:
Figure BDA0001422088970000053
where a (j,1), b (j,1) and c (j,1) are respectively data of each row K of the data epipolar line I1 of each group, where j is 1,2,3 … 8; equation (2) is to calculate the sum of the distances from the matching points to the epipolar line in the picture 1, and to understand that the sum of the distances from all the matching points to the epipolar line in the picture 2 is Davg2iThe final distance average is Davgi=mean(Davg1i+Davg2i) Wherein i ═ 1,2,3, …, num; selecting the smallest mean value in the num group of candidate data as the optimal basic matrix Fz:Davg=min(Davg1,Davg2,Davg3,…,Davgnum) Wherein z is the position with the minimum mean value, and z is more than or equal to 1 and less than or equal to num.
Step (5) for the optimal basic matrix FzSVD singular value decomposition is carried out to obtain two possible rotation matrixes R1Or R2(ii) a Deriving a corresponding rotation matrix R by rotating the three-dimensional space according to the sequence of the roll angle, the yaw angle and the pitch anglezyx(ii) a By RzyxThe first component is positive and the diagonal component is close to 1 to determine the matrix R1Or R2Obtaining a final matrix R; and obtaining the attitude angle of the motion estimation through the component of R. The specific implementation steps are as follows:
(5.1) obtaining the optimal basic matrix FzPerforming SVD singular value decomposition, then Fz=Udiag(1,1,0)VT
(5.2) decomposition from the basic matrix FzWhen SR is equal to
Udiag(1,1,0)VT=Fz=SR=(UZUT)(UXVT)=U(ZX)VT
ZX ═ diag (1,1,0) can be deduced, since X is a rotation matrix, X ═ W or X ═ W can be deduced againT(ii) a Where W is an orthogonal matrix and Z is an antisymmetric matrix, which are generally defined as the matrix form:
Figure BDA0001422088970000054
(5.3) two possible solutions of the rotation matrix can be derived from the above calculations: r1=UWVTOr R2=UWTVTDetermining a rotation matrix R, wherein
Figure BDA0001422088970000061
The attitude angle information of the motion estimation is mainly obtained by mutual calculation between a rotation matrix decomposed by a basic matrix and a rotation matrix corresponding to the rotation in any direction of a three-dimensional space. For the sequence of rotation in three-dimensional space, the invention selects the sequence of rotation in roll-yaw-pitch, the corresponding rotation matrix being the result of the sequential product of three rotation matrices, wherein
Figure BDA0001422088970000062
Figure BDA0001422088970000063
Ry(θ),Rz(φ) represents a rotation matrix around the x-axis, y-axis, and z-axis, respectively.
(5.4) for any rotation in three-dimensional space, the invention rotates according to the sequence of roll angle-yaw angle-pitch angle, and the corresponding rotation matrix can be expressed as:
Figure BDA0001422088970000064
(5.5) by mutual reasoning in the steps (5.3) and (5.4), the attitude angle information, namely the pitch angle respectively, can be calculated
Figure BDA0001422088970000065
Yaw angle θ, roll angle Φ:
Figure BDA0001422088970000066
the invention reduces the influence of mismatching data on the algorithm and realizes the calculation of the attitude angle information of motion estimation through the fundamental matrix by preprocessing the characteristic points and re-presenting the characteristic points in a grouping form based on an improved eight-point method. The whole algorithm flow is simple and easy to realize, the requirement of an experimental tool is low, and the use result shows that the method can improve the calculation efficiency on the premise of ensuring the calculation precision, and particularly can embody the value of the method when the number of the characteristic points is large.
The embodiments described in this specification are merely illustrative of implementations of the inventive concept and the scope of the present invention should not be considered limited to the specific forms set forth in the embodiments but rather by the equivalents thereof as may occur to those skilled in the art upon consideration of the present inventive concept.

Claims (3)

1. An attitude angle estimation method based on feature point set grouping is characterized by comprising the following steps:
step (1), detecting and describing two image feature points by using SURF algorithm to obtain a feature point matching set S to be inputallTo SallSorting the ratio of the variation of the horizontal axis and the vertical axis to obtain a preprocessed matching point set
Figure FDA0002245229360000011
Step (2), the preprocessed matching point set SrThe sorting and grouping are performed based on the horizontal axis, each group can be divided into K elements, so that the division is carried out into num ═ N/K]Group, in which N is SrLength of (d);
step (3) translating and scaling the num group of point sets, and calculating a basic moment F by using an improved eight-point algorithmi(i=1,2,3,…,num);
Step (4), according to the corresponding basic matrix F of each group of samplesi(i ═ 1,2,3, …, num) calculating the average value of the sum of all matching points to epipolar line distances, finding the data position dex with the minimum average value, and using the data at this position as the matching point data set for calculating the basic matrix
Figure FDA0002245229360000012
Obtaining the optimal basic matrix Fz
Step (5) for the optimal basic matrix FzSVD singular value decomposition is carried out to obtain two possible rotation matrixes R1Or R2(ii) a Deriving a corresponding rotation matrix R by rotating the three-dimensional space according to the sequence of the roll angle, the yaw angle and the pitch anglezyx(ii) a By RzyxThe first component is positive and the diagonal component is close to 1 to determine the matrix R1Or R2Obtaining a final matrix R; and obtaining the attitude angle of the motion estimation through the component of R.
2. The method for estimating the attitude angle based on the feature point set grouping according to claim 1, wherein in the step (4), the sum of the distances from all the matching points to the epipolar line is calculated according to the fundamental matrix corresponding to each group of samples, and the specific steps are as follows:
by calculating the fundamental matrix F of the num group dataseti(i ═ 1,2,3, …, num), then the epipolar lines for each set of two images are:
Figure FDA0002245229360000013
where M1 and M2 are datasets of matching points in num groups, respectively, Fi TIs FiThe transposed matrix of (2);
using elementary matrices F for each groupi(i ═ 1,2,3, …, num) the sum of all matching point to epipolar distances is found:
Figure FDA0002245229360000014
where a (j,1), b (j,1), and c (j,1) are data for each set of jth row of epipolar line I1, respectively, where j is 1,2,3 … K; equation (2) is to calculate the sum of the distances from the matching points to the epipolar line in the picture 1, and to understand that the sum of the distances from all the matching points to the epipolar line in the picture 2 is Davg2iThe final distance average is Davgi=mean(Davg1i+Davg2i) Wherein i ═ 1,2,3, …, num; selecting the smallest mean value in the num group of candidate data as the optimal basic matrix Fz:Davg=min(Davg1,Davg2,Davg3,…,Davgnum) Wherein z is the position with the minimum mean value, and z is more than or equal to 1 and less than or equal to num.
3. The method for estimating the attitude angle based on the feature point set grouping according to claim 2, wherein the step (5) comprises the following specific steps:
(5.1) obtaining the optimal basic matrix FzPerforming SVD singular value decomposition, then Fz=Udiag(1,1,0)VT
(5.2) decomposition of the basic matrix FzSR, available as
Udiag(1,1,0)VT=Fz=SR=(UZUT)(UXVT)=U(ZX)VT
ZX ═ diag (1,1,0) can be deduced, since X is a rotation matrix, X ═ W or X ═ W can be deduced againTWherein W is an orthogonal matrix and Z is an antisymmetric matrix;
(5.3) two possible solutions of the rotation matrix can be derived from the above calculations: r1=UWVTOr R2=UWTVTDetermining a rotation matrix R, wherein
Figure FDA0002245229360000021
(5.4) for arbitrary rotation in three-dimensional space, in the order of roll-yaw-pitch, the corresponding rotation matrix can be expressed as:
Figure FDA0002245229360000022
wherein:
Figure FDA0002245229360000023
Ry(θ),Rz(phi) represents a rotation matrix around the x-axis, y-axis, and z-axis, respectively,
Figure FDA0002245229360000024
(5.5) by mutual reasoning in steps (5.3) and (5.4), the attitude angle information can be solved:
Figure FDA0002245229360000031
wherein:
Figure FDA0002245229360000032
represents the pitch angle, theta represents the yaw angle, and phi represents the roll angle.
CN201710896806.7A 2017-09-28 2017-09-28 Attitude angle estimation method based on feature point set grouping Expired - Fee Related CN107798705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710896806.7A CN107798705B (en) 2017-09-28 2017-09-28 Attitude angle estimation method based on feature point set grouping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710896806.7A CN107798705B (en) 2017-09-28 2017-09-28 Attitude angle estimation method based on feature point set grouping

Publications (2)

Publication Number Publication Date
CN107798705A CN107798705A (en) 2018-03-13
CN107798705B true CN107798705B (en) 2020-06-16

Family

ID=61533918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710896806.7A Expired - Fee Related CN107798705B (en) 2017-09-28 2017-09-28 Attitude angle estimation method based on feature point set grouping

Country Status (1)

Country Link
CN (1) CN107798705B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104880187A (en) * 2015-06-09 2015-09-02 北京航空航天大学 Dual-camera-based motion estimation method of light stream detection device for aircraft
CN106920259A (en) * 2017-02-28 2017-07-04 武汉工程大学 A kind of localization method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8050458B2 (en) * 2007-06-18 2011-11-01 Honda Elesys Co., Ltd. Frontal view imaging and control device installed on movable object

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104880187A (en) * 2015-06-09 2015-09-02 北京航空航天大学 Dual-camera-based motion estimation method of light stream detection device for aircraft
CN106920259A (en) * 2017-02-28 2017-07-04 武汉工程大学 A kind of localization method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Attitude and attitude rate estimation for a nanosatellite using SVD and UKF》;Demet Cilden 等;《 2015 7th International Conference on Recent Advances in Space Technologies 》;20150619;全文 *
《单目相机姿态估计的点云与图像融合》;熊光洋 等;《测绘科学》;20160229;第41卷(第2期);全文 *

Also Published As

Publication number Publication date
CN107798705A (en) 2018-03-13

Similar Documents

Publication Publication Date Title
CN108665491B (en) Rapid point cloud registration method based on local reference points
CN102750704B (en) Step-by-step video camera self-calibration method
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN107818598B (en) Three-dimensional point cloud map fusion method based on visual correction
CN105956074A (en) Single image scene six-degree-of-freedom positioning method of adjacent pose fusion guidance
CN109766903B (en) Point cloud model curved surface matching method based on curved surface features
CN111027140A (en) Airplane standard part model rapid reconstruction method based on multi-view point cloud data
CN102663351A (en) Face characteristic point automation calibration method based on conditional appearance model
CN110838146A (en) Homonymy point matching method, system, device and medium for coplanar cross-ratio constraint
CN117132630A (en) Point cloud registration method based on second-order spatial compatibility measurement
CN113902779B (en) Point cloud registration method based on tensor voting method
CN115100277A (en) Method for determining position and pose of complex curved surface structure part
CN110688440A (en) Map fusion method suitable for less sub-map overlapping parts
CN107330934B (en) Low-dimensional cluster adjustment calculation method and system
CN116476070B (en) Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot
CN117351078A (en) Target size and 6D gesture estimation method based on shape priori
CN107798705B (en) Attitude angle estimation method based on feature point set grouping
CN112509018B (en) Quaternion space optimized three-dimensional image registration method
CN109658489B (en) Three-dimensional grid data processing method and system based on neural network
Duan et al. Filtering 2D-3D outliers by camera adjustment for visual odometry
Del-Tejo-Catalá et al. Probabilistic pose estimation from multiple hypotheses
CN109949357B (en) Method for recovering relative posture of stereo image pair
CN111210507B (en) Initial view selection method for multi-view three-dimensional reconstruction
CN115861666B (en) 3D image point cloud matching method, system, equipment and medium
CN115294285B (en) Three-dimensional reconstruction method and system of deep convolutional network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200616