CN107798705B - An Attitude Angle Estimation Method Based on Feature Point Set Grouping - Google Patents
An Attitude Angle Estimation Method Based on Feature Point Set Grouping Download PDFInfo
- Publication number
- CN107798705B CN107798705B CN201710896806.7A CN201710896806A CN107798705B CN 107798705 B CN107798705 B CN 107798705B CN 201710896806 A CN201710896806 A CN 201710896806A CN 107798705 B CN107798705 B CN 107798705B
- Authority
- CN
- China
- Prior art keywords
- matrix
- num
- group
- angle
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Algebra (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
技术领域technical field
本发明涉及机器人导航与定位等领域,尤其涉及一种基于特征点集分组的姿态角估计方法。The invention relates to the fields of robot navigation and positioning, in particular to an attitude angle estimation method based on feature point set grouping.
背景技术Background technique
随着信号处理理论和计算机技术的发展,使得计算机、机器人或其它智能机器像人类一样获取处理视觉信息成为研究者的目标,他们尝试使用摄像机来采集三维场景的图像,通过对一幅或多幅图像的处理,使得计算机可以认知周围的场景信息。对已知匹配点集的两幅图像,对极几何约束是可以从匹配点集中获得的关于相机的唯一信息。对极几何关系可以用一个3阶的秩为2的矩阵,即基本矩阵来表示,它包括了两幅图像相机内参和外参的相关信息。因而,对极几何问题就转化为对基本矩阵F的估计问题。With the development of signal processing theory and computer technology, it has become the goal of researchers to make computers, robots or other intelligent machines acquire and process visual information like human beings. They try to use cameras to collect images of three-dimensional scenes. Image processing enables computers to recognize surrounding scene information. For two images with a known matching point set, the epipolar geometric constraint is the only information about the camera that can be obtained from the matching point set. The epipolar geometric relationship can be represented by a third-order rank 2 matrix, that is, the fundamental matrix, which includes the relevant information of the internal and external parameters of the two images of the camera. Therefore, the problem of epipolar geometry is transformed into the problem of estimating the fundamental matrix F.
现有的估计基本矩阵的方法主要有:线性方法、迭代方法和鲁棒方法。线性方法的优点是计算量小,易于实现,但匹配的对应点中有误匹配或者由于噪声原因点的定位不好,线性方法的计算不稳定,得到的结果精度不理想。迭代方法虽然比线性方法精度高,但确实非常费时,且不能处理异常数据。鲁棒方法虽然能正确的模拟对极集合,能获得结果比较好的基本矩阵,但当数据中含有错误匹配时,结果也不够理想,该算法太过于依赖最初的由最小二乘法得到的初值。The existing methods for estimating the fundamental matrix mainly include: linear method, iterative method and robust method. The advantage of the linear method is that the amount of calculation is small and it is easy to implement, but there are mismatches in the matching corresponding points or the positioning of the points is not good due to noise. The calculation of the linear method is unstable, and the accuracy of the obtained results is not ideal. Iterative methods, although more accurate than linear methods, are indeed very time-consuming and cannot handle abnormal data. Although the robust method can correctly simulate the epipolar set and obtain the basic matrix with better results, when the data contains incorrect matching, the results are not ideal. The algorithm relies too much on the initial value obtained by the least squares method. .
发明内容SUMMARY OF THE INVENTION
本发明针对线性方法中误匹配点对基本矩阵估计不稳定缺陷,本发明要克服现有方法的上述缺点,在现有改进的八点算法基础上提出了一种基于特征点集分组的姿态角估计方法。Aiming at the defect of unstable estimation of the fundamental matrix by the mismatched points in the linear method, the present invention overcomes the above shortcomings of the existing method, and proposes an attitude angle based on the feature point set grouping on the basis of the existing improved eight-point algorithm. estimation method.
本发明解决技术问题所采用的技术方案如下:一种基于特征点集分组的姿态角估计方法,包括以下步骤:The technical solution adopted by the present invention to solve the technical problem is as follows: an attitude angle estimation method based on feature point set grouping, comprising the following steps:
步骤(1)、使用SURF算法检测和描述两幅图像特征点,获得待输入的特征点匹配集合Sall,对Sall进行横纵轴变化量的比值排序,得到预处理后的匹配点集 Step (1), use the SURF algorithm to detect and describe the feature points of two images, obtain the feature point matching set S all to be input, and sort S all by the ratio of the horizontal and vertical axis changes to obtain a preprocessed matching point set
步骤(2)、对预处理后的匹配点集Sr(长度为N,通常大于150)以横轴为基准进行排序分组,每一组可分为K(8≤K≤15)个元素,从而分成num=[N/K]组;Step (2): Sort and group the preprocessed matching point set S r (length is N, usually greater than 150) based on the horizontal axis, and each group can be divided into K (8≤K≤15) elements, Thus divided into num=[N/K] groups;
步骤(3)、对num组点集进行平移、放缩,使用归一化八点算法估计基本矩Fi(i=1,2,3,…,num);Step (3), translate and scale the num group of points, and use the normalized eight-point algorithm to estimate the fundamental moment F i (i=1, 2, 3, . . . , num);
步骤(4)、根据每组样本对应的基本矩阵Fi(i=1,2,3,…,num)计算所有匹配点到对极线距离之和的均值,找到均值最小的数据位置dex,将此位置的数据作为计算基本矩阵的匹配点数据组得到最优的基本矩阵Fz;Step (4): Calculate the mean value of the sum of the distances from all matching points to the epipolar line according to the basic matrix F i (i=1, 2, 3, ..., num) corresponding to each group of samples, and find the data position dex with the smallest mean value, Use the data at this location as the matching point data set for computing the fundamental matrix Obtain the optimal fundamental matrix F z ;
步骤(5)、对最优基本矩阵Fz进行SVD奇异值分解,得到两种可能的旋转矩阵R1或R2;通过按翻滚角-偏航角-俯仰角的顺序对三维空间的转动,推导出对应的旋转矩阵Rzyx;通过Rzyx第一个分量符号为正且对角线上分量接近1来判断矩阵R1或R2,得到最后的矩阵R;通过R的分量得到运动估计的姿态角。Step (5), perform SVD singular value decomposition on the optimal fundamental matrix F z to obtain two possible rotation matrices R 1 or R 2 ; by rotating the three-dimensional space in the order of roll angle-yaw angle-pitch angle, Derive the corresponding rotation matrix R zyx ; judge the matrix R 1 or R 2 by the sign of the first component of R zyx being positive and the component on the diagonal is close to 1, and obtain the final matrix R; obtain the motion estimation through the components of R attitude angle.
进一步的,在步骤(4)中,根据每组样本对应的基本矩阵计算所有匹配点到对极线距离之和,具体步骤如下:Further, in step (4), the sum of the distances from all matching points to the epipolar line is calculated according to the basic matrix corresponding to each group of samples, and the specific steps are as follows:
通过计算出num组数据集的基本矩阵Fi(i=1,2,3,…,num),则每组两幅图像的对极线为:By calculating the fundamental matrix F i (i=1,2,3,...,num) of num groups of data sets, the epipolar lines of each group of two images are:
其中M1和M2分别是num组中匹配点的数据集,为Fi的转置矩阵;where M1 and M2 are the datasets of matching points in the num group, respectively, is the transposed matrix of Fi;
利用每组的基本矩阵Fi(i=1,2,3,…,num)求出所有匹配点到对极线距离之和:Use the fundamental matrix F i (i=1,2,3,...,num) of each group to find the sum of the distances from all matching points to the epipolar line:
其中a(j,1)、b(j,1)和c(j,1)分别是每组数据对极线I1的每行K的数据,其中j=1,2,3…K;式(2)是计算图片1中的匹配点到对极线距离之和,同理解算出图片2中的所有匹配点到对极线距离之和为Davg2i,则最终的距离均值为Davgi=mean(Davg1i+Davg2i),其中i=1,2,3,…,num;选择num组候选数据中均值最小的作为最优基本矩阵Fz:Davg=min(Davg1,Davg2,Davg3,…,Davgnum),其中z是取均值最小的位置,1≤z≤num。where a(j,1), b(j,1) and c(j,1) are the data of each row K of each group of data to the polar line I1, where j=1,2,3...K; formula ( 2) is to calculate the sum of the distances from the matching points in the
进一步的,所述的步骤(5)具体步骤如下:Further, the concrete steps of described step (5) are as follows:
(5.1)对获取的最优基本矩阵Fz进行SVD奇异值分解,则Fz=Udiag(1,1,0)VT;(5.1) Perform SVD singular value decomposition on the obtained optimal fundamental matrix F z , then F z =Udiag(1,1,0)V T ;
(5.2)从基本矩阵的分解式Fz=SR,则(5.2) From the decomposition of the fundamental matrix F z =SR, then
Udiag(1,1,0)VT=Fz=SR=(UZUT)(UXVT)=U(ZX)VT,Udiag(1,1,0)V T =F z =SR=(UZU T )(UXV T )=U(ZX)V T ,
可推算出ZX=diag(1,1,0),因X是一个旋转矩阵,故又推出X=W或X=WT,其中W为正交矩阵,Z为反对称矩阵;It can be deduced that ZX=diag(1,1,0), because X is a rotation matrix, so X=W or X=W T , where W is an orthogonal matrix, and Z is an antisymmetric matrix;
(5.3)从以上的推算中可得出旋转矩阵的两种可能解:R1=UWVT或者R2=UWTVT,确定一个旋转矩阵R,其中(5.3) Two possible solutions of the rotation matrix can be obtained from the above calculation: R 1 =UWV T or R 2 =UW T V T , determine a rotation matrix R, where
(5.4)对于三维空间的任意转动,按翻滚角-偏航角-俯仰角的顺序来转动,则对应的旋转矩阵可表示为:(5.4) For any rotation in three-dimensional space, rotate in the order of roll angle - yaw angle - pitch angle, then the corresponding rotation matrix can be expressed as:
其中:Ry(θ),Rz(φ)分别表示绕x轴、y轴、z轴的旋转矩阵,in: R y (θ), R z (φ) represent the rotation matrices around the x-axis, y-axis, and z-axis, respectively,
(5.5)通过步骤(5.3)和(5.4)的相互推理,可解算出姿态角信息:(5.5) Through the mutual reasoning of steps (5.3) and (5.4), the attitude angle information can be calculated:
其中:表示俯仰角,θ表示偏航角,φ表示翻滚角。in: represents the pitch angle, θ represents the yaw angle, and φ represents the roll angle.
本发明有益效果如下:本发明通过对特征点的预处理,基于改进的八点法以特征点分组的形式重新呈现,减少了误匹配数据对算法的影响,实现了通过基本矩阵解算出运动估计的姿态角信息。整个方法的流程简单易实现,实验工具要求也很低,使用结果表明本发明能在保证计算精度的前提下提高了计算效率,尤其在特征点数目较多时更能体现该方法的价值。The beneficial effects of the present invention are as follows: the present invention re-presents the feature points in the form of feature point grouping based on the improved eight-point method through the preprocessing of the feature points, reduces the influence of the mismatched data on the algorithm, and realizes the motion estimation through the basic matrix solution. attitude angle information. The process of the whole method is simple and easy to implement, and the requirements for experimental tools are also very low. The use results show that the invention can improve the calculation efficiency on the premise of ensuring the calculation accuracy, especially when the number of feature points is large, it can better reflect the value of the method.
附图说明Description of drawings
图1为本发明一种基于特征点集分组的姿态角估计方法的流程图。FIG. 1 is a flowchart of an attitude angle estimation method based on feature point set grouping according to the present invention.
具体实施方式Detailed ways
参照附图1,本发明提供一种基于特征点集分组的姿态角估计方法,首先使用SURF算法检测和描述图像特征点,获得待输入的特征点集合;对预备集合进行了预处理和分组选择形成初始特征点集合,很大程度上解决了误匹配特征点对八点法的影响,获得了良好初始数据集合。之后使用改进的八点算法计算两幅图像间的每组样本基本矩阵的初始值,由各初始值计算所有匹配点到对极线距离和的均值,将得到的各组对极线距离和的均值由小到大排序,选择最小均值所对应的数据子集组作为该算法的精确基本矩阵初始值;通过对该矩阵进行SVD奇异值分解,推算出两种可能的旋转矩阵R1或R2,主要通过比较两种矩阵的符号来选择最终的矩阵R;对于三维空间中转动的顺序,本发明选择的是按翻滚角-偏航角-俯仰角的顺序转动,得到旋转矩阵Rzyx,将该旋转矩阵与矩阵R进行位置解算,从而求出运动估计的姿态角信息;具体步骤如下:Referring to FIG. 1, the present invention provides an attitude angle estimation method based on feature point set grouping. First, the SURF algorithm is used to detect and describe image feature points, and the feature point set to be input is obtained; the preprocessing and grouping selection are performed on the preliminary set The initial feature point set is formed, which largely solves the influence of mismatched feature points on the eight-point method, and obtains a good initial data set. Then use the improved eight-point algorithm to calculate the initial value of the basic matrix of each group of samples between the two images, and calculate the mean value of the sum of the distances from all matching points to the epipolar line from each initial value. The mean is sorted from small to large, and the data subset group corresponding to the smallest mean is selected as the initial value of the exact basic matrix of the algorithm; by SVD singular value decomposition of the matrix, two possible rotation matrices R 1 or R 2 are calculated. , the final matrix R is mainly selected by comparing the symbols of the two matrices; for the rotation order in the three-dimensional space, the present invention chooses to rotate in the order of roll angle-yaw angle-pitch angle to obtain the rotation matrix R zyx , The rotation matrix and the matrix R are used for position calculation, so as to obtain the attitude angle information of motion estimation; the specific steps are as follows:
步骤(1)、对初始的两幅图像的匹配点集Sall进行横纵轴变化量的比值排序,去掉前后变化量中差值大于5的点,得到预处理后的匹配点集具体实施步骤如下所示:Step (1), sort the matching point set S all of the initial two images by the ratio of the horizontal and vertical axis changes, remove the points whose difference is greater than 5 in the before and after changes, and obtain the matching point set after preprocessing. The specific implementation steps are as follows:
(1.1)通过SURF算法对两幅图像分别进行特征点的提取和匹配,得到匹配点集Sall;(1.1) extraction and matching of feature points are respectively performed on two images by the SURF algorithm to obtain a matching point set S all ;
(1.2)对Sall进行横纵轴变化量的比值的计算,并以比值数据为基础进行排序,为了去除误匹配数据,去掉排序数据列中差值大于5的数据点,得到预处理后的匹配点集 (1.2) Calculate the ratio of the horizontal and vertical axis changes to S all , and sort based on the ratio data. In order to remove the mismatched data, remove the data points with a difference greater than 5 in the sorted data column, and obtain the preprocessed matching point set
步骤(2)、对点集Sr(长度为N,通常大于150)以横轴为基准进行排序,通常设置一个下限值K,每一组有至少K个元素,这样就分成约num=[N/K]组,通过大量的实验数据说明,当8≤K≤15时,算法的数据精度都很稳定,整体的时间效率也提高了,本实施中K取8。Step (2), sort the point set S r (length is N, usually greater than 150) based on the horizontal axis, usually set a lower limit K, each group has at least K elements, so that it is divided into about num= Group [N/K], according to a large number of experimental data, when 8≤K≤15, the data accuracy of the algorithm is very stable, and the overall time efficiency is also improved. In this implementation, K is 8.
步骤(3)、对num组点集进行平移、放缩,使用归一化八点算法估计基本矩阵Fi(i=1,2,3,…,num);Step (3), translate and scale the num group of points, and use the normalized eight-point algorithm to estimate the fundamental matrix F i (i=1, 2, 3, . . . , num);
步骤(4)、根据每组样本对应的基本矩阵Fi(i=1,2,3,…,num)计算所有匹配点到对极线距离之和的均值,找到均值最小的数据位置dex,将此位置的数据作为计算基本矩阵的匹配点数据组得到最优的基本矩阵Fz;具体实施步骤如下所示:Step (4): Calculate the mean value of the sum of the distances from all matching points to the epipolar line according to the basic matrix F i (i=1, 2, 3, ..., num) corresponding to each group of samples, and find the data position dex with the smallest mean value, Use the data at this location as the matching point data set for computing the fundamental matrix Obtain the optimal fundamental matrix F z ; the specific implementation steps are as follows:
(4.1)通过计算出num组数据集的基本矩阵Fi(i=1,2,3,…,num),则每组两幅图像的对极线为:(4.1) By calculating the fundamental matrix F i (i=1, 2, 3, ..., num) of num groups of data sets, the epipolar lines of each group of two images are:
其中M1和M2分别是num组中匹配点的数据集,为Fi的转置矩阵;where M1 and M2 are the datasets of matching points in the num group, respectively, is the transposed matrix of Fi;
(4.2)利用每组的基本矩阵Fi(i=1,2,3,…,num)求出所有匹配点到对极线距离之和:(4.2) Use the basic matrix F i (i=1,2,3,...,num) of each group to find the sum of the distances from all matching points to the epipolar line:
其中a(j,1)、b(j,1)和c(j,1)分别是每组数据对极线I1的每行K的数据,其中j=1,2,3…8;式(2)是计算图片1中的匹配点到对极线距离之和,同理解算出图片2中的所有匹配点到对极线距离之和为Davg2i,则最终的距离均值为Davgi=mean(Davg1i+Davg2i),其中i=1,2,3,…,num;选择num组候选数据中均值最小的作为最优基本矩阵Fz:Davg=min(Davg1,Davg2,Davg3,…,Davgnum),其中z是取均值最小的位置,1≤z≤num。where a(j,1), b(j,1) and c(j,1) are the data of each row K of each group of data to the polar line I1, where j=1,2,3...8; formula ( 2) is to calculate the sum of the distances from the matching points in the
步骤(5)、对最优基本矩阵Fz进行SVD奇异值分解,得到两种可能的旋转矩阵R1或R2;通过按翻滚角-偏航角-俯仰角的顺序对三维空间的转动,推导出对应的旋转矩阵Rzyx;通过Rzyx第一个分量符号为正且对角线上分量接近1来判断矩阵R1或R2,得到最后的矩阵R;通过R的分量得到运动估计的姿态角。具体实施步骤如下所示:Step (5), perform SVD singular value decomposition on the optimal fundamental matrix F z to obtain two possible rotation matrices R 1 or R 2 ; by rotating the three-dimensional space in the order of roll angle-yaw angle-pitch angle, Derive the corresponding rotation matrix R zyx ; judge the matrix R 1 or R 2 by the sign of the first component of R zyx being positive and the component on the diagonal is close to 1, and obtain the final matrix R; obtain the motion estimation through the components of R attitude angle. The specific implementation steps are as follows:
(5.1)对获取的最优基本矩阵Fz进行SVD奇异值分解,则Fz=Udiag(1,1,0)VT;(5.1) Perform SVD singular value decomposition on the obtained optimal fundamental matrix F z , then F z =Udiag(1,1,0)V T ;
(5.2)从基本矩阵的分解式Fz=SR,则(5.2) From the decomposition of the fundamental matrix F z =SR, then
Udiag(1,1,0)VT=Fz=SR=(UZUT)(UXVT)=U(ZX)VT,Udiag(1,1,0)V T =F z =SR=(UZU T )(UXV T )=U(ZX)V T ,
可推算出ZX=diag(1,1,0),因X是一个旋转矩阵,故又推出X=W或X=WT;其中W为正交矩阵,Z为反对称矩阵,一般它们定为以下的矩阵形式:It can be deduced that ZX=diag(1,1,0), because X is a rotation matrix, X=W or X=W T is deduced again; wherein W is an orthogonal matrix, Z is an antisymmetric matrix, generally they are set as The following matrix form:
(5.3)从以上的推算中可得出旋转矩阵的两种可能解:R1=UWVT或者R2=UWTVT,确定一个旋转矩阵R,其中 (5.3) Two possible solutions of the rotation matrix can be obtained from the above calculation: R 1 =UWV T or R 2 =UW T V T , determine a rotation matrix R, where
运动估计的姿态角信息主要是通过基本矩阵分解出来的旋转矩阵和三维空间任意方向转动对应的旋转矩阵之间的相互推算求得。对于三维空间中转动的顺序,本发明选择的是按翻滚角-偏航角-俯仰角的顺序转动,对应的旋转矩阵是三个旋转矩阵的顺序乘积的结果,其中 The attitude angle information of motion estimation is mainly obtained by mutual calculation between the rotation matrix decomposed by the basic matrix and the rotation matrix corresponding to the rotation in any direction in three-dimensional space. For the order of rotation in three-dimensional space, the present invention chooses to rotate in the order of roll angle-yaw angle-pitch angle, and the corresponding rotation matrix is the result of the order product of three rotation matrices, wherein
Ry(θ),Rz(φ)分别表示绕x轴、y轴、z轴的旋转矩阵。 R y (θ) and R z (φ) represent the rotation matrices around the x-axis, y-axis, and z-axis, respectively.
(5.4)对于三维空间的任意转动,本发明按翻滚角-偏航角-俯仰角的顺序来转动,则对应的旋转矩阵可表示为:(5.4) For any rotation in three-dimensional space, the present invention rotates in the order of roll angle-yaw angle-pitch angle, then the corresponding rotation matrix can be expressed as:
(5.5)通过步骤(5.3)和(5.4)的相互推理,可解算出姿态角信息,分别是俯仰角偏航角θ,翻滚角φ:(5.5) Through the mutual reasoning of steps (5.3) and (5.4), the attitude angle information can be calculated, which are the pitch angle respectively Yaw angle θ, roll angle φ:
本发明通过对特征点的预处理,基于改进的八点法以特征点分组的形式重新呈现,减少了误匹配数据对算法的影响,实现了通过基本矩阵解算出运动估计的姿态角信息。整个算法的流程简单易实现,实验工具要求也很低,使用结果表明本发明能在保证计算精度的前提下提高了计算效率,尤其在特征点数目较多时更能体现该方法的价值。The present invention preprocesses the feature points and re-presents them in the form of feature point grouping based on the improved eight-point method, reduces the influence of mismatched data on the algorithm, and realizes the attitude angle information of motion estimation through the basic matrix solution. The process of the entire algorithm is simple and easy to implement, and the requirements for experimental tools are also very low. The use results show that the invention can improve the calculation efficiency under the premise of ensuring the calculation accuracy, especially when the number of feature points is large, it can better reflect the value of the method.
本说明书实施例所述的内容仅仅是对发明构思的实现形式的列举,本发明的保护范围不应当被视为仅限于实施例所陈述的具体形式,本发明的保护范围也及于本领域技术人员根据本发明构思所能够想到的等同技术手段。The content described in the embodiments of the present specification is only an enumeration of the realization forms of the inventive concept, and the protection scope of the present invention should not be regarded as limited to the specific forms stated in the embodiments, and the protection scope of the present invention also extends to those skilled in the art. Equivalent technical means that can be conceived by a person based on the inventive concept.
Claims (3)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710896806.7A CN107798705B (en) | 2017-09-28 | 2017-09-28 | An Attitude Angle Estimation Method Based on Feature Point Set Grouping |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710896806.7A CN107798705B (en) | 2017-09-28 | 2017-09-28 | An Attitude Angle Estimation Method Based on Feature Point Set Grouping |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107798705A CN107798705A (en) | 2018-03-13 |
CN107798705B true CN107798705B (en) | 2020-06-16 |
Family
ID=61533918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710896806.7A Expired - Fee Related CN107798705B (en) | 2017-09-28 | 2017-09-28 | An Attitude Angle Estimation Method Based on Feature Point Set Grouping |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107798705B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104880187A (en) * | 2015-06-09 | 2015-09-02 | 北京航空航天大学 | Dual-camera-based motion estimation method of light stream detection device for aircraft |
CN106920259A (en) * | 2017-02-28 | 2017-07-04 | 武汉工程大学 | A kind of localization method and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8050458B2 (en) * | 2007-06-18 | 2011-11-01 | Honda Elesys Co., Ltd. | Frontal view imaging and control device installed on movable object |
-
2017
- 2017-09-28 CN CN201710896806.7A patent/CN107798705B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104880187A (en) * | 2015-06-09 | 2015-09-02 | 北京航空航天大学 | Dual-camera-based motion estimation method of light stream detection device for aircraft |
CN106920259A (en) * | 2017-02-28 | 2017-07-04 | 武汉工程大学 | A kind of localization method and system |
Non-Patent Citations (2)
Title |
---|
《Attitude and attitude rate estimation for a nanosatellite using SVD and UKF》;Demet Cilden 等;《 2015 7th International Conference on Recent Advances in Space Technologies 》;20150619;全文 * |
《单目相机姿态估计的点云与图像融合》;熊光洋 等;《测绘科学》;20160229;第41卷(第2期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN107798705A (en) | 2018-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108038902B (en) | High-precision three-dimensional reconstruction method and system for depth camera | |
CN106845515B (en) | Robot target identification and pose reconstruction method based on virtual sample deep learning | |
CN105913489B (en) | An Indoor 3D Scene Reconstruction Method Using Plane Features | |
CN104019799B (en) | A Relative Orientation Method for Computing Fundamental Matrix Using Local Parameter Optimization | |
CN104748683B (en) | A kind of on-line automatic measurement apparatus of Digit Control Machine Tool workpiece and measuring method | |
CN102750704B (en) | Step-by-step video camera self-calibration method | |
CN105354841B (en) | A kind of rapid remote sensing image matching method and system | |
CN105021124A (en) | Planar component three-dimensional position and normal vector calculation method based on depth map | |
CN113393524B (en) | Target pose estimation method combining deep learning and contour point cloud reconstruction | |
Byrod et al. | Improving numerical accuracy of gröbner basis polynomial equation solvers | |
CN109766903B (en) | Point cloud model curved surface matching method based on curved surface features | |
Belter et al. | Improving accuracy of feature-based RGB-D SLAM by modeling spatial uncertainty of point features | |
CN102663351A (en) | Face characteristic point automation calibration method based on conditional appearance model | |
CN112329726B (en) | Face recognition method and device | |
CN104036542A (en) | Spatial light clustering-based image surface feature point matching method | |
CN102404595A (en) | Epipolar line correction method capable of providing 3D program shooting guidance | |
CN108280858A (en) | A kind of linear global camera motion method for parameter estimation in multiple view reconstruction | |
Sweeney et al. | Computing similarity transformations from only image correspondences | |
CN109870106A (en) | A method of building volume measurement based on UAV images | |
CN108597016A (en) | Torr-M-Estimators basis matrix robust estimation methods based on joint entropy | |
CN116817920A (en) | Visual positioning method and device for plane mobile robot without three-dimensional map model | |
CN107330934B (en) | Low-dimensional cluster adjustment calculation method and system | |
CN111932628B (en) | A method and device for determining posture, electronic device, and storage medium | |
CN107798705B (en) | An Attitude Angle Estimation Method Based on Feature Point Set Grouping | |
CN113436249B (en) | Rapid and stable monocular camera pose estimation algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200616 |
|
CF01 | Termination of patent right due to non-payment of annual fee |