CN113587904A - Target attitude and position measurement method integrating machine vision and laser reference point information - Google Patents

Target attitude and position measurement method integrating machine vision and laser reference point information Download PDF

Info

Publication number
CN113587904A
CN113587904A CN202110866493.7A CN202110866493A CN113587904A CN 113587904 A CN113587904 A CN 113587904A CN 202110866493 A CN202110866493 A CN 202110866493A CN 113587904 A CN113587904 A CN 113587904A
Authority
CN
China
Prior art keywords
reference point
laser reference
point
pose
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110866493.7A
Other languages
Chinese (zh)
Other versions
CN113587904B (en
Inventor
张高鹏
彭建伟
梅超
廖加文
任龙
黄继江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN202110866493.7A priority Critical patent/CN113587904B/en
Publication of CN113587904A publication Critical patent/CN113587904A/en
Application granted granted Critical
Publication of CN113587904B publication Critical patent/CN113587904B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Abstract

The invention belongs to the field of spatial target pose measurement, and relates to a target pose measurement method fusing machine vision and laser reference point information, which is applied to a non-cooperative target pose measurement process in spatial near-field operation and in-orbit service. The method solves the problem that the traditional orthogonal iterative algorithm is lower in operation efficiency than a non-iterative algorithm, integrates the intermediate process of the traditional orthogonal iterative algorithm by using a laser reference point as a spatial characteristic point of a monocular vision pose measurement method, realizes the accelerated solution of the orthogonal iterative algorithm by eliminating the intermediate value of a translation vector in the iterative process, and solves the initial value of a rotation matrix in the orthogonal iterative algorithm by using a parallel perspective projection model, so that the performance and the efficiency of the algorithm are improved.

Description

Target attitude and position measurement method integrating machine vision and laser reference point information
Technical Field
The invention belongs to the field of spatial target pose measurement, and particularly relates to a target pose measurement method fusing machine vision and laser reference point information, which is applied to a non-cooperative target pose measurement process in spatial near-field operation and in-orbit service.
Background
The accurate measurement of the relative position and attitude (collectively referred to as pose) of a space target is the key to complete the tasks of space intersection butt joint, attack and defense confrontation, on-orbit capture and maintenance and the like, and the pose measurement method based on monocular vision has the advantages of simple system structure, simple and clear camera calibration steps, wide measurement view field range, low cost, high measurement real-time efficiency and the like compared with binocular and multi-view vision measurement methods. By virtue of the advantages, the pose measurement method based on monocular vision is widely applied to the field of space non-cooperative target pose measurement. Pose measurement methods based on monocular vision can be broadly divided into two major categories: one is to solve pose parameters based on the known geometric shape of the target to be measured, such as a circular ellipse method, a target length-width ratio method and the like. And the other type is a pose solving algorithm based on target characteristic mark information. The former needs to know the specific geometric shape of the target, the latter is not limited by the geometric shape of the target, and is not only suitable for regular targets, but also suitable for various targets with irregular shapes, and in contrast, the latter has higher research and practical values, so the former gradually becomes a research hotspot in the field of pose measurement of monocular vision. The latter can be divided into a monocular vision pose measurement technology based on point characteristics and a monocular vision pose measurement technology based on linear characteristics according to different target characteristic mark information.
The target pose estimation method based on the Point features is also called a Perspective n-Point (PnP), namely, internal parameters of a given camera, coordinate information of n spatial feature points in a target coordinate system and pixel coordinates of corresponding image points of the n spatial feature points in an image coordinate system are given, and then pose information of the target in the camera coordinate system is solved. The PnP problem is classified according to a solving method, and can be classified into a non-iterative method and an iterative method. The most classical non-iterative method is the DLT method, which is initially used for camera calibration and after modification can solve pose information based on at least 4 coplanar feature points or at least 6 non-coplanar feature points. However, the DLT method cannot ensure the orthogonality of the rotation matrix, is very sensitive to noise, and has low robustness. The iteration method generally uses the image surface reprojection error as an objective function, then uses nonlinear optimization algorithms such as a Gauss-Newton method or a Levenberg-Marquardt method to carry out iteration solution, and solves the pose information by minimizing the objective function, so that the accuracy and the robustness are high. However, the method has the disadvantages that the relationship between the pose resolving precision and the initial value of the rotation matrix in the iterative algorithm is large, the iterative algorithm is large in calculation amount, and the algorithm efficiency is low. Aiming at the defect, DeMenthon and the like propose a POSIT (position from the original and Scaling with operations) algorithm, the algorithm obtains an initial value of a rotation matrix by using a weak perspective projection model, and then solves the pose information by using a related iterative algorithm, the POSIT algorithm has high precision, but the obtained rotation matrix is not the optimal rotation matrix. Horaud provides a pose solving method based on parallel perspective iteration, the method replaces a weak perspective projection model with a parallel perspective model, the stability of an algorithm is improved, and better solving speed can be obtained. However, the algorithm also cannot guarantee the orthogonality of the rotation matrix, and other methods are needed to solve the closest orthogonal matrix, but the obtained matrix is often close to the true rotation matrix but still not the optimal rotation matrix. To solve this problem, Lu proposes an orthogonal iterative algorithm (Fast and global converging position estimation from video images "(c.p.lu, g.hager, e.m. johns, IEEE trans. pattern Analysis and Machine Intelligence, vol.22, No.5, pp.610-622, 2000) which takes a target space collinearity error (object side collinearity error) as an objective function and can directly solve an orthogonal rotation matrix, and since the algorithm has the advantages of high precision, good robustness, global convergence and the fact that the algorithm precision does not depend on the initial value of the rotation matrix, the algorithm is gradually one of the research hotspots in the field of pose measurement based on monocular vision, but the method is still an iterative algorithm in nature, which determines that the operation efficiency is necessarily lower than that of a non-iterative algorithm, while for the space of camera facing the space target pose measurement, it requires higher calculation efficiency of the iterative algorithm, and therefore needs to be improved, the operation efficiency is improved.
Disclosure of Invention
In order to solve the problem that the traditional orthogonal iterative algorithm is lower in operation efficiency than a non-iterative algorithm, the invention provides a method for measuring the pose of a spatial non-cooperative target by fusing monocular vision and laser reference point information.
The technical scheme of the invention is to provide a target attitude measurement method integrating machine vision and laser reference point information, which is characterized by comprising the following steps:
step 1, taking a laser reference point as a spatial characteristic point, imaging the laser reference point, and extracting a coordinate of an imaging point of the laser reference point in a camera pixel coordinate system;
step 2, constructing object space collinear errors, and taking the minimum value of the object space collinear errors as a target function to iteratively solve the attitude parameter rotation matrix R;
step 2.1, determining an initial value of a rotation matrix;
step 2.2, determining a sight line projection matrix W of an imaging point of a laser reference point by using a formula 7i
Figure BDA0003187644280000031
Wherein, therein
Figure BDA0003187644280000032
Is the coordinate of the ith laser reference point in the target coordinate system,
Figure BDA0003187644280000041
homogeneous coordinates of a normalized image point corresponding to the ith laser reference point;
step 2.3, utilizing the sight line projection matrix WiInitial values of the rotation matrix R and
Figure BDA0003187644280000042
construction-side alignment error:
Figure BDA0003187644280000043
wherein the translation vector is solved using equation 10
Figure BDA0003187644280000049
Figure BDA0003187644280000044
Wherein I is an identity matrix;
step 2.4, returning to step 2.3, obtaining the optimal solution of the absolute orientation problem by using a Singular Value Decomposition (SVD) method, and then iteratively updating the rotation matrix R until an iteration termination condition is met;
step 2.5, selecting the minimum object party collinear error obtained in the iterative updating process as an objective function of pose estimation:
Figure BDA0003187644280000045
translation vector
Figure BDA0003187644280000046
The optimal solution of (a) is:
Figure BDA0003187644280000047
further, step 2.1 solves the initial value of the rotation matrix using a parallel perspective model:
set laser reference point set { PiThe homogeneous coordinate of (1) is Pi=(Xi,Yi,Zi,1)TCorresponding set of image points { piThe homogeneous coordinate of (the normalized pixel homogeneous coordinate) is pi=(xi,yi,1)TSet of laser reference points { P }iThe centroid homogeneous coordinate is
Figure BDA0003187644280000048
The centroid image point homogeneous coordinate is (x)0,y0,1)TThen the initial value R of the rotation matrix is rotated0Comprises the following steps:
Figure BDA0003187644280000051
wherein the content of the first and second substances,
Figure BDA0003187644280000052
and
Figure BDA0003187644280000053
respectively represent
Figure BDA0003187644280000054
And
Figure BDA0003187644280000055
the corresponding anti-symmetric matrix is then used,
Figure BDA0003187644280000056
and
Figure BDA0003187644280000057
three-dimensional column vectors, respectively, the result of which can be determined by:
Figure BDA0003187644280000058
further, step 1 specifically includes the following steps:
step 1.1, recording the spatial three-dimensional coordinates of the laser reference point;
step 1.2, imaging the laser reference point, and preprocessing the laser reference point imaging picture, wherein the preprocessing mainly comprises color image graying processing, image filtering, histogram equalization, edge sharpening and image denoising processing;
and step 1.3, extracting coordinates of an imaging point of the laser reference point in a camera pixel coordinate system by adopting a Harris angular point extraction algorithm.
Further, step 2 is followed by step 3, wherein the measurement accuracy is analyzed based on an orthogonal simulation test:
step 3.1, designing an orthogonal test;
taking five factors of spatial feature point image coordinate extraction precision, spatial feature point spatial coordinate precision, camera principal point calibration precision, normalized focal length calibration precision and the number of spatial feature points as main factors influencing measurement precision; three typical levels were chosen for each factor, using L18(35) The orthogonal table of (2);
step 3.2, defining the robustness index of the product quality characteristic;
error in pose parameter (Delta A)x、ΔAy、ΔAz、ΔPx、ΔPy、ΔPz) Respectively used as the robustness indexes of the product quality characteristics; wherein Δ Ax、ΔAy、ΔAzThe rotation angle errors in the x direction, the y direction and the z direction are respectively; delta Px、ΔPy、ΔPzThe position errors in the x, y and z directions are respectively;
step 3.3, analyzing the signal to noise ratio;
based on L18(35) Establishing 18 groups of simulation tests, and calculating the signal-to-noise ratios of different pose parameter errors by using the results of the 18 groups of simulation tests;
step 3.4, calculating the range of the signal-to-noise ratio;
calculating the signal-to-noise ratios of the parameter errors of different poses calculated in the step 3.3 to obtain extreme differences, wherein the greater the extreme differences are, the higher the influence level of the influence factors is, and the greater the influence is;
step 3.5, calculating the influence ranking of each influence factor on each pose parameter;
based on the extreme difference calculation result in the step 3.4, the influence ranking of each influence factor on each pose parameter is calculated, and the first influence factors influencing the precision of different pose parameters are determined.
The invention has the beneficial effects that:
1. the invention utilizes the laser reference point as the space characteristic point in the monocular vision pose measurement method to improve the traditional orthogonal iteration algorithm, and the rotation matrix R and the translation vector are respectively calculated once in each iteration process in the traditional orthogonal iteration algorithm
Figure BDA0003187644280000061
The improvement is that each iteration process only needs to iterate and update the rotation matrix R, and the translation vector does not need to be solved in each iteration
Figure BDA0003187644280000062
In other words, only the translation vector needs to be output after the last iteration
Figure BDA0003187644280000063
The optimal solution of (a). Therefore, the process can be eliminated in the iteration process
Figure BDA0003187644280000064
The intermediate value of (2) reduces the calculation amount in the iteration process, thereby improving the operation efficiency.
2. The method uses a parallel perspective model to solve the initial value of the rotation matrix. Theoretically, the initial value R of the rotation matrix0Optionally selected, however, the initial value R of the rotation matrix0The selection of (A) has great influence on the operation efficiency of the algorithm, and the initial value R of the rotation matrix0When the selection is improper, the calculation amount of the algorithm is very large and the time is long. Solving rotation matrix initialization through parallel perspective modelAnd the operation efficiency is further improved.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
fig. 2 is a schematic diagram of an orthogonal iterative algorithm.
Detailed Description
In the following, the technical solutions in the embodiments of the present invention will be clearly and completely described with reference to the drawings in the embodiments of the present invention, wherein the drawings are only an embodiment of the present invention and do not limit the present invention in any way, and therefore any simple modification, equivalent change or modification of the above embodiments according to the technical essence of the present invention still falls within the scope of the technical solutions of the present invention.
As shown in fig. 1, the basic work flow of the present invention includes the following steps:
step 1, irradiating a target by using a laser, forming a laser reference point on the surface of the target, imaging the laser reference point by using the laser reference point as a spatial characteristic point, and extracting the coordinate of the imaged point of the laser reference point in a camera pixel coordinate system by adopting a Harris angular point extraction algorithm.
1.1, firstly, recording a spatial three-dimensional coordinate of a laser reference point;
and step 1.2, preprocessing the laser reference point imaging photo, wherein the preprocessing mainly comprises color image graying processing, image filtering, histogram equalization, edge sharpening and image denoising processing.
And step 1.3, carrying out corner detection by adopting a Harris corner extraction algorithm, and extracting image pixel coordinates of an imaging point of each laser reference point in an image.
And 2, constructing object space collinear errors, and taking the minimum value of the object space collinear errors as a target function to iteratively solve the attitude parameter rotation matrix.
And 2.1, solving an initial value of the rotation matrix by using a parallel perspective model.
Theoretically, the initial value R of the rotation matrix0Optionally selected, however, the initial value R of the rotation matrix0Influence of selection on the operating efficiency of the algorithmMaximum, rotation matrix initial value R0When the selection is improper, the calculation amount of the algorithm is very large and the time is long. In the method, a parallel perspective model is used for solving the initial value of the rotation matrix.
Under a parallel perspective model, the projection process is divided into two steps: the first step is to project the object parallel to a plane that passes through the centroid and is parallel to the image plane, but the projected line is not parallel to the optical axis but parallel to the line connecting the optical center of the camera and the centroid. The parallel perspective model can be expressed as:
Figure BDA0003187644280000081
wherein (X)0,Y0,Z0) Are coordinates of the center of mass. The matrix representation of the parallel perspective model is:
Figure BDA0003187644280000082
wherein
Figure BDA0003187644280000083
Referred to as a parallel perspective projection matrix.
Let the actual coordinate of the three-dimensional space coordinate P be (X, Y, Z)T=(X0+ΔX,Y0+ΔY,Z0+ΔZ)TParallel perspective model imaging error IerrCan be obtained from the taylor equation:
Figure BDA0003187644280000091
the equation (4) shows that the imaging error of the parallel perspective model is second-order infinitesimal of the three-dimensional point coordinates. And the image point error under the amblyopia model is the first order infinitesimal of the three-dimensional point coordinates. Therefore, in the present invention, the solution of the initial value of the rotation matrix is performed using a parallel perspective model.
Let three-dimensional space point set { PiThe homogeneous coordinate of (1) is Pi=(Xi,Yi,Zi,1)TCorresponding set of image points { piThe homogeneous coordinate of (i) } is pi=(xi,yi,1)TSet of three-dimensional spatial points { P }iThe centroid homogeneous coordinate is
Figure BDA0003187644280000092
The centroid image point homogeneous coordinate is (x)0,y0,1)TThen rotate the initial value R of the matrix0Comprises the following steps:
Figure BDA0003187644280000093
wherein the content of the first and second substances,
Figure BDA0003187644280000094
and
Figure BDA0003187644280000095
respectively represent
Figure BDA0003187644280000096
And
Figure BDA0003187644280000097
a corresponding antisymmetric matrix.
Figure BDA0003187644280000098
And
Figure BDA0003187644280000099
three-dimensional column vectors, respectively, the result of which can be determined by:
Figure BDA00031876442800000910
thus, the initial value R of the rotation matrix in the orthogonal iterative algorithm can be obtained by the formula (6)0
Step 2.2Assuming that n laser reference points exist in the target coordinate system, wherein the coordinate of the ith laser reference point in the target coordinate system is
Figure BDA00031876442800000911
Corresponding normalized image point homogeneous coordinates of
Figure BDA00031876442800000912
The sight line projection matrix of the image point is:
Figure BDA0003187644280000101
wherein, WiReferred to as a line-of-sight projection matrix, line-of-sight refers to a ray from the optical center to an image point. And after the laser reference point is acted by the sight line projection matrix, the projection point of the laser reference point on the sight line of the corresponding image point is obtained.
Step 2.3, constructing object space collinear error;
the basic principle of the orthonormal iterative algorithm is that the spatial feature point should coincide with its projection point on the sight line of the corresponding image point, wherein the relationship can be described by a target space collinearity equation:
Figure BDA0003187644280000102
wherein R and
Figure BDA0003187644280000103
namely the pose of the camera under a target coordinate system, R is a rotation matrix,
Figure BDA0003187644280000104
is a translation vector.
From equation (8), object-side alignment errors can be constructed:
Figure BDA0003187644280000105
the inventionSolving translation vector in equation 9 using equation 10
Figure BDA0003187644280000106
Figure BDA0003187644280000107
Where I is the identity matrix.
Step 2.4, returning to step 2.3, obtaining the optimal solution of the absolute orientation problem by using a Singular Value Decomposition (SVD) method, and then iteratively updating R until an iteration termination condition is met, as shown in FIG. 2; the iteration end condition in this embodiment is set such that the relative variation amount of the object-side collinear error is smaller than a threshold value set in advance. Specifically, in the present embodiment, if the relative variation of the corresponding object-side collinear error in five consecutive iterations is smaller than 0.001mm, it means that the iterations have stabilized, and the whole iteration process is terminated.
Step 2.5, based on the formula (9) in the iterative process, an objective function of attitude estimation can be obtained:
Figure BDA0003187644280000111
for equation (11), if the rotation matrix R is known, the vector is translated
Figure BDA0003187644280000112
The optimal solution of (a) is:
Figure BDA0003187644280000113
this is a univariate minimization optimization problem. Can be solved by using an iterative method
Figure BDA0003187644280000114
The optimum value of (c). The initial value of the iterative optimization is R0Updating R by obtaining the optimal solution to the absolute orientation problem using the singular value decomposition method (SVD), such a processCan finish R and
Figure BDA00031876442800001110
continuously iterating and optimizing.
The iteration process of each time of the traditional orthogonal iteration algorithm needs to calculate a rotation matrix R and a translational vector respectively
Figure BDA0003187644280000115
In practice, however, after each R update, the optimum can be solved linearly
Figure BDA0003187644280000116
Thus, each iteration is actually an iteration of the rotation matrix R, and it is not necessary to solve for the translation vector in each iteration
Figure BDA0003187644280000117
In other words, only the translation vector needs to be output after the last iteration
Figure BDA0003187644280000118
The optimal solution of (a). Therefore, the present invention can eliminate the interference in the iterative process
Figure BDA0003187644280000119
The intermediate value of (2) reduces the calculation amount in the iteration process, thereby improving the operation efficiency.
And 3, carrying out quantitative analysis on the accuracy of the pose measurement method based on an orthogonal simulation test.
And 3.1, designing an orthogonal test.
The Tiankou method is created by Tiankou Xuani, a famous quality management expert in Japan, and the Tiankou method enables the functions and the performance of the product to be insensitive to the cause of deviation by adjusting design parameters so as to improve the anti-interference capability of the product. To quantitatively describe the product quality loss, the Tankou proposes the concept of "quality loss function" and measures the robustness of the design parameters by the signal-to-noise ratio. The Tiankou method emphasizes that the design of products must pass through three stages of system design, parameter design and tolerance design, the parameter design is the core process, and the method is mainly characterized in that a quality loss function is introduced and converted into a signal-to-noise ratio, the optimal level combination of each parameter value is found out on the basis of orthogonal experimental design through statistical analysis of an experimental scheme, and therefore the performance and the quality of the products are improved. Because the orthogonal experiment has the characteristics of uniform dispersion and order comparability, the orthogonal table is applied to arrange the experiment to be representative, and the approximate condition of the influence of each level of each factor on the index can be comprehensively reflected.
TABLE 1 numbering and leveling of the geometric factors
Figure BDA0003187644280000121
The invention takes the thought of the Taguchi method and the result of simulation analysis as the basis, and quantitatively researches the influence of the extraction precision of the image coordinate of the spatial characteristic point, the spatial coordinate precision of the spatial characteristic point, the calibration precision of the camera principal point, the calibration precision of the normalized focal length and the number of the spatial characteristic points on the final pose measurement result by reasonably designing orthogonal experiments. The level numbers of the factors are shown in table 1, each factor has three typical levels, and if each combination is calculated, the total number of the levels is 35Calculating one by one as 243 different combinations would be a very time consuming task, as shown in table 2, using L in the present invention18(35) Only 18 combinations need to be calculated, thereby greatly accelerating the analysis process.
TABLE 2L18(35) Orthogonal table
Figure BDA0003187644280000122
Figure BDA0003187644280000131
And 3.2, defining the robustness index of the product quality characteristic.
For the problem of pose measurement of a spatial object, the smaller the error of the pose measurement result, the better, and therefore, in the present invention, the error (Δ a) of the pose measurement result is takenx、ΔAy、ΔAz、ΔPx、ΔPy、ΔPz) Respectively as the robustness indexes of the product quality characteristics (namely the target function of the algorithm precision quantitative analysis).
And 3.3, analyzing the signal to noise ratio.
The Tankou method measures the stability of product quality by the signal-to-noise ratio (SNR). The analysis process is to use an orthogonal test table, and the signal-to-noise ratio is used as an evaluation index of the product robustness, so that researchers can be helped to find out which level of influence is more effective for each control factor. The snr analysis process is an evaluation index of product robustness with snr by using an orthogonal test table to help researchers find which level of influence is more effective for each control factor. In the present invention, 18 sets of simulation experiments were established based on table 2, and the results of the 18 sets of simulation experiments are shown in table 3.
Table 3 results of simulation experiment
Figure BDA0003187644280000132
Figure BDA0003187644280000141
And calculating the signal-to-noise ratios of different pose parameter errors based on the 18 sets of simulation test results shown in the table 3. The following is a calculation of Δ PzSNR ofΔPzThe calculation of the signal-to-noise ratio is illustrated as an example.
ΔPzThe signal-to-noise ratio (S/N) can be derived from the following equation:
Figure BDA0003187644280000142
where n represents the number of times the test was repeated. Since the result of the simulation test is itself the mean square of 100 tests, n is taken to be 1. Based on the formula (13), Δ P in 18 sets of simulation experiments was obtainedzSNR ofΔPzResults are shown in the last column of Table 4, other pose parameter errors (Δ A)x、ΔAy、ΔAz、ΔPx、ΔPy) The signal to noise ratio of (c) can be similarly obtained, and the results are also shown in table 4.
TABLE 4 SNR for each pose parameter error
Figure BDA0003187644280000143
Figure BDA0003187644280000151
And 3.4, calculating the extreme difference of the signal to noise ratio. And 3.3, calculating the signal-to-noise ratios of the different pose parameter errors calculated in the step 3.3, wherein the greater the range is, the higher the influence level of the influence factor is, and the greater the influence is.
Specifically, T is shown in Table 51The first number of rows is the sum of the signal-to-noise ratios for the 6 sets of trials when the a factor is at the first level. Can obtain T in the same way1-T3Other numbers are listed. The very poor signal-to-noise ratio is denoted T1-T3The larger the difference between the medium maximum and minimum values, the higher the level of influence of the factor.
TABLE 5 parameter pairs Δ PzAnalysis of the contribution ratio of
Figure BDA0003187644280000152
And 3.5, calculating the influence ranking of each influence factor. And 3.4, calculating the influence ranking of each influence factor based on the extreme difference calculation result in the step 3.4, and determining the first influence factors influencing the precision of different pose parameters.
The influence contribution rate and the ranking of each geometric factor are obtained according to the following method:
Ri=maxTj-minTj(j=1,2,3)(14)
Figure BDA0003187644280000161
the same method is used for measuring errors (delta A) of other five posesx、ΔAy、ΔAz、ΔPx、ΔPy) The influence of the signal-to-noise ratio and various factors on the signal-to-noise ratio is calculated, and the calculation processes are similar and are not repeated herein.
According to the accuracy quantitative calculation method for the pose measurement algorithm, the primary factors influencing each pose parameter can be determined, and the pose measurement result can be optimized in a targeted manner according to the requirements of different space pose measurement tasks.

Claims (4)

1. A target attitude measurement method fusing machine vision and laser reference point information is characterized by comprising the following steps:
step 1, taking a laser reference point as a spatial characteristic point, imaging the laser reference point, and extracting a coordinate of an imaging point of the laser reference point in a camera pixel coordinate system;
step 2, constructing object space collinear errors, and taking the minimum value of the object space collinear errors as a target function to iteratively solve the attitude parameter rotation matrix R;
step 2.1, determining an initial value of a rotation matrix;
step 2.2, determining a sight line projection matrix W of an imaging point of a laser reference point by using a formula 7i
Figure FDA0003187644270000011
Wherein, therein
Figure FDA0003187644270000012
Is the coordinate of the ith laser reference point in the target coordinate system,
Figure FDA0003187644270000013
homogeneous coordinates of a normalized image point corresponding to the ith laser reference point;
step 2.3, utilizing the sight line projection matrix WiInitial values of the rotation matrix R and
Figure FDA0003187644270000014
construction-side alignment error:
Figure FDA0003187644270000015
wherein the translation vector is solved using equation 10
Figure FDA0003187644270000016
Figure FDA0003187644270000017
Wherein I is an identity matrix;
step 2.4, returning to step 2.3, obtaining the optimal solution of the absolute orientation problem by using a Singular Value Decomposition (SVD) method, and then iteratively updating the rotation matrix R until an iteration termination condition is met;
step 2.5, selecting the minimum object party collinear error obtained in the iterative updating process as an objective function of pose estimation:
Figure FDA0003187644270000021
translation vector
Figure FDA0003187644270000022
The optimal solution of (a) is:
Figure FDA0003187644270000023
2. a method of target pose measurement fusing machine vision and laser reference point information according to claim 1, characterized by step 2.1 solving the initial value of the rotation matrix using parallel perspective model:
set laser reference point set { PiThe homogeneous coordinate of (1) is Pi=(Xi,Yi,Zi,1)TCorresponding set of image points { piThe homogeneous coordinate of (i) } is pi=(xi,yi,1)TSet of laser reference points { P }iThe centroid homogeneous coordinate is
Figure FDA0003187644270000024
The centroid image point homogeneous coordinate is (x)0,y0,1)TThen the initial value R of the rotation matrix is rotated0Comprises the following steps:
Figure FDA0003187644270000025
wherein the content of the first and second substances,
Figure FDA0003187644270000026
and
Figure FDA0003187644270000027
respectively represent
Figure FDA0003187644270000028
And
Figure FDA0003187644270000029
the corresponding anti-symmetric matrix is then used,
Figure FDA00031876442700000210
and
Figure FDA00031876442700000211
three-dimensional column vectors, respectively, the result of which can be determined by:
Figure FDA00031876442700000212
3. a method of target attitude measurement incorporating machine vision and laser reference point information according to claim 1 or 2, wherein: the step 1 specifically comprises the following steps:
step 1.1, recording the spatial three-dimensional coordinates of the laser reference point;
step 1.2, imaging the laser reference point, and preprocessing the laser reference point imaging picture, wherein the preprocessing mainly comprises color image graying processing, image filtering, histogram equalization, edge sharpening and image denoising processing;
and step 1.3, extracting coordinates of an imaging point of the laser reference point in a camera pixel coordinate system by adopting a Harris angular point extraction algorithm.
4. The method of fusion machine vision and laser reference point information target pose measurement according to claim 3, wherein: step 2 is followed by step 3, based on the orthogonal simulation test, analyzing the measurement accuracy:
step 3.1, designing an orthogonal test;
taking five factors of spatial feature point image coordinate extraction precision, spatial feature point spatial coordinate precision, camera principal point calibration precision, normalized focal length calibration precision and number of spatial feature points as factors influencing measurement precisionA major factor; three typical levels were chosen for each factor, using L18(35) The orthogonal table of (2);
step 3.2, defining the robustness index of the product quality characteristic;
error in pose parameter (Delta A)x、ΔAy、ΔAz、ΔPx、ΔPy、ΔPz) Respectively used as the robustness indexes of the product quality characteristics; wherein Δ Ax、ΔAy、ΔAzThe rotation angle errors in the x direction, the y direction and the z direction are respectively; delta Px、ΔPy、ΔPzThe position errors in the x, y and z directions are respectively;
step 3.3, analyzing the signal to noise ratio;
based on L18(35) Establishing 18 groups of simulation tests, and calculating the signal-to-noise ratios of different pose parameter errors by using the results of the 18 groups of simulation tests;
step 3.4, calculating the range of the signal-to-noise ratio;
calculating the signal-to-noise ratios of the parameter errors of different poses calculated in the step 3.3 to obtain extreme differences, wherein the greater the extreme differences are, the higher the influence level of the influence factors is, and the greater the influence is;
step 3.5, calculating the influence ranking of each influence factor on each pose parameter;
based on the extreme difference calculation result in the step 3.4, the influence ranking of each influence factor on each pose parameter is calculated, and the first influence factors influencing the precision of different pose parameters are determined.
CN202110866493.7A 2021-07-29 2021-07-29 Target attitude and position measurement method integrating machine vision and laser reference point information Active CN113587904B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110866493.7A CN113587904B (en) 2021-07-29 2021-07-29 Target attitude and position measurement method integrating machine vision and laser reference point information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110866493.7A CN113587904B (en) 2021-07-29 2021-07-29 Target attitude and position measurement method integrating machine vision and laser reference point information

Publications (2)

Publication Number Publication Date
CN113587904A true CN113587904A (en) 2021-11-02
CN113587904B CN113587904B (en) 2022-05-20

Family

ID=78252087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110866493.7A Active CN113587904B (en) 2021-07-29 2021-07-29 Target attitude and position measurement method integrating machine vision and laser reference point information

Country Status (1)

Country Link
CN (1) CN113587904B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116363205A (en) * 2023-03-30 2023-06-30 中国科学院西安光学精密机械研究所 Space target pose resolving method based on deep learning and computer program product
CN117284499A (en) * 2023-11-24 2023-12-26 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070040800A1 (en) * 2005-08-18 2007-02-22 Forlines Clifton L Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Three-dimensional object European space reconstruction measurement system based on vision and active optics fusion
CN108489496A (en) * 2018-04-28 2018-09-04 北京空间飞行器总体设计部 Noncooperative target Relative Navigation method for estimating based on Multi-source Information Fusion and system
CN113052908A (en) * 2021-04-16 2021-06-29 南京工业大学 Mobile robot pose estimation method based on multi-sensor data fusion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070040800A1 (en) * 2005-08-18 2007-02-22 Forlines Clifton L Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Three-dimensional object European space reconstruction measurement system based on vision and active optics fusion
CN108489496A (en) * 2018-04-28 2018-09-04 北京空间飞行器总体设计部 Noncooperative target Relative Navigation method for estimating based on Multi-source Information Fusion and system
CN113052908A (en) * 2021-04-16 2021-06-29 南京工业大学 Mobile robot pose estimation method based on multi-sensor data fusion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116363205A (en) * 2023-03-30 2023-06-30 中国科学院西安光学精密机械研究所 Space target pose resolving method based on deep learning and computer program product
CN117284499A (en) * 2023-11-24 2023-12-26 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism
CN117284499B (en) * 2023-11-24 2024-01-19 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism

Also Published As

Publication number Publication date
CN113587904B (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN107063228B (en) Target attitude calculation method based on binocular vision
US8593524B2 (en) Calibrating a camera system
CN111897349B (en) Autonomous obstacle avoidance method for underwater robot based on binocular vision
CN113587904B (en) Target attitude and position measurement method integrating machine vision and laser reference point information
Douxchamps et al. High-accuracy and robust localization of large control markers for geometric camera calibration
CN107067437B (en) Unmanned aerial vehicle positioning system and method based on multi-view geometry and bundle adjustment
CN109470149B (en) Method and device for measuring position and posture of pipeline
CN113532420B (en) Visual inertial odometer method integrating dotted line characteristics
CN112164117A (en) V-SLAM pose estimation method based on Kinect camera
JP2016024052A (en) Three-dimensional measurement system, three-dimensional measurement method and program
CN110322492B (en) Space three-dimensional point cloud registration method based on global optimization
CN115222905A (en) Air-ground multi-robot map fusion method based on visual features
Zhou et al. Semi-dense visual odometry for RGB-D cameras using approximate nearest neighbour fields
Fetzer et al. Stable intrinsic auto-calibration from fundamental matrices of devices with uncorrelated camera parameters
CN111523547A (en) 3D semantic segmentation method and terminal
CN114998448A (en) Method for calibrating multi-constraint binocular fisheye camera and positioning space point
CN112712566B (en) Binocular stereo vision sensor measuring method based on structure parameter online correction
CN113642397B (en) Object length measurement method based on mobile phone video
Zhang et al. Improved Camera Calibration Method and Accuracy Analysis for Binocular Vision
CN113706381A (en) Three-dimensional point cloud data splicing method and device
CN113393524A (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN111951295A (en) Method and device for determining flight trajectory based on polynomial fitting high precision and electronic equipment
CN112991372B (en) 2D-3D camera external parameter calibration method based on polygon matching
CN113592953A (en) Binocular non-cooperative target pose measurement method based on feature point set
Iida et al. High-accuracy Range Image Generation by Fusing Binocular and Motion Stereo Using Fisheye Stereo Camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant