CN115908562A - Different-surface point cooperation marker and measuring method - Google Patents

Different-surface point cooperation marker and measuring method Download PDF

Info

Publication number
CN115908562A
CN115908562A CN202211376512.9A CN202211376512A CN115908562A CN 115908562 A CN115908562 A CN 115908562A CN 202211376512 A CN202211376512 A CN 202211376512A CN 115908562 A CN115908562 A CN 115908562A
Authority
CN
China
Prior art keywords
marker
mark
point
points
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211376512.9A
Other languages
Chinese (zh)
Inventor
范晓鹏
郝颖明
魏景阳
付双飞
吴清潇
朱枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Institute of Automation of CAS filed Critical Shenyang Institute of Automation of CAS
Priority to CN202211376512.9A priority Critical patent/CN115908562A/en
Publication of CN115908562A publication Critical patent/CN115908562A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention belongs to the field of computer vision, in particular to a different surface point cooperation marker and a measuring method, wherein the marker comprises: a base surface, a column, and a plurality of mark points; a plurality of upright columns are arranged on the base surface; on the mark point was located base plane and stand respectively, the mark point included: a ring mark point and a dot mark point; wherein, the mark points arranged on the base surface are base surface mark points, and the mark points arranged on the top surface of the cylinder are cylindrical surface mark points; the base mark points are circular ring mark points and are uniformly distributed on four corners of the base along the center point of the base; the cylindrical surface mark points are ring mark points or dot mark points. The invention has the characteristic of rapid identification, has stronger anti-interference capability and can still accurately finish target identification under the condition of losing individual mark points.

Description

Different-surface point cooperation marker and measuring method
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a different surface point cooperation marker and a measuring method.
Background
Visual measurement can be divided into monocular visual measurement, multi-view visual measurement, structured light visual measurement and the like according to different application scenes and task requirements. The multi-view vision measurement mainly utilizes texture information of a target and a background in a scene to carry out binocular feature matching and relative pose calculation, the algorithm is relatively complex, and the calculation is time-consuming; the structured light measurement is equivalent to the increase of texture information in a scene, so that the extraction of a three-dimensional point cloud of a target is facilitated, but if the point cloud is dense, the problem of large calculation amount and low corresponding speed is also brought; through reasonable layout and configuration in many application scenes, the characteristics of high measurement precision, high measurement response speed and strong anti-interference capability can be obtained through monocular vision-based marker measurement. The marker measurement based on monocular vision has wide application cases in multiple fields of space, underwater, industrial production and the like, such as specific operations of spacecraft intersection, on-orbit maintenance, submersible parking, mechanical arm target grabbing and the like.
Zhou Xin, jufeng published in "computer science newspaper", 2003, 12 nd paper "several discussions about the unique conditions of the solution to the P3P problem", discussed the P3P problem multi-solution problem and unique solution conditions from the perspective of engineering application, and the application numbers are: 201310639611.6, the invention relates to a monocular vision pose measurement method based on point characteristics, which utilizes isosceles triangle distributed mark points to construct a unique solution condition of a P3P problem and adopts a dichotomous iterative algorithm to realize rapid solution of relative poses. Under relatively complex working conditions in some application scenes, the problems of reflection, shielding, interference of a large number of light spots and the like may occur after the mark points on the marker are imaged, so that the identification fails or the efficiency of target measurement is seriously influenced by false identification. Therefore, the anti-interference capability of the marker point identification should be fully considered in the aspect of marker design.
Disclosure of Invention
The invention aims to provide a marker with an out-of-plane structure, which consists of circular ring mark points and circular dot mark points, improves the anti-interference capability of marker identification by utilizing the exclusivity of the circular ring mark points in image processing, and improves the relative pose measurement accuracy of the marker by utilizing the out-of-plane structure. During marker identification, firstly searching from a small number of ring feature points, then searching candidate minimum units which can meet model constraints by combining with dot feature points, calculating an initial value of the relative pose by using a P3P algorithm, further searching other feature points according to the model constraints, and finally solving an accurate solution of the relative pose by using a nonlinear optimization algorithm.
The technical scheme adopted by the invention for realizing the purpose is as follows: an out-of-plane point cooperation marker, comprising: a base surface, a column, and a plurality of mark points;
a plurality of upright columns are arranged on the base surface; the mark point is respectively arranged on the base surface and the upright column, and the mark point comprises: a ring mark point and a dot mark point;
wherein, the mark points arranged on the base surface are base surface mark points, and the mark points arranged on the top surface of the cylinder are cylindrical surface mark points;
the base mark points are circular ring mark points and are uniformly distributed on four corners of the base along the center point of the base; the cylindrical surface mark points are ring mark points or dot mark points.
The stand, include: the first upright column, the second upright column and the third upright column are arranged at the central point of the base surface, and the second upright column and the third upright column are symmetrically arranged along the upper side and the lower side of the upright column at the central point of the base surface;
dot mark points are respectively arranged on the first upright post and the second upright post, and a circular ring mark point is arranged on the third upright post;
the center point of the dot mark point on the second upright post and the center points of the ring mark points at two corners of the top of the base surface are positioned on the same horizontal line; the center point of the ring mark point on the third upright post and the center points of the ring mark points at two corners of the bottom of the base surface are positioned on the same horizontal line;
the central points of the circular ring mark points on the base surface are sequentially connected to form a square;
the vertical projection of the first upright column on the base surface is positioned at the center point of the square, and the vertical projections of the second upright column and the third upright column on the base surface are respectively positioned at the middle points of the upper side and the lower side of the square.
The diameter size of the dot mark points is two thirds of that of the ring mark points, and the inner diameter size of the ring mark points is equal to one half of the outer diameter size of the ring mark points; the inner circle and the outer circle of the circular ring mark point are concentric circles.
A measuring method of a different-surface point cooperation marker comprises the following steps:
1) Marking the mark points on the marker, and establishing a marker coordinate system according to the mark points;
2) Setting a plurality of groups of minimum identification units according to the distribution condition of the mark points on the marker; each group of minimum identification units consists of a cylindrical surface mark point and two basal surface mark points;
3) The marker is placed in front of a camera lens, the inclination angle of the marker is within a set angle range, and after the marker is imaged by the camera lens, the circular ring mark points and the dot mark points in the image are identified, and distortion correction is carried out on the image coordinates of the mark points;
4) Starting to search the minimum identification unit by the circular ring mark points, namely traversing all the circular ring mark points found in the image processing process, selecting two circular ring mark points meeting set distance constraint, selecting a circular point mark point from all the circular point mark points, and forming a candidate minimum identification unit together with the two circular ring mark points;
5) Assuming that the candidate minimum identification unit is one of the multiple groups of minimum identification units set in the step 2), setting model coordinates of the mark points, and solving the initial value of the relative pose of the marker by combining the imaging coordinates of the mark points in the minimum identification unit to obtain the initial value of the pose of the coordinate system of the marker relative to the coordinate system of the camera;
6) Acquiring imaging coordinates of other mark points on the marker on an image by acquiring an initial pose value of a marker coordinate system relative to a camera coordinate system, space coordinates of the mark points on the marker in a three-dimensional model of the marker and calibrated camera model parameters, and matching the searched mark points in the image by using the imaging coordinates of the mark points to acquire successfully matched mark points;
7) According to the successfully matched marker points, taking the reprojection error as an objective function, taking the relative pose between the marker coordinate system and the camera coordinate system as an optimization variable, and taking the relative pose solved by the P3P algorithm as an initial value of the optimization variable to carry out nonlinear optimization relative pose solving to obtain the relative pose between the marker coordinate system and the camera coordinate system;
8) And carrying out coordinate system conversion on the relative pose between the obtained marker coordinate system and the camera coordinate system: and the relative position and attitude between a passive end coordinate system fixedly connected with the marker and an active end coordinate system fixedly connected with the camera.
The step 1) specifically comprises the following steps:
the mark point on the first column is marked as a mark point No. 0, the mark point on the second column is marked as a mark point No. 1, the mark point on the upper right corner of the base surface is marked as a mark point No. 2, the mark point on the lower right corner of the base surface is marked as a mark point No. 3, the mark point on the third column is marked as a mark point No. 4, the mark point on the lower left corner of the base surface is marked as a mark point No. 5, and the mark point on the upper left corner of the base surface is marked as a mark point No. 6;
the projection point of the No. 0 mark point on the base surface is defined as the origin of the coordinate system of the marker, the direction from the No. 6 mark point to the No. 2 mark point is defined as the positive direction of the X axis of the coordinate system of the marker, and the direction from the No. 6 mark point to the No. 5 mark point is defined as the positive direction of the Y axis of the coordinate system of the marker.
In step 2), the setting of the plurality of groups of minimum identification units is:
{0,2,3},{0,3,5},{0,5,6},{0,6,2},{0,2,5},{0,6,3},{1,6,2},{4,3,5};
wherein, the numbers 0 to 6 correspond to the number 0 mark point to the number 6 mark point respectively;
wherein the layout of the three marker points of each of the first 4 sets {0,2,3}, {0,3,5}, {0,5,6}, {0,6,2} on the marker constitutes a clockwise ordering;
the last 4 groups {0,2,5}, {0,6,3}, {1,6,2}, {4,3,5}, the projection of the first landmark point of each group on the base is located in the middle of the second and third landmark points;
and screening candidate minimum identification units by setting a constraint condition.
In step 6), the matching is performed from the found mark points in the image to obtain the mark points successfully matched, specifically:
if the Euclidean distance between the imaging coordinate and the certain mark point coordinate is smaller than a set threshold value, the mark point is successfully matched with the image feature point, and if the matching point meets the set quantity constraint, the target is identified; otherwise, continuously judging the initial pose values of the marker coordinate systems of other marker points solved by the P3P algorithm relative to the camera coordinate system;
if the pose solved by the P3P algorithm does not meet the constraint condition of the number of the matching points, the candidate minimum identification units are continuously assumed to be other conditions of the multiple groups of minimum identification units set in the step 2) until all conditions in the multiple groups of minimum identification units are traversed and completed until the target is correctly identified.
The step 7) is specifically as follows:
acquiring a pose initial value of a marker coordinate system relative to a camera coordinate system, taking a reprojection error as an objective function E, taking a relative pose between the marker coordinate system and the camera coordinate system as an optimization variable, and solving through nonlinear optimization to obtain a rotation-translation relation (Rt) of the marker coordinate system relative to the camera coordinate system, namely:
Figure BDA0003926790840000031
Figure BDA0003926790840000032
wherein (X) i ,Y i ,Z i ) As the coordinates of the ith marking point in the coordinate system of the marker, (u) i ,v i ) The imaging coordinates corresponding to the i marking points, (R t) the rotational and translational relation of the marker coordinate system relative to the camera coordinate system, and (R t) the corresponding position quantities are tx, ty, tz, and Euler angles ax, ay, az, fu, fv, u 0 ,v 0 Is an internal parameter of the camera and is,
Figure BDA0003926790840000033
is passing (X) i ,Y i ,Z i ) Relative roto-translational relationship (Rt) and phaseAnd (4) recalculating the imaging coordinates of the mark points by the built-in parameters.
The step 8) specifically includes:
make the rotation translation matrix
Figure BDA0003926790840000041
Representing the relative position and posture relationship between the coordinate system of the actuated end and the coordinate system of the marker, and making the rotation and translation matrix->
Figure BDA0003926790840000042
Representing the relative pose relationship between the camera coordinate system and the active end mark system, and making the rotation and translation matrix ≥>
Figure BDA0003926790840000043
Representing the relative pose of the marker coordinate system relative to the camera coordinate system, and making the rotation translation matrix ≥>
Figure BDA0003926790840000044
Representing the relative pose of the coordinate system of the active end relative to the coordinate system of the active end:
Figure BDA0003926790840000045
wherein,
Figure BDA0003926790840000046
and &>
Figure BDA0003926790840000047
It is a fixed parameter which can be known by calibration>
Figure BDA0003926790840000048
Is calculated by nonlinear optimization (rt).
The invention has the following beneficial effects and advantages:
1. the marker is designed by using the layout of the different-surface marker points, so that the measurement precision can be effectively improved in the limited space of the marker.
2. The marker is designed by using the circular ring characteristic points with strong exclusivity, and under the condition of complex background, the searching range of the characteristic points can be reduced, and the characteristic point combination in the image can be quickly positioned.
3. The recognition algorithm uses a search strategy based on a minimum recognition unit, can still accurately complete target recognition under the condition of losing individual mark points, and has strong anti-interference capability.
Drawings
FIG. 1 is a schematic view of a marker structure according to the present invention;
FIG. 2 illustrates 8 sets of minimum identification cells for a marker set in an embodiment of the present invention;
FIG. 3 is a schematic view of marker marking points of the present invention;
fig. 4 is a two-dimensional three-dimensional view of the marker structure of the present invention.
Detailed Description
As shown in fig. 1, which is a schematic design diagram of the marker in the patent of the present invention, the marker points on the marker include ring marker points and dot marker points, the marker is composed of a base surface and three columns on the base surface, and the marker points are distributed on the base surface and the columns;
dividing mark points into base mark points and cylindrical mark points according to a distribution area of the mark points on the marker, wherein the mark points distributed on a base are the base mark points, and the mark points distributed on a cylinder are the cylindrical mark points;
wherein, the base mark points are ring mark points and are distributed at four corners of the base of the marker; the marker is divided into left, middle and right areas, the cylinders are positioned in the middle of the marker, the three cylinders are distributed according to the upper part, the middle part and the lower part, wherein the cylindrical surface mark points on the lower side are also circular ring mark points, and the other two cylindrical surface mark points are circular dot mark points.
A column, comprising: the first upright column, the second upright column and the third upright column are arranged at the central point of the base surface, and the second upright column and the third upright column are symmetrically arranged along the upper side and the lower side of the upright column at the central point of the base surface;
dot mark points are respectively arranged on the first upright post and the second upright post, and a circular ring mark point is arranged on the third upright post;
the center point of the dot mark point on the second upright post and the center points of the ring mark points at two corners of the top of the base surface are positioned on the same horizontal line; the center point of the ring mark point on the third upright post and the center points of the ring mark points at two corners of the bottom of the base surface are positioned on the same horizontal line;
the diameter of each dot mark point is two thirds of the outer diameter of the circular mark point, and the inner diameter of each circular mark point is equal to half of the outer diameter of the circular mark point; the inner circle and the outer circle of the circular ring mark point are concentric circles.
The central connecting lines of the base mark points positioned at the four corners form a square, the vertical projection of the upper cylindrical mark point on the base is positioned at the middle point position of the upper two base mark points, the vertical projection of the lower cylindrical mark point on the base is positioned at the middle point position of the lower two base mark points, and the vertical projection of the middle cylindrical mark point on the base is positioned at the central position of the upper four base mark points.
As shown in FIG. 1, it can be seen that there are 4 circular mark points processed on the base surface of the marker, and 3 through holes (3 positions marked by 7/8/9 in FIG. 2) of 6.35mm, and the 3 through holes are mainly used for the laser tracker to perform precision test and collect data.
FIG. 4 shows specific dimensional information, where the diameter of the dot mark on the marker is 10mm, the outer diameter of the ring mark is 15mm, and the inner diameter is 7.5mm; the height of the middle cylinder is 30mm, and the diameter of the cylinder is 18mm; the height of the upper cylinder and the lower cylinder is 20mm, the diameter of the upper cylinder is 18mm, and the diameter of the lower cylinder is 25mm. The center-to-center spacing of the base circle mark points is 65mm. The surface of the marker is treated by black light absorption material, the area of the marker point is recessed 0.2mm from the surface, and diffuse reflection white paint is coated.
The marker identification measuring method specifically comprises the following steps:
setting a binarization threshold value T0 according to the image gray level distribution condition, carrying out binarization on the image, defining pixels larger than the binarization threshold value as target pixels, and defining the rest pixels as background pixels;
growing a region, searching a connected region which meets the condition that the length-width ratio of a circumscribed rectangle is not more than T1 (such as 1.6) in an image, judging whether a hole exists near the center of the region or not according to the characteristic that the center of a circular feature point is a hole formed by background pixels and a dot feature point is a solid connected region, directly classifying the region into the dot feature point if the searched region does not contain the hole, further calculating the image coordinate of the center of the hole if the hole exists, comparing the image coordinate with the center coordinate of the connected region formed by corresponding target pixels, and classifying the identified region into the circular feature point if the Euclidean distance between the two is less than T2 (such as 1.0 pixel) and meeting the circular feature point constraint;
1) Marking the mark points on the marker, as shown in fig. 2, marking the middle cylindrical mark point as a mark point No. 0, marking the upper cylindrical mark point as a mark point No. 1, marking the upper right corner base mark point as a mark point No. 2, marking the lower right corner mark point as a mark point No. 3, marking the lower cylindrical mark point as a mark point No. 4, marking the lower left corner mark point as a mark point No. 5, and marking the upper left corner mark point as a mark point No. 6; the projection point of the No. 0 mark point on the base surface is defined as the origin of the marker coordinate system, the direction from the No. 6 mark point to the No. 2 mark point is defined as the positive direction of the X axis of the marker coordinate system, and the direction from the No. 6 mark point to the No. 5 mark point is defined as the positive direction of the Y axis of the marker coordinate system.
2) The following 8 groups of minimum identification units are set according to the distribution of the marker points on the marker, as shown in fig. 3: {0,2,3},{0,3,5},{0,5,6},{0,6,2},{0,2,5},{0,6,3},{1,6,2},{4,3,5}.
Wherein, the numbers 0 to 6 correspond to the number 0 mark point to the number 6 mark point respectively. Each group of minimum identification units consists of a cylindrical surface mark point and two basal surface mark points. Wherein the layout of the first 4 groups of {0,2,3}, {0,3,5}, {0,5,6}, {0,6,2} three marker points on the marker constitutes a clockwise ordering; and the last 4 groups of {0,2,5}, {0,6,3}, {1,6,2}, {4,3,5}, wherein the projection of the first mark point on the base plane is positioned in the middle position of the second mark point and the third mark point, and the information can be used for setting a constraint condition to screen candidate minimum identification units. As shown in fig. 2 to 3, wherein 0 to 6 are marked as mark points, and 7/8/9 is a target seat hole site used for precision testing;
3) The marker is positioned in front of the camera lens, the inclination angle of the marker is within a certain range, and after the marker is normally imaged, the circular ring characteristic points and the dot characteristic points in the image are firstly identified, and distortion correction is carried out on the image coordinates of the characteristic points.
4) The method comprises the steps of starting to search a minimum identification unit by using a ring feature point with strong exclusivity, namely traversing all ring mark points found in the image processing process, selecting two rings meeting certain distance constraint from the ring mark points, selecting a dot feature point from all dot feature points, and forming a candidate minimum identification unit together with the two ring feature points.
5) Assuming that the candidate minimum identification unit is one of the 8 groups of minimum identification units set in the step 2, setting model coordinates of the mark points, and combining imaging coordinates of the minimum identification units, and using a P3P algorithm to solve the initial value of the relative pose. Theoretically, the P3P algorithm has 4 groups of solutions for solving the relative pose, and possibly comprises complex solutions, and only the real solution is checked.
Since the range of motion of the marker relative to the camera is limited, including the range of measurement distances and possible attitude angles, it is assumed here that the tilt angle of the marker is not greater than T3 (e.g. 30 °), and the roll angle deviates not greater than T4 (e.g. 20 °) around Theta, and this information can be used to further screen the pose initial value solved by the P3P algorithm.
For the pose initial values meeting certain constraint conditions, the model coordinates of the mark points on the marker and the camera model parameters can be combined to search whether other mark points have matched feature points in the image, the matching error is set to be T5 (such as 1.0 pixel), if the matching points are not less than T6 (such as 5), the target is correctly identified, the checking calculation of other P3P pose solutions is not performed, and the assumption of other conditions in the 8 groups of minimum identification units set in the step 2 is not performed; and if not, continuously judging other poses solved by the P3P algorithm, if all real poses solved by the P3P algorithm do not meet the constraint condition of the number of the matching points, continuously assuming that the candidate minimum identification unit is the other condition in the 8 groups of minimum identification units set in the step 2 until all conditions in the 8 groups of minimum identification units are traversed. If the motion range of the transverse rolling angle of the marker is limited within 20 degrees of deviation near Theta, the identification ambiguity cannot be caused even if the No. 1 marker point and the No. 4 marker point are simultaneously lost, and the No. 1 marker point and the No. 4 marker point are allowed to be simultaneously lost in the target identification process under the working condition;
the P3P algorithm for solving the relative pose specifically comprises the following steps:
the 1 st marking point in the minimum recognition unit is marked with symbol a, the 2 nd marking point is marked with symbol B, and the 3 rd marking point is marked with symbol C. Distance between A and B by d 1 Marks, the distance between A and C being d 2 Mark, distance between B and C is d 3 The mark, the camera optical center is marked with O, the distance between O and a is marked with a, the distance between O and B is marked with B, and the distance between O and C is marked with C. The angle AOB is marked by alpha, the alpha 0AOC is marked by beta, and the angle BOC is marked by gamma. Since the marker model is known and the coordinates of the marker point in the marker coordinate system can be obtained, d1, d2, and d3 are known quantities. An imaging point corresponding to a marker point A is marked by D, an imaging point corresponding to a marker point B is marked by E, an imaging point corresponding to a marker point C is marked by F, the O, A and D three points are collinear, the O, B and E three points are collinear, and the O, C and F three points are collinear, so that the method has the advantages of { [ AOB ] = [ DOE ], [ AOC = [ DOF ], [ BOC ] = [ EOF ], and the imaging points D, E and F are determined during image processing characteristic point searching, and corresponding imaging coordinates are also known quantities, so that specific values of alpha, beta and gamma can be calculated by using the cosine law.
And then according to the cosine theorem:
in Δ AOB are
Figure BDA0003926790840000061
In delta AOC are
Figure BDA0003926790840000071
In Δ BOC have
Figure BDA0003926790840000072
Above d 1 、d 2 And d 3 And alpha, beta and gamma are known quantities, three formulas are combinedThe process can solve for a, b, and c. Because the direction vector of the connecting line of the three points A, B and C and the optical center in the camera coordinate system is obtained by calculating the corresponding imaging coordinate and the connecting line of the optical center, and the distances between the three points A, B and C and the optical center are obtained by the solving process, the three-dimensional coordinates of the three points A, B and C in the camera coordinate system can also be obtained by using the product of the direction vector and the distance, and the three-dimensional coordinates of the three points A, B and C in the camera coordinate system and the three-dimensional coordinates in the marker coordinate system are known at the same time, so that the relative pose of the marker coordinate system relative to the camera coordinate system can be obtained.
6) And (5) according to the successfully matched mark points and feature points found in the step (5), taking the reprojection error as an objective function, taking the relative pose between the marker coordinate system and the camera coordinate system as an optimization variable, taking the pose solved by the recorded P3P algorithm as an initial value of the optimization variable, and performing nonlinear optimization relative pose solution, wherein the nonlinear optimization algorithm adopts an LM (Linear modeling) optimization algorithm.
The relative pose obtained by solving through the P3P algorithm in the marker identification algorithm solving process can be compared with the possible pose range under the specific working condition, if the solved pose exceeds the possible pose range, the solved pose is directly eliminated, and the marker identification algorithm does not participate in the next calculation.
In addition, if the range of the gesture angle of the marker relative to the motion of the camera is limited, the No. 1 marker point and the No. 4 marker point can be lost simultaneously under the condition of no identification ambiguity, otherwise at least one marker point of the No. 1 marker point and the No. 4 marker point is correctly identified.
7) And transforming the coordinate system, obtaining the relative pose between the marker coordinate system and the camera coordinate system in the calculation process, and transforming the relative pose to a passive end coordinate system fixedly connected with the marker and a relative pose between an active end coordinate system fixedly connected with the camera according to the pre-calibrated parameters.
The active end is fixedly connected with the camera, the passive end is fixedly connected with the marker, and specific objects of the active end and the passive end are different according to different actual measurement tasks. For example, in the operation process of the mechanical arm, the camera is assembled at the tail end of the mechanical arm, the operation tool is assembled at the tail end of the mechanical arm, the marker is assembled near the operation part, and the mechanical arm is guided to carry the operation tool to complete the action of the operation part through visual measurement; or in the docking mission of the space spacecraft, the camera is installed on one of the spacecrafts, the marker is assembled on the other spacecraft, the docking mechanism of the spacecraft provided with the camera is called as an active end, and the spacecraft provided with the marker is called as a passive end.
The nonlinear optimization relative pose solution is generally an iterative optimization process, and three input conditions, namely an optimization variable, an initial value of the optimization variable and an objective function, need to be determined, and are specifically described below.
It is known that n marker points are successfully identified, wherein the coordinate of the ith marker point in the marker coordinate system is (X) i ,Y i ,Z i ) The corresponding imaging coordinate is (u) i ,v i ) The rotational-translational relationship of the marker coordinate system with respect to the camera coordinate system is expressed in terms of a 3 x 4 matrix (rt), where R is a 3 x 3 rotation matrix and the corresponding euler angles ax, ay, az, t are 3 x 1 column vectors (tx, ty, tz)'. fu, fv is the camera focal ratio, u 0 ,v 0 As coordinates of camera principal points fu, fv, u 0 ,v 0 Collectively referred to as camera intrinsic parameters, may be obtained by prior camera calibration.
Figure BDA0003926790840000073
For the reprojected image coordinates, the model parameters (X) can be passed i ,Y i ,Z i ) The relative rotation and translation relation (Rt) and the camera intrinsic parameters are calculated and obtained as follows:
Figure BDA0003926790840000081
let the position quantities tx, ty, tz and euler angles ax, ay, az be optimization variables, and the sum of the squares of the deviations of the image coordinates of the identified feature points from the image coordinates of the landmark point re-projection be used as a nonlinear optimization relative pose solution objective function, as follows:
Figure BDA0003926790840000082
make the rotation translation matrix
Figure BDA0003926790840000083
Representing the relative position and posture relationship between the coordinate system of the actuated end and the coordinate system of the marker, and making the rotation and translation matrix->
Figure BDA0003926790840000084
Representing the relative pose relationship between the camera coordinate system and the active terminal mark system, and making the rotation/translation matrix->
Figure BDA0003926790840000085
Representing the relative pose of the marker coordinate system relative to the camera coordinate system, and making the rotation translation matrix ≥>
Figure BDA0003926790840000086
Representing the relative pose of the active end coordinate system relative to the active end coordinate system:
Figure BDA0003926790840000087
wherein,
Figure BDA0003926790840000088
and &>
Figure BDA0003926790840000089
Is that the fixed parameter can be known by calibration>
Figure BDA00039267908400000810
Represents (rt) calculated by nonlinear optimization.
In conclusion, the minimum identification unit is arranged in the marker identification process, the searching range is narrowed by using the characteristic that the exclusivity of the circular ring marker point is strong by using the cylindrical surface marker point and the two base surface marker points, the target is quickly positioned, the initial value of the relative pose between the marker and the camera is solved by using the P3P algorithm, the feature points matched with the marker points in the image are further searched by using the initial value information, and finally, the accurate solution of the relative pose is obtained by using a nonlinear optimization method.
The above description is only an embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, extension, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1. An out-of-plane point cooperation marker, comprising: a base surface, a column, and a plurality of mark points;
a plurality of upright columns are arranged on the base surface; the mark point is respectively arranged on the base surface and the upright column, and the mark point comprises: a circular ring mark point and a circular dot mark point;
wherein, the mark points arranged on the base surface are base surface mark points, and the mark points arranged on the top surface of the cylinder are cylindrical surface mark points;
the base mark points are circular ring mark points and are uniformly distributed on four corners of the base along the center point of the base; the cylindrical surface mark points are ring mark points or dot mark points.
2. A co-planar marker as claimed in claim 1, wherein the post comprises: the first upright column, the second upright column and the third upright column are arranged at the central point of the base surface, and the second upright column and the third upright column are symmetrically arranged along the upper side and the lower side of the upright column at the central point of the base surface;
dot mark points are respectively arranged on the first upright post and the second upright post, and a circular ring mark point is arranged on the third upright post;
the center point of the dot mark point on the second upright post and the center points of the ring mark points at two corners of the top of the base surface are positioned on the same horizontal line; the center point of the circular ring mark point on the third upright post and the center points of the circular ring mark points at two corners of the bottom of the base plane are positioned on the same horizontal line.
3. A co-operating marker according to claim 2, wherein the centre points of the circle markers on the base surface are connected in sequence to form a square;
the vertical projection of the first upright column on the base surface is positioned at the center point of the square, and the vertical projections of the second upright column and the third upright column on the base surface are respectively positioned at the middle points of the upper side and the lower side of the square.
4. The heterofacial point cooperative marker of claim 1, wherein the diameter dimension of the dot marker is two thirds of the diameter dimension of the ring marker, and the inner diameter dimension of the ring marker is equal to one half of the outer diameter dimension of the ring marker; the inner circle and the outer circle of the circular ring mark point are concentric circles.
5. The method for measuring the different-surface-point cooperation marker according to claim 1, characterized by comprising the following steps:
1) Marking the mark points on the marker, and establishing a marker coordinate system according to the mark points;
2) Setting a plurality of groups of minimum identification units according to the distribution condition of the mark points on the marker; each group of minimum identification units consists of a cylindrical surface mark point and two base surface mark points;
3) The marker is placed in front of a camera lens, the inclination angle of the marker is within a set angle range, and after the marker is imaged by the camera lens, the circular ring mark points and the dot mark points in the image are identified, and distortion correction is carried out on the image coordinates of the mark points;
4) Starting to search the minimum identification unit by the circular ring mark points, namely traversing all the circular ring mark points found in the image processing process, selecting two circular ring mark points meeting set distance constraint, selecting a circular point mark point from all the circular point mark points, and forming a candidate minimum identification unit together with the two circular ring mark points;
5) Assuming that the candidate minimum identification unit is one of the multiple groups of minimum identification units set in the step 2), setting model coordinates of the mark points, combining imaging coordinates of the mark points in the minimum identification unit, and solving initial values of relative positions of the markers to obtain initial values of positions of the coordinate system of the markers relative to the coordinate system of the camera;
6) Acquiring imaging coordinates of other mark points on the marker on an image by acquiring an initial pose value of a marker coordinate system relative to a camera coordinate system, space coordinates of the mark points on the marker in a three-dimensional model of the marker and calibrated camera model parameters, and matching the searched mark points in the image by using the imaging coordinates of the mark points to acquire successfully matched mark points;
7) According to the successfully matched marker points, taking the reprojection error as an objective function, taking the relative pose between the marker coordinate system and the camera coordinate system as an optimization variable, and taking the relative pose solved by the P3P algorithm as an initial value of the optimization variable to carry out nonlinear optimization relative pose solving to obtain the relative pose between the marker coordinate system and the camera coordinate system;
8) And carrying out coordinate system conversion on the obtained relative pose between the marker coordinate system and the camera coordinate system: and the relative position and attitude between a passive end coordinate system fixedly connected with the marker and an active end coordinate system fixedly connected with the camera.
6. The method for measuring an out-of-plane point cooperation marker according to claim 5, wherein the step 1) specifically comprises:
the mark point on the first column is marked as a mark point No. 0, the mark point on the second column is marked as a mark point No. 1, the mark point on the upper right corner of the base surface is marked as a mark point No. 2, the mark point on the lower right corner of the base surface is marked as a mark point No. 3, the mark point on the third column is marked as a mark point No. 4, the mark point on the lower left corner of the base surface is marked as a mark point No. 5, and the mark point on the upper left corner of the base surface is marked as a mark point No. 6;
the projection point of the No. 0 mark point on the base surface is defined as the origin of the coordinate system of the marker, the direction from the No. 6 mark point to the No. 2 mark point is defined as the positive direction of the X axis of the coordinate system of the marker, and the direction from the No. 6 mark point to the No. 5 mark point is defined as the positive direction of the Y axis of the coordinate system of the marker.
7. The method for measuring an out-of-plane cooperative marker according to claim 5, wherein in step 2), the setting of the plurality of groups of minimum identification units is:
{0,2,3},{0,3,5},{0,5,6},{0,6,2},{0,2,5},{0,6,3},{1,6,2},{4,3,5};
wherein, the numbers 0 to 6 correspond to the number 0 mark point to the number 6 mark point respectively;
wherein the layout of the three marker points of each of the first 4 sets {0,2,3}, {0,3,5}, {0,5,6}, {0,6,2} on the marker constitutes a clockwise ordering;
the last 4 groups {0,2,5}, {0,6,3}, {1,6,2}, {4,3,5}, the projection of the first landmark point of each group on the base is located in the middle of the second and third landmark points;
and screening candidate minimum identification units by setting constraint conditions.
8. The method for measuring a different plane point cooperative marker according to claim 5, wherein in step 6), the matching is performed from the searched marker points in the image to obtain the successfully matched marker points, specifically:
if the Euclidean distance between the imaging coordinate and a certain mark point coordinate is smaller than a set threshold value, the mark point is successfully matched with the image feature point, and if the matching point meets the set quantity constraint, the target is identified; otherwise, continuously judging the initial pose values of the marker coordinate systems of other marker points solved by the P3P algorithm relative to the camera coordinate system;
if the pose solved by the P3P algorithm does not meet the constraint condition of the number of the matching points, the candidate minimum identification units are continuously assumed to be other conditions of the multiple groups of minimum identification units set in the step 2) until all conditions in the multiple groups of minimum identification units are traversed and completed until the target is correctly identified.
9. The method for measuring a different plane point cooperation marker according to claim 5, wherein the step 7) is specifically:
obtaining an initial pose value of a marker coordinate system relative to a camera coordinate system, taking a re-projection error as an objective function E, taking a relative pose between the marker coordinate system and the camera coordinate system as an optimization variable, and solving through nonlinear optimization to obtain a rotation-translation relation (Rt) of the marker coordinate system relative to the camera coordinate system, namely:
Figure FDA0003926790830000031
/>
Figure FDA0003926790830000032
wherein (X) i ,Y i ,Z i ) As the coordinates of the ith marking point in the coordinate system of the marker, (u) i ,v i ) The position quantities tx, ty, tz, euler angles ax, ay, az, fu, fv, u, which correspond to the rotational-translational relationship (R t) of the marker coordinate system with respect to the camera coordinate system are (R t) 0 ,v 0 Is an internal parameter of the camera and is,
Figure FDA0003926790830000033
is passed through (X) i ,Y i ,Z i ) Relative rotational translation relationship (R t) and marker point imaging coordinates for intra-camera parameter recalculation.
10. The method for measuring a different-plane-point cooperation marker according to claim 5, wherein the step 8) is specifically:
make the rotation translation matrix
Figure FDA0003926790830000034
Representing the relative pose relationship between the coordinate system of the actuated end and the coordinate system of the marker, and making the rotation and translation matrix->
Figure FDA0003926790830000035
Representing the relative pose relationship between the camera coordinate system and the active end mark systemIn which the rotation/translation matrix is made to be greater than>
Figure FDA0003926790830000036
Representing the relative pose of the marker coordinate system relative to the camera coordinate system, and making the rotation and translation matrix->
Figure FDA0003926790830000037
Representing the relative pose of the active end coordinate system relative to the active end coordinate system:
Figure FDA0003926790830000038
wherein,
Figure FDA0003926790830000039
and &>
Figure FDA00039267908300000310
Is that the fixed parameter can be known by calibration>
Figure FDA00039267908300000311
Represents (R t) calculated by non-linear optimization. />
CN202211376512.9A 2022-11-04 2022-11-04 Different-surface point cooperation marker and measuring method Pending CN115908562A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211376512.9A CN115908562A (en) 2022-11-04 2022-11-04 Different-surface point cooperation marker and measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211376512.9A CN115908562A (en) 2022-11-04 2022-11-04 Different-surface point cooperation marker and measuring method

Publications (1)

Publication Number Publication Date
CN115908562A true CN115908562A (en) 2023-04-04

Family

ID=86491790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211376512.9A Pending CN115908562A (en) 2022-11-04 2022-11-04 Different-surface point cooperation marker and measuring method

Country Status (1)

Country Link
CN (1) CN115908562A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628786A (en) * 2023-07-26 2023-08-22 中南大学 Manufacturing method of special-shaped three-dimensional marking ball

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628786A (en) * 2023-07-26 2023-08-22 中南大学 Manufacturing method of special-shaped three-dimensional marking ball
CN116628786B (en) * 2023-07-26 2023-10-10 中南大学 Manufacturing method of special-shaped three-dimensional marking ball

Similar Documents

Publication Publication Date Title
CN107292927B (en) Binocular vision-based symmetric motion platform pose measurement method
CN107014312B (en) A kind of integral calibrating method of mirror-vibrating line laser structured light three-dimension measuring system
CN108562274B (en) Marker-based non-cooperative target pose measurement method
CN109269430B (en) Multi-standing-tree breast height diameter passive measurement method based on deep extraction model
CN111604598B (en) Tool setting method of mechanical arm feeding type laser etching system
CN102773862B (en) Quick and accurate locating system used for indoor mobile robot and working method thereof
CN101299270B (en) Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN111968177B (en) Mobile robot positioning method based on fixed camera vision
CN103196370B (en) Measuring method and measuring device of conduit connector space pose parameters
CN108555908A (en) A kind of identification of stacking workpiece posture and pick-up method based on RGBD cameras
CN106651942A (en) Three-dimensional rotation and motion detecting and rotation axis positioning method based on feature points
CN110763204B (en) Planar coding target and pose measurement method thereof
CN105631844A (en) Image camera calibration method
CN110415304B (en) Vision calibration method and system
CN109472778B (en) Appearance detection method for towering structure based on unmanned aerial vehicle
CN113028990B (en) Laser tracking attitude measurement system and method based on weighted least square
CN115774265A (en) Two-dimensional code and laser radar fusion positioning method and device for industrial robot
CN112614188A (en) Dot-matrix calibration board based on cross ratio invariance and identification method thereof
CN114998448B (en) Multi-constraint binocular fisheye camera calibration and space point positioning method
CN110838146A (en) Homonymy point matching method, system, device and medium for coplanar cross-ratio constraint
CN115451964A (en) Ship scene simultaneous mapping and positioning method based on multi-mode mixed features
CN115908562A (en) Different-surface point cooperation marker and measuring method
CN115267747A (en) Calibration method for sparse laser radar and visible light/infrared imaging system
CN113963067B (en) Calibration method for calibrating large-view-field visual sensor by using small target
US20210272301A1 (en) Method for processing three-dimensional point cloud data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination