CN115496802A - High-precision pose measurement method based on butt joint surface characteristics - Google Patents

High-precision pose measurement method based on butt joint surface characteristics Download PDF

Info

Publication number
CN115496802A
CN115496802A CN202210962107.9A CN202210962107A CN115496802A CN 115496802 A CN115496802 A CN 115496802A CN 202210962107 A CN202210962107 A CN 202210962107A CN 115496802 A CN115496802 A CN 115496802A
Authority
CN
China
Prior art keywords
target
pin hole
component
coordinate system
displacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210962107.9A
Other languages
Chinese (zh)
Inventor
罗华
李瑞峰
郭静
杨娜
张力力
郭超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Aerospace Times Precision Electromechanical Co ltd
Original Assignee
Xi'an Aerospace Times Precision Electromechanical Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Aerospace Times Precision Electromechanical Co ltd filed Critical Xi'an Aerospace Times Precision Electromechanical Co ltd
Priority to CN202210962107.9A priority Critical patent/CN115496802A/en
Publication of CN115496802A publication Critical patent/CN115496802A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a high-precision pose measurement method based on butt joint surface characteristics, and aims to solve the technical problems that the existing assembly method of large parts is high in equipment cost, complex in calibration process and easy to be influenced by industrial environment. The method comprises the following steps: 1. building an assembly platform; 2. carrying out internal reference calibration on the monocular vision sensor; 3. calibrating the monocular vision sensor and the motion execution component by adopting a hand-eye calibration method; 4. calibrating the corresponding target and the corresponding pin hole; 5. the pose of the first part relative to the second part is measured.

Description

High-precision pose measurement method based on butt joint surface characteristics
Technical Field
The invention relates to a butt joint assembly method of pin holes of butt joint surfaces of components, in particular to a high-precision pose measurement method based on butt joint surface characteristics.
Background
With the continuous improvement of the performance of products such as airplanes, rockets and the like in the field of aerospace, higher assembly precision and automation requirements are put forward for the butt joint assembly of large parts, and the assembly work adopts a manual operation method at first, namely, the butt joint attitude of the large parts is estimated by human eyes for manual adjustment; the assembly mode has high labor intensity and low working efficiency and depends on the working experience of workers; therefore, the adoption of automatic assembly is a necessary trend, wherein a measuring system needs to measure the relative space attitude between the butt joint components in real time during the assembly process of the large components, and a motion execution assembly is guided to adjust the attitude of the components.
At present, more mature automatic assembly is mostly carried out by adopting a laser tracker for pose measurement, the method utilizes the characteristics of a target ball or a butt joint surface to calculate the space pose, and has the characteristics of high precision and large measurement range, but the laser tracker for pose measurement has higher equipment cost, needs detailed calibration in each assembly, is relatively complex in calibration process, and has high requirements on measurement environment. Compared with the measurement of a laser tracker, the three-dimensional measurement method has the advantages of low cost and convenience in use. However, in the practical application process, the three-dimensional measurement system is easily influenced by the industrial environment, the system parameters need to be calibrated regularly to ensure the measurement precision, and if vibration exists in the working site, the calibration frequency is increased, so that the application effect is influenced.
Disclosure of Invention
The invention aims to solve the technical problems that the existing assembly method of large-scale components is high in equipment cost, complex in calibration process and easy to be influenced by industrial environment, and provides a high-precision pose measurement method based on butt joint surface characteristics.
The technical scheme of the invention is as follows:
a high-precision pose measurement method based on the characteristics of a butt joint surface is characterized by comprising the following steps:
s1, building an assembly platform: the assembly device comprises a first assembly unit and a second assembly unit which are oppositely arranged, wherein the first assembly unit comprises a first fixed butt joint vehicle, a movable motion execution assembly arranged on the upper surface of the first fixed butt joint vehicle and a first component fixedly arranged on the motion execution assembly, and the butt joint surface of the first component is provided with at least four first pin holes; the second assembly unit comprises a second fixed butting trolley, a fixed execution assembly arranged on the second fixed butting trolley and a second component fixedly arranged on the fixed execution assembly, and second pin holes corresponding to the first pin holes one by one are formed in the butting surface of the second component; establishing a three-dimensional coordinate system O-xyz of the first assembly unit, wherein O is at the center of the butt joint surface of the first component;
s2, carrying out internal reference calibration on the visual sensor to obtain calibration parameters of the visual sensor;
s3, installing the vision sensor with the calibrated internal reference in a first assembly unit, calibrating the vision sensor and the motion execution assembly by adopting a hand-eye calibration method, resolving the conversion relation between a vision sensor coordinate system and a motion execution assembly coordinate system, and acquiring a rotation matrix from the vision sensor coordinate system to the motion execution assembly coordinate system
Figure BDA0003793151390000021
And translation matrix
Figure BDA0003793151390000022
S4, arranging a first target and a second target on the first component and the second component, wherein the first target and the second target are both in the visual field range of the visual sensor in the step S3; calibrating the positions of the first target and the first pin hole and the positions of the second target and the second pin hole by using a three-dimensional measuring system; according to a sorting rule, obtaining the coordinate sorting of the first target mark point as
Figure BDA0003793151390000023
The second target mark point coordinates are sorted into
Figure BDA0003793151390000024
The first pin hole coordinate is ordered as
Figure BDA0003793151390000025
The second pin hole coordinate is ordered as
Figure BDA0003793151390000026
And S5, acquiring the first target image and the second target image simultaneously by adopting the vision sensor in the step S3, solving a pitch angle theta, a yaw angle psi, a rolling angle phi and a displacement x of the first component relative to the second component into a displacement y and a displacement z through an image processing algorithm and a transformation matrix of a coordinate system, and finishing the pose measurement of the first component relative to the second component.
Further, step S5 specifically includes:
s5.1, simultaneously acquiring a first target image and a second target image by adopting the vision sensor in the step S3, respectively extracting the circle center coordinates of the first target mark point and the circle center coordinates of the second target mark point by an image processing algorithm, and sequencing according to the sequencing rule in the step S4 to obtain the first target mark point circle center coordinate sequencing p 1 And second target mark point circle center coordinate sequence p 2
S5.2, sorting p according to the calibration parameters of the visual sensor and the circle center coordinates of the first target mark points in the step S2 1 Second target mark point circle center coordinate sorting p 2 And the coordinates of the first target mark point in the target world coordinate system are sorted
Figure BDA0003793151390000031
And second target mark point coordinate sorting under target world coordinate system
Figure BDA0003793151390000032
Respectively calculating the coordinates of the first target and the second target in the coordinate system of the vision sensor
Figure BDA0003793151390000033
According to the coordinates
Figure BDA0003793151390000034
Computing a rotation transformation matrix (R) of a vision sensor coordinate system and a target world coordinate system c ,T c );
S5.3, according to a rotation transformation matrix (R) of the vision sensor coordinate system and the target world coordinate system c ,T c ) And the first pin hole coordinate sequence under the world coordinate system of the target
Figure BDA0003793151390000035
Second pin hole coordinate ordering
Figure BDA0003793151390000036
Respectively calculating the coordinates of the first pin hole in the coordinate system of the vision sensor
Figure BDA0003793151390000037
Coordinates of the second pin hole in a vision sensor coordinate system
Figure BDA0003793151390000038
S5.4, according to the rotation matrix from the vision sensor coordinate system to the motion execution assembly coordinate system in the step S2
Figure BDA0003793151390000039
And a translation matrix of the vision sensor coordinate system to the motion actuator coordinate system
Figure BDA00037931513900000310
And the coordinates of the first pin hole in the coordinate system of the vision sensor
Figure BDA00037931513900000311
Coordinates of the second pin hole in a vision sensor coordinate system
Figure BDA00037931513900000312
Calculating the coordinates of the first pin hole in the coordinate system of the motion execution assembly
Figure BDA00037931513900000313
The coordinates of the second pin hole in the coordinate system of the motion execution assembly
Figure BDA00037931513900000314
S5.5, constructing an SVD function and solving
Figure BDA00037931513900000315
Calculating a rigid body transformation rotation matrix R of the first pin hole and the second pin hole under the condition that the first pin hole and the second pin hole are ideally butted and under a motion execution assembly coordinate system b And a translation matrix t b
S5.6 according to the rotation matrix R b And a translation matrix t b And solving a pitch angle theta, a yaw angle psi, a roll angle phi, a displacement x, a displacement y and a displacement z of the first component relative to the second component by utilizing the relation between the Euler angle and the rotation matrix and the translation matrix, and finishing the pose measurement of the first component relative to the second component.
Further, the method also comprises a step S6,
s6, according to the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z acquired in the step S5.6, calculating an alignment error value of the first pin hole and the second pin hole, and if the alignment error value of the first pin hole and the second pin hole is smaller than a preset threshold value, finishing the pose adjustment of the first component relative to the second component; if the alignment error value of the first pin hole and the second pin hole is larger than or equal to the preset threshold value, the first component is adjusted to align the first pin hole and the second pin hole, and the steps S5.1 to S5.6 are repeated until the alignment error value of the first pin hole and the second pin hole is smaller than the preset threshold value.
Further, in the step S4, the three-dimensional measurement system is adopted to calibrate the positions of the first target and the first pin hole and the positions of the second target and the second pin hole respectively; the method specifically comprises the following steps:
s4.1, respectively acquiring three-dimensional coordinates of a first target mark point, a first pin hole, a second target mark point and a second pin hole under a target world coordinate system by adopting a three-dimensional measurement system;
s4.2, sorting the three-dimensional coordinates obtained in the step S4.1 according to a sorting rule, and obtaining a first target mark point coordinate sorting
Figure BDA0003793151390000041
The second target mark point coordinates are sorted into
Figure BDA0003793151390000042
The first pin hole coordinate sequence is
Figure BDA0003793151390000043
The second pin hole coordinate is ordered as
Figure BDA0003793151390000044
Further, in step S5.3, the coordinates of the first pin hole in the vision sensor coordinate system are calculated
Figure BDA0003793151390000045
And coordinates of the second pin hole in a vision sensor coordinate system
Figure BDA0003793151390000046
Is given by the formula
Figure BDA0003793151390000047
Figure BDA0003793151390000051
Further, in step S5.4, the coordinates of the first pin hole in the coordinate system of the motion imparting component are calculated
Figure BDA0003793151390000052
And the coordinates of the second pin hole in the coordinate system of the motion execution assembly
Figure BDA0003793151390000053
In particular to
Figure BDA0003793151390000054
Figure BDA0003793151390000055
Further, in step S5.6, according to the rotation matrix R b And a translation matrix, and the relation between the Euler angle and the rotation matrix and the translation matrix is utilized to calculate the pitch angle theta, the yaw angle psi and the roll angle phi of the first component relative to the second component,
extracting a rotation matrix R b And a translation matrix t b A rotation-translation matrix M of
Figure BDA0003793151390000056
Wherein r is 11 、r 12 、r 13 …r 32 、r 33 Is a rotation matrix R b Middle element, t 1 、t 2 、t 3 Is a translation matrix t b The elements of (1); the pitch angle theta, yaw angle psi and roll angle phi of the first component relative to the second component are calculated as follows
Figure BDA0003793151390000057
Further, in step S3, the visual sensor with the calibrated internal reference is installed in the first assembly unit, specifically, the first assembly unit is provided with a plurality of visual sensors;
step S4.1, at least one first target and at least one second target are provided for simultaneously including the first target and the second target in the field of view of each of the vision sensors;
in step S5.6, the plurality of vision sensors solve the pitch angle θ, yaw angle ψ, roll angle Φ, displacement x, displacement y, and displacement z of the plurality of groups of first components relative to the second component;
according to the preset pitch angle threshold range, yaw angle threshold range, roll angle threshold range, displacement x threshold range, displacement y threshold range and displacement z threshold range, if the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z are all in the corresponding threshold ranges, the group (theta, psi, phi, x, y, z) is an effective value; if the pitch angle theta, yaw angle psi, roll angle phi, displacement x, displacement y or displacement z is outside the corresponding threshold range, the group (theta, psi, phi, x, y, z) is an invalid value; solving the average value of each group of effective values to be used as a pitch angle, a yaw angle, a roll angle, a displacement x, a displacement y and a displacement z of the first component relative to the second component;
and if the plurality of vision sensors solve that the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z of the plurality of groups of first components relative to the second component are invalid values, returning to the step S5.1 for re-measurement.
Further, the first target and the second target are standard targets correspondingly mounted on the first component and the second component;
or the first target is a feature with target properties inherent on the first component and the second target is a feature with target properties inherent on the second component.
Further, in step S3, the visual sensor after the internal reference calibration is installed in the first assembly unit, and the calibration of the visual sensor and the motion execution component by using the hand-eye calibration method specifically includes:
installing a visual sensor on a first fixed butting vehicle, and fixedly installing a hand-eye calibration tool on a motion execution assembly;
or the vision sensor is arranged on the motion execution assembly, the hand-eye calibration tool is fixedly arranged in the region outside the motion execution assembly and the first component and is positioned in the visual field range of the vision sensor, and the position of the hand-eye calibration tool is kept unchanged.
The invention has the beneficial effects that:
1. according to the high-precision pose measurement method based on the characteristics of the butt joint surfaces, the conversion between coordinate systems during measurement is realized by calibrating internal parameters of a vision sensor, calibrating the vision sensor and a motion execution assembly, and calibrating the positions of a first target and a first pin hole and the positions of a second target and a second pin hole respectively, the calibration can be continuously used only once, and the complicated calibration operation is simplified.
2. According to the invention, the position and posture relation of the first component relative to the second component under the coordinate system of the motion execution assembly is measured, and the motion execution assembly is adjusted to realize the butt joint of the first component and the second component, so that compared with a laser tracker measurement guiding butt joint method, the method is easy to operate and low in cost; compared with a three-dimensional measurement guiding butt joint method, the method has the advantages of high stability and good flexibility, and simplifies operation and maintenance processes.
3. The method is simple in calculation and high in operation speed.
4. The method provided by the invention can realize the pose measurement task when large components of different models are butted, and has the advantages of multiple application scenes, simple system structure, easy arrangement, simple calibration process and good stability.
Drawings
FIG. 1 is a schematic flow diagram of a high-precision pose measurement method based on the characteristics of a butt joint surface;
FIG. 2 is a schematic diagram of an embodiment of the high-precision pose measurement device based on the characteristics of the butt joint surfaces;
FIG. 3 is a schematic view of a first pin hole of the mating surface according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a hand-eye calibration principle according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of measuring the transformation of the pose coordinate system of the first component relative to the second component in the embodiment of the invention;
the reference numbers are as follows:
11-first fixed docking car, 12-motion actuator, 13-first part, 14-first target, 15-first pin hole, 16-vision sensor, 21-second fixed docking car, 22-fixed actuator, 23-second part, 24-second target, 25-second pin hole.
Detailed Description
Referring to fig. 1, the present embodiment provides a high-precision pose measurement method based on the features of the butt-joint surfaces, including the following steps:
s1, building an assembly platform: referring to fig. 2, the device comprises a first assembly unit and a second assembly unit which are oppositely arranged, wherein the first assembly unit comprises a first fixed butting trolley 11, a movable motion executing assembly 12 arranged on the upper surface of the first fixed butting trolley 11 and a first part 13 fixedly arranged on the motion executing assembly 12, and the first part 13 is provided with at least four first pin holes 15, and is shown in fig. 3; the second assembly unit comprises a second fixed butting trolley 21, a fixed execution assembly 22 arranged on the second fixed butting trolley 21 and a second component 23 fixedly arranged on the fixed execution assembly 22, and the second component 23 is provided with second pin holes 25 corresponding to the first pin holes 15 one by one; a three-dimensional coordinate system O-xyz is established for the first assembly unit, O being centered on the abutment surface of the first component 23.
S2, performing internal reference calibration on the vision sensor 16, wherein in the embodiment, the vision sensor 16 is a monocular vision sensor, and in other embodiments, the vision sensor can also be a binocular vision sensor or other measuring sensors; in the embodiment, the internal reference calibration of the monocular vision sensor is realized by a calibration plate, and the calibration method refers to the article "A flex new technique for camera calibration. E transactions on Pattern Analysis and Machine Analysis" of Zhang Zhengyou [ JJ.2000,22 (11): 1330-1334 ].
S2.1, acquiring internal reference calibration plate images at different positions of an internal reference calibration plate for multiple times by adopting a monocular vision sensor; specifically, the internal reference calibration plate is freely moved in a non-parallel mode for 8-10 times within the visual field range of the monocular vision sensor, the moving range is distributed in the whole visual field as much as possible, and the monocular vision sensor is used for shooting the internal reference calibration plate once when the internal reference calibration plate moves by one position, so that the image of the internal reference calibration plate is obtained.
S2.2, setting the coordinate of the circle center P of any characteristic point on the internal reference calibration plate under the world coordinate system of the calibration plate as (X) w ,Y w ,Z w ),(X w ,Y w ,Z w ) The projection point coordinates on the image plane are (u, v); obtaining the X of the monocular vision sensor according to the perspective projection model of the monocular vision sensor 1 ,Y 1 Effective focal length in direction (f) x ,f y ) And the principal point coordinate (u) of the intersection point of the monocular vision sensor optical axis and the image plane in the image pixel coordinate system 0 ,v 0 ) The formula is as follows:
Figure BDA0003793151390000091
wherein: a is a parameter matrix inside the monocular vision sensor; (f) x ,f y ) For monocular vision sensors at X 1 ,Y 1 Effective focal length in direction, i.e. X in the physical coordinate system of the image 1 Axis, Y 1 The ratio of the physical size value of the unit pixel in the axial direction to the focal length; (R, T) is an external parameter of the monocular vision sensor and represents a conversion relation between a world coordinate system of the calibration plate and a coordinate system of the camera; λ is an arbitrary scale factor that is non-zero.
Obtaining a radial distortion coefficient (k) according to a lens distortion model of a monocular vision sensor 1 ,k 2 ) And tangential distortion coefficient (p) 1 ,p 2 ) The formula is as follows:
Figure BDA0003793151390000092
wherein (x) 1 ,y 1 ) As ideal image coordinates, (x) d ,y d ) And r is the distance between the actual image coordinate and the monocular vision sensor principal point coordinate.
And S3, in order to guide the motion execution assembly 12 to realize accurate butt joint of the first component 13 and the second component 23, the conversion relation between the coordinate system of the monocular vision sensor and the coordinate system of the motion execution assembly needs to be solved through a hand-eye calibration method.
The specific process of calibrating the hands and the eyes comprises the following steps: s3.1, installing the monocular vision sensor subjected to reference calibration in the step S2 on a first fixed butting vehicle 11 of a first assembly unit, fixedly installing a hand-eye calibration tool on a motion execution component 12, wherein the hand-eye calibration tool comprises a mechanical tool and a third target, fixedly connecting the third target with the motion execution component 12 by using the mechanical tool, moving the motion execution component 12 along a coordinate system of the motion execution component 12, moving the three positions in each direction, and ensuring that the third target is in the visual field range of the monocular vision sensor in the moving process;
s3.2, starting a monocular vision sensor to acquire a third target image of the current position when the target moves by one position, wherein all mark points of the third target are contained in the third target image;
s3.3, extracting mark point image coordinates of a third target, and corresponding to the mark point coordinates of a third target coordinate system one by one;
s3.4, referring to fig. 4, defining that B is a rotation transformation in the motion execution component coordinate system, and a rotation transformation in the monocular vision sensor coordinate system, according to the hand-eye calibration equation: AX = XB, X is calculated,
Figure BDA0003793151390000101
obtaining a rotation matrix from a monocular vision sensor coordinate system to a motion actuator coordinate system
Figure BDA0003793151390000102
And translation matrix from monocular vision sensor coordinate system to motion execution component coordinate system
Figure BDA0003793151390000103
It is understood that the steps: in S3.1, the monocular vision sensor may be mounted on the motion actuator 12, and the corresponding third target is disposed in an area outside the motion actuator 12 and the first component 13 through a mechanical fixture and within a visual field of the monocular vision sensor, and the third target is kept unchanged.
It should be noted that, in step S3, the mounting position of the monocular vision sensor when the monocular vision sensor and the motion actuating assembly 12 are calibrated by the hand-eye calibration method cannot be changed after calibration, and the subsequent operation continues to maintain the same position operation, otherwise, the calibration needs to be performed again.
S4, fixedly connecting the first target 14 to the first part 13 in a sticking or spraying manner, and fixing the second target on the second part 23 in a sticking or spraying manner, wherein the 1414 and the second target are both in the visual field range of the monocular vision sensor in the step S3; the number of marker points on the first target 14 and the second target 24 can be flexibly adjusted, wherein the marker points at the four corners of the target are coded marker points for identifying the identity of the target and sorting the non-coded marker points on the target. It will be appreciated that the first target 14 is a target property feature inherent on the first component 1 and the second target 24 is a target property feature inherent on the second component 23.
In addition, in order to improve the accuracy of measurement, can set up a plurality of monocular vision sensors and a plurality of first mark target 14, a plurality of second mark target 24, preferably, set up a monocular vision sensor in the left side of first part 13, the right side sets up a monocular vision sensor, sets up a monocular vision sensor in the upper side, guarantees simultaneously that each monocular vision sensor's field of vision scope contains first mark target 14 and second mark target 24 simultaneously.
The positions of the first target 14 and the first pin hole 15 and the positions of the second target 24 and the second pin hole 25 are respectively calibrated by adopting a three-dimensional measuring system:
and S4.1, respectively obtaining three-dimensional coordinates of the first target mark point, the first pin hole 15, the second target mark point and the second pin hole 25 in a target world coordinate system by adopting a three-dimensional measurement system.
S4.2, sorting the three-dimensional coordinates obtained in the step S4.1 according to a sorting rule, and obtaining a first target mark point coordinate sorting P 1 w The coordinates of the second target mark point are ordered as
Figure BDA0003793151390000111
The first pin hole coordinate is ordered as
Figure BDA0003793151390000112
The second pin hole coordinate is ordered as
Figure BDA0003793151390000113
S5, referring to fig. 5, the pose of the first part 13 with respect to the second part 23 is measured:
s5.1, acquiring a first target image and a second target image simultaneously by adopting the monocular vision sensor in the step S3, wherein all mark points on the first target 14 and the second target 24 are in corresponding images, and passing through the imagesThe image processing algorithm extracts the center coordinates of the circle of the mark points of the first target 14 and the second target 24, and sorts the marks according to the sorting rule in step S4.3 to obtain the center coordinate sorting p of the mark points of the first target 1 And a second target mark point circle center coordinate order p 2
S5.2, sorting p according to the calibration parameters of the monocular vision sensor and the circle center coordinates of the first target mark point in the step S2 1 Second target mark point circle center coordinate sequence p 2 And the coordinate sequence of the first target mark point under the target world coordinate system is P 1 w And second target mark point coordinate ordering of
Figure BDA0003793151390000121
The coordinates of the first target 14 and the second target 24 in the monocular vision sensor coordinate system are calculated separately using a PNP algorithm
Figure BDA0003793151390000122
According to the coordinates
Figure BDA0003793151390000123
Calculating a rotation transformation matrix (R) of a monocular vision sensor coordinate system and a target world coordinate system c ,T c )。
S5.3, according to the rotation transformation matrix (R) of the coordinate system of the monocular vision sensor and the coordinate system of the target world c ,T c ) And the first pin hole coordinate under the target world coordinate system is ordered as
Figure BDA0003793151390000124
The second pin hole coordinate sequence is
Figure BDA0003793151390000125
According to the formula
Figure BDA0003793151390000126
Figure BDA0003793151390000127
Respectively calculating the coordinates of the first pin hole 15 in the coordinate system of the monocular vision sensor as
Figure BDA0003793151390000128
The coordinates of the second pin hole 25 in the monocular vision sensor coordinate system are
Figure BDA0003793151390000129
S5.4, according to the rotation matrix from the monocular vision sensor coordinate system to the motion execution component coordinate system in the step S2
Figure BDA00037931513900001210
And a translation matrix of the monocular vision sensor coordinate system to the motion actuator coordinate system
Figure BDA00037931513900001211
And the coordinates of the first pin hole 15 in the monocular vision sensor coordinate system
Figure BDA00037931513900001212
And the coordinates of the second pin hole 25 in the monocular vision sensor coordinate system
Figure BDA00037931513900001213
According to the formula
Figure BDA00037931513900001214
Figure BDA00037931513900001215
Calculating the coordinates of the first pin hole 15 in the coordinate system of the motion-performing member
Figure BDA00037931513900001216
Coordinates of the second pin hole 25 in the motion actuator coordinate system
Figure BDA00037931513900001217
S5.5, constructing an SVD function and solving
Figure BDA00037931513900001218
Calculating a rigid body transformation relation rotation matrix R of the first pin hole 15 and the second pin hole 25 under the ideal butt joint of the first pin hole 15 and the second pin hole 25 under the motion execution assembly coordinate system b And a translation matrix t b
S5.6 according to the rotation matrix R b And a translation matrix t b Solving the pitch angle theta, the yaw angle psi, the roll angle phi and the displacement x of the first component 13 relative to the second component 23 as the displacement y and the displacement z by utilizing the relation between the Euler angle and the rotation matrix, and finishing the pose measurement of the first component 13 relative to the second component 23; specifically, a rotation matrix R is extracted b And a translation matrix t b The rotational-translation matrix M in (1) is
Figure BDA0003793151390000131
Wherein r is 11 、r 12 、r 13 …r 32 、r 33 Is a rotation matrix R b Middle element, t 1 、t 2 、t 3 Is a translation matrix t b The elements of (1); the pitch angle θ, yaw angle ψ, and roll angle φ of the first member 13 with respect to the second member 23 are calculated as follows
Figure BDA0003793151390000132
The measurement of the attitude of the first block 13 with respect to the second block 23 is completed.
When the plurality of vision sensors 16 are provided, the plurality of vision sensors 16 solve for a plurality of sets of pitch angle θ, yaw angle ψ, roll angle Φ, displacement x, displacement y, and displacement z of the first member 13 relative to the second member 23.
According to the preset pitch angle threshold range, yaw angle threshold range, roll angle threshold range, displacement x threshold range, displacement y threshold range and displacement z threshold range, if the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z are all in the corresponding threshold ranges, the group (theta, psi, phi, x, y, z) is an effective value; if the pitch angle theta, yaw angle psi, roll angle phi, displacement x, displacement y or displacement z is outside the corresponding threshold range, the group (theta, psi, phi, x, y, z) is an invalid value; solving the average value of each group of effective values to be used as the pitch angle, the yaw angle, the roll angle, the displacement x, the displacement y and the displacement z of the first component 13 relative to the second component 23;
the average of each set of effective values is taken as the pitch, yaw, and roll angles of the first component 13 relative to the second component 23. If the plurality of vision sensors 16 solve for the invalid values for the pitch angle θ, yaw angle ψ, roll angle φ, displacement x, displacement y, and displacement z of the plurality of sets of first components 13 relative to the second component 23, the procedure returns to step S5.1 to re-measure.
S6, according to the pitch angle theta, the yaw angle psi and the roll angle phi acquired in the step S5.6, an alignment error value of the first pin hole 15 and the second pin hole 25 is calculated, and if the alignment error value of the first pin hole 15 and the second pin hole 25 is smaller than a preset threshold value, the pose adjustment of the first component 13 relative to the second component 23 is completed; if the alignment error value of the first pin hole 15 and the second pin hole 25 is greater than or equal to the preset threshold value, the first component 13 is adjusted to align the first pin hole 15 and the second pin hole 25, and the steps S5.1 to S5.6 are repeated until the alignment error value of the first pin hole 15 and the second pin hole 25 is smaller than the preset threshold value.
The high-precision pose measurement method based on the characteristics of the butt joint surface calibrates the internal parameters of the monocular vision sensor by 1; 2. the transformation relation between the monocular vision sensor coordinate system and the motion execution assembly coordinate system; 3. calibrating the positions of the first target 14 and the first pin hole 15 and the positions of the second target 24 and the second pin hole 25 respectively; the calibration is carried out only once, continuous measurement can be carried out, if the corresponding monocular vision sensor, the first target 14 or the second target 24 falls off and is replaced, the calibration needs to be carried out again, and if the device receives severe impact and shake, the calibration needs to be carried out again; when the butt joint pose is measured after calibration, the monocular vision sensor shoots a group of images containing all the mark points of the first target 14 and the second target 24 at the same time, the coordinates of all the mark points of the two targets are extracted by adopting a target identification technology and are sequenced according to a specific rule, the three-dimensional coordinates of the mark points of the two targets are converted into the coordinate system of the motion execution component according to the internal parameter and the distortion system coefficient of the monocular vision sensor and the conversion relation of the coordinate system of the motion execution component, and then the relative pose of the first target 14 relative to the second target 24 under the coordinate system of the motion execution component is calculated through the calibration parameters of the positions of the targets and the corresponding pin holes.

Claims (10)

1. A high-precision pose measurement method based on butt joint surface features is characterized by comprising the following steps:
s1, building an assembly platform: the assembling device comprises a first assembling unit and a second assembling unit which are oppositely arranged, wherein the first assembling unit comprises a first fixed butting trolley (11), a movable motion executing assembly (12) arranged on the upper surface of the first fixed butting trolley (11) and a first component (13) fixedly arranged on the motion executing assembly (12), and the butting surface of the first component (13) is provided with at least four first pin holes (15); the second assembly unit comprises a second fixed butt joint vehicle (21), a fixed execution assembly (22) arranged on the second fixed butt joint vehicle (21) and a second component (23) fixedly arranged on the fixed execution assembly (22), and second pin holes (25) which correspond to the first pin holes (15) one by one are formed in the butt joint surface of the second component (23); establishing a three-dimensional coordinate system O-xyz of the first assembly unit, O being in the centre of the abutment surface of the first component (23);
s2, performing internal reference calibration on the visual sensor (16) to obtain calibration parameters of the visual sensor (16);
s3, installing the vision sensor (16) with the internal reference calibrated on a first assembly unit, calibrating the vision sensor (16) and the motion execution assembly (12) by adopting a hand-eye calibration method, resolving a conversion relation between a vision sensor coordinate system and a motion execution assembly coordinate system, and acquiring a rotation matrix from the vision sensor coordinate system to the motion execution assembly coordinate system
Figure FDA0003793151380000014
And translation matrix
Figure FDA0003793151380000015
S4, arranging a first target (14) and a second target (24) on the first part (13) and the second part (23), wherein the first target (14) and the second target (24) are both in the visual field range of the visual sensor (16) in the step S3; calibrating the positions of a first target (14) and a first pin hole (15) and the positions of a second target (24) and a second pin hole (25) by adopting a three-dimensional measuring system; obtaining the coordinate sequence of the first target mark point as P according to the sequence rule 1 w The coordinates of the second target mark point are ordered as
Figure FDA0003793151380000011
The first pin hole coordinate is ordered as
Figure FDA0003793151380000012
The second pin hole coordinate is ordered as
Figure FDA0003793151380000013
S5, the first target image and the second target image are acquired simultaneously by the vision sensor (16) in the step S3, the pitch angle theta, the yaw angle psi, the roll angle phi and the displacement x of the first component (13) relative to the second component (23) are calculated through an image processing algorithm and a transformation matrix of a coordinate system, the displacement y and the displacement z are obtained, and the pose measurement of the first component (13) relative to the second component (23) is completed.
2. The high-precision pose measurement method based on the butt joint surface features according to claim 1, characterized in that:
the step S5 specifically comprises the following steps:
s5.1, a first target image and a second target image are acquired simultaneously by adopting the vision sensor (16) in the step S3, and all mark points on the first target (14) and the second target (24) are positioned on the corresponding graphIn the image, respectively extracting the circle center coordinates of the first target mark point and the circle center coordinates of the second target mark point through an image processing algorithm, and sequencing according to the sequencing rule in the step S4 to obtain the circle center coordinate sequencing p of the first target mark point 1 And a second target mark point circle center coordinate order p 2
S5.2, sorting p according to the calibration parameters of the visual sensor (16) and the circle center coordinates of the first target mark points in the step S2 1 Second target mark point circle center coordinate sorting p 2 And a first target mark point coordinate sequence P under the target world coordinate system 1 w And second target mark point coordinate sorting under target world coordinate system
Figure FDA0003793151380000021
Separately calculating the coordinates of the first target (14) and the second target (24) in the vision sensor coordinate system
Figure FDA0003793151380000022
According to the coordinates
Figure FDA0003793151380000023
Computing a rotation transformation matrix (R) of a vision sensor coordinate system and a target world coordinate system c ,T c );
S5.3, according to the rotation transformation matrix (R) of the vision sensor coordinate system and the target world coordinate system c ,T c ) And the first pin hole coordinate sorting under the world coordinate system of the target
Figure FDA0003793151380000024
Second pin hole coordinate ordering
Figure FDA0003793151380000025
Respectively calculating the coordinates of the first pin hole (15) in the coordinate system of the vision sensor
Figure FDA0003793151380000026
Coordinates of the second pin hole (25) in the vision sensor coordinate system
Figure FDA0003793151380000027
S5.4, according to the rotation matrix from the vision sensor coordinate system to the motion execution assembly coordinate system in the step S2
Figure FDA0003793151380000028
And a translation matrix of the vision sensor coordinate system to the motion actuator coordinate system
Figure FDA0003793151380000029
And the coordinates of the first pin hole (15) in the vision sensor coordinate system
Figure FDA00037931513800000210
Coordinates of the second pin hole (25) in the vision sensor coordinate system
Figure FDA0003793151380000031
Calculating the coordinates of the first pin hole (15) in the coordinate system of the motion-executing component
Figure FDA0003793151380000032
The coordinates of the second pin hole (25) in the coordinate system of the motion-executing component
Figure FDA0003793151380000033
S5.5, constructing an SVD function and solving
Figure FDA0003793151380000034
Calculating a rigid body transformation rotation matrix R of the first pin hole (15) and the second pin hole (25) under the condition that the first pin hole (15) and the second pin hole (25) are ideally butted under a motion execution assembly coordinate system b And a translation matrix t b
S5.6 according to the rotation matrix R b And translation matrix t b Using the Euler angle in relation to the rotation and translation matrices to solve for the first component (13) relative to the second componentAnd the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z of the component (23) complete the posture measurement of the first component (13) relative to the second component (23).
3. The high-precision pose measurement method based on the butt joint surface features according to claim 2, wherein:
further comprising a step S6 of carrying out,
s6, according to the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z acquired in the step S5.6, an alignment error value of the first pin hole (15) and the second pin hole (25) is calculated, and if the alignment error value of the first pin hole (15) and the second pin hole (25) is smaller than a preset threshold value, pose adjustment of the first component (13) relative to the second component (23) is completed; if the alignment error value of the first pin hole (15) and the second pin hole (25) is larger than or equal to the preset threshold value, the first component (13) is adjusted to align the first pin hole (15) and the second pin hole (25), and the steps S5.1 to S5.6 are repeated until the alignment error value of the first pin hole (15) and the second pin hole (25) is smaller than the preset threshold value.
4. The high-precision pose measurement method based on the butt joint surface features according to any one of claims 1 to 3, wherein:
in the step S4, a three-dimensional measuring system is adopted to calibrate the positions of the first target (14) and the first pin hole (15) and the positions of the second target (24) and the second pin hole (25) respectively; the method specifically comprises the following steps:
s4.1, respectively obtaining three-dimensional coordinates of a first target mark point, a first pin hole (15), a second target mark point and a second pin hole (25) in a target world coordinate system by adopting a three-dimensional measurement system;
s4.2, sorting the three-dimensional coordinates obtained in the step S4.1 according to a sorting rule, and obtaining a first target mark point coordinate sorting P 1 w The coordinates of the second target mark point are sorted into
Figure FDA0003793151380000041
The first pin hole coordinate sequence is
Figure FDA0003793151380000042
The second pin hole coordinate is ordered as
Figure FDA0003793151380000043
5. The high-precision pose measurement method based on the butt joint surface features according to claim 4, wherein the method comprises the following steps:
in step S5.3, the coordinates of the first pin hole (15) in the visual sensor coordinate system are calculated
Figure FDA0003793151380000044
And the coordinates of the second pin hole (25) in the vision sensor coordinate system
Figure FDA0003793151380000045
Is given by the formula
Figure FDA0003793151380000046
Figure FDA0003793151380000047
6. The high-precision pose measurement method based on the butt joint surface features according to claim 5, wherein the method comprises the following steps:
in step S5.4, the coordinates of the first pin hole (15) in the coordinate system of the motion execution assembly are calculated
Figure FDA0003793151380000048
And the coordinates of the second pin hole (25) in the coordinate system of the motion-performing member
Figure FDA0003793151380000049
In particular to
Figure FDA00037931513800000410
Figure FDA00037931513800000411
7. The high-precision pose measurement method based on the butt joint surface features according to claim 6, wherein the method comprises the following steps:
in step S5.6, the rotation matrix R is used b And a translation matrix, and the relation between the Euler angle and the rotation matrix and the translation matrix is utilized to calculate the pitch angle theta, the yaw angle psi and the roll angle phi of the first component (13) relative to the second component (23),
extracting a rotation matrix R b And a translation matrix t b The rotational-translation matrix M in (1) is
Figure FDA0003793151380000051
Wherein r is 11 、r 12 、r 13 …r 32 、r 33 Is a rotation matrix R b Middle element, t 1 、t 2 、t 3 Is a translation matrix t b The elements of (1); the pitch angle theta, yaw angle psi and roll angle phi of the first component (13) relative to the second component (23) are calculated as follows
Figure FDA0003793151380000052
8. The high-precision pose measurement method based on the butt joint surface features according to claim 7, characterized in that:
in the step S3, the vision sensor (16) with calibrated internal reference is arranged in a first assembly unit, and specifically, the first assembly unit is provided with a plurality of vision sensors (16);
in step S4.1, at least one first target (14) and at least one second target (24) are provided for simultaneously including the first target (14) and the second target (24) in the field of view of the respective vision sensor (16);
in step S5.6, the plurality of vision sensors (16) calculate a plurality of groups of pitch angle theta, yaw angle psi, roll angle phi, displacement x, displacement y and displacement z of the first component (13) relative to the second component (23);
according to the preset pitch angle threshold range, yaw angle threshold range, roll angle threshold range, displacement x threshold range, displacement y threshold range and displacement z threshold range, if the pitch angle theta, the yaw angle psi, the roll angle phi, the displacement x, the displacement y and the displacement z are all in the corresponding threshold ranges, the group (theta, psi, phi, x, y, z) is an effective value; if the pitch angle theta, yaw angle psi, roll angle phi, displacement x, displacement y or displacement z is outside the corresponding threshold range, the group (theta, psi, phi, x, y, z) is an invalid value; solving the average value of each group of effective values to be used as a pitch angle, a yaw angle, a roll angle, a displacement x, a displacement y and a displacement z of the first component (13) relative to the second component (23);
if the plurality of vision sensors (16) solve that the plurality of groups of the pitching angle theta, the yaw angle psi, the rolling angle phi, the displacement x, the displacement y and the displacement z of the first component (13) relative to the second component (23) are invalid values, the step S5.1 is returned to and the measurement is carried out again.
9. The high-precision pose measurement method based on the butt joint surface features according to claim 8, wherein the method comprises the following steps:
the first target (14) and the second target (24) are standard targets which are correspondingly arranged on the first component (13) and the second component (23);
alternatively, the first target (14) is a feature having a target property inherent to the first member (13), and the second target (24) is a feature having a target property inherent to the second member (23).
10. The high-precision pose measurement method based on the butt joint surface features according to claim 9, wherein:
in the step S3, the visual sensor (16) with the calibrated internal reference is arranged in the first assembly unit, and the calibration of the visual sensor (16) and the motion execution component (12) by adopting a hand-eye calibration method specifically comprises the following steps:
the vision sensor (16) is arranged on a first fixed butt joint vehicle (11), and the hand-eye calibration tool is fixedly arranged on the movement execution component (12);
or the visual sensor (16) is arranged on the motion execution assembly (12), the hand-eye calibration tool is fixedly arranged in the region outside the motion execution assembly (12) and the first component (13) and is positioned in the visual field range of the visual sensor (16), and the position of the hand-eye calibration tool is kept unchanged.
CN202210962107.9A 2022-08-11 2022-08-11 High-precision pose measurement method based on butt joint surface characteristics Pending CN115496802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210962107.9A CN115496802A (en) 2022-08-11 2022-08-11 High-precision pose measurement method based on butt joint surface characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210962107.9A CN115496802A (en) 2022-08-11 2022-08-11 High-precision pose measurement method based on butt joint surface characteristics

Publications (1)

Publication Number Publication Date
CN115496802A true CN115496802A (en) 2022-12-20

Family

ID=84467181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210962107.9A Pending CN115496802A (en) 2022-08-11 2022-08-11 High-precision pose measurement method based on butt joint surface characteristics

Country Status (1)

Country Link
CN (1) CN115496802A (en)

Similar Documents

Publication Publication Date Title
CN110006402B (en) Visual measurement system and measurement method for relative pose of large-scale component butt joint assembly
CN101876532B (en) Camera on-field calibration method in measuring system
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN107449374B (en) Visual auxiliary laser galvanometer scanning system with flexible layout and field calibration method thereof
US7145647B2 (en) Measurement of spatial coordinates
Luna et al. Calibration of line-scan cameras
JP2013036987A (en) Information processing device and information processing method
CN111707189B (en) Laser displacement sensor light beam direction calibration method based on binocular vision
Hu et al. Automatic calibration of hand–eye–workspace and camera using hand-mounted line laser
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN114011608A (en) Spraying process optimization system based on digital twinning and spraying optimization method thereof
CN112229321B (en) Method for solving 21-item geometric errors of three-coordinate measuring machine based on LASSO algorithm
JP6626338B2 (en) Information processing apparatus, control method for information processing apparatus, and program
CN114608806A (en) Calibration method of laser galvanometer scanning system based on binocular camera
CN112381881B (en) Automatic butt joint method for large rigid body members based on monocular vision
CN114459345A (en) System and method for detecting position and attitude of airplane body based on visual space positioning
CN114092552A (en) Method for carrying out butt joint on large rigid body member based on fixed end image
Fontana et al. Unconventional calibration strategies for micromanipulation work-cells
CN110370272B (en) Robot TCP calibration system based on vertical reflection
CN115496802A (en) High-precision pose measurement method based on butt joint surface characteristics
CN110645916B (en) Free-form surface measuring method and device based on reference calibration plate correction pose
CN116619350A (en) Robot error calibration method based on binocular vision measurement
Isa et al. Kinematic error analysis of stage tracking using stereo vision
CN111028298A (en) Convergent binocular system for rigid coordinate system space transformation calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination