CN114092552A - Method for carrying out butt joint on large rigid body member based on fixed end image - Google Patents

Method for carrying out butt joint on large rigid body member based on fixed end image Download PDF

Info

Publication number
CN114092552A
CN114092552A CN202111295176.0A CN202111295176A CN114092552A CN 114092552 A CN114092552 A CN 114092552A CN 202111295176 A CN202111295176 A CN 202111295176A CN 114092552 A CN114092552 A CN 114092552A
Authority
CN
China
Prior art keywords
coordinate system
fixed end
butt joint
camera
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111295176.0A
Other languages
Chinese (zh)
Inventor
罗华
杨娜
郭静
程军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Aerospace Precision Electromechanical Institute
Original Assignee
Xian Aerospace Precision Electromechanical Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Aerospace Precision Electromechanical Institute filed Critical Xian Aerospace Precision Electromechanical Institute
Priority to CN202111295176.0A priority Critical patent/CN114092552A/en
Publication of CN114092552A publication Critical patent/CN114092552A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method for carrying out butt joint on a large rigid body component based on a fixed end image, wherein the large rigid body component comprises a fixed end and a movable end; the moving end is arranged on the butt-joint moving vehicle through the posture adjusting platform; the camera is arranged on the butt joint moving vehicle, and the lens is opposite to the butt joint surface of the fixed end; the method comprises the following specific steps: 1. calibrating internal parameters of a visual system; 2. calibrating external parameters of a visual system; 3. calibrating the posture of the mobile terminal; 4. positioning, guiding and butting; the method solves the problem that the existing vision technology cannot solve the problem that the docking pose cannot be solved when the docking scene of the fixed end image and the mobile end image cannot be acquired simultaneously.

Description

Method for carrying out butt joint on large rigid body member based on fixed end image
Technical Field
The invention belongs to the field of industrial automation. In particular to a method for carrying out butt joint on a large rigid body component based on a fixed end image.
Background
The assembly and butt joint of rigid equipment is one of the important links in the fields of aerospace, industrial manufacturing and the like at present. Most of the existing butt joint work is finished manually by manpower, the labor intensity of the manpower is high, and the efficiency is low; meanwhile, the relative pose control between the rigid bodies on the two sides is difficult in the butt joint process during manual operation, so that the butt joint precision is greatly influenced.
Aiming at the problems of manual operation, a laser measurement technology or an indoor GPS positioning and guiding technology is provided, but the two methods still have problems: the laser measurement technology has high precision and strong anti-interference capability, but the equipment is expensive. The indoor GPS has low precision and cannot meet the requirement of high-precision butt joint assembly.
Compared with the two methods, the vision technology reduces the cost and the operation complexity while ensuring the precision, so that the rigid body assembly docking guided by the vision technology is more and more widely applied, and a plurality of patents about the application of the vision technology to the large rigid body docking are disclosed.
For example, the chinese invention patent, application No.: in the monocular and binocular position deviation measuring method for cabin docking disclosed in CN201610309533.7, two independent monocular vision systems are used to collect artificial targets at a moving end and a fixed end respectively, and a docking pose is obtained by combining the relationship between two monocular cameras calibrated in advance and a three-dimensional coordinate point of the artificial target.
Chinese invention patent, application number: CN202010760878.0 discloses an automated guided missile horizontal loading system based on visual alignment and an operation method thereof, which adopts a photogrammetric system to calibrate the relationship between an artificial target and a round hole of a butt joint face in advance, and utilizes a binocular visual system to simultaneously acquire target image information on a moving end and a fixed end so as to solve the butt joint pose.
Chinese invention patent, application number: CN 201910339187.0 discloses a vision measurement system and a measurement method for the relative pose of large-scale component butt joint assembly, which utilize two sets of binocular vision systems to respectively perform coarse positioning and fine positioning to improve the butt joint precision.
Chinese invention patent, application number: CN 202011156320.8 discloses a monocular vision-based automatic docking method for large rigid body members, which utilizes a monocular vision system to simultaneously acquire artificial target images of a moving end and a fixed end and can solve the docking pose of two cylinder sections.
The above patents can all complete the relative pose measurement and guide automatic assembly and docking processes of two rigid body equipment under common conditions. However, when the field conditions cannot ensure that the vision system can simultaneously acquire images of the fixed end and the movable end, the above methods cannot meet the docking requirements of the positioning guide component.
Disclosure of Invention
The invention provides a method for carrying out butt joint on a large rigid body member based on a fixed end image, aiming at the problem that the butt joint pose cannot be solved when the butt joint scene of the fixed end image and a movable end image cannot be acquired simultaneously in the prior art.
The specific technical scheme of the invention is as follows:
a method for carrying out butt joint on a large rigid body component based on a fixed end image, wherein the large rigid body component comprises a fixed end and a movable end; the moving end is arranged on the butt-joint moving vehicle through the posture adjusting platform; the camera is arranged on the butt joint moving vehicle, and the lens is opposite to the butt joint surface of the fixed end; the specific docking method comprises the following implementation steps:
step 1: calibrating internal parameters of a visual system;
step 2: calibrating external parameters of a visual system;
calibrating external parameters of the camera by using the calibration plate to obtain the rotation and translation relation between the coordinate system of the camera and the coordinate system of the attitude adjusting platform
Figure BDA0003336290160000031
Figure BDA0003336290160000032
Is a rotation matrix from a camera coordinate system to an attitude adjusting platform coordinate system,
Figure BDA0003336290160000033
a translation vector from a camera coordinate system to a pose adjusting platform coordinate system;
and step 3: calibrating the posture of the mobile terminal;
step 3.1: a moving end feature identifier is arranged in front of the butt joint end face of the moving end, so that the moving end feature identifier is ensured to be within a camera view field range and to be clearly imaged;
step 3.2: establishing a mobile terminalThe coordinate system obtains the coordinate P of the mobile terminal feature identifier in the mobile terminal coordinate system according to the relationship between the feature point in the mobile terminal feature identifier and the mobile terminal coordinate systemm m
Step 3.3: according to the coordinate P of the mobile terminal feature identification in the mobile terminal coordinate systemm mAnd (3) calculating a transformation matrix of a coordinate system of the mobile terminal and a coordinate system of the camera by utilizing a PnP algorithm in combination with the visual system internal parameters acquired in the step (1)
Figure BDA0003336290160000034
Obtaining the coordinate P of the mobile terminal feature identifier under the camera coordinate systemm-c
Step 3.4: combining the external reference calibration result of the vision system in the step 2
Figure BDA0003336290160000035
Solving the coordinate P of the mobile terminal characteristic mark under the coordinate system of the attitude adjusting platformm-b
And 4, step 4: positioning, guiding and butting;
step 4.1: establishing a fixed end coordinate system, and presetting the pose relationship between the movable end coordinate system and the fixed end coordinate system in a butt joint state
Figure BDA0003336290160000036
Step 4.2: obtaining the coordinate P of the fixed end characteristic mark under the fixed end coordinate system according to the relation between the characteristic point in the fixed end characteristic mark and the fixed end coordinate systemf f
Step 4.3: coordinate P of fixed end characteristic mark in fixed end coordinate systemf fAnd step 1, calculating a transformation matrix of the fixed end coordinate system and the camera coordinate system by adopting a PnP algorithm according to the visual system internal parameters acquired in the step 1
Figure BDA0003336290160000041
Step 4.4: according to the pose relationship set in advance in the step 4.1
Figure BDA0003336290160000042
Solving the coordinate P of the characteristic mark of the moving end in the butt joint state under the coordinate system of the attitude adjusting platformm-b';
Step 4.5: according to the characteristic identification coordinate P of the moving end under the coordinate system of the attitude adjusting platform in the step 3.4m-bAnd 4.4, coordinate P of the characteristic mark of the mobile terminal in the coordinate system of the attitude adjusting platform in the butt joint statem-b' calculating the relative pose R of the moving end and the fixed endb、tbThe data is the butt joint pose;
step 4.6: and resolving the docking pose to six degrees of freedom of the pose adjusting platform to guide the docking pose adjustment.
Further, the specific implementation process of the step 4.1 is as follows:
step 4.1.1: establishing a fixed end coordinate system, acquiring the structural relationship between the characteristic points on the moving end characteristic mark and the fixed end in a butt joint state according to a simulation means, and acquiring the coordinates P of all the characteristic points in the moving end characteristic mark in the butt joint state under the fixed end coordinate systemm f
Step 4.1.2: calculating the coordinates Pm mAnd the coordinate Pm fThe conversion relation between the fixed end coordinate system and the movable end coordinate system is the conversion relation between the fixed end coordinate system and the movable end coordinate system in the butt joint state
Figure BDA0003336290160000043
The formula is as follows:
Figure BDA0003336290160000044
further, the moving end characteristic mark is fixed in front of the moving end butt joint surface by adopting an L-shaped bracket; one end of the L-shaped support is fixed at the moving end, and the other end is provided with a moving end characteristic mark.
Further, the fixed end characteristic mark is a fixed end target which is adhered to the fixed end butt joint surface and has characteristic points, or inherent characteristic points on the fixed end butt joint surface, wherein the inherent characteristic points are bulges or pits or screws with regular shapes;
the mobile terminal feature is identified as a mobile terminal target or an inherent feature arranged at the front end of the mobile terminal.
Further, the coordinate P in step 4.4 abovem-bThe specific solving process of':
step 4.4.1: using the result of step 4.1
Figure BDA0003336290160000051
Calculating the coordinates of the feature points of the moving end feature identification in the butt joint state under a fixed end coordinate system as follows:
Figure BDA0003336290160000053
step 4.4.2: using the result of step 4.3
Figure BDA0003336290160000054
Calculating the coordinates of the feature points of the feature identification of the moving end in the butt joint state under the coordinate system of the attitude adjusting platform as follows:
Figure BDA0003336290160000056
step 4.4.3: the formula (1) is taken to obtain the formula Pm-bThe specific calculation formula of':
Figure BDA0003336290160000057
further, P in step 3.3m-cThe specific calculation formula of (A) is as follows:
Figure BDA0003336290160000058
step 3.4Pm-bThe specific calculation formula of (A) is as follows:
Figure BDA0003336290160000059
step 4.5Rb、tbThe specific calculation formula of (A) is as follows: pm-b'=Rb·Pm-b+tb
Further, the specific solving process of the step 4.5 is as follows:
Figure BDA00033362901600000510
wherein r is11To r33To represent a rotation matrix Rb,t1To t3Representing translation vector tb(ii) a And resolving a pitch angle theta, a yaw angle psi and a roll angle phi based on a posture adjusting platform coordinate system by utilizing the Euler angle and rotation matrix relation and the posture adjusting sequence.
Further, the internal parameters of the visual system in step 1 include: ratio (f) of unit pixel size value to focal length in X-axis and Y-axis directions in camera coordinate systemx,fy) And the coordinates (u) of the intersection of the optical axis of the camera and the image plane in the camera coordinate system0,v0);
The specific calibration process comprises the following steps:
step 1.1: placing the internal reference calibration plate within a field of view of a camera of the vision system;
step 1.2: moving the internal reference calibration plate in the field of view of the camera, traversing the field of view of the whole camera, and acquiring a plurality of images of the internal reference calibration plate;
step 1.3: processing the collected multiple internal reference calibration plate images to obtain visual coordinates (u, v) of each characteristic point in the internal reference calibration plate;
step 1.4: establishing a world coordinate system, and acquiring world coordinates (X) of each characteristic point of the internal reference calibration plate by using the known actual position relation of each characteristic point of the internal reference calibration platew,Yw,Zw);
Step 1.5: performing data fitting on the visual coordinates of the characteristic points obtained in the step 1.3 and the world coordinates of the characteristic points obtained in the step 1.4, and solving to obtain internal parameters of a visual system; the specific formula of the internal reference of the visual system is as follows:
Figure BDA0003336290160000061
in the formula, M is the conversion relation between the visual coordinate system and the world coordinate system.
Further, the specific calibration process in step 2 is as follows:
step 2.1: fixedly connecting the external reference calibration plate with the attitude adjusting platform, and determining an attitude adjusting platform coordinate system;
step 2.2: controlling the posture adjusting platform to move the external reference calibration plate at least two positions along one direction, wherein the external reference calibration plate is converted into B1 in a rotating mode under a posture adjusting platform coordinate system in the process, and the external reference calibration plate is converted into A1 in a rotating mode under a visual coordinate system; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a first typical hand-eye calibration equation: A1X ═ XB 1; when the vision system and the control posture adjustment platform are relatively fixed,
Figure BDA0003336290160000071
Figure BDA0003336290160000072
is a rotation matrix from a camera coordinate system to an attitude adjusting platform coordinate system,
Figure BDA0003336290160000073
a translation vector from a camera coordinate system to a pose adjusting platform coordinate system;
step 2.3: controlling the posture adjusting platform to move at least two positions along the other direction, wherein in the process, the external reference calibration plate is converted into B2 in a rotating mode under a posture adjusting platform coordinate system, and the rotating mode under a camera coordinate system is converted into A2; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a second typical hand-eye calibration equation: A2X ═ XB 2;
step 2.4: solving X according to the calibration equation of the two hands and eyes to obtain
Figure BDA0003336290160000074
Thereby completing the external reference calibration of the vision system.
The invention has the beneficial effects that:
1. the invention utilizes a vision system to calibrate the pose relation between the moving end and the pose adjusting platform in advance, then collects the fixed end characteristic identification in real time, acquires the butt joint pose and guides the rigid body part of the moving end to be automatically butted, thereby solving the problem that the butt joint pose cannot be solved when the existing method cannot simultaneously collect the butt joint scenes of the fixed end image and the moving end image
2. The invention is also suitable for the general butt joint scene capable of simultaneously obtaining the fixed end image and the moving end image, the position and the posture of the moving end are calibrated in advance by utilizing a visual system arranged near the moving end, and then the fixed end image is collected in real time to obtain the butt joint position and posture.
3. The invention utilizes the monocular vision measuring system to guide the butt joint pose of the large rigid body member, and compared with binocular and multi-ocular vision systems, the monocular vision measuring system has the advantages of simple structure, low complexity of the calibration process, higher stability and easier operation and maintenance.
Drawings
Fig. 1 is a schematic diagram of the docking process of the present invention.
FIG. 2 is a schematic diagram of the implementation of step 4.1 in the method of the present invention.
FIG. 3 is a schematic view of an inner reference calibration plate and an outer reference calibration plate;
FIG. 4 is a schematic diagram of a mobile terminal target and a fixed terminal target.
The reference numbers are as follows:
1-fixed end, 2-fixed end target, 3-L type support, 4-moving end, 5-camera, 6-posture adjusting platform, 7-moving butt joint vehicle and 8-transport vehicle.
Detailed Description
The method of the present invention is described in further detail below with reference to the figures and examples.
When the diameters of the fixed end and the movable end are greatly different or the side space of the fixed end and the movable end is insufficient, the situation that images at two ends are collected through one camera to cause unclear imaging and pose solving cannot be achieved.
The method mainly comprises the following implementation steps: calibrating internal parameters of a vision system, calibrating external parameters of the vision system, and calibrating the posture of a moving end to position, guide and butt joint;
as shown in fig. 1, in executing the method, the following preparations are required:
1. preparing a set of monocular vision system, wherein the monocular vision system specifically comprises a camera 1, a lens and a light source, and the monocular vision system is arranged on a mobile butt-joint vehicle 7 driving a mobile terminal 4 to move in the method for implementing the invention;
2. selecting a characteristic mark on the butt joint surface of the fixed end 1, wherein the characteristic mark can be a fixed end target 2 adhered to the butt joint surface of the fixed end, or a bulge, a pit or a screw with a regular shape on the butt joint surface of the fixed end; in the embodiment, the feature identification adopts a fixed end target, and the fixed end is arranged on the transport vehicle 8 in the embodiment;
3. preparing a mobile terminal characteristic mark; in the embodiment, the mobile terminal characteristic mark adopts a mobile terminal target, when the mobile terminal target is used, the mobile terminal target can be installed on the mobile terminal 4 through an L-shaped support 3, and the mobile terminal 4 is installed on the mobile docking vehicle 7 through the posture adjusting platform 6.
When the visual system works, the fixed end target is always bonded on the fixed end butt joint face, the movable end target is installed at the movable end when the posture of the movable end is calibrated in the early stage, and the movable end target needs to be taken down when the guide butt joint is started.
The method of the invention is specifically implemented as follows:
step 1: internal reference calibration of vision system
And utilizing the calibration plate to calibrate the internal reference, wherein the internal reference of the visual system comprises the following steps: ratio (f) of unit pixel size value to focal length in X-axis and Y-axis directions in camera coordinate systemx,fy) And the coordinates (u) of the intersection of the optical axis of the camera and the image plane in the camera coordinate system0,v0);
Step 1.1: placing an internal reference calibration plate shown in figure 1 in a camera view field range to acquire a clear image;
step 1.2: moving the internal reference calibration plate in the field of view of the camera, traversing the field of view of the whole camera, and acquiring a plurality of images of the internal reference calibration plate;
step 1.3: processing the collected multiple internal reference calibration plate images to obtain visual coordinates (u, v) of each characteristic point in the internal reference calibration plate;
step 1.4: establishing a world coordinate system, and acquiring world coordinates (X) of each characteristic point of the internal reference calibration plate by using the known actual position relation of each characteristic point of the internal reference calibration platew,Yw,Zw);
Step 1.5: performing data fitting on the visual coordinates of the characteristic points obtained in the step 1.3 and the world coordinates of the characteristic points obtained in the step 1.4, and solving to obtain internal parameters of a visual system; the specific formula of the internal reference of the visual system is as follows:
Figure BDA0003336290160000101
in the formula, M is the conversion relation between the visual coordinate system and the world coordinate system.
Step 2: and calibrating external parameters of the visual system.
And (4) utilizing the external reference calibration plate to calibrate the external reference of the camera to obtain the rotation and translation relation between the coordinate system of the camera and the coordinate system of the attitude adjusting platform.
The specific process of external reference calibration of the visual system is as follows:
step 2.1: fixedly connecting the external reference calibration plate with the attitude adjusting platform, and determining an attitude adjusting platform coordinate system;
step 2.2: controlling the posture adjusting platform to move the external reference calibration plate at least two positions along one direction, wherein the external reference calibration plate is converted into B1 in a rotating mode under a posture adjusting platform coordinate system in the process, and the external reference calibration plate is converted into A1 in a rotating mode under a visual coordinate system; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a first typical hand-eye calibration equation: A1X ═ XB 1; when the vision system and the control posture adjustment platform are relatively fixed,
Figure BDA0003336290160000102
Figure BDA0003336290160000103
is a rotation matrix from a camera coordinate system to an attitude adjusting platform coordinate system,
Figure BDA0003336290160000104
a translation vector from a camera coordinate system to a pose adjusting platform coordinate system;
step 2.3: controlling the posture adjusting platform to move at least two positions along the other direction, wherein in the process, the external reference calibration plate is converted into B2 in a rotating mode under a posture adjusting platform coordinate system, and the rotating mode under a camera coordinate system is converted into A2; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a second typical hand-eye calibration equation: A2X ═ XB 2;
step 2.4: solving X according to the calibration equation of the two hands and eyes to obtain
Figure BDA0003336290160000105
Thereby completing the external reference calibration of the vision system.
And step 3: calibrating the posture of the mobile terminal;
when the camera can only acquire fixed end images in real time and calculate the butt joint pose, the pose of the moving end needs to be calibrated in advance to acquire the pose relation between the moving end and the pose adjusting platform;
the specific process of mobile terminal attitude calibration is as follows:
step 3.1: the mobile end target is installed on the butt joint surface of the mobile end by utilizing the L-shaped bracket, so that the mobile end target is ensured to be in the field range of the camera and to be clearly imaged;
step 3.2: establishing a mobile terminal coordinate system, and acquiring a coordinate P of the mobile terminal target in the mobile terminal coordinate system according to the relation between the mobile terminal target characteristic point and the mobile terminal coordinate systemm m
Step 3.3: according to the coordinate P of the mobile terminal target in the mobile terminal coordinate systemm mAnd (3) calculating a transformation matrix of a coordinate system of the mobile terminal and a coordinate system of the camera by utilizing a PnP algorithm in combination with the visual system internal parameters acquired in the step (1)
Figure BDA0003336290160000111
Obtaining the coordinate P of the characteristic point of the target at the mobile terminal in the camera coordinate systemm-c,Pm-cIs specifically calculated as:
Figure BDA0003336290160000112
Step 3.4: combining the external reference calibration result of the vision system in the step 2
Figure BDA0003336290160000113
Solving the coordinate P of the target characteristic point of the mobile terminal under the coordinate system of the attitude adjusting platformm-b,Pm-bThe specific calculation formula of (A) is as follows:
Figure BDA0003336290160000114
and 4, step 4: positioning guided docking
And acquiring a fixed end target image with known coordinates on the fixed end in real time, combining data obtained by vision system calibration and moving end posture calibration, solving the final moving posture of the moving end relative to the fixed end, and guiding the posture adjusting platform to adjust the posture.
The specific process of positioning and guiding the docking is as follows:
step 4.1: establishing a fixed end coordinate system, and presetting the pose relationship between the movable end coordinate system and the fixed end coordinate system in a butt joint state
Figure BDA0003336290160000115
The method comprises the following specific steps:
step 4.1.1: as shown in fig. 2, a fixed-end coordinate system is established, a structural relationship between the moving-end target and the fixed end in the butt joint state is obtained according to simulation, and a coordinate P of all the moving-end target feature points in the fixed-end coordinate system during butt joint is obtainedm f
Step 4.1.2: calculating the coordinates Pm mAnd the coordinate Pm fThe conversion relation between the fixed end coordinate system and the movable end coordinate system is the conversion relation between the fixed end coordinate system and the movable end coordinate system in the butt joint state
Figure BDA0003336290160000121
The formula is as follows:
Figure BDA0003336290160000122
step 4.2: obtaining the coordinate P of the fixed end target under the fixed end coordinate system according to the relation between the characteristic point of the fixed end target and the fixed end coordinate systemf f
Step 4.3: using the coordinate P of the fixed-end target in the fixed-end coordinate systemf fAnd step 1, calculating a transformation matrix of the fixed end coordinate system and the camera coordinate system by adopting a PnP algorithm according to the visual system internal parameters acquired in the step 1
Figure BDA0003336290160000123
Step 4.4: according to the pose relationship set in advance in the step 4.1
Figure BDA0003336290160000124
Solving the coordinate P of the target at the moving end in the butt joint state under the coordinate system of the attitude adjusting platformm-b';
Step 4.4.1: using the result of step 4.1
Figure BDA0003336290160000125
Calculating the coordinates of the characteristic points of the target at the moving end in the butt joint state under a fixed end coordinate system as follows:
Figure BDA0003336290160000127
step 4.4.2: using the result of step 4.3
Figure BDA0003336290160000128
Calculating the coordinates of the characteristic points of the target at the moving end in the butt joint state under the coordinate system of the attitude adjusting platform as follows:
Figure BDA00033362901600001210
step 4.4.3: the formula (1) is taken to obtain the formula Pm-bThe specific calculation formula of':
Figure BDA00033362901600001211
step 4.5: according to the coordinate P of the characteristic point of the target at the moving end under the coordinate system of the attitude adjusting platform in the step 3.4m-bAnd 4.4, coordinates P of the mark point of the target at the mobile end in the coordinate system of the attitude adjusting platform in the butt joint statem-b' calculating the relative pose R of the moving end and the fixed endb、tbThe data is the docking pose, and the specific calculation formula is as follows: pm-b'=Rb·Pm-b+tb
Step 4.6: resolving the docking pose to six degrees of freedom of a pose adjusting platform, and guiding the docking pose adjustment, wherein the specific process of the step is as follows:
the docking pose matrix solved in the step is shown as a formula (1), three rows and three columns at the upper left are rotation matrixes, and three rows and one column at the right are translation vectors. By utilizing the Euler angle and rotation matrix relation, according to the attitude adjusting sequence, the attitude adjusting sequence of the embodiment is yaw angle-pitch angle-roll angle, the pitch angle theta, yaw angle psi and roll angle phi based on the attitude adjusting platform coordinate system in the formula (2) are calculated, and the displacement needing to move in each direction is separated by utilizing the translation vector in the formula (1).
Figure BDA0003336290160000131
Figure BDA0003336290160000132
In addition, one point to be emphasized is: in the embodiment, the forms of the internal reference calibration plate used in the step 1, the external reference calibration plate used in the step 2, the mobile terminal target used in the steps 3 and 4 and the fixed terminal target are similar; the device is characterized in that the device is provided with a characteristic point array, and at least two characteristic points of the mobile end target and the fixed end target are used as coding mark points for identifying the identity of the target and sequencing non-coding mark points on the target.

Claims (9)

1. A method for carrying out butt joint on a large rigid body component based on a fixed end image, wherein the large rigid body component comprises a fixed end and a movable end; the moving end is arranged on the butt-joint moving vehicle through the posture adjusting platform; the camera is arranged on the butt-joint moving vehicle, and the lens is opposite to the butt-joint surface of the fixed end; the specific docking method comprises the following implementation steps:
step 1: calibrating internal parameters of a visual system;
step 2: calibrating external parameters of a visual system;
calibrating external parameters of the camera by using the calibration plate to obtain the rotation and translation relation between the coordinate system of the camera and the coordinate system of the attitude adjusting platform
Figure FDA0003336290150000011
Figure FDA0003336290150000012
Is a rotation matrix from a camera coordinate system to an attitude adjusting platform coordinate system,
Figure FDA0003336290150000013
a translation vector from a camera coordinate system to a pose adjusting platform coordinate system;
and step 3: calibrating the posture of the mobile terminal;
step 3.1: a moving end feature identifier is arranged in front of the butt joint end face of the moving end, so that the moving end feature identifier is ensured to be within a camera view field range and to be clearly imaged;
step 3.2: establishing a mobile terminal coordinate system, and acquiring a coordinate P of the mobile terminal feature identifier in the mobile terminal coordinate system according to the relationship between the feature point in the mobile terminal feature identifier and the mobile terminal coordinate systemm m
Step 3.3: according to the coordinate P of the mobile terminal feature identification in the mobile terminal coordinate systemm mAnd (3) calculating a transformation matrix of a coordinate system of the mobile terminal and a coordinate system of the camera by utilizing a PnP algorithm in combination with the visual system internal parameters acquired in the step (1)
Figure FDA0003336290150000014
Obtaining the characteristic mark of the mobile terminal under the camera coordinate systemCoordinate Pm-c
Step 3.4: combining the external reference calibration result of the vision system in the step 2
Figure FDA0003336290150000015
Solving the coordinate P of the mobile terminal characteristic mark under the coordinate system of the attitude adjusting platformm-b
And 4, step 4: positioning, guiding and butting;
step 4.1: establishing a fixed end coordinate system, and presetting the pose relationship between the movable end coordinate system and the fixed end coordinate system in a butt joint state
Figure FDA0003336290150000021
Step 4.2: obtaining the coordinate P of the fixed end characteristic mark under the fixed end coordinate system according to the relation between the characteristic point in the fixed end characteristic mark and the fixed end coordinate systemf f
Step 4.3: coordinate P of fixed end characteristic mark in fixed end coordinate systemf fAnd step 1, calculating a transformation matrix of the fixed end coordinate system and the camera coordinate system by adopting a PnP algorithm according to the visual system internal parameters acquired in the step 1
Figure FDA0003336290150000022
Step 4.4: according to the pose relationship set in advance in the step 4.1
Figure FDA0003336290150000023
Solving the coordinate P of the characteristic mark of the moving end in the butt joint state under the coordinate system of the attitude adjusting platformm-b';
Step 4.5: according to the characteristic identification coordinate P of the moving end under the coordinate system of the attitude adjusting platform in the step 3.4m-bAnd 4.4, coordinate P of the characteristic mark of the mobile terminal in the coordinate system of the attitude adjusting platform in the butt joint statem-b' calculating the relative pose R of the moving end and the fixed endb、tbThe data is the butt joint pose;
step 4.6: and resolving the docking pose to six degrees of freedom of the pose adjusting platform to guide the docking pose adjustment.
2. The method of claim 1, wherein the method comprises the steps of: the specific implementation process of the step 4.1 is as follows:
step 4.1.1: establishing a fixed end coordinate system, acquiring the structural relationship between the characteristic points on the moving end characteristic mark and the fixed end in a butt joint state according to a simulation means, and acquiring the coordinates P of all the characteristic points in the moving end characteristic mark in the butt joint state under the fixed end coordinate systemm f
Step 4.1.2: calculating the coordinates Pm mAnd the coordinate Pm fThe conversion relation between the fixed end coordinate system and the movable end coordinate system is the conversion relation between the fixed end coordinate system and the movable end coordinate system in the butt joint state
Figure FDA0003336290150000031
The formula is as follows:
Figure FDA0003336290150000032
3. the method of claim 2, wherein the method comprises the steps of: the moving end characteristic mark is fixed in front of the moving end butt joint surface by adopting an L-shaped bracket; one end of the L-shaped support is fixed at the moving end, and the other end is provided with a moving end characteristic mark.
4. The method of claim 3 for docking a large rigid body member based on a fixed end image, wherein the method comprises: the fixed end characteristic mark is a fixed end target which is adhered to the butt joint surface of the fixed end and has characteristic points, or the inherent characteristic points on the butt joint surface of the fixed end are bulges or pits or screws with regular shapes;
the mobile terminal feature is identified as a mobile terminal target or an inherent feature arranged at the front end of the mobile terminal.
5. The method of claim 4, wherein the method comprises the steps of:
coordinate P in said step 4.4m-bThe specific solving process of':
step 4.4.1: using the result of step 4.1
Figure FDA0003336290150000033
Calculating the coordinates of the feature points of the moving end feature identification in the butt joint state under a fixed end coordinate system as follows:
Figure FDA0003336290150000034
step 4.4.2: using the result of step 4.3
Figure FDA0003336290150000035
Calculating the coordinates of the feature points of the feature identification of the moving end in the butt joint state under the coordinate system of the attitude adjusting platform as follows:
Figure FDA0003336290150000036
step 4.4.3: the formula (1) is taken to obtain the formula Pm-bThe specific calculation formula of':
Figure FDA0003336290150000037
6. the method of claim 5, wherein the method comprises the steps of:
p in said step 3.3m-cThe specific calculation formula of (A) is as follows:
Figure FDA0003336290150000041
p in said step 3.4m-bThe specific calculation formula of (A) is as follows:
Figure FDA0003336290150000042
in the step 4.5, Rb、tbThe specific calculation formula of (A) is as follows: pm-b'=Rb·Pm-b+tb
7. The method for fixed-end image-based docking of large rigid body members according to any of claims 1-6, wherein: the concrete solving process of the step 4.5 is as follows:
Figure FDA0003336290150000043
wherein r is11To r33To represent a rotation matrix Rb,t1To t3Representing translation vector tb(ii) a And resolving a pitch angle theta, a yaw angle psi and a roll angle phi based on a posture adjusting platform coordinate system by utilizing the Euler angle and rotation matrix relation and the posture adjusting sequence.
8. The method of claim 7, wherein the method comprises the steps of: the internal parameters of the visual system in the step 1 comprise: ratio (f) of unit pixel size value to focal length in X-axis and Y-axis directions in camera coordinate systemx,fy) And the coordinates (u) of the intersection of the optical axis of the camera and the image plane in the camera coordinate system0,v0);
The specific calibration process comprises the following steps:
step 1.1: placing the internal reference calibration plate within a field of view of a camera of the vision system;
step 1.2: moving the internal reference calibration plate in the field of view of the camera, traversing the field of view of the whole camera, and acquiring a plurality of images of the internal reference calibration plate;
step 1.3: processing the collected multiple internal reference calibration plate images to obtain visual coordinates (u, v) of each characteristic point in the internal reference calibration plate;
step 1.4: establishing a world coordinate system, and acquiring world coordinates (X) of each characteristic point of the internal reference calibration plate by using the known actual position relation of each characteristic point of the internal reference calibration platew,Yw,Zw);
Step 1.5: performing data fitting on the visual coordinates of the characteristic points obtained in the step 1.3 and the world coordinates of the characteristic points obtained in the step 1.4, and solving to obtain internal parameters of a visual system; the specific formula of the internal reference of the visual system is as follows:
Figure FDA0003336290150000051
in the formula, M is the conversion relation between the visual coordinate system and the world coordinate system.
9. The method of claim 8, wherein the method comprises the steps of: the specific calibration process of the step 2 is as follows:
step 2.1: fixedly connecting the external reference calibration plate with the attitude adjusting platform, and determining an attitude adjusting platform coordinate system;
step 2.2: controlling the posture adjusting platform to move the external reference calibration plate at least two positions along one direction, wherein the external reference calibration plate is converted into B1 in a rotating mode under a posture adjusting platform coordinate system in the process, and the external reference calibration plate is converted into A1 in a rotating mode under a visual coordinate system; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a first typical hand-eye calibration equation: A1X ═ XB 1; when the vision system and the control posture adjustment platform are relatively fixed,
Figure FDA0003336290150000052
Figure FDA0003336290150000053
is a rotation matrix from a camera coordinate system to an attitude adjusting platform coordinate system,
Figure FDA0003336290150000054
is a phase ofA translation vector from the machine coordinate system to the pose adjusting platform coordinate system;
step 2.3: controlling the posture adjusting platform to move at least two positions along the other direction, wherein in the process, the external reference calibration plate is converted into B2 in a rotating mode under a posture adjusting platform coordinate system, and the rotating mode under a camera coordinate system is converted into A2; let the transformation relation from the camera coordinate system to the pose platform coordinate system be X, so as to construct a second typical hand-eye calibration equation: A2X ═ XB 2;
step 2.4: solving X according to the calibration equation of the two hands and eyes to obtain
Figure FDA0003336290150000061
Thereby completing the external reference calibration of the vision system.
CN202111295176.0A 2021-11-03 2021-11-03 Method for carrying out butt joint on large rigid body member based on fixed end image Pending CN114092552A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111295176.0A CN114092552A (en) 2021-11-03 2021-11-03 Method for carrying out butt joint on large rigid body member based on fixed end image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111295176.0A CN114092552A (en) 2021-11-03 2021-11-03 Method for carrying out butt joint on large rigid body member based on fixed end image

Publications (1)

Publication Number Publication Date
CN114092552A true CN114092552A (en) 2022-02-25

Family

ID=80298766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111295176.0A Pending CN114092552A (en) 2021-11-03 2021-11-03 Method for carrying out butt joint on large rigid body member based on fixed end image

Country Status (1)

Country Link
CN (1) CN114092552A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842079A (en) * 2022-04-23 2022-08-02 四川大学 Device and method for measuring pose of prefabricated intermediate wall in shield tunnel
CN116140987A (en) * 2023-04-17 2023-05-23 广东施泰德测控与自动化设备有限公司 Visual quick docking device and docking method for axle test board

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842079A (en) * 2022-04-23 2022-08-02 四川大学 Device and method for measuring pose of prefabricated intermediate wall in shield tunnel
CN114842079B (en) * 2022-04-23 2023-09-19 四川大学 Equipment and method for measuring pose of prefabricated intermediate wall in shield tunnel
CN116140987A (en) * 2023-04-17 2023-05-23 广东施泰德测控与自动化设备有限公司 Visual quick docking device and docking method for axle test board

Similar Documents

Publication Publication Date Title
CN107214703B (en) Robot self-calibration method based on vision-assisted positioning
CN109029299B (en) Dual-camera measuring device and method for butt joint corner of cabin pin hole
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN107590835B (en) Mechanical arm tool quick-change visual positioning system and positioning method in nuclear environment
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN109859272B (en) Automatic focusing binocular camera calibration method and device
CN101876532B (en) Camera on-field calibration method in measuring system
CN112833786B (en) Cabin attitude and pose measuring and aligning system, control method and application
CN114092552A (en) Method for carrying out butt joint on large rigid body member based on fixed end image
CN109448054A (en) The target Locate step by step method of view-based access control model fusion, application, apparatus and system
CN111707189B (en) Laser displacement sensor light beam direction calibration method based on binocular vision
CN112985293B (en) Binocular vision measurement system and measurement method for single-camera double-spherical mirror image
CN105323455B (en) A kind of location compensation method based on machine vision
CN110686595A (en) Laser beam space pose calibration method of non-orthogonal axis system laser total station
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN112894209A (en) Automatic plane correction method for intelligent tube plate welding robot based on cross laser
CN114434059A (en) Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN112697112A (en) Method and device for measuring horizontal plane inclination angle of camera
CN112381881B (en) Automatic butt joint method for large rigid body members based on monocular vision
CN109541626B (en) Target plane normal vector detection device and detection method
CN110533727B (en) Robot self-positioning method based on single industrial camera
CN114842079B (en) Equipment and method for measuring pose of prefabricated intermediate wall in shield tunnel
CN111598945B (en) Three-dimensional positioning method for curved bearing bush cover of automobile engine
CN114111578A (en) Automatic pose determination method for large-diameter element

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination