CN103558850A - Laser vision guided welding robot full-automatic movement self-calibration method - Google Patents

Laser vision guided welding robot full-automatic movement self-calibration method Download PDF

Info

Publication number
CN103558850A
CN103558850A CN201310322092.0A CN201310322092A CN103558850A CN 103558850 A CN103558850 A CN 103558850A CN 201310322092 A CN201310322092 A CN 201310322092A CN 103558850 A CN103558850 A CN 103558850A
Authority
CN
China
Prior art keywords
calibration
msub
hand
points
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310322092.0A
Other languages
Chinese (zh)
Other versions
CN103558850B (en
Inventor
李新
郭新年
白瑞林
吉峰
王秀平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XINJE ELECTRONIC CO Ltd
Jiangnan University
Original Assignee
XINJE ELECTRONIC CO Ltd
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XINJE ELECTRONIC CO Ltd, Jiangnan University filed Critical XINJE ELECTRONIC CO Ltd
Priority to CN201310322092.0A priority Critical patent/CN103558850B/en
Publication of CN103558850A publication Critical patent/CN103558850A/en
Application granted granted Critical
Publication of CN103558850B publication Critical patent/CN103558850B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention designs a simple and fixable laser structure light guided welding robot system full-automatic calibration method by deeply analyzing the camera imaging principle, the laser structure light measurement principle and the hand-eye system working principle, wherein sensor parameter calibration (comprising intra-camera parameter and line laser plane parameter equation calibration) and hand-eye relational matrix calibration are comprised, and workpiece offset correction is performed. The calibration method can be used to overcome the disadvantage that the participation of a professional is needs and the disadvantage of complex calibration steps with the conventional intra-camera parameter calibration and laser plane equation and hand-eye matrix calibration. According to the method, the full-automatic calibration of a laser structure light guided welding robot system can be realized only through four given poses and six sets of translational movements performed automatically by the robot. The method can be used to realize the calibration streamlining and automation of the laser structure light guided welding robot system, greatly improve the flexibility of the system calibration, has important significance in actual vision measurement and three-dimensional tracking, and has the advantage of good practicability.

Description

Laser vision guided welding robot full-automatic movement self-calibration method
Technical Field
The invention relates to a robot laser structure optical vision sensor and a calibration method thereof, in particular to a full-automatic motion self-calibration method of a hand-eye relation matrix and sensor parameters (including camera internal parameters and line laser plane parameters) in a welding robot hand-eye system based on the guidance of a line laser structure optical vision sensor.
Technical background.
Welding is widely used in industrial production as an important means of material processing. Automation and robotization of welding processes have become a trend driven by many factors, such as stability of welding quality, flexibility of application, safety and economy of operation, etc. The welding robot shows higher superiority in the aspects of technical improvement of manufacturing industry, improvement of welding quality, reduction of labor intensity of workers, improvement of welding labor conditions, guarantee of welding stability and the like. The key problem of realizing welding automation is automatic tracking of welding seams, and the welding robot guided by laser vision combines welding seam image recognition and robot motion control technology, so that the difficult problem of automatic tracking of welding seams can be effectively solved. The adoption of laser structured light as an active optical vision sensor has become the mainstream of the current welding robot vision system. The vision system calibration refers to the calculation of parameters of a vision sensor and the relation between the sensor and a robot body. The calibration is a very critical and important link in a vision measurement system, and the precision of a calibration result and the stability and real-time performance of an algorithm directly influence the measurement and tracking precision in the industrial production process.
The laser vision system of the welding robot needs to be calibrated in three aspects, including camera calibration, laser structure optical parameter calibration and robot eye calibration. The camera calibration is to obtain the internal parameters and the external parameters of the camera according to a certain camera model. Faugeras et al's linear model camera internal and external parameter calibration method and Tsai's two-step method (two-stage) calibration method based on radial constraint all use three-dimensional targets, and Zhang Z Y. And the Ma S.D does not need a specific calibration target, and realizes the calibration of camera parameters by two groups of translation motions of the camera in a three-dimensional space. The line laser structure optical parameter calibration is an equation for calibrating a laser plane projected by a laser. There are several methods for its solution, such as the wire drawing calibration method proposed by r.dewar; an exchange ratio invariance calibration method proposed by D.Q.Huynh; the calibration method based on active vision provided by Chentianfei and the like does not need a target, can realize the calibration of the optical parameters of the laser structure by controlling the robot to do specific motion, has stronger robustness, simple implementation method and easy extraction of characteristics, and meets the requirements of field calibration. An Eye-in-Hand mode is usually adopted for the welding robot, and the relation between a visual system and a robot terminal coordinate system is obtained through Hand-Eye calibration. The commonly used robot hand-eye calibration method is that a known calibration reference object (calibration object) is utilized to control the robot to observe a known calibration reference object in different directions, so as to deduce the rotation and translation parts R and t of a hand-eye matrix; however, the calibration of the hand-eye matrix is realized by controlling the robot to do specific motion without using a target, so that the automation and the process of the calibration can be effectively improved.
Disclosure of Invention
The invention aims to overcome the defects of high requirement on target manufacturing precision and high requirement on professional of calibration personnel in the prior calibration technology, and provides a full-automatic self-calibration method of a welding robot system based on the guidance of a linear laser structure optical visual sensor. The algorithm is simple and flexible, high in precision, strong in real-time performance and strong in operability.
According to the technical scheme provided by the invention, the full-automatic calibration method for guiding the welding robot by the laser structure optical vision sensor comprises the following steps:
firstly, a calibration object is placed in a working area of the robot, and an initial pose T0 of the robot is given to ensure that the calibration object is positioned in the visual field of the camera. And controlling the robot to do linear independent translation motion, extracting characteristic points, automatically matching and solving the FOE. Judging whether each movement meets the calibration requirement on the extended focus FOE, deleting the movement if not, retaining the movement if not, and storing the first and tail end poses T of the movementi1、Ti2And FOE point coordinates. The next set of movements is continued until there are four sets of movements that meet the requirements. According to the Ma S.D. Properties regarding Focus of expansion (FOE), Focus of expansion eiThe normalized coordinates in the camera coordinate system represent the direction of the camera translation in the camera coordinate system before translation; combined robot translation vector kbiLinearly solving the camera intrinsic parameter matrix k x s u 0 0 k y v 0 0 0 1 And a rotating portion R of the hand-eye relationship matrix, where kx、kyIs a scale factor of a u axis and a v axis of the image, s reflects the inclination degree of the arrangement of the CCD photosensitive elements, and u0、v0Is the intersection point of the camera lens optical axis and the CCD photosensitive element;
secondly, given the poses of the robots T1 and T2 (the poses of T1 and T2 are different), the calibration object is ensured to be positioned in the field of view of the camera. And collecting images of the calibration objects, processing the images, and storing the coordinates of the characteristic points. And (3) turning on a laser, controlling the robot to do translational motion, taking one image with laser bars at a motion distance interval of Z1, taking five images with laser bars, extracting laser bars from each image, refining the laser bars, fitting a laser bar linear equation to obtain blanking points, and combining camera parameters to obtain normalized coordinate values of the blanking points in a camera coordinate system, namely the directions of each group of parallel light bars under two poses of T1 and T2. Determining a plane according to two orthogonal lines to obtain a normal vector (a) of the light plane1,a2,a3)。
Third, given robot pose T3(T3 is different from T1, T2), it is guaranteed that the calibration object is within the camera field of view. And collecting images of the calibration objects, processing the images and storing the coordinates of the characteristic points. And opening the laser, collecting an image with the laser strip, and fitting a laser strip linear equation. Obtaining coordinates of a translation part T and three feature points of the hand-eye matrix in a robot base coordinate system according to the calibrated camera internal parameters, the rotation part of the hand-eye matrix and the feature point information of three different poses T1, T2 and T3; determining a plane according to the three non-collinear points, solving a plane equation of the three characteristic points in the base coordinate system, taking the pixel coordinate of any point on the straight line of the light bar, transforming the pixel coordinate into the robot base coordinate system, then introducing the pixel coordinate into the plane equation formed by the characteristic points to obtain the coordinate of the light point in the robot base coordinate system, transforming the coordinate into the camera coordinate system, then introducing the pixel coordinate into the plane equation, and solving a4Then, the equation a of the light plane is obtained1xc+a2yc+a3zc+a4=0。
And fourthly, after the calibration of the linear laser structured light guided welding robot is completed, controlling the welding workpiece to accurately touch the characteristic point 1 of the calibration object under the fixed end pose, calculating the coordinate value of the end point of the workpiece under the robot coordinate, and calculating the offset value of the workpiece under the pose.
The invention has the beneficial effects that: the invention designs a simple and flexible full-automatic calibration method based on a laser structure light guide robot system by deeply analyzing the imaging principle of a camera, the laser structure light measurement principle and the working principle of a hand-eye system, wherein the method comprises sensor parameter calibration (including camera internal parameter and line laser plane parameter equation calibration) and hand-eye relation matrix calibration, and workpiece offset correction is carried out. The calibration method overcomes the defects that the traditional internal reference calibration, the laser plane equation and the hand-eye matrix calibration need the participation of professional personnel and the calibration steps are complicated. The method only requires 4 given poses and 6 sets of translational movements automatically performed by the robot. The method realizes the calibration process and automation of the laser structure light guided welding robot system, greatly improves the flexibility of the system calibration, has important significance for the actual vision measurement and tracking, and has good practicability.
Drawings
FIG. 1 is a flow chart of the overall calibration operation of the present invention.
Fig. 2 shows a schematic diagram of the calibration object and a schematic diagram of the motion direction selection.
Fig. 3 is a flow chart for automatically selecting the moving direction.
Fig. 4 is a dual representation of two laser intersections in projection space.
Fig. 5 is a schematic diagram of light plane depth information determination.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following is a detailed description of the embodiments of the present invention with reference to the accompanying drawings.
The basic idea of the invention is as follows: in the prior art, the requirement on the manufacturing precision of the target is high, and the requirement on the professional of a calibration worker is high, so that the full-automatic calibration algorithm without the accurate target is particularly important in practical use. The invention can realize the rapid positioning of the corner points of the triangle by using the right-angled triangle calibration object as shown in figure 2 and automatically match. Deeply analyzing a mathematical model of the hand-eye system, solving the internal parameter matrix and the rotating part of the hand-eye matrix as a whole through at least four automatic motions meeting the requirements, and then decomposing to obtain the internal parameter matrix and the rotating part of the hand-eye matrix; when the light plane normal vector information is solved, two groups of translation motions meeting the requirements are automatically realized through a set automatic motion algorithm, and the light plane normal vector is determined. When the hand-eye matrix translation part is calibrated, the coordinate information of the characteristic points obtained in the calibration of the translation part in the robot base coordinate system is used for calibrating the depth information of the laser plane, and the calibration process is effectively simplified.
FIG. 1 is a flowchart of the overall calibration of the present invention. Firstly, the calibration objects are placed in a robot working space, and in the working space, the positions T0, T1, T2 and T3 can be ensured, and the calibration objects are all in the camera vision field. In a pose T0, the robot fully automatically completes four times of translation motions meeting the requirements, and the internal parameters of the camera and the translation part of the hand-eye matrix are calibrated; calibrating a normal vector of a laser plane through two times of full-automatic translation movements at poses T1 and T2; according to the feature point coordinate information of the poses T1, T2 and T3, the hand-eye matrix translation part and the laser plane depth information are calibrated.
The first step is as follows:
1.1 extraction of sub-pixel level coordinates of corner points
And collecting an image with characteristic points on line, and detecting by using a Harris angular point to obtain pixel-level coordinates of the angular point. And a space moment method is applied to obtain sub-pixel level coordinates.
1.2 automatic matching of feature points
The schematic diagram of the calibration object is shown in fig. 2, and the calibration triangle is not required to be accurately manufactured, and only the triangle is ensured to meet the following criteria: and (3) comparing the distances of three corner points of the triangle, marking a third point except for the two points with the largest distance as a point 1, marking a point 2 far away from the point 1, and remaining points. In the European space, the non-isosceles triangle can meet the above criteria, and in practical use, a right-angled triangle with an acute angle of 30 degrees can be used.
1.3 automatic selection of direction of motion
As shown in fig. 2, sub-pixel level coordinates of feature points in the image are extracted. In the pose T0, the robot motion direction is set as:
Figure BSA0000093138220000041
Figure BSA0000093138220000042
in the direction of the robot end coordinate system, wherein
Figure BSA00000931382200000417
Is the included angle between the projection of the motion direction on the XOY plane of the camera coordinate system and the positive direction of the X axis,
Figure BSA0000093138220000044
is that
Figure BSA0000093138220000045
The included angle with the positive direction of the optical axis.
Figure BSA0000093138220000046
The distribution of FOE points around the image is controlled,
Figure BSA0000093138220000047
controlling the distance between the FOE point and the central point of the image,
Figure BSA0000093138220000048
the smaller the FOE point is, the closer it is to the center of the image. As shown in the orientation of figure 2,the value is the included angle between the central line direction of the selected characteristic point and the positive direction of the X axis of the image coordinate system.
Each movement
Figure BSA00000931382200000410
The following values are taken:
Figure BSA00000931382200000411
wherein
Figure BSA00000931382200000412
And taking pi/4 and i as the number of movements.
Figure BSA00000931382200000413
The specific implementation process of the value is shown in fig. 3, and whether each corner point exists in the image space is judged (u)t,vt),(u′t,v′t) Within range, if any, selected in the direction of the centerline of 1 pointIf not, the motion direction is automatically selected, and the selection method comprises the following steps: calculating the distance between each characteristic point and four vertexes of the image, and recording as [ d ]i1,di2,di3,di4](i is 1, 2, 3), and taking the minimum value diGet diThe central line direction of the minimum point in the middle forms an included angle with the X axis
Figure BSA00000931382200000415
1.4 determination of the distance of movement Zb
The robot moves Z0 along the direction, the motion vector distance Zb is determined according to the proportional relation between the change of the pixel coordinates of the characteristic points and Z0, and finally the calibration object image stays in the image space (u)0,v0),(u′0,v′0) Within the range; the concrete implementation is as follows:
the movement direction is set as the direction of the central line of 1 point, and the coordinate before the characteristic point moves is set as (u)1,v1),(u2,v2),(u3,v3) After the movement distance Z0, the coordinates of the feature points are (u'1,v′1),(u′2,v′2),(u′3,v′3) The Zb value is selected as follows:
Figure BSA00000931382200000418
1.5 Pole satisfaction requirement determination
Fitting a linear equation corresponding to each characteristic point according to the coordinate information of the characteristic point image, and solving three intersection points e of the three straight lines1,e2,e3Taking the average value of the three intersection points as an extended focus e (FOE point), and judging whether e meets max (| e-e)iIf |) < epsilon, if the motion is satisfied, the motion is retained, and e point coordinates and the first and tail end poses T of the translation are storedi1、Ti2If not, the next group of movements is performed without reserving the group of movements.
1.6 calibration Algorithm implementation
From an intra-camera parametric model and eiImage coordinates, from which point e can be foundi(homogeneous coordinate is (u)ei,vei1)) coordinates (x) of the imaging point in the camera coordinate systemci,yci,zci) To obtain <math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>ci</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>ci</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>ci</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>f</mi> <msup> <mi>K</mi> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mover> <mrow> <msub> <mi>C</mi> <mn>1</mn> </msub> <msub> <mi>e</mi> <mi>i</mi> </msub> </mrow> <mo>&RightArrow;</mo> </mover> <mo>,</mo> </mrow> </math>
Figure BSA0000093138220000052
Representing a translation straight line O1O2Direction in the camera coordinate system before translation.
Stored robot pose Ti1、Ti2All are in the pose and translation motion of the tail end of the robot under the base coordinate system <math> <mrow> <msub> <mi>k</mi> <mn>1</mn> </msub> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>T</mi> <mrow> <mi>i</mi> <mn>1</mn> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <msub> <mi>T</mi> <mrow> <mi>i</mi> <mn>2</mn> </mrow> </msub> <mo>.</mo> </mrow> </math>
O1O2Direction being a unit vector <math> <mrow> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msub> <mi>O</mi> <mn>1</mn> </msub> <mi>e</mi> <mo>/</mo> <mo>|</mo> <msub> <mi>O</mi> <mn>1</mn> </msub> <mi>e</mi> <mo>|</mo> <mo>,</mo> </mrow> </math> Then <math> <mrow> <msub> <mi>O</mi> <mn>1</mn> </msub> <msub> <mi>O</mi> <mn>2</mn> </msub> <mo>=</mo> <mi>k</mi> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>,</mo> </mrow> </math> Wherein <math> <mrow> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>a</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>a</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mi>T</mi> </msup> <mo>,</mo> </mrow> </math> k>0。
Let P be a point in space, XwIs the coordinate of point P under the robot base coordinate system, XeIs the coordinate of point P in the robot end coordinate system, XcAs the coordinates of point P in the camera coordinate system, REWAnd tEWRepresenting the relation between the base coordinate system of the robot and the end coordinate system of the robot, RCEAnd tCERepresenting the relation between the robot terminal coordinate system and the camera coordinate system, namely the hand-eye relation; the relation of P in each coordinate system is as follows:
Xe=REWXw+tEW,Xc=RECXe+tEC
the tail end of the robot moves from A to B in parallel, and the translation vector of the tail end of the robot is
Figure BSA00000931382200000519
The coordinates of the point P under the robot terminal coordinate system at the A position and the B position are Xe and B respectively
Figure BSA0000093138220000058
The coordinates in the camera coordinate system are X respectivelycAnd Xc1Then there is
<math> <mrow> <msub> <mi>x</mi> <mrow> <mi>c</mi> <mn>1</mn> </mrow> </msub> <mo>=</mo> <msub> <mi>Rx</mi> <mrow> <mi>e</mi> <mn>1</mn> </mrow> </msub> <mo>+</mo> <mi>t</mi> <mo>=</mo> <msub> <mi>R</mi> <mi>EC</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>+</mo> <mi>k</mi> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>t</mi> <mi>EC</mi> </msub> <mo>=</mo> <msub> <mi>X</mi> <mi>c</mi> </msub> <mo>+</mo> <mi>kR</mi> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> </mrow> </math>
Is obtained by the formula: <math> <mrow> <msub> <mi>x</mi> <mrow> <mi>c</mi> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>c</mi> </msub> <mo>=</mo> <mi>k</mi> <msub> <mi>R</mi> <mi>EC</mi> </msub> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mi>k</mi> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>,</mo> </mrow> </math> namely, it is <math> <mrow> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mi>R</mi> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> </mrow> </math>
Is provided with <math> <mrow> <mover> <mrow> <msub> <mi>C</mi> <mn>1</mn> </msub> <mi>e</mi> </mrow> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>,</mo> </mrow> </math> Then <math> <mrow> <msub> <mi>k</mi> <mn>1</mn> </msub> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <msub> <mi>R</mi> <mi>EC</mi> </msub> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mover> <mrow> <msub> <mi>C</mi> <mn>1</mn> </msub> <mi>e</mi> </mrow> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <msup> <mi>fK</mi> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math> Can obtain the product <math> <mrow> <mi>&tau;</mi> <msub> <mi>KR</mi> <mi>EC</mi> </msub> <mover> <mi>b</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mi>i</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math> Where τ is k1/f。
Order to A 1 = KR EC = a 11 a 12 a 13 a 21 a 22 a 23 a 31 a 32 a 33 Can be brought into the above formula
b i 1 a 11 + b i 2 a 12 + b i 3 a 13 - b i 1 u i a 31 - b i 2 u i a 32 - b i 3 u i a 33 = 0 b i 1 a 21 + b i 2 a 22 + b i 3 a 23 - b i 1 u i a 31 - b i 2 u i a 32 - b i 3 u i a 33 = 0
Controlling the camera to move n times, and obtaining n e from n movementsiAt points, 2n linear equations for the K matrix elements can be obtained. Suppose a331, resulting in 2n linear equations for the other elements of the K matrix. When n is more than or equal to 4, obtaining a unique solution by using a least square method. To determine a33We write the matrix a to the form: a 33 a 1 a 2 a 3 = k x s u 0 0 k y v 0 0 0 1 r 1 r 2 r 3
from the above formula33a3=r3R is a unit orthogonal vector, so a33=|r3|/|a3|=1/|a3L. Then A = a 33 A 1 = A 1 A 2 A 3 The internal reference K and the hand-eye matrix R are decomposed as follows:
<math> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <msub> <mi>r</mi> <mn>3</mn> </msub> <mo>=</mo> <msub> <mi>A</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mn>0</mn> </msub> <mo>=</mo> <msub> <mi>A</mi> <mn>2</mn> </msub> <msubsup> <mi>r</mi> <mn>3</mn> <mi>T</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msub> <mi>u</mi> <mn>0</mn> </msub> <mo>=</mo> <msub> <mi>A</mi> <mn>1</mn> </msub> <msubsup> <mi>r</mi> <mn>3</mn> <mi>T</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mi>y</mi> </msub> <mo>=</mo> <mo>|</mo> <msub> <mi>A</mi> <mn>2</mn> </msub> <mo>&times;</mo> <msub> <mi>r</mi> <mn>3</mn> </msub> <mo>|</mo> </mtd> </mtr> <mtr> <mtd> <msub> <mi>r</mi> <mn>1</mn> </msub> <mo>=</mo> <msub> <mi>A</mi> <mn>2</mn> </msub> <mo>&times;</mo> <msub> <mi>r</mi> <mn>3</mn> </msub> <mo>/</mo> <mo>|</mo> <msub> <mi>A</mi> <mn>2</mn> </msub> <mo>&times;</mo> <msub> <mi>r</mi> <mn>3</mn> </msub> <mo>|</mo> </mtd> </mtr> <mtr> <mtd> <msub> <mi>r</mi> <mn>2</mn> </msub> <mo>=</mo> <msub> <mi>r</mi> <mn>3</mn> </msub> <mo>&times;</mo> <msub> <mi>r</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mi>x</mi> </msub> <mo>=</mo> <msub> <mi>f</mi> <mn>1</mn> </msub> <msubsup> <mi>r</mi> <mn>1</mn> <mi>T</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mi>s</mi> <mo>=</mo> <msub> <mi>A</mi> <mn>1</mn> </msub> <msubsup> <mi>r</mi> <mn>2</mn> <mi>T</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> </math>
where |, denotes the modulus, vector r1、r2Is determined so that fx、fyThe value is in the positive direction.
The second step is as follows:
2.1, the set of translational motion directions theta,
Figure BSA0000093138220000064
The values refer to 1.3.
2.2, carrying out binarization on the acquired image with the laser strip by using a threshold value, and carrying out image closing operation on the binarized image to remove edge singular points;
2.2, an 8-neighborhood centered on the optical stripe region, denoted as p1 with the 8 points of the neighborhood being p2, p3, a.page., p8, p9 clockwise around the center point, respectively, where p2 is above p1, first marks the boundary points that satisfy the following conditions:
①2≤N(P1)≤6;②S(P1)=1;③P2*P4*P6=0;④P4*p6*p8=0;
wherein N (P1) is the number of non-zero neighbors of P1; s (p1) is the number of changes from 0 → 1 when rotated in the order of p2, p 3. And when all the boundary points are checked, removing all the mark points. And (5) repeatedly iterating the algorithm until no point meets the marking condition, and finishing the light bar thinning.
2.3, performing line equation fitting on the points obtained after thinning by adopting a least square method to obtain a line equation aix+biy+ci0(i ═ 1, 2.., 5), blanking point (x)e,ye) The optimal solution is obtained by
<math> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mi>min</mi> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msup> <mrow> <mo>|</mo> <mfrac> <mrow> <msub> <mi>a</mi> <mi>i</mi> </msub> <mi>x</mi> <mo>+</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mi>y</mi> <mo>+</mo> <msub> <mi>c</mi> <mi>i</mi> </msub> </mrow> <msqrt> <msup> <msub> <mi>a</mi> <mi>i</mi> </msub> <mn>2</mn> </msup> <mo>+</mo> <msup> <msub> <mi>b</mi> <mi>i</mi> </msub> <mn>2</mn> </msup> </msqrt> </mfrac> <mo>|</mo> </mrow> <mn>2</mn> </msup> </mrow> </math>
And combining the camera intrinsic reference information K to obtain the homogeneous coordinate of the image point of the blanking point on the normalization plane, wherein the homogeneous coordinate represents the direction of the parallel straight line in the light plane under the camera coordinate system.
2.4, let the equation of the light plane be: a is1xc+a2yc+a3zc+a4=0。
As shown in FIG. 4, let γ1、γ2Is the direction of two groups of non-parallel light strip lines in the light plane, gamma3Normal to the plane of light, there are: gamma ray3=γ1×γ2Then the optical plane normal direction (a) can be completed1,a2,a3) And (4) calibrating.
The third specific method is as follows:
3.1, after images are collected at T1, T2 and T3 poses, extracting coordinates of feature points, and automatically matching and storing coordinate information. According to the relation of a robot base coordinate system-a robot end coordinate system-a camera coordinate system-an image coordinate system in the robot eye system:
<math> <mrow> <msub> <mi>X</mi> <mi>w</mi> </msub> <mo>=</mo> <mo>&Element;</mo> <msub> <mi>R</mi> <mi>EW</mi> </msub> <mi>M</mi> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>u</mi> </mtd> </mtr> <mtr> <mtd> <mi>v</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <msub> <mi>R</mi> <mi>EW</mi> </msub> <mi>t</mi> <mo>+</mo> <msub> <mi>t</mi> <mi>EW</mi> </msub> </mrow> </math>
where M ═ RK-1And epsilon is the depth value of the target object in the camera coordinate system, XwThree-dimensional coordinate values of the feature points in the scene in a base coordinate system, wherein REW,tEWAnd (u, v) representing the image coordinates corresponding to the characteristic points, and K representing a parameter matrix in the camera. From the same feature point, at different poses:
<math> <mfenced open='{' close='' separators=' '> <mtable> <mtr> <mtd> <msub> <mi>X</mi> <mi>w</mi> </msub> <mo>=</mo> <msub> <mo>&Element;</mo> <mn>1</mn> </msub> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>1</mn> </msubsup> <mi>M</mi> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>1</mn> </msubsup> <mi>t</mi> <mo>+</mo> <msubsup> <mi>t</mi> <mi>EW</mi> <mn>1</mn> </msubsup> </mtd> </mtr> <mtr> <mtd> <msub> <mi>X</mi> <mi>w</mi> </msub> <mrow> <mo>=</mo> <msub> <mo>&Element;</mo> <mn>2</mn> </msub> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>2</mn> </msubsup> <mi>M</mi> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>2</mn> </msubsup> <mi>t</mi> <mo>+</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mi>X</mi> <mi>w</mi> </msub> <mo>=</mo> <msub> <mo>&Element;</mo> <mn>3</mn> </msub> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>3</mn> </msubsup> <mi>M</mi> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>v</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <msubsup> <mi>R</mi> <mi>EW</mi> <mn>3</mn> </msubsup> <mi>t</mi> <mo>+</mo> <msubsup> <mi>t</mi> <mi>EW</mi> <mn>3</mn> </msubsup> </mtd> </mtr> </mtable> <msubsup> <mi>t</mi> <mi>EW</mi> <mn>2</mn> </msubsup> </mfenced> </math>
subtracting the above two by two to obtain: <math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <msub> <mo>&Element;</mo> <mn>1</mn> </msub> <msub> <mi>A</mi> <mn>1</mn> </msub> <mo>-</mo> <msub> <mi>&epsiv;</mi> <mn>2</mn> </msub> <msub> <mi>A</mi> <mn>2</mn> </msub> <mo>+</mo> <msub> <mi>B</mi> <mn>1</mn> </msub> <msub> <mi>t</mi> <mi>CE</mi> </msub> <mo>=</mo> <msub> <mi>C</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mo>&Element;</mo> <mn>2</mn> </msub> <msub> <mi>A</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>&epsiv;</mi> <mn>3</mn> </msub> <msub> <mi>A</mi> <mn>3</mn> </msub> <mo>+</mo> <msub> <mi>B</mi> <mn>2</mn> </msub> <msub> <mi>t</mi> <mi>CE</mi> </msub> <mo>=</mo> <msub> <mi>C</mi> <mn>2</mn> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math> wherein, A 1 = R EW 1 M u v 1 is a matrix of 3 x 1, and the matrix is,
Figure BSA0000093138220000074
is a 3-by-3 matrix and is,
Figure BSA0000093138220000075
3 x 1 matrix, and the rest are analogized.
Order to A 1 - A 2 0 3 * 1 A 1 0 3 * 1 A 2 - A 3 A 2 = A , <math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mo>&Element;</mo> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mo>&Element;</mo> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mo>&Element;</mo> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>t</mi> <mi>CE</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>X</mi> <mo>,</mo> </mrow> </math> C 1 C 2 = b , The above formula writes out the matrix form: and X can be solved by least square, and the hand-eye matrix translation part t is contained in the X.
3.2, the solved X comprises the depth value epsilon of the three characteristic points under the T3 poseiBy eiThe three-dimensional coordinates (x) of each feature point in the robot base coordinate system can be solvedwi,ywi,Zwi) Wherein (i ═ 1, 2, 3). Since three points which are not on the same straight line can determine a plane, the plane equation a determined by three characteristic points can be obtained0x+b0y+c0z +1 is 0. As shown in FIG. 5, the homogeneous coordinate (u) is determined at any point on the straight line of the light barg,vg1) is substituted into the hand-eye system relational expression, the point containing an unknown number epsilon in the robot base coordinate system can be obtainedgIs e.g. as the coordinate ofg(xwg,ywg,zwg)TSubstituting the coordinates into a plane equation determined by the three characteristic points, and solving the epsilongObtaining the coordinates (x) of the light spot in the base coordinate systemg,yg,zg)TThrough T3Hand-eye matrix R t 0 1 , Transformation to the Camera coordinate System (x)CP,yCP,zCP)TSubstituting it into the optical plane equation to solve a4And then an equation of the light plane is obtained.
The fourth step is that the method for calculating the displacement value of the workpiece in the current pose is as follows:
4.1, controlling the welding workpiece to perform point contact on the characteristic point 1 under the fixed end pose, reading the current robot pose, and obtaining the end coordinate of the robot;
and 4.2, comparing the value with the coordinates obtained by point contact before according to the coordinates of the solved specific point under the robot base coordinate system to obtain the displacement value of the workpiece under the current pose.

Claims (4)

1. A laser vision guided welding robot full-automatic movement self-calibration method is characterized in that the translation and rotation movement is automatically carried out without human participation in the calibration process, and the calibration automation and the process are improved; the calibration algorithm uses a common right-angled triangle calibration object in an industrial field, places the calibration object in a working visual field, realizes the rapid positioning of the triangle corner points, and automatically matches the triangle corner points; deeply analyzing a mathematical model of the hand-eye system, and solving the internal parameter matrix and the rotating part of the hand-eye matrix as a whole through at least four automatic movements meeting the requirements to obtain the internal parameter matrix and the rotating part of the hand-eye matrix; two groups of translation motions meeting the requirements are automatically realized through a set automatic motion algorithm, and the normal vector information of the optical plane is solved; finally, obtaining the depth values of the hand-eye matrix translation part and each characteristic point through the given three poses, and solving the coordinate information of the characteristic points in the robot base coordinate system for calibrating the depth information of the laser plane; the whole algorithm comprises the following modules:
the calibration module for the internal parameter and hand-eye matrix rotating part is used for placing a calibration object in a working area of the robot, giving an initial pose T0 of the robot, automatically performing four groups of translation motions meeting requirements, automatically matching feature points and calculating FOE points; calibrating an internal parameter K and a hand-eye matrix rotation part R according to the motion information and the FOE points;
the light plane normal vector calibration module gives the poses of the robots T1 and T2 (the poses of the robots T1 and T2 are different), automatically performs translational motion, collects images of five light strips, calculates blanking points and further obtains the normal vector (a) of the light plane1,a2,a3);
The hand-eye matrix translation part and the light plane depth information calibration module are used for giving a robot pose T3 (the T3 is different from the T1 and the T2), three pose feature point coordinates of T1, T2 and T3, and coordinates of the translation part T and the three feature points of the hand-eye matrix in a robot base coordinate system are obtained through solving by combining a hand-eye relationship; taking a point on the light bar, carrying out hand-eye relation transformation, solving depth information of the light plane, and obtaining an equation of the light plane;
and the workpiece tail end correction module is used for controlling the welding workpiece to accurately touch the characteristic point 1 of the calibration object under the fixed tail end pose after the calibration of the welding robot system guided by the laser vision is finished, calculating the coordinate value of the tail end point of the workpiece under the robot coordinate and calculating the deviation value of the workpiece under the pose.
2. The automatic matching of feature points according to claim 1, the automatic selection of linearly independent translational motion having the following features:
2.1, in the automatic matching of the feature points, extracting the feature points of the calibration object, and automatically matching according to the distance information of the feature points;
2.2, in the determination of the movement direction, the movement direction is set as follows:
Figure FSA0000093138210000011
in order to be along the direction of the coordinate system of the end of the robot,
Figure FSA0000093138210000013
the direction is the middle line direction of the condition points in the automatic movement process, and each movement
Figure FSA0000093138210000014
The following values are taken:whereinTaking pi/6, i as the number of movements; and the motion distance Zb is determined according to the proportional relation between the change of the pixel coordinates of the characteristic points and Z0.
3. The method of claim 1 wherein the solution of the blanking point and the solution of the normal vector of the laser plane are characterized by: in the solving of the blanking points, linear equation fitting is carried out on the points obtained after thinning by adopting a least square method, and the blanking points are solved by adopting a least square optimization idea; when solving the normal vector of the light plane, determining a plane according to the two intersecting straight lines, and determining the normal vector (a) of the light plane1,a2,a3)。
4. The method for solving the hand-eye matrix translation part and calibrating the light plane depth information as claimed in the third step of claim 1 has the following features:
4.1, in the calculation of the hand-eye matrix translation part, according to the coordinate information of the feature points extracted according to the poses of T1, T2 and T3, combining the relation among a robot base coordinate system, a robot tail end coordinate system, a camera coordinate system and an image coordinate system in a robot hand-eye system, and calculating to obtain the depth values of the hand-eye matrix translation part T and each feature point in the camera coordinate system;
4.2, the solved X comprises the depth value epsilon of the three characteristic points under the T3 poseiBy eiThe three-dimensional coordinates (x) of each feature point in the robot base coordinate system can be solvedwi,ywi,Zwi) Wherein (i ═ 1, 2, 3); since three points which are not on the same straight line can determine a plane, the plane equation a determined by three characteristic points can be obtained0x+b0y+c0z +1 ═ 0; as shown in FIG. 5, the homogeneous coordinate (u) is determined at any point on the straight line of the light barg,vg1) is substituted into the hand-eye system relational expression, the point containing an unknown number epsilon in the robot base coordinate system can be obtainedgIs e.g. as the coordinate ofg(xwg,ywg,zwg)TSubstituting the coordinates into a plane equation determined by the three characteristic points, and solving the epsilongObtaining the coordinates (x) of the light spot in the base coordinate systemg,yg,zg)TThrough T3Hand-eye matrix R t 0 1 , Transformation to the Camera coordinate System (x)CP,yCP,zCP)TSubstituting it into the optical plane equation to solve a4And then an equation of the light plane is obtained.
CN201310322092.0A 2013-07-26 2013-07-26 A kind of welding robot full-automatic movement self-calibration method of laser vision guiding Active CN103558850B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310322092.0A CN103558850B (en) 2013-07-26 2013-07-26 A kind of welding robot full-automatic movement self-calibration method of laser vision guiding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310322092.0A CN103558850B (en) 2013-07-26 2013-07-26 A kind of welding robot full-automatic movement self-calibration method of laser vision guiding

Publications (2)

Publication Number Publication Date
CN103558850A true CN103558850A (en) 2014-02-05
CN103558850B CN103558850B (en) 2017-10-24

Family

ID=50013130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310322092.0A Active CN103558850B (en) 2013-07-26 2013-07-26 A kind of welding robot full-automatic movement self-calibration method of laser vision guiding

Country Status (1)

Country Link
CN (1) CN103558850B (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103878774A (en) * 2014-02-25 2014-06-25 西安航天精密机电研究所 Vision calibration method based on robot
CN104613899A (en) * 2015-02-09 2015-05-13 淮阴工学院 Full-automatic calibration method for structured light hand-eye three-dimensional measuring system
CN105157725A (en) * 2015-07-29 2015-12-16 华南理工大学 Hand-eye calibration method employing two-dimension laser vision sensor and robot
CN105303560A (en) * 2015-09-22 2016-02-03 中国计量学院 Robot laser scanning welding seam tracking system calibration method
CN106887023A (en) * 2017-02-21 2017-06-23 成都通甲优博科技有限责任公司 For scaling board and its scaling method and calibration system that binocular camera is demarcated
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN107741224A (en) * 2017-08-28 2018-02-27 浙江大学 A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN109060074A (en) * 2018-08-10 2018-12-21 广州极飞科技有限公司 Device for storing liquid, by storage solution balance detection method, liquid storage detection device
CN109159114A (en) * 2018-08-16 2019-01-08 郑州大学 The accuracy method of SCARA manipulator fixed camera vision system hand and eye calibrating
CN109375628A (en) * 2018-11-28 2019-02-22 南京工程学院 A kind of Intelligent Mobile Robot air navigation aid positioned using laser orientation and radio frequency
CN109683710A (en) * 2018-12-20 2019-04-26 北京字节跳动网络技术有限公司 A kind of palm normal vector determines method, apparatus, equipment and storage medium
CN110009685A (en) * 2018-12-29 2019-07-12 南京衍构科技有限公司 A kind of laser camera hand and eye calibrating method increasing material applied to electric arc
CN110193849A (en) * 2018-02-27 2019-09-03 北京猎户星空科技有限公司 A kind of method and device of Robotic Hand-Eye Calibration
CN110245599A (en) * 2019-06-10 2019-09-17 深圳市超准视觉科技有限公司 A kind of intelligent three-dimensional weld seam Auto-searching track method
CN110378898A (en) * 2019-07-26 2019-10-25 金瓜子科技发展(北京)有限公司 A kind of method, apparatus, storage medium and the equipment of beacon positioning
CN110666798A (en) * 2019-10-11 2020-01-10 华中科技大学 Robot vision calibration method based on perspective transformation model
CN111390439A (en) * 2020-03-31 2020-07-10 北京博清科技有限公司 Welding seam detection method and device, welding robot and storage medium
CN111508027A (en) * 2019-01-31 2020-08-07 杭州海康威视数字技术股份有限公司 Method and device for calibrating external parameters of camera
CN112549018A (en) * 2020-11-03 2021-03-26 武汉数字化设计与制造创新中心有限公司 Robot line laser rapid hand-eye calibration method
US11040452B2 (en) 2018-05-29 2021-06-22 Abb Schweiz Ag Depth sensing robotic hand-eye camera using structured light
CN113524204A (en) * 2021-09-15 2021-10-22 苏州鼎纳自动化技术有限公司 Coordinate system coincidence calibration method and system
CN114289934A (en) * 2021-09-27 2022-04-08 西安知象光电科技有限公司 Three-dimensional vision-based automatic welding system and method for large structural part
CN114693770A (en) * 2020-12-31 2022-07-01 北京小米移动软件有限公司 Calibration method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
CN102441719A (en) * 2011-08-26 2012-05-09 昆山工研院工业机器人研究所有限公司 Front laser vision sensing-based seam tracking offline planning method
CN102794763A (en) * 2012-08-31 2012-11-28 江南大学 Systematic calibration method of welding robot guided by line structured light vision sensor
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103175485A (en) * 2013-02-20 2013-06-26 天津工业大学 Method for visually calibrating aircraft turbine engine blade repair robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
CN102441719A (en) * 2011-08-26 2012-05-09 昆山工研院工业机器人研究所有限公司 Front laser vision sensing-based seam tracking offline planning method
CN102794763A (en) * 2012-08-31 2012-11-28 江南大学 Systematic calibration method of welding robot guided by line structured light vision sensor
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103175485A (en) * 2013-02-20 2013-06-26 天津工业大学 Method for visually calibrating aircraft turbine engine blade repair robot

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103878774A (en) * 2014-02-25 2014-06-25 西安航天精密机电研究所 Vision calibration method based on robot
CN104613899A (en) * 2015-02-09 2015-05-13 淮阴工学院 Full-automatic calibration method for structured light hand-eye three-dimensional measuring system
CN105157725A (en) * 2015-07-29 2015-12-16 华南理工大学 Hand-eye calibration method employing two-dimension laser vision sensor and robot
CN105157725B (en) * 2015-07-29 2018-06-29 华南理工大学 A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot
CN105303560A (en) * 2015-09-22 2016-02-03 中国计量学院 Robot laser scanning welding seam tracking system calibration method
CN105303560B (en) * 2015-09-22 2018-01-12 中国计量学院 Robotic laser scanning type weld seam tracking system calibrating method
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN107256567B (en) * 2017-01-22 2020-08-07 梅卡曼德(北京)机器人科技有限公司 Automatic calibration device and calibration method for hand-eye camera of industrial robot
CN106887023A (en) * 2017-02-21 2017-06-23 成都通甲优博科技有限责任公司 For scaling board and its scaling method and calibration system that binocular camera is demarcated
CN107741224A (en) * 2017-08-28 2018-02-27 浙江大学 A kind of AGV automatic-posture-adjustment localization methods of view-based access control model measurement and demarcation
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN110193849A (en) * 2018-02-27 2019-09-03 北京猎户星空科技有限公司 A kind of method and device of Robotic Hand-Eye Calibration
US11040452B2 (en) 2018-05-29 2021-06-22 Abb Schweiz Ag Depth sensing robotic hand-eye camera using structured light
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108717715B (en) * 2018-06-11 2022-05-31 华南理工大学 Automatic calibration method for linear structured light vision system of arc welding robot
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN109060074A (en) * 2018-08-10 2018-12-21 广州极飞科技有限公司 Device for storing liquid, by storage solution balance detection method, liquid storage detection device
CN109159114A (en) * 2018-08-16 2019-01-08 郑州大学 The accuracy method of SCARA manipulator fixed camera vision system hand and eye calibrating
CN109375628A (en) * 2018-11-28 2019-02-22 南京工程学院 A kind of Intelligent Mobile Robot air navigation aid positioned using laser orientation and radio frequency
CN109683710B (en) * 2018-12-20 2019-11-08 北京字节跳动网络技术有限公司 A kind of palm normal vector determines method, apparatus, equipment and storage medium
CN109683710A (en) * 2018-12-20 2019-04-26 北京字节跳动网络技术有限公司 A kind of palm normal vector determines method, apparatus, equipment and storage medium
CN110009685A (en) * 2018-12-29 2019-07-12 南京衍构科技有限公司 A kind of laser camera hand and eye calibrating method increasing material applied to electric arc
CN110009685B (en) * 2018-12-29 2022-02-22 南京衍构科技有限公司 Laser camera hand-eye calibration method applied to electric arc material increase
CN111508027B (en) * 2019-01-31 2023-10-20 杭州海康威视数字技术股份有限公司 Method and device for calibrating external parameters of camera
CN111508027A (en) * 2019-01-31 2020-08-07 杭州海康威视数字技术股份有限公司 Method and device for calibrating external parameters of camera
CN110245599A (en) * 2019-06-10 2019-09-17 深圳市超准视觉科技有限公司 A kind of intelligent three-dimensional weld seam Auto-searching track method
CN110378898B (en) * 2019-07-26 2021-07-16 金瓜子科技发展(北京)有限公司 Beacon positioning method, device, storage medium and equipment
CN110378898A (en) * 2019-07-26 2019-10-25 金瓜子科技发展(北京)有限公司 A kind of method, apparatus, storage medium and the equipment of beacon positioning
CN110666798A (en) * 2019-10-11 2020-01-10 华中科技大学 Robot vision calibration method based on perspective transformation model
CN111390439B (en) * 2020-03-31 2021-11-05 北京博清科技有限公司 Welding seam detection method and device, welding robot and storage medium
CN111390439A (en) * 2020-03-31 2020-07-10 北京博清科技有限公司 Welding seam detection method and device, welding robot and storage medium
CN112549018A (en) * 2020-11-03 2021-03-26 武汉数字化设计与制造创新中心有限公司 Robot line laser rapid hand-eye calibration method
CN114693770A (en) * 2020-12-31 2022-07-01 北京小米移动软件有限公司 Calibration method and device
CN113524204A (en) * 2021-09-15 2021-10-22 苏州鼎纳自动化技术有限公司 Coordinate system coincidence calibration method and system
CN114289934A (en) * 2021-09-27 2022-04-08 西安知象光电科技有限公司 Three-dimensional vision-based automatic welding system and method for large structural part
CN114289934B (en) * 2021-09-27 2023-10-13 西安知象光电科技有限公司 Automatic welding system and method for large structural part based on three-dimensional vision

Also Published As

Publication number Publication date
CN103558850B (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN103558850B (en) A kind of welding robot full-automatic movement self-calibration method of laser vision guiding
CN109308693B (en) Single-binocular vision system for target detection and pose measurement constructed by one PTZ camera
CN107876970B (en) Robot multilayer multi-pass welding seam three-dimensional detection and welding seam inflection point identification method
JP3735344B2 (en) Calibration apparatus, calibration method, and calibration program
CN103678754B (en) Information processor and information processing method
CN108717715A (en) A kind of line-structured light vision system automatic calibration method for arc welding robot
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN111089569A (en) Large box body measuring method based on monocular vision
CN112964186B (en) Device and method for measuring clearance in automatic shaft hole assembly process
CN109470149B (en) Method and device for measuring position and posture of pipeline
CN111012506A (en) Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision
CN109448059B (en) Rapid X-corner sub-pixel detection method
CN111476841A (en) Point cloud and image-based identification and positioning method and system
CN115816471B (en) Unordered grabbing method, unordered grabbing equipment and unordered grabbing medium for multi-view 3D vision guided robot
CN114371472B (en) Automatic combined calibration device and method for laser radar and camera
CN112085708B (en) Method and equipment for detecting defects of straight line edges in outer contour of product
Ding et al. A robust detection method of control points for calibration and measurement with defocused images
CN104515502A (en) Robot hand-eye stereo vision measurement method
CN103247032A (en) Weak extended target positioning method based on attitude compensation
JPH0798214A (en) Method and device for three dimensional position and attitude recognition method based on sense of sight
CN109360267B (en) Rapid three-dimensional reconstruction method for thin object
CN113012238B (en) Method for quick calibration and data fusion of multi-depth camera
Li et al. Method for detecting pipeline spatial attitude using point cloud alignment
CN117584121A (en) Welding robot path planning method based on point cloud scene understanding
CN110849285A (en) Welding spot depth measuring method, system and medium based on monocular camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant