CN113386136B - Robot posture correction method and system based on standard spherical array target estimation - Google Patents
Robot posture correction method and system based on standard spherical array target estimation Download PDFInfo
- Publication number
- CN113386136B CN113386136B CN202110736372.0A CN202110736372A CN113386136B CN 113386136 B CN113386136 B CN 113386136B CN 202110736372 A CN202110736372 A CN 202110736372A CN 113386136 B CN113386136 B CN 113386136B
- Authority
- CN
- China
- Prior art keywords
- robot
- coordinate system
- transformation matrix
- pose
- standard
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The invention belongs to the technical field of robots and discloses a robot posture correction method and system based on standard spherical array target estimation. The method comprises the following steps: s1, establishing a robot base coordinate system, a robot tail end coordinate system, a scanner measuring coordinate system and a standard spherical array local coordinate system; s2, the robot drives the scanner to scan the standard ball array in multiple angles, and reads and records the robot pose and the point cloud of the standard ball array under the pose; calculating to obtain a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system; s3, converting the transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system into an actual six-dimensional vector, wherein the actual six-dimensional vector is the actual pose of the robot, and calculating the error between the actual pose of the robot and the pose read in the step S2 to realize the pose correction of the robot. The invention breaks through the bottleneck of the existing method and has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.
Description
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot posture correction method and system based on standard spherical array target estimation.
Background
The robot has the characteristics of flexible operation, high flexibility and the like, and the industrial robot replaces a human to finish operations such as stacking, welding, assembling and the like, so that the mainstream development trend of the application field of the industrial robot is formed, however, the lower absolute positioning precision of the industrial robot limits the precision of the operation of the industrial robot.
Therefore, many scholars develop researches on the theory aspect of the precision compensation method for the industrial robot according to the kinematic parameters of the robot, the method mostly involves complex mathematical formula derivation processes such as Jacobian matrix calculation, differential motion solution and the like, the efficiency is low for the condition that only the specific robot pose needs to be corrected, and the cost of a measuring device (a laser tracker) is greatly increased. Accordingly, there is a technical need in the art to develop a robot posture fast correction method based on standard spherical array target estimation, which is low in cost, strong in practicability and high in correction efficiency.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a robot pose correction method and system based on standard spherical array target estimation, which can realize the rapid correction of the specific robot pose, break through the bottleneck of the existing method, and have the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.
To achieve the above object, according to one aspect of the present invention, there is provided a robot pose correction method based on a standard sphere array target estimation, the method comprising the steps of:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating a hand-eye matrix based on the recorded pose and point cloud information; converting the recorded point cloud to a robot base coordinate system, and matching the point cloud with a standard spherical array design model; finishing standard spherical array target estimation by using a matching result; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relationConversion to a true six-dimensional vectorThe actual six-dimensional vectorThat is, the actual pose of the robot obtained by the coordinate transformation calculation is calculated and obtained, and the actual pose of the robot and the pose of the robot read in step S2 are obtainedAnd correcting the pose of the robot by the aid of the errors.
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to a scanner measurement coordinate system { S }
S22, the hand-eye calibration is carried out on the robot, so as to obtain the hand-eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } relative to the robot terminal coordinate system { E }
S23, based on the rigidity transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
S24 construction of transformation matrixAndand transformation matrixThe required transformation matrix is obtained by utilizing the relation formula to calculate
Further preferably, in step S21, the transformation matrixThe calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
(b) selecting the sphere center of one standard sphere as an origin to establish a local coordinate system of the standard sphere array, wherein the three-dimensional coordinates of the origin and the coordinate axis direction of the coordinate system form a transformation matrix
Further preferably, in step S22, the matrix is transformedObtained according to the following steps:
(a) constructing a relational expression between the robot pose and a transformation matrix of the robot end coordinate system relative to the robot base coordinate system, and calculating by using the read robot pose to obtain the transformation matrix of the robot end coordinate system { E } relative to the robot base coordinate system { B }
(b) Constructing transformation matricesAndthe relation between them, so as to calculate and obtain transformation matrix
Further preferably, in step (a), the transformation matrixCalculated according to the following relation:
wherein the content of the first and second substances,represents the transformation matrix of { E } relative to { B } at the ith measurement, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of (R (y, Ey)i)、R(x,Exi) And so on),is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate systemConversion into six-dimensional vector setsSolving the mean of the six-dimensional vector groupBy vector-matrix transformation, from six-dimensional vectorsSolving transformation matrices
Further preferably, in the step (a), the scanning of each scanner is performed by measuring a point cloud under a coordinate systemSPiAnd (3) converting to a robot base coordinate system according to the following relation:
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the standard spherical array point cloud under the coordinate system measured by the scanner obtained by the ith scanning,is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
Further preferably, in step S24, the transformation matrixMatrix ofAndand transformation matrixThe relationship between them is according to the following relation:
where i is the number of scan measurements and n represents the total number of scan measurements.
Further preferably, the error is calculated according to the following relation:
wherein the content of the first and second substances,is the robot pose error in the base coordinate system for the ith scan,is the actual pose of the robot in the robot base coordinate system for the ith scanning,the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
According to another aspect of the present invention, there is provided a system for correcting the robot posture correction method, the system comprising a robot, a scanner and a standard ball array, wherein the scanner is connected to the end of the robot, the standard ball array comprises a plurality of standard balls with different sizes and arranged in a nonlinear manner, and the standard ball array is arranged in the scanning range of the scanner.
Generally, compared with the prior art, the technical scheme of the invention has the following beneficial effects:
1. the method can correct the specific robot pose only by adding the robot pose to be corrected to the hand-eye calibration operation process, and is easy to implement; the method has the advantages that a plurality of specific robot poses can be quickly corrected only by matrix product forward/inverse operation and matrix-vector transformation, a new robot pose is determined based on a size chain transfer model, the robot pose is quickly corrected, the specific robot poses can be quickly corrected, the bottleneck of the existing method is broken through, and the method has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like;
2. the invention realizes the correction of the pose of the robot by adopting the scanner and the standard calibration ball, greatly reduces the cost compared with the prior art which adopts the expensive laser tracker and the target ball, and simultaneously, the realization method of scanning the standard ball array by the scanner in multiple angles is fast, and the time consumption is shorter compared with the space pose of the planning robot grid type in the prior art;
3. the robot hand-eye calibration is completed based on the standard ball array comprising at least three standard balls with different diameters, and the rigid body transformation matrix is easy to solve; in addition, the point cloud matching is carried out by adopting an ADF algorithm, the standard spherical array target estimation is completed based on a plurality of matching transformation matrixes, the ADF matching algorithm integrates the distance function between the point and the distance function between the point and the plane, and the method is not easy to fall into local optimization relative to ICP point matching, and has high calculation efficiency and wide application range.
Drawings
Fig. 1 is a schematic flow chart of a robot pose rapid correction method based on standard spherical array target estimation according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a system for finishing rapid robot pose correction based on standard ball array target estimation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a standard spherical array design model according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The invention provides a robot pose rapid correction method based on standard spherical array target estimation, which is suitable for rapid correction of robot poses.
As shown in fig. 2, the system for completing the rapid posture correction of the robot based on the target estimation of the standard spherical array is shown in the figure, the robot is a six-degree-of-freedom industrial robot, and the scanner is a grating type binocular area array scanner, as shown in fig. 3, in this embodiment, the standard spherical array includes three standard spheres (standard matt ceramic spheres) which are different in size and are arranged in a nonlinear manner. Wherein { B } denotes a robot base coordinate system, { E } denotes a robot end flange coordinate system, { S } denotes a scanner measurement coordinate system, { W } denotes a standard spherical array local coordinate system (or a workpiece coordinate system),a transformation matrix representing the robot end flange coordinate system relative to the robot base coordinate system,a transformation matrix representing the scanner measurement coordinate system relative to the robot end flange coordinate system,a transformation matrix representing the local coordinate system of the standard spherical array relative to the measurement coordinate system of the scanner,a transformation matrix representing the robot base coordinate system relative to a standard spherical array local coordinate system, wherein,andis a constant value, and the value is,dependent on the number of scans, different numbers of scans correspond to different onesTransform the matrix, hence, note
As shown in fig. 1, a method for quickly correcting the pose of a robot based on standard spherical array target estimation includes the following steps:
step one, the robot drives the scanner to scan and measure the standard spherical array in multiple angles, and the calibration of the hands and eyes of the robot is completed.
Specifically, firstly, the robot is controlled to drive the scanner to scan and measure the standard spherical array in multiple angles, and the obtained point cloud of the standard spherical array is recorded as-SP1,SP2,SP3,...,SPi,...,SPnN, n represents the number of scanning measurements, each scanner measuring a point cloud in a coordinate systemSPiFrom N pointsSp1,Sp2,Sp3,…,Spm,…,SpNComposition, where m is 1,2,3Spm=[xm ym zm]TThe coordinates of the m-th point are expressed, and simultaneously, the pose of the robot is sequentially recorded as Is a six-dimensional vector and represents the position and posture of the ith robot,indicating the position of the robot,. psii=[Exi Eyi Ezi]TRepresenting a pose of the robot; in the scanning process, at least three or more than three nonlinear related robot poses are included, and the robot poses are guaranteed to be non-singular.
Secondly, a standard spherical array local coordinate system is established based on the standard spherical array point cloud: and fitting the sphere centers of the standard spheres and the plane where the sphere centers are located based on the measurement point cloud, selecting the sphere center of one of the spheres as an original point, selecting the sphere center of the other sphere to establish an x-axis direction, taking one of normal vectors of the plane where the sphere centers are located as a z-axis, and obtaining a y-axis direction through right-hand rule. Calculating transformation matrix of several standard spherical array local coordinate systems relative to the measuring coordinate system of scanner by using the original point of local coordinate system as position vector and using the directions of x, y and z axes as attitude vectorAt the same time, through vector-matrix conversion, from six-dimensional vectorsSolving a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system(the transformation matrix is the pose ζ of the robot by direct reading)iObtained) by
Wherein, R (·, E ·)i) Watch (A)Shows rotation about the axis EiThe rotation matrix of the angle can be obtained, and transformation matrixes of a plurality of robot end coordinate systems relative to a robot base coordinate system can be obtained
Finally, based on the AX ═ XB model, where,j-1, 2,3, a, n-1, calculating a hand-eye relationship matrixAnd finishing the calibration of the hands and eyes of the robot.
And step two, acquiring a plurality of groups of standard spherical array point clouds under the robot base coordinate system based on the rigid body transformation matrix among the coordinate systems.
Specifically, each scanner is used for measuring point clouds under a coordinate system through rigid body transformation matrix product operationSPiTo be transferred under the robot base coordinate system, i.e.
Wherein the content of the first and second substances,BPithe standard ball array point clouds under the robot base coordinate system obtained by the ith scanning are expressed, and the standard ball array point clouds under n groups of robot base coordinate systems can be obtained by the formula (2)BP1,BP2,BP3,...,BPi,...,BPn}。
And step three, obtaining a transformation matrix set of the robot base coordinate systems relative to the standard spherical array local coordinate system through point cloud matching.
Specifically, a standard spherical array point cloud under a robot base coordinate system is used for making a final imageBP1,BP2,BP3,...,BPi,...,BPnUsing the three-dimensional solid model of the standard spherical array as a parameterAnd (4) taking an examination model, matching by adopting an ADF algorithm, sequentially acquiring a plurality of transformation matrix sets of the robot base coordinate system relative to the standard spherical matrix local coordinate system, and recording as
And step four, determining a transformation matrix of the robot base coordinate system relative to the standard spherical array local coordinate system based on the matching result, and finishing target estimation.
Specifically, first, a transformation matrix set of the robot base coordinate system with respect to the standard spherical array local coordinate system is setConversion to six-dimensional vector setsNamely:
Finally, by vector-matrix transformation, from six-dimensional vectorsSolving transformation matricesNamely the transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical arrayAnd (5) estimating a standard spherical array target.
And step five, determining a transformation matrix of the terminal flange coordinate systems of the robots relative to the robot base coordinate system based on the size chain transfer model.
Specifically, based on the relative pose relationship among a robot base coordinate system, a standard spherical array local coordinate system, a scanner measurement coordinate system and a robot end flange coordinate system, a transformation matrix of the robot end flange coordinate system relative to the robot base coordinate system is sequentially solvedNamely, it is
The attitude matrix in the transformation matrix is an orthogonal matrix.
And sixthly, obtaining a new robot pose through matrix-vector transformation, and finishing quick correction of the robot pose.
In particular, the matrix is transformedConversion to six-dimensional vectorsThus, a plurality of new robot poses can be obtainedCalculating pose deviation of poses of n robotsTo perform robot pose correction, i.e.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (8)
1. A robot posture correction method based on standard spherical array target estimation is characterized by comprising the following steps:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relationConversion to a true six-dimensional vectorThe actual six-dimensional vectorThe actual pose of the robot obtained through coordinate transformation calculation is calculated, and the error between the actual pose of the robot and the pose of the robot read in the step S2 is obtained, so that the pose of the robot is corrected;
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to the scanner measurement coordinate system { S }, wherein
S22, calibrating the hand and the eye of the robot to obtain the hand and eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } and the robot end coordinate system { E }
S23, based on the rigid transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
S24 construction of transformation matrixAndand transformation matrixThe relation between the two, and the needed transformation matrix is obtained by the calculation of the relation
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate systemConversion into six-dimensional vector setsSolving the mean of the six-dimensional vector groupBy vector-matrix transformation, from six-dimensional vectorsSolving transformation matrices
2. The method of claim 1, wherein the estimating is based on a standard spherical array targetThe method for correcting the robot posture of the meter, wherein the transformation matrix is used in step S21The calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
3. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S22, the transformation matrixObtained according to the following steps:
(a) constructing a relational expression between the robot pose and a transformation matrix of the robot end coordinate system relative to the robot base coordinate system, and calculating by using the read robot pose to obtain the transformation matrix of the robot end coordinate system { E } relative to the robot base coordinate system { B }
4. The method for correcting robot pose based on standard sphere array target estimation of claim 3, wherein in step (a), the transformation matrixCalculated according to the following relation:
wherein, the first and the second end of the pipe are connected with each other,represents the transformation matrix of { E } relative to { B } at the ith measurement, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of R (y, Ey)i)、R(x,Exi) By analogy with the above-mentioned general formula,is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
5. The method according to claim 4, wherein in the step (a), each scanner measures a point cloud in a coordinate systemSPiAnd (3) converting to a robot base coordinate system according to the following relation:
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the scanner measurement coordinate system acquired by the ith scanThe point cloud of the standard spherical array below,is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
6. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S24, the transformation matrixMatrixAndand transformation matrixThe relationship between them is according to the following relation:
where i is the number of scan measurements and n represents the total number of scan measurements.
7. A method for robot pose correction based on standard sphere array target estimation as claimed in claim 1, wherein the error is calculated according to the following relation:
wherein the content of the first and second substances,is the robot pose error in the base coordinate system for the ith scan,is the actual pose of the robot in the robot base coordinate system for the ith scanning,the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
8. A system for correcting posture of a robot by using the method of any one of claims 1 to 7, wherein the system comprises a robot, a scanner attached to an end of the robot, and a standard ball array comprising a plurality of standard balls of different sizes and arranged in a non-linear manner, the standard ball array being disposed within a scanning range of the scanner.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736372.0A CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736372.0A CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113386136A CN113386136A (en) | 2021-09-14 |
CN113386136B true CN113386136B (en) | 2022-05-20 |
Family
ID=77624581
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110736372.0A Active CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113386136B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113770577B (en) * | 2021-09-18 | 2022-09-20 | 宁波博视达焊接机器人有限公司 | Method for realizing generation of track of workpiece mounted on robot |
CN113843792B (en) * | 2021-09-23 | 2024-02-06 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN114347027B (en) * | 2022-01-08 | 2022-12-13 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114485468B (en) * | 2022-01-28 | 2023-09-26 | 天津大学 | Multi-axis linkage composite measurement system and micro-part full-contour automatic measurement method |
CN114589692B (en) * | 2022-02-25 | 2024-03-26 | 埃夫特智能装备股份有限公司 | Zero calibration method and calibration equipment for robot |
CN114770517B (en) * | 2022-05-19 | 2023-08-15 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN115249267B (en) * | 2022-09-22 | 2022-12-30 | 海克斯康制造智能技术(青岛)有限公司 | Automatic detection method and device based on turntable and robot position and posture calculation |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1805830A (en) * | 2003-06-11 | 2006-07-19 | Abb公司 | A method for fine tuning of a robot program |
CN107398901A (en) * | 2017-07-28 | 2017-11-28 | 哈尔滨工业大学 | The visual servo control method of robot for space maintainable technology on-orbit |
CN107953336A (en) * | 2017-12-27 | 2018-04-24 | 北京理工大学 | Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT |
CN108724181A (en) * | 2017-04-19 | 2018-11-02 | 丰田自动车株式会社 | Calibration system |
EP3402632A1 (en) * | 2016-01-11 | 2018-11-21 | KUKA Deutschland GmbH | Determining an orientation of a robot relative to the direction of gravity |
CN108994827A (en) * | 2018-05-04 | 2018-12-14 | 武汉理工大学 | A kind of robot measurement-system of processing scanner coordinate system automatic calibration method |
CN109373898A (en) * | 2018-11-27 | 2019-02-22 | 华中科技大学 | A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud |
CN110202582A (en) * | 2019-07-03 | 2019-09-06 | 桂林电子科技大学 | A kind of robot calibration method based on three coordinates platforms |
CN110480638A (en) * | 2019-08-20 | 2019-11-22 | 南京博约智能科技有限公司 | A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
CN112659112A (en) * | 2020-12-03 | 2021-04-16 | 合肥富煌君达高科信息技术有限公司 | Robot eye calibration method based on line laser scanner |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3531062A1 (en) * | 2018-02-26 | 2019-08-28 | Renishaw PLC | Coordinate positioning machine |
-
2021
- 2021-06-30 CN CN202110736372.0A patent/CN113386136B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1805830A (en) * | 2003-06-11 | 2006-07-19 | Abb公司 | A method for fine tuning of a robot program |
EP3402632A1 (en) * | 2016-01-11 | 2018-11-21 | KUKA Deutschland GmbH | Determining an orientation of a robot relative to the direction of gravity |
CN108724181A (en) * | 2017-04-19 | 2018-11-02 | 丰田自动车株式会社 | Calibration system |
CN107398901A (en) * | 2017-07-28 | 2017-11-28 | 哈尔滨工业大学 | The visual servo control method of robot for space maintainable technology on-orbit |
CN107953336A (en) * | 2017-12-27 | 2018-04-24 | 北京理工大学 | Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT |
CN108994827A (en) * | 2018-05-04 | 2018-12-14 | 武汉理工大学 | A kind of robot measurement-system of processing scanner coordinate system automatic calibration method |
CN109373898A (en) * | 2018-11-27 | 2019-02-22 | 华中科技大学 | A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud |
CN110202582A (en) * | 2019-07-03 | 2019-09-06 | 桂林电子科技大学 | A kind of robot calibration method based on three coordinates platforms |
CN110480638A (en) * | 2019-08-20 | 2019-11-22 | 南京博约智能科技有限公司 | A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
CN112659112A (en) * | 2020-12-03 | 2021-04-16 | 合肥富煌君达高科信息技术有限公司 | Robot eye calibration method based on line laser scanner |
Non-Patent Citations (1)
Title |
---|
核主泵复杂零件机器人在位自动光学检测系统开发;李文龙 等;《机械工程学报》;20200731;第56卷(第13期);第1-5节 * |
Also Published As
Publication number | Publication date |
---|---|
CN113386136A (en) | 2021-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113386136B (en) | Robot posture correction method and system based on standard spherical array target estimation | |
CN107738254B (en) | Conversion calibration method and system for mechanical arm coordinate system | |
CN109822574B (en) | Industrial robot end six-dimensional force sensor calibration method | |
Wang et al. | A point and distance constraint based 6R robot calibration method through machine vision | |
CN109304730B (en) | Robot kinematic parameter calibration method based on laser range finder | |
CN109877840B (en) | Double-mechanical-arm calibration method based on camera optical axis constraint | |
CN111531547B (en) | Robot calibration and detection method based on vision measurement | |
CN112833786B (en) | Cabin attitude and pose measuring and aligning system, control method and application | |
CN111660295A (en) | Industrial robot absolute precision calibration system and calibration method | |
CN109323650B (en) | Unified method for measuring coordinate system by visual image sensor and light spot distance measuring sensor in measuring system | |
Zhuang et al. | Robot calibration with planar constraints | |
CN113160334B (en) | Dual-robot system calibration method based on hand-eye camera | |
CN111168719B (en) | Robot calibration method and system based on positioning tool | |
CN107817682A (en) | A kind of space manipulator on-orbit calibration method and system based on trick camera | |
Wang et al. | A vision-based fully-automatic calibration method for hand-eye serial robot | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera | |
CN115546289A (en) | Robot-based three-dimensional shape measurement method for complex structural part | |
Dehghani et al. | Vision-based calibration of a Hexa parallel robot | |
CN113681559A (en) | Line laser scanning robot hand-eye calibration method based on standard cylinder | |
TW202302301A (en) | Automated calibration system and method for the relation between a profile scanner coordinate frame and a robot arm coordinate frame | |
CN109059761B (en) | EIV model-based handheld target measuring head calibration method | |
CN113878586B (en) | Robot kinematics calibration device, method and system | |
CN116352710A (en) | Robot automatic calibration and three-dimensional measurement method for large aerospace component | |
CN116309879A (en) | Robot-assisted multi-view three-dimensional scanning measurement method | |
CN115179323A (en) | Machine end pose measuring device based on telecentric vision constraint and precision improving method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |