CN111482957B - Vision offline demonstrator registration method - Google Patents
Vision offline demonstrator registration method Download PDFInfo
- Publication number
- CN111482957B CN111482957B CN201910630591.3A CN201910630591A CN111482957B CN 111482957 B CN111482957 B CN 111482957B CN 201910630591 A CN201910630591 A CN 201910630591A CN 111482957 B CN111482957 B CN 111482957B
- Authority
- CN
- China
- Prior art keywords
- point
- demonstrator
- sharp
- seti
- teaching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Numerical Control (AREA)
Abstract
The invention provides a registration method of a visual off-line demonstrator, which is characterized in that a three-dimensional included angle tool is designed, a demonstration sharp point can be jacked to the position, and when the demonstrator rotates in any direction in a small range, the demonstration sharp point is ensured to be still at the same position; the invention uses software algorithm to fit the position of the point of the demonstrator, and the precision is far higher than that of processing and assembling. The teaching cusp can be accurately calculated without knowing design parameters in the registration process, and the method is simple and convenient.
Description
Technical Field
The invention relates to the technical field of industrial robot control teaching, in particular to a registration method of a vision off-line teaching instrument.
Background
With the continuous improvement of the industrial automation degree, industrial robots are widely applied in the fields of automobile and automobile part manufacturing industry, heavy machinery, aerospace, ship, chemical industry, electronic industry and the like. In general, teaching of industrial robots mainly comprises two ways of teaching online and teaching offline, while in the robot vision offline teaching system of the present invention, the relative positions of teaching cusps and marker point arrays are fixed, and the precise relative positions of the teaching cusps and marker point arrays in the space must be obtained before working. In general, the array of teach pendant and the teach point are precisely designed, i.e., their relative positions can be obtained from the design drawing. However, in the production process, because of the uncertainty of errors in both machining and assembly, their relative positions must be recalculated, otherwise large precision errors are generated for the application, and the operation is troublesome.
Disclosure of Invention
The invention mainly aims to provide a visual offline demonstrator registration method, which comprises the following specific operation methods:
designing a three-dimensional included angle tool, enabling a teaching sharp point to be capable of jacking to the top end position of the three-dimensional included angle tool, and ensuring that the teaching sharp point is still at the same position when a teaching instrument rotates in any direction in a small range;
rotating the teaching instrument around the three-dimensional included angle tool by n angles, wherein n is any angle, ensuring that a point is at the same position at each position of the teaching instrument main body during rotation, and reconstructing a mark point array at each angle position by using a binocular reconstruction system to obtain n point clouds;
thirdly, matching the point clouds in the n point clouds to obtain relative external parameters, namely a rotation parameter R and a translation parameter T;
and fourthly, if the matching is successful, fitting the accurate positions of the cusps by utilizing the matching relation of the point clouds, wherein the specific method comprises the following steps: the positions of the sharp points are P (x, y, z), the rotation relation between the point cloud Set1 and the point cloud Seti is Seti ═ Set1, but the rotation relation satisfies P (x, y, z) ═ P (x, y, z), because a plurality of groups can be obtained, the values of P (x, y, z) can be fitted by a least square method according to the information, and the accurate positions of the sharp points relative to the array can be obtained;
fifthly, if the matching fails in the third step, the point clouds with failed matching are removed, then the similar positions are collected again until n point clouds capable of being matched correctly are obtained, and then the accurate positions of the cusps are fitted by utilizing the matching relation of the point clouds, wherein the specific method comprises the following steps: the method comprises the following steps of setting the cusp position as P (x, y, z), setting the rotation relation between the point cloud Set1 and the point cloud Seti as Seti (Set 1), and simultaneously satisfying the requirement of P (x, y, z) as P (x, y, z), wherein multiple groups of the point cloud Seti and the point cloud Seti can be obtained;
and sixthly, recording the relative spatial position relation between the point cloud and the cusp.
The LED lamp is arranged on the body of the teaching instrument, and the LED lamp is not arranged on the point of the teaching instrument.
The invention has the beneficial effects that: the invention uses software algorithm to fit the position of the point of the demonstrator, and the precision is far higher than that of processing and assembling. The teaching cusp can be accurately calculated without knowing design parameters in the registration process, and the method is simple and convenient.
When the system is used, the position of a sharp point is calculated according to the led array in the system, namely the sharp point on the teaching instrument is provided, and no led lamp is arranged at the sharp point. The invention provides a registration method for calculating the accurate position of the cusp relative to the array. In the process, data with higher processing precision than that of the machining precision are finally obtained through the acquisition of a vision system and algorithm processing.
The off-line path is generated without depending on tools, and a binocular system is used for tracking a feature array on a teaching pen, calculating the moving track of the pen and then converting the moving track into the track of the robot.
Drawings
FIG. 1 is a schematic diagram of a registration process according to the present invention.
FIG. 2 is a schematic structural diagram of the visual offline teaching instrument of the present invention.
FIG. 3 is a three-dimensional angling tool of the present invention.
Detailed Description
The present application is explained in detail with reference to the drawings and examples:
example 1
Designing a three-dimensional included angle tool, enabling a teaching sharp point to be capable of jacking to the top end position of the three-dimensional included angle tool, and ensuring that the teaching sharp point is still at the same position when a teaching instrument rotates in any direction in a small range;
rotating the teaching instrument around the three-dimensional included angle tool by n angles, wherein n is any angle, ensuring that a point is at the same position at each position of the teaching instrument main body during rotation, and reconstructing a mark point array at each angle position by using a binocular reconstruction system to obtain n point clouds;
thirdly, matching the point clouds in the n point clouds to obtain relative external parameters, namely a rotation parameter R and a translation parameter T;
and fourthly, if the matching is successful, fitting the accurate positions of the cusps by utilizing the matching relation of the point clouds, wherein the specific method comprises the following steps: the positions of the sharp points are P (x, y, z), the rotation relation between the point cloud Set1 and the point cloud Seti is Seti ═ Set1, but the rotation relation satisfies P (x, y, z) ═ P (x, y, z), because a plurality of groups can be obtained, the values of P (x, y, z) can be fitted by a least square method according to the information, and the accurate positions of the sharp points relative to the array can be obtained;
fifthly, if the matching in the third step fails, a three-dimensional included angle tool needs to be designed from the first step again until the point cloud matching in the third step succeeds, and then the accurate positions of the point points are fitted by utilizing the matching relation of the point clouds, wherein the specific method comprises the following steps: the method comprises the following steps of setting the cusp position as P (x, y, z), setting the rotation relation between the point cloud Set1 and the point cloud Seti as Seti (Set 1), and simultaneously satisfying the requirement of P (x, y, z) as P (x, y, z), wherein multiple groups of the point cloud Seti and the point cloud Seti can be obtained;
and sixthly, recording the relative spatial position relation between the point cloud and the cusp.
Example 2
Designing a three-dimensional included angle tool, enabling a teaching sharp point to be capable of jacking to the top end position of the three-dimensional included angle tool, and ensuring that the teaching sharp point is still at the same position when a teaching instrument rotates in any direction in a small range; the teaching instrument rotates for 9 angles around the three-dimensional included angle tool, each position of a teaching instrument main body ensures that a point is at the same position during rotation, a binocular reconstruction system is utilized for reconstructing a mark point array at each angle position, so that 9 point clouds are obtained, and each point cloud has 6 points;
matching the point clouds to obtain relative external parameters, namely a rotation parameter R and a translation parameter T; the method comprises the following steps of setting the cusp position as P (x, y, z), setting the rotation relation between the point cloud Set1 and the point cloud Seti as Seti (Set 1), and simultaneously satisfying the requirement of P (x, y, z) as P (x, y, z), wherein multiple groups of the point cloud Seti and the point cloud Seti can be obtained;
the position of the cusp is calculated according to the algorithm
415.70850,468.03960,1702.70700。
The foregoing shows and describes the general principles and broad features of the present invention and advantages thereof. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are intended to illustrate the principles of the invention, but rather that various changes and modifications may be made without departing from the spirit and scope of the invention, which is intended to be covered by the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (2)
1. A registration method of a visual off-line demonstrator comprises the following specific operation methods:
designing a three-dimensional included angle tool, enabling a teaching sharp point to be capable of jacking to the top end position of the three-dimensional included angle tool, and ensuring that the teaching sharp point is still at the same position when a teaching instrument rotates in any direction in a small range;
rotating the demonstrator by n angles around the three-dimensional included angle tool, ensuring that a sharp point is at the same position at each position of the demonstrator main body during rotation, and reconstructing a mark point array at each angle position by using a binocular reconstruction system to obtain n point clouds;
matching the point clouds in the n point clouds to obtain external parameters of the camera, namely a rotation parameter R and a translation parameter T;
and fourthly, if the matching is successful, fitting the accurate positions of the cusps by utilizing the matching relation of the point clouds, wherein the specific method comprises the following steps: setting the sharp point as P (x, y, z), setting the rotation relation between the point cloud Set1 and the point cloud Seti as Seti [ Ri | Ti ] ] Set1, but simultaneously satisfying P (x, y, z) [ [ Ri | Ti ]. P (x, y, z), because a plurality of groups of [ R | T ] can be obtained, fitting the values of P (x, y, z) by a least square method according to the information, and further obtaining the accurate position of the sharp point relative to the mark point array;
fifthly, if the matching in the third step fails, it is indicated that some acquisition positions are problematic to acquire, the problematic point clouds are removed, the acquisition is continued until n point clouds which can be correctly matched are obtained, and then the accurate positions of the cusps are fitted by using the matching relation of the point clouds, and the specific method comprises the following steps: setting the sharp point as P (x, y, z), setting the rotation relation between the point cloud Set1 and the point cloud Seti as Seti ═ Ri | Ti ]. Set1, but simultaneously satisfying P (x, y, z) ═ Ri | Ti ]. P (x, y, z), because the multiple groups of [ R | T ] can be obtained, fitting the values of P (x, y, z) by the least square method according to the information, and further obtaining the accurate position of the sharp point relative to the mark point array;
and sixthly, recording the accurate position of the sharp point relative to the mark point array.
2. The visual offline demonstrator registration method according to claim 1, wherein a led lamp is disposed on a body of the demonstrator, and no led lamp is disposed on a tip of the demonstrator.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910630591.3A CN111482957B (en) | 2019-07-12 | 2019-07-12 | Vision offline demonstrator registration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910630591.3A CN111482957B (en) | 2019-07-12 | 2019-07-12 | Vision offline demonstrator registration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111482957A CN111482957A (en) | 2020-08-04 |
CN111482957B true CN111482957B (en) | 2020-12-29 |
Family
ID=71788688
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910630591.3A Active CN111482957B (en) | 2019-07-12 | 2019-07-12 | Vision offline demonstrator registration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111482957B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1734379A (en) * | 2004-08-02 | 2006-02-15 | 发那科株式会社 | Processing program generating device |
EP2955526A1 (en) * | 2014-06-10 | 2015-12-16 | Siemens Healthcare Diagnostics Products GmbH | Apparatus for determining the position of a gauge and method for determining position |
CN107717993A (en) * | 2017-11-03 | 2018-02-23 | 成都卡诺普自动化控制技术有限公司 | A kind of efficient easily Simple robot scaling method |
CN108453707A (en) * | 2018-04-12 | 2018-08-28 | 珞石(山东)智能科技有限公司 | Robot drags teaching orbit generation method |
CN108527332A (en) * | 2018-06-11 | 2018-09-14 | 华南理工大学 | A kind of seam track off-line calibration method based on structured light vision sensor |
CN108582076A (en) * | 2018-05-10 | 2018-09-28 | 武汉库柏特科技有限公司 | A kind of Robotic Hand-Eye Calibration method and device based on standard ball |
-
2019
- 2019-07-12 CN CN201910630591.3A patent/CN111482957B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1734379A (en) * | 2004-08-02 | 2006-02-15 | 发那科株式会社 | Processing program generating device |
EP2955526A1 (en) * | 2014-06-10 | 2015-12-16 | Siemens Healthcare Diagnostics Products GmbH | Apparatus for determining the position of a gauge and method for determining position |
CN107717993A (en) * | 2017-11-03 | 2018-02-23 | 成都卡诺普自动化控制技术有限公司 | A kind of efficient easily Simple robot scaling method |
CN108453707A (en) * | 2018-04-12 | 2018-08-28 | 珞石(山东)智能科技有限公司 | Robot drags teaching orbit generation method |
CN108582076A (en) * | 2018-05-10 | 2018-09-28 | 武汉库柏特科技有限公司 | A kind of Robotic Hand-Eye Calibration method and device based on standard ball |
CN108527332A (en) * | 2018-06-11 | 2018-09-14 | 华南理工大学 | A kind of seam track off-line calibration method based on structured light vision sensor |
Also Published As
Publication number | Publication date |
---|---|
CN111482957A (en) | 2020-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106338990B (en) | Industrial robot DH parameter calibration and Zero positioning method based on laser tracker | |
CN108748159B (en) | Self-calibration method for tool coordinate system of mechanical arm | |
CN107443382B (en) | Industrial robot structure parameter error identification and compensation method | |
CN109465826B (en) | Industrial robot TCP calibration method based on posture uniform distribution | |
CN107972071B (en) | A kind of industrial robot link parameters scaling method based on distal point plane restriction | |
CN109877840B (en) | Double-mechanical-arm calibration method based on camera optical axis constraint | |
CN107796276B (en) | Device and method for estimating absolute positioning accuracy of industrial robot | |
CN111941425B (en) | Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera | |
JP5531996B2 (en) | 6-axis robot offset detection method | |
CN104408299B (en) | Robot location's error compensating method based on distance identification redundancy kinematics parameters | |
CN104819707A (en) | Polyhedral active cursor target | |
CN112917513A (en) | TCP calibration method of three-dimensional dispensing needle head based on machine vision | |
CN112109084A (en) | Terminal position compensation method based on robot joint angle compensation and application thereof | |
CN112907682B (en) | Hand-eye calibration method and device for five-axis motion platform and related equipment | |
CN107471257A (en) | Robot geometric calibration method based on single stay encoder | |
CN110815204B (en) | Industrial robot kinematics calibration method | |
CN105598975A (en) | Method for determining movement tracks of industrial robot | |
CN112958960A (en) | Robot hand-eye calibration device based on optical target | |
CN113211436B (en) | Six-degree-of-freedom series robot error calibration method based on genetic algorithm | |
CN111482957B (en) | Vision offline demonstrator registration method | |
CN110014427A (en) | A kind of redundancy mechanical arm high-precision motion planing method based on pseudoinverse | |
CN110497417A (en) | A kind of multi-axis robot based on high-precision three-dimensional space positioning system | |
CN115609586A (en) | Robot high-precision assembling method based on grabbing pose constraint | |
CN110631479B (en) | Spatial pose measurement method based on multi-microsphere vision probe | |
Sung et al. | Modeling/analysis of four-half axis machine tool via modified denavit-hartenberg notation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210325 Address after: 315400 Chengdong new district, Ningbo Economic Development Zone, Zhejiang Province Patentee after: Zhichang Technology Group Co.,Ltd. Address before: Room 320, building 1, 358 Huayan village, Nanqiao Town, Fengxian District, Shanghai Patentee before: SHANGHAI GENE AUTOMATION TECHNOLOGY Co.,Ltd. |
|
TR01 | Transfer of patent right |