CN115200475A - Rapid correction method for arm-mounted multi-vision sensor - Google Patents

Rapid correction method for arm-mounted multi-vision sensor Download PDF

Info

Publication number
CN115200475A
CN115200475A CN202210827177.3A CN202210827177A CN115200475A CN 115200475 A CN115200475 A CN 115200475A CN 202210827177 A CN202210827177 A CN 202210827177A CN 115200475 A CN115200475 A CN 115200475A
Authority
CN
China
Prior art keywords
vision sensor
coordinate system
calibration
sensor
industrial robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210827177.3A
Other languages
Chinese (zh)
Other versions
CN115200475B (en
Inventor
苗庆伟
杨星奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Alson Intelligent Technology Co ltd
Original Assignee
Henan Alson Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Alson Intelligent Technology Co ltd filed Critical Henan Alson Intelligent Technology Co ltd
Priority to CN202210827177.3A priority Critical patent/CN115200475B/en
Priority claimed from CN202210827177.3A external-priority patent/CN115200475B/en
Publication of CN115200475A publication Critical patent/CN115200475A/en
Application granted granted Critical
Publication of CN115200475B publication Critical patent/CN115200475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/045Correction of measurements

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The invention belongs to the technical field of industrial robot vision guidance and measurement in industrial fields, and particularly relates to a method for quickly correcting an arm-mounted multi-vision sensor, aiming at the problems that the prior art carries out multiple position correction, the whole process consumes more time, the requirement of quick fault recovery is difficult to meet and the like, the following scheme is proposed, and the method comprises the following steps: s1: the calibration plate is used for calibrating each vision sensor by hands and eyes, a correction program is executed, and the calibration points are photographed and identified to obtain a rotation and translation relation
Figure DDA0003746995540000011
The invention can quickly correct the position of the changed vision sensor by only executing the corresponding correction track once during correction, can obtain more accurate correction effect by compensating the error caused by calibration by using the calibration point, meets the requirement of correction precision, can quickly realize fault recovery,the work efficiency is improved.

Description

Rapid correction method for arm-mounted multi-vision sensor
Technical Field
The invention belongs to the field of industrial robot vision guidance and measurement in industrial fields, and particularly relates to a method for quickly correcting an arm-mounted multi-vision sensor.
Technical Field
The vision of the industrial robot can acquire the image of the measured object through the vision sensor, and the image is calculated and analyzed through the vision processor and converted into the position and pose coordinates which can be identified by the industrial robot. For a large workpiece or part with a characteristic hole, a plurality of (usually 2-4) 2D vision sensors are carried to identify object characteristic points of the part, so that the pose of a measured object can be estimated, the calculated pose is converted into coordinate information which can be identified by the industrial robot, and then the process track points of the industrial robot are subjected to offset correction, and the accurate positioning and measurement of the process points are realized.
In the whole process of grabbing, loading or measuring, the industrial robot needs to calibrate a plurality of sensors in advance, and position relations of the sensors are unified to the same coordinate system. Meanwhile, in the whole identification working process, the relative position relation between the visual sensor and the tail end of the industrial robot is kept unchanged.
Because the industrial field environment is complicated, the vision sensor is installed on the industrial robot tongs simultaneously, when the guide industrial robot is worked, the tongs can be because of reasons such as collision, overhaul of equipments take place to remove, and then causes the relative position change of vision sensor and industrial robot terminal ring flange. Generally, a workshop production line has high production efficiency and is a production line type production mode, and each station is required to have rapid line resetting capability, otherwise, a production plan is influenced; in the traditional visual sensor calibration method, a calibration plate is used for correcting the position of the visual sensor after the position is changed again, and then the coordinate system of the visual sensor after the position is changed is transferred to the same reference coordinate system. The position correction needs to be executed for many times in the process, the time consumption of the whole process is high, the requirement for quick fault recovery is difficult to meet, and the production efficiency is influenced.
Disclosure of Invention
Aiming at the technical problem, the invention provides a quick correction method of an arm-mounted multi-vision sensor, which realizes quick correction of the multi-vision sensor by using an error compensation principle. The method does not need to use the calibration plate again for correction, and can finish the correction work of the visual sensor only by photographing the calibration point once by the changed visual sensor, thereby reducing the step sequence of the correction process, shortening the correction time, effectively improving the working efficiency and meeting the requirement of quick fault recovery; the technical scheme adopted for achieving the purpose is as follows:
a quick correction method for an arm-mounted multi-vision sensor utilizes a position compensation principle to carry out error compensation on the position of a changed vision sensor, and converts a coordinate system { CAi' } of the changed vision sensor into a coordinate system { fi } of a flange plate of an industrial robot, and comprises the following steps:
s1: and (3) calibrating the eyes of the vision sensors to obtain the rotation and translation relation between the end flange plates of the industrial robot { fi } and the coordinate system { Ci } of the vision sensor i
Figure BDA0003746995520000021
S2: teaching track points of a correction program of each vision sensor in advance to ensure that each vision sensor i can shoot a calibration point, wherein each teaching photographing position Pi corresponds to one vision sensor i;
s3: executing a correction program of the vision sensor, sequentially enabling the vision sensor i to photograph and identify the calibration point, and calculating to obtain a rotation and translation relation between a robot Base seat { B } and a vision sensor i coordinate system { Ci' }
Figure BDA0003746995520000022
S4: obtaining the rotation and translation relation between the central coordinate system { fi } of the end flange of the industrial robot and the coordinate system { Ci' } of the visual sensor i obtained by calculation by using the calibration point
Figure BDA0003746995520000023
S5: obtaining the relative error between the coordinate system { Ci } of the visual sensor i and the coordinate system { Ci' } of the visual sensor i, which is obtained by utilizing the calibration of the calibration plate
Figure BDA0003746995520000024
S6: when the position of the vision sensor i is changed, the vision sensor i photographs and identifies the calibration point, and a rotation translation matrix between a coordinate system { CAi' } of the changed vision sensor i and an end flange { fi } of the industrial robot is obtained through calculation
Figure BDA0003746995520000025
S7: and (4) carrying out error compensation on the changed position of the vision sensor i, and unifying the compensated coordinate system { CAi } to the end flange plate coordinate system { fi } of the industrial robot.
Preferably, the vision sensor of claim 1 is a 2D vision sensor.
Preferably, in S1, the hand-eye calibration is performed on each vision sensor to obtain a rotational-translational relationship between { fi } between end flanges of the industrial robot and a coordinate system { Ci } of the vision sensor i
Figure BDA0003746995520000026
The calibration plate is used for calibrating the eyes and hands of each vision sensor.
Preferably, the calibration points are fixed on the clamp table, the number of the calibration points is more than 4, the calibration points are the upper end surfaces of the cylinders, and the heights of the calibration points are not on the same horizontal plane.
Preferably, in S3, the vision sensor i sequentially photographs and recognizes the calibration point, and a rotation and translation relationship between the robot Base { B } and a coordinate system { Ci' } of the vision sensor i is obtained through calculation
Figure BDA0003746995520000031
The PnP algorithm is utilized to obtain the rotation and translation relation between a coordinate system { Ci' } of the vision sensor i and a Base { B } of the robot.
Preferably, in S5, a relative error between the coordinate system { Ci } of the vision sensor i obtained by calibration using the calibration plate and the obtained coordinate system { Ci' } of the vision sensor i is obtained
Figure BDA0003746995520000032
The coordinate system of the visual sensor i is calculated by utilizing the calibration point.
Preferably, in S6, after the position of the vision sensor i is changed, the vision sensor i photographs and identifies the calibration point, and calculates to obtain a coordinate system { CAi' } of the changed vision sensor i and the industrial machineRotational translation matrix between robot end flanges { fi }
Figure BDA0003746995520000033
A correction program taught in advance needs to be executed, and the calibration point is photographed and recognized after correction.
Preferably, in S7, the position of the vision sensor i after the change is error-compensated, and the compensated coordinate system { CAi } is unified to the terminal flange coordinate system { fi } of the industrial robot, so as to obtain:
Figure BDA0003746995520000034
preferably, the characteristic position of the calibration point is the center of the upper end face of the cylinder, so that the measurement of three-coordinate measuring equipment is facilitated, and unnecessary measuring errors are reduced; the coordinate system of the calibration point is superposed with the Base coordinate system of the industrial robot, and the three-dimensional coordinate position under the Base coordinate system { B } of the industrial robot is obtained by measuring each calibration point by using a three-coordinate articulated arm measuring instrument or other three-coordinate measuring equipment B Pi(xi,yi,zi);
Preferably, the position and the posture of the end flange plate of the industrial robot corresponding to the photographing position Pi under the Base coordinate system of the industrial robot can be obtained through the demonstrator
Figure BDA0003746995520000035
The calibration point is fixed above the clamp table, so that the calibration point can be installed and measured by using the existing field equipment during the installation and debugging of the equipment in the early stage, the utilization rate of resources can be improved, the cost is reduced, the calibration plate is used for carrying out hand-eye calibration on the visual sensor to obtain more accurate relation between the visual sensor and the end flange plate of the industrial robot, and the calibration method is the most commonly used visual sensor calibration method.
The invention has the following beneficial effects: the invention utilizes the calibration plate to carry out hand-eye calibration to obtain the rotation-translation relation between the end flange plates of the industrial robot and the coordinate system (Ci) of the visual sensor i
Figure BDA0003746995520000041
As a reference; reuse calibrationPoint calibration is carried out to obtain the rotation translation relation between the flange plate at the tail end of the industrial robot and a coordinate system (Ci' }) of the visual sensor i
Figure BDA0003746995520000042
Obtaining relative error by comparing with reference
Figure BDA0003746995520000043
After the visual sensor is changed, the calibration point is used again for calibration, then the error compensation is carried out on the changed visual sensor, the compensated visual sensor coordinate system is further unified to the industrial robot end flange plate coordinate system, and finally unified to the workpiece coordinate system to realize the unification of the multi-visual sensor coordinate system; according to the method, the corresponding correction track is only needed to be executed once during correction, the changed visual sensor can be quickly subjected to quick position correction, errors caused by calibration of the calibration points are compensated, a more accurate correction effect can be obtained, the correction precision requirement is met, fault recovery can be quickly realized, and the working efficiency is improved.
Drawings
FIG. 1 is a diagram illustrating the calibration of a multi-vision sensor according to the present invention.
Fig. 2 is a flowchart of a method for quickly calibrating an arm-mounted multi-vision sensor according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments.
Referring to fig. 1-2, a method for quickly calibrating an arm-mounted multi-vision sensor, which utilizes a position compensation principle to perform error compensation on a changed vision sensor position and converts a changed vision sensor coordinate system { CAi' } into an industrial robot flange plate coordinate system { fi }, comprises the following steps:
s1: and (3) calibrating the eyes of the vision sensors to obtain the rotation and translation relation between the end flange plates of the industrial robot { fi } and the coordinate system { Ci } of the vision sensor i
Figure BDA0003746995520000044
S2: teaching track points of a correction program of each vision sensor in advance to ensure that each vision sensor i can shoot a calibration point, wherein each teaching photographing position Pi corresponds to one vision sensor i;
s3: executing a correction program of the vision sensor, sequentially enabling the vision sensor i to photograph and identify the calibration point, and calculating to obtain a rotation and translation relation between a robot Base seat { B } and a vision sensor i coordinate system { Ci' }
Figure BDA0003746995520000051
S4: obtaining the rotation and translation relation between the central coordinate system { fi } of the end flange of the industrial robot and the coordinate system { Ci' } of the visual sensor i obtained by calculation by using the calibration point
Figure BDA0003746995520000052
S5: obtaining the relative error between the coordinate system { Ci } of the visual sensor i and the coordinate system { Ci' } of the visual sensor i, which is obtained by utilizing the calibration of the calibration plate
Figure BDA0003746995520000053
S6: when the position of the vision sensor i is changed, the vision sensor i takes pictures and identifies the calibration point, and a rotation and translation matrix between a coordinate system { CAi' } of the changed vision sensor i and an end flange plate { fi } of the industrial robot is obtained through calculation
Figure BDA0003746995520000054
S7: and (4) carrying out error compensation on the changed position of the vision sensor i, and unifying the compensated coordinate system { CAi } to the end flange plate coordinate system { fi } of the industrial robot.
In this embodiment, the visual sensor is a 2D visual sensor.
In this embodiment, in the step S1, for each visionThe sensors are calibrated by hands and eyes to obtain the rotation and translation relation between the flange plates at the tail ends of the industrial robots { fi } and the coordinate system { Ci } of the visual sensor i
Figure BDA0003746995520000055
The calibration plate is used for calibrating the eyes and hands of each vision sensor.
In this embodiment, the calibration points are fixed on the fixture table, the number of the calibration points is more than 4, the calibration points are the upper end surfaces of the cylinders, and the heights of the calibration points are not on the same horizontal plane.
In this embodiment, in S3, the visual sensor i sequentially photographs and identifies the calibration point, and a rotation-translation relationship between the robot Base { B } and a coordinate system { Ci' } of the visual sensor i is obtained through calculation
Figure BDA0003746995520000056
The PnP algorithm is utilized to obtain the rotation and translation relation between a coordinate system { Ci' } of the vision sensor i and a Base { B } of the robot.
In this embodiment, in S5, a relative error between a coordinate system { Ci } of the visual sensor i obtained by calibration using the calibration board and the obtained coordinate system { Ci' } of the visual sensor i is obtained
Figure BDA0003746995520000057
Is a visual sensor i coordinate system calculated by using a calibration point.
In this embodiment, in S6, after the position of the vision sensor i is changed, the vision sensor i takes a picture of the calibration point, identifies the calibration point, and calculates to obtain a rotation and translation matrix between the coordinate system { CAi' } of the vision sensor i after the change and the end flange { fi } of the industrial robot
Figure BDA0003746995520000061
A correction program taught in advance needs to be executed, and the calibration point is photographed and recognized after correction.
In this embodiment, in S7, the position of the vision sensor i after the change is error-compensated, and the compensated coordinate system { CAi } is unified with the industrial robotAnd (3) obtaining the following components in a terminal flange coordinate system { fi }:
Figure BDA0003746995520000062
in the embodiment, the characteristic position of the calibration point is the center of the upper end face of the cylinder, so that the measurement of three-coordinate measuring equipment is facilitated, and unnecessary measuring errors are reduced; the coordinate system of the calibration point is coincided with the industrial robot Base coordinate system, and the three-dimensional coordinate position under the industrial robot Base coordinate system { B } is obtained by measuring each calibration point by using a three-coordinate articulated arm measuring instrument or other three-coordinate measuring equipment B Pi(xi,yi,zi);
In this embodiment, the position and posture of the end flange of the industrial robot corresponding to the photographing position Pi in the Base coordinate system of the industrial robot can be obtained through the demonstrator
Figure BDA0003746995520000063
The calibration point is fixed above the clamp table, so that the calibration point can be installed and measured by using the existing field equipment during the installation and debugging of the equipment in the early stage, the utilization rate of resources can be improved, the cost is reduced, the calibration plate is used for carrying out hand-eye calibration on the visual sensor to obtain more accurate relation between the visual sensor and the end flange plate of the industrial robot, and the calibration method is the most commonly used visual sensor calibration method.
The foregoing descriptions of specific exemplary embodiments of the present invention are presented for purposes of illustration and description, and not of limitation, and the scope of the present application is not limited thereto, although the present application will be described in detail with reference to the foregoing examples, and those skilled in the art will understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the application. Are intended to be covered by the scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A quick correction method for an arm-mounted multi-vision sensor utilizes a position compensation principle to carry out error compensation on the position of a changed vision sensor and converts a coordinate system { CAi' } of the changed vision sensor into a coordinate system { fi } of a flange of an industrial robot, and is characterized by comprising the following steps:
s1: and calibrating the hand and the eye of each vision sensor to obtain the rotation and translation relation between the flange plates at the tail ends of the industrial robot and the coordinate system of the vision sensor i, wherein the coordinate system is { fi }, and the coordinate system is { Ci }, of the vision sensor i
Figure FDA0003746995510000011
S2: teaching track points of a correction program of each vision sensor in advance to ensure that each vision sensor i can shoot a calibration point, wherein each teaching photographing position Pi corresponds to one vision sensor i;
s3: executing a correction program of the vision sensor, sequentially enabling the vision sensor i to photograph and identify the calibration points, and calculating to obtain a rotation and translation relation between a robot Base seat { B } and a vision sensor i coordinate system { Ci' }
Figure FDA0003746995510000012
S4: obtaining the rotation translation relation between the central coordinate system { fi } of the end flange of the industrial robot and the coordinate system { Ci' } of the visual sensor i
Figure FDA0003746995510000013
S5: obtaining the relative error between the coordinate system { Ci } of the visual sensor i obtained by calibration with the calibration plate and the coordinate system { Ci' } of the visual sensor i obtained by calibration with the calibration point
Figure FDA0003746995510000014
S6: when the position of the visual sensor i changes, the visual sensor i photographs and identifies the calibration point, and the changed rearview is calculatedRotation and translation matrix between coordinate system { CAi' } of tactile sensor i and end flange plate { fi } of industrial robot
Figure FDA0003746995510000015
S7: and (4) carrying out error compensation on the changed position of the vision sensor i, and unifying the compensated coordinate system { CAi } to the end flange plate coordinate system { fi } of the industrial robot.
2. The method for rapidly calibrating an arm-mounted multi-vision sensor according to claim 1, wherein the vision sensor is a 2D vision sensor.
3. The method for rapidly calibrating the multi-vision sensor carried on the arm as claimed in claim 1, wherein in the step S1, the calibration of each vision sensor is performed by hand-eye to obtain the rotational-translational relationship between the end flanges { fi } of the industrial robot and the i-coordinate system { Ci } of the vision sensor
Figure FDA0003746995510000016
The calibration plate is used for calibrating the eyes and hands of each vision sensor.
4. The method for rapidly calibrating the multi-vision sensor carried on the arm as claimed in claim 1, wherein the calibration points are fixed on the fixture table, the number of the calibration points is more than 4, the calibration points are cylindrical upper end surfaces, and the heights of the calibration points are not on the same horizontal plane.
5. The method according to claim 1, wherein in S3, the vision sensor i is sequentially made to photograph and recognize the calibration point, and the rotation-translation relationship between the robot Base { B } and the vision sensor i coordinate system { Ci' } is calculated
Figure FDA0003746995510000021
The method is characterized in that a PnP algorithm is utilized to obtain a rotation translation relation between a coordinate system { Ci' } of a vision sensor i and a Base { B } of a robot.
6. The method for fast calibration of an arm-mounted multi-vision sensor according to claim 1, wherein in S5, a relative error between an i-coordinate system { Ci } of the vision sensor obtained by calibration with the calibration plate and an i-coordinate system { Ci' } of the vision sensor obtained by calibration with the calibration plate is obtained
Figure FDA0003746995510000022
Is a visual sensor i coordinate system calculated by using a calibration point.
7. The method for rapidly calibrating an arm-mounted multi-vision sensor according to claim 1, wherein in S6, when the position of a vision sensor i is changed, the vision sensor i takes a picture of a calibration point, identifies the calibration point, and calculates to obtain a rotation and translation matrix between a coordinate system { CAi' } of the changed vision sensor i and a terminal flange plate { fi } of an industrial robot
Figure FDA0003746995510000023
A calibration program taught in advance needs to be executed, and the calibration point is photographed and recognized after calibration.
8. The method for rapidly calibrating an arm-mounted multi-vision sensor according to claim 1, wherein in S7, the position of the vision sensor i after the change is compensated for errors, and the compensated coordinate system { CAi } is unified to the end flange coordinate system { fi } of the industrial robot, so as to obtain:
Figure FDA0003746995510000024
CN202210827177.3A 2022-07-14 Rapid correction method for arm-mounted multi-vision sensor Active CN115200475B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210827177.3A CN115200475B (en) 2022-07-14 Rapid correction method for arm-mounted multi-vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210827177.3A CN115200475B (en) 2022-07-14 Rapid correction method for arm-mounted multi-vision sensor

Publications (2)

Publication Number Publication Date
CN115200475A true CN115200475A (en) 2022-10-18
CN115200475B CN115200475B (en) 2024-06-07

Family

ID=

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117123520A (en) * 2023-02-06 2023-11-28 荣耀终端有限公司 Method for realizing glue wiping of target workpiece and glue wiping equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117123520A (en) * 2023-02-06 2023-11-28 荣耀终端有限公司 Method for realizing glue wiping of target workpiece and glue wiping equipment

Similar Documents

Publication Publication Date Title
CN108818536B (en) Online offset correction method and device for robot hand-eye calibration
US8520067B2 (en) Method for calibrating a measuring system
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
CN111256732B (en) Target attitude error measurement method for underwater binocular vision
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
CN113601158A (en) Bolt feeding and pre-tightening system based on visual positioning and control method
KR20080088165A (en) Robot calibration method
CN111862221A (en) UVW platform calibration method and device, deviation correction method and device and alignment system
CN112907679A (en) Robot repeated positioning precision measuring method based on vision
JP2012101306A (en) Apparatus and method for calibration of robot
CN109059755B (en) High-precision hand-eye calibration method for robot
TW201439572A (en) System and method of normalizing machine coordinate system
JP2007533963A (en) Non-contact optical measuring method and measuring apparatus for 3D position of object
JP7427370B2 (en) Imaging device, image processing device, image processing method, calibration method for imaging device, robot device, method for manufacturing articles using robot device, control program, and recording medium
JP2007533963A5 (en)
CN114459345B (en) Aircraft fuselage position and posture detection system and method based on visual space positioning
CN108627103A (en) A kind of 2D laser measurement methods of parts height dimension
CN115200475B (en) Rapid correction method for arm-mounted multi-vision sensor
CN115200475A (en) Rapid correction method for arm-mounted multi-vision sensor
CN113894793B (en) Method for acquiring relative pose relationship between part and vision sensor
JP2021024053A (en) Correction method of visual guidance robot arm
CN214200141U (en) Robot repeated positioning precision measuring system based on vision
CN109737902B (en) Industrial robot kinematics calibration method based on coordinate measuring instrument
CN115235340A (en) Rapid correction method for arm-mounted vision sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant