CN110842919A - Visual guide method for screwing of robot - Google Patents
Visual guide method for screwing of robot Download PDFInfo
- Publication number
- CN110842919A CN110842919A CN201911068409.6A CN201911068409A CN110842919A CN 110842919 A CN110842919 A CN 110842919A CN 201911068409 A CN201911068409 A CN 201911068409A CN 110842919 A CN110842919 A CN 110842919A
- Authority
- CN
- China
- Prior art keywords
- acquiring
- fixed point
- ideal
- coordinate
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23P—METAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
- B23P19/00—Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes
- B23P19/04—Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes for assembling or disassembling parts
- B23P19/06—Screw or nut setting or loosening machines
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
The invention provides a visual guidance method for screwing by a robot, which comprises the following steps: establishing a camera image coordinate system and a robot coordinate system; acquiring a mapping relation between a camera image coordinate system and a robot coordinate system; acquiring a distance between a camera and a robot tool; establishing an image template database, wherein the image template database comprises a plurality of image templates; setting a plurality of ideal fixed points on the image template and acquiring the coordinates of the ideal fixed points; shooting an actual fixed point corresponding to the product and the ideal fixed point and acquiring an actual fixed point coordinate; acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate; acquiring the coordinates of an ideal screw on the image template; acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference; and acquiring actual screw coordinates according to the deviation value. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
Description
Technical Field
The invention relates to the technical field of visual guidance of screwing of robots, in particular to a visual guidance method for screwing of robots.
Background
In the prior art, manual operation is usually adopted in the process of locking screws, but the method has low efficiency and is easy to cause the problem of deviation; some companies use mechanized operations to lock screws by using a robot tool, i.e., to use a robot to drive screws, but the problem of offset is easily caused.
Disclosure of Invention
In order to solve the problem that the bolt driving in the prior art is easy to deviate, a visual guiding method for robot bolt driving is further provided.
In order to achieve the above object, the present invention provides a visual guidance method for screwing by a robot, comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
acquiring the coordinates of an ideal screw on the image template;
acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
Preferably, the mapping relationship is determined by a robot nine-point calibration method.
Preferably, the step of acquiring the distance between the camera and the robot tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
shooting a fixed point q by a camera, and acquiring a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z');
the robot tool touches the fixed point q and acquires the current second manipulator coordinates q "(x", y ", z");
acquiring a distance Q between the camera and the robot tool from the first robot coordinate Q '(x', y ', z') and the second robot coordinate Q ″ (x ", y", z "):
preferably, the step of obtaining the angular difference comprises:
arbitrarily taking two ideal fixed points m1, m2 on the image template; wherein the ideal fixed point coordinate is m1 (x)1,y1),m2(x2,y2);
According to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
From an ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
Preferably, the step of obtaining an offset value of ideal screw coordinates comprises:
ideal screw seat on template for obtaining imageLabel P1 (x)P1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
preferably, the step of acquiring actual screw coordinates comprises:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
the technical scheme of the invention has the following technical effects:
by adopting the technical scheme, the mapping relation between the camera image coordinate system and the robot coordinate system is obtained by establishing the camera image coordinate system and the robot coordinate system, and the image coordinate obtained by the camera and the coordinate of the robot tool are converted; the method comprises the steps of obtaining an angle difference between an actual fixed point and an ideal fixed point by obtaining an ideal fixed point coordinate and an actual fixed point coordinate, and obtaining an offset according to the ideal screw coordinate and the angle difference, so that the actual screw coordinate is determined. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
Drawings
FIG. 1 is a flow chart of a robot screwing visual guidance method involved in the present invention;
FIG. 2 is a schematic diagram of the positional relationship of the components of the robot involved in the present invention;
fig. 3 is an enlarged schematic view of portion i of fig. 2.
Detailed Description
The following describes an embodiment according to the present invention with reference to the drawings.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and thus the present invention is not limited to the specific embodiments disclosed below.
The invention provides a visual guiding method for a robot to drive screws, aiming at solving the problem that the screws are easy to deviate.
As shown in fig. 1, the present invention provides a visual guiding method for screwing by a robot, comprising the following steps:
s1, establishing a camera image coordinate system and a robot coordinate system;
in this step, a camera image coordinate system o is establishedP-xPyPzPAnd robot coordinate system oR-xRyRzRThe screw driver is applied to driving screws, and can be applied to other practical applications.
For example, the setup method shown in fig. 2 and 3 may be used, and the camera is mounted on the manipulator and moves with the manipulator, i.e., eye-in-hand. Camera image coordinate system oP-xPyPzPWith the optical axis of the camera as zPAxis with optical center as origin oPAfter (not shown in the figure) is reacted with zPSetting x in a plane with axis perpendicular and passing through originPAxis and yPA shaft; robot coordinate system oR-xRyRzRUsing the origin of the robot tool coordinate system as the origin oR(not shown in the figure) at zRSetting x in a plane with axis perpendicular and passing through originRAxis and yRA shaft; wherein o isA-xAyAzAIs a base coordinate system; oB-xByBzBIs the calibration plate coordinate system.
S2, acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
further, the mapping relation is determined by a robot nine-point calibration method.
In this step, by providing a robot nine-point calibration method, the coordinates of the image acquired by the camera and the coordinates of the robot tool are converted.
As shown in FIG. 3, the camera image coordinates P (x) when the camera image coordinate system is acquiredP,yP,zP) Coordinates R (x) of the robot tool in the robot coordinate systemR,yR,zR) (ii) a Since the Z-axis direction and the relative distance between the camera image coordinate P and the robot tool coordinate R are fixed, the data Z in the Z-axis directionPAnd zRAnd may not be considered in subsequent acquisitions.
Wherein R is a rotation angle and M is a displacement;
converting the formula:
xR=axP+byP+c;
yR=a′xP+b′yP+c′;
according to the formula, x is obtainedP0,xP1,xP2;yR0,yR1,yR2。
The robot nine-point calibration method is adopted to obtain the mapping relation between the camera image coordinate system and the robot coordinate system, namely the coordinate transformation relation between the camera and the robot tool.
In this embodiment, a fixed point q (x, y, 1) is placed on the camera shooting plane, the camera shoots and makes a template, the robot tool moves nine points in sequence to shoot and match, nine sets of pixel coordinates and manipulator coordinates are correspondingly acquired, and the nine sets of coordinates are used to establish an affine transformation matrix through the image algorithm library. Obtaining a conversion matrix according to the nine-point calibration of the robotThe pixel coordinates are converted to manipulator coordinates q ' (x ', y ', 1) by the conversion equation:
s3, acquiring the distance between the camera and the robot tool;
further, the step of acquiring the distance between the camera and the robot tool comprises:
s30, arbitrarily selecting a fixed point q in the camera photographing plane;
s31, shooting a fixed point q by a camera, and acquiring a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z');
s32, the robot tool touches a fixed point q, and a current second manipulator coordinate q "(x", y ", z") is obtained;
s33, acquiring a distance Q between the camera and the robot tool from the first robot coordinate Q '(x', y ', z') and the second robot coordinate Q "(x", y ", z"):
wherein z ═ 0 and z ═ 0.
In the step, a fixed point arbitrarily selected in the camera photographing plane is provided, and the robot tool touches the fixed point, so that the distance from the center of the camera to the robot tool is obtained and is used for calculating the subsequent point position.
In the embodiment, the obtained coordinates of the manipulator of the grasped object are shifted to obtain the coordinates of the grasped object in a screw machine tool coordinate system, and when the center of a camera is taken as a center, MarkX and MarkY are taken as original coordinates; when the robot tool coordinate system is the center, markX and markY are shifted coordinates, and since Δ x ═ x ″ -x ', Δ y ═ y ″ -y', and Δ z ═ z ″ -z ═ 0, markX ═ markX- Δ x, and markY ═ markY- Δ y.
S4, establishing an image template database, wherein the image template database comprises a plurality of image templates;
s5, setting a plurality of ideal fixed points on the image template, and acquiring the coordinates of the ideal fixed points;
in the step, an image template database is established, the image template database comprises a plurality of image templates, and a plurality of ideal fixed points are preset on the image templates and are used as the standard of the actual positions of the screws.
S6, shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring the coordinate of the actual fixed point;
s7, acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
further, the step of obtaining the angle difference comprises:
s70, arbitrarily selecting two ideal fixed points m1 and m2 on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
s71, according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
S72, and m1 (x) is the ideal fixed point1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
And S73, the angle difference is delta theta, and the delta theta is theta 1-theta 2.
S8, acquiring the coordinates of the ideal screw on the image template;
s9, obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
further, the step of obtaining an offset value of ideal screw coordinates includes:
obtaining ideal screw coordinate P1 (x) on image templateP1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
and S10, acquiring the actual screw coordinate according to the offset value.
Further, the step of acquiring actual screw coordinates includes:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
in the above steps, the angle difference between the ideal fixed point and the actual fixed point is obtained to obtain the offset value between the subsequent actual fixed point and the ideal fixed point, and the actual position of each nailing position is obtained according to the ideal screw coordinate, so that the offset generated in the process of driving the screw is avoided.
In summary, with the above technical solution, the actual screw coordinate is determined by obtaining the ideal fixed point coordinate and the actual fixed point coordinate, obtaining the angle difference between the actual fixed point and the ideal fixed point, and then obtaining the offset according to the ideal screw coordinate and the angle difference. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (6)
1. A visual guide method for screwing by a robot is characterized by comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between the camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
acquiring the coordinates of the ideal screw on the image template;
obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
2. The visual guidance method for robot screwing according to claim 1, wherein said mapping relationship is determined by a robot nine-point calibration method.
3. The visual guidance method for robotic screwing according to claim 2, wherein said step of acquiring the distance between the camera and the robotic tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
the camera shoots the fixed point q, and fixed point pixel coordinates q (x, y, z) and current first manipulator coordinates q '(x', y ', z') are obtained;
the robot tool touches a fixed point q, acquiring a current second manipulator coordinate q '(x', y ', z');
-acquiring a distance Q between the camera and the robot tool from the first and second robot coordinates Q '(x', y ', z') (x ", y", z "):
4. the visual guidance method for robotic screwing according to claim 3, wherein the step of acquiring said angular difference comprises:
any two of the ideal fixed points m1, m2 are taken on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining an angle theta 1 between the ideal fixed point and the X axis, the angle being
From said ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining an angle theta 2 between the actual fixed point and the X axis, the angle
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
5. The visual guidance method for robotic screwing according to claim 4, wherein the step of obtaining an offset value of the ideal screw coordinates comprises:
acquiring ideal screw coordinates P1 (x) on the image templateP1,yP1);
The offset value comprises Δ x, Δ y, and the offset value is obtained by:
6. the visual guidance method for robotic screwing according to claim 5, wherein the step of acquiring said actual screw coordinates comprises:
the actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911068409.6A CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911068409.6A CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110842919A true CN110842919A (en) | 2020-02-28 |
CN110842919B CN110842919B (en) | 2021-01-22 |
Family
ID=69598932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911068409.6A Active CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110842919B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112312666A (en) * | 2020-11-06 | 2021-02-02 | 浪潮电子信息产业股份有限公司 | Circuit board screw driving method and system |
CN112329530A (en) * | 2020-09-30 | 2021-02-05 | 北京航空航天大学 | Method, device and system for detecting mounting state of bracket |
CN113635286A (en) * | 2021-08-20 | 2021-11-12 | 菲烁易维(重庆)科技有限公司 | Device and method for controlling bolt tightening based on machine vision technology |
CN114178838A (en) * | 2021-12-28 | 2022-03-15 | 上海稊米汽车科技有限公司 | Multi-locking-point locking method and device applied to multi-plane workpiece |
CN114683214A (en) * | 2022-03-30 | 2022-07-01 | 武汉海微科技有限公司 | Visual positioning method for automatically screwing vehicle-mounted screen shell |
CN115582829A (en) * | 2021-07-05 | 2023-01-10 | 腾讯科技(深圳)有限公司 | Method and device for determining position of mechanical arm, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160082557A1 (en) * | 2014-09-22 | 2016-03-24 | Kuka Systems Corporation North America | Robotic apparatus and process for the installation of collars and nuts onto fasteners |
CN108312144A (en) * | 2017-12-25 | 2018-07-24 | 北京航天测控技术有限公司 | Automatically lock pays control system and method for robot based on machine vision |
CN108544531A (en) * | 2018-04-12 | 2018-09-18 | 江苏科技大学 | A kind of automatic chemical examination robot arm device, control system and its control method of view-based access control model calibration |
US20180297717A1 (en) * | 2017-04-18 | 2018-10-18 | Electroimpact, Inc. | Camera assisted robotic system for locating the end of a fastener extending through an aircraft part during manufacture thereof |
CN110125926A (en) * | 2018-02-08 | 2019-08-16 | 比亚迪股份有限公司 | The workpiece of automation picks and places method and system |
CN110315525A (en) * | 2018-03-29 | 2019-10-11 | 天津工业大学 | A kind of robot workpiece grabbing method of view-based access control model guidance |
-
2019
- 2019-11-05 CN CN201911068409.6A patent/CN110842919B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160082557A1 (en) * | 2014-09-22 | 2016-03-24 | Kuka Systems Corporation North America | Robotic apparatus and process for the installation of collars and nuts onto fasteners |
US20180297717A1 (en) * | 2017-04-18 | 2018-10-18 | Electroimpact, Inc. | Camera assisted robotic system for locating the end of a fastener extending through an aircraft part during manufacture thereof |
CN108312144A (en) * | 2017-12-25 | 2018-07-24 | 北京航天测控技术有限公司 | Automatically lock pays control system and method for robot based on machine vision |
CN110125926A (en) * | 2018-02-08 | 2019-08-16 | 比亚迪股份有限公司 | The workpiece of automation picks and places method and system |
CN110315525A (en) * | 2018-03-29 | 2019-10-11 | 天津工业大学 | A kind of robot workpiece grabbing method of view-based access control model guidance |
CN108544531A (en) * | 2018-04-12 | 2018-09-18 | 江苏科技大学 | A kind of automatic chemical examination robot arm device, control system and its control method of view-based access control model calibration |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112329530A (en) * | 2020-09-30 | 2021-02-05 | 北京航空航天大学 | Method, device and system for detecting mounting state of bracket |
CN112329530B (en) * | 2020-09-30 | 2023-03-21 | 北京航空航天大学 | Method, device and system for detecting mounting state of bracket |
CN112312666A (en) * | 2020-11-06 | 2021-02-02 | 浪潮电子信息产业股份有限公司 | Circuit board screw driving method and system |
CN112312666B (en) * | 2020-11-06 | 2023-08-15 | 浪潮电子信息产业股份有限公司 | Circuit board screw driving method and system |
CN115582829A (en) * | 2021-07-05 | 2023-01-10 | 腾讯科技(深圳)有限公司 | Method and device for determining position of mechanical arm, electronic equipment and storage medium |
CN113635286A (en) * | 2021-08-20 | 2021-11-12 | 菲烁易维(重庆)科技有限公司 | Device and method for controlling bolt tightening based on machine vision technology |
CN114178838A (en) * | 2021-12-28 | 2022-03-15 | 上海稊米汽车科技有限公司 | Multi-locking-point locking method and device applied to multi-plane workpiece |
CN114178838B (en) * | 2021-12-28 | 2024-03-22 | 上海稊米汽车科技有限公司 | Multi-locking point locking method and equipment applied to multi-plane workpiece |
CN114683214A (en) * | 2022-03-30 | 2022-07-01 | 武汉海微科技有限公司 | Visual positioning method for automatically screwing vehicle-mounted screen shell |
Also Published As
Publication number | Publication date |
---|---|
CN110842919B (en) | 2021-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110842919B (en) | Visual guide method for screwing of robot | |
JP7207851B2 (en) | Control method, robot system, article manufacturing method, program and recording medium | |
JP6427972B2 (en) | Robot, robot system and control device | |
CN104540648B (en) | Have the working rig and electronic part mounting of articulated robot | |
US20160354929A1 (en) | Robot, robot control device, and robot system | |
WO2020024178A1 (en) | Hand-eye calibration method and system, and computer storage medium | |
JP2019169156A (en) | Vision system for training assembly system through virtual assembly of objects | |
CN113001535A (en) | Automatic correction system and method for robot workpiece coordinate system | |
JP2014180720A (en) | Robot system and calibration method | |
JP5618770B2 (en) | Robot calibration apparatus and calibration method | |
CN105323455B (en) | A kind of location compensation method based on machine vision | |
JP2016187846A (en) | Robot, robot controller and robot system | |
CN110465946B (en) | Method for calibrating relation between pixel coordinate and robot coordinate | |
US20210154836A1 (en) | Trajectory control device | |
CN104552341A (en) | Single-point multi-view meter-hanging posture error detecting method of mobile industrial robot | |
TWI699264B (en) | Correction method of vision guided robotic arm | |
US20220395981A1 (en) | System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system | |
CN114260908B (en) | Robot teaching method, apparatus, computer device and computer program product | |
EP4101604A1 (en) | System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system | |
CN114663500A (en) | Vision calibration method, computer device and storage medium | |
CN111098306A (en) | Calibration method and device of robot, robot and storage medium | |
CN113211431A (en) | Pose estimation method based on two-dimensional code correction robot system | |
CN108582037B (en) | Method for realizing precise fitting by matching two cameras with robot | |
CN112238453A (en) | Vision-guided robot arm correction method | |
JP2021024053A (en) | Correction method of visual guidance robot arm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |