CN110842919A - Visual guide method for screwing of robot - Google Patents

Visual guide method for screwing of robot Download PDF

Info

Publication number
CN110842919A
CN110842919A CN201911068409.6A CN201911068409A CN110842919A CN 110842919 A CN110842919 A CN 110842919A CN 201911068409 A CN201911068409 A CN 201911068409A CN 110842919 A CN110842919 A CN 110842919A
Authority
CN
China
Prior art keywords
acquiring
fixed point
ideal
coordinate
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911068409.6A
Other languages
Chinese (zh)
Other versions
CN110842919B (en
Inventor
吴鲁滨
李强
徐志超
张文强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Changhong Intelligent Manufacturing Technology Co Ltd
Original Assignee
Sichuan Changhong Intelligent Manufacturing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Changhong Intelligent Manufacturing Technology Co Ltd filed Critical Sichuan Changhong Intelligent Manufacturing Technology Co Ltd
Priority to CN201911068409.6A priority Critical patent/CN110842919B/en
Publication of CN110842919A publication Critical patent/CN110842919A/en
Application granted granted Critical
Publication of CN110842919B publication Critical patent/CN110842919B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P19/00Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes
    • B23P19/04Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes for assembling or disassembling parts
    • B23P19/06Screw or nut setting or loosening machines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/023Cartesian coordinate type

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a visual guidance method for screwing by a robot, which comprises the following steps: establishing a camera image coordinate system and a robot coordinate system; acquiring a mapping relation between a camera image coordinate system and a robot coordinate system; acquiring a distance between a camera and a robot tool; establishing an image template database, wherein the image template database comprises a plurality of image templates; setting a plurality of ideal fixed points on the image template and acquiring the coordinates of the ideal fixed points; shooting an actual fixed point corresponding to the product and the ideal fixed point and acquiring an actual fixed point coordinate; acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate; acquiring the coordinates of an ideal screw on the image template; acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference; and acquiring actual screw coordinates according to the deviation value. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.

Description

Visual guide method for screwing of robot
Technical Field
The invention relates to the technical field of visual guidance of screwing of robots, in particular to a visual guidance method for screwing of robots.
Background
In the prior art, manual operation is usually adopted in the process of locking screws, but the method has low efficiency and is easy to cause the problem of deviation; some companies use mechanized operations to lock screws by using a robot tool, i.e., to use a robot to drive screws, but the problem of offset is easily caused.
Disclosure of Invention
In order to solve the problem that the bolt driving in the prior art is easy to deviate, a visual guiding method for robot bolt driving is further provided.
In order to achieve the above object, the present invention provides a visual guidance method for screwing by a robot, comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
acquiring the coordinates of an ideal screw on the image template;
acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
Preferably, the mapping relationship is determined by a robot nine-point calibration method.
Preferably, the step of acquiring the distance between the camera and the robot tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
shooting a fixed point q by a camera, and acquiring a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z');
the robot tool touches the fixed point q and acquires the current second manipulator coordinates q "(x", y ", z");
acquiring a distance Q between the camera and the robot tool from the first robot coordinate Q '(x', y ', z') and the second robot coordinate Q ″ (x ", y", z "):
Figure BDA0002260143780000021
preferably, the step of obtaining the angular difference comprises:
arbitrarily taking two ideal fixed points m1, m2 on the image template; wherein the ideal fixed point coordinate is m1 (x)1,y1),m2(x2,y2);
According to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
Figure BDA0002260143780000022
From an ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
Figure BDA0002260143780000023
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
Preferably, the step of obtaining an offset value of ideal screw coordinates comprises:
ideal screw seat on template for obtaining imageLabel P1 (x)P1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
Figure BDA0002260143780000031
Figure BDA0002260143780000032
preferably, the step of acquiring actual screw coordinates comprises:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
the technical scheme of the invention has the following technical effects:
by adopting the technical scheme, the mapping relation between the camera image coordinate system and the robot coordinate system is obtained by establishing the camera image coordinate system and the robot coordinate system, and the image coordinate obtained by the camera and the coordinate of the robot tool are converted; the method comprises the steps of obtaining an angle difference between an actual fixed point and an ideal fixed point by obtaining an ideal fixed point coordinate and an actual fixed point coordinate, and obtaining an offset according to the ideal screw coordinate and the angle difference, so that the actual screw coordinate is determined. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
Drawings
FIG. 1 is a flow chart of a robot screwing visual guidance method involved in the present invention;
FIG. 2 is a schematic diagram of the positional relationship of the components of the robot involved in the present invention;
fig. 3 is an enlarged schematic view of portion i of fig. 2.
Detailed Description
The following describes an embodiment according to the present invention with reference to the drawings.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and thus the present invention is not limited to the specific embodiments disclosed below.
The invention provides a visual guiding method for a robot to drive screws, aiming at solving the problem that the screws are easy to deviate.
As shown in fig. 1, the present invention provides a visual guiding method for screwing by a robot, comprising the following steps:
s1, establishing a camera image coordinate system and a robot coordinate system;
in this step, a camera image coordinate system o is establishedP-xPyPzPAnd robot coordinate system oR-xRyRzRThe screw driver is applied to driving screws, and can be applied to other practical applications.
For example, the setup method shown in fig. 2 and 3 may be used, and the camera is mounted on the manipulator and moves with the manipulator, i.e., eye-in-hand. Camera image coordinate system oP-xPyPzPWith the optical axis of the camera as zPAxis with optical center as origin oPAfter (not shown in the figure) is reacted with zPSetting x in a plane with axis perpendicular and passing through originPAxis and yPA shaft; robot coordinate system oR-xRyRzRUsing the origin of the robot tool coordinate system as the origin oR(not shown in the figure) at zRSetting x in a plane with axis perpendicular and passing through originRAxis and yRA shaft; wherein o isA-xAyAzAIs a base coordinate system; oB-xByBzBIs the calibration plate coordinate system.
S2, acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
further, the mapping relation is determined by a robot nine-point calibration method.
In this step, by providing a robot nine-point calibration method, the coordinates of the image acquired by the camera and the coordinates of the robot tool are converted.
As shown in FIG. 3, the camera image coordinates P (x) when the camera image coordinate system is acquiredP,yP,zP) Coordinates R (x) of the robot tool in the robot coordinate systemR,yR,zR) (ii) a Since the Z-axis direction and the relative distance between the camera image coordinate P and the robot tool coordinate R are fixed, the data Z in the Z-axis directionPAnd zRAnd may not be considered in subsequent acquisitions.
Figure BDA0002260143780000051
Wherein R is a rotation angle and M is a displacement;
converting the formula:
xR=axP+byP+c;
yR=a′xP+b′yP+c′;
according to the formula, x is obtainedP0,xP1,xP2;yR0,yR1,yR2
Figure BDA0002260143780000052
Figure BDA0002260143780000053
The robot nine-point calibration method is adopted to obtain the mapping relation between the camera image coordinate system and the robot coordinate system, namely the coordinate transformation relation between the camera and the robot tool.
In this embodiment, a fixed point q (x, y, 1) is placed on the camera shooting plane, the camera shoots and makes a template, the robot tool moves nine points in sequence to shoot and match, nine sets of pixel coordinates and manipulator coordinates are correspondingly acquired, and the nine sets of coordinates are used to establish an affine transformation matrix through the image algorithm library. Obtaining a conversion matrix according to the nine-point calibration of the robot
Figure BDA0002260143780000054
The pixel coordinates are converted to manipulator coordinates q ' (x ', y ', 1) by the conversion equation:
Figure BDA0002260143780000055
s3, acquiring the distance between the camera and the robot tool;
further, the step of acquiring the distance between the camera and the robot tool comprises:
s30, arbitrarily selecting a fixed point q in the camera photographing plane;
s31, shooting a fixed point q by a camera, and acquiring a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z');
s32, the robot tool touches a fixed point q, and a current second manipulator coordinate q "(x", y ", z") is obtained;
s33, acquiring a distance Q between the camera and the robot tool from the first robot coordinate Q '(x', y ', z') and the second robot coordinate Q "(x", y ", z"):
wherein z ═ 0 and z ═ 0.
In the step, a fixed point arbitrarily selected in the camera photographing plane is provided, and the robot tool touches the fixed point, so that the distance from the center of the camera to the robot tool is obtained and is used for calculating the subsequent point position.
In the embodiment, the obtained coordinates of the manipulator of the grasped object are shifted to obtain the coordinates of the grasped object in a screw machine tool coordinate system, and when the center of a camera is taken as a center, MarkX and MarkY are taken as original coordinates; when the robot tool coordinate system is the center, markX and markY are shifted coordinates, and since Δ x ═ x ″ -x ', Δ y ═ y ″ -y', and Δ z ═ z ″ -z ═ 0, markX ═ markX- Δ x, and markY ═ markY- Δ y.
S4, establishing an image template database, wherein the image template database comprises a plurality of image templates;
s5, setting a plurality of ideal fixed points on the image template, and acquiring the coordinates of the ideal fixed points;
in the step, an image template database is established, the image template database comprises a plurality of image templates, and a plurality of ideal fixed points are preset on the image templates and are used as the standard of the actual positions of the screws.
S6, shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring the coordinate of the actual fixed point;
s7, acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
further, the step of obtaining the angle difference comprises:
s70, arbitrarily selecting two ideal fixed points m1 and m2 on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
s71, according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
S72, and m1 (x) is the ideal fixed point1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
Figure BDA0002260143780000072
And S73, the angle difference is delta theta, and the delta theta is theta 1-theta 2.
S8, acquiring the coordinates of the ideal screw on the image template;
s9, obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
further, the step of obtaining an offset value of ideal screw coordinates includes:
obtaining ideal screw coordinate P1 (x) on image templateP1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
Figure BDA0002260143780000073
Figure BDA0002260143780000074
and S10, acquiring the actual screw coordinate according to the offset value.
Further, the step of acquiring actual screw coordinates includes:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
in the above steps, the angle difference between the ideal fixed point and the actual fixed point is obtained to obtain the offset value between the subsequent actual fixed point and the ideal fixed point, and the actual position of each nailing position is obtained according to the ideal screw coordinate, so that the offset generated in the process of driving the screw is avoided.
In summary, with the above technical solution, the actual screw coordinate is determined by obtaining the ideal fixed point coordinate and the actual fixed point coordinate, obtaining the angle difference between the actual fixed point and the ideal fixed point, and then obtaining the offset according to the ideal screw coordinate and the angle difference. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. A visual guide method for screwing by a robot is characterized by comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between the camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
acquiring the coordinates of the ideal screw on the image template;
obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
2. The visual guidance method for robot screwing according to claim 1, wherein said mapping relationship is determined by a robot nine-point calibration method.
3. The visual guidance method for robotic screwing according to claim 2, wherein said step of acquiring the distance between the camera and the robotic tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
the camera shoots the fixed point q, and fixed point pixel coordinates q (x, y, z) and current first manipulator coordinates q '(x', y ', z') are obtained;
the robot tool touches a fixed point q, acquiring a current second manipulator coordinate q '(x', y ', z');
-acquiring a distance Q between the camera and the robot tool from the first and second robot coordinates Q '(x', y ', z') (x ", y", z "):
Figure FDA0002260143770000021
4. the visual guidance method for robotic screwing according to claim 3, wherein the step of acquiring said angular difference comprises:
any two of the ideal fixed points m1, m2 are taken on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining an angle theta 1 between the ideal fixed point and the X axis, the angle being
Figure FDA0002260143770000022
From said ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining an angle theta 2 between the actual fixed point and the X axis, the angle
Figure FDA0002260143770000023
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
5. The visual guidance method for robotic screwing according to claim 4, wherein the step of obtaining an offset value of the ideal screw coordinates comprises:
acquiring ideal screw coordinates P1 (x) on the image templateP1,yP1);
The offset value comprises Δ x, Δ y, and the offset value is obtained by:
Figure FDA0002260143770000024
Figure FDA0002260143770000025
6. the visual guidance method for robotic screwing according to claim 5, wherein the step of acquiring said actual screw coordinates comprises:
the actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
CN201911068409.6A 2019-11-05 2019-11-05 Visual guide method for screwing of robot Active CN110842919B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911068409.6A CN110842919B (en) 2019-11-05 2019-11-05 Visual guide method for screwing of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911068409.6A CN110842919B (en) 2019-11-05 2019-11-05 Visual guide method for screwing of robot

Publications (2)

Publication Number Publication Date
CN110842919A true CN110842919A (en) 2020-02-28
CN110842919B CN110842919B (en) 2021-01-22

Family

ID=69598932

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911068409.6A Active CN110842919B (en) 2019-11-05 2019-11-05 Visual guide method for screwing of robot

Country Status (1)

Country Link
CN (1) CN110842919B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312666A (en) * 2020-11-06 2021-02-02 浪潮电子信息产业股份有限公司 Circuit board screw driving method and system
CN112329530A (en) * 2020-09-30 2021-02-05 北京航空航天大学 Method, device and system for detecting mounting state of bracket
CN114178838A (en) * 2021-12-28 2022-03-15 上海稊米汽车科技有限公司 Multi-locking-point locking method and device applied to multi-plane workpiece
CN114683214A (en) * 2022-03-30 2022-07-01 武汉海微科技有限公司 Visual positioning method for automatically screwing vehicle-mounted screen shell
CN115582829A (en) * 2021-07-05 2023-01-10 腾讯科技(深圳)有限公司 Method and device for determining position of mechanical arm, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160082557A1 (en) * 2014-09-22 2016-03-24 Kuka Systems Corporation North America Robotic apparatus and process for the installation of collars and nuts onto fasteners
CN108312144A (en) * 2017-12-25 2018-07-24 北京航天测控技术有限公司 Automatically lock pays control system and method for robot based on machine vision
CN108544531A (en) * 2018-04-12 2018-09-18 江苏科技大学 A kind of automatic chemical examination robot arm device, control system and its control method of view-based access control model calibration
US20180297717A1 (en) * 2017-04-18 2018-10-18 Electroimpact, Inc. Camera assisted robotic system for locating the end of a fastener extending through an aircraft part during manufacture thereof
CN110125926A (en) * 2018-02-08 2019-08-16 比亚迪股份有限公司 The workpiece of automation picks and places method and system
CN110315525A (en) * 2018-03-29 2019-10-11 天津工业大学 A kind of robot workpiece grabbing method of view-based access control model guidance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160082557A1 (en) * 2014-09-22 2016-03-24 Kuka Systems Corporation North America Robotic apparatus and process for the installation of collars and nuts onto fasteners
US20180297717A1 (en) * 2017-04-18 2018-10-18 Electroimpact, Inc. Camera assisted robotic system for locating the end of a fastener extending through an aircraft part during manufacture thereof
CN108312144A (en) * 2017-12-25 2018-07-24 北京航天测控技术有限公司 Automatically lock pays control system and method for robot based on machine vision
CN110125926A (en) * 2018-02-08 2019-08-16 比亚迪股份有限公司 The workpiece of automation picks and places method and system
CN110315525A (en) * 2018-03-29 2019-10-11 天津工业大学 A kind of robot workpiece grabbing method of view-based access control model guidance
CN108544531A (en) * 2018-04-12 2018-09-18 江苏科技大学 A kind of automatic chemical examination robot arm device, control system and its control method of view-based access control model calibration

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329530A (en) * 2020-09-30 2021-02-05 北京航空航天大学 Method, device and system for detecting mounting state of bracket
CN112329530B (en) * 2020-09-30 2023-03-21 北京航空航天大学 Method, device and system for detecting mounting state of bracket
CN112312666A (en) * 2020-11-06 2021-02-02 浪潮电子信息产业股份有限公司 Circuit board screw driving method and system
CN112312666B (en) * 2020-11-06 2023-08-15 浪潮电子信息产业股份有限公司 Circuit board screw driving method and system
CN115582829A (en) * 2021-07-05 2023-01-10 腾讯科技(深圳)有限公司 Method and device for determining position of mechanical arm, electronic equipment and storage medium
CN114178838A (en) * 2021-12-28 2022-03-15 上海稊米汽车科技有限公司 Multi-locking-point locking method and device applied to multi-plane workpiece
CN114178838B (en) * 2021-12-28 2024-03-22 上海稊米汽车科技有限公司 Multi-locking point locking method and equipment applied to multi-plane workpiece
CN114683214A (en) * 2022-03-30 2022-07-01 武汉海微科技有限公司 Visual positioning method for automatically screwing vehicle-mounted screen shell

Also Published As

Publication number Publication date
CN110842919B (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN110842919B (en) Visual guide method for screwing of robot
JP7207851B2 (en) Control method, robot system, article manufacturing method, program and recording medium
CN107073719B (en) Robot and robot system
JP6427972B2 (en) Robot, robot system and control device
CN109015630B (en) Hand-eye calibration method and system based on calibration point extraction and computer storage medium
CN104540648B (en) Have the working rig and electronic part mounting of articulated robot
US20160354929A1 (en) Robot, robot control device, and robot system
WO2020024178A1 (en) Hand-eye calibration method and system, and computer storage medium
JP2014180720A (en) Robot system and calibration method
JP5618770B2 (en) Robot calibration apparatus and calibration method
JP2016166872A (en) Vision system for training assembly system by virtual assembly of object
WO2021169855A1 (en) Robot correction method and apparatus, computer device, and storage medium
CN105323455B (en) A kind of location compensation method based on machine vision
CN113001535A (en) Automatic correction system and method for robot workpiece coordinate system
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
JP2016187846A (en) Robot, robot controller and robot system
CN104552341A (en) Single-point multi-view meter-hanging posture error detecting method of mobile industrial robot
TWI699264B (en) Correction method of vision guided robotic arm
CN110465946B (en) Method for calibrating relation between pixel coordinate and robot coordinate
US20210154836A1 (en) Trajectory control device
CN113211431A (en) Pose estimation method based on two-dimensional code correction robot system
CN114347013A (en) Method for assembling printed circuit board and FPC flexible cable and related equipment
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
CN112238453A (en) Vision-guided robot arm correction method
CN117340879A (en) Industrial machine ginseng number identification method and system based on graph optimization model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant