CN110842919B - Visual guide method for screwing of robot - Google Patents
Visual guide method for screwing of robot Download PDFInfo
- Publication number
- CN110842919B CN110842919B CN201911068409.6A CN201911068409A CN110842919B CN 110842919 B CN110842919 B CN 110842919B CN 201911068409 A CN201911068409 A CN 201911068409A CN 110842919 B CN110842919 B CN 110842919B
- Authority
- CN
- China
- Prior art keywords
- fixed point
- acquiring
- ideal
- coordinate
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23P—METAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
- B23P19/00—Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes
- B23P19/04—Machines for simply fitting together or separating metal parts or objects, or metal and non-metal parts, whether or not involving some deformation; Tools or devices therefor so far as not provided for in other classes for assembling or disassembling parts
- B23P19/06—Screw or nut setting or loosening machines
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
Abstract
The invention provides a visual guidance method for screwing by a robot, which comprises the following steps: establishing a camera image coordinate system and a robot coordinate system; acquiring a mapping relation between a camera image coordinate system and a robot coordinate system; acquiring a distance between a camera and a robot tool; establishing an image template database, wherein the image template database comprises a plurality of image templates; setting a plurality of ideal fixed points on the image template and acquiring the coordinates of the ideal fixed points; shooting an actual fixed point corresponding to the product and the ideal fixed point and acquiring an actual fixed point coordinate; acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate; acquiring the coordinates of an ideal screw on the image template; acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference; and acquiring actual screw coordinates according to the deviation value. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
Description
Technical Field
The invention relates to the technical field of visual guidance of screwing of robots, in particular to a visual guidance method for screwing of robots.
Background
In the prior art, manual operation is usually adopted in the process of locking screws, but the method has low efficiency and is easy to cause the problem of deviation; some companies use mechanized operations to lock screws by using a robot tool, i.e., to use a robot to drive screws, but the problem of offset is easily caused.
Disclosure of Invention
In order to solve the problem that the bolt driving in the prior art is easy to deviate, a visual guiding method for robot bolt driving is further provided.
In order to achieve the above object, the present invention provides a visual guidance method for screwing by a robot, comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
acquiring the coordinates of an ideal screw on the image template;
acquiring an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
Preferably, the mapping relationship is determined by a robot nine-point calibration method.
Preferably, the step of acquiring the distance between the camera and the robot tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
shooting a fixed point q by a camera, and acquiring a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z');
the robot tool touches a fixed point q and acquires a current second manipulator coordinate q '(x', y ', z');
acquiring a distance Q between the camera and the robot tool from the first robot arm coordinate Q '(x', y ', z') and the second robot arm coordinate Q "(x", y ", z"):
preferably, the step of obtaining the angular difference comprises:
arbitrarily taking two ideal fixed points m1, m2 on the image template; wherein the ideal fixed point coordinates arem1(x1,y1),m2(x2,y2);
According to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
From an ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
Preferably, the step of obtaining an offset value of ideal screw coordinates comprises:
obtaining ideal screw coordinate P1 (x) on image templateP1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
preferably, the step of acquiring actual screw coordinates comprises:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
the technical scheme of the invention has the following technical effects:
by adopting the technical scheme, the mapping relation between the camera image coordinate system and the robot coordinate system is obtained by establishing the camera image coordinate system and the robot coordinate system, and the image coordinate obtained by the camera and the coordinate of the robot tool are converted; the method comprises the steps of obtaining an angle difference between an actual fixed point and an ideal fixed point by obtaining an ideal fixed point coordinate and an actual fixed point coordinate, and obtaining an offset according to the ideal screw coordinate and the angle difference, so that the actual screw coordinate is determined. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
Drawings
FIG. 1 is a flow chart of a robot screwing visual guidance method involved in the present invention;
FIG. 2 is a schematic diagram of the positional relationship of the components of the robot involved in the present invention;
fig. 3 is an enlarged schematic view of portion i of fig. 2.
Detailed Description
The following describes an embodiment according to the present invention with reference to the drawings.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and thus the present invention is not limited to the specific embodiments disclosed below.
The invention provides a visual guiding method for a robot to drive screws, aiming at solving the problem that the screws are easy to deviate.
As shown in fig. 1, the present invention provides a visual guiding method for screwing by a robot, comprising the following steps:
s1, establishing a camera image coordinate system and a robot coordinate system;
in this step, a camera image coordinate system o is establishedP-xPyPzPAnd robot coordinate system oR-xRyRzRThe device is applied to driving screws, and can be applied to other practical applications。
For example, the setup method shown in fig. 2 and 3 may be used, and the camera is mounted on the manipulator and moves with the manipulator, i.e., eye-in-hand. Camera image coordinate system oP-xPyPzPWith the optical axis of the camera as zPAxis with optical center as origin oPAfter (not shown in the figure) is reacted with zPSetting x in a plane with axis perpendicular and passing through originPAxis and yPA shaft; robot coordinate system oR-xRyRzRUsing the origin of the robot tool coordinate system as the origin oR(not shown in the figure) at zRSetting x in a plane with axis perpendicular and passing through originRAxis and yRA shaft; wherein o isA-xAyAzAIs a base coordinate system; oB-xByBzBIs the calibration plate coordinate system.
S2, acquiring a mapping relation between a camera image coordinate system and a robot coordinate system;
further, the mapping relation is determined by a robot nine-point calibration method.
In this step, by providing a robot nine-point calibration method, the coordinates of the image acquired by the camera and the coordinates of the robot tool are converted.
As shown in FIG. 3, the camera image coordinates P (x) when the camera image coordinate system is acquiredP,yP,zP) Coordinates R (x) of the robot tool in the robot coordinate systemR,yR,zR) (ii) a Since the Z-axis direction and the relative distance between the camera image coordinate P and the robot tool coordinate R are fixed, the data Z in the Z-axis directionPAnd zRAnd may not be considered in subsequent acquisitions.
Wherein R is a rotation angle and M is a displacement;
converting the formula:
xR=axP+byP+c;
yR=a′xP+b′yP+c′;
according to the formula, x is obtainedP0,xP1,xP2;yR0,yR1,yR2。
The robot nine-point calibration method is adopted to obtain the mapping relation between the camera image coordinate system and the robot coordinate system, namely the coordinate transformation relation between the camera and the robot tool.
In the embodiment, a fixed point q (x, y, 1) is placed on a camera shooting plane, a camera shoots and makes a template, a robot tool moves nine points in sequence to shoot and match, nine groups of pixel coordinates and manipulator coordinates are correspondingly acquired, and an affine transformation matrix is established by using the nine groups of coordinates through an image algorithm library. Obtaining a conversion matrix according to the nine-point calibration of the robotThe pixel coordinates are converted to manipulator coordinates q ' (x ', y ', 1) by the formula:
s3, acquiring the distance between the camera and the robot tool;
further, the step of acquiring the distance between the camera and the robot tool comprises:
s30, arbitrarily selecting a fixed point q in the camera photographing plane;
s31, shooting a fixed point q by a camera, and acquiring a pixel coordinate q (x, y, z) of the fixed point and a current first manipulator coordinate q '(x', y ', z');
s32, the robot tool touches a fixed point q, and current second manipulator coordinates q '(x', y ', z') are obtained;
s33, acquiring the distance between the camera and the robot tool as Q according to the first manipulator coordinate Q '(x', y ', z') and the second manipulator coordinate Q '(x', y ', z'):
wherein z ═ 0 and z ═ 0.
In the step, a fixed point arbitrarily selected in the camera photographing plane is provided, and the robot tool touches the fixed point, so that the distance from the center of the camera to the robot tool is obtained and is used for calculating the subsequent point position.
In the embodiment, the obtained coordinates of the manipulator of the grasped object are shifted to obtain the coordinates of the grasped object in a screw machine tool coordinate system, and when the center of a camera is taken as a center, MarkX and MarkY are taken as original coordinates; when the robot tool coordinate system is the center, markX and markY are shifted coordinates, and since Δ x ═ x ″ -x ', Δ y ═ y ″ -y', and Δ z ═ z ″ -z ═ 0, markX ═ markX- Δ x, and markY ═ markY- Δ y.
S4, establishing an image template database, wherein the image template database comprises a plurality of image templates;
s5, setting a plurality of ideal fixed points on the image template, and acquiring the coordinates of the ideal fixed points;
in the step, an image template database is established, the image template database comprises a plurality of image templates, and a plurality of ideal fixed points are preset on the image templates and are used as the standard of the actual positions of the screws.
S6, shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring the coordinate of the actual fixed point;
s7, acquiring an angle difference according to the ideal fixed point coordinate and the actual fixed point coordinate;
further, the step of obtaining the angle difference comprises:
s70, arbitrarily selecting two ideal fixed points m1 and m2 on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
s71, according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining the angle theta 1 between the ideal fixed point and the X axis
S72, and m1 (x) is the ideal fixed point1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining the angle theta 2 between the actual fixed point and the X axis
And S73, the angle difference is delta theta, and the delta theta is theta 1-theta 2.
S8, acquiring the coordinates of the ideal screw on the image template;
s9, obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
further, the step of obtaining an offset value of ideal screw coordinates includes:
obtaining ideal screw coordinate P1 (x) on image templateP1,yP1);
The offset value comprises Δ x and Δ y, and the offset value is obtained by:
and S10, acquiring the actual screw coordinate according to the offset value.
Further, the step of acquiring actual screw coordinates includes:
actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
in the above steps, the angle difference between the ideal fixed point and the actual fixed point is obtained to obtain the offset value between the subsequent actual fixed point and the ideal fixed point, and the actual position of each nailing position is obtained according to the ideal screw coordinate, so that the offset generated in the process of driving the screw is avoided.
In summary, with the above technical solution, the actual screw coordinate is determined by obtaining the ideal fixed point coordinate and the actual fixed point coordinate, obtaining the angle difference between the actual fixed point and the ideal fixed point, and then obtaining the offset according to the ideal screw coordinate and the angle difference. The visual guidance method provided by the invention can correctly guide the robot tool to reach the actual screw driving position, avoids the phenomenon of deviation in the screw driving process, is simple to operate and is easy to realize.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (6)
1. A visual guide method for screwing by a robot is characterized by comprising the following steps:
establishing a camera image coordinate system and a robot coordinate system;
acquiring a mapping relation between the camera image coordinate system and a robot coordinate system;
acquiring a distance between a camera and a robot tool;
establishing an image template database, wherein the image template database comprises a plurality of image templates;
setting a plurality of ideal fixed points on the image template, and acquiring coordinates of the ideal fixed points;
shooting an actual fixed point corresponding to the product and the ideal fixed point, and acquiring an actual fixed point coordinate;
acquiring an angle theta 1 between the ideal fixed point and an X axis according to the ideal fixed point coordinate; acquiring an angle theta 2 between the actual fixed point and an X axis according to the actual fixed point coordinate; obtaining an angle difference delta theta, wherein the delta theta is theta 1-theta 2;
acquiring the coordinates of the ideal screw on the image template;
obtaining an offset value of the ideal screw coordinate according to the ideal screw coordinate and the angle difference;
and acquiring actual screw coordinates according to the deviation value.
2. The visual guidance method for robot screwing according to claim 1, wherein said mapping relationship is determined by a robot nine-point calibration method.
3. The visual guidance method for robotic screwing according to claim 2, wherein said step of acquiring the distance between the camera and the robotic tool comprises:
arbitrarily taking a fixed point q in a camera shooting plane;
the camera shoots the fixed point q, and a fixed point pixel coordinate q (x, y, z) and a current first manipulator coordinate q '(x', y ', z') are obtained;
the robot tool touches a fixed point q and acquires current second manipulator coordinates q '(x', y ', z');
-acquiring a distance Q between the camera and the robot tool from the first robot arm coordinates Q '(x', y ', z') and the second robot arm coordinates Q "(x", y ", z"):
4. the visual guidance method for robotic screwing according to claim 3, wherein the step of acquiring said angular difference comprises:
any two of the ideal fixed points m1, m2 are taken on the image template; wherein the ideal fixed point coordinates are m1(x1, y1), m2(x2, y 2);
according to the ideal fixed point coordinate m1 (x)1,y1),m2(x2,y2) Obtaining an angle theta 1 between the ideal fixed point and the X axis, the angle being
From said ideal fixed point m1 (x)1,y1),m2(x2,y2) The respective actual fixed point coordinates are m11 (x)11,y11),m22(x22,y22) Obtaining an angle theta 2 between the actual fixed point and the X axis, the angle
The angular difference is Δ θ, and Δ θ is θ 1 — θ 2.
5. The visual guidance method for robotic screwing according to claim 4, wherein the step of obtaining an offset value of the ideal screw coordinates comprises:
acquiring ideal screw coordinates P1 (x) on the image templateP1,yP1);
The offset value comprises Δ x, Δ y, and the offset value is obtained by:
6. the visual guidance method for robotic screwing according to claim 5, wherein the step of acquiring said actual screw coordinates comprises:
the actual screw coordinate is P2 (x)P2,yP2) Wherein:
xP2=x11+Δx;
yP2=y11+Δy。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911068409.6A CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911068409.6A CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110842919A CN110842919A (en) | 2020-02-28 |
CN110842919B true CN110842919B (en) | 2021-01-22 |
Family
ID=69598932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911068409.6A Active CN110842919B (en) | 2019-11-05 | 2019-11-05 | Visual guide method for screwing of robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110842919B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112329530B (en) * | 2020-09-30 | 2023-03-21 | 北京航空航天大学 | Method, device and system for detecting mounting state of bracket |
CN112312666B (en) * | 2020-11-06 | 2023-08-15 | 浪潮电子信息产业股份有限公司 | Circuit board screw driving method and system |
CN114178838B (en) * | 2021-12-28 | 2024-03-22 | 上海稊米汽车科技有限公司 | Multi-locking point locking method and equipment applied to multi-plane workpiece |
CN114683214A (en) * | 2022-03-30 | 2022-07-01 | 武汉海微科技有限公司 | Visual positioning method for automatically screwing vehicle-mounted screen shell |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160082557A1 (en) * | 2014-09-22 | 2016-03-24 | Kuka Systems Corporation North America | Robotic apparatus and process for the installation of collars and nuts onto fasteners |
US10435179B2 (en) * | 2017-04-18 | 2019-10-08 | Electroimpact, Inc. | Camera assisted robotic system for locating the end of a fastener extending through an aircraft part during manufacture thereof |
CN108312144B (en) * | 2017-12-25 | 2020-10-20 | 北京航天测控技术有限公司 | Robot automatic locking control system and method based on machine vision |
CN110125926B (en) * | 2018-02-08 | 2021-03-26 | 比亚迪股份有限公司 | Automatic workpiece picking and placing method and system |
CN110315525A (en) * | 2018-03-29 | 2019-10-11 | 天津工业大学 | A kind of robot workpiece grabbing method of view-based access control model guidance |
CN108544531B (en) * | 2018-04-12 | 2020-11-10 | 江苏科技大学 | Automatic chemical examination mechanical arm device based on visual calibration, control system and control method thereof |
-
2019
- 2019-11-05 CN CN201911068409.6A patent/CN110842919B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110842919A (en) | 2020-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110842919B (en) | Visual guide method for screwing of robot | |
CN108453701B (en) | Method for controlling robot, method for teaching robot, and robot system | |
CN107073719B (en) | Robot and robot system | |
JP6427972B2 (en) | Robot, robot system and control device | |
Zhan et al. | Hand–eye calibration and positioning for a robot drilling system | |
US20140288710A1 (en) | Robot system and calibration method | |
JP2016221645A (en) | Robot, robot control device and robot system | |
CN111369625B (en) | Positioning method, positioning device and storage medium | |
WO2020024178A1 (en) | Hand-eye calibration method and system, and computer storage medium | |
JP5618770B2 (en) | Robot calibration apparatus and calibration method | |
CN105323455B (en) | A kind of location compensation method based on machine vision | |
WO2021169855A1 (en) | Robot correction method and apparatus, computer device, and storage medium | |
CN113001535A (en) | Automatic correction system and method for robot workpiece coordinate system | |
JP2016187846A (en) | Robot, robot controller and robot system | |
JP2017056546A (en) | Measurement system used for calibrating mechanical parameters of robot | |
CN104552341A (en) | Single-point multi-view meter-hanging posture error detecting method of mobile industrial robot | |
US20210154836A1 (en) | Trajectory control device | |
CN110465946B (en) | Method for calibrating relation between pixel coordinate and robot coordinate | |
TWI699264B (en) | Correction method of vision guided robotic arm | |
CN113211431A (en) | Pose estimation method based on two-dimensional code correction robot system | |
US20220395981A1 (en) | System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system | |
CN114347013A (en) | Method for assembling printed circuit board and FPC flexible cable and related equipment | |
EP4101604A1 (en) | System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system | |
CN110232710B (en) | Article positioning method, system and equipment based on three-dimensional camera | |
CN107756391B (en) | Correction method of mechanical arm correction system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |