CN114734451A - Visual positioning method combining articulated manipulator and camera posture change - Google Patents

Visual positioning method combining articulated manipulator and camera posture change Download PDF

Info

Publication number
CN114734451A
CN114734451A CN202210530700.6A CN202210530700A CN114734451A CN 114734451 A CN114734451 A CN 114734451A CN 202210530700 A CN202210530700 A CN 202210530700A CN 114734451 A CN114734451 A CN 114734451A
Authority
CN
China
Prior art keywords
manipulator
camera
calibration
visual positioning
positioning method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210530700.6A
Other languages
Chinese (zh)
Other versions
CN114734451B (en
Inventor
黎伟权
王后方
罗祥祝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shuangyi Photoelectric Technology Co ltd
Original Assignee
Shenzhen Shuangyi Photoelectric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shuangyi Photoelectric Technology Co ltd filed Critical Shenzhen Shuangyi Photoelectric Technology Co ltd
Priority to CN202210530700.6A priority Critical patent/CN114734451B/en
Publication of CN114734451A publication Critical patent/CN114734451A/en
Application granted granted Critical
Publication of CN114734451B publication Critical patent/CN114734451B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of visual positioning, in particular to a visual positioning method combining a joint type manipulator and posture change of a camera. Compared with the prior art, the joint type mechanical arm and the visual positioning method for the posture change of the camera realize the new requirement of the industry that the camera is arranged on the second arm, the coordinate relation between the camera and the mechanical arm is calibrated at one position, the mechanical arm can be moved to any position for processing, the hardware resource of a theta axis of the mechanical arm is saved, and the workload of calibration operation is simplified.

Description

Visual positioning method combining articulated manipulator and camera posture change
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of visual positioning, in particular to a visual positioning method combining a joint type manipulator and camera posture change.
[ background of the invention ]
The application of the locking screw requires the locking screw action at multiple positions of the same product, and the prior art has two modes;
1. the camera is arranged on the theta axis of the manipulator, the posture of the camera is kept unchanged, although the camera can be moved to other places for processing by only calibrating the relation between the camera coordinate system and the manipulator coordinate system at one place, the hardware resource of the theta axis is consumed, and the theta axis can be used as an electric screwdriver for locking screws;
2. the camera is installed at manipulator second arm, and the gesture changes along with the manipulator motion, can only mark in every screw hole position, has improved the production operation complexity.
[ summary of the invention ]
In order to overcome the above problems, the present invention provides a visual positioning method combining the joint type manipulator and the camera posture change, which can effectively solve the above problems.
The invention provides a technical scheme for solving the technical problems, which comprises the following steps: the visual positioning method combining the posture change of the articulated manipulator and the camera comprises the following steps:
step S1, the camera collects a calibration image with nine grid points;
step S2, the guiding manipulator pokes the actual positions on the point calibration drawing in sequence;
step S3, calculating the relation between the camera and each joint of the manipulator;
and step S4, calculating the positioning when the manipulator reaches the new working point.
Preferably, the step S1 includes the following steps:
step S11, a grid calibration graph is laid flat, and the grid calibration graph is called as a calibration position A;
step S12, the camera is arranged on the second arm of the manipulator and guides the manipulator to move, so that the camera can observe a complete calibration position A;
step S13, the camera captures all the grid pixel coordinates at one time;
and step S14, reading back the world coordinates XY of the manipulator at the moment and the joint angles J1 and J2.
Preferably, the step S2 includes the steps of:
step S21, grabbing the pixel coordinate sequence of the grid points by the corresponding camera, and guiding the manipulator to click the grid points;
and step S22, reading back the coordinates of the mechanical arm when the mechanical arm clicks.
Preferably, the step S3 includes the following steps:
step S31, the relationship calibration of the camera and the manipulator is realized at the calibration position A by using the pixel coordinates of the lattice points and the mechanical coordinate point pairs of the lattice points clicked by the manipulator;
and step S32, calculating the relative relation between the camera and the posture of the manipulator at the calibration position A by using the world coordinates XY, J1 and J2 joint angles of the manipulator read back at the calibration position A.
Preferably, the step S4 includes the following steps:
step S41, moving the manipulator to a new working point B, and capturing a target pixel PixelXY in a visual field by the camera;
step S42, the world coordinates XYb, J1B and J2B joint angles of the manipulator at the moment are read, and the manipulator world coordinates of the screw at the position where the screwdriver head can click the screw B at the work position are calculated by combining the joint angles of the manipulator Xya, J1a and J2a at the calibration position A
Compared with the prior art, the joint type mechanical arm and the visual positioning method for the posture change of the camera realize the new requirement of the industry that the camera is arranged on the second arm, the coordinate relation between the camera and the mechanical arm is calibrated at one position, the mechanical arm can be moved to any position for processing, the hardware resource of a theta axis of the mechanical arm is saved, and the workload of calibration operation is simplified.
[ description of the drawings ]
FIG. 1 is a general flow chart of the present invention of a method for visual positioning in conjunction with changes in the pose of an articulated robot and camera;
FIG. 2 is a flowchart of step S1 of the visual positioning method according to the present invention;
FIG. 3 is a flowchart of step S2 of the visual positioning method according to the present invention;
FIG. 4 is a flowchart of step S3 of the visual positioning method according to the present invention;
fig. 5 is a schematic diagram of the calibration process of the camera and the robot in the visual positioning method combining the articulated robot and the posture change of the camera according to the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It should be noted that all directional indications (such as up, down, left, right, front, and back … …) in the embodiments of the present invention are limited to relative positions on a given view, not absolute positions.
In addition, the descriptions related to "first", "second", etc. in the present invention are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Referring to fig. 1 to 5, the visual positioning method combining the articulated robot and the camera pose change according to the present invention is applicable to all 4-axis plane articulated robots, and the pose of the camera changes according to the motion of the robot, and includes the following steps:
step S1, the camera acquires a calibration image with nine grid points.
In the step S1, the method includes the following steps:
step S11, a grid calibration graph is laid at a certain position, and the graph is called a calibration position A;
step S12, the camera is arranged on the second arm of the manipulator and guides the manipulator to move, so that the camera can observe a complete calibration position A;
step S13, the camera captures all grid pixel coordinates once;
and step S14, reading back the world coordinates XY of the manipulator at the moment and the joint angles J1 and J2.
The calibration is performed by requiring pixel-to-physical point pairs, pixel points having been generated in S13, i.e., pixel coordinates of each object in the grid graph. The positions read back here are the corresponding robot poses at the time of registration calibration, where XY is the working point world coordinates of the robot, and J1, J2 joint angles are the corresponding joint coordinates, described in angular quantities.
In step S2, the manipulator is guided to successively stamp the actual positions on the dot calibration drawing.
The step S2 includes the following steps:
step S21, grabbing the pixel coordinate sequence of the grid points by the corresponding camera, and guiding the manipulator to click the grid points;
and step S22, reading back the coordinates of the mechanical arm when the mechanical arm clicks.
In step S3, the relationship between the camera and each joint of the robot is calculated.
In the step S3, the method includes the following steps:
step S31, using the pixel coordinates of the grid points and the mechanical coordinate point pairs of the mechanical hand clicking the grid points, and realizing the relation calibration of the camera and the mechanical hand at a calibration position A;
and step S32, calculating the relative relation between the camera and the posture of the manipulator at the calibration position A by using the world coordinates XY, J1 and J2 joint angles of the manipulator read back at the calibration position A.
In step S31, the calibration process of the camera and the manipulator is as shown in fig. 5:
the positive coordinate system is a manipulator coordinate system, and the oblique coordinate system is a camera coordinate system. There is a physical point P, which is in the manipulator coordinates [ Wx, Wy ], and in the camera coordinate system [ Vx, Vy ]; the origin of the camera coordinate system is [ Tx, Ty ] in the manipulator coordinate system; the camera coordinate system and the manipulator coordinate system form an included angle theta, and a proportion f exists between the two coordinate systems. Then the following relationship exists
Wx=Vx*cos(θ)*f–Vy*sin(θ)*f+Tx;
Wy=Vx*sin(θ)*f+Vy*cos(θ)*f+Ty;
Let cos (θ) × f ═ u and sin (θ) × f ═ v, the above two equations can be simplified as
Wx=Vx*u–Vy*v+Tx;
Wy=Vx*v+Vy*u+Ty;
Expressed in matrix form, then:
Figure BDA0003646346740000041
the rectangular array of the above formula, [ Vx, Vy ] and [ Wx, Wy ] are a pixel physical point pair, a least square model is constructed by utilizing a plurality of point pairs, and an optimal value is solved, namely the calibration of a camera coordinate system and a manipulator coordinate system is completed.
In step S32, when the manipulator is in the XYa position, J1a, and J2a joint angle attitude, the camera and the manipulator are calibrated, and they are a complete set of corresponding relationships.
And step S4, calculating the positioning when the manipulator reaches the new working point.
In the step S4, the method includes the following steps:
step S41, moving the manipulator to a new working point B, and capturing a target pixel PixelXY in a visual field by the camera;
and step S42, reading the world coordinates XYb, J1B and J2B joint angles of the manipulator at the moment, combining the joint angles XYa, J1a and J2a of the manipulator at the calibration position A, and calculating the world coordinates of the manipulator of the B screw at the working position which can be clicked by the screwdriver head.
In the step S42, the method includes the following steps:
step S421, calculating the angle variation between the cameras at the working point B and the calibration point a as follows:
DeltaAngle=(J1a+J2a)–(J1b+J2b);
step S422, calculating the world coordinate offset between the working point B and the calibration point A as follows:
DeltaXY=XYb–XYa;
step S423, converting the pixel coordinate PixelXY to the manipulator world coordinate through the calibration relationship between the camera and the manipulator, as follows:
WorldXY=SensorToWorld(PixelXY);
step S424, rotating WorldXY by DeltaAngle around the origin of the world coordinate system of the manipulator to obtain rotaxy as follows:
RotateXY.x=WorldXY.x*cos(DeltaAngle)–WorldXY.y*sin(DeltaAngle)
RotateXY.y=WorldXY.x*sin(DeltaAngle)+WorldXY.y*cos(DeltaAngle);
step S425, calculating the world coordinate RealXY of the screw tip point to the target by combining RotateXY and DeltaXY as follows:
RealXY=RotateXY+DeltaXY。
the invention relates to a visual positioning method combining an articulated manipulator and camera posture change, which needs to use a visual positioning system, wherein the visual positioning system comprises an image acquisition camera and a light source matched with the image acquisition camera, and the image acquisition camera and the light source are both connected to a visual controller. The algorithm of the visual positioning method combining the joint type mechanical arm and the camera posture change is uploaded to the visual controller.
Compared with the prior art, the joint type mechanical arm and the visual positioning method for the posture change of the camera realize the new requirement of the industry that the camera is arranged on the second arm, the coordinate relation between the camera and the mechanical arm is calibrated at one position, the mechanical arm can be moved to any position for processing, the hardware resource of a theta axis of the mechanical arm is saved, and the workload of calibration operation is simplified.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and any modifications, equivalents, improvements, etc. made within the spirit of the present invention should be included in the scope of the present invention.

Claims (5)

1. The visual positioning method combining the posture change of the articulated manipulator and the camera is characterized by comprising the following steps of:
step S1, the camera collects a calibration image with nine grid points;
step S2, the guiding manipulator pokes the actual positions on the point calibration drawing in sequence;
step S3, calculating the relation between the camera and each joint of the manipulator;
and step S4, calculating the positioning when the manipulator reaches the new working point.
2. The visual positioning method combining the articulated manipulator and the pose change of the camera according to claim 1, wherein the step S1 comprises the steps of:
step S11, a grid calibration graph is laid flat, and the grid calibration graph is called as a calibration position A;
step S12, the camera is arranged on the second arm of the manipulator and guides the manipulator to move, so that the camera can observe a complete calibration position A;
step S13, the camera captures all grid pixel coordinates once;
and step S14, reading back the world coordinates XY of the manipulator at the moment and the joint angles J1 and J2.
3. The visual positioning method in combination with the articulated manipulator and camera pose change of claim 1, wherein the step S2 comprises the steps of:
step S21, grabbing the pixel coordinate sequence of the grid points by the corresponding camera, and guiding the manipulator to click the grid points;
and step S22, reading back the coordinates of the mechanical arm when the mechanical arm clicks.
4. The visual positioning method combining the articulated manipulator and the pose change of the camera according to claim 1, wherein the step S3 comprises the steps of:
step S31, using the pixel coordinates of the grid points and the mechanical coordinate point pairs of the mechanical hand clicking the grid points, and realizing the relation calibration of the camera and the mechanical hand at a calibration position A;
and step S32, calculating the relative relation between the camera and the posture of the manipulator at the calibration position A by using the world coordinates XY, J1 and J2 joint angles of the manipulator read back by the calibration position A.
5. The visual positioning method combining the articulated manipulator and the pose change of the camera according to claim 1, wherein the step S4 comprises the steps of:
step S41, moving the manipulator to a new working point B, and capturing a target pixel PixelXY in a visual field by the camera;
and step S42, reading the world coordinates XYb, J1B and J2B joint angles of the manipulator at the moment, combining the joint angles XYa, J1a and J2a of the manipulator at the calibration position A, and calculating the world coordinates of the manipulator of the B screw at the working position which can be clicked by the screwdriver head.
CN202210530700.6A 2022-05-16 2022-05-16 Visual positioning method combining joint type mechanical arm and camera posture change Active CN114734451B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210530700.6A CN114734451B (en) 2022-05-16 2022-05-16 Visual positioning method combining joint type mechanical arm and camera posture change

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210530700.6A CN114734451B (en) 2022-05-16 2022-05-16 Visual positioning method combining joint type mechanical arm and camera posture change

Publications (2)

Publication Number Publication Date
CN114734451A true CN114734451A (en) 2022-07-12
CN114734451B CN114734451B (en) 2024-08-27

Family

ID=82285332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210530700.6A Active CN114734451B (en) 2022-05-16 2022-05-16 Visual positioning method combining joint type mechanical arm and camera posture change

Country Status (1)

Country Link
CN (1) CN114734451B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6681151B1 (en) * 2000-12-15 2004-01-20 Cognex Technology And Investment Corporation System and method for servoing robots based upon workpieces with fiducial marks using machine vision
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN109015630A (en) * 2018-06-21 2018-12-18 深圳辰视智能科技有限公司 Hand and eye calibrating method, system and the computer storage medium extracted based on calibration point
CN110000790A (en) * 2019-04-19 2019-07-12 深圳科瑞技术股份有限公司 A kind of scaling method of SCARA robot eye-to-hand hand-eye system
CN110490942A (en) * 2019-08-23 2019-11-22 苏州精速智能科技有限公司 A kind of mobile camera calibration method based on the second arm of SCARA manipulator
CN111098306A (en) * 2019-12-31 2020-05-05 广东利元亨智能装备股份有限公司 Calibration method and device of robot, robot and storage medium
CN111300422A (en) * 2020-03-17 2020-06-19 浙江大学 Robot workpiece grabbing pose error compensation method based on visual image
CN112692840A (en) * 2020-12-10 2021-04-23 安徽巨一科技股份有限公司 Mechanical arm positioning guiding and calibrating method based on machine vision cooperation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6681151B1 (en) * 2000-12-15 2004-01-20 Cognex Technology And Investment Corporation System and method for servoing robots based upon workpieces with fiducial marks using machine vision
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN109015630A (en) * 2018-06-21 2018-12-18 深圳辰视智能科技有限公司 Hand and eye calibrating method, system and the computer storage medium extracted based on calibration point
CN110000790A (en) * 2019-04-19 2019-07-12 深圳科瑞技术股份有限公司 A kind of scaling method of SCARA robot eye-to-hand hand-eye system
CN110490942A (en) * 2019-08-23 2019-11-22 苏州精速智能科技有限公司 A kind of mobile camera calibration method based on the second arm of SCARA manipulator
CN111098306A (en) * 2019-12-31 2020-05-05 广东利元亨智能装备股份有限公司 Calibration method and device of robot, robot and storage medium
CN111300422A (en) * 2020-03-17 2020-06-19 浙江大学 Robot workpiece grabbing pose error compensation method based on visual image
CN112692840A (en) * 2020-12-10 2021-04-23 安徽巨一科技股份有限公司 Mechanical arm positioning guiding and calibrating method based on machine vision cooperation

Also Published As

Publication number Publication date
CN114734451B (en) 2024-08-27

Similar Documents

Publication Publication Date Title
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
CN109483516B (en) Mechanical arm hand-eye calibration method based on space distance and polar line constraint
US20110320039A1 (en) Robot calibration system and calibrating method thereof
CN111369625B (en) Positioning method, positioning device and storage medium
Zhan et al. Hand–eye calibration and positioning for a robot drilling system
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN110842919B (en) Visual guide method for screwing of robot
JP2019115974A (en) Calibration and operation of vision-based manipulation systems
US20220395981A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
CN113379849A (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN111872922B (en) Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor
JP2019155556A (en) Control device of robot, robot, robot system, and calibration method for camera
CN111438688A (en) Robot correction method, robot correction device, computer equipment and storage medium
WO2023134237A1 (en) Coordinate system calibration method, apparatus and system for robot, and medium
US12128571B2 (en) 3D computer-vision system with variable spatial resolution
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
TWI699264B (en) Correction method of vision guided robotic arm
CN117817667B (en) Mechanical arm tail end posture adjustment method based on SVD decomposition method
CN112958960B (en) Robot hand-eye calibration device based on optical target
JPWO2018043524A1 (en) Robot system, robot system control apparatus, and robot system control method
Li et al. Research on hand-eye calibration technology of visual service robot grasping based on ROS
TWI708667B (en) Method and device and system for calibrating position and orientation of a motion manipulator
CN114224489B (en) Track tracking system for surgical robot and tracking method using same
CN114734451A (en) Visual positioning method combining articulated manipulator and camera posture change
CN117428777A (en) Hand-eye calibration method of bag-removing robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant