CN112894823B - Robot high-precision assembling method based on visual servo - Google Patents

Robot high-precision assembling method based on visual servo Download PDF

Info

Publication number
CN112894823B
CN112894823B CN202110170583.2A CN202110170583A CN112894823B CN 112894823 B CN112894823 B CN 112894823B CN 202110170583 A CN202110170583 A CN 202110170583A CN 112894823 B CN112894823 B CN 112894823B
Authority
CN
China
Prior art keywords
pose
mechanical arm
image
workpiece
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110170583.2A
Other languages
Chinese (zh)
Other versions
CN112894823A (en
Inventor
袁顺宁
张彪
韩峰涛
曹华
庹华
耿旭达
李亚楠
任赜宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rokae Shandong Intelligent Technology Co ltd
Original Assignee
Rokae Shandong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rokae Shandong Intelligent Technology Co ltd filed Critical Rokae Shandong Intelligent Technology Co ltd
Priority to CN202110170583.2A priority Critical patent/CN112894823B/en
Publication of CN112894823A publication Critical patent/CN112894823A/en
Application granted granted Critical
Publication of CN112894823B publication Critical patent/CN112894823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention provides a high-precision robot assembling method based on visual servo, which comprises the following steps: performing teaching and setting work; after the teaching and setting work is finished, moving the mechanical arm to a grabbing area to grab a workpiece; moving the mechanical arm to a position above the first camera, performing visual servo motion by using the first camera and the mechanical arm to enable the position of the workpiece in the image to be M _ b, and recording a relative position T _ d of the mechanical arm relative to the position T _ b; photographing by using a second camera, calculating the pose of the assembly groove on the image, and calculating the expected pose M _ e of the visual mark on the image according to M _ c; moving the mechanical arm to the view of the second camera, and performing visual servo motion by using the second camera and the mechanical arm to enable the pose of the visual mark in the image to be M _ e; converting the relative pose T _ d into a pose under the terminal coordinate system of the mechanical arm at the moment, and moving the pose by the mechanical arm; and (5) moving the relative pose T _ c by the mechanical arm, namely accurately placing the workpiece on the assembling groove.

Description

Robot high-precision assembling method based on visual servo
Technical Field
The invention relates to the technical field of industrial robots, in particular to a high-precision robot assembling method based on visual servo.
Background
With the continuous increase of labor cost, the field of assembly production lines raises the automation heat tide. In modern automation lines, the "pick-and-place" assembly action is usually performed by an industrial robot. In order to complete the assembly task, the robot must know the initial pose of the workpiece before it is operated and the target pose after it is operated.
In a simple application scenario, the initial pose and the target pose of the workpiece are specified in advance, and the robot operates only according to a fixed program.
In most practical application scenes, the initial pose or the target pose of the workpiece is not strictly fixed and unchanged. In this case, visual guided positioning is an ideal solution. The robot senses the change of a working environment through a vision system, and obtains an initial pose of a workpiece before being operated and a target pose of the workpiece after being operated, so that the completion of a task is guaranteed. Therefore, the production line has certain flexibility, and even if certain error exists in the previous operation of the production line, the assembly cannot be influenced greatly. It is noted that the relationship between the camera and the robot has to be calibrated before using visual guidance for positioning.
In some assembly scenarios where high accuracy is required, simple visual guided positioning does not completely solve the problem. Firstly, the calculation of the initial pose of a workpiece before being operated and the calculation of the target pose of the workpiece after being operated under a robot coordinate system completely depend on the calibration result between a camera and the robot, and the calibration result is influenced by the absolute position error of the robot, the calculation of visual characteristic points in the calibration process and the like and is inaccurate; secondly, the workpiece may move slightly at the moment it is gripped by the robot gripper, thus causing positioning errors.
Disclosure of Invention
The object of the present invention is to solve at least one of the technical drawbacks mentioned.
Therefore, the invention aims to provide a robot high-precision assembling method based on visual servo.
In order to achieve the above object, an embodiment of the present invention provides a high-precision robot assembling method based on visual servoing, including the following steps:
step S1, teaching and setting work is performed, including:
step S11, calibrating the motion relation between the image pixels of the two cameras and the robot respectively;
step S12, positioning the workpiece, the assembly groove and the visual mark;
step S13, determining the reference pose of the grabbed workpiece on the image;
step S14, determining the pose of the visual mark on the image relative to the assembly groove and the relative pose of the tail end of the mechanical arm during placement;
step S2, after the teaching and setting work is finished, the mechanical arm is moved to a grabbing area to grab a workpiece;
step S3, moving the mechanical arm to a position above the first camera, performing visual servo motion by using the first camera and the mechanical arm to enable the position of the workpiece in the image to be M _ b, and recording a relative position T _ d of the mechanical arm relative to the position T _ b;
step S4, photographing by using a second camera, calculating the pose of the assembly groove on the image, and calculating the expected pose M _ e of the visual mark on the image according to M _ c;
step S5, moving the mechanical arm to the view of the second camera, and performing visual servo motion by using the second camera and the mechanical arm to enable the pose of the visual mark in the image to be M _ e;
step S6, converting the relative pose T _ d into a pose under the terminal coordinate system of the mechanical arm at the moment, and moving the pose by the mechanical arm;
and step S7, moving the relative pose T _ c by the mechanical arm, namely accurately placing the workpiece on the assembling groove.
Further, in the step S11, the robot moves k positions, and records the moving distance (x _1, y _1), (x _2, y _2). - (x _ k, y _ k) of the robot, and the moving distance (u _1, v _1), (u _2, v _2). - (u _ k, v _ k) of the center point of the visual marker on the image; wherein, k > is 2;
the transformation relationship between (u, v) and (x, y) is calculated using least squares, denoted as M _ ca.
Further, aiming at the workpiece and the assembly groove, training is carried out by using an image template matching algorithm, so that the system identifies the poses (x, y, theta) of the workpiece to be processed and the assembly groove in the image;
for the visual mark, firstly, a circle on the visual mark is found by using a blob detection algorithm, and then the position and the direction of the central point of the visual mark are determined by fitting two lines in the circle.
Further, in the step S13,
moving the mechanical arm to a fixed pose in a grabbing area and grabbing a workpiece;
moving a robotic arm over a first camera;
and the first camera takes a picture and calculates the pose M _ b of the workpiece on the image, and simultaneously records the pose T _ b of the tail end of the mechanical arm at the moment.
Further, in the step S14,
the second camera shoots and calculates the pose of the assembly groove on the image;
moving the mechanical arm to a position near the upper part of the assembly groove, taking a picture by a second camera CA and calculating the pose M _ c of the visual mark on the image relative to the assembly groove;
and moving the mechanical arm to place the workpiece into the assembly groove, and calculating the relative pose T _ c of the mechanical arm when the mechanical arm shoots with the step S12 at the moment when the workpiece contacts the assembly groove.
According to the high-precision assembling method of the robot based on the visual servo, the pose of the workpiece on the image is not required to be calculated in the whole process, the pose of the workpiece on the image is not required to be calculated, the motion relation of the image on the image, which corresponds to the mechanical arm, is only required to be calculated, and the calibration error between a camera and the mechanical arm can be avoided by continuously adjusting through the visual servo; after each grabbing, the relative position deviation of the object is calculated by using visual servo, and correction compensation is carried out during assembly, so that errors caused by physical contact during workpiece grabbing are avoided, and the effect is obvious in some light and small object scenes such as foam. The invention designs a visual mark, and can carry out accurate positioning by utilizing two lines of a circle and a center on the visual mark.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a block flow diagram of a high precision assembly method for a robot based on visual servoing according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a high precision assembly method of a robot based on visual servoing according to an embodiment of the present invention;
fig. 3 is a flow chart of a high-precision assembly method of a robot based on visual servoing according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The invention provides a high-precision robot assembling method based on visual servo, which applies the following hardware equipment: the device comprises a workbench, a second camera CA, a first camera CB, an industrial personal computer, a mechanical arm, a workpiece, an assembly groove and a visual mark. As shown in fig. 2, the table is a plane. One area is provided with a workpiece to be assembled; the other area is provided with a mounting groove. The mechanical arm needs to accurately place the workpiece to be assembled into the assembly groove.
Both cameras are fixed on the workbench. The second camera CA is fixed above the workbench, the lens faces downwards, and the visual field is aligned with the assembly area. The first camera CB is fixed near the grabbing area of the workbench, and the visual field is upward. The industrial personal computer is used for image processing, mathematical calculation, communication of a camera and a mechanical arm and the like. The mechanical arm is arranged beside the workbench, the tail end of the mechanical arm is provided with a workpiece grabbing device, and a visual mark (visual marker) is attached to the workpiece grabbing device. The visual mark has a circle, and the center of the circle has two mutually perpendicular line segments with different lengths.
As shown in fig. 1 and 3, the high-precision assembly method of the robot based on the visual servo according to the embodiment of the present invention includes the following steps:
step S1, teaching and setting work is performed, including:
step S11, calibrating the motion relationships M _ ca, M _ cb (2x2 matrix) between the image pixels of the two cameras and the robot, respectively.
Taking the second camera CA as an example:
1. the robot moves k positions (k > 2), and the moving distances (x _1, y _1), (x _2, y _2). (x _ k, y _ k) of the robot moving and the moving distances (u _1, v _1), (u _2, v _2). (u _ k, v _ k) of the visual mark central point (any recognizable point) on the image are recorded;
2. the transformation relationship between (u, v) and (x, y) is calculated using least squares, denoted as M _ ca.
Step S12, positioning the workpiece, the assembly groove, and the visual mark.
1. Aiming at the workpiece and the assembly groove, an image template matching algorithm is used for training, so that the system can accurately identify the poses (x, y, theta) of the workpiece to be processed and the assembly groove in the image;
2. for the visual mark, firstly, a circle on the visual mark is found by using a blob detection algorithm, and then the position and the direction of the central point of the visual mark are determined by fitting two lines in the circle.
And step S13, determining the reference pose of the grabbed workpiece on the image.
1. Moving the mechanical arm to a fixed pose in a grabbing area and grabbing a workpiece;
2. moving the mechanical arm to be close to the upper part of the first camera CB;
3. the first camera CB shoots and calculates the pose M _ b of the workpiece on the image, and simultaneously records the pose T _ b of the tail end of the mechanical arm at the moment.
And step S14, determining the relative pose of the visual mark on the image relative to the assembly groove and the relative pose of the tail end of the mechanical arm during placement.
1. The second camera CA takes a picture and calculates the pose of the assembly groove on the image;
2. moving the mechanical arm to a position near the upper part of the assembly groove, taking a picture by a second camera CA and calculating the pose M _ c of the visual mark on the image relative to the assembly groove;
3. and (3) moving the mechanical arm to place the workpiece in the assembly groove, and calculating the relative pose T _ c of the mechanical arm when the workpiece contacts the assembly groove and is photographed in the step 2 at the moment.
And step S2, after finishing the teaching and setting work, moving the mechanical arm to a grabbing area to grab the workpiece.
And step S3, moving the mechanical arm to a position T _ b above the first camera CB, performing visual servo motion by using the first camera CB and the mechanical arm to enable the position of the workpiece in the image to be M _ b, and recording the relative position T _ d of the mechanical arm relative to the position T _ b.
In step S4, the second camera CA photographs and calculates the pose of the assembly groove on the image, and calculates the expected pose M _ e of the visual marker on the image from M _ c.
And step S5, moving the mechanical arm to the view of the second camera CA, and performing visual servo motion by using the second camera CA and the mechanical arm to enable the pose of the visual mark in the image to be M _ e.
And step S6, converting the relative pose T _ d into a pose under the terminal coordinate system of the mechanical arm at the moment, and moving the pose by the mechanical arm.
And step S7, moving the relative pose T _ c by the mechanical arm, namely accurately placing the workpiece on the assembling groove.
According to the high-precision assembling method of the robot based on the visual servo, the pose of the workpiece on the image is not required to be calculated in the whole process, the pose of the workpiece on the image is not required to be calculated, the motion relation of the image on the image, which corresponds to the mechanical arm, is only required to be calculated, and the calibration error between a camera and the mechanical arm can be avoided by continuously adjusting through the visual servo; after each grabbing, the relative position deviation of the object is calculated by using visual servo, and correction compensation is carried out during assembly, so that errors caused by physical contact during workpiece grabbing are avoided, and the effect is obvious in some light and small object scenes such as foam. The invention designs a visual mark, and can carry out accurate positioning by utilizing two lines of a circle and a center on the visual mark.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (3)

1. A high-precision robot assembling method based on visual servo is characterized by comprising the following steps:
step S1, teaching and setting work is performed, including:
step S11, calibrating the motion relation between the image pixels of the two cameras and the robot respectively;
step S12, positioning the workpiece, the assembly groove and the visual mark;
step S13, determining the reference pose of the grabbed workpiece on the image;
in the step S13, in the above step,
moving the mechanical arm to a fixed pose in a grabbing area and grabbing a workpiece;
moving a robotic arm over a first camera;
the first camera shoots and calculates the pose M _ b of the workpiece on the image, and simultaneously records the pose T _ b of the tail end of the mechanical arm at the moment;
step S14, determining the pose of the mark relative to the assembly groove and the relative pose of the tail end of the mechanical arm on the image during placement;
in the step S14, in the above step,
the second camera takes a picture and calculates the pose of the assembly groove on the image;
moving the mechanical arm to a position near the upper part of the assembly groove, taking a picture by a second camera CA and calculating the pose M _ c of the visual mark on the image relative to the assembly groove;
moving the mechanical arm to place the workpiece in the assembly groove, and calculating the relative pose T _ c of the mechanical arm when the workpiece is in contact with the assembly groove and the shot of the step S14 at the moment;
step S2, after the teaching and setting work is finished, the mechanical arm is moved to a grabbing area to grab a workpiece;
step S3, moving the mechanical arm to a position above the first camera, performing visual servo motion by using the first camera and the mechanical arm to enable the position of the workpiece in the image to be M _ b, and recording a relative position T _ d of the mechanical arm relative to the position T _ b;
step S4, photographing by using a second camera, calculating the pose of the assembly groove on the image, and calculating the expected pose M _ e of the visual mark on the image according to M _ c;
step S5, moving the mechanical arm to the view of a second camera, and performing visual servo motion by using the second camera and the mechanical arm to enable the pose of the visual mark in the image to be M _ e;
step S6, converting the relative pose T _ d into a pose under the terminal coordinate system of the mechanical arm at the moment, and moving the pose by the mechanical arm;
and step S7, moving the relative pose T _ c by the mechanical arm, namely accurately placing the workpiece on the assembling groove.
2. The method for assembling a robot with high precision based on visual servoing as claimed in claim 1, wherein in step S11, the robot moves k positions, and the moving distances (x _1, y _1), (x _2, y _2). (x _ k, y _ k) of the robot moving and the moving distances (u _1, v _1), (u _2, v _2). -% (u _ k, v _ k) of the center points of the visual marks on the image are recorded; wherein k > = 2;
the transformation relationship between (u, v) and (x, y) is calculated using least squares, denoted as M _ ca.
3. The high-precision assembly method of the robot based on the visual servoing as set forth in claim 1, wherein in said step S12, an image template matching algorithm is used for training for the workpiece and the assembly slot, so that the system recognizes the poses (x, y, θ) of the workpiece to be assembled and the assembly slot in the image;
for the visual mark, firstly, a circle on the visual mark is found by using a blob detection algorithm, and then the position and the direction of the central point of the visual mark are determined by fitting two lines in the circle.
CN202110170583.2A 2021-02-08 2021-02-08 Robot high-precision assembling method based on visual servo Active CN112894823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110170583.2A CN112894823B (en) 2021-02-08 2021-02-08 Robot high-precision assembling method based on visual servo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110170583.2A CN112894823B (en) 2021-02-08 2021-02-08 Robot high-precision assembling method based on visual servo

Publications (2)

Publication Number Publication Date
CN112894823A CN112894823A (en) 2021-06-04
CN112894823B true CN112894823B (en) 2022-06-21

Family

ID=76123954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110170583.2A Active CN112894823B (en) 2021-02-08 2021-02-08 Robot high-precision assembling method based on visual servo

Country Status (1)

Country Link
CN (1) CN112894823B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113814984B (en) * 2021-10-20 2023-04-18 环旭电子股份有限公司 Dynamic image positioning method and system for robot emptying
CN114089767B (en) * 2021-11-23 2024-03-26 成都瑞特数字科技有限责任公司 Positioning and grabbing method for bottle-shaped objects in application of mobile compound robot
CN114339058B (en) * 2022-03-16 2022-05-27 珞石(北京)科技有限公司 Mechanical arm flying shooting positioning method based on visual marks
CN114932541B (en) * 2022-06-15 2023-07-25 中迪机器人(盐城)有限公司 Robot-based automatic assembly system and method
CN115194773B (en) * 2022-08-22 2024-01-05 苏州佳祺仕科技股份有限公司 Visual guidance assembling method and device
CN115570562B (en) * 2022-09-05 2023-06-02 梅卡曼德(北京)机器人科技有限公司 Robot assembly pose determining method and device, robot and storage medium
CN115564847B (en) * 2022-11-17 2023-03-31 歌尔股份有限公司 Visual calibration method and device of visual assembly system and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110936369A (en) * 2018-09-25 2020-03-31 南京曼新智能科技有限公司 Binocular vision and mechanical arm based large-scale workpiece pose accurate measurement and grabbing device and method
CN111546344A (en) * 2020-05-18 2020-08-18 北京邮电大学 Mechanical arm control method for alignment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5237468A (en) * 1991-10-15 1993-08-17 International Business Machines Corporation Camera and gripper assembly for an automated storage library
DE102007060653A1 (en) * 2007-12-15 2009-06-18 Abb Ag Position determination of an object
US10477154B2 (en) * 2013-03-07 2019-11-12 Cognex Corporation System and method for aligning two work pieces with a vision system in the presence of occlusion
CN105538345B (en) * 2016-01-27 2017-09-26 华南理工大学 A kind of puma manipulator and positioning assembly method based on many camera lenses
CN106272416B (en) * 2016-08-29 2020-12-29 上海交通大学 Robot slender shaft precision assembly system and method based on force sense and vision
JP6809245B2 (en) * 2017-01-20 2021-01-06 セイコーエプソン株式会社 robot
CN107398901B (en) * 2017-07-28 2019-03-01 哈尔滨工业大学 The visual servo control method of robot for space maintainable technology on-orbit
CN111958604A (en) * 2020-08-20 2020-11-20 扬州蓝邦数控制刷设备有限公司 Efficient special-shaped brush monocular vision teaching grabbing method based on CAD model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110936369A (en) * 2018-09-25 2020-03-31 南京曼新智能科技有限公司 Binocular vision and mechanical arm based large-scale workpiece pose accurate measurement and grabbing device and method
CN111546344A (en) * 2020-05-18 2020-08-18 北京邮电大学 Mechanical arm control method for alignment

Also Published As

Publication number Publication date
CN112894823A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN112894823B (en) Robot high-precision assembling method based on visual servo
CN110238849B (en) Robot hand-eye calibration method and device
EP3705239B1 (en) Calibration system and method for robotic cells
CN111452040B (en) System and method for associating machine vision coordinate space in a pilot assembly environment
CN108326850B (en) Method and system for robot to accurately move mechanical arm to reach specified position
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
JP4265088B2 (en) Robot apparatus and control method thereof
CN111390901B (en) Automatic calibration method and calibration device for mechanical arm
EP0489919A1 (en) Calibration system of visual sensor
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
CN112621743B (en) Robot, hand-eye calibration method for fixing camera at tail end of robot and storage medium
CN110666798A (en) Robot vision calibration method based on perspective transformation model
CN110148187A (en) A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
CN112720458B (en) System and method for online real-time correction of robot tool coordinate system
US20200262080A1 (en) Comprehensive model-based method for gantry robot calibration via a dual camera vision system
CN112792814B (en) Mechanical arm zero calibration method based on visual marks
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
CN112658643A (en) Connector assembly method
EP3602214B1 (en) Method and apparatus for estimating system error of commissioning tool of industrial robot
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN110533727B (en) Robot self-positioning method based on single industrial camera
CN111482964A (en) Novel robot hand-eye calibration method
CN110815177A (en) Migration method for 2D visual guidance teaching of composite robot
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN116681772A (en) Multi-camera online calibration method under non-common view

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant