CN110842928B - Visual guiding and positioning method for compound robot - Google Patents

Visual guiding and positioning method for compound robot Download PDF

Info

Publication number
CN110842928B
CN110842928B CN201911225526.9A CN201911225526A CN110842928B CN 110842928 B CN110842928 B CN 110842928B CN 201911225526 A CN201911225526 A CN 201911225526A CN 110842928 B CN110842928 B CN 110842928B
Authority
CN
China
Prior art keywords
mechanical arm
calibration
tail end
robot
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911225526.9A
Other languages
Chinese (zh)
Other versions
CN110842928A (en
Inventor
杨跞
朱小生
贺琪欲
李兵
刘一帆
李法设
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siasun Co Ltd
Original Assignee
Siasun Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siasun Co Ltd filed Critical Siasun Co Ltd
Priority to CN201911225526.9A priority Critical patent/CN110842928B/en
Publication of CN110842928A publication Critical patent/CN110842928A/en
Application granted granted Critical
Publication of CN110842928B publication Critical patent/CN110842928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators

Abstract

The invention provides a visual guiding and positioning method of a compound robot, which is characterized in that the relative position and posture information of the compound robot, including the distance, position, angle and the like with a workbench, is recorded in detail during calibration, so that the relative posture is reproduced during operation application, the visual and action deviation of the compound robot caused by the factors of trolley navigation positioning error, uneven ground and the like is covered, and the operation precision of the robot in the actual environment is improved. The invention ensures that the operation precision of the compound robot is not influenced by various actual error factors through accurate relative positioning based on information of various sensors.

Description

Visual guiding and positioning method for compound robot
Technical Field
The disclosure belongs to the field of industrial robots, particularly relates to a robot guiding and positioning technology, and particularly relates to a visual guiding and positioning method of a compound robot.
Background
The composite robot is a moving manipulator consisting of an AGV (automatic moving trolley) and mechanical arms, belongs to a novel robot, can be rapidly distributed in the 3C industry, an automatic factory, storage sorting and automatic goods supermarket, and realizes automatic material handling, material loading and unloading, material sorting and the like.
At present, the vision guide positioning of the composite robot is the same as that of the fixed base robot, and the mode of eyes on hands is mainly adopted, namely, the tail end of a mechanical arm drives a camera to move, but the working environment condition of the composite robot is more complex than that of the fixed base robot, so that the difficulty in grabbing and placing the vision guide object of the robot is increased.
The working area faced by the base fixing robot is fixed, usually, the corresponding relation of the hand-eye positions of all the acquisition points is established through the prior calibration, the calibration parameters are calculated, the displacement of the mechanical arm is subsequently calculated only according to the pixel coordinates and the calibration parameters of the object in the visual field, the mechanical arm is controlled to move according to the displacement, the workpiece placing position can be accurately aligned, and then the accurate grabbing and placing are realized. Therefore, before the mechanical arm grabs and places the workpiece, the alignment with the workpiece position can be realized only by calculation and adjustment in a two-dimensional plane.
The composite robot with the movable base needs to automatically navigate the AGV to a parking point, and then the mechanical arm is used for grabbing and placing objects at a target position on a workbench, in the process, because the parking point position of the AGV has errors and the ground of the parking point of the AGV is uneven, the position and the posture of the tail end of the mechanical arm relative to the workbench after the AGV reaches the parking point are different from the calibration time.
Disclosure of Invention
The invention aims to provide a visual guide positioning device and method of a composite robot, which aim to solve the problem that the visual guide precision of the composite robot is reduced in an actual working environment.
One aspect of the present invention provides a vision guiding and positioning apparatus for a robot, including:
the measuring module is fixed at the tail end of the mechanical arm and used for measuring the distance between the tail end of the mechanical arm and the working table surface, the inclination angle of the tail end of the mechanical arm and acquiring an image of an identification area serving as a guide reference on the working table;
the visual guidance processing module is used for controlling the motion of the mechanical arm through the robot motion controller according to the information acquired by the measuring module, so that the distance, the position and the angle value of the tail end of the mechanical arm relative to the workbench reach the same state when the robot is calibrated and applied in operation; controlling automatic acquisition of calibration data and calculation of calibration parameters; and controlling the tail end of the mechanical arm to deviate to a workpiece placing position from the calibration position along the TCP to complete workpiece grabbing and placing.
Further, the measurement module includes a ranging sensor, a tilt sensor, and an industrial camera.
Further, the visual guidance processing module includes:
a measurement module communication unit: communicating with the measuring module, and acquiring and recording the distance and inclination angle information acquired by the measuring module in real time;
a robot communication unit: sending an instruction to a robot motion controller, commanding the robot motion controller to control the robot to move, and simultaneously acquiring the coordinate information of the robot in real time;
an image identification positioning unit: controlling the measuring module to collect images, identifying and positioning the received images, and acquiring and recording pixel coordinates of reference points and Rz angle values of the images;
a calibration parameter calculation unit: calculating calibration parameters and storing according to the pixel coordinates of the reference points acquired at the acquisition points and the offset of the mechanical arm;
a human-computer interaction unit: and the input and output interface provides information such as instructions, data, images and the like for a user.
Furthermore, the system also comprises a two-dimensional code label or a character label which is fixed on the identification area on the workbench.
The invention also provides a visual guide positioning method of the compound robot suitable for the device, which comprises a visual guide calibration step and a visual guide job application step, wherein the visual guide calibration step specifically comprises the following steps:
acquiring a relative pose: after the trolley reaches a proper parking point, adjusting the tail end of the mechanical arm to a proper calibration original point, and acquiring and recording the pose of the mechanical arm, the inclination angle of the tail end of the mechanical arm, the distance and position relative to the working table and the Rz angle value;
calibration parameter calculation: collecting calibration data, and calculating calibration parameters according to the collected data;
measuring the TCP offset of the mechanical arm from the calibration original point to the workpiece placing position;
the step of visually guiding the job application specifically comprises:
and (3) reproduction of relative pose: when the trolley stops at the position and the angle during calibration, the mechanical arm is controlled to move to the posture recorded during calibration, and then the tail end of the mechanical arm is controlled to adjust to the relative posture recorded during calibration;
and controlling the mechanical arm to move according to the TCP offset from the calibration origin to the workpiece placing position to complete the grabbing and releasing task.
Further, the step of acquiring the relative pose specifically includes:
adjusting the tail end posture of the mechanical arm to enable a working plane of the mechanical arm to be parallel to a working table surface, and collecting and recording values of inclination angles Rx and Ry of the tail end of the mechanical arm;
adjusting the distance between the tail end of the mechanical arm and the workbench to enable the imaging view of the measuring module to be clear, and recording the current point as a calibration origin;
collecting and recording the distance from the tail end of the mechanical arm to the working table at the moment;
collecting the image of the identification area on the workbench, identifying and positioning the image, and extracting and recording the pixel coordinates of the reference point and the Rz angle value of the identification area;
and collecting and recording the pose of the mechanical arm at the moment.
Further, the method for identifying and positioning the image comprises the following steps: and identifying and positioning by using a shape matching mode, and if the identification area is provided with a two-dimension code label, identifying and positioning by using a two-dimension code identifying and positioning mode.
Further, the calibration data acquisition method comprises the following steps:
setting TCP offset of each acquisition point relative to the calibration origin;
controlling the mechanical arm to move to each acquisition point along the TCP coordinate system according to a preset offset;
and collecting the image of the identification area on the workbench at each collection point, identifying and positioning the image, and extracting and recording the pixel coordinates of the reference point.
Further, the calibration parameter calculation method comprises the following steps:
assuming that the tail end of the mechanical arm is at the calibration origin, obtaining the pixel coordinate of the reference point in the identification area as (u)0,v0) (ii) a The end of the arm moves Δ x along the TCP coordinate system, at which point the fiducial point pixel coordinate is (u)1,v1) (ii) a And after the tail end of the mechanical arm returns to the calibration origin, moving the tail end of the mechanical arm by delta y along the TCP coordinate system, wherein the pixel coordinate of the reference point is (u)2,v2);
Then, in the job application, if the pixel coordinate of the reference point is obtained as (u, v), the TCP offset (Δ x ', Δ y') of the mechanical arm with respect to the calibration origin at this time shall be:
Figure GDA0003341940100000041
wherein the content of the first and second substances,
Δu1=u1-u0,Δv1=v1-v0,Δu2=u2-u0,Δv2=v2-v0,Δu=u-u0,Δv=v-v0
further, the step of reproducing the relative pose specifically includes:
controlling the mechanical arm to move to the posture recorded during calibration;
measuring the values of the inclination angles Rx and Ry of the tail end of the mechanical arm, and adjusting the posture of the tail end of the mechanical arm to enable the inclination angle to be the same as the inclination angle value recorded during calibration;
measuring the distance value between the tail end of the mechanical arm and the working table surface, and adjusting the posture of the mechanical arm to enable the distance to be equal to the distance value recorded in calibration;
acquiring an image of the identification area on the workbench, extracting an Rz angle value of the identification area, comparing the Rz angle value with an image Rz value recorded during calibration, and adjusting the terminal attitude Rz of the mechanical arm to make the Rz angle value and the image Rz value consistent;
and acquiring the image of the identification area again to obtain the pixel coordinate of the reference point, calculating the TCP offset of the mechanical arm according to the recorded calibration parameters, and controlling the mechanical arm to move according to the offset, so that the recorded pose relative to the working table surface can be reached during calibration.
Therefore, the visual guiding and positioning device and method for the compound robot can record the relative position and posture information of the compound robot including the distance, position, angle and the like with the workbench in detail during calibration, further reproduce the relative pose during operation application, cover the visual and action deviation of the compound robot caused by the factors of trolley navigation positioning error, uneven ground and the like, and improve the operation precision of the robot in the actual environment. Compared with the prior art, its beneficial effect mainly includes: (1) the object grabbing and placing accuracy of the composite robot in various adverse working environments is guaranteed; (2) the robot motion control system can be communicated with a robot motion controller to directly control the motion of the robot without using a mechanical arm demonstrator, and is convenient and quick, accurate in data and free of human input errors; (3) the pixel coordinates and the image angle of the visual guidance reference are automatically identified and obtained, the operation complexity in the calibration process is reduced, and the calibration efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the scope of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic diagram of an exemplary embodiment of a vision-guided positioning apparatus of a multi-functional robot;
FIG. 2 is a general flow diagram of an exemplary embodiment of a method for visual guidance positioning of a multi-robot;
FIG. 3 is a flowchart of the calibration steps of an exemplary embodiment of a composite robot vision-guided localization method;
fig. 4 is a flowchart of a job application procedure of an exemplary embodiment of a visual guidance positioning method of a composite robot.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
A schematic composition diagram of an exemplary embodiment of the composite robot vision-guided positioning apparatus according to the present disclosure is shown in fig. 1.
The utility model provides a vision guide positioner of compounding machine people includes:
the measuring module is used for measuring the distance between the tail end of the mechanical arm and the working table surface, the inclination angle of the tail end of the mechanical arm and acquiring an image of an identification area serving as a guide reference on the working table;
the visual guidance processing module is used for controlling the motion of the mechanical arm through the robot motion controller according to the information acquired by the measuring module, so that the distance, the position and the angle value of the tail end of the mechanical arm relative to the workbench reach the same state when the robot is calibrated and applied in operation; controlling automatic acquisition of calibration data and calculation of calibration parameters; and controlling the tail end of the mechanical arm to deviate to a workpiece placing position from the calibration position along the TCP to complete workpiece grabbing and placing.
The identification area on the work table as a guide reference should have features that are clearly different from the surrounding environment and are easy to recognize and distinguish. One point with the characteristics is selected as a reference point, and the pixel coordinate positioning of the point in the image is used for realizing the positioning of the tail end of the mechanical arm relative to the workbench.
The position of the identification area can be flexibly selected: if the workpiece to be grabbed and placed is placed in the workpiece positioning tool on the workbench, and the position and the posture of the workpiece relative to the positioning tool are unchanged, the workpiece positioning tool can be directly selected, and the relative positioning relation among the workpiece to be grabbed and placed, the workpiece positioning tool and the identification area is kept unchanged; if the position of the target workpiece to be grabbed on the workbench is changed movably, namely, the workpiece positioning tool is not limited, the target workpiece to be grabbed can be directly selected. And meanwhile, the position convenient for the tail end of the mechanical arm to be recognized by the photographing device is required to be selected.
The identification area can be internally provided with features by means of printing or label pasting and the like, and a preferred scheme is to paste a two-dimensional code label or a character label.
The measuring module provides more information for measuring the pose of the robot relative to the workbench, and subsequently performs pose adjustment on the basis of the information, so that the vision deviation of the composite robot caused by factors such as trolley navigation positioning error and uneven ground can be effectively corrected, and the vision guiding operation precision is provided.
The measuring module can adopt all the existing detecting devices which are suitable for measuring the distance of the tail end of the mechanical arm relative to the workbench, the inclination angle of the tail end of the mechanical arm and acquiring the image of the identification area serving as the guide reference on the workbench.
Preferred solutions are a range sensor, a tilt sensor and an industrial camera, as shown in fig. 1.
The visual guidance processing module typically includes a vision control computer of the robot and visual guidance processing software running therein. The vision control computer is also known as a vision controller, is an industrial control computer, and is provided with communication interfaces with the robot motion controller and the measuring module: the robot motion controller is generally connected with a network cable and communicates through a TCP/IP protocol; the camera is connected with the high-speed network cable; and is connected with an angle measuring or distance measuring sensor through a serial port. The visual guide processing software completes the functions of the visual guide processing module.
Preferably, the visual guidance processing module includes:
a measurement module communication unit: communicating with the measuring module, and acquiring and recording the distance and inclination angle information acquired by the measuring module in real time;
a robot communication unit: sending an instruction to a robot motion controller, commanding the robot motion controller to control the robot to move, and simultaneously acquiring the coordinate information of the robot in real time;
an image identification positioning unit: controlling the measuring module to collect images, identifying and positioning the received images, and acquiring and recording pixel coordinates of reference points and Rz angle values of the images;
a calibration parameter calculation unit: calculating calibration parameters and storing according to the pixel coordinates of the reference points acquired at the acquisition points and the offset of the mechanical arm;
a human-computer interaction unit: and the input and output interface provides information such as instructions, data, images and the like for a user.
A general flow diagram of an exemplary embodiment of a visual guide positioning method suitable for use with the composite robotic visual guide apparatus described above is given in fig. 2. As shown in fig. 2, the visual guidance method according to the exemplary embodiment mainly includes a step of visual guidance calibration and a step of visual guidance job application, wherein the step of visual guidance calibration includes:
acquiring a relative pose: and after the trolley reaches a proper parking point, adjusting the tail end of the mechanical arm to a proper calibration original point, and acquiring and recording the pose of the mechanical arm, the tail end inclination angle of the mechanical arm, the distance and position relative to the working table surface and the Rz angle value. The position of the tail end of the mechanical arm relative to the workbench and the Rz angle value can be positioned through the pixel coordinates of the reference point in the identification area serving as the guide reference on the workbench and the Rz angle value of the identification area.
Calibration parameter calculation: the method comprises the steps of collecting calibration data and calculating calibration parameters according to the collected data.
And measuring the TCP offset of the mechanical arm from the calibration original point to the placing position of the workpiece.
The visual guidance job applying step includes:
and (3) reproduction of relative pose: and after the trolley moves to a stopping point recorded in the calibration process, the mechanical arm is controlled to move to the posture recorded in the calibration process, and the tail end of the mechanical arm is controlled to adjust to the pose relative to the workbench in the calibration process according to the information collected by the measuring module.
And controlling the mechanical arm to offset according to the TCP offset from the calibration origin to the workpiece placing position acquired during calibration, and completing the pick-and-place task.
In the method, in the relative pose acquisition process, the pose of the mechanical arm and the pose of the tail end of the mechanical arm relative to a workbench during calibration are recorded in detail through acquisition of parameters obtained by a measurement module, wherein the pose comprises a distance, a position and an angle, the position is represented by pixel coordinates of a reference point in an image of an identification area, and the angle is represented by RX and Ry direction angle values recorded by an inclination angle sensor and an Rz angle value of the image of the identification area. And the pixel coordinates of the reference points in the identification area image and the Rz angle value of the identification area image are obtained by carrying out identification and positioning processing on the identification area image acquired by the measuring module.
Then, when the robot works, the relative pose is reproduced, so that the pose of the composite robot relative to the workbench is the same as that during calibration, the vision and action deviation of the robot caused by factors such as trolley navigation positioning error and uneven ground is shielded, and the working accuracy of the robot in an actual working environment is ensured.
As a preferred scheme, the step of collecting the relative pose specifically comprises the following steps:
adjusting the tail end posture of the mechanical arm to enable a working plane of the mechanical arm to be parallel to a working table surface, and collecting and recording values of inclination angles Rx and Ry of the tail end of the mechanical arm;
adjusting the distance between the tail end of the mechanical arm and the workbench to enable the imaging view of the measuring module to be clear, and recording the current point as a calibration origin;
collecting and recording the distance from the tail end of the mechanical arm to the working table at the moment;
collecting the image of the identification area on the workbench, identifying and positioning the image, and extracting and recording the pixel coordinates of the reference point and the Rz angle value of the identification area;
and collecting and recording the pose of the mechanical arm at the moment.
As a preferred scheme, the step of reproducing the relative pose specifically comprises:
controlling the mechanical arm to move to the posture recorded during calibration;
measuring the values of the inclination angles Rx and Ry of the tail end of the mechanical arm, and adjusting the posture of the tail end of the mechanical arm to enable the inclination angle to be the same as the inclination angle value recorded during calibration;
measuring the distance value between the tail end of the mechanical arm and the working table surface, and adjusting the posture of the mechanical arm to enable the distance to be equal to the distance value recorded in calibration;
acquiring an image of the identification area on the workbench, extracting an Rz angle value of the identification area, comparing the Rz angle value with an image Rz value recorded during calibration, and adjusting the terminal attitude Rz of the mechanical arm to make the Rz angle value and the image Rz value consistent;
and acquiring the image of the identification area again to obtain the pixel coordinate of the reference point, calculating the TCP offset of the mechanical arm according to the recorded calibration parameters, and controlling the mechanical arm to move according to the offset, so that the recorded pose relative to the working table surface can be reached during calibration.
The method for identifying and positioning the image belongs to a mature technology, a shape matching method is usually adopted, but if a two-dimension code label is arranged in the identification area, a two-dimension code identification and positioning method can be adopted.
The calibration parameter calculation of the composite robot can adopt all applicable methods in the prior art. The preferable scheme is that the calibration data acquisition and calibration parameter calculation are based on incremental compensation: controlling the mechanical arm to respectively reach each preset point around the calibration origin according to the preset TCP offset in the Rx direction and the Ry direction, acquiring the pixel coordinates of the reference point on the workbench at the point, then calculating the calibration parameters according to the acquired data, and establishing the corresponding relation between the pixel coordinates of the reference point and the TCP offset of the mechanical arm relative to the calibration origin.
In addition, the TCP offset of the mechanical arm from the calibration origin to the workpiece placing position is measured in the calibration step, and the measurement can be completed in the modes of traction teaching and the like.
According to the visual guiding and positioning method for the composite robot, disclosed by the invention, through detailed relative pose measurement during calibration and reproduction of a later operation application link, the influence of various error factors of the robot trolley is shielded, the workpiece grabbing and placing precision is ensured, and the method is convenient, fast and accurate.
Application example:
fig. 1 shows an exemplary vision-guided pointing device of a robot. In the figure, the compound robot is composed of a mobile trolley (1) and a mechanical arm (2), and a working table top (10) is in a horizontal state. When the trolley is calibrated or works, a proper parking point (11) is selected, and the workpiece position of the workbench is located in the working range (9) of the tail end of the mechanical arm.
This robot vision guide positioner includes:
the system comprises an inclination angle sensor (3) horizontally arranged at the tail end of a mechanical arm of the compound robot, a distance measuring sensor (4) arranged at the tail end of the mechanical arm and an industrial camera (5), wherein the inclination angle sensor is used for measuring the inclination angle of the tail end of the mechanical arm and the distance from the tail end of the mechanical arm to a workbench respectively, and acquiring an image of an identification area (7) serving as a visual guide reference;
the vision guide processing module is provided with communication interfaces with the robot motion controller, the inclination angle sensor (3), the distance measurement sensor (4) and the industrial camera (5), acquires data such as distance, inclination angle and image collected by the measurement module, and identifies and positions the image to obtain pixel coordinates of a reference point and Rz angle values of the identification area; and sending an instruction to the robot motion controller to control the robot to move, and simultaneously acquiring the coordinate information of the robot in real time.
The visual guidance processing module is mainly used for: controlling the mechanical arm to deflect according to a preset deflection, collecting pixel coordinates of a reference point, and calculating a calibration parameter; the distance, the position and the angle value of the tail end of the mechanical arm relative to the working platform surface are obtained and recorded during calibration, and the tail end of the mechanical arm is controlled to reach the same relative pose during operation application; and controlling the tail end of the mechanical arm to deviate to a workpiece placing position from the calibration position along the TCP to complete workpiece grabbing and placing.
The vision guide processing module comprises a robot vision controller and vision guide processing software running in the robot vision controller, wherein the vision controller is an industrial personal computer, is respectively connected with the composite robot motion controller and the camera through network cables, and is connected with the inclination angle sensor and the distance measuring sensor through serial ports; the visual guide processing software completes each specific function.
In addition, the visual guidance positioning device also comprises a two-dimensional code label or a character label arranged in the identification area (7), and the pixel coordinate positioning is carried out by taking the center point of the label as a reference point.
The visual guide positioning method of the compound robot suitable for the device comprises a visual guide positioning calibration step and a working step, wherein:
s1, the calibration step, as shown in fig. 3, includes:
and S11, after the AGV arrives at a proper working stop point (the movement navigation of the AGV is controlled by a control device of the AGV), adjusting the tail end posture of the mechanical arm of the composite robot to enable the tail end of the mechanical arm to be parallel to the workbench, and recording the angle values of the inclination angle sensors Rx and Ry at the moment by the vision guidance processing module.
And S12, adjusting the distance between the mechanical arm of the composite robot and the workbench, adjusting the focal length and the aperture of the industrial camera to enable the imaging field of the camera to be clear, and recording the distance value of the ranging sensor at the moment by the vision guidance processing module. Taking the point as a calibration origin.
And S13, communicating the visual guidance processing module with the robot, recording the pose of the mechanical arm at the moment, opening a camera for image acquisition, and positioning and recording the pixel coordinates and the angle values of the label.
S14, setting offsets of the composite robot arm along the X direction and the Y direction of a TCP coordinate system (tool coordinate system), respectively controlling the arm to move to an offset point, opening a camera for picking up a picture, and carrying out pixel coordinate positioning on a label; and then the mechanical arm returns to the calibration original point, the robot is controlled to shift to another point along the TCP, the camera is started to pick up the image, the pixel positioning is carried out on the label in the image, and the mechanical arm returns to the calibration original point.
And S15, the vision guide processing module calculates calibration parameters according to the obtained pixel coordinates and the mechanical arm offset, and stores the calibration result to the local.
And S16, moving the mechanical arm from the current calibration origin position to the position of the target workpiece to be grabbed and placed along the TCP in a traction teaching mode, and recording the TCP offset (delta Tx ', delta Ty ', delta Tz ') of the mechanical arm.
S2, the operation steps, as shown in fig. 4, include:
s21, after the AGV trolley of the composite robot navigates from the initial point to the stopping point during calibration, the mechanical arm is moved to the calibration initial point posture;
s22, adjusting the tail end posture of the mechanical arm according to the Rx and Ry values of the tilt sensors recorded in the calibration, and ensuring that the tail end posture is the same as the value of the tilt sensors recorded in the calibration;
s23, adjusting the distance from the tail end of the mechanical arm to the workbench according to the distance of the ranging sensor recorded during calibration;
s24, opening the industrial camera to take a picture, identifying the label angle, comparing the label angle with Rz recorded during calibration, and sending the angle deviation to the mechanical arm of the composite robot;
s25, adjusting the tail end posture Rz according to the angle deviation by the mechanical arm of the compound robot to be consistent with the posture during calibration;
and S26, opening the camera again to take a picture, identifying and positioning the pixel coordinates of the label, calculating the TCP offset of the mechanical arm according to the calibration parameters, and commanding and controlling the mechanical arm to carry out TCP offset. The position and angle of the tool coordinate system of the compound robot arm relative to the label is now consistent with that at calibration.
And S27, commanding and controlling the mechanical arm to move according to the TCP offset between the calibration origin and the position of the workpiece to be grabbed of the mechanical arm recorded during calibration. At the moment, the mechanical arm can accurately run to the position of the workpiece to be grabbed and placed, and the grabbing and placing task is completed.
Therefore, in the device and the method for guiding and positioning the vision of the composite robot in the embodiment, the distance, the position and the angle of the mechanical arm relative to the workbench during calibration are recorded in detail by acquiring the angle measurement and distance measurement sensor data and the camera image in real time, and the mechanical arm is accurately adjusted in the subsequent actual working process to achieve the relative pose, so that the influence on the grabbing and placing operation of the mechanical arm caused by the factors such as trolley parking positioning error, angle error, uneven ground and the like is shielded, and the precision of the operation of the visual guiding and positioning of the composite robot can be obviously improved compared with the prior art.
The foregoing is merely an illustrative embodiment of the present application, and any equivalent changes and modifications made by those skilled in the art without departing from the spirit and principles of the present application shall fall within the protection scope of the present application.

Claims (6)

1. A vision guide positioning method of a compound robot comprises a vision guide calibration step and a vision guide operation application step, wherein a measuring module is fixed at the tail end of a mechanical arm of the compound robot and used for measuring the distance between the tail end of the mechanical arm and a working table surface, the inclination angle of the tail end of the mechanical arm and acquiring an image of an identification area serving as a guide reference on the working table;
the visual guidance calibration method specifically comprises the following steps:
acquiring a relative pose: after the trolley reaches a proper parking point, adjusting the tail end of the mechanical arm to a proper calibration original point, and acquiring and recording the pose of the mechanical arm, the inclination angle of the tail end of the mechanical arm, the distance relative to the working table surface, the position and the Rz angle value of the identification area;
calibration parameter calculation: collecting calibration data, and calculating calibration parameters according to the collected data;
measuring the TCP offset of the mechanical arm from the calibration original point to the workpiece placing position;
the step of visually guiding the job application specifically comprises:
and (3) reproduction of relative pose: when the trolley stops at the position and the angle during calibration, the mechanical arm is controlled to move to the posture recorded during calibration, and then the tail end of the mechanical arm is controlled to adjust to the relative posture recorded during calibration;
and controlling the mechanical arm to move according to the TCP offset from the calibration origin to the workpiece placing position to complete the grabbing and releasing task.
2. The visual guide positioning method of the compound robot according to claim 1, wherein the step of acquiring the relative pose specifically comprises:
adjusting the tail end posture of the mechanical arm to enable a working plane of the mechanical arm to be parallel to a working table surface, and collecting and recording values of inclination angles Rx and Ry of the tail end of the mechanical arm;
adjusting the distance between the tail end of the mechanical arm and the workbench to enable the imaging view of the measuring module to be clear, and recording the current point as a calibration origin;
collecting and recording the distance from the tail end of the mechanical arm to the working table at the moment;
collecting the image of the identification area on the workbench, identifying and positioning the image, and extracting and recording the pixel coordinates of the reference point and the Rz angle value of the identification area;
and collecting and recording the pose of the mechanical arm at the moment.
3. The visual guide positioning method for the robot of claim 2, wherein the method for identifying and positioning the image comprises: and identifying and positioning by using a shape matching mode, and if the identification area is provided with a two-dimension code label, identifying and positioning by using a two-dimension code identifying and positioning mode.
4. The visual guide positioning method of the compound robot as claimed in claim 1, wherein the calibration data collection method comprises:
setting TCP offset of each acquisition point relative to the calibration origin;
controlling the mechanical arm to move to each acquisition point along the TCP coordinate system according to a preset offset;
and collecting the image of the identification area on the workbench at each collection point, identifying and positioning the image, and extracting and recording the pixel coordinates of the reference point.
5. The visual guiding and positioning method of the compound robot as claimed in claim 4, wherein the calibration parameters are calculated by:
assuming that the tail end of the mechanical arm is at the calibration origin, obtaining the pixel coordinate of the reference point in the identification area as (u)0,v0) (ii) a The end of the arm moves Δ x along the TCP coordinate system, at which point the fiducial point pixel coordinate is (u)1,v1) (ii) a And after the tail end of the mechanical arm returns to the calibration origin, moving the tail end of the mechanical arm by delta y along the TCP coordinate system, wherein the pixel coordinate of the reference point is (u)2,v2);
Then, in the job application, if the pixel coordinate of the reference point is obtained as (u, v), the TCP offset (Δ x ', Δ y') of the mechanical arm with respect to the calibration origin at this time shall be:
Figure FDA0003341940090000021
wherein the content of the first and second substances,
Δu1=u1-u0,Δv1=v1-v0,Δu2=u2-u0,Δv2=v2-v0,Δu=u-u0,Δv=v-v0
6. the visual guide positioning method of the compound robot according to claim 1, wherein the step of reproducing the relative pose specifically comprises:
controlling the mechanical arm to move to the posture recorded during calibration;
measuring the values of the inclination angles Rx and Ry of the tail end of the mechanical arm, and adjusting the posture of the tail end of the mechanical arm to enable the inclination angle to be the same as the inclination angle value recorded during calibration;
measuring the distance value between the tail end of the mechanical arm and the working table surface, and adjusting the posture of the mechanical arm to enable the distance to be equal to the distance value recorded in calibration;
acquiring an image of the identification area on the workbench, extracting an Rz angle value of the identification area, comparing the Rz angle value with an image Rz value recorded during calibration, and adjusting the terminal attitude Rz of the mechanical arm to make the Rz angle value and the image Rz value consistent;
and acquiring the image of the identification area again to obtain the pixel coordinate of the reference point, calculating the TCP offset of the mechanical arm according to the recorded calibration parameters, and controlling the mechanical arm to move according to the offset, so that the recorded pose relative to the working table surface can be reached during calibration.
CN201911225526.9A 2019-12-04 2019-12-04 Visual guiding and positioning method for compound robot Active CN110842928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911225526.9A CN110842928B (en) 2019-12-04 2019-12-04 Visual guiding and positioning method for compound robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911225526.9A CN110842928B (en) 2019-12-04 2019-12-04 Visual guiding and positioning method for compound robot

Publications (2)

Publication Number Publication Date
CN110842928A CN110842928A (en) 2020-02-28
CN110842928B true CN110842928B (en) 2022-02-22

Family

ID=69607796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911225526.9A Active CN110842928B (en) 2019-12-04 2019-12-04 Visual guiding and positioning method for compound robot

Country Status (1)

Country Link
CN (1) CN110842928B (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111452038B (en) * 2020-03-03 2021-08-24 重庆大学 High-precision workpiece assembly and assembly method thereof
CN113370203A (en) * 2020-03-10 2021-09-10 固高科技(深圳)有限公司 Robot control method, robot control device, computer device, and storage medium
CN111250406B (en) * 2020-03-16 2023-11-14 科为升视觉技术(苏州)有限公司 Automatic placement method and system for PCB detection assembly line based on visual positioning
CN111531580B (en) * 2020-04-27 2023-02-07 武汉工程大学 Vision-based multi-task robot fault detection method and system
CN112098137A (en) * 2020-08-05 2020-12-18 湖南华菱涟源钢铁有限公司 Automatic steel plate sampling method and automatic steel plate sampling system
CN112208113B (en) * 2020-08-13 2022-09-06 苏州赛米维尔智能装备有限公司 Automatic heat-conducting cotton attaching device based on visual guidance and attaching method thereof
CN112102289A (en) * 2020-09-15 2020-12-18 齐鲁工业大学 Cell sample centrifugal processing system and method based on machine vision
CN112357707B (en) * 2020-10-21 2022-06-21 日立楼宇技术(广州)有限公司 Elevator detection method and device, robot and storage medium
CN112454354B (en) * 2020-11-10 2022-05-03 中国电子工程设计院有限公司 Working method and device of industrial robot and storage medium
CN114538027A (en) * 2020-11-26 2022-05-27 合肥欣奕华智能机器股份有限公司 Full-automatic visual positioning transfer equipment and control method thereof
CN114643599B (en) * 2020-12-18 2023-07-21 沈阳新松机器人自动化股份有限公司 Three-dimensional machine vision system and method based on point laser and area array camera
CN112720474A (en) * 2020-12-21 2021-04-30 深圳市越疆科技有限公司 Pose correction method and device for robot, terminal device and storage medium
CN112936265B (en) * 2021-01-29 2022-09-20 山东莱钢永锋钢铁有限公司 System for remotely regulating ABB mechanical arm
CN113084808B (en) * 2021-04-02 2023-09-22 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN113211431B (en) * 2021-04-16 2022-07-01 中铁第一勘察设计院集团有限公司 Pose estimation method based on two-dimensional code correction robot system
CN115450447A (en) * 2021-06-08 2022-12-09 广东博智林机器人有限公司 Interaction system, brick laying device, brick laying manipulator and brick laying positioning method
JP7054036B1 (en) * 2021-07-09 2022-04-13 株式会社不二越 Robot vision system
CN113526125B (en) * 2021-07-28 2022-11-22 齐鲁工业大学 Cell specimen sample carrying system and method based on multi-label positioning
CN115892802A (en) * 2021-08-26 2023-04-04 深圳市海柔创新科技有限公司 Compensation parameter generation method and device for sensor device
CN113777336B (en) * 2021-09-08 2023-08-04 广州赛特智能科技有限公司 Automatic detection system and method for biological specimen
CN113989472A (en) * 2021-09-30 2022-01-28 深圳先进技术研究院 Method, system and equipment for accurately grabbing target object
CN113843798B (en) * 2021-10-11 2023-04-28 深圳先进技术研究院 Correction method and system for mobile robot grabbing and positioning errors and robot
CN113799140A (en) * 2021-10-14 2021-12-17 友上智能科技(苏州)有限公司 Flight vision positioning material taking method applied to composite robot
CN113858206B (en) * 2021-10-25 2023-05-23 联想(北京)有限公司 Robot job control method, robot, and computer-readable storage medium
CN114089767B (en) * 2021-11-23 2024-03-26 成都瑞特数字科技有限责任公司 Positioning and grabbing method for bottle-shaped objects in application of mobile compound robot
CN114055454B (en) * 2021-12-15 2023-07-14 重庆远创光电科技有限公司 Engine end cover and wire box robot vision guiding and positioning device
CN114252013B (en) * 2021-12-22 2024-03-22 深圳市天昕朗科技有限公司 AGV visual identification accurate positioning system based on wired communication mode
CN114310881A (en) * 2021-12-23 2022-04-12 中国科学院自动化研究所 Calibration method and system for mechanical arm quick-change device and electronic equipment
CN114571494B (en) * 2022-03-18 2023-06-02 贵州航天天马机电科技有限公司 Multi-degree-of-freedom general heavy-duty lifting manipulator structure based on visual guidance
CN114842089B (en) * 2022-03-29 2024-03-15 国营芜湖机械厂 Automatic modulation method for fly-by-wire computer potentiometer
CN114920000A (en) * 2022-04-22 2022-08-19 歌尔科技有限公司 Conveying device, mechanical equipment and control method of conveying device
CN115122331A (en) * 2022-07-04 2022-09-30 中冶赛迪工程技术股份有限公司 Workpiece grabbing method and device
CN114833872B (en) * 2022-07-04 2022-09-27 苏州亿迈视光电科技有限公司 Mechanical arm based on image recognition as guide information
CN115319737B (en) * 2022-07-12 2023-06-27 广州里工实业有限公司 Automatic feeding and discharging control method, system, device and storage medium
CN116135492B (en) * 2023-04-20 2023-09-05 成都盛锴科技有限公司 Automatic dismounting device and method for railway vehicle door
CN117549338B (en) * 2024-01-09 2024-03-29 北京李尔现代坦迪斯汽车系统有限公司 Grabbing robot for automobile cushion production workshop

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3398729A1 (en) * 2017-05-05 2018-11-07 Robert Bosch GmbH Facility, device and method for operating autonomous transport vehicles which can be loaded with small goods holders
CN109131630A (en) * 2018-08-28 2019-01-04 中科新松有限公司 A kind of control method of composite machine people and composite machine people
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
CN109848994A (en) * 2019-02-22 2019-06-07 浙江启成智能科技有限公司 A kind of robot vision guidance location algorithm

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9785911B2 (en) * 2013-07-25 2017-10-10 I AM Robotics, LLC System and method for piece-picking or put-away with a mobile manipulation robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3398729A1 (en) * 2017-05-05 2018-11-07 Robert Bosch GmbH Facility, device and method for operating autonomous transport vehicles which can be loaded with small goods holders
CN109131630A (en) * 2018-08-28 2019-01-04 中科新松有限公司 A kind of control method of composite machine people and composite machine people
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
CN109848994A (en) * 2019-02-22 2019-06-07 浙江启成智能科技有限公司 A kind of robot vision guidance location algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于视觉引导的机械臂抓取系统研究;卢海军;《中国优秀硕士学位论文全文库信息科技辑2019年第8期》;20190815;摘要、正文第7、8、13、21、22、26、37、38页 *

Also Published As

Publication number Publication date
CN110842928A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110842928B (en) Visual guiding and positioning method for compound robot
CN110238845B (en) Automatic hand-eye calibration method and device for optimal calibration point selection and error self-measurement
CA2710669C (en) Method and system for the high-precision positioning of at least one object in a final location in space
CN111127568B (en) Camera pose calibration method based on spatial point location information
EP1488893A2 (en) Connector gripping device, connector inspection system comprising the device, and connector connection system
US20220314455A1 (en) Production system
CN106272424A (en) A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
JP6855492B2 (en) Robot system, robot system control device, and robot system control method
US20110071675A1 (en) Visual perception system and method for a humanoid robot
JP2019093481A (en) Robot system and robot system control method
US20220331970A1 (en) Robot-mounted moving device, system, and machine tool
JP2021049607A (en) Controller of robot device for adjusting position of member supported by robot
CN112247525A (en) Intelligent assembling system based on visual positioning
CN114770461B (en) Mobile robot based on monocular vision and automatic grabbing method thereof
JP2019195885A (en) Control device and robot system
JPH1158273A (en) Mobile robot device
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
JP2016203282A (en) Robot with mechanism for changing end effector attitude
WO2023013740A1 (en) Robot control device, robot control system, and robot control method
WO2022075303A1 (en) Robot system
CN115409878A (en) AI algorithm for workpiece sorting and homing
CN114643577B (en) Universal robot vision automatic calibration device and method
JPH06218682A (en) Robot for assembly
US20220134577A1 (en) Image processing method, image processing apparatus, robot-mounted transfer device, and system
JP6832408B1 (en) Production system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Vision-Guided Localization Method for Composite Robots

Effective date of registration: 20220826

Granted publication date: 20220222

Pledgee: Industrial and Commercial Bank of China Limited Shanghai pilot Free Trade Zone New Area Branch

Pledgor: SIASUN Co.,Ltd.

Registration number: Y2022310000204

PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20221227

Granted publication date: 20220222

Pledgee: Industrial and Commercial Bank of China Limited Shanghai pilot Free Trade Zone New Area Branch

Pledgor: SIASUN Co.,Ltd.

Registration number: Y2022310000204

PC01 Cancellation of the registration of the contract for pledge of patent right
TR01 Transfer of patent right

Effective date of registration: 20230316

Address after: 201306 No. 299, Xueyang Road, Lingang New District, pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: SHANGHAI XINSONG ROBOT CO.,LTD.

Address before: Room 101, 201, West, building 11, No. 351 jinzang Road, Pudong New Area, Shanghai

Patentee before: SIASUN Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231023

Address after: 201206 rooms 101 and 201, west of building 11, 351 jinzang Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: SIASUN Co.,Ltd.

Address before: 201306 No. 299, Xueyang Road, Lingang New District, pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee before: SHANGHAI XINSONG ROBOT CO.,LTD.

TR01 Transfer of patent right