CN115122331A - Workpiece grabbing method and device - Google Patents

Workpiece grabbing method and device Download PDF

Info

Publication number
CN115122331A
CN115122331A CN202210846168.9A CN202210846168A CN115122331A CN 115122331 A CN115122331 A CN 115122331A CN 202210846168 A CN202210846168 A CN 202210846168A CN 115122331 A CN115122331 A CN 115122331A
Authority
CN
China
Prior art keywords
workpiece
real
posture
point
state feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210846168.9A
Other languages
Chinese (zh)
Inventor
刘贵林
刘景亚
陈开�
万小丽
谭云龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Original Assignee
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CISDI Engineering Co Ltd, CISDI Research and Development Co Ltd filed Critical CISDI Engineering Co Ltd
Priority to CN202210846168.9A priority Critical patent/CN115122331A/en
Publication of CN115122331A publication Critical patent/CN115122331A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a workpiece grabbing method and a device, wherein a state feedback point is set, before a workpiece is grabbed, the real-time theoretical posture of the state feedback point is obtained and whether the real-time theoretical posture meets the grabbing tolerance requirement is judged, if the grabbing tolerance requirement is met, the real-time theoretical posture of the workpiece grabbing point is further calculated, and a robot system is guided to grab the workpiece according to the real-time theoretical posture of the workpiece grabbing point, namely, whether the real-time theoretical posture can successfully complete workpiece grabbing can be automatically judged at the state feedback point, grabbing is carried out if the real-time theoretical posture can successfully grab the workpiece, and grabbing is not carried out if the real-time theoretical posture cannot successfully grab the workpiece, so that the workpiece grabbing efficiency of the robot system is effectively improved, and the phenomenon of workpiece grabbing failure caused by the fact that the deviation of the positioning posture exceeds a threshold value is avoided.

Description

Workpiece grabbing method and device
Technical Field
The invention relates to the technical field of industrial automation, in particular to a workpiece grabbing method and device.
Background
With the improvement of computer technology and automation hardware level, the application of automation equipment such as robots in industrial fields is more and more common. Workpiece grabbing is a common task of a robot, attitude information of a workpiece needs to be acquired before the robot grabs, a target which is easy to locate is preset in a common mode for the workpiece which cannot be directly located, indirect locating of the workpiece is achieved through locating of the target, and then the robot is guided to complete grabbing actions through locating information.
Therefore, a robot grabbing scheme with positioning information judgment feedback is needed.
Disclosure of Invention
In view of the above drawbacks of the prior art, the present invention provides a workpiece grabbing technical scheme based on visual guidance, which first determines whether positioning information meets grabbing requirements before grabbing a workpiece, so as to avoid grabbing failure caused by inaccurate positioning information.
To achieve the above and other related objects, the present invention provides the following technical solutions.
A method of workpiece capture comprising:
setting a state feedback point and a workpiece grabbing point of a robot system, pre-calibrating a posture conversion relation, and acquiring a posture transformation matrix from a mark plate to the state feedback point and a posture transformation matrix from the state feedback point to the workpiece grabbing point, wherein the mark plate is arranged on a workpiece to be grabbed;
acquiring the real-time posture of the marking plate, and calculating to obtain the real-time theoretical posture of the state feedback point by combining a posture transformation matrix from the marking plate to the state feedback point;
and judging whether the real-time theoretical posture of the state feedback point meets the requirement of grabbing tolerance or not, if so, calculating to obtain the real-time theoretical posture of the workpiece grabbing point by combining a posture transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point.
Optionally, before setting a state feedback point and a workpiece capture point of the robot system and performing posture conversion relationship pre-calibration, the workpiece capture method further includes:
the positioning target comprises a mark plate, a baffle and a connecting plate, the mark plate is arranged in parallel with the baffle, the plate projection of the mark plate and the plate projection of the baffle are overlapped, the mark plate is fixedly connected with the baffle through the connecting plate, the sensing system comprises an industrial camera, a laser range finder and a gesture calculating module, the robot system comprises a robot body and an end effector, and the industrial camera and the laser range finder are respectively arranged on the end effector.
Optionally, a visual positioning pattern is arranged on the surface of the mark plate, a plurality of through holes with non-collinear circle centers are further arranged on the surface of the mark plate, and the radius of each through hole is smaller than or equal to the tolerance of the end effector when the end effector grasps the workpiece to be grasped.
Optionally, the method includes the steps of setting a state feedback point and a workpiece grasping point of the robot system, pre-calibrating a posture conversion relationship, and acquiring a posture transformation matrix from the sign board to the state feedback point and a posture transformation matrix from the state feedback point to the workpiece grasping point, wherein the steps include:
acquiring an image of the visual positioning pattern through the industrial camera, and processing the image of the visual positioning pattern through the gesture calculation module to obtain the gesture of the sign board;
setting the state feedback point, adjusting the posture of the robot system, starting the laser range finder, enabling a light spot of the laser range finder to penetrate through the central area of the through hole in the sign board and reach the baffle, and obtaining an ideal distance parameter from the laser range finder to the baffle; calculating to obtain a posture transformation matrix from the mark plate to the state feedback point according to the posture of the robot system at the state feedback point and the posture of the mark plate through the posture calculation module;
and setting the workpiece grabbing point through teaching, and calculating to obtain a posture transformation matrix from the state feedback point to the workpiece grabbing point according to the posture of the robot system at the workpiece grabbing point and the posture of the robot system at the state feedback point through the posture calculation module.
Optionally, the step of obtaining the real-time posture of the sign board, and calculating to obtain the real-time theoretical posture of the state feedback point by combining a posture transformation matrix from the sign board to the state feedback point includes:
acquiring a real-time image of the visual positioning pattern through the industrial camera, and processing the real-time image of the visual positioning pattern through the gesture calculation module to obtain a real-time gesture of the marking plate;
and calculating to obtain the real-time theoretical attitude of the state feedback point according to the real-time attitude of the mark plate and the attitude transformation matrix from the mark plate to the state feedback point through the attitude calculation module.
Optionally, the step of determining whether the real-time theoretical attitude of the state feedback point meets the requirement of the capture tolerance includes:
moving the robot system to the real-time theoretical attitude of the state feedback point, starting the laser range finder and acquiring the real-time distance parameter from the laser range finder to the baffle;
obtaining a tolerance range when the end effector grabs the workpiece to be grabbed, comparing the ideal distance parameter with the real-time distance parameter, if the difference value between the ideal distance parameter and the real-time distance parameter is within the tolerance range, the real-time theoretical attitude of the state feedback point meets the grabbing tolerance requirement, and if the difference value between the ideal distance parameter and the real-time distance parameter is outside the tolerance range, the real-time theoretical attitude of the state feedback point does not meet the grabbing tolerance requirement.
Optionally, the step of calculating a real-time theoretical posture of the workpiece grabbing point by combining the posture transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point includes:
calculating to obtain the real-time theoretical attitude of the workpiece grabbing point according to the real-time theoretical attitude of the state feedback point and the attitude transformation matrix from the state feedback point to the workpiece grabbing point through the attitude calculation module;
and guiding the robot system to move and driving the end effector to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point.
Optionally, if the real-time theoretical posture of the state feedback point does not meet the requirement of the grabbing tolerance, the state feedback point and the workpiece grabbing point are reestablished, the posture conversion relationship is precalibrated again, the real-time posture of the sign board is obtained again, and the real-time theoretical posture of the state feedback point is updated until the real-time theoretical posture of the state feedback point meets the requirement of the grabbing tolerance.
A workpiece gripping device comprising:
the positioning target is arranged on a workpiece to be grabbed;
a robotic system;
and the sensing system is arranged on the robot system and used for acquiring the real-time posture of the positioning target, acquiring the real-time theoretical posture of the state feedback point and the real-time theoretical posture of the workpiece grabbing point through a pre-calibrated posture conversion relation, judging whether the real-time theoretical posture of the state feedback point meets the grabbing tolerance requirement or not, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point meeting the grabbing tolerance requirement.
Optionally, the positioning target includes a mark plate, a baffle plate and a connecting plate, the mark plate and the baffle plate are arranged in parallel, an overlapping portion exists between the projection of the surface of the mark plate and the projection of the surface of the baffle plate, and the mark plate and the baffle plate are fixedly connected through the connecting plate; the robot system comprises a robot body and an end effector, wherein the end effector is arranged on the robot body; the sensing system comprises an industrial camera, a laser range finder and a posture calculation module, wherein the industrial camera and the laser range finder are respectively arranged on the end effector.
Optionally, a visual positioning pattern is arranged on the surface of the mark plate, a plurality of through holes with non-collinear circle centers are further arranged on the surface of the mark plate, and the radius of each through hole is smaller than or equal to the tolerance of the end effector when the end effector grabs the workpiece to be grabbed.
As described above, the workpiece grabbing method and apparatus provided by the invention at least have the following beneficial effects:
the method comprises the steps of setting a state feedback point, obtaining a real-time theoretical posture of the state feedback point and judging whether the real-time theoretical posture meets the requirement of grabbing tolerance or not before grabbing a workpiece, further calculating to obtain the real-time theoretical posture of a workpiece grabbing point if the real-time theoretical posture meets the requirement of grabbing tolerance, and guiding a robot system to grab the workpiece according to the real-time theoretical posture of the workpiece grabbing point, namely, automatically judging whether the real-time theoretical posture can successfully complete workpiece grabbing or not at the state feedback point, grabbing if the real-time theoretical posture can successfully grab the workpiece, and not grabbing the workpiece if the real-time theoretical posture cannot successfully grab the workpiece, so that the workpiece grabbing efficiency of the robot system is effectively improved, and the phenomenon of workpiece grabbing failure caused by the fact that the deviation of the positioning posture exceeds a threshold value is avoided.
Drawings
FIG. 1 is a schematic illustration of the steps of a method of gripping a workpiece according to the present invention;
FIG. 2 is a schematic view of a workpiece holding device according to an alternative embodiment of the invention;
fig. 3-4 are schematic structural views of the positioning target 1 of fig. 2;
fig. 5-6 are partial flow diagrams of the workpiece grabbing method of the invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in this embodiment are only for schematically illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings and not drawn according to the number, shape and size of the components in actual implementation, and the form, quantity and proportion of each component in actual implementation may be arbitrarily changed, and the component layout may be more complicated. The structures, proportions, sizes, and other dimensions shown in the drawings and described in the specification are for understanding and reading the present disclosure, and are not intended to limit the scope of the present disclosure, which is defined in the claims, and are not essential to the art, and any structural modifications, changes in proportions, or adjustments in size, which do not affect the efficacy and attainment of the same are intended to fall within the scope of the present disclosure.
As described in the foregoing background, the inventors have studied to find that: the grabbing operation aiming at the workpiece which cannot be directly positioned is generally characterized in that a target is preset, indirect positioning of the workpiece is realized through positioning of the target, and then the robot is guided to complete grabbing actions through positioning information.
Based on this, the inventor proposes a workpiece grabbing technical scheme based on visual guidance: and establishing a state feedback point before the workpiece grabbing point, judging whether the corresponding real-time theoretical posture meets the grabbing tolerance requirement at the state feedback point, further calculating the real-time theoretical posture of the workpiece grabbing point meeting the grabbing tolerance requirement, guiding the robot system to grab the workpiece according to the real-time theoretical posture of the workpiece grabbing point, and continuously updating until the real-time theoretical posture of the workpiece grabbing point does not meet the grabbing tolerance requirement.
As shown in fig. 1, the present invention provides a workpiece gripping method, which includes the steps of:
s1, setting a state feedback point and a workpiece grabbing point of the robot system, pre-calibrating a posture conversion relation, acquiring a posture conversion matrix from a marking plate to the state feedback point and a posture conversion matrix from the state feedback point to the workpiece grabbing point, and arranging the marking plate on the workpiece to be grabbed;
s2, acquiring the real-time posture of the marker plate, and calculating to obtain the real-time theoretical posture of the state feedback point by combining the posture transformation matrix from the marker plate to the state feedback point;
and S3, judging whether the real-time theoretical posture of the state feedback point meets the grabbing tolerance requirement, if so, calculating to obtain the real-time theoretical posture of the workpiece grabbing point by combining the posture transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point.
In detail, as shown in fig. 1, before the step S1 of setting the state feedback point and the workpiece capture point of the robot system and performing the posture conversion relationship pre-calibration, the workpiece capture method further includes the steps of:
s0, a positioning target 1 and a sensing system 2 are established, the positioning target 1 is arranged on a workpiece 0 to be grabbed, the positioning target 1 comprises a mark plate 11 and a baffle plate 12, the mark plate 11 and the baffle plate 12 are arranged in parallel, the sensing system 2 comprises an industrial camera 21, a laser range finder 22 and a gesture calculating module (not shown in the figure), the robot system 3 comprises a robot body 31 and an end effector 32, and the industrial camera 21 and the laser range finder 22 are respectively arranged on the end effector 32.
In detail, as shown in fig. 2, in the robot system 3, the end effector 32 is provided on the robot body 31, the robot body 31 is freely movable, and the end effector 32 can perform a grasping action; in step S10, first, the positioning target 1 is designed and constructed based on the physical characteristics of the workpiece 0 to be grasped and the tolerance when the end effector 32 grasps the workpiece 0 to be grasped, and the positioning target 1 is fixedly mounted on the workpiece 0 to be grasped.
More specifically, as shown in fig. 2 and 3, the positioning target 1 is disposed on the workpiece 0 to be grasped, the positioning target 1 includes a sign plate 11, a baffle plate 12 and a connecting plate 13, the sign plate 11 is disposed parallel to the baffle plate 12, a projection of a plate surface of the sign plate 11 in a YZ plane and a projection of a plate surface of the baffle plate 12 in a YZ plane have an overlapping portion, the sign plate 11 and the baffle plate 12 are fixedly connected by the connecting plate 13, and a distance between the sign plate 11 and the baffle plate 12 in an X-axis direction is adjustable, that is, a dimension of the connecting plate 13 in the X-axis direction is adjustable, and is usually set to 3cm to 5 cm. The overall size of the positioning target 1 matches the physical characteristics of the workpiece 0 to be grasped and the tolerance when the end effector 32 grasps the workpiece 0 to be grasped.
More specifically, as shown in fig. 4, a positioning area 112 is set at the center of the sign board 11, the rest areas are feedback areas 111, a visual positioning pattern is set at the positioning area 112 on the board surface of the sign board 11, the visual positioning pattern may be a common open source visual positioning pattern such as a positioning two-dimensional code, the visual positioning pattern has a matched image processing algorithm, and three-dimensional posture information of the sign board 11, such as an apritag series pattern and an image processing algorithm, can be obtained through algorithm calculation; feedback region 111 department is provided with a plurality of centre of a circle through-hole A that is not collinear on the face of marking plate 11, the quantity and the radius of through-hole A are decided according to the tolerance when end effector 32 snatchs and waits to snatch work piece 0, if end effector 32 snatchs and waits to have the tolerance of 2mm on certain direction when grabbing work piece 0, then through-hole A's radius can be designed to 2mm, through-hole A's quantity decides according to tolerance required precision, tolerance precision is higher, then through-hole A's quantity is more, can further inject end effector 32's rotation angle precision on the basis of guaranteeing three-dimensional tolerance precision, usually 1 ~ 3 through-holes can.
In more detail, as shown in fig. 2, the positioning target 1 is fixedly mounted on the workpiece 0 to be grabbed, and the mounting position is determined according to the actual situation; the sign board 11 and the baffle 12 are arranged in parallel, and the projection of the board surface of the sign board 11 in the YZ plane and the projection of the board surface of the baffle 12 in the YZ plane have an overlapping part, so that a light spot of the laser range finder 22 penetrating through the through hole a on the sign board 11 can be shot onto the baffle 12 and reflected when judging whether the state feedback point meets the requirement of grabbing tolerance.
In more detail, as shown in fig. 2, in step S10, the sensing system 2 is designed and built according to the arrangement of the positioning targets 1, the sensing system 2 includes an industrial camera 21, a laser range finder 22 and a posture calculation module (not shown in the figure), the industrial camera 21 and the laser range finder 22 are respectively arranged on the end effector 32, and the posture calculation module can be arranged on the robot body 31, wherein the number and the positions of the laser range finders 22 correspond to the number and the positions of the through holes a on the sign board 11 one by one.
In detail, step S1 of setting a state feedback point and a workpiece grasping point of the robot system, pre-calibrating a posture conversion relationship, and obtaining a posture transformation matrix from the sign board to the state feedback point and a posture transformation matrix from the state feedback point to the workpiece grasping point further includes:
s11, as shown in FIG. 5, acquiring an image of the visual positioning pattern by the industrial camera 21, and processing the image of the visual positioning pattern by the gesture calculation module to obtain the gesture of the sign board 11;
s12, setting a state feedback point, adjusting the posture of the robot system 3 as shown in FIG. 6, starting the laser range finder 22, enabling the light spot of the laser range finder 22 to penetrate through the central area of the through hole A on the sign board 11 and reach the baffle 12, obtaining and storing the ideal distance parameter from the laser range finder 22 to the baffle 12; calculating to obtain a posture transformation matrix from the mark plate 11 to the state feedback point according to the posture of the robot system 3 at the state feedback point and the posture of the mark plate 11 through a posture calculation module;
and S13, setting a workpiece grabbing point through teaching, and calculating to obtain a posture transformation matrix from the state feedback point to the workpiece grabbing point according to the posture of the robot system 3 at the workpiece grabbing point and the posture of the robot system 3 at the state feedback point through a posture calculation module.
In more detail, as shown in fig. 5, in step S11, a photographing point of the robot is selected based on the fact that the imaging effect of the visual positioning pattern on the positioning region 112 of the marker plate 11 of the positioning target 1 is optimal, an image is captured at the photographing point, and the posture matrix of the marker plate 11 is obtained through an image processing algorithm.
In more detail, as shown in fig. 6, in step S12, a state feedback point is established first, and a distance parameter of the state feedback point is obtained; after the state feedback points are set, a posture transformation matrix from the signboard 11 to the state feedback points is calculated based on the posture matrix of the current robot system 3 and the posture matrix of the signboard obtained in step S11.
In more detail, in step S13, a workpiece grasping point is set up first, a workpiece point position is grasped by teaching the robot system 3, and after the workpiece grasping point is set, a posture transformation matrix from the state feedback point to the workpiece grasping point is calculated and obtained according to the posture matrix of the current robot system 3 and the posture matrix of the state feedback point obtained in step S12.
After the positioning target 1 and the sensing system 2 are built and the posture conversion relation is pre-calibrated, workpiece grabbing operation can be carried out, and when the workpiece is actually grabbed, the robot system 3 is guided to complete grabbing operation according to the posture information calculated by the sensing system 2.
In detail, step S2 of obtaining the real-time posture of the signboard 11, and calculating the real-time theoretical posture of the state feedback point by combining the posture transformation matrix from the signboard 11 to the state feedback point, further includes:
s21, as shown in FIG. 5, acquiring a real-time image of the visual positioning pattern by the industrial camera 21, and processing the real-time image of the visual positioning pattern by the gesture calculation module to obtain a real-time gesture of the sign board 11;
and S22, calculating to obtain the real-time theoretical attitude of the state feedback point according to the real-time attitude of the sign board 11 and the attitude transformation matrix from the sign board 11 to the state feedback point through the attitude calculation module.
In more detail, in step S21, similar to the previous pre-calibration, the real-time image of the visual positioning pattern on the signboard 11 is acquired by the industrial camera 21 in the sensing system 2, and the real-time posture of the signboard 11 at the current moment is obtained through an image processing algorithm; in step S22, the real-time theoretical attitude (or the calculated attitude) of the state feedback point at the current time is calculated by combining the pre-calibrated attitude transformation matrix from the signboard 11 to the state feedback point.
In detail, in step S3, the step of determining whether the real-time theoretical pose of the state feedback point meets the grabbing tolerance requirement further includes:
stp1, moving the robot system 3 to the real-time theoretical attitude of the state feedback point, starting the laser range finder 22 and acquiring the real-time distance parameter from the laser range finder 22 to the baffle 12;
stp2, obtaining a tolerance range when the end effector 32 grabs the workpiece 0 to be grabbed, comparing the ideal distance parameter with the real-time distance parameter, if the difference between the ideal distance parameter and the real-time distance parameter is within the tolerance range, the real-time theoretical attitude of the state feedback point satisfies the grabbing tolerance requirement, and if the difference between the ideal distance parameter and the real-time distance parameter is outside the tolerance range, the real-time theoretical attitude of the state feedback point does not satisfy the grabbing tolerance requirement.
In more detail, in step Stp1, the robot system 3 is moved to the real-time theoretical attitude of the state feedback point at the present time, the laser rangefinder 22 is turned on and the real-time distance parameter of the laser rangefinder 22 to the barrier 12 is acquired.
In more detail, in step Stp2, the tolerance range, the ideal distance parameter and the real-time distance parameter when the end effector 32 grabs the workpiece 0 to be grabbed are determined, and at the real-time theoretical attitude of the state feedback point, if the difference between the real-time distance parameter and the ideal distance parameter measured by the laser range finder 22 is within the tolerance range, such as ± 2mm, it indicates that the light spot of the laser range finder 22 penetrates through the central region of the through hole a on the signboard 11 and reaches the baffle 12, the distance in the YZ plane meets the grabbing tolerance requirement, and meanwhile, the distance in the X-axis direction also meets the grabbing tolerance requirement, that is, the real-time theoretical attitude of the state feedback point meets the grabbing tolerance requirement in the three-dimensional space.
In more detail, in step Stp2, for the case that there are multiple laser range finders 22, if the difference between the ideal distance parameter and the real-time distance parameter corresponding to each laser range finder 22 is within the tolerance range, the real-time theoretical attitude of the state feedback point satisfies the grabbing tolerance requirement in the three-dimensional space.
In more detail, in step Stp2, in the case that there are a plurality of laser distance meters 22, when the three-dimensional space positioning accuracy is determined, the positioning accuracy of the rotation angle may be further determined, and if the difference between the ideal distance parameter and the real-time distance parameter corresponding to at least one laser distance meter 22 is not within the tolerance range, it indicates that a certain rotation angle of the state feedback point is not satisfactory, and the real-time theoretical attitude of the state feedback point does not satisfy the capture tolerance requirement of "three-dimensional space + rotation angle", and needs to be confirmed again.
In detail, in step S3, the step of calculating a real-time theoretical attitude of the workpiece grabbing point by using the attitude transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical attitude of the workpiece grabbing point further includes:
stp3, calculating to obtain the real-time theoretical attitude of the workpiece grabbing point according to the real-time theoretical attitude of the state feedback point and the attitude transformation matrix from the state feedback point to the workpiece grabbing point through an attitude calculation module;
stp4, according to the real-time theoretical posture of the workpiece grasping point, guides the robot system 3 to move and drives the end effector 32 to grasp the workpiece 0 to be grasped.
More specifically, in steps Stp3 to Stp4, the real-time theoretical attitude of the workpiece grabbing point at the current time is calculated according to the real-time theoretical attitude of the state feedback point at the current time and by combining the attitude transformation matrix from the pre-calibrated state feedback point to the workpiece grabbing point, and the robot system 3 is guided to complete the workpiece grabbing operation according to the attitude information.
In more detail, in step S3, if the real-time theoretical posture of the state feedback point does not satisfy the requirement of the grasping tolerance, the above steps are repeated, the state feedback point and the workpiece grasping point are re-established, the posture conversion relationship pre-calibration is performed again, the real-time posture of the marking plate 11 is re-obtained and the real-time theoretical posture of the state feedback point is updated until the real-time theoretical posture of the state feedback point satisfies the requirement of the grasping tolerance, and finally the real-time theoretical posture of the workpiece grasping point is calculated and the grasping is completed.
In addition, in order to implement the above workpiece gripping method, the present invention also provides a workpiece gripping device, as shown in fig. 2, which includes:
the positioning target 1 is arranged on a workpiece 0 to be grabbed;
a robot system 3;
and the sensing system 2 is arranged on the robot system 3 and used for acquiring the real-time posture of the positioning target 1, acquiring the real-time theoretical posture of the state feedback point and the real-time theoretical posture of the workpiece grabbing point through a pre-calibrated posture conversion relation, judging whether the real-time theoretical posture of the state feedback point meets the grabbing tolerance requirement or not, and guiding the robot system 3 to grab the workpiece 0 to be grabbed according to the real-time theoretical posture of the workpiece grabbing point meeting the grabbing tolerance requirement.
As shown in fig. 2, the positioning target 1 includes a sign plate 11, a baffle plate 12 and a connecting plate 13, the sign plate 11 and the baffle plate 12 are arranged in parallel, and the projection of the plate surface of the sign plate 11 in the YZ plane and the projection of the plate surface of the baffle plate 12 in the YZ plane have an overlapping portion, and the sign plate 11 and the baffle plate 12 are fixedly connected by the connecting plate 13; the robot system 3 includes a robot body 31 and an end effector 32, the end effector 32 being provided on the robot body 31; the sensing system 2 includes an industrial camera 21, a laser range finder 22 and a posture calculation module, and the industrial camera 21 and the laser range finder 22 are respectively disposed on the end effector 32.
In detail, as shown in fig. 2, a visual positioning pattern is disposed on the surface of the sign board 11, a plurality of through holes a with non-collinear circle centers are further disposed on the surface of the sign board 11, and the radius of the through holes a is smaller than or equal to the tolerance when the end effector 32 grabs the workpiece 0 to be grabbed.
It should be noted that the working principle of the workpiece grabbing device can refer to the workpiece grabbing method, and details are not described herein.
In summary, in the workpiece grabbing method and device provided by the invention, the state feedback point is set, before grabbing the workpiece, the real-time theoretical posture of the state feedback point is obtained and whether the real-time theoretical posture meets the grabbing tolerance requirement is judged, if the grabbing tolerance requirement is met, the real-time theoretical posture of the workpiece grabbing point is further calculated, and the robot system is guided to grab the workpiece according to the real-time theoretical posture of the workpiece grabbing point, that is, whether the real-time theoretical posture can successfully complete workpiece grabbing can be automatically judged at the state feedback point, grabbing is performed if the real-time theoretical posture can successfully grab the workpiece, and grabbing is not performed if the real-time theoretical posture cannot successfully grab the workpiece, so that the workpiece grabbing efficiency of the robot system is effectively improved, and the phenomenon of workpiece grabbing failure caused by the deviation of the positioning posture exceeding the threshold value is avoided.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (11)

1. A method of workpiece grasping, comprising:
setting a state feedback point and a workpiece grabbing point of a robot system, pre-calibrating a posture conversion relation, and acquiring a posture conversion matrix from a mark plate to the state feedback point and a posture conversion matrix from the state feedback point to the workpiece grabbing point, wherein the mark plate is arranged on a workpiece to be grabbed;
acquiring the real-time posture of the marking plate, and calculating to obtain the real-time theoretical posture of the state feedback point by combining a posture transformation matrix from the marking plate to the state feedback point;
and judging whether the real-time theoretical attitude of the state feedback point meets the requirement of grabbing tolerance or not, if so, calculating to obtain the real-time theoretical attitude of the workpiece grabbing point by combining the attitude transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical attitude of the workpiece grabbing point.
2. The workpiece grasping method according to claim 1, wherein before setting up the state feedback point and the workpiece grasping point of the robot system and performing the posture conversion relationship pre-calibration, the workpiece grasping method further comprises:
the positioning target comprises a mark plate, a baffle and a connecting plate, the mark plate is arranged in parallel with the baffle, the plate projection of the mark plate and the plate projection of the baffle are overlapped, the mark plate is fixedly connected with the baffle through the connecting plate, the sensing system comprises an industrial camera, a laser range finder and a gesture calculating module, the robot system comprises a robot body and an end effector, and the industrial camera and the laser range finder are respectively arranged on the end effector.
3. The workpiece grabbing method according to claim 2, wherein a visual positioning pattern is arranged on the plate surface of the marking plate, a plurality of through holes with non-collinear circle centers are further arranged on the plate surface of the marking plate, and the radius of each through hole is smaller than or equal to the tolerance of the end effector when grabbing the workpiece to be grabbed.
4. The workpiece gripping method according to claim 3, wherein the step of setting a state feedback point and a workpiece gripping point of the robot system, performing attitude transformation relationship pre-calibration, and obtaining an attitude transformation matrix from a signboard to the state feedback point and an attitude transformation matrix from the state feedback point to the workpiece gripping point comprises:
acquiring an image of the visual positioning pattern through the industrial camera, and processing the image of the visual positioning pattern through the gesture calculation module to obtain the gesture of the mark plate;
setting the state feedback point, adjusting the posture of the robot system, starting the laser range finder, enabling a light spot of the laser range finder to penetrate through the central area of the through hole in the sign board and reach the baffle, and obtaining an ideal distance parameter from the laser range finder to the baffle; calculating to obtain a posture transformation matrix from the mark plate to the state feedback point according to the posture of the robot system at the state feedback point and the posture of the mark plate through the posture calculation module;
and setting the workpiece grabbing point through teaching, and calculating to obtain a posture transformation matrix from the state feedback point to the workpiece grabbing point according to the posture of the robot system at the workpiece grabbing point and the posture of the robot system at the state feedback point through the posture calculation module.
5. The workpiece grabbing method according to claim 4, wherein the step of obtaining the real-time posture of the marking plate and calculating the real-time theoretical posture of the state feedback point by combining a posture transformation matrix from the marking plate to the state feedback point comprises the following steps:
acquiring a real-time image of the visual positioning pattern through the industrial camera, and processing the real-time image of the visual positioning pattern through the gesture calculation module to obtain a real-time gesture of the sign board;
and calculating to obtain the real-time theoretical attitude of the state feedback point according to the real-time attitude of the mark plate and the attitude transformation matrix from the mark plate to the state feedback point through the attitude calculation module.
6. The workpiece grabbing method according to claim 5, wherein the step of judging whether the real-time theoretical attitude of the state feedback point meets grabbing tolerance requirements comprises the following steps:
moving the robot system to the real-time theoretical attitude of the state feedback point, starting the laser range finder and acquiring the real-time distance parameter from the laser range finder to the baffle;
obtaining a tolerance range when the end effector grabs the workpiece to be grabbed, comparing the ideal distance parameter with the real-time distance parameter, if the difference value between the ideal distance parameter and the real-time distance parameter is within the tolerance range, the real-time theoretical attitude of the state feedback point meets the grabbing tolerance requirement, and if the difference value between the ideal distance parameter and the real-time distance parameter is outside the tolerance range, the real-time theoretical attitude of the state feedback point does not meet the grabbing tolerance requirement.
7. The workpiece grabbing method according to claim 6, wherein the step of calculating a real-time theoretical posture of the workpiece grabbing point by combining the posture transformation matrix from the state feedback point to the workpiece grabbing point, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point comprises the following steps:
calculating to obtain the real-time theoretical attitude of the workpiece grabbing point according to the real-time theoretical attitude of the state feedback point and the attitude transformation matrix from the state feedback point to the workpiece grabbing point through the attitude calculation module;
and guiding the robot system to move and driving the end effector to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point.
8. The workpiece grabbing method according to claims 1-7, characterized in that if the real-time theoretical attitude of the state feedback point does not meet the grabbing tolerance requirement, the state feedback point and the workpiece grabbing point are re-established, the attitude transformation relation pre-calibration is performed again, the real-time attitude of the marking plate is re-acquired, and the real-time theoretical attitude of the state feedback point is updated until the real-time theoretical attitude of the state feedback point meets the grabbing tolerance requirement.
9. A workpiece grasping device, comprising:
the positioning target is arranged on a workpiece to be grabbed;
a robotic system;
and the sensing system is arranged on the robot system and used for acquiring the real-time posture of the positioning target, acquiring the real-time theoretical posture of the state feedback point and the real-time theoretical posture of the workpiece grabbing point through a pre-calibrated posture conversion relation, judging whether the real-time theoretical posture of the state feedback point meets the grabbing tolerance requirement or not, and guiding the robot system to grab the workpiece to be grabbed according to the real-time theoretical posture of the workpiece grabbing point meeting the grabbing tolerance requirement.
10. The workpiece grabbing device according to claim 9, wherein the positioning target comprises a mark plate, a baffle plate and a connecting plate, the mark plate and the baffle plate are arranged in parallel, the plate surface projection of the mark plate and the plate surface projection of the baffle plate have an overlapping part, and the mark plate and the baffle plate are fixedly connected through the connecting plate; the robot system comprises a robot body and an end effector, wherein the end effector is arranged on the robot body; the sensing system comprises an industrial camera, a laser range finder and a posture calculation module, wherein the industrial camera and the laser range finder are respectively arranged on the end effector.
11. The workpiece gripping device according to claim 10, wherein a visual positioning pattern is provided on the surface of the marking plate, a plurality of through holes with non-collinear circle centers are further provided on the surface of the marking plate, and the radius of the through holes is smaller than or equal to the tolerance of the end effector in gripping the workpiece to be gripped.
CN202210846168.9A 2022-07-04 2022-07-04 Workpiece grabbing method and device Pending CN115122331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210846168.9A CN115122331A (en) 2022-07-04 2022-07-04 Workpiece grabbing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210846168.9A CN115122331A (en) 2022-07-04 2022-07-04 Workpiece grabbing method and device

Publications (1)

Publication Number Publication Date
CN115122331A true CN115122331A (en) 2022-09-30

Family

ID=83382986

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210846168.9A Pending CN115122331A (en) 2022-07-04 2022-07-04 Workpiece grabbing method and device

Country Status (1)

Country Link
CN (1) CN115122331A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190188435A1 (en) * 2011-08-30 2019-06-20 Digimarc Corporation Methods and arrangements for identifying objects
CN110842928A (en) * 2019-12-04 2020-02-28 中科新松有限公司 Visual guiding and positioning device and method for compound robot
CN111775146A (en) * 2020-06-08 2020-10-16 南京航空航天大学 Visual alignment method under industrial mechanical arm multi-station operation
CN112936257A (en) * 2021-01-22 2021-06-11 熵智科技(深圳)有限公司 Workpiece grabbing method and device, computer equipment and storage medium
WO2021228181A1 (en) * 2020-05-13 2021-11-18 中国科学院福建物质结构研究所 3d printing method and device
CN114074331A (en) * 2022-01-19 2022-02-22 成都考拉悠然科技有限公司 Disordered grabbing method based on vision and robot
JP2022039906A (en) * 2020-08-28 2022-03-10 中国計量大学 Multi-sensor combined calibration device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190188435A1 (en) * 2011-08-30 2019-06-20 Digimarc Corporation Methods and arrangements for identifying objects
CN110842928A (en) * 2019-12-04 2020-02-28 中科新松有限公司 Visual guiding and positioning device and method for compound robot
WO2021228181A1 (en) * 2020-05-13 2021-11-18 中国科学院福建物质结构研究所 3d printing method and device
CN111775146A (en) * 2020-06-08 2020-10-16 南京航空航天大学 Visual alignment method under industrial mechanical arm multi-station operation
JP2022039906A (en) * 2020-08-28 2022-03-10 中国計量大学 Multi-sensor combined calibration device and method
CN112936257A (en) * 2021-01-22 2021-06-11 熵智科技(深圳)有限公司 Workpiece grabbing method and device, computer equipment and storage medium
CN114074331A (en) * 2022-01-19 2022-02-22 成都考拉悠然科技有限公司 Disordered grabbing method based on vision and robot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KENTARO NOZU: "Robotic bolt insertion and tightening based on in-hand object localization and force sensing", 《2018 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS》, 2 September 2018 (2018-09-02) *
于涵: "基于深度视觉的排爆机器人自主抓取系统研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 15 July 2020 (2020-07-15) *
刘景亚: "基于增强学习的工业机器人自主运动规划", 《机器人技术与应用》, 30 May 2021 (2021-05-30) *

Similar Documents

Publication Publication Date Title
EP3222393B1 (en) Automated guidance system and method for a coordinated movement machine
US6681151B1 (en) System and method for servoing robots based upon workpieces with fiducial marks using machine vision
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US7200260B1 (en) Teaching model generating device
CN102448679B (en) Method and system for extremely precise positioning of at least one object in the end position in space
JP2022028672A (en) System and method for automatic hand-eye calibration of vision system for robot motion
JP5365379B2 (en) Robot system and robot system calibration method
US20050273199A1 (en) Robot system
CN212284935U (en) Sheet metal sorting device based on 3D vision
JP2015213973A (en) Picking device and picking method
JP2016099257A (en) Information processing device and information processing method
CN110740841B (en) Operating system
CN111278608A (en) Calibration article for 3D vision robot system
US20190030722A1 (en) Control device, robot system, and control method
US20230173660A1 (en) Robot teaching by demonstration with visual servoing
CN115122331A (en) Workpiece grabbing method and device
EP4116043A2 (en) System and method for error correction and compensation for 3d eye-to-hand coordination
WO2023013740A1 (en) Robot control device, robot control system, and robot control method
US20220395981A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
WO2023102647A1 (en) Method for automated 3d part localization and adjustment of robot end-effectors
CN110977950B (en) Robot grabbing and positioning method
US12017371B2 (en) Efficient and robust line matching approach
JPH09323280A (en) Control method and system of manupulator
Wang et al. Robotic assembly system guided by multiple vision and laser sensors for large scale components
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination