CN112123342B - Robot system and measurement and control method - Google Patents

Robot system and measurement and control method Download PDF

Info

Publication number
CN112123342B
CN112123342B CN202011333511.7A CN202011333511A CN112123342B CN 112123342 B CN112123342 B CN 112123342B CN 202011333511 A CN202011333511 A CN 202011333511A CN 112123342 B CN112123342 B CN 112123342B
Authority
CN
China
Prior art keywords
substrate
coordinate system
stacked
axis
top surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011333511.7A
Other languages
Chinese (zh)
Other versions
CN112123342A (en
Inventor
姜峣
陈志远
李逢春
李铁民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202011333511.7A priority Critical patent/CN112123342B/en
Publication of CN112123342A publication Critical patent/CN112123342A/en
Application granted granted Critical
Publication of CN112123342B publication Critical patent/CN112123342B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G61/00Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)

Abstract

A robot system and a measurement and control method are provided, the measurement and control method comprises: driving the mechanical arm to enable the tail end grabbing module to move to the position above the articles to be stacked; measuring by adopting at least 3 distance measuring elements on the tail end grabbing module; calculating the amount of movement required when the board surface of the substrate moves to be parallel to the top surface; driving a mechanical arm to enable the substrate to move until the plate surface and the top surface are parallel to each other; shooting the top of the article to be stacked by adopting a first camera; calculating the amount of movement required to move a suction cup fixed relative to the substrate to a predetermined position on the top surface relative to a predetermined gripping point in the normal direction of the substrate; the mechanical arm is driven to move the base plate so that the sucker moves to the position aligned with the preset grabbing point, and the sucker is close to the object to be stacked along the normal direction of the base plate until the sucker sucks the object to be stacked. The measurement and control method is adopted to grab the articles to be stacked, and the precision is higher.

Description

Robot system and measurement and control method
Technical Field
The present disclosure relates to robotics, and more particularly, to a robot system and a measurement and control method.
Background
Efficient performance of repetitive operations is an important feature in robotic applications, where multi-component repetitive operations are even more numerous in the areas of logistics and assembly, such as robotic palletizing operations in the logistics industry, wall stacking, tile laying operations in the construction industry, and so forth. However, compared to industrial scenarios where robots are used in mature environments, logistics warehouses, construction sites, and the like can be generally considered as unstructured, and the relations among objects and positions where workpieces are not fixed are randomly established, which brings challenges to the operation accuracy of the robots.
One important reason for the lack of assurance of operating accuracy in the case of a robot model determination is the measurement error present in the operation of the robot. The measurement error mainly comes from two aspects. Firstly, in the grabbing stage, the robot acquires the 6D position and posture of a workpiece in a scene through a related sensor, and then the grabbing configuration including a grabbing point, a grabbing direction and the like is calculated. Secondly, the robot needs to acquire the 6D pose of the workpiece to be placed or assembled through related sensors so as to accurately install the workpiece at a desired position. At present, the pose measurement of a workpiece in an unstructured environment is generally completed through vision measurement, but at the present stage, the vision measurement cannot be used for a robot to execute a task with a high precision requirement. One possible implementation is to first roughly guide the robot tip to the vicinity of the desired position by robot vision and then use other means with high-precision measurement capabilities to assist the robot in performing high-precision operations.
Under the background, a stable and reliable target pose measurement and control method and system meeting the requirement of higher precision are urgently needed.
Disclosure of Invention
The application provides a measurement and control method of a robot system, which can grab articles to be stacked with high precision.
Compared with the related art, the measurement and control method comprises the following steps:
driving the mechanical arm to enable the tail end grabbing module to move to the position above the articles to be stacked;
measuring by adopting at least 3 distance measuring elements on the tail end grabbing module to obtain the distance between the measuring starting point of each measuring element and the top surface of the object to be stacked in the normal direction of the substrate;
calculating the amount of movement required when the plate surface of the substrate moves to be parallel to the top surface according to the distance from each measurement starting point to the top surface and the relative position relation between each measurement starting point and the substrate of the tail end grabbing module;
driving a mechanical arm to enable the substrate to move until the plate surface and the top surface are parallel to each other;
shooting the top of an article to be stacked by adopting a first camera to obtain a top picture of the article to be stacked;
calculating the motion amount required by the substrate when the sucker fixed relative to the substrate is moved to a position opposite to a preset grabbing point on the top surface in the normal direction of the substrate according to the relative position relation between the first camera and the substrate according to the top picture;
the mechanical arm is driven to move the base plate so that the sucker moves to the position aligned with the preset grabbing point, and then the sucker is close to the object to be stacked along the normal direction of the base plate until the tail end grabbing module grabs the object to be stacked.
According to the invention, the base plate is firstly rotated to the position parallel to the top surface of the object to be stacked according to the measurement result of the distance measurement element, then the top of the object to be stacked is shot through the first camera, and then the sucker is moved to the position aligned with the preset grabbing point on the top surface of the object to be stacked according to the shot top picture, so that the high-precision alignment of the sucker and the preset grabbing point on the top surface of the object to be stacked can be realized.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. Other advantages of the present application may be realized and attained by the instrumentalities and combinations particularly pointed out in the specification and the drawings.
Drawings
The accompanying drawings are included to provide an understanding of the present disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the examples serve to explain the principles of the disclosure and not to limit the disclosure.
Fig. 1 is a schematic structural diagram of a robot system in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an end capture module in an embodiment of the present application;
FIG. 3 is a schematic diagram of a robotic system according to an embodiment of the present disclosure;
FIG. 4 is a flowchart of a measurement and control method in the embodiment of the present application at a capture stage;
FIG. 5 is a schematic view of an embodiment of the present disclosure illustrating alignment of a substrate with a top surface of an object to be stacked;
FIG. 6 is a schematic diagram of an imaging model of a camera in an embodiment of the present application;
FIG. 7 is a schematic diagram of the operation of a first camera in the embodiment of the present application;
FIG. 8 is a schematic diagram of the operation of a reference surface generator in an embodiment of the present application;
FIG. 9 is a schematic diagram of the operation of a second camera in the embodiment of the present application;
fig. 10 is a flowchart of the measurement and control method in the embodiment of the present application at the capture stage.
Detailed Description
As shown in fig. 1, fig. 1 shows a robotic system 100. The robot system 100 comprises a measuring component 2, a calculating component 3 and an executing device 1. The measuring assembly 2 and the execution device 1 are electrically connected to the computing assembly 3.
The execution apparatus 1 includes a robot arm 11 and an end gripping module 12. The robot arm 11 may be a multi-axis robot arm 11, for example, a six-axis robot arm 11. The robotic arm 11 may be mounted on the ground. The end-grasping module 12 is provided at the end of the robot arm 11. The end gripping module 12 is used to grip the items 4 to be stacked.
As shown in fig. 2, the end gripping module 12 includes a base plate 121, a first connecting post 122, a second connecting post 126, a mounting plate 127, and a plurality of suction cups 128. The substrate 121 is configured as a flat plate. The substrate 121 may be a square flat plate. The first connection post 122 and the second connection post 126 are perpendicular to the substrate 121. The first connection post 122 and the second connection post 126 are respectively located at two opposite sides of the substrate 121. The first connection post 122 and the second connection post 126 vertically protrude from the opposite plate surfaces of the base plate 121, respectively. The first connection post 122 and the second connection post 126 may both be connected to the middle of the substrate 121.
The first connecting post 122 includes a post 125, a first flange 123 and a second flange 124. The pillar 125 has a straight bar shape. The cylinder 125 may be of a generally circular tubular configuration. The first flange 123 and the second flange 124 are respectively disposed at both ends of the cylinder 125. The first flange 123 and the second flange 124 are both disposed coaxially with the cylinder 125. The first flange 123 is used for screw connection with the robot arm 11. The second flange 124 is used for screw connection with the base plate 121.
One end of the second connection post 126 is connected to the base plate 121, and the other end of the second connection post 126 is connected to the mounting plate 127. The second connecting post 126 may be connected to the middle of the mounting plate 127. The mounting plate 127 may be a flat plate. The mounting plate 127 is parallel to the substrate 121.
The mounting plate 127 is provided with a plurality of mounting seats 120. A suction cup 128 is mounted on each mounting block 120. Suction cup 128 is located on a side of mounting base 120 facing away from second connecting post 126. In the present embodiment, four suction cups 128 are disposed on the mounting plate 127, and the four suction cups 128 are distributed in a matrix. The suction cup 128 can be externally connected to a vacuum line that can provide negative pressure inside the suction cup 128 so that the suction cup 128 can suck the items 4 to be stacked.
The computing assembly 3 is mounted on a substrate 121. The computing component 3 may be an embedded computer. The computing component 3 is located on the surface of the substrate 121 away from the second connecting column 126.
The measuring assembly 2 comprises a plurality of distance measuring elements 21, a first camera 22, a second camera 23, a reference plane generator 25, a plurality of detection elements 24, a first support 129 and a second support 133.
The distance measuring device 21 is mounted on the plate surface of the substrate 121 near the suction cup 128. At least 3 distance measuring elements 21 are provided. In the present embodiment, 4 distance measuring elements 21 are provided. The 4 distance measuring elements 21 are respectively disposed near four edges of the substrate 121. The distance measuring device 21 is mounted on the plate surface of the substrate 121 near the suction cup 128. The distance measuring element 21 is used to measure the distance between the article 4 to be stacked and the distance measuring element 21 in a direction perpendicular to the base plate 121. The distance measuring element 21 may be a laser displacement sensor. The distance measuring element 21 sends laser light in a direction perpendicular to the base plate 121, and the laser light is irradiated onto the article 4 to be stacked so that the distance from the measuring element to the article 4 to be stacked in the direction perpendicular to the base plate 121 can be measured.
The first camera 22 is disposed on the plate surface of the substrate 121 near the chuck 128. The lens of the first camera 22 faces away from the substrate 121. The first camera 22 can take a picture of the underside of the grabbed model. The first camera 22 is close to one corner of the substrate 121.
The first support 129 is a strip-shaped plate. The substrate 121 has a first side 1211, and the first side 1211 of the substrate 121 is perpendicular to the surface of the substrate 121. One end of the first bracket 129 is connected to the first side surface 1211 of the substrate 121, and the other end extends obliquely downward from the substrate 121. The first bracket 129 extends to a position substantially flush with the suction cup 128. The first bracket 129 includes a first section 130, a second section 131, and a third section 132 connected in series. The first segment 130 extends horizontally outward from the edge of the substrate 121. The second segment 131 extends obliquely downward away from the substrate 121 from the first end of the first segment 130 facing outward. The third section 132 extends vertically downward from an end of the second section 131 facing away from the first section 130.
The second camera 23 is arranged on an end of the first support 129 facing away from the substrate 121. In this embodiment, the second camera 23 is fixed on the third section 132 of the first support 129, and is located on one side of the third section 132 close to the suction cup 128. The lens of the second camera 23 faces the suction cup 128.
The second bracket 133 and the first connection post 122 are disposed on the same plate surface of the substrate 121. The second bracket 133 is adjacent to the first side 1211 of the substrate 121. The second bracket 133 includes a vertical bar 135 and a horizontal bar 134. The risers are perpendicular to the substrate 121. One end of the stem 135 is connected to the substrate 121. A cross bar 134 is provided on the other end of the vertical bar 135. The crossbar 134 is parallel to the substrate 121.
As shown in fig. 8, the reference surface generator 25 is fixed in the environment around the robot arm 11. The reference plane generator 25 is used to generate a reference plane 251. The reference plane 251 may be a laser plane, for example, the reference plane generator 25 may emit a substantially fan-shaped laser plane as the reference plane 251. The reference plane 251 serves as a reference plane for the stacked article and is flush with one side surface 51 of the stacked article 55. The reference plane 251 may be a plane disposed vertically.
At least 3 detecting elements 24 are provided. The detecting element 24 may be a PSD (Position Sensitive Device) sensor. The detecting elements 24 are fixed to the substrate 121. The detection element 24 includes a circuit board 242 and a sensing portion 241. The circuit board 242 may be of a generally box-like configuration. The sensing part 241 is disposed outside the circuit board 242. The sensing part 241 is located at the middle of one surface of the circuit board 242. The sensing portion 241 may have a substantially straight bar shape. The sensing part 241 senses the reference plane 251 established by the reference plane generator 25. The extending direction of the sensing portion 241 is parallel to the plate surface of the substrate 121 and perpendicular to the first side 1211 of the substrate 121. The sensing portions 241 of the plurality of sensing elements 24 have middle portions located in the same plane, which is parallel to the first side 1211 and located outside the first side 1211, and the middle portions of the plurality of sensing portions 241 are not located on the same straight line. In the present embodiment, the first detecting element 24 and the second detecting element 24 are respectively provided on both plate surfaces of the substrate 121, and the third detecting element 24 is provided on the cross bar 134 of the second bracket 133.
In the present embodiment, as shown in fig. 3, the article to be stacked 4 and the stacked article 5 are the same in shape and size. Both the article 4 to be stacked and the stacked article 5 are cuboids.
As shown in fig. 4, this embodiment further provides a measurement and control method. The measurement and control method comprises the following steps:
step S0: pre-establishing a substrate coordinate system, wherein the substrate coordinate system is fixed relative to the substrate 121;
pre-establishing a first camera coordinate system, which is fixed relative to the first camera 22;
a second camera coordinate system is pre-established, which is fixed relative to the second camera 23.
As shown in fig. 5, the substrate coordinate system may be a rectangular coordinate system. The substrate coordinate system has x, y, z axes. The z-axis of the substrate coordinate system is perpendicular to the plate surface of the substrate 121, and the positive direction of the z-axis points to the side of the substrate 121 near the chuck 128. The origin O of the substrate coordinate system may be located at the center of a plate surface of the substrate 121, which may be the plate surface of the substrate 121 near the chuck 128.
Step S1: driving the robot arm 11 so that the end gripping module 12 moves above the item 4 to be stacked;
the measurement of the target pose based on vision is generally a rough measurement. The position and posture information of the article 4 to be stacked can be roughly measured based on the visual measurement, and then the robot arm 11 is driven to move the end-grip module 12 to the vicinity above the article 4 to be stacked.
It is of course also possible to manually control the robot arm 11 to move the end gripping module 12 over the items 4 to be stacked.
Step S2: performing measurement by using at least 3 distance measuring elements 21 on the end grabbing module 12 to obtain the distance from the measurement starting point of each measuring element to the top surface of the object 4 to be stacked in the normal direction of the base plate 121;
in this step, as shown in fig. 5, the object 4 to be stacked is placed on the ground, the robot arm 11 moves the end-gripping module 12 above the object 4 to be stacked, the plate surface of the substrate 121 close to the suction cup 128 faces downward, the suction cup 128 is located between the object 4 to be stacked and the substrate 121, and the lens of the first camera 22 faces the object 4 to be stacked. Each ranging element 21 emits laser light toward the article 4 to be stacked, the laser light being incident on the top surface of the article 4 to be stacked from the measurement start point of the ranging element 21 in a direction parallel to the normal direction of the base plate 121. The laser emitted by the distance measuring element 21 irradiates on the top surface to form a light spot, and the position of the light spot is a measuring point on the top surface. The straight distance from the measurement start point measured by the distance measuring element 21 to the measurement point is the distance from the measurement start point of the measurement element to the top surface of the article 4 to be stacked in the normal direction of the base plate 121. In the present embodiment, the top surface of the article 4 to be stacked is measured using 4 distance measuring elements 21 to obtain the distance between the measurement starting point of each measuring element in the normal direction of the base plate 121 to the top surface of the article 4 to be stacked. Respectively corresponding to four measurement starting points K1、K2、K3、K4The corresponding measurement points are respectively point H1、H2、H3、H4Measuring the starting point K1And measuring point H1A distance d between1Measuring the starting point K2And measuring point H2A distance d between2Measuring the starting point K3And measuring point H3A distance d between3Measuring the starting point K4And measuring point H4A distance d between4. The distance measuring element 21 sends the measurement results to the calculation component 3.
Step S3: calculating the amount of movement required when the plate surface of the substrate 121 moves to be parallel to the top surface according to the distance from each measurement starting point to the top surface and the relative position relationship between each measurement starting point and the substrate 121 of the tail-end grabbing module 12;
in the present embodiment, the calculation component 3 calculates a rotation angle at which the substrate 121 needs to be rotated about the x-axis of the substrate coordinate system and a rotation angle at which the substrate 121 needs to be rotated about the y-axis of the substrate coordinate system in order to rotate the plate surface of the substrate 121 to be parallel to the top surface of the article 4 to be stacked, based on the distance between the measurement start point of each measurement element in the normal direction of the substrate 121 to the top surface of the article 4 to be stacked and the coordinates of each measurement start point in the substrate coordinate system.
The step S3 includes steps S31-S33;
step S31: calculating a plane equation of a plane of the top surface under the substrate coordinate system according to the distance between the measurement starting point of each measurement element in the normal direction of the substrate 121 and the top surface of the article 4 to be stacked and the coordinates of each measurement starting point under the substrate coordinate system;
the coordinates of the measurement starting point of the distance measuring element 21 in the substrate coordinate system are known. In the present embodiment, the starting point K is measured1、K2、K3、K4The coordinates in the substrate coordinate system are respectively
Figure 325482DEST_PATH_IMAGE001
Measuring the starting point K1And measuring point H1A distance d between1Measuring the starting point K2And measuring point H2A distance d between2Measuring the starting point K3And measuring point H3A distance d between3Measuring the starting point K4And measuring point H4A distance d between4Then measuring point H1、H2、H3、H4The coordinates in the substrate coordinate system are respectively
Figure 836097DEST_PATH_IMAGE003
According to the measuring point H on the top surface of the article 4 to be stacked1、H2、H3、H4The coordinates under the substrate coordinate system can adopt a space plane fitting algorithm to calculate a plane equation of the plane of the top surface
Figure 943731DEST_PATH_IMAGE004
. The spatial plane fitting algorithm may be a least squares method.
Step S32: according to the plane equation of the plane 6 of the top surface under the substrate coordinate system, a first workpiece coordinate system is set, the origin of the first workpiece coordinate system is a point F, the x axis and the y axis of the first workpiece coordinate system are in the plane 6, the z axis of the first workpiece coordinate system is parallel to the normal direction of the plane 6, and a rotation matrix from the substrate coordinate system to the first workpiece coordinate system is obtained.
In this embodiment, the plane equation is
Figure 807781DEST_PATH_IMAGE005
The normal vector of the plane 6 is noted as
Figure 448585DEST_PATH_IMAGE006
And selecting the direction of the normal vector as the direction in which the cosine value of an included angle between the direction of the normal vector and the positive direction of the z axis of the substrate coordinate system is greater than or equal to 0. Let the direction vector of the z-axis of the first workpiece coordinate system under the substrate coordinate system be
Figure 739889DEST_PATH_IMAGE007
In the same direction. In this way, the z-axis normal direction of the first object coordinate system can be aligned with the normal vector direction.
Let the direction vector of the x-axis of the first workpiece coordinate system under the substrate coordinate system be:
Figure 600398DEST_PATH_IMAGE008
let the direction vector of the y-axis of the first workpiece coordinate system under the substrate coordinate system be:
Figure 268139DEST_PATH_IMAGE009
and calculating a rotation matrix from the substrate coordinate system to the first workpiece coordinate system according to the direction vectors of the x, y and z axes of the first workpiece coordinate system in the substrate coordinate system.
Step S33: according to the rotation matrix from the substrate coordinate system to the first workpiece coordinate system, a corner angle required for rotating the substrate 121 around the x-axis of the substrate coordinate system and a corner angle required for rotating the substrate 121 around the y-axis of the substrate coordinate system are calculated for making the plate surface of the substrate 121 and the top surface of the article 4 to be stacked parallel to each other.
The calculation of the rotation angle based on the rotation matrix is a conventional technique in the art and will not be described herein.
Step S4: the substrate 121 is moved to a position where the surface of the substrate is parallel to the top surface by the driving arm 11;
the computing assembly 3 drives the robot arm 11 to rotate the substrate 121 about the x-axis and about the y-axis of the substrate coordinate system by the angular angle. This rotates the plate surface of the base plate 121 to be parallel to the top surface of the article 4 to be stacked.
Step S5: shooting the bottom of the article 4 to be stacked by using the first camera 22 to obtain a top picture of the article 4 to be stacked;
as shown in fig. 7, the first camera 22 is controlled to perform shooting at the calculation component 3. The first camera 22 takes a picture of the item 4 to be stacked from the top of the item 4 to be stacked, and a picture of the top of the item 4 to be stacked can be obtained.
Step S6: from the top view, the relative positional relationship between the first camera 22 and the substrate 121 calculates the amount of movement required to move the suction cup 128, which is fixed relative to the substrate 121, to the substrate 121 when it is aligned with a predetermined grasping point on the top surface in the normal direction of the substrate 121;
the step S6 includes S61~ S64.
Step S61: the top surface 41 of the article 4 to be stacked is measured using the distance measuring element 21 to obtain the distance between the measurement start point of the measuring element and the top surface 41 of the article 4 to be stacked.
The computing assembly 3 controls the distance measuring unit 21 to measure the top surface 41 of the article 4 to be stacked after the robot 11 rotates the base plate 121 to the proper position. At this time, since the top surface 41 of the article 4 to be stacked is parallel to the plate surface of the substrate 121, the laser propagation direction of the measuring element is perpendicular to the top surface 41 of the article 4 to be stacked, and the distance measured by the measuring element is the distance d from the measurement starting point of the measuring element to the top surface 41.
In the present embodiment, 4 distance measuring elements 21 are used to measure the distance d simultaneously, and the 4 distance measuring elements 21 measure the distance d respectively5、d6、d7、d8The calculation component 3 averages these several distances to obtain an average distance, which is used as the distance d between the measuring starting point of the measuring element and the top surface 41 of the item 4 to be stacked.
Step S62: a first feature in the top picture is identified.
In this embodiment, as shown in fig. 7, the first feature comprises a right angle formed by two adjacent edges on the top surface 41 of the article 4 to be stacked. In the top picture, the right angle may be the right angle located at the lower left. The computing component 3 can identify the first feature from the top picture after performing edge detection on the top picture.
Step S63: the position of the first feature in the first camera coordinate system is obtained from the position of the measurement starting point of the ranging element 21 in the first camera coordinate system, the distance between the measurement starting point and the top surface 41 of the item 4 to be stacked, and the position of the first feature in the top picture.
A two-dimensional pixel coordinate system can be established in the top picture, and the position of the first feature in the top picture is the coordinate value of the first feature in the pixel coordinate system.
In the present embodiment, as shown in fig. 6, the origin of the first camera coordinate system is a point C, which is the optical center of the first camera 22, and the first camera coordinate system has mutually perpendicular XcAxis, YcAxis and ZcA shaft. Wherein Z iscThe axis is parallel to the normal of the substrate 121. X of the first camera coordinate systemcThe axis is parallel to the x-axis of the substrate coordinate system, and the Y-axis of the first camera coordinate system is parallel to the x-axis of the substrate coordinate systemcThe axis is parallel to the y-axis of the substrate coordinate system, and the Z of the first camera coordinate system is setcThe axis is arranged parallel to the z-axis of the substrate coordinate system. The starting points of the measurements of the 4 distance measuring elements 21 are located in the same plane as the first camera coordinate system
Figure 530493DEST_PATH_IMAGE010
The planes are parallel to each other. Can be used forPre-calibrating a measurement start point to
Figure 992699DEST_PATH_IMAGE011
The distance between the planes. The distance is X from the measuring starting point of the distance measuring element 21CCYCPlane parallel to plane XCCYCThe distance between the planes.
The coordinates of any point in the first feature in the first camera coordinate system can be calculated using the following equation:
Figure 899103DEST_PATH_IMAGE014
where K is the internal reference matrix of the first camera 22,
Figure 922423DEST_PATH_IMAGE015
is a point
Figure 367311DEST_PATH_IMAGE016
The depth value of (a) is determined,
Figure 235909DEST_PATH_IMAGE017
is the coordinate value of any point in the first feature under the pixel coordinate system,
Figure 348222DEST_PATH_IMAGE018
is the coordinate value of the point in the first camera coordinate system.
The internal reference matrix of the first camera 22 is an intrinsic parameter of the first camera 22, and the internal reference matrix can be obtained by calibration, for example, by using monocular expandable virginia calibration method, or can be directly provided by a camera manufacturer. Thus, the reference matrix of the first camera 22 is a known quantity.
Z of the first camera coordinate systemcX of axis up and first camera coordinate systemCCYCDepth value of plane above initial measurement point of distance measuring element
Figure 276864DEST_PATH_IMAGE019
Can be obtained using the following equation:
Figure 474627DEST_PATH_IMAGE020
wherein d is the distance between the measurement starting point of the measuring element and the top surface 41 of the item 4 to be stacked, and d is the distance between the measurement starting point and the first camera coordinate system
Figure 494763DEST_PATH_IMAGE022
The distance between the planes.
Figure 735251DEST_PATH_IMAGE023
Can be calibrated in advance for known quantities.
As shown in fig. 7, a second object coordinate system is established, the origin of which is the vertex Q of the first feature, and the x and y axes of which extend along two edges of the first feature, respectively, and the z axis of which is perpendicular to the two edges. By using the above equation, the coordinates of the vertex of the first feature in the first camera coordinate system and the coordinates of any two other points respectively located on the two edges of the first feature in the first camera coordinate system can be calculated. The direction vector of the x-axis and the y-axis of the second workpiece coordinate system in the first camera coordinate system can be calculated according to the coordinates of the vertex in the first camera coordinate system in the first characteristic and the coordinates of the two points respectively positioned on the two edges in the first camera coordinate system.
Since the first feature is overlapped with the x-axis and the y-axis of the second workpiece coordinate system, the position of the first feature in the first camera coordinate system can be known by obtaining the origin coordinate of the second workpiece coordinate system and the direction vectors of the x-axis and the y-axis in the first camera 22 coordinate system.
In addition, a camera may be additionally installed beside the first camera 22, and the camera and the first camera 22 form a binocular depth camera system with depth values
Figure 13786DEST_PATH_IMAGE024
Can also be directly transmitted through a binocular depth camera systemAnd (4) measuring.
Step S64: according to the position of the first feature in the first camera coordinate system, the rotation matrix from the first camera coordinate system to the substrate coordinate system, and the first preset position in the substrate coordinate system, the translation amount of the substrate 121 that needs to move along the x-axis of the substrate coordinate system, the translation amount of the substrate that needs to move along the y-axis of the substrate coordinate system, and the rotation angle of the z-axis of the substrate coordinate system when the first feature is moved to the first preset position are calculated.
The base plate 121 has a first predetermined position in which the predetermined gripping point on the top surface 41 of the item 4 to be stacked is aligned with the suction cup 128, and the suction cup 128 is capable of abutting against the predetermined gripping point on the top surface 41 of the item 4 to be stacked when approaching the item 4 to be stacked in a direction perpendicular to the normal direction of the base plate 121.
In this embodiment, the substrate coordinate system has a first predetermined coordinate, which is the coordinate of the vertex Q of the first feature in the substrate coordinate system when the suction cup 128 is just abutting against the predetermined gripping point on the top surface 41 of the object 4 to be stacked. When the origin of the second workpiece coordinate system is located at a first preset coordinate of the substrate coordinate system, the x-axis of the second workpiece coordinate system is parallel to the x-axis of the substrate coordinate system, and the y-axis of the second workpiece coordinate system is parallel to the y-axis of the substrate coordinate system, the first feature is located at the first preset position.
The rotation matrix from the first camera coordinate system to the substrate coordinate system is a known quantity and can be calibrated in advance. The position of the first feature in the substrate coordinate system can be obtained according to the rotation matrix from the first camera coordinate system to the substrate coordinate system and the position of the first feature in the first camera coordinate system.
According to the position of the first feature in the substrate coordinate system and the first preset position in the substrate coordinate system, the translation amount of the substrate 121 that needs to move along the x-axis of the substrate coordinate system, the translation amount of the substrate that needs to move along the y-axis of the substrate coordinate system, and the rotation angle of the z-axis of the substrate coordinate system when the first feature is moved to the first preset position can be calculated.
Step S7: driving the mechanical arm 11 to move the base plate 121 to enable the suction cup 128 to move to a position aligned with a preset grabbing point, and approaching the to-be-stacked object 4 along the normal direction of the base plate 121 until the suction cup 128 grabs the to-be-stacked object 4;
in the embodiment, the computing assembly 3 drives the mechanical arm 11 to move the substrate 121 according to the obtained translation amount of the x-axis movement of the substrate coordinate system, the translation amount of the y-axis movement of the substrate coordinate system, and the rotation angle of the z-axis rotation of the substrate coordinate system, so that the first feature is located at the first preset position under the substrate coordinate system.
At this time, the suction cup 128 is aligned with the predetermined grabbing point on the top surface 41 of the to-be-stacked object 4, the computing assembly 3 controls the robot arm 11 to move the substrate 121 so that the substrate 121 approaches the to-be-stacked object 4 along the normal direction, the suction cup 128 approaches the to-be-stacked object 4 along the normal direction of the substrate 121 under the driving of the substrate 121 until abutting against the predetermined grabbing point on the top surface 41 of the to-be-stacked object 4, and then the suction cup 128 sucks the to-be-stacked object 4.
The distance of the base plate 121 in the normal direction close to the article 4 to be stacked can be calculated from the distance d between the measurement start point of the measuring element and the top surface 41 of the article 4 to be stacked and the relative position between the measurement start point of the measuring element and the suction cup 128.
Step S8: moving the item to be stacked 4 over the stacked item 5;
the position and posture information of the uppermost one of the stacked articles 5 can be roughly measured based on the visual measurement technique, and then the robot arm 11 is driven to move the end gripping module 12 to the vicinity above the stacked article 5.
Of course, it is also possible to manually control the robot arm 11 to move the end gripping module 12 over the stacked items 5.
Step S9: driving the reference surface generator 25 to set a reference plane 251, the reference plane 251 being flush with one side surface of the stacked article 5;
step S10: measuring the relative position between the reference plane 251 and the substrate 121 using at least 3 detection elements 24 on the end gripping module 12;
in the present embodiment, coordinates of each position point on the sensing portion 241 of the measuring device in the substrate coordinate system are known. When any one of the position points senses the reference plane 251, the coordinates of the position point where the reference plane 251 and the sensing part 241 intersect with each other in the substrate coordinate system can be obtained.
The sensing parts 241 of the 3 detecting elements 24 simultaneously measure the reference plane 251 to obtain the position point where the reference plane 251 intersects with the three sensing parts 241K 1 、K 2 、K 3 Coordinates in the substrate coordinate system.
Step S11: calculating the amount of movement of the base plate 121 required to move the side to be parallel to the reference plane 251, based on the relative position between the reference plane 251 and the base plate 121 and the relative position between the base plate 121 and the side of the article 4 to be stacked;
the step S11 includes steps S111 to 113.
Step S111: obtaining a plane equation of the reference plane 251 in the substrate coordinate system according to coordinates of the position points of the reference plane 251, which are sensed by the three sensing elements 24, in the substrate coordinate system;
in the present embodiment, as shown in fig. 8, the sensing parts 241 of the three detecting elements 24 sense the reference plane 251 simultaneously, and the position points where the three sensing parts 241 sense the reference plane 251 are respectively the reference plane 251K 5 、K 6 、K 7 Position pointK 5 、K 6 、K 7 The coordinate values in the substrate coordinate system are respectively
Figure 365133DEST_PATH_IMAGE025
Due to the position pointK 5 、K 6 、K 7 Are all located on the reference plane 251 according to the position pointsK 5 、K 6 、K 7 The coordinates in the substrate coordinate system of (1) to find the plane equation of the reference plane 251 in the substrate coordinate system
Figure 311092DEST_PATH_IMAGE026
Step S112: a third object coordinate system is set according to the plane equation of the reference plane 251 in the substrate coordinate system, the x-axis and z-axis of the third object coordinate system are in the reference plane 251, the y-axis of the third object coordinate system is parallel to the normal of the reference plane 251, and a rotation matrix from the substrate coordinate system to the third object coordinate system is obtained.
In the present embodiment, as shown in FIG. 8, the plane equation is
Figure 83001DEST_PATH_IMAGE027
The normal vector of the reference plane 251 is recorded as
Figure 255357DEST_PATH_IMAGE028
And selecting the direction of the normal vector as the direction in which the cosine value of an included angle between the direction of the normal vector and the positive direction of the y axis of the substrate coordinate system is greater than or equal to 0. Let the direction vector of the y-axis of the third workpiece coordinate system under the substrate coordinate system be
Figure 3870DEST_PATH_IMAGE029
In the same direction. In this way, the y-axis normal direction of the third workpiece coordinate system can be aligned with the normal vector direction.
Let the direction vector of the x-axis of the third workpiece coordinate system under the substrate coordinate system be:
Figure 945281DEST_PATH_IMAGE030
let the direction vector of the z-axis of the third workpiece coordinate system under the substrate coordinate system be:
Figure 652206DEST_PATH_IMAGE031
and calculating a rotation matrix from the substrate coordinate system to a third workpiece coordinate system according to the direction vectors of the x, y and z axes of the third workpiece coordinate system in the substrate coordinate system.
Step S113: based on the rotation matrix of the substrate coordinate system to the third workpiece coordinate system and the relative position between the substrate 121 and the side of the object 4 to be stacked, the angle of rotation of the substrate 121 about the x-axis of the substrate coordinate system and the angle of rotation about the z-axis of the substrate coordinate system required to move the side to be parallel to the reference plane 251 are calculated.
In the present embodiment, after the object 4 to be stacked is accurately gripped by the end-grip module 12, the normal direction of one side surface of the object 4 to be stacked and the y-axis of the substrate coordinate system are parallel to each other. Therefore, the side of the article 4 to be stacked and the reference plane 251 can be made parallel to each other only by rotating the y-axis of the base coordinate system to be parallel to the y-axis of the third workpiece coordinate.
The rotation angle of the substrate 121 about the x-axis of the substrate coordinate system and the rotation angle of the substrate 121 about the z-axis of the substrate coordinate system can be determined from the rotation matrix from the substrate coordinate system to the third workpiece coordinate system when the y-axis of the substrate coordinate system is rotated to be parallel to the y-axis of the third workpiece coordinate system.
Step S12: driving the robot arm 11 so that the side of the article 4 to be stacked is parallel to the reference plane 251;
in the present embodiment, the computing unit 3 drives the robot arm 11 to move the substrate 121 according to the obtained rotation angle of the x-axis rotation of the substrate coordinate system and the rotation angle of the z-axis rotation of the substrate coordinate system, so that the side of the article 4 to be stacked is parallel to the reference plane 251.
Step S13: the position of the base plate 121 is adjusted in the normal direction perpendicular to the reference plane 251 until the reference plane 251 intersects the sensing element 24 at a preset position point on the sensing element 24 so that the side of the article 4 to be stacked is flush with the reference plane 251.
In the present embodiment, when the suction cup 128 is accurately sucked on the predetermined grabbing point of the top surface 41 of the to-be-stacked item 4, the preset position point of the sensing portion 241 of each detecting element 24 is in the same plane with the side surface 42 of the to-be-stacked item 4, and the preset position point may be the middle point of the sensing portion 241. When the reference plane 251 is sensed at a predetermined position of each sensing part 241, the side surface 42 is flush with the reference plane 251. The reference plane 251 is also flush with one side surface 51 of the stacked article 5, so that the one side surface 51 of the stacked article 5 is flush with the corresponding one side surface 42 of the article 4 to be stacked when the reference plane 251 intersects the detecting element 24 at a predetermined position point on the detecting element 24.
When the sensing portion 241 of the detection element 24 senses the reference plane 251, information of a position point where the sensing portion 241 senses the reference plane 251 is transmitted to the individual calculation unit 3.
The calculating component 3 controls the robot arm 11 to move the distance in the y-axis direction parallel to the substrate coordinate system according to the distance between the position point of the reference plane 251 sensed by the sensing portion 241 of one of the sensing elements 24 and the preset position point of the sensing element 24, so that the reference plane 251 passes through the preset position point.
Step S14: a second camera 23 is used to take a picture of the side of the stacked item 5.
As shown in fig. 9, the second camera 23 is controlled at the computing unit 3 to photograph the stacked article 5 from the side of the stacked article 5. The second camera 23 photographs the stacked article 5 from the side of the article 4 to be stacked, and can obtain a side picture of the stacked article 5.
Step S15: from the side view, the relative positional relationship between the second camera 23 and the base plate 121 calculates the amount of movement required to overlap the bottom surface of the article 4 to be stacked and the top surface of the stacked article 5 by the base plate 121;
the step S15 includes steps S151 to S153.
Step S151: a second feature in the side picture is identified.
In this embodiment, as shown in fig. 9, the second feature comprises a right angle formed by two adjacent edges on the side 51 of the uppermost stacked article 5. In the side view, the right angle may be a right angle located at the lower left. The calculation component 3 can identify the second feature from the side picture after performing edge detection on the side picture.
Step S152: the position of the second feature in the second camera coordinate system is obtained from the position of the plane in which the side 42 of the item 4 to be stacked lies in the second camera coordinate system and the position of the second feature in the side picture.
A two-dimensional pixel coordinate system can be established in the side image, and the position of the second feature in the side image is the coordinate value of the second feature in the pixel coordinate system.
In this embodiment, the origin of the second camera coordinate system is point D, which is the optical center of the second camera 23, and the second camera coordinate system has mutually perpendicular XDAxis, YDAxis and ZDA shaft. Wherein, YDThe axis is parallel to the normal of the substrate 121. The X of the second camera coordinate system can be adjustedDThe axis is parallel to the x-axis of the substrate coordinate system, and the Z of the second camera coordinate system is setDThe axes are arranged parallel to the y-axis of the substrate coordinate system. The rotation matrix of the second camera coordinate system to the substrate coordinate system is known.
The side of the item 4 to be stacked, the predetermined position points on the sensing portions 241 of the 3 detection elements 24 and the reference plane 251 are in the same plane, which is parallel to the second camera coordinate system
Figure 46278DEST_PATH_IMAGE032
And (4) a plane. Can calibrate a preset position point to the position in advance
Figure 864061DEST_PATH_IMAGE033
Distance between planes
Figure 394400DEST_PATH_IMAGE034
. According to the distance
Figure 505182DEST_PATH_IMAGE035
I.e. the position of the plane in which the side of the item 4 to be stacked lies in the second camera coordinate system.
The coordinates of any point in the second feature in the second camera coordinate system can be calculated using the following equation:
Figure 386550DEST_PATH_IMAGE036
where K is the internal reference matrix of the second camera 23,
Figure 8024DEST_PATH_IMAGE037
is a point
Figure 658449DEST_PATH_IMAGE038
The depth value of (a) is determined,
Figure 441597DEST_PATH_IMAGE039
is the coordinate value of any point in the second feature under the pixel coordinate system,
Figure 669316DEST_PATH_IMAGE040
is the coordinate value of the point in the second camera coordinate system.
The internal reference matrix of the second camera 23 is an intrinsic parameter of the second camera 23, and the internal reference matrix can be obtained by calibration, for example, by using a monocular chequer upright calibration method, or the internal reference matrix can be directly provided by a camera manufacturer. Therefore, the internal reference matrix of the second camera 23 is a known quantity.
Depth value
Figure 704268DEST_PATH_IMAGE041
Equal to a preset position point to
Figure 100877DEST_PATH_IMAGE042
Distance between planes
Figure 930292DEST_PATH_IMAGE043
. Distance between two adjacent plates
Figure 910887DEST_PATH_IMAGE044
Can be pre-calibrated to known quantities.
As shown in fig. 9, a fourth workpiece coordinate system is established, the origin of the fourth workpiece coordinate system is the vertex W of the second feature, the x-axis and the y-axis of the fourth workpiece coordinate system respectively extend along two edges of the second feature, and the z-axis of the fourth workpiece coordinate system is perpendicular to the two edges. By using the above equation, the coordinates of the vertex of the second feature in the second camera coordinate system and the coordinates of any two other points respectively located on the two edges of the second feature in the second camera coordinate system can be calculated. And calculating the direction vector of the x axis and the y axis of the fourth workpiece coordinate system in the second camera coordinate system according to the coordinates of the vertex in the second camera coordinate system in the second characteristic and the coordinates of the two points respectively positioned on the two edges in the second camera coordinate system.
Because the second feature is overlapped with the x and y axes of the fourth workpiece coordinate system, the position of the second feature in the second camera coordinate system can be known by obtaining the origin coordinate of the fourth workpiece coordinate system and the direction vectors of the x and y axes under the second camera 23 coordinate system.
In addition, a camera may be additionally installed beside the second camera 23, and the camera and the second camera 23 form a binocular depth camera system with depth values
Figure 483950DEST_PATH_IMAGE045
And can also be measured directly by a binocular depth camera system.
Step S153: and calculating the translation amount of the substrate 121 which needs to move along the x axis of the substrate coordinate system, the translation amount of the substrate which needs to move along the z axis of the substrate coordinate system and the rotation angle of the substrate which rotates around the y axis of the substrate coordinate system when the second feature is moved to the second preset position according to the position of the second feature in the second camera coordinate system, the rotation matrix from the second camera coordinate system to the substrate coordinate system and the second preset position under the substrate coordinate system.
The base plate coordinate system has a second preset position, and when the second characteristic is at the second preset position, the bottom surface of the article 4 to be stacked is just overlapped with the top surface of the topmost stacked article 5.
In this embodiment, when the origin of the fourth workpiece coordinate system is located at the second predetermined coordinate of the substrate coordinate system, the x-axis of the fourth workpiece coordinate system is parallel to the z-axis of the substrate coordinate system, and the y-axis of the fourth workpiece coordinate system is parallel to the x-axis of the substrate coordinate system, the second feature is located at the second predetermined position.
The rotation matrix of the second camera coordinate system to the substrate coordinate system is a known quantity and can be calibrated in advance. And obtaining the position of the second feature in the substrate coordinate system according to the rotation matrix from the second camera coordinate system to the substrate coordinate system and the position of the second feature in the second camera coordinate system.
According to the position of the second feature in the substrate coordinate system and the second preset position in the substrate coordinate system, the translation amount of the substrate 121 that needs to move along the x-axis of the substrate coordinate system, the translation amount of the substrate that needs to move along the z-axis of the substrate coordinate system, and the rotation angle of the substrate that needs to rotate around the y-axis of the substrate coordinate system when the second feature is moved to the second preset position can be calculated.
Step S16: the robot arm 11 is driven to move the base plate 121 so that the bottom surface of the article 4 to be stacked overlaps the top surface of the stacked article 5, and then releases the article 4 to be stacked.
In the embodiment, the computing assembly 3 drives the mechanical arm 11 to move the substrate 121 according to the obtained translation amount of the x-axis movement of the substrate coordinate system, the translation amount of the z-axis movement of the substrate coordinate system, and the rotation angle of the y-axis rotation of the substrate coordinate system, so that the second feature is located at the second preset position under the substrate coordinate system.
At this time, the bottom surface of the article 4 to be stacked is overlapped with the top surface of the stacked article 5, and the article 4 to be stacked is neatly placed on the top of the stacked article 5. Finally, the counting assembly 3 controls the plurality of suction cups 128 to simultaneously release the items 4 to be stacked.
Repeating the steps S1-S16 can stack regular articles in sequence, and because the regular articles are stacked by taking the reference plane 251 as a reference when being placed, the accumulated error caused by repeated stacking can be reduced or even eliminated, so that the stacking and placing are more accurate.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.

Claims (11)

1. A measurement and control method of a robot system is characterized by comprising the following steps:
driving the mechanical arm to enable the tail end grabbing module to move to the position above the articles to be stacked;
measuring by using at least 3 distance measuring elements of the tail end grabbing module to obtain the distance from the measuring starting point of each distance measuring element to the top surface of the object to be stacked in the normal direction of the substrate of the tail end grabbing module;
calculating the amount of movement required when the plate surface of the substrate moves to be parallel to the top surface according to the distance from each measurement starting point to the top surface and the relative position relationship between each measurement starting point and the substrate;
driving a mechanical arm to enable the substrate to move until the plate surface and the top surface are parallel to each other;
shooting the top of an article to be stacked by adopting a first camera of the tail end grabbing module to obtain a top picture of the article to be stacked;
calculating the motion amount required by the substrate when the sucker relatively fixed with the substrate is moved to a position opposite to a preset grabbing point on the top surface in the normal direction of the substrate according to the top picture and the relative position relation between the first camera and the substrate;
the mechanical arm is driven to move the base plate so that the sucker moves to the position aligned with the preset grabbing point, and the sucker is close to the object to be stacked along the normal direction of the base plate until the sucker sucks the object to be stacked.
2. The measurement and control method according to claim 1, further comprising the step of, after the end gripping module has gripped the item to be stacked:
moving the articles to be stacked above the stacked articles;
driving the datum plane generator to provide a datum plane that is flush with a side of the stacked item;
measuring the relative position between the reference plane and the substrate by adopting at least 3 detection elements on the tail end grabbing module;
calculating the amount of movement required by the base plate when the side surface is moved to be parallel to the reference plane according to the relative position between the reference plane and the base plate and the relative position between the base plate and the side surface of the article to be stacked;
driving the mechanical arm to enable the side face of the article to be stacked to be parallel to the reference plane;
adjusting the position of the substrate along a normal direction perpendicular to the reference plane until the reference plane and the detection element intersect at a preset position point on the detection element, so that the side surface of the article to be stacked is flush with the reference plane;
shooting by adopting a second camera to obtain a side picture of the stacked articles;
calculating the amount of movement required by the base plate when the bottom surface of the article to be stacked is overlapped with the top surface of the stacked article according to the side face picture and the relative position relation between the second camera and the base plate;
the robot arm is driven to move the base plate so that the bottom surface of the stacked article overlaps the top surface of the stacked article, and then the stacked article is released.
3. The measurement and control method according to claim 2, further comprising the step of, before the robotic arm is actuated to move the end gripper module over the item to be stacked: establishing a substrate coordinate system, wherein the substrate coordinate system and the substrate are relatively fixed;
the amount of motion required when the face of the base plate moves to be parallel to the top surface is calculated, and the method comprises the following steps:
calculating a plane equation of a plane of the top surface under the substrate coordinate system according to the distance between the measurement starting point of each distance measuring element in the normal direction of the substrate and the top surface of the article to be stacked and the coordinate of each measurement starting point under the substrate coordinate system;
setting a first workpiece coordinate system according to a plane equation of a plane of the top surface in a substrate coordinate system, wherein an x axis and a y axis of the first workpiece coordinate system are in the plane, a z axis of the first workpiece coordinate system is parallel to a normal direction of the plane, and a rotation matrix from the substrate coordinate system to the first workpiece coordinate system is obtained;
and according to the rotation matrix from the substrate coordinate system to the first workpiece coordinate system, calculating a corner angle required for enabling the substrate to rotate around the x axis of the substrate coordinate system and a corner angle required for enabling the substrate to rotate around the y axis of the substrate coordinate system when the plate surface of the substrate and the top surface of the object to be stacked are parallel to each other.
4. The measurement and control method according to claim 3, further comprising the step of, before the robotic arm is actuated to move the end gripper module over the item to be stacked: establishing a first camera coordinate system, wherein the first camera coordinate system and a first camera are relatively fixed;
calculating an amount of movement required to move a chuck fixed relative to a substrate to a predetermined gripping point on the top surface in a direction normal to the substrate, comprising:
measuring the top surface of the article to be stacked by using the distance measuring element to obtain the distance between the measurement starting point of the distance measuring element and the top surface of the article to be stacked;
identifying a first feature in the top picture;
obtaining the position of the first feature in the first camera coordinate system according to the position of the measurement starting point of the distance measuring element in the first camera coordinate system, the distance between the measurement starting point and the top surface of the article to be stacked and the position of the first feature in the top picture;
calculating the translation amount of the substrate which needs to move along the x axis of the substrate coordinate system, the translation amount of the substrate which needs to move along the y axis of the substrate coordinate system and the corner angle of the substrate which rotates around the z axis of the substrate coordinate system when the first feature is moved to the first preset position according to the position of the first feature in the first camera coordinate system, the rotation matrix from the first camera coordinate system to the substrate coordinate system and the first preset position under the substrate coordinate system;
the first preset position is a position where the first feature is located when the preset grabbing point on the top surface of the article to be stacked is aligned with the sucker in the normal direction of the substrate.
5. The method of claim 4, wherein the first feature comprises a right angle formed by two adjacent edges on a top surface of the article to be stacked.
6. The measurement and control method according to claim 3, wherein calculating an amount of movement of the substrate required to move the side surface to be parallel to the reference plane includes:
obtaining a plane equation of the reference plane in the substrate coordinate system according to coordinates of the position points of the reference plane, which are sensed by the three detection elements, in the substrate coordinate system;
setting a third workpiece coordinate system according to a plane equation of a reference plane in a substrate coordinate system, wherein the x axis and the y axis of the third workpiece coordinate system are in the reference plane, the z axis of the third workpiece coordinate system is parallel to the normal direction of the reference plane, and obtaining a rotation matrix from the substrate coordinate system to the third workpiece coordinate system;
and calculating a rotation angle of the substrate required to rotate around the x axis of the substrate coordinate system and a rotation angle of the substrate required to rotate around the z axis of the substrate coordinate system when the side surface is moved to be parallel to the reference plane according to the rotation matrix from the substrate coordinate system to the third workpiece coordinate system and the relative position between the substrate and the side surface of the object to be stacked.
7. The measurement and control method according to claim 3, further comprising the step of, before the robotic arm is actuated to move the end gripper module over the item to be stacked: establishing a second camera coordinate system, wherein the second camera coordinate system and the second camera are relatively fixed;
calculating the amount of movement of the base plate required to overlap the bottom surface of the stacked item with the top surface of the stacked item, including:
identifying a second feature in the side picture;
obtaining the position of the second feature in the second camera coordinate system according to the position of the plane of the side face of the article to be stacked in the second camera coordinate system and the position of the second feature in the side face picture;
calculating the translation amount of the substrate which needs to move along the x axis of the substrate coordinate system, the translation amount of the substrate which needs to move along the z axis of the substrate coordinate system and the corner angle of the substrate which rotates around the y axis of the substrate coordinate system when the second feature is moved to the second preset position according to the position of the second feature in the second camera coordinate system, the rotation matrix from the second camera coordinate system to the substrate coordinate system and the second preset position under the substrate coordinate system;
the second preset position is a position where the second feature is located when the bottom surface of the article to be stacked is aligned with and overlaps the top surface of the stacked article.
8. The measurement and control method of claim 7, wherein the second feature comprises a right angle formed by two adjacent edges on the side of the stacked article.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the measurement and control method according to any one of claims 1 to 8.
10. A robotic system, comprising:
an actuating device comprising
A mechanical arm; and
the tail end grabbing module is arranged at the tail end of the mechanical arm and comprises a base plate;
a measuring assembly comprising
At least 3 distance measuring elements fixed opposite to the base plate, each distance measuring element being used for measuring the distance from the measuring starting point of the distance measuring element to the top surface of the stacked object in the normal direction of the base plate; and
the first camera is fixed relative to the base plate and used for shooting the top surface of the article to be stacked to obtain a top picture;
the computing assembly is provided with a substrate coordinate system fixed relative to the substrate and a first camera coordinate system fixed relative to the first camera;
the plane equation of the plane of the top surface under the substrate coordinate system can be calculated according to the distance between the measurement starting point of each distance measuring element and the top surface of the article to be stacked in the normal direction of the substrate and the coordinate of each measurement starting point under the substrate coordinate system; setting a first workpiece coordinate system according to a plane equation of a plane of the top surface in a substrate coordinate system, wherein an x axis and a y axis of the first workpiece coordinate system are in the plane, a z axis of the first workpiece coordinate system is parallel to a normal direction of the plane, and a rotation matrix from the substrate coordinate system to the first workpiece coordinate system is obtained; according to a rotation matrix from a substrate coordinate system to a first workpiece coordinate system, calculating a corner angle required for enabling the substrate to rotate around an x axis of the substrate coordinate system and a corner angle required for enabling the substrate to rotate around a y axis of the substrate coordinate system when the plate surface of the substrate and the top surface of the object to be stacked are parallel to each other; identifying a first feature in the top picture;
the position of the first feature in the first camera coordinate system can be obtained according to the position of the measurement starting point of the distance measuring element in the first camera coordinate system, the distance between the measurement starting point and the top surface of the article to be stacked and the position of the first feature in the top picture; calculating the translation amount of the substrate which needs to move along the x axis of the substrate coordinate system, the translation amount of the substrate which needs to move along the y axis of the substrate coordinate system and the corner angle of the substrate which rotates around the z axis of the substrate coordinate system when the first feature is moved to the first preset position according to the position of the first feature in the first camera coordinate system, the rotation matrix from the first camera coordinate system to the substrate coordinate system and the first preset position under the substrate coordinate system;
driving the mechanical arm to move the substrate according to the calculation result so that the tail end grabbing module accurately grabs the article to be stacked;
the first preset position is a position where the first feature is located when the preset grabbing point on the top surface of the article to be stacked is aligned with the sucker in the normal direction of the substrate.
11. The robotic system as claimed in claim 10, wherein the measurement assembly further comprises:
the reference plane generator is arranged in the environment and used for generating a reference plane;
at least 3 detection elements fixed relative to the substrate for detecting the position of the reference plane relative to the substrate; and
the second camera is fixed relative to the base plate and is used for shooting side pictures of the stacked articles;
the calculating component is also used for calculating the movement amount of the base plate required when the articles to be stacked are accurately stacked on the top surfaces of the stacked articles according to the position of the reference plane relative to the base plate and the side pictures, and driving the mechanical arm to move the base plate according to the movement amount so that the articles to be stacked move to the top surfaces of the stacked articles.
CN202011333511.7A 2020-11-25 2020-11-25 Robot system and measurement and control method Active CN112123342B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011333511.7A CN112123342B (en) 2020-11-25 2020-11-25 Robot system and measurement and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011333511.7A CN112123342B (en) 2020-11-25 2020-11-25 Robot system and measurement and control method

Publications (2)

Publication Number Publication Date
CN112123342A CN112123342A (en) 2020-12-25
CN112123342B true CN112123342B (en) 2021-03-23

Family

ID=73852392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011333511.7A Active CN112123342B (en) 2020-11-25 2020-11-25 Robot system and measurement and control method

Country Status (1)

Country Link
CN (1) CN112123342B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115701472A (en) * 2021-08-02 2023-02-10 广东博智林机器人有限公司 Wall brick paving equipment and wall brick paving method
CN114055444B (en) * 2021-08-27 2023-04-07 清华大学 Robot, control method and control device thereof, calibration method and calibration control device thereof, and storage medium
CN114102622B (en) * 2021-11-22 2023-07-14 清华大学 Robot system, measurement and control method thereof and surface laser receiver
CN114179090A (en) * 2021-12-28 2022-03-15 苏州优速软件研发有限公司 Rotation assembly control method, system, equipment and storage medium for manipulator
CN117506941B (en) * 2024-01-05 2024-05-03 珠海格力智能装备有限公司 Control method and device of mechanical arm, readable storage medium and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3333963B2 (en) * 1986-09-19 2002-10-15 テキサス インスツルメンツ インコーポレイテツド Visual guidance robot
JP6913573B2 (en) * 2017-09-07 2021-08-04 株式会社東芝 Unloading device and unloading method
US10369701B1 (en) * 2018-10-30 2019-08-06 Mujin, Inc. Automated package registration systems, devices, and methods
CN209125849U (en) * 2018-11-22 2019-07-19 北京电子科技职业学院 A kind of multi-functional end crawl structure of robot
CN109648587B (en) * 2018-12-30 2023-12-29 中冶南方工程技术有限公司 Mechanical arm end effector for grabbing and welding hanging welding nails and labels
CN109483554B (en) * 2019-01-22 2020-05-12 清华大学 Robot dynamic grabbing method and system based on global and local visual semantics

Also Published As

Publication number Publication date
CN112123342A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN112123342B (en) Robot system and measurement and control method
US9026234B2 (en) Information processing apparatus and information processing method
US9604363B2 (en) Object pickup device and method for picking up object
US8295975B2 (en) Object picking device
JP6855492B2 (en) Robot system, robot system control device, and robot system control method
EP3173194A1 (en) Manipulator system, image capturing system, transfer method of object, and carrier medium
JP6879238B2 (en) Work picking device and work picking method
CN111278608B (en) Calibration article for 3D vision robot system
JP5370774B2 (en) Tray transfer apparatus and method
JP2016099257A (en) Information processing device and information processing method
CN110621447B (en) Robot conveyor calibration method, robot system and control system
CN112292235B (en) Robot control device, robot control method, and recording medium
CN112428248B (en) Robot system and control method
CN115582827A (en) Unloading robot grabbing method based on 2D and 3D visual positioning
US20220203549A1 (en) Suction pad and deformation measuring device
CN111344119B (en) Apparatus and method for robot
US20240003675A1 (en) Measurement system, measurement device, measurement method, and measurement program
JP7477633B2 (en) Robot System
CN114102622B (en) Robot system, measurement and control method thereof and surface laser receiver
JPH0545117A (en) Optical method for measuring three-dimensional position
WO2022202655A1 (en) Three-dimensional measurement system
JP5332873B2 (en) Bag-like workpiece recognition device and method
WO2020054724A1 (en) Picking system, information processing device, and computer-readable recording medium
US20230264352A1 (en) Robot device for detecting interference of constituent member of robot
TWI727628B (en) Dynamic tracking system with function of compensating pose and pose compensation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant