CN115592662A - Picking method, picking system, and picking control device - Google Patents

Picking method, picking system, and picking control device Download PDF

Info

Publication number
CN115592662A
CN115592662A CN202210783229.1A CN202210783229A CN115592662A CN 115592662 A CN115592662 A CN 115592662A CN 202210783229 A CN202210783229 A CN 202210783229A CN 115592662 A CN115592662 A CN 115592662A
Authority
CN
China
Prior art keywords
robot
picking
sensor
posture
gripped
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210783229.1A
Other languages
Chinese (zh)
Inventor
吉田刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Publication of CN115592662A publication Critical patent/CN115592662A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A picking method, a picking system and a picking control device can improve the working efficiency. In the picking method, a picking operation is repeatedly performed, and the picking operation includes: a first step of acquiring an image including a plurality of objects using a sensor; a second step of calculating the pose of the object based on the image; a third step of determining a holding posture when the robot holds the object whose posture is calculated; a fourth step of gripping the object with the determined gripping pose; a fifth step of positioning the gripped object in a field of view of the sensor and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot; and a sixth step of calculating the posture of the object held by the robot based on the image, wherein the fifth step in the n-th picking motion is also used as the first step in the n + 1-th picking motion, and n is an integer of 1 or more.

Description

Picking method, picking system, and picking control device
Technical Field
The invention relates to a picking method, a picking system and a picking control device.
Background
In patent document 1, a method of acquiring a posture of a robot hand in a state of holding a workpiece, acquiring a posture of the workpiece in the state held by the robot hand based on an image acquired by a camera, and then acquiring a relative posture of the robot hand and the workpiece based on the posture of the robot hand and the posture of the workpiece is described.
Patent document 1: japanese patent laid-open publication No. 2015-199155
According to such a method, since the posture of the workpiece in the state of being held by the robot hand is acquired, the relative posture of the robot hand and the workpiece can be acquired with high accuracy. However, the imaging timing of the camera is not described in detail, and it is difficult to shorten the time required for the work according to the imaging timing.
Disclosure of Invention
The picking method of the present invention repeats a picking operation including: a first step of acquiring an image including a plurality of objects not held by a robot using a sensor; a second step of calculating a pose of at least one of the objects based on the image acquired in the first step; a third step of determining a gripping posture of the robot when at least one of the objects of which the postures are calculated in the second step is gripped by the robot; a fourth step of causing the robot to grip the object at the grip posture determined in the third step; a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step, the fifth step in the n-th picking motion also serving as the first step in the n + 1-th picking motion, where n is an integer of 1 or more.
The picking system according to the present invention includes a robot and a sensor, and repeats a picking operation including: a first step of acquiring an image including a plurality of objects not held by the robot using the sensor; a second step of calculating a pose of at least one of the object objects based on the image acquired in the first step; a third step of determining a gripping posture of the robot when at least one of the objects of which the postures are calculated in the second step is gripped by the robot; a fourth step of causing the robot to grip the object at the grip posture determined in the third step; a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step, the fifth step in the n-th picking motion also serving as the first step in the n + 1-th picking motion, where n is an integer of 1 or more.
A picking control device according to the present invention repeats a picking operation including: a first step of acquiring an image including a plurality of objects not held by a robot using a sensor; a second step of calculating a pose of at least one of the object objects based on the image acquired in the first step; a third step of determining a gripping posture of the robot when at least one of the objects of which the postures are calculated in the second step is gripped by the robot; a fourth step of causing the robot to grip the object at the grip posture determined in the third step; a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step, the fifth step in the n-th picking motion also serving as the first step in the n + 1-th picking motion, where n is an integer of 1 or more.
Drawings
Fig. 1 is a diagram showing an overall configuration of a picking system.
Fig. 2 is a flowchart showing a sorting method.
Fig. 3 is a diagram showing a first image.
Fig. 4 is a diagram showing a first image.
Fig. 5 is a diagram for explaining a gripping method.
Fig. 6 is a diagram for explaining a gripping method.
Fig. 7 is a diagram for explaining a gripping method.
Fig. 8 is a diagram showing a second image.
Fig. 9 is a diagram showing a positional relationship when the second image is captured.
Fig. 10 is a diagram showing a state in which the calculated posture is compared with the correct posture.
Description of the reference numerals
100 \ 8230and a sorting system; 200 \ 8230and a loading table; 300 \ 8230and a sensor; 310 \ 8230and 3D camera; 500\8230anda manipulator; 501\8230aclaw; 502 \ 8230a claw; 600 \ 8230a robot; 610 \ 8230a base; 620, 8230and a mechanical arm; 621 \ 8230a arm; 622, 8230a arm; 623 \ 8230a arm; 624 \ 8230a arm; 625 \ 8230a arm; 626, 8230a arm; 640\8230arobot control device; e\8230acoder; g1 \ 8230and a first image; g2 \ 8230and a second image; j1\8230ajoint; j2 \ 8230a joint; j3, 8230a joint; j4\8230ajoint; j5, 8230a joint; j6, 8230a joint; m8230and motor; p0 \ 8230and correct pose; p1 \8230position; q \8230anda carrying area; s1 \ 8230, a first step; s2 \ 8230, and a second step; s3 \ 8230and a third step; s4 \ 8230and a fourth step; s5 \ 8230and a fifth step; s6 \ 8230and a sixth step; s7 \ 8230and a seventh step; s8 \ 8230eighth step; s9 \ 8230the ninth step; sp \8230anda sorting action; sp (n) \8230anda sorting action; sp (n + 1) \8230, a sorting action; t1\8230aregion; t2 \ 8230a region; w8230and workpiece; wa\8230aworkpiece; wb 8230and workpiece.
Detailed Description
Hereinafter, a picking method, a picking system, and a picking control device according to the present invention will be described in detail based on embodiments shown in the drawings.
Fig. 1 is a diagram showing an overall configuration of a picking system. Fig. 2 is a flowchart showing a sorting method. Fig. 3 and 4 are diagrams showing the first image, respectively. Fig. 5 to 7 are diagrams for explaining a gripping method. Fig. 8 is a diagram showing a second image. Fig. 9 is a diagram showing a positional relationship when the second image is captured. Fig. 10 is a diagram showing a state in which the calculated posture is compared with the correct posture.
The picking system 100 shown in fig. 1 includes a sensor 300 and a robot 600, the sensor 300 measures a plurality of workpieces W as objects mounted on a mounting table 200, and the robot 600 includes a hand 500 picking a specific workpiece W from the mounting table 200.
Robot 600 is a six-axis robot having six drive axes. The robot 600 has a base 610 fixed on the floor and a robot arm 620 connected to the base 610. The robot arm 620 is a robot arm in which a plurality of arms 621, 622, 623, 624, 625, and 626 are rotatably connected, and includes six joints J1 to J6. The joints J2, J3, and J5 are bending joints, and the joints J1, J4, and J6 are torsion joints. Further, the joints J1, J2, J3, J4, J5, and J6 are provided with a motor M as a drive source and an encoder E for detecting a rotation amount (arm rotation angle) of the motor M.
A robot 500 is connected to a distal end portion of the arm 626. The robot 500 is not particularly limited and can be set as appropriate according to the target work, but in the present embodiment, the workpiece W is configured to be gripped by a pair of claws 501 and 502.
Further, the robot 600 has a robot control device 640 as a picking control device that performs picking of the work W by controlling the joints J1 to J6 and driving of the robot hand 500. The robot controller 640 is constituted by a computer, for example, and includes a processor (CPU) for processing information, a memory communicably connected to the processor, and an external interface. The memory stores various programs executable by the processor, and the processor can read and execute the various programs stored in the memory.
The robot 600 has been briefly described above. However, the structure of the robot 600 is not particularly limited. For example, the robot 600 may be a horizontal articulated robot (SCARA robot), a two-arm robot having two robot arms 620 as described above, or the like, in addition to a six-axis robot. The Robot 600 may not be fixed to the floor, but may be fixed to an automated Guided Vehicle such as an AMR (Autonomous Mobile Robot) or an AGV (automated Guided Vehicle).
The sensor 300 is disposed above the mounting table 200 so as not to interfere with the operation of the robot 600. The sensor 300 is not particularly limited as long as it can acquire measurement information necessary for recognizing the posture of the workpiece W on the stage 200, and in the present embodiment, a 3D camera 310 (stereo camera) is used as a 3D sensor for capturing a distance image having depth information (depth information) for each pixel. In addition, as the sensor 300, for example, a measuring device or the like that measures a three-dimensional shape by a phase shift method can be used.
The entire configuration of the picking system 100 is explained above. Next, a sorting method for sorting the work W by using the sorting system 100 will be described. The picking method is realized by controlling each part by the robot controller 640.
As shown in fig. 2, the picking method repeatedly executes the picking action Sp including the first step S1 to the ninth step S9 as one cycle. The following describes the first step S1 to the ninth step S9 in this order.
First step S1
As shown in fig. 3, in a first step S1, a first image G1 including a plurality of workpieces W arranged on the stage 200 is acquired using the 3D camera 310. The first image G1 is a distance image, and each pixel included in the first image G1 has depth information.
Second step S2
As shown in fig. 4, in the second step S2, at least one workpiece W is extracted from the first image G1 acquired in the above-described first step S1, and the pose of the extracted workpiece W is calculated. In the present embodiment, a plurality of workpieces W are extracted from the first image G1, and the poses of the extracted workpieces W are calculated. In this way, by calculating the poses of the plurality of workpieces W, options in the following fourth step S4 are added, and the work can be performed more smoothly and correctly. In fig. 4, the workpiece W with the calculated posture is shown in a colored manner. For convenience of explanation, the workpiece W whose posture is calculated will be hereinafter also referred to as "workpiece Wa".
The method of calculating the pose of the workpiece W is not particularly limited, and for example, template matching can be used. In this method, first, a 3D model of the workpiece W is created using CAD. Next, the created 3D model is set to various poses (positions and postures) of 360 °, the pose of the workpiece W in the contour shape at that time is estimated, and a learning model is generated by correlating these poses. Then, the pose of the work W is calculated by comparing the work W extracted from the first image G1 with the learning model. According to such a method, the posture of the workpiece W can be calculated with high accuracy.
However, the second step S2 is not limited to this, and one workpiece W may be extracted from the first image G1. This shortens the time taken for extraction, and improves the efficiency of the picking action Sp.
Third step S3
In the third step S3, the gripping method and the gripping pose of the robot 600 when at least one of the workpieces Wa of which the pose was calculated in the second step S2 is gripped by the robot 600 are determined. In the present embodiment, the holding method and the holding posture of the robot 600 for all the workpieces Wa are determined. In this way, by calculating the poses of the plurality of workpieces W, options in the following fourth step S4 are added, and the work can be performed more smoothly and correctly.
First, a method of gripping the robot 600 will be described. In the present embodiment, the workpiece W is formed in a flat annular shape, and the robot hand 500 includes a pair of claws 501 and 502. Therefore, there are a first gripping method of gripping the workpiece W from the lateral direction by gripping the workpiece W with the pair of jaws 501 and 502 as shown in fig. 5, a second gripping method of gripping the workpiece W from the vertical direction by gripping the workpiece W with the pair of jaws 501 and 502 as shown in fig. 6, and a third gripping method of gripping the workpiece W by inserting the pair of jaws 501 and 502 into the hole of the workpiece W and pressing the workpiece W against the inner peripheral surface as shown in fig. 7. In the third step S3, it is determined which of the above-described first, second, and third gripping methods is suitable for each workpiece Wa, and which gripping method can easily and firmly grip the workpiece Wa. For example, the holding method is determined in consideration of the pose of the workpiece Wa, the pose of the workpiece W located in the periphery, and the like.
However, depending on the shape of the workpiece W, the determination of the gripping method may be omitted when there is no option in the gripping method, such as when there is actually only one gripping method, or when there are a plurality of gripping methods but gripping is performed only by a specific gripping method.
Next, the holding posture of the robot 600 will be described. The holding posture refers to a posture in which the robot 600 approaches the workpiece Wa, i.e., a relative posture of the workpiece W and the hand 500. For example, the gripping pose is determined in consideration of the pose of the workpiece Wa, the pose of the workpiece W located around, the movable range of the robot arm 620, and the like.
However, the third step S3 is not limited to this, and the gripping method and the gripping posture of the robot 600 when the robot 600 grips 1 of the workpieces Wa may be determined. Thus, the time taken to determine the holding method and the holding posture is shortened, and the efficiency of the picking motion Sp is improved.
The method of selecting one workpiece Wa from among a plurality of workpieces Wa is not particularly limited, and examples thereof include a method of confidence based on template matching. Since the higher the confidence level is, the higher the possibility that the posture of the workpiece Wa calculated in the second step S2 matches the actual posture, the selection of the workpiece Wa with the high confidence level can perform the sorting action Sp with high accuracy. In this case, the workpiece Wa having the highest degree of confidence may be selected from the plurality of workpieces Wa, or one workpiece Wa may be selected from the plurality of workpieces Wa having a degree of confidence equal to or higher than a predetermined value.
Fourth step S4
In the fourth step S4, first, a workpiece Wa to be gripped by the robot 600 is selected from among a plurality of workpieces Wa. The selection method is not particularly limited, and can be determined, for example, in consideration of the degree of freedom of template matching, the degree of difficulty in approaching the workpiece Wa, the possibility of smooth transition to the post-gripping operation, and the like. Next, the drive of the robot arm 620 and the robot 500 is controlled so that the robot 600 grips the workpiece Wa in the gripping method and the gripping pose determined in the third step S3. For convenience of explanation, the workpiece Wa gripped by the robot 600 will be hereinafter also referred to as "workpiece Wb".
Fifth step S5
In the fifth step S5, first, the robot arm 620 is controlled to be driven so that the workpiece Wb gripped by the robot 600 in the fourth step S4 is positioned at a predetermined position within the field of view of the 3D camera 310. Next, as shown in fig. 8, a second image G2 including the plurality of workpieces W placed on the mounting table 200 and the workpiece Wb gripped by the robot 600 is acquired by imaging with the 3D camera 310.
Here, the position of the workpiece Wb in the field of view of the 3D camera 310 will be described. In the present embodiment, the workpiece Wb is positioned within the field of view of the 3D camera 310 so as not to overlap the plurality of workpieces W placed on the table 200 as much as possible. Hereinafter, this position is also referred to as "imaging position". This is because, as described later, since the second image G2 is used as the first image G1 in the next picking motion Sp, it is preferable that more workpieces W are included in the second image G2 in terms of increasing the options of workpieces W to be gripped.
In particular, as shown in fig. 8, in the present embodiment, the placement area Q of the workpiece W mounted on the placement table 200 is set based on the calculation result of the attitude in the second step S2, and the position of the workpiece Wb in the field of view of the 3D camera 310 is determined so that the overlapping area (overlapping amount) with the set placement area Q is as small as possible, preferably so that the overlapping area is zero. According to such a determination method, since the first image G1 and the calculation result in the second step S2 can be used, the position of the workpiece Wb in the field of view of the 3D camera 310 can be easily and appropriately determined without newly detecting the workpiece W on the mounting table 200. For example, a workpiece W having a confidence level of the posture equal to or higher than a certain value may be extracted, and the placement area Q may be set based on the position of the extracted workpiece W. This suppresses overlapping with the work W having high confidence, and therefore can be effectively used as the first image G1 in the next picking motion Sp.
As shown in fig. 9, in the present embodiment, the workpiece Wb is positioned closer to the 3D camera 310 than the plurality of workpieces W placed on the table 200. That is, the workpiece W and the workpiece Wb on the stage 200 are arranged to be shifted in the depth direction (optical axis direction) of the 3D camera 310. The second image G2 is a distance image having depth information for each pixel. Therefore, as described above, by shifting the workpiece W and the workpiece Wb on the mounting table 200 in the depth direction of the 3D camera 310 and based on the depth information, as shown in fig. 8, the region T1 including the workpiece W mounted on the mounting table 200 and the region T2 including the workpiece Wb gripped by the robot 600 can be distinguished from each other and extracted from the second image G2.
Sixth step S6
In the sixth step S6, the pose P1 of the workpiece Wb gripped by the robot 600 is calculated based on the second image G2 acquired in the fifth step S5, in particular, the region T2 extracted from the second image G2. That is, the relative positional relationship between the hand 500 and the workpiece Wb (the position and orientation of the workpiece Wb to be gripped by the hand 500) is calculated.
Seventh step S7
As shown in fig. 10, in the seventh step S7, for example, the posture P1 of the work Wb calculated in the sixth step S6 is compared with the correct posture P0 as the ideal posture of the work Wb, and an error thereof is calculated. The correct posture P0 can be set by learning in advance through teaching or the like based on the shape of the workpiece W, the configuration of the robot 500, the content of the work performed by the robot 600, and the like.
However, the method of setting the correct pose P0 is not particularly limited. For example, the correct pose P0 may also be calculated by detecting the pose of the robot arm 500 based on the second image G2, and calculating the pose of the workpiece W with respect to the pose of the robot arm 500 according to the gripping method determined in the third step S3. According to such a method, since learning in advance is not required, the time taken for preparation is shortened. In the present embodiment, in order to compare the correct posture P0 with the second image G2, it is preferable that the posture of the robot 500 at the time of capturing the second image G2 be matched with the correct posture P0. In contrast, according to the method of calculating the correct posture P0 from the second image G2, the posture of the robot 500 at the time of shooting can be arbitrarily set. Therefore, it becomes easier to select a shooting position with a small overlap area.
In addition, the posture of the robot arm 500 may be calculated not from the second image G2 but from the relative positional relationship between the robot 600 and the 3D camera 310 and the posture of the robot arm 620 calculated from the outputs of the encoders E of the joints J1 to J6. With such a method, the same effect as calculating the correct pose P0 from the second image G2 can be obtained.
Eighth step S8
Here, an operation command generated on the assumption that the workpiece Wb is in the correct posture P0 is input to the robot control device 640 from a host computer, not shown. Therefore, as shown in fig. 10, if the actual posture P1 deviates from the correct posture P0, the operation of the operation command may not be smoothly performed, and the yield may be lowered due to a failure. Therefore, in the eighth step S8, the motion command is corrected based on the error between the posture P1 calculated in the seventh step S7 and the correct posture P0, and the corrected motion command is generated.
Ninth step S9
In the ninth step S9, the drive of the robot 600 is controlled based on the correction operation command generated in the eighth step S8, and the robot 600 performs a predetermined task. The operation is not particularly limited.
The above description has explained the picking operation Sp. In the picking method, such a picking action Sp is repeated. Then, as shown in fig. 2, the fifth step S5 in the n-th (where n is an integer of 1 or more) picking action Sp (n) also serves as the first step S1 in the next n + 1-th picking action Sp (n + 1). That is, the second image G2 acquired in the fifth step S5 in the n-th picking action Sp (n), more specifically, the region T1 in the second image G2 is used as the first image G1 acquired in the first step S1 in the n + 1-th picking action Sp (n + 1). According to such a method, since it is not necessary to newly acquire the first image G1 in the (n + 1) -th picking motion Sp (n + 1), the number of steps of the (n + 1) -th picking motion Sp (n + 1) is reduced, and the time taken for the work can be reduced accordingly.
Note that, in the 1 st picking action Sp, there is no second image G2 available as the first image G1, and the first image G1 needs to be acquired. However, in a case where the arrangement of the workpiece W on the mounting table 200 is not changed from the previous stop of the robot 600 to the current re-operation, the second image G2 acquired before the previous stop of the robot 600 may be used as the first image G1 in the current 1 st picking motion Sp.
In addition, for example, in the fifth step S5 of the picking motion Sp (n + 1), when the imaging position is specified using the second image G2 acquired by the picking motion Sp (n), the bulk state of the workpieces W on the mounting table 200 is unclear (hidden by the workpieces Wb and not visible) in the area T2. Therefore, the placement region Q cannot be set, and the overlapping area cannot be calculated. Therefore, in the fifth step S5 of the picking action Sp (n + 1), the overlapping area at the same imaging position as the picking action Sp (n) is estimated to be equal to the overlapping area calculated by the picking action Sp (n). This makes it possible to calculate an imaging position with a smaller overlap area.
The picking method, the picking system 100, and the robot control device 640 as the picking control device have been described above. As described above, such a picking method repeatedly performs a picking action Sp that includes: a first step S1 of acquiring a first image G1 including a plurality of workpieces W as objects not gripped by the robot 600 using the sensor 300; a second step S2 of calculating the pose of at least one workpiece W based on the first image G1 acquired in the first step S1; a third step S3 of determining a gripping pose of the robot 600 when at least one of the workpieces Wa of which poses were calculated in the second step S2 is gripped by the robot 600; a fourth step S4 of causing the robot 600 to grip the workpiece Wa at the grip posture determined by the third step S3; a fifth step S5 of positioning the workpiece Wb gripped by the robot 600 in the fourth step S4 in the field of view of the sensor 300 and acquiring a second image G2 including the plurality of workpieces W not gripped by the robot 600 and the workpiece Wb gripped by the robot 600 using the sensor 300; and a sixth step S6 of calculating the posture of the workpiece Wb held by the robot 600 based on the second image G2 acquired in the fifth step S5. The fifth step S5 in the nth (where n is an integer of 1 or more) picking action Sp (n) also serves as the first step S1 in the (n + 1) th picking action Sp (n + 1). That is, the second image G2 acquired by the picking action Sp (n) is used as the first image G1 in the picking action Sp (n + 1). Therefore, according to this method, the time (tact time) taken for the picking operation Sp (n + 1) can be shortened, and the picking operation can be made more efficient.
As described above, the picking motion Sp includes the seventh step S7, and the seventh step S7 calculates an error between the posture P1 calculated in the sixth step S6 and the correct posture P0 which is the ideal posture of the workpiece Wb gripped by the robot 600. This error can be reflected in the control of the robot 600, and the operation of the robot 600 can be made smoother.
In addition, as described above, in the fifth step S5, the position of the workpiece Wb gripped by the robot 600 within the field of view of the sensor 300 is determined based on the first image G1 acquired in the first step S1. In this way, by using the first image G1, the position of the workpiece Wb within the field of view of the sensor 300 can be easily specified.
As described above, in the fifth step S5, the position of the workpiece Wb held by the robot 600 in the field of view of the sensor 300 is determined so that the overlapping area with the workpiece W whose posture is calculated in the second step S2 is as small as possible. This allows the selection of the workpiece W as the workpiece Wb to be added to the next picking action Sp.
As described above, the sensor 300 is the 3D camera 310 as the 3D sensor, and the region T1 including the plurality of workpieces W not gripped by the robot 600 and the region T2 including the workpiece Wb gripped by the robot 600 are distinguished from each other in the second image G2 acquired in the fifth step S5 based on the depth information obtained from the 3D camera 310. By such a method, it becomes easy to distinguish the regions T1 and T2.
As described above, the picking system 100 includes the robot 600 and the sensor 300, and repeats the picking motion Sp including: a first step S1 of acquiring a first image G1 including a plurality of workpieces W as objects not gripped by the robot 600 using the sensor 300; a second step S2 of calculating a pose of at least one workpiece W based on the first image G1 acquired in the first step S1; a third step S3 of determining the holding posture of the robot 600 when at least one of the workpieces Wa of which the postures are calculated in the second step S2 is held by the robot 600; a fourth step S4 of causing the robot 600 to grip the workpiece W at the grip posture determined in the third step S3; a fifth step S5 of positioning the workpiece Wb gripped by the robot 600 in the fourth step S4 in the field of view of the sensor 300 and acquiring a second image G2 including the plurality of workpieces W not gripped by the robot 600 and the workpiece Wb gripped by the robot 600 using the sensor 300; and a sixth step S6 of calculating the posture of the workpiece Wb held by the robot 600 based on the second image G2 acquired in the fifth step S5. The fifth step S5 in the n-th (where n is an integer of 1 or more) picking action Sp (n) also serves as the first step S1 in the n + 1-th picking action Sp (n + 1). That is, the second image G2 acquired by the picking action Sp (n) is used as the first image G1 in the picking action Sp (n + 1). Therefore, according to such a method, the time taken for the picking action Sp (n + 1) can be shortened, and the picking operation can be made more efficient.
As described above, the robot controller 640 as the picking controller repeatedly performs the picking action Sp including: a first step S1 of acquiring a first image G1 including a plurality of workpieces W as objects not gripped by the robot 600 using the sensor 300; a second step S2 of calculating the pose of at least one workpiece W based on the first image G1 acquired in the first step S1; a third step S3 of determining a gripping pose of the robot 600 when at least one of the workpieces Wa of which poses were calculated in the second step S2 is gripped by the robot 600; a fourth step S4 of causing the robot 600 to grip the workpiece W at the grip posture determined in the third step S3; a fifth step S5 of positioning the workpiece Wb gripped by the robot 600 in the fourth step S4 in the field of view of the sensor 300 and acquiring a second image G2 including the plurality of workpieces W not gripped by the robot 600 and the workpiece Wb gripped by the robot 600 using the sensor 300; and a sixth step S6 of calculating the posture of the workpiece Wb held by the robot 600 based on the second image G2 acquired in the fifth step S5. The fifth step S5 in the n-th (where n is an integer of 1 or more) picking action Sp (n) also serves as the first step S1 in the n + 1-th picking action Sp (n + 1). That is, the second image G2 acquired by the picking action Sp (n) is used as the first image G1 in the picking action Sp (n + 1). Therefore, according to this method, the time taken for the picking operation Sp (n + 1) can be shortened, and the efficiency of the picking operation can be improved.
The picking method, the picking system, and the picking control device according to the present invention have been described above based on the illustrated embodiments, but the present invention is not limited thereto, and the configurations of the respective portions may be replaced with arbitrary configurations having the same function. In addition, other arbitrary structures may be added.

Claims (7)

1. A method of sorting, wherein a sorting action is repeated, the sorting action comprising:
a first step of acquiring an image including a plurality of objects not held by a robot using a sensor;
a second step of calculating a pose of at least one of the object objects based on the image acquired in the first step;
a third step of determining a holding posture of the robot when at least one of the objects of which the posture is calculated in the second step is held by the robot;
a fourth step of causing the robot to grip the object at the grip posture determined in the third step;
a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and
a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step,
the fifth step in the n-th picking action also serves as the first step in the n + 1-th picking action, where n is an integer of 1 or more.
2. The sorting method of claim 1, wherein,
the picking operation includes a seventh step of calculating an error between the posture calculated in the sixth step and a correct posture which is an ideal posture of the object held by the robot.
3. The sorting method according to claim 1 or 2,
in the fifth step, a position of the object held by the robot within the field of view of the sensor is determined based on the image acquired in the first step.
4. The sorting method of claim 3, wherein,
the position of the object held by the robot in the field of view of the sensor is determined so that the area of overlap with the object whose posture is calculated in the second step is as small as possible.
5. The sorting method according to claim 1 or 2,
the sensor is a 3D sensor and the sensor is,
based on the depth information obtained from the 3D sensor, a region including the plurality of objects not gripped by the robot and a region including the object gripped by the robot are distinguished from the image acquired in the fifth step.
6. A picking system having a robot and a sensor, the picking system iteratively performing a picking action, the picking action comprising:
a first step of acquiring an image including a plurality of objects not held by the robot using the sensor;
a second step of calculating a pose of at least one of the object objects based on the image acquired in the first step;
a third step of determining a gripping posture of the robot when at least one of the objects of which the postures are calculated in the second step is gripped by the robot;
a fourth step of causing the robot to grip the object at the grip posture determined in the third step;
a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and
a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step,
the fifth step in the n-th picking action also serves as the first step in the n + 1-th picking action, where n is an integer of 1 or more.
7. A picking control apparatus, wherein a picking action is repeatedly performed, the picking action comprising:
a first step of acquiring an image including a plurality of objects not held by a robot by using a sensor;
a second step of calculating a pose of at least one of the object objects based on the image acquired in the first step;
a third step of determining a holding posture of the robot when at least one of the objects of which the posture is calculated in the second step is held by the robot;
a fourth step of causing the robot to grip the object at the grip posture determined in the third step;
a fifth step of positioning the object gripped by the robot in the fourth step in a field of view of the sensor, and acquiring an image including a plurality of objects not gripped by the robot and the object gripped by the robot using the sensor; and
a sixth step of calculating a pose of the object held by the robot based on the image acquired in the fifth step,
the fifth step in the n-th picking action also serves as the first step in the n + 1-th picking action, where n is an integer of 1 or more.
CN202210783229.1A 2021-07-08 2022-07-05 Picking method, picking system, and picking control device Pending CN115592662A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-113341 2021-07-08
JP2021113341A JP2023009776A (en) 2021-07-08 2021-07-08 Picking method, picking system and picking control device

Publications (1)

Publication Number Publication Date
CN115592662A true CN115592662A (en) 2023-01-13

Family

ID=84841973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210783229.1A Pending CN115592662A (en) 2021-07-08 2022-07-05 Picking method, picking system, and picking control device

Country Status (2)

Country Link
JP (1) JP2023009776A (en)
CN (1) CN115592662A (en)

Also Published As

Publication number Publication date
JP2023009776A (en) 2023-01-20

Similar Documents

Publication Publication Date Title
US10456917B2 (en) Robot system including a plurality of robots, robot controller and robot control method
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
JP3946711B2 (en) Robot system
US20200298411A1 (en) Method for the orientation of an industrial robot, and industrial robot
EP3272473B1 (en) Teaching device and method for generating control information
US11285609B2 (en) Working position correcting method and working robot
JP7306937B2 (en) A control device for a robot device that adjusts the position of a member supported by a robot
JP2007011978A (en) Motion controller for robot
CN111745640B (en) Object detection method, object detection device, and robot system
US20190030722A1 (en) Control device, robot system, and control method
JP6217322B2 (en) Robot control apparatus, robot, and robot control method
CN115194755A (en) Apparatus and method for controlling robot to insert object into insertion part
WO2020179416A1 (en) Robot control device, robot control method, and robot control program
JP6499272B2 (en) Teaching apparatus and control information generation method
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN115592662A (en) Picking method, picking system, and picking control device
WO2023013740A1 (en) Robot control device, robot control system, and robot control method
JP2016203282A (en) Robot with mechanism for changing end effector attitude
US20220134550A1 (en) Control system for hand and control method for hand
EP3224004B1 (en) Robotic system comprising a telemetric device with a laser measuring device and a passive video camera
JP2022139157A (en) Carrying system and control method thereof
US11826919B2 (en) Work coordinate generation device
WO2023013739A1 (en) Robot control device, robot control system, and robot control method
WO2023013698A1 (en) Robot control device, robot control system, and robot control method
Nguyen Situation-oriented behavior-based stereo vision to gain robustness and adaptation in the manipulator control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination