WO2020066066A1 - Dispositif effecteur terminal - Google Patents

Dispositif effecteur terminal Download PDF

Info

Publication number
WO2020066066A1
WO2020066066A1 PCT/JP2019/009920 JP2019009920W WO2020066066A1 WO 2020066066 A1 WO2020066066 A1 WO 2020066066A1 JP 2019009920 W JP2019009920 W JP 2019009920W WO 2020066066 A1 WO2020066066 A1 WO 2020066066A1
Authority
WO
WIPO (PCT)
Prior art keywords
end effector
palm
finger
approach
proximity sensor
Prior art date
Application number
PCT/JP2019/009920
Other languages
English (en)
Japanese (ja)
Inventor
小也香 土肥
寛規 古賀
実里 鍋藤
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020066066A1 publication Critical patent/WO2020066066A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members

Definitions

  • the present disclosure relates to an end effector device.
  • Patent Literature 1 discloses a robot system including a gripper.
  • the grasping unit has two plate-like fingers that are arranged to face each other and are movable so as to approach or separate from each other, and a proximity sensor is provided on each face of each finger that faces each other. Is provided.
  • an object of the present disclosure to provide an end effector device that can approach a grasping target object without contacting the grasping target object and an object in a surrounding environment.
  • An end effector device includes: A palm and one end in the extending direction are connected to the palm, and a gripping operation of gripping an object to be gripped by moving in a direction crossing the extending direction and approaching each other is possible.
  • a plurality of finger portions, and a proximity sensor portion is provided at the other end of the at least one of the plurality of finger portions in the extending direction, and the proximity sensor portion is provided for the grip target in the extending direction.
  • End effector A drive device for driving each of the palm portion and the plurality of finger portions of the end effector, A movement control unit that moves the palm toward the gripped object in a state in which the proximity sensor unit is opposed to the gripped object and the grip sensor and the object can be detected by the proximity sensor unit;
  • the palm of the object to be grasped is around a palm approaching direction in which the object approaches the object to be grasped, and all of the plurality of finger portions are located between the object to be grasped and the object, and the grasping operation is performed.
  • An approach position determining unit that determines an approach position away from the object to be gripped in the palm approach direction, rather than a grip position at which the object can be gripped,
  • the movement control unit includes: When the end effector moves to the approach position, the grip target and the object come into contact with each of the palm and the plurality of fingers based on a detection result detected by the proximity sensor. If there is a possibility that the palm part will move, the approach movement of the palm part to the object to be grasped is stopped.
  • an end effector in which a proximity sensor is provided on at least one of a plurality of fingers, a driving device for driving a palm of the end effector and each finger, and an object to grip the palm
  • a movement control unit for approaching the object, and all of a plurality of finger parts are located between the object to be grasped and the object in the surrounding environment around the palm approaching direction of the object to be grasped and can grasp the object to be grasped by the grasping operation.
  • An approaching position determining unit that determines an approaching position distant from the object to be gripped in the palm approaching direction, rather than the gripping position, wherein the movement control unit is configured to perform the operation based on the detection result detected by the proximity sensor unit.
  • FIG. 1 is a block diagram illustrating an end effector device according to an embodiment of the present disclosure.
  • FIG. 1 is a perspective view illustrating an end effector according to an embodiment of the present disclosure.
  • FIG. 3 is a front view of the end effector of FIG. 2.
  • FIG. 3 is a plan view of the end effector of FIG. 2.
  • FIG. 3 is an enlarged front view of a tip portion of a finger portion in the end effector of FIG. 2.
  • FIG. 6 is a perspective view showing a first modification of the end effector of FIG. 2.
  • FIG. 13 is a perspective view showing a second modification of the end effector of FIG. 2.
  • FIG. 13 is a perspective view showing a third modification of the end effector of FIG. 2.
  • FIG. 1 is a block diagram illustrating an end effector device according to an embodiment of the present disclosure.
  • FIG. 1 is a perspective view illustrating an end effector according to an embodiment of the present disclosure.
  • FIG. 3 is a front view of the end effect
  • FIG. 13 is a perspective view showing a fourth modification of the end effector of FIG. 2.
  • FIG. 13 is a perspective view showing a fifth modification of the end effector of FIG. 2.
  • FIG. 13 is an exemplary perspective view showing a sixth modification of the end effector shown in FIG. 2;
  • FIG. 17 is an exemplary perspective view showing a seventh modification of the end effector shown in FIG. 2;
  • FIG. 19 is a perspective view showing an eighth modification of the end effector shown in FIG. 2.
  • FIG. 19 is an exemplary perspective view showing a ninth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing a tenth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing an eleventh modification of the end effector shown in FIG.
  • FIG. 21 is an exemplary perspective view showing a twelfth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing a thirteenth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing a fourteenth modification of the end effector shown in FIG. 2;
  • FIG. 2 is a first schematic diagram for explaining each unit constituting a control device of the end effector device of FIG. 1.
  • FIG. 2 is a second schematic diagram for explaining each unit constituting the control device of the end effector device in FIG. 1.
  • FIG. 3 is a third schematic diagram for describing each unit constituting the control device of the end effector device of FIG. 1.
  • FIG. 21 is an exemplary perspective view showing a twelfth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing a thirteenth modification of the end effector shown in FIG. 2;
  • FIG. 21 is an exemplary perspective view showing a fourteenth modification of
  • FIG. 4 is a fourth schematic diagram for explaining each unit constituting the control device of the end effector device of FIG. 1.
  • FIG. 5 is a fifth schematic diagram for explaining each unit constituting the control device of the end effector device of FIG. 1.
  • 3 is a flowchart for explaining a first approach process of the end effector device of FIG. 1.
  • 3 is a first flowchart for explaining a second approach process of the end effector device of FIG. 1.
  • 2 is a second flowchart for explaining a second approach process of the end effector device of FIG. 1.
  • the end effector 10 constitutes a part of an end effector device 1 such as a manipulator, for example, as illustrated in FIG.
  • the end effector device 1 includes, for example, an end effector 10, an arm 20 connected to the end effector 10, a driving device 30 for driving the end effector 10 and the arm 20, a control device 100 for controlling the driving device 30, An operation unit 40 connected to the control device 100 and a power supply 50 for supplying power to the drive device 30 and the control device 100 are provided.
  • the control device 100 controls the drive of the end effector 10 and the arm 20 by outputting a command to the drive device 30 based on the operation received by the operation unit 40.
  • the arm 20 is connected to a palm 11 described later of the end effector 10, and can be moved by the driving device 30 so that the position and the posture of the end effector 10 can be arbitrarily changed.
  • the drive device 30 includes a motor (not shown) that drives the palm portion 11 and each finger portion 12, and an encoder (not shown) that detects the rotation of the motor, and information detected by the encoder is output to the control device 100. It is configured as follows.
  • the end effector 10 includes a palm portion 11 and a plurality of finger portions 12 (two finger portions 12 in this embodiment) connected to the palm portion 11.
  • Each finger portion 12 has a base end 121 (see FIG. 4) in the extending direction connected to the palm portion 11 and moves in a direction intersecting the extending direction and approaching each other. It is configured to be able to perform a gripping operation of gripping the gripping target object 60.
  • Each finger portion 12 has, for example, a substantially rectangular plate shape, is arranged so that the plate surfaces face each other, and is configured to be movable in a direction orthogonal to the plate surface by the driving device 30. That is, the mutually facing surfaces of the finger portions 12 intersect (for example, orthogonally) in the extending direction of each finger portion 12 to constitute a gripping surface 123 (see FIG. 4) that faces the gripping target object 60.
  • the motor that drives each finger unit 12 can be configured by, for example, a linear motor.
  • each finger 12 has a proximity sensor 13 provided at a tip 122 (see FIG. 4) in the extending direction.
  • Each proximity sensor unit 13 is configured by, for example, a capacitive proximity sensor, and can detect the approach and separation of the proximity sensor unit 13 to and from the object in the extending direction of each finger unit 12, and It is configured to be able to detect the approach and separation of the object to the proximity sensor unit 13 in the extending direction.
  • each proximity sensor section 13 is arranged along the edge of the substantially rectangular tip surface 124 of each finger section 12 as viewed from the extending direction of each finger section 12. It has a frame-shaped electrode 131.
  • the electrode 131 is arranged symmetrically with respect to a center line CL that is orthogonal to the grip surface 123 and passes through the center of the grip surface 123. That is, the electrode 131 forms the first detection region 14 that covers the edge of the distal end surface 124 of each finger 12 and the second detection region 15 that is disposed inside the first detection region 14.
  • the first detection area 14 and the second detection area 15 cover substantially the entire distal end surface 124 of each finger 12.
  • each finger 12 in the extending direction thereof is substantially the same, and the distal end surface 124 of each finger 12 is on the same plane orthogonal to the extending direction of each finger.
  • the proximity sensor unit 13 of each finger unit 12 is arranged so that the distance to the grip target 60 in the extending direction of each finger unit 12 is substantially the same.
  • each finger 12 has the proximity sensor 13 at the distal end in the extending direction, and the proximity sensor 13 is moved from the extending direction of each finger 12.
  • the first detection region 14 has a frame shape and covers the edge of the distal end portion 122 in the extending direction of each finger portion 12.
  • an object for example, the grasp target 60 or the surrounding environment object 70 shown in FIG. 20
  • the end effector 10 capable of detecting approach and separation of the tip 122 of each finger with respect to the object.
  • the edge of the tip 122 of each finger 12 can be located at any position.
  • the positional relationship between each finger 12 and the object in the extending direction of each finger 12 can be grasped with high accuracy.
  • the end effector 10 is approached toward the grasping target 50, it is detected with high accuracy whether each finger 12 can approach without grasping the grasping target 50. can do.
  • the proximity sensor unit 13 is constituted by a capacitive proximity sensor.
  • the proximity sensor section 13 has a frame-shaped electrode 131 disposed on the edge of the tip 122 of each finger section 12 when viewed from the extending direction of each finger section 12.
  • the end effector 10 can detect the approach and the separation of the object with respect to the tip 122 of each finger 12 in the extending direction of each finger 12, and The end effector device 1 can detect the approach and the separation of the tip portion 122 of each finger portion 12 with respect to the end effector device 12.
  • the end effector 10 only needs to include a plurality of finger portions 12, and is not limited to a case where the end effector 10 includes two finger portions 12.
  • the end effector 10 may be configured to include three fingers 12.
  • each finger 12 is not limited to a substantially rectangular shape when viewed from the direction in which each finger 12 extends.
  • the distal end 122 of each finger 12 may be configured to have an arc shape when viewed from the direction in which each finger 12 extends.
  • the end face 124 of the distal end portion 122 is not limited to the case where the end face 124 is formed by a plane orthogonal to the extending direction of the finger portion.
  • the end surface 124 of the distal end portion 122 may be constituted by a curved surface 125 protruding toward the distal end portion 122 in the extending direction of each finger portion. It may be constituted by an inclined surface 126 intersecting with the extending direction.
  • the proximity sensor unit 13 may be provided on at least one of the plurality of finger units 12, and is not limited to the case where the proximity sensor unit 13 is provided on each of the plurality of finger units 12.
  • the proximity sensor unit 13 has at least the first detection region 14, that is, a frame-shaped detection region that covers the edge of the tip 122 of each finger 12 when viewed from the extending direction of each finger.
  • the present invention is not limited to the case where a capacitive proximity sensor having the frame-shaped electrode 131 is used.
  • a capacitive proximity sensor having the frame-shaped electrode 131 is used.
  • it may be constituted by one or more capacitive sensors having an electrode 131 of an arbitrary shape.
  • the proximity sensor unit 13 in FIG. 10 includes one solid electrode 131 that covers substantially the entire distal end surface 124 of each finger unit 12.
  • the proximity sensor unit 13 in FIG. 11 is configured by two frame-shaped electrodes 131 having different sizes, and the other electrode 131 is disposed inside one electrode 131.
  • the proximity sensor unit 13 shown in FIGS. 12 and 13 includes two linear electrodes 131 having the same size.
  • the proximity sensor unit 13 shown in FIGS. 14 and 15 includes two substantially C-shaped electrodes 131 having the same size.
  • the proximity sensor unit 13 shown in FIG. 16 includes two frame-shaped electrodes 131 having the same size.
  • the proximity sensor unit 13 shown in FIGS. 17 and 18 includes two solid electrodes 131 having the same size. In the proximity sensor unit 13 of FIGS.
  • each electrode 131 is disposed along each of a pair of sides orthogonal to the center line CL on the distal end surface 124 of each finger unit 12.
  • each electrode 131 is arranged along each of a pair of sides parallel to the center line CL on the distal end surface 124 of each finger unit 12.
  • each electrode 131 is perpendicular to the grip surface 123 and viewed from a direction in which the finger unit 12 extends. Are arranged symmetrically. As described above, the proximity sensor unit 13 can be arbitrarily changed in configuration in accordance with the dimensional configuration of each finger unit 12 or the shape and size of the object 60 to be gripped. Of the end effector 10 having a high height.
  • the proximity sensor unit 13 is not limited to the capacitive proximity sensor, but may be an optical, inductive, magnetic, or ultrasonic proximity sensor of any type.
  • the proximity sensor unit 13 in FIGS. 5, 7, and 11 to 15 is a loop electrode, and can reduce the parasitic capacitance of the proximity sensor unit 13 to increase the detection sensitivity.
  • the proximity sensor unit 13 in FIGS. 10 and 16 to 18 is a solid electrode, and can increase the electrode area to increase the detection sensitivity.
  • the proximity sensor unit 13 in FIGS. 5, 7, and 10 is a self-capacitance proximity sensor having a single electrode, and can increase the electrode area to increase the detection sensitivity.
  • the proximity sensor unit 13 in FIGS. 11 to 18 has a plurality of electrodes and can be configured with a plurality of self-capacitance proximity sensors or one or more mutual capacitance proximity sensors. For example, when the proximity sensor unit 13 shown in FIG.
  • the number of surface pixels of the detection area 14 can be increased. It is possible to determine in which direction the finger 12 is moved to avoid contact with the object 60 to be grasped or the object 70 in the surrounding environment when the edge portion is located.
  • At least one of the fingers 12 may be configured to include one or both of the first auxiliary proximity sensor 16 and the second auxiliary proximity sensor 17.
  • the first auxiliary proximity sensor section 16 is provided on the grip surface 123 and is arranged so as to be able to detect approach and separation of the grip target 60 with respect to the grip surface 123.
  • the first auxiliary proximity sensor unit 16 can more accurately grasp the positional relationship between the grip surface 123 of each finger 12 and the grip target 50 when gripping the grip target 50.
  • the second auxiliary proximity sensor unit 17 is provided on a surface 127 opposite to the grip surface 123 in a direction intersecting the extending direction of each finger unit 12, and a surface 127 opposite to the grip surface 123.
  • an object for example, an obstacle in the surrounding environment
  • the second auxiliary proximity sensor unit 17 can more accurately grasp, for example, the positional relationship between each finger unit 12 and an obstacle in the surrounding environment.
  • the first and second auxiliary proximity sensor units 16 and 17 are configured by any type of proximity sensor such as a capacitive type, an optical type, an inductive type, a magnetic type and an ultrasonic type, like the proximity sensor unit 13. it can. Further, when the first and second auxiliary proximity sensor units 16 and 17 are configured by a capacitive type, they can be configured by one or more capacitive type sensors having electrodes of an arbitrary shape. The detection results detected by the first and second auxiliary proximity sensors 16 and 17 are output to the control device 100, for example.
  • control device 100 of the end effector device 1 will be described.
  • the control device 100 transmits a signal between a storage medium such as a CPU for performing calculations and the like, a ROM and a RAM for storing a program or data necessary for controlling the end effector 10, and the like, and the outside of the end effector device 1.
  • the interface unit includes an approach position determination unit 110, a finger arrangement determination unit 120, a finger movement determination unit 130, and a movement control unit 140.
  • Each of the approaching position determination unit 110, the finger arrangement determination unit 120, the finger movement determination unit 130, and the movement control unit 140 is a function realized by the CPU executing a predetermined program.
  • the approach position determination unit 110 is a position of the proximity sensor unit 13 with respect to the grip target 60 in the palm approach direction (that is, the direction of arrow A in FIG. 21) in which the palm 11 of the end effector 10 approaches the grip target 60. Then, an approach position P1 (see FIG. 22) farther from the grasp target 60 in the palm approach direction A than a grasp position P2 (see FIG. 24) described later is determined.
  • the approaching position P1 is a position where the end effector 10 does not contact the target object 60 and the object 70 in the surrounding environment, and a required moving time from the approaching position P1 of the end effector 10 to the gripping position P2, It is determined according to the performance, the dimensional configuration of each finger portion 12, or the shape and size of the object 60 to be gripped. Note that, for example, the palm approaching direction substantially coincides with the extending direction of each finger 12 of the end effector 10.
  • the approach position determination unit 110 also determines a speed change position P3 that is farther from the grip target 60 than the approach position P1 in the palm approach direction A.
  • the movement control section 140 sets the speed change position P3 to the position P0 farther from the grip target object 60 than the speed change position P3. This is a position where the first moving speed, which is the moving speed of the palm 11 between the changing position P3, and the second moving speed smaller than the first moving speed is changed.
  • the speed change position P3 is determined by the required movement time from the approach position P1 of the end effector 10 to the gripping position P2, the performance of the driving device 30, the dimensional configuration of each finger 12, or the shape and size of the gripping target 60. It is determined according to the size and the like.
  • each of the approach position P1, the speed change position P3, the first moving speed, and the second moving speed may be determined by a user's input, or the approach position determining unit 110 may determine a plurality of values stored in advance. The value may be determined by selecting from among them, or the approach position determination unit 110 may determine the value by user input or correct the value selected by the approach position determination unit 110 based on the detection result detected by the proximity sensor unit 13. May be determined.
  • the second moving speed is determined, for example, according to a required moving time from the approach position P1 of the end effector 10 to the gripping position P2 within a range where the end effector 10 can be stopped at the approach position P1.
  • the finger arrangement determination unit 120 moves the palm 11 of the grip target 60 toward the grip target 60 based on the detection result detected by the proximity sensor unit 13.
  • the end effector 10 contacts the object 70 at a gripping position P2 (see FIG. 24) where the finger portion 12 is located in a space 80 between the gripping target object 60 and the object 70 in the surrounding environment.
  • the arrangement is determined whether the arrangement is possible or not.
  • the arrangement determination is performed on each finger 12 at the approach position P1 as an example.
  • the gripping position P2 is a position where the gripping target 60 can be gripped by gripping each finger 12, and for example, the dimensional configuration of each finger 12 of the end effector 10, or the gripping target 60 It is determined in advance based on the size and the like.
  • the finger arrangement determining unit 120 calculates the distance from the determined approach position P1 in the palm approaching direction A to the grip target object 60 from information such as the shape and size of the grip target object 60 input in advance. Then, the finger arrangement determination unit 120 determines the palm approach direction A based on the information on the dimensional configuration of each finger 12, the calculated distance, and the detection result detected by the proximity sensor unit 13 of each finger 12. The positional relationship between each finger unit 12 in the direction orthogonal to the direction, and the grasp target 60 and the surrounding environment object 70 is grasped. Thereby, the finger arrangement determination unit 120 can insert each finger unit 12 into the space 80 between the grasp target 60 and the surrounding environment object 70 without contacting the grasp target object 60 and the surrounding environment object 70. It is determined whether or not. This determination is performed, for example, before moving the end effector 10 to the approach position P1, and after moving the end effector 10 to the approach position P1 and before moving it to the gripping position P2.
  • the finger arrangement determination unit 120 determines that it is impossible to arrange each of the plurality of finger parts 12 at the grip position P2 without contacting the object to be gripped 60 and the object 70 in the surrounding environment. Is performed on a part of the plurality of finger portions 12, after the movement control portion 140 performs the first movement described later, the finger portion 12 that has been determined to be able to be arranged before the first movement moves to the second position. After one movement, a first rearrangement determination is made as to whether or not the object can be arranged at the grip position P2 without contacting the object to be gripped 60 or the object 70 in the surrounding environment. The first rearrangement determination is performed in the same manner as the arrangement determination.
  • the finger placement determination is performed when the finger unit 12 for which the layout determination was performed before the first movement is determined to be able to be positioned at the grip position P2 after the first movement.
  • the unit 120 determines whether or not the finger unit 12, for which the placement failure determination has been made before the first movement, can be placed at the holding position P2 without contacting the holding target object 60 or the surrounding environment object 70 after the first movement.
  • a second rearrangement determination is performed. The second rearrangement determination is performed in the same manner as the arrangement determination.
  • the finger placement determination unit 120 performs the placement determination each time the movement control unit 140 moves each finger unit 12 at the approach position P1, for example.
  • the finger movement determination unit 130 moves the grip target 60 between the grip surfaces 123 of the plurality of finger units 12 when the placement determination or the second rearrangement determination performed by the finger placement determination unit 120 determines that placement is not possible. It is determined whether or not each of the plurality of finger portions 12 can be moved in a direction intersecting the extending direction of each finger portion 12 so that the finger portions 12 can be arranged. In this movement determination, each of the plurality of finger portions 12 may be moved in a direction intersecting the extending direction of each finger portion 12 so that the grasp target 60 can be arranged between the grasping surfaces 123 of the plurality of finger portions 12. When the determination is made that the movement cannot be performed, the finger movement determination unit 130 determines that the grip target 60 cannot be gripped.
  • each of the plurality of finger portions 12 is held in the gripping direction B1 (that is, in the direction intersecting the extending direction of each finger portion 12 and approaching each other). It is configured to move the end effector 10 to the approach position P1 in a state of being approached and closed. For this reason, when the disposition determination is made in the placement determination, the movement control unit 140 determines the direction in which each of the plurality of finger parts 12 is separated from each other as the movement direction of the plurality of finger parts 12, and is determined. Each of the plurality of finger portions 12 is moved stepwise along the direction. That is, when the plurality of finger portions 12 cannot move in the direction determined by the movement control unit 140 without performing the arrangement possibility determination, the movement impossible determination is performed.
  • the movement control unit 140 controls the driving device 30 based on the result of the determination performed by the approach position determination unit 110, the finger arrangement determination unit 120, and the finger movement determination unit 130, and controls the palm unit 11 and each finger unit 12 Drive.
  • the movement control unit 140 drives the palm 11 and each finger as follows.
  • the movement control unit 140 moves the end effector 10 to the grip target 60 from a position farther from the grip target 60 than the approach position P1 in the palm approach direction A (for example, a position P0 illustrated in FIGS. 20 and 21). In this case, it is determined whether or not the end effector 10 has moved to the approaching position P1 based on the detection result detected by the proximity sensor unit 13, and if it is determined that the end effector 10 has moved to the approaching position P1, The movement of the end effector 10 is stopped. At this time, as shown in FIG. 21, the end effector 10 is moved to the approach position P1 in a state where each of the plurality of finger portions 12 is close to each other in the gripping direction B1 and is closed. Whether or not the end effector 10 has moved to the approach position P1 is determined based on the distance to the grip target 60 in the palm approach direction A detected by the proximity sensor unit 13.
  • the movement control unit 140 determines, before moving the end effector 10 to the approaching position P1, that the finger placement determining unit 120 determines that the position of the grasp target 60 with respect to the end effector 10 in the palm approaching direction A cannot be grasped. Until it is determined that the position of the grasping target 60 with respect to the end effector 10 in the palm approaching direction A can be grasped, the palm unit 11 is moved in a direction intersecting (for example, orthogonally) with the palm approaching direction A.
  • the movement control unit 140 detects the detection result of the proximity sensor unit 13 when moving the palm unit 11 toward the grasp target 60 in the palm approach direction A and moving the end effector 10 to the approach position P1. It is determined whether or not there is a possibility that the object to be gripped 60 and the object 70 in the surrounding environment may come into contact with the palm part 11 and each finger part 12 based on. When it is determined that there is a possibility that the grasp target 60 and the surrounding environment object 70 may come into contact with the palm 11 and each finger 12, the movement control unit 140 may grasp the palm 11 in the palm approach direction A. The approach movement to the target object 60 is stopped. In addition, based on the detection result detected by the proximity sensor unit 13, the movement control unit 140 moves the end effector 10 to a speed change position P3 (see FIG.
  • the movement control unit 140 drives the palm 11 and each finger as follows.
  • the second approach processing is performed after the end of the first approach processing.
  • the movement control unit 140 controls the driving device 30 when the placement determination is made in the placement determination or the second rearrangement determination, and moves the palm 11 toward the grasp target 60 in the palm approach direction A.
  • Each of the finger portions 12 is inserted into the space 80 between the grasp target 60 and the surrounding environment object 70 without touching the grasp target 60 and the surrounding environment object 70, and the end effector 10 is moved to the grasp position P2. Move.
  • the movement control unit 140 controls the driving device 30 to move all of the plurality of finger portions 12 to the length of each of the finger portions 12 when the placement determination is made for all or some of the finger portions 12 in the placement determination.
  • a first movement of moving in a direction intersecting (for example, orthogonally) with the existing direction is performed.
  • the first movement is performed by moving each finger 12 in a direction B2 away from each other in a direction intersecting the extending direction of each finger 12 and opening the finger.
  • the movement control unit 140 intersects the finger portion 12 that has been determined to be rejectable before the first relocation determination is performed in the extending direction of the finger portion 12.
  • the palm portion 11 is moved away from the grasping target object 60 in the direction (that is, in the same direction as the direction B2 in which the finger portion 12 determined to be unplaceable before the first rearrangement determination is moved in the first movement). Is performed in a direction intersecting the palm approaching direction A.
  • Whether or not the end effector 10 has reached the approach position P1, the grip position P2, and the speed change position P3 is determined by, for example, the proximity of the grip sensor 60 in the palm approach direction A among the plurality of proximity sensors 13. The determination is made based on the detection result detected by the sensor unit 13 (that is, the distance from each proximity sensor unit 13 to the object 60 to be grasped).
  • the proximity sensor unit 13 of each finger unit 12 is disposed so that the distance to the grip target 60 in the extending direction of each finger unit 12 is substantially the same. Any of the proximity sensor units 13 may be used to determine whether or not P1, P2, and P3 have been reached.
  • the first auxiliary proximity sensor unit 16 can be used to determine whether the end effector 10 has reached the grip position P2. For example, when the first auxiliary proximity sensor unit 16 recognizes the grip target 60, the movement control unit 140 determines that the end effector 10 has reached the grip position P2.
  • the first approach processing and the second approach processing will be described with reference to FIGS. These processes described below are performed by the control device 100 executing a predetermined program.
  • the amount of movement in the directions B1 and B2 of the finger portions 12 approaching or separating from each other is a minute amount (for example, 1 mm).
  • the approach position determination unit 110 determines the approach position P1 and the speed change position P3, and also determines the first movement speed and the second movement speed. (Step S1).
  • the movement control unit 140 closes each of the plurality of finger portions 12 of the end effector 10 by approaching each other in the gripping direction B1 (Step S2), and the proximity sensor unit. 13 is started.
  • the finger arrangement determination unit 120 determines the position of the grasp target 60 with respect to the palm unit 11 in the palm approach direction A based on the detection result detected by the proximity sensor unit 13. It is determined whether or not it is possible (step S3).
  • the movement control unit 140 moves the palm 11 in a direction intersecting the palm approaching direction A (step S4). Steps S3 and S4 are repeated until it is determined that the position of the grip target 60 with respect to the end effector 10 in the palm approaching direction A can be grasped.
  • the movement control unit 140 causes the palm 11 to approach the grasp target 60 in the palm approach direction A, and The movement to the approach position P1 is started (step S5).
  • the movement control unit 140 When the movement of the end effector 10 to the approach position P1 is started, the movement control unit 140 first holds the palm unit 11 and each finger unit 12 based on the detection result detected by the proximity sensor unit 13. It is determined whether or not the end effector 10 has reached the speed change position P3 while determining whether or not the target object 60 and the surrounding environment object 70 may come into contact with each other (step S6). This step S6 is repeated until the end effector 10 reaches the speed change position P3. If it is determined that there is a possibility that the object to be gripped 60 and the object 70 in the surrounding environment may come into contact with the palm 11 and each finger 12, the movement control unit 140 proceeds to the palm 11 in the palm approaching direction A. The approach movement to the grasping target object 60 is stopped.
  • the movement control unit 140 changes the movement speed of the end effector 10 from the first movement speed to a second movement speed smaller than the first movement speed. (Step S7).
  • the movement control unit 140 moves the end effector 10 to the approach position P1 based on the detection result detected by the proximity sensor unit 13. It is determined whether or not it has reached (step S8). This step S8 is repeated until the end effector 10 reaches the approach position P1.
  • the movement control unit 140 stops the movement of the end effector 10 (Step S9), and the first approach processing ends.
  • the end effector 10 in which the proximity sensor unit 13 is provided for each of the plurality of finger units 12, and the driving unit 30 that drives the palm unit 11 of the end effector 10 and each finger unit 12 A movement control unit 140 for bringing the palm 11 closer to the object 60 to be gripped, and all the fingers 12 around the palm approach direction A in the object 60 to be gripped and the object 70 in the surrounding environment
  • an approaching position determining unit 110 that determines an approaching position P1 distant from the gripping object 60 in the palm approaching direction A than a gripping position P2 at which the gripping object 60 can be gripped by the gripping operation.
  • the movement control unit 140 controls the palm unit 11 and each finger unit 12 to grasp the target object 60 and the surrounding environment object 70. If there is a possibility of contact stops the approaching movement to the gripping target 60 of the palm portion 11. With such a configuration, it is possible to realize the end effector device 1 that can move the end effector 10 to the approach position without contacting the grasp target 60 and the surrounding environment object 70.
  • the movement control unit 140 moves the end effector 10 to the approach position P1 in a state where the finger units 12 are close to each other in a direction intersecting the extending direction thereof and are closed.
  • the proximity sensor unit 13 of each finger unit 12 can be used as an array, the approach or separation of the grasp target 60 and the surrounding environment object 70 with respect to the end effector 10 in the palm approach direction A can be improved. It can be detected accurately.
  • the movement control unit 140 moves the object 60 to be gripped with respect to the end effector 10 in the palm approaching direction A based on the detection result detected by the proximity sensor unit 13. It is determined whether or not the position can be grasped. If it is determined that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A cannot be determined, the movement control unit 140 determines that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A can be determined. The end effector 10 is moved in the direction B intersecting with the palm approaching direction A until the end effector 10 moves. With such a configuration, the end effector 10 can be more accurately moved toward the grasp target 60.
  • the movement control unit 140 moves the end effector 10 away from the grasping target object 60 more than the approaching position P1.
  • the end effector 10 moves to the speed change position P3 between the position P0 farther from the graspable object 60 than the speed change position P3 when moving to the speed change position P3.
  • the speed is changed to a second moving speed smaller than the first moving speed.
  • steps S2 to S7 can be omitted as necessary. That is, when the end effector 10 is moved to the approach position P ⁇ b> 1, it is not necessary to close each finger 12, and it is determined whether the position of the grasp target 60 with respect to the palm 11 in the palm approach direction A can be grasped. And the moving speed of the end effector 10 need not be changed. Further, one or more of steps S2 to S7 may be omitted, or all of steps S2 to S7 may be omitted.
  • step S3 of the first approach process when step S3 of the first approach process is omitted, the position of the grip target 60 with respect to the palm 11 can be determined using an image sensor.
  • the proximity sensor section 13 may be provided on at least one of the plurality of finger sections 12, and is not limited to being provided on each of the plurality of finger sections 12.
  • step S3 If it is determined in step S3 that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A cannot be determined, it is determined that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A can be determined.
  • the first approach process may be terminated at that time, instead of repeating step S6 until the process is performed.
  • the movement control unit 140 moves the plurality of finger units 12 of the end effector 10 away from each other based on the input information of the grasp target 60.
  • the finger placement determining unit 120 determines whether or not all the fingers 12 can be placed at the grip position P2 (step S12).
  • FIG. 26 shows a process in the case where a placement failure determination is made for all the finger portions 12 in this placement determination. 27.
  • the movement control unit 140 gradually moves each finger unit 12 in the gripping direction B1 approaching each other in step S11. It is configured to be moved and closed.
  • step S12 when it is determined that all the fingers 12 can be arranged, the movement control unit 140 moves the end effector 10 to the grip position P2 (step S13). ), The second approach process ends.
  • the finger movement determination unit 130 determines whether or not the finger parts 12 can be opened by moving the finger parts 12 away from each other.
  • the movement determination is performed (step S14).
  • the movement control unit 140 moves the fingers 12 in the direction B2 away from each other. (Step S15), and the process returns to step S12, where the finger placement determination unit 120 again performs placement determination.
  • the finger movement determination unit 130 determines that the grip target 60 cannot be gripped. Then (step S16), the second approach processing ends.
  • the finger movement determination unit 130 moves each finger part 12 in a direction away from each other. It is determined whether or not it can be opened (step S21).
  • the movement control unit 140 performs the first movement in which the finger units 12 are moved in the direction B2 away from each other and opened (step S22).
  • the finger arrangement determination unit 120 determines that the finger unit 12, for which the arrangement possibility has been determined before the first movement, comes into contact with the grasp target 60 or the object 70 in the surrounding environment after the first movement.
  • a first rearrangement determination is made as to whether or not it can be arranged at the gripping position P2 (step S23).
  • the finger placement determining unit 120 determines that the finger unit 12 that has been determined to be unplaceable before the first movement moves to the grip target 60 or the surrounding environment after the first movement.
  • a second rearrangement determination is made as to whether or not the object 70 can be arranged at the grip position P2 without touching the object 70 (step S24).
  • the movement control unit 140 holds the end effector 10. After moving to the position P2 (step S13), the second approach process ends.
  • the finger movement determination unit 130 determines that the grip target object 60 cannot be gripped (step S16), and the second approach process ends.
  • the movement control unit 140 determines whether the finger 12 that has been determined to be repositionable before the first relocation determination is performed extends in the direction in which the finger 12 extends. A second movement of moving the palm portion 11 in a direction intersecting with the palm approaching direction A is performed so as to move away from the grasping target object 60 in a direction intersecting (step S25). When the second movement is performed, the process returns to step S12, and the finger placement determination unit 120 again performs the placement determination.
  • step S24 When the placement rejection determination is made in the second relocation determination in step S24, the process returns to step S21, and the finger movement determination unit 130 performs the movement determination again.
  • step S11 when the finger units 12 are moved stepwise in the gripping direction B1 approaching each other and closed, in step S25, the finger units 12 determined to be unplaceable before the first rearrangement determination is performed.
  • the second movement is performed so as to approach the gripping target object 60 in a direction intersecting the extending direction of the finger portion 12.
  • the end effector 10 in which the proximity sensor unit 13 is provided for each of the plurality of finger units 12, and the driving unit 30 that drives the palm unit 11 of the end effector 10 and each finger unit 12
  • a finger placement determining unit 120 that determines whether or not each of the plurality of finger units 12 can be placed at the grip position P2 without contacting the grip target object 60 and the surrounding environment object 70, and a placement in the placement determination.
  • a movement control unit 140 that moves the end effector 10 to the grip position P2 when the determination is possible and moves each finger 12 in a direction that intersects with the extending direction when the disposition determination is performed in the placement determination. It has. With such a configuration, it is possible to realize the end effector device 1 that can be disposed at a position where the grip target object 60 can be gripped (that is, the grip position P2) without contacting the grip target object 60 and an object in the surrounding environment.
  • the finger placement determining unit 120 moves all of the plurality of finger parts 12 in a direction intersecting the extending direction thereof.
  • the finger unit 12 which has been determined to be able to be arranged before the first movement, comes into contact with the grasp target 60 or the surrounding environment object 70 after the first movement.
  • the relocation determination is performed to determine whether or not the relocation can be performed at the grip position P2 without performing the relocation.
  • the movement control unit 140 determines that the finger 12 that has been determined to be unplaceable before the relocation determination is performed in the direction that intersects the extending direction of the finger 12. A second movement of moving the palm 11 in a direction intersecting the palm approaching direction A is performed so as to approach or separate from the palm 60. With such a configuration, the amount of driving of the arm 20 when the end effector 10 is moved to the grip position P2 can be reduced.
  • each of the plurality of finger portions 12 is moved in a direction intersecting the extending direction so that the grasp target 60 can be disposed between the plurality of finger portions 12.
  • the apparatus further includes a finger movement determination unit 130 that determines whether movement is possible.
  • the finger movement determination unit 130 determines that the grip target 60 cannot be gripped when the movement determination is made by the movement determination. With such a configuration, it is possible to accurately and quickly determine whether or not the grip target object 60 can be gripped.
  • steps S16 and S25 can be omitted. That is, the finger movement determination unit 130 may be omitted, or the second movement may not be performed when the placement failure determination is made in the first rearrangement determination.
  • the finger part 12 for which the placement determination is performed is performed without performing the first movement in step S22.
  • the palm 11 may be moved in a direction intersecting with the palm approaching direction A so as to approach the object 60 to be grasped.
  • the end effector device 1 includes: The palm part 11 and one end 121 in the extending direction are connected to the palm part 11, and each of the palm parts 11 is moved in a direction crossing the extending direction and in a direction approaching each other to grip the grip target 60.
  • a plurality of fingers 12 capable of performing a gripping operation, and a proximity sensor 13 is provided at the other end 122 in the extending direction of at least one of the plurality of fingers 12; ,
  • the approach or separation of the proximity sensor unit 13 with respect to the grip target object 60 and the surrounding environment object 70 in the extension direction can be detected, and the grip target object with respect to the proximity sensor unit 13 in the extension direction.
  • An end effector 10 arranged to detect approach or separation of the object 60 and the object 70;
  • a driving device 30 for driving each of the palm portion 11 and the plurality of finger portions 12 of the end effector 10,
  • the palm unit 11 approaches the grip target 60 in a state where the proximity sensor unit 13 is opposed to the grip target 60 and the proximity sensor unit 13 can detect the grip target 60 and the surrounding environment object 70.
  • a movement control unit 140 for causing The palm 11 of the grasping target 60 approaches the palm approaching direction A in which the palm 11 approaches the grasping target 60, and all of the plurality of fingers 12 are located between the grasping target 60 and the object 70.
  • the movement control unit 140 includes: When the end effector 10 moves to the approach position P ⁇ b> 1, the object to be gripped is applied to each of the palm 11 and the plurality of fingers 12 based on a detection result detected by the proximity sensor 13. When there is a possibility that the object 60 comes into contact with the object 60, the approach movement of the palm 11 to the object 60 to be grasped is stopped.
  • the end effector device 1 of the first aspect it is possible to realize the end effector device 1 that can move the end effector 10 to the approach position without contacting the grasp target 60 and the object 70 in the surrounding environment.
  • the end effector device 1 includes: The end effector 10
  • the proximity sensor unit 13 is provided on each of the plurality of finger units 12,
  • the movement control unit 140 includes: The end effector 10 is moved to the approach position P1 in a state where each of the plurality of finger portions 12 is close to each other in a direction intersecting the extending direction and is closed.
  • the proximity sensor unit 13 of each finger unit 12 can be used as an array. Approach or separation can be detected more accurately.
  • the end effector device 1 includes:
  • the movement control unit 140 includes: Before moving the end effector 10 to the approach position P1, the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A is grasped based on the detection result detected by the proximity sensor unit 13. It is determined whether or not it is possible.
  • the end effector 10 can be more accurately moved toward the grasp target 60.
  • the end effector device 1 includes:
  • the movement control unit 140 includes: When it is determined that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A cannot be determined, it is determined that the position of the grip target 60 with respect to the end effector 10 in the palm approach direction A can be determined. Until the end effector 10 moves in the direction intersecting the palm approaching direction A.
  • the end effector 10 can be more accurately moved toward the object 60 to be grasped.
  • the end effector device 1 includes: The movement control unit 140 includes: Based on the detection result detected by the proximity sensor unit 13, it is determined whether the end effector 10 has moved to the approach position P1.
  • the end effector device 1 of the fifth aspect it is possible to determine whether or not the end effector 10 has moved to the approach position P1 without providing any device such as an encoder.
  • An end effector device 1 includes:
  • the movement control unit 140 includes: In the palm approach direction A, when the end effector 10 moves to a speed change position P3 farther from the grip target object 60 than the approach position P1, the end effector 10 is farther from the grip target object 60 than the speed change position P3. From the first moving speed, which is the moving speed of the end effector 10 between the moved position P0 and the speed changing position P3, to a second moving speed smaller than the first moving speed.
  • the end effector of the present disclosure can be applied to, for example, an end effector device of an industrial robot.
  • the end effector device of the present disclosure can be applied to, for example, an industrial robot.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif effecteur terminal pourvu : d'un effecteur terminal qui comprend une partie capteur de proximité disposée sur l'extrémité de pointe d'au moins un doigt ; d'un dispositif d'entraînement pour entraîner l'effecteur terminal ; d'une unité de commande de mouvement qui, sur la base d'un résultat de détection par la partie capteur de proximité, déplace une partie paume de façon à s'approcher d'un objet à saisir dans un état dans lequel la partie capteur de proximité est positionnée pour faire face à l'objet à saisir de façon à amener la partie capteur de proximité à être capable de détecter l'objet à saisir et d'autres objets ; et d'une unité de détermination de position d'approche pour déterminer une position d'approche qui est plus éloignée de l'objet à saisir, le long de la direction de la paume s'approchant, que la position de préhension où l'objet à saisir peut être saisi par une opération de préhension du doigt.
PCT/JP2019/009920 2018-09-25 2019-03-12 Dispositif effecteur terminal WO2020066066A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-179302 2018-09-25
JP2018179302A JP6947145B2 (ja) 2018-09-25 2018-09-25 エンドエフェクタ装置

Publications (1)

Publication Number Publication Date
WO2020066066A1 true WO2020066066A1 (fr) 2020-04-02

Family

ID=69951296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009920 WO2020066066A1 (fr) 2018-09-25 2019-03-12 Dispositif effecteur terminal

Country Status (2)

Country Link
JP (1) JP6947145B2 (fr)
WO (1) WO2020066066A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112621795A (zh) * 2020-12-21 2021-04-09 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器
CN112621794A (zh) * 2020-12-21 2021-04-09 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器
CN112720486A (zh) * 2020-12-21 2021-04-30 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61105594U (fr) * 1984-12-14 1986-07-04
JPH04159094A (ja) * 1990-10-23 1992-06-02 Natl Space Dev Agency Japan<Nasda> 自動マニピユレーターハンド
JPH08141956A (ja) * 1994-11-17 1996-06-04 Sanyo Electric Co Ltd 物体把持ロボットの制御方式

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61105594U (fr) * 1984-12-14 1986-07-04
JPH04159094A (ja) * 1990-10-23 1992-06-02 Natl Space Dev Agency Japan<Nasda> 自動マニピユレーターハンド
JPH08141956A (ja) * 1994-11-17 1996-06-04 Sanyo Electric Co Ltd 物体把持ロボットの制御方式

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOTOMAN ENGINEERING: "Non-official translation: Operation Education Manual, YRC/DX/NX", MOTOMAN YRC/DX/NX, December 2017 (2017-12-01) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112621795A (zh) * 2020-12-21 2021-04-09 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器
CN112621794A (zh) * 2020-12-21 2021-04-09 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器
CN112720486A (zh) * 2020-12-21 2021-04-30 深圳市越疆科技有限公司 机械臂末端执行器及其控制方法、机械臂和存储器

Also Published As

Publication number Publication date
JP2020049567A (ja) 2020-04-02
JP6947145B2 (ja) 2021-10-13

Similar Documents

Publication Publication Date Title
WO2020066066A1 (fr) Dispositif effecteur terminal
JP6661001B2 (ja) ワークオフセットを決定するシステムおよび方法
JP7147419B2 (ja) エンドエフェクタ装置
JP6663978B2 (ja) ツールオフセットを決定するシステムおよび方法
US9919424B1 (en) Analog control switch for end-effector
US10675767B2 (en) Robot system and robot controller
CN113165187A (zh) 图像信息处理装置、把持系统以及图像信息处理方法
WO2020066064A1 (fr) Effecteur terminal et dispositif d&#39;effecteur terminal
JP2015071207A (ja) ロボットハンドおよびその制御方法
JPH04240087A (ja) 把持装置および把持方法
WO2020066065A1 (fr) Dispositif effecteur d&#39;extrémité
JP2015085481A (ja) ロボット、ロボットシステム、ロボット制御部及び把持方法
JP6988757B2 (ja) エンドエフェクタおよびエンドエフェクタ装置
JP6314429B2 (ja) ロボット、ロボットシステム、及びロボット制御装置
JP2022115341A (ja) ケーブル終端検出方法およびハンド
US20200338720A1 (en) Robot
JP7226474B2 (ja) 把持装置
WO2020110237A1 (fr) Dispositif de reconnaissance d&#39;état de contact et système de robot
JP7022964B2 (ja) ロボットハンドの把持方法
WO2019098027A1 (fr) Système de préhension et son procédé de commande
JP2023006241A (ja) 圧電センサーおよびハンド
Su et al. A new, simple and universal four-finger gripper for 3D objects grasping
JP2015104763A (ja) ロボット装置の制御方法及びロボット装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19864918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19864918

Country of ref document: EP

Kind code of ref document: A1