WO2022085408A1 - Dispositif de commande de robot et procédé de commande de robot - Google Patents

Dispositif de commande de robot et procédé de commande de robot Download PDF

Info

Publication number
WO2022085408A1
WO2022085408A1 PCT/JP2021/036652 JP2021036652W WO2022085408A1 WO 2022085408 A1 WO2022085408 A1 WO 2022085408A1 JP 2021036652 W JP2021036652 W JP 2021036652W WO 2022085408 A1 WO2022085408 A1 WO 2022085408A1
Authority
WO
WIPO (PCT)
Prior art keywords
gripping
deformation
point
force
control device
Prior art date
Application number
PCT/JP2021/036652
Other languages
English (en)
Japanese (ja)
Inventor
浩司 白土
宏規 土橋
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN202180060765.2A priority Critical patent/CN116194255A/zh
Priority to JP2022557371A priority patent/JP7337285B2/ja
Priority to DE112021005493.7T priority patent/DE112021005493T5/de
Publication of WO2022085408A1 publication Critical patent/WO2022085408A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39469Grip flexible, deformable plate, object and manipulate it
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39507Control of slip motion

Definitions

  • the present disclosure relates to a robot control device and a robot control method that give an operation command to a robot and an end effector attached to the fingertip of the robot in order to grip the object so as not to drop it.
  • the robot control device controls the robot and the robot hand of the robot in order to grip the object, and includes a grip point generation unit that generates a grip point of the object to be gripped by the robot hand.
  • the grip point generation unit includes a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping motion of the robot, and a grip point determination unit that determines the grip point of the object based on the shape deformation information. It has.
  • FIG. 1-9 It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 3. It is a flowchart which shows the operation of the robot control apparatus which concerns on Embodiment 3. It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 4. FIG. It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 5. It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 6. It is a block diagram which shows the structure of another gripping point generation part which concerns on Embodiment 6. It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 7. It is a schematic diagram of the object before grasping which concerns on Embodiment 9. FIG. It is a figure which shows the hardware composition of the robot control apparatus which concerns on Embodiment 1-9.
  • FIG. 1 is an overall view of a robot system according to the first embodiment for carrying out the present disclosure.
  • the robot system is based on the configuration of a robot and a robot control device that controls and operates the robot.
  • the robot 10 may perform a task called material handling, such as gripping an object.
  • a measuring device 60 that measures information such as the shape of the object 70 in order to acquire the position information and shape information of the object 70, and a robot hand 20 (end effector) for gripping the object 70 are provided.
  • the configuration to be provided is added.
  • the information of the object 70 measured based on the measuring device 60 is processed by the measuring device controller 50, and the information of the object 70 is input to the robot control device 30.
  • the robot control device 30 controls either the joint of the arm of the robot 10 or at least one of the fingers of the robot hand 20 so that the finger of the robot hand 20 moves to an appropriate position. ..
  • the position shape and the shape information of the object 70 are exemplified.
  • the robot control device 30 outputs the calculated position command value, the open position command value of the finger of the robot hand 20, and the closed position command value to the robot 10.
  • the robot control device 30 determines the timing at which the position command value for the robot hand 20 is executed with respect to the position command value of the robot 10, and transmits the position command value to the robot 10 as the position command value at each time t.
  • the position command value of the robot 10 is related to 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom.
  • the position command value of the finger of the robot hand 20 depends on the type of the hand, but if it is a link structure, it is defined by the finger tip position or the opening width. In addition, it may refer to the position command value of each drive unit, but here, it refers broadly to the position command value that can be specified without being limited to the structure of the hand.
  • the finger of the robot hand 20 can control the gripping force when the actuator can control the pressure, the force, or the torque.
  • the gripping force command value is given to the gripping point candidate.
  • the "grasping point” means the position and posture of the finger in which the robot hand 20 can grip the object 70.
  • a position command value at each time t is required in addition to the position and orientation of the gripping point, but each of the robots 10 so that the robot hand 20 can reach the gripping point of the object 70.
  • the joint position target value shall be calculated separately.
  • the information that can be used to calculate the gripping point of the object 70 does not have to be limited to the position shape and shape information of the object 70. That is, as the information of the object 70, in addition to the direct information such as the position information and the shape information, the position of the object 70 is used by using indirect information such as the temperature information, the distance information, and the color information of the object 70. It is possible to estimate information and shape information.
  • FIG. 2 is a block diagram showing the configuration of the robot control device 30.
  • the robot control device 30 is mainly composed of a gripping point generation unit 31 and a command value generation unit 39.
  • the robot control device 30 calculates the position of the gripping point where the robot 10 should move, moves the robot hand 20 to the gripping point, and controls the robot 10 so that the robot 10 grips and operates.
  • the gripping point generation unit 31 outputs the gripping point of the object 70 by using the shape information of the object 70 to be gripped by the robot 10.
  • the target shape information is obtained by acquiring and calculating the image information or the distance information of the object 70 obtained by the visual sensor as the measuring device 60 as a point cloud.
  • the object 70 may be actually gripped once by the finger of the robot hand 20 by actually using the robot hand 20, and the shape information may be acquired based on the position information of the finger at the time of gripping.
  • the measuring device 60 may use a distance measuring sensor to acquire shape information based on the cross-sectional shape of the object.
  • the temperature sensor may be used as the measuring device 60 to acquire shape information based on the approximate position and shape of the object 70.
  • the measuring device 60 is not limited to the visual sensor. Further, position information, shape information, temperature information, distance information, color information and the like may be obtained from other than the measuring device 60.
  • FIG. 3 is a block diagram showing the configuration of the grip point generation unit 31.
  • the grip point generation unit 31 is composed of a grip point candidate generation unit 32, a deformation evaluation unit 33, and a grip point determination unit 36.
  • the deformation evaluation unit 33 calculates shape deformation information when the shape of the object 70 is deformed by the gripping operation of the robot hand 20.
  • the gripping point determining unit 36 determines the gripping point of the object based on the deformation amount of the object included in the shape deformation information and the geometric constraint condition after the deformation of the object. decide.
  • each component will be described.
  • the gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 attached to the robot 10 based on the target shape information input to the robot control device 30. At this time, as a method of generating the gripping point candidate, it is possible to search by a method of completely searching between any two points on the entire circumference from the target shape information based on the stroke (opening width) of the finger of the robot hand 20. For example, a case where a 2-finga gripper is taken as an example is shown in FIG. 4 described later.
  • the search for an object is an elliptical shape, that is, a process of selecting any two points on the outer circumference of the object.
  • the deformation evaluation unit 33 moves a finger inside the object with respect to the two selected points, and performs deformation evaluation described later on the assumption that the grip is performed.
  • the search itself is not a full search, but the distance between the two candidate points and the open / close distance L0 with the opening / closing distance L0 of the finger as a constraint condition. It is possible to execute a search method under the constraint of comparison, and the search method itself is not limited.
  • the deformation evaluation unit 33 evaluates and outputs the expected shape deformation information as shown in FIG. 4 for each case of the plurality of grip point candidates generated by the grip point candidate generation unit 32. do.
  • the shape deformation information includes the shape information after deformation. In order to evaluate the shape deformation information, it is possible to calculate the shape deformation information expected in the model in which each finger causes deformation with respect to the object 70, assuming gripping by point contact by each finger.
  • the mechanical relationship in which deformation occurs with respect to force can be calculated at each point contact point.
  • the object 70 is treated as a uniform shape
  • the spring multiplier is K
  • the damping coefficient is C
  • a rigid body, an elastic body or a rheology is used in the direction in which the finger force is generated.
  • the shape deformation information can be evaluated by approximating the characteristics of the object as an object.
  • the relational expression between displacement and force is established as described in Non-Patent Document 1.
  • the expected shape deformation information can be obtained by designating the gripping point and causing the shape deformation under appropriate conditions.
  • FIG. 4 is a case of being restrained, and a case as shown in FIG. 5 is a case of not being restrained.
  • FIG. 4 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70. 4 (a) shows the positional relationship before the finger of the robot hand 20 grips the object 70, and FIGS. 4 (b) and 4 (c) show the positional relationship after the finger of the robot hand 20 grips the object 70. Is shown.
  • FIG. 5 is a diagram showing the positional relationship between the finger and the object 70 when the object 70 can move.
  • the direction in which the finger of the robot hand 20 opens and closes is defined as the X axis
  • the direction perpendicular to the direction along the finger is defined as the Y axis.
  • the black arrows pointing outward from the object 70 shown in FIGS. 4 and 5 indicate the direction in which the object 70 is to be moved by applying a force from the outside.
  • the object is geometrically applied even if an external force is applied in the X-axis direction and the Y-axis direction. Since it is restrained, it is easy to maintain the gripped state. This is because the geometrical constraint is established from the shape information of the object 70 after the deformation and the positional relationship of the fingers to be gripped after the deformation of the object 70.
  • the present embodiment is characterized by paying attention to this point and extracting gripping point candidates that realize stable gripping.
  • the gripping stability due to the geometrical restraint also acts in the X direction, and the object cannot move.
  • the geometric constraint does not act and the gripping stability is low, and the object can be moved.
  • the deformation evaluation unit 33 outputs a plurality of discrete points DP1, DP2, ... As shape deformation information of the object 70.
  • the discrete points DP1, DP2, ... are set based on the contour of the shape expected in the model that causes deformation with respect to the object 70.
  • the gripping point determining unit 36 determines whether or not a geometric constraint is established from the relationship between the positions of the discrete points DP1, DP2, ... And the finger positions FP1, FP2, thereby gripping. Evaluate stability.
  • the gripping point determination unit 36 obtains a first approximate curve for a plurality of discrete points located in the vicinity of the finger position FP1.
  • the gripping point determination unit 36 sets a plurality of discrete points (not shown) with respect to the finger based on the contour of the finger at the position FP1, and obtains a second approximate curve for the plurality of discrete points.
  • the gripping point determination unit 36 determines the shape of the object 70 near the position FP1 of the finger (unevenness information, etc.) and the shape of the finger at the position FP1 (arc, rectangle) based on the first approximate curve and the second approximate curve. Etc.), and the gripping stability is evaluated by determining whether or not the geometrical constraint is established.
  • Examples of the comparison method include the magnitude relation of the curvature between the shape of the object 70 and the shape of the finger, and the height difference between the maximum point and the minimum point of the first approximation curve.
  • the gripping point determination unit 36 similarly obtains approximate curves for a plurality of discrete points located in the vicinity of the finger position FP2 and a plurality of discrete points (not shown) relating to the finger at the position FP2, and the same method as described above. Evaluate grip stability.
  • the gripping point determination unit 36 confirms the position coordinates of the discrete points DP1, DP2, ... When the virtual force Fvir is applied to the object 70, and the position coordinates before the addition are confirmed. By determining whether or not the amount of change is equal to or less than a predetermined value, it is determined whether or not the geometric constraint is established, and the gripping stability is evaluated.
  • the gripping point determination unit 36 may determine that the gripping stability is low when the amount of change exceeds a predetermined value even in one of the plurality of discrete points, or the amount of change in a part of the plurality of discrete points. May be determined that the gripping stability is low when the value exceeds a predetermined value. It is assumed that the virtual force Fvir is applied to the object 70 from an arbitrary direction.
  • the grip point closest to the position of the center of gravity can be selected.
  • the distance between the center of gravity and the gripping point is short, it is expected that the couple can be kept small even when the gripping force near the gripping point fluctuates due to disturbance or the like.
  • FIG. 6 is a flowchart showing the operation of the robot control device.
  • the target shape information is input.
  • the gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 based on the input target shape information.
  • the deformation evaluation unit 33 evaluates and outputs the shape deformation information for each case of the plurality of gripping point candidates.
  • the gripping point determining unit 36 determines the gripping point based on the shape deformation information.
  • the grip point generation unit 31 is a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping operation of the hand.
  • the gripping point determining unit 36 that determines the gripping point of the object based on the shape deformation information 33, an irregular object such as a flexible object grips the selected gripping point with respect to the object.
  • the gripping failure is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
  • production efficiency refers to the speed of work such as picking work.
  • production efficiency it refers to takt time, and if one operation of 1 second is tried 100 times and succeeded 100 times, it is evaluated as an average of 1 second / time takt time, and the same work is tried 100 times. If it succeeds only 50 times, it is evaluated as an average tact time of 2 seconds / time. From the above, the fewer failures there are, the higher the production efficiency.
  • the deformation evaluation unit 33 further adds a configuration for evaluating whether or not the upper limit of the force calculated based on the relational expression between the force applied to the object 70 and the displacement of the object 70 is exceeded. Is different from the first embodiment.
  • a plurality of grip point candidates are extracted by the grip point candidate generation unit 32 after satisfying the condition of being geometrically constrained. From these gripping point candidates, the deformation evaluation unit 33 determines whether or not the object 70 exceeds the allowable deformation amount by the gripping force F (t) (value that changes at time t) expressed in time series. It is a feature of this embodiment that it is evaluated and added to the constraint condition.
  • the deformation evaluation unit 33 sets an upper limit value of the gripping force applied to the object 70 based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 and the amount of deformation of the object 70 that the object 70 can tolerate. calculate. Then, the deformation evaluation unit 33 evaluates whether or not the gripping force applied to the object 70 from the robot hand 20 exceeds the upper limit value. Further, the deformation evaluation unit 33 determines the time-series information of the gripping force calculated based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 as a part of the shape deformation information. Output to 36.
  • Deformations include elastic bodies whose shape returns to their original shape when the gripping force is removed after deformation, rheological bodies whose shape does not completely return, and plastic bodies that deform by the amount of force applied.
  • flexible objects have an upper limit of allowable deformation. If the conditions of deformation are exceeded, an event occurs in which the object 70 is damaged or the commercial value is impaired.
  • the amount of deformation is calculated by the force and the time to apply the force.
  • the relational expression between the force and the amount of deformation can be expressed mathematically, for example, by the description in Non-Patent Document 1.
  • a physical property model can be simulated by a configuration in which an elastic element and a damping element are connected in series as in the Maxwell model.
  • gripping points PG1 and PG2 When using a 2-finger hand, two grip points are given to the finger. These are set as gripping points PG1 and PG2. At this time, the gripping point PG2 and the point P2 on which the force is applied are made to coincide with each other. Further, the vector (P1P2) and the vector (PG1PG2) are set to be parallel to each other. Regarding the displacement, the displacement of the joint between the spring element and the damping element is defined by x1, and the displacement of the gripping point P2 is defined by x2. The origins of the displacement x1 and the displacement x2 can both define the state of natural length. At that time, as an initial positional relationship, the length of the spring element of k1 is set to X10, and the length of the damping element is set to X20.
  • the time-series data F (t) of the gripping force when applied from the outside, it can be obtained as the time-series data of the displacement x1 and the displacement x2 by calculating the equation of motion.
  • the characteristics (having residual displacement) of the rheological object can be simulated by giving the damping coefficient C2 a non-linear characteristic.
  • the definition of the physical property model is not limited to this, and it can be applied to rigid bodies, elastic bodies, rheological objects, and plastic bodies by changing the coefficients and configurations.
  • the change in the position of the displacement x2 can be acquired, so it is possible to determine what kind of deformation occurs according to the time-series data F (t) of the appropriate gripping force.
  • the format of the physical property model is not particularly limited.
  • the deformation evaluation unit 33 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, so that the gripping fails.
  • the ratio of selecting points is reduced, and a special effect of improving production efficiency can be obtained.
  • the robot hand 20 grips the object 70, which is a flexible amorphous object, the object 70 is damaged based on the time-series information of the shape of the deformed object 70 and the force up to it. It is possible to select a gripping point having high gripping stability without any problem.
  • the robot hand 20 grips the object 70 with a slightly large gripping force, it can include a case where the deformation of the object 70 is allowed if the gripping force is removed within a predetermined time. Therefore, the failure is reduced, the object 70 can be grasped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
  • the number of gripping point candidates may be very small if it is determined only by whether or not a certain upper limit value as mentioned above is exceeded. In this case, it can be utilized that the amount of deformation is within the permissible range as long as the upper limit of time is slightly exceeded.
  • the deformation evaluation unit 33 has a gripping force F (t), which is the magnitude of the force acting on the gripping point, and a time t for applying a force equal to or greater than the allowable load, as a part of the shape deformation information. Is output. In this case, it is possible to evaluate whether or not the amount of deformation finally allowed is reached based on the gripping force F (t) expressed in time series.
  • the gripping point determining unit 36 uses the gripping point, the gripping force, and the gripping time as the shape deformation information to determine whether or not the deformation amount is within the allowable range based on the magnitude relationship between the force and the time threshold. ..
  • the gripping point determining unit 36 can acquire the gripping point and the gripping force that realize the state in which the shape of the food is kept within a certain range.
  • the gripping point information includes information on the position of the gripping point and the gripping force (acting force) at the gripping point.
  • the threshold value regarding the gripping force and the time can be obtained by being converted into the gripping force and the gripping time based on the allowable range of the deformation amount of the object 70.
  • the deformation amount may be calculated using the gripping point, the gripping force, and the gripping time, and the upper limit value may be set based on the deformation amount. not.
  • the upper limit of the deformation amount of the object 70 is provided in advance by the user of this system for each food.
  • the robot control device 30 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, the ratio of selecting the gripping points where the gripping fails is reduced and is high. It is possible to grasp the object with the success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be maintained high.
  • the embodiment further includes a gripping stability calculation unit that evaluates gripping stability by mechanical stability against a predetermined external force with respect to the balance of the deformed force near the gripping point.
  • FIG. 8 is a block diagram showing a configuration of a gripping point generation unit according to the third embodiment.
  • the grip point generation unit 31a is composed of a grip stability calculation unit 34 and a result DB (result database) 35.
  • the gripping stability calculation unit 34 evaluates the mechanical stability against a predetermined external force with respect to the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70. Further, the gripping stability calculation unit 34 evaluates the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70, and the gripping force of the robot hand 20 with respect to the object 70 is minimized. Extract the gripping point.
  • the grip stability calculation unit 34 inputs shape deformation information. First, the grip stability calculation unit 34 calculates each point of the finger of the robot hand 20 and the grip target based on the force vector generated in the grip target after the deformation. Next, the grip stability calculation unit 34 evaluates whether or not the object 70 does not move based on the balance of forces at the grip point of the object 70. At this time, if the deformed object 70 and the finger of the robot hand 20 are geometrically constrained (immovable), the object 70 and the finger are affected by the action of a force other than the gripping force of the finger of the robot hand 20. It becomes a state where and is pressed, and it is regarded as a stable state.
  • the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained.
  • the stable state (stability) will be described.
  • the predetermined external force is defined as Fdis
  • the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained even when the external force Fdis is set to a value other than 0.
  • the deformation when the gripping force F (t) and the external force Fdis are added to the shape deformation information is added. Deformation is obtained by the relationship between displacement and force using the above-mentioned physical property model. The "stable state" is determined based on this shape deformation information.
  • the grip stability calculation unit 34 can also determine the "stable state" under the condition that the robot 10 accelerates or decelerates.
  • an inertial force is generated on the object.
  • the external force Fdis (t) due to the inertial force can also be expressed as in Equation 1 by the mass m of the object 70 and the acceleration ⁇ _obj (t) of the object 70.
  • the acceleration ⁇ _obj (t) of the object 70 is a function of time t, but is basically obtained based on the command value regarding the finger of the robot 10.
  • Fdis (t) m ⁇ ⁇ _obj (t) (Equation 1)
  • the phenomenon that the object 70 slides down from the finger of the robot hand 20 with respect to the inertial force Finr that is, the upper limit Flim of the binding force for eliminating the geometrical constraint is set according to the physical properties of the object 70 (elastic modulus K and damping coefficient C). If the upper limit of the binding force is exceeded, it will not be in a stable state even if the geometrical constraint is lost.
  • the gripping stability calculation unit 34 maintains the "stable state"
  • the gripping stability evaluation value is set high, and the stability evaluation result is output to the DB 35 as a result.
  • the grip stability evaluation value is set low, and the stability evaluation result is output to the DB 35 as a result.
  • the upper limit of the binding force from which the stable state disappears can also be defined by the friction coefficient ⁇ between the object 70 and the robot hand 20.
  • the binding force upper limit Flim can also be defined as in Equation 2.
  • Flim ⁇ ⁇ Fi (Equation 2)
  • the gripping stability Si can be defined as in the equation 3.
  • Si (Flim-max (Fdis (t))) (Equation 3)
  • the grip stability calculation unit 34 outputs the grip point candidates and the stability evaluation results calculated via the result DB 35 to the grip point determination unit 36.
  • the gripping point determination unit 36 can select the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result.
  • FIG. 9 is a flowchart showing the operation of the robot control device. Since steps S101 to S103 of FIG. 9 are the same as those of FIG. 6, the description thereof will be omitted.
  • step S201 the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained. If the "stable state" can be maintained, the process proceeds to step S202, and the grip stability calculation unit 34 sets a high evaluation value of the grip stability. When it is no longer in the "stable state", the process proceeds to step S203, and the grip stability calculation unit 34 sets the evaluation value of the grip stability low. Then, in step S204, the gripping point determination unit 36 selects the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result, and determines the gripping point.
  • the deformation evaluation unit 33 performs a simulation in which the gripping force Fi (t) at each gripping point is variously changed.
  • a gripping force Fi (t) is included, according to Equation 2, the binding force upper limit Flim becomes smaller, and as a result, the gripping stability Si tends to become smaller.
  • the gripping stability Si can be defined as in the formula 4 including the index of "minimum deformation” as an index different from the “stable state” as in the formula 3.
  • Si w1 * (Flim-max (Fdis (t))) + w2 / max (Fi (t)) (Equation 4)
  • w1 and w2 are appropriate weighting factors.
  • the weighting factor is designed by the user depending on whether it is easier to maintain a stable state or whether it is gripped with the minimum gripping force.
  • the gripping stability calculation unit 34 evaluates the gripping stability based on the gripping stability Si, when the robot hand 20 grips a flexible amorphous object, it is based on the shape of the deformed object 70. It becomes possible to select a gripping point having high gripping stability without damaging the object 70. As a result, gripping failures are reduced, the object can be gripped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • the gripping stability calculation unit 34 evaluates the resistance to geometrical deviation based on the shape of the finger of the robot hand 20 and the shape deformation information, and outputs the result of the gripping stability evaluation.
  • the gripping points are represented by points, but in the present embodiment, the gripping points are given a geometric shape. In this case, a plurality of contact points are generated even for one finger.
  • the grip stability calculation unit 34 evaluates the difficulty of geometrically shifting the object 70 with respect to the robot hand 20 based on the shape of the finger of the robot hand 20 and the shape deformation information.
  • FIG. 10 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70.
  • Equation 2 defining the binding force upper limit Flim regarding geometrical restraint as Equation 5.
  • Flim ⁇ ⁇ A ⁇ Fi (Equation 5)
  • A is an effective contact area between the finger of the robot hand 20 and the object 70.
  • the effective contact area indicates the contact area when the finger makes surface contact with an object instead of point contact.
  • the coefficient of friction in the state of surface contact is larger than the coefficient of friction in the state of point contact. Modeling to reflect this is the embodiment of the present embodiment.
  • the grip stability calculation unit 34 defines the friction coefficient based on the effective contact area, and the grip stability is calculated based on the friction coefficient, which is a feature of the present embodiment.
  • the robot hand 20 grips a flexible amorphous object
  • the accuracy of the grip stability calculated based on the deformed shape is improved, and the object is damaged more than before. It becomes possible to select a gripping point having high gripping stability. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • the shape deformation information after the lapse of time is called the second shape deformation information.
  • the difference amount between the original shape of the object 70 and the shape in the second shape deformation information is calculated, and the deformation tolerance value and the magnitude relationship determined in advance are compared for the difference amount, and the deformation tolerance value is exceeded. It is characterized in that the gripping stability is evaluated low for those that do, and the gripping stability is evaluated high for those that do not exceed.
  • FIG. 11 is a diagram showing the positional relationship between the finger and the object 70.
  • the gripping stability calculation unit 34 obtains the difference amount between the curvature of the original shape of the object 70 and the curvature of the shape of the object 70 after unloading, and determines the difference amount of the curvature and the predetermined deformation allowable value. Compare and evaluate.
  • the amount of difference in curvature can be obtained as follows. The first shape deformation information and the second shape deformation information are superimposed on the non-deformed point (point far from the gripping point) as a reference.
  • the curves between the two points are a curve having a length L1 from the discrete point DP3 through the discrete point DP5 to the discrete point DP4 and a curve having a length L2 from the discrete point DP3 through the discrete point DP1 to the discrete point DP4. There are two.
  • Corresponding points are defined for each fixed ratio based on the respective lengths L1 and L2. For example, in each curve, points corresponding to 0.25 ⁇ L1 and 0.25 ⁇ L2 are defined as corresponding points, and the distance between the corresponding points is obtained. Find each of the distances and define the maximum value as the "difference in curvature". In the case of FIG. 11, the difference amount of the curvature is the distance DC1 between the discrete point DP1 and the discrete point DP5.
  • the deformation evaluation unit 33 evaluates whether the difference amount is larger or smaller than the "deformation allowable value" predetermined by the user, and outputs the evaluation result as a part of the shape deformation information. In the case of FIG.
  • the result is evaluated by evaluating whether or not the shape is acceptable based on the shape after unloading, that is, the final shape after work. It is possible to exclude from the extraction target a gripping force or a gripping point that is treated as a work failure. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • FIG. 12 is a block diagram showing the configuration of the gripping point generation unit 31b according to the sixth embodiment.
  • the grip point generation unit 31b is composed of a grip stability calculation unit 34 and a result DB (result database) 35.
  • the robot control device 30 includes a gripping point candidate learning unit 37 and a learning DB (learning database) 38.
  • the grip point candidate learning unit 37 has a neural network 40.
  • the gripping point candidate learning unit 37 inputs the gripping point candidate output from the gripping stability calculation unit 34, the result data which is the stability evaluation result, and the result label obtained in the actual work, and the object is obtained from the shape deformation information. Learn the relationship that outputs grip point candidates. As shown in FIG. 12, it is exemplified to learn a network that outputs a gripping point, a gripping force, and a gripping stability by inputting target shape information (before deformation).
  • Success / failure label for multiple trials gripping point, gripping force, physical properties of the object, deformation shape of the object (shape before deformation and shape information after deformation) and grip stability, success / failure label for each trial Is prepared, and the process of learning the neural network is performed.
  • the grip point generation unit 31b is a physical characteristic model definition unit (not shown) that models the relationship between the force acting on the object 70 and the displacement of the object 70 by a model using a spring multiplier and a damping coefficient.
  • the physical property model definition unit applies a force that changes with time to the object 70, and sets the physical property model (spring multiplier) of the model based on the time-series information of the displacement based on the deformation of the object 70 with respect to the applied force. K and the damping coefficient C) are estimated. At this time, the spring multiplier K and the damping coefficient C can be updated with respect to the predetermined spring multiplier K and the damping coefficient C based on the deformation result obtained by the actual machine work.
  • the relationship between the force and the displacement can be obtained by learning only from the time-series information of the deformation information and the gripping force actually obtained without assuming the spring multiplier K and the damping coefficient C. can.
  • the gripping point candidate learning unit 37 has a physical characteristic model learning unit (not shown) that learns by modeling the relationship between the force acting on the object 70 and the displacement of the object 70 by a neural network.
  • the physical characteristic model learning unit applies a force that changes with time to the object 70, and learns a neural network 40 that is set based on time-series information of displacement based on the deformation of the object with respect to the applied force.
  • the gripping point candidate learning unit 37 performs learning processing based on the gripping point candidate stored in the result DB 35 and the stability evaluation result. For example, learning of the neural network 40 is exemplified.
  • the neural network 40 has a learning unit and an inference unit (not shown). Using the learning parameters in the learning unit, the neural network 41 that reflects the learning parameters is incorporated in the inference unit. Then, the target shape information can be input and the gripping point and the gripping force can be output.
  • the learning parameters are exemplified by the coefficients that define the network structure of the neural network.
  • FIG. 13 is a block diagram showing a configuration of another grip point generation unit 31c according to the sixth embodiment.
  • the grip point candidate generation unit 32a when the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, the grip point candidate generation unit 32a generates a plurality of grip point candidates and grip stability. It is generated and output to the gripping point determination unit 36.
  • the gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
  • a gripping point generation algorithm that corrects the modeling error acquired in the actual work can be acquired by learning, and as a result, the calculation cost for calculating the gripping point candidate is reduced, and the gripping point is calculated. Since the time to do is shortened, a special effect of increasing production efficiency can be obtained.
  • the gripping point, gripping force, physical properties of the object 70, the deformed shape of the object 70 (shape before deformation and shape information after deformation), and grip stability are prepared for trials with multiple success labels.
  • the grip point candidate generation unit 32a When the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, a plurality of grip point candidates and grip stability are generated and output to the grip point determination unit 36.
  • the gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
  • a grip point generation algorithm that automatically outputs grip points when a target shape is input can be acquired by learning, and as a result, Since the calculation cost for calculating the grip point candidate is reduced and the time for calculating the grip point is shortened, the tact time is shortened and the production efficiency is improved, which is a special effect.
  • Embodiment 7 In the present embodiment, after the gripping point candidate generation unit defines the first gripping force, evaluates it under the condition of gripping with the first gripping force with respect to the gripping point, and extracts an effective gripping point, It is different from the third embodiment in that the gripping point can be efficiently searched for by gripping the object with a second gripping force smaller than the first gripping force.
  • FIG. 14 is a block diagram showing the configuration of the gripping point generation unit 31d according to the seventh embodiment.
  • the gripping point candidate is input to the gripping point candidate generation unit 32 from the result DB 35.
  • the gripping point candidate obtained by the gripping stability calculation unit 34 and the result database based on the stability evaluation result are input to the gripping point candidate generation unit 32 again.
  • the gripping point candidate generation unit 32 extracts a finite number of those having a high stability evaluation, and uses the finite number of those extracted gripping point candidates as a second gripping force (however, smaller than the first gripping force). Is specified.
  • the gripping point generation unit 31d defines a result DB 35 for storing a plurality of gripping point candidates and a first gripping force output by the robot hand 20 to the object 70, and is designated to be gripped by the first gripping force. It has a gripping point candidate generation unit 32 that outputs the first gripping point candidate to the deformation evaluation unit.
  • the gripping stability calculation unit 34 calculates the stability evaluation result for the first gripping point candidate, and outputs the first gripping point candidate and the stability evaluation result to the result DB 35.
  • the gripping point candidate generation unit 32 extracts a plurality of gripping point candidates from the first gripping point candidate stored in the result DB 35 based on the stability evaluation result, and the second gripping force with respect to the plurality of gripping point candidates. Is defined and output to the deformation evaluation unit 33 again.
  • the gripping point candidate generation unit 32 can repeat the same process three or more times. For example, by repeating the third gripping force, the fourth gripping force, ..., The kth gripping force, and reducing the gripping force to be searched, the object 70 effective with the minimum gripping force. It is possible to extract the gripping point where the deformation is obtained. As a result, it is possible to efficiently search for points that can be stably gripped with the smallest gripping force, and as a result of being able to extract candidate points that are unlikely to fail in gripping in a short time, the work time per robot operation is shortened and tact. The special effect of shortening the time and increasing the production efficiency can be obtained.
  • the gripping stability calculation unit 34 is different from the first embodiment in that the gripping stability calculation unit 34 obtains a combination of gripping point candidates for stably gripping the object from the information on the contour of the object.
  • the grip stability calculation unit 34 acquires information on the contour of the object 70 from the point cloud coordinates of the contour of the object of the object 70, and selects a combination of grip point candidates from above the contour of the object 70. Then, the gripping stability calculation unit 34 obtains an evaluation value when the robot hand 20 grips the object 70 with a predetermined gripping force for each combination, and a gripping point for stably gripping the object 70 based on the evaluation value. Find a combination of candidates.
  • the grip stability calculation unit 34 derives a combination of stable grip points based on the evaluation of the magnitude of the minimum grip force required to grip the object 70.
  • the minimum gripping force required to grip the object 70 is the minimum fingertip force required to resist the gravity acting on the object 70. From the viewpoint of the difficulty of breaking the object 70, this value should be small.
  • the evaluation is performed using the value of the fingertip force obtained from the gripping force and the frictional force, and the search for the combination of stable gripping points is performed by the following procedure.
  • the points arranged in the two-dimensional plane are smoothly connected by the Spline interpolation method, and the information of the outline of the object 70 is acquired.
  • Grip point candidates are taken on the contour of the object 70, and all combinations of the two points are stored.
  • the gripping force is set to a certain value, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at that gripping force is obtained. Then, the gripping force is changed, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at the gripping force is obtained. This operation is repeated to obtain a combination of stable grip points with the optimum grip force.
  • the gripping stability calculation unit 34 obtains a combination of gripping point candidates of the object 70 for stably gripping the object 70, whereby an amorphous object such as a flexible object is obtained with respect to the object. , Gripping failure due to gripping the selected gripping point is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be improved. Can be done.
  • the gripping stability calculation unit 34 obtains an evaluation value based not only on the shape deformation information of the object 70 after gripping but also on the shape information of the object 70 before gripping.
  • FIG. 15 is a schematic view of the object 70 before gripping according to the ninth embodiment.
  • the deformation evaluation unit 33 outputs a plurality of discrete points DPB1, DPB2, ... As shape information of the object 70 before gripping.
  • the plurality of discrete points DPB1, DPB2, ... Are set based on the contour of the object 70.
  • the grip stability calculation unit 34 quantitatively evaluates the degree of depression of the object from the positional relationship of the plurality of discrete points DPB1, DPB2, ..., And outputs it as an evaluation value.
  • the gripping stability calculation unit 34 may output the discrete points DPB1 and DPB2 as gripping point candidates having high gripping stability.
  • the gripping stability calculation unit 34 can accurately select the gripping point by obtaining the evaluation value based on the shape information of the object 70 before gripping. As a result, it is possible to grasp the object with a high success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be increased.
  • the processing circuit comprises at least one processor and at least one memory.
  • FIG. 16 is a diagram showing a hardware configuration of the robot control device according to the first to ninth embodiments.
  • the robot control device 30 can be realized by the control circuit shown in FIG. 16A, that is, the processor 81 and the memory 82.
  • An example of the processor 81 is a CPU (Central Processing Unit, central processing unit, processing unit, arithmetic unit, microprocessor, microprocessor, processor, DSP (Digital Signal Processor)) or system LSI (Large Scale Integration).
  • the memory 82 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (registered trademark), etc.
  • a RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory)
  • EEPROM registered trademark
  • it may be a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versaille Disc), or the like.
  • the robot control device 30 is realized by the processor 81 reading and executing a program stored in the memory 82 for executing the operation of the robot control device 30. It can also be said that this program causes a computer to execute the procedure or method of the robot control device 30.
  • the program executed by the robot control device 30 includes a grip point generation unit 31 and a command value generation unit 39, which are loaded on the main storage device and these are generated on the main storage device.
  • the memory 82 stores obstacle information, target shape information, shape deformation information, and the like.
  • the memory 82 is also used as a temporary memory when the processor 81 executes various processes.
  • the robot control device 30 may be realized by dedicated hardware. Further, the functions of the robot control device 30 may be partially realized by dedicated hardware and partially realized by software or firmware.
  • the robot control device 30 may be realized by the dedicated processing circuit 83 shown in FIG. 16 (b). At least a part of the grip point generation unit 31 and the command value generation unit 39 may be realized by the processing circuit 83.
  • the processing circuit 83 is dedicated hardware.
  • the processing circuit 83 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof. Is.
  • a part of the function of the robot control device 30 may be realized by software or firmware, and the other part may be realized by dedicated hardware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un dispositif de commande de robot pour commander un robot (10) et une main de robot (20) pour saisir un objet (70), le dispositif de commande de robot comprenant une unité de génération de point de saisie (31) qui génère un point de saisie de l'objet (70) devant être saisi par la main de robot (20). L'unité de génération de point de saisie (31) comprend une unité d'évaluation de déformation (33) qui calcule des informations de déformation de forme lorsque la forme de l'objet (70) est déformée par une opération de saisie de la main de robot (20), et une unité de détermination de point de saisie (36) qui détermine un point de saisie de l'objet (70), sur la base des informations de déformation de forme.
PCT/JP2021/036652 2020-10-19 2021-10-04 Dispositif de commande de robot et procédé de commande de robot WO2022085408A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202180060765.2A CN116194255A (zh) 2020-10-19 2021-10-04 机器人控制装置及机器人控制方法
JP2022557371A JP7337285B2 (ja) 2020-10-19 2021-10-04 ロボット制御装置およびロボット制御方法
DE112021005493.7T DE112021005493T5 (de) 2020-10-19 2021-10-04 Robotersteuervorrichtung und robotersteuerverfahren

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-175203 2020-10-19
JP2020175203 2020-10-19

Publications (1)

Publication Number Publication Date
WO2022085408A1 true WO2022085408A1 (fr) 2022-04-28

Family

ID=81289731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/036652 WO2022085408A1 (fr) 2020-10-19 2021-10-04 Dispositif de commande de robot et procédé de commande de robot

Country Status (4)

Country Link
JP (1) JP7337285B2 (fr)
CN (1) CN116194255A (fr)
DE (1) DE112021005493T5 (fr)
WO (1) WO2022085408A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023110111B3 (de) 2023-04-20 2024-06-06 J.Schmalz Gmbh Verfahren zum Ansteuern einer Handhabungsanlage sowie Handhabungsanlage
DE102023110107B3 (de) 2023-04-20 2024-05-23 J.Schmalz Gmbh Verfahren zum Handhaben von Gegenständen sowie Handhabungsanlage

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016196077A (ja) * 2015-04-06 2016-11-24 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
WO2018092254A1 (fr) * 2016-11-17 2018-05-24 株式会社安川電機 Système de réglage de force de préhension, procédé de réglage de force de préhension et système d'estimation de force de préhension
JP2019107725A (ja) * 2017-12-18 2019-07-04 国立大学法人信州大学 把持装置、学習装置、学習済みモデル、把持システム、判定方法、及び学習方法
JP2019188587A (ja) * 2018-04-24 2019-10-31 ファナック株式会社 ロボット制御装置およびシステム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4211701B2 (ja) * 2004-07-21 2009-01-21 トヨタ自動車株式会社 ロボットハンドの把持制御装置
JP2008049459A (ja) 2006-08-28 2008-03-06 Toshiba Corp マニピュレータ制御システム、マニピュレータ制御方法およびプログラム
JP6807949B2 (ja) 2016-11-16 2021-01-06 三菱電機株式会社 干渉回避装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016196077A (ja) * 2015-04-06 2016-11-24 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
WO2018092254A1 (fr) * 2016-11-17 2018-05-24 株式会社安川電機 Système de réglage de force de préhension, procédé de réglage de force de préhension et système d'estimation de force de préhension
JP2019107725A (ja) * 2017-12-18 2019-07-04 国立大学法人信州大学 把持装置、学習装置、学習済みモデル、把持システム、判定方法、及び学習方法
JP2019188587A (ja) * 2018-04-24 2019-10-31 ファナック株式会社 ロボット制御装置およびシステム

Also Published As

Publication number Publication date
CN116194255A (zh) 2023-05-30
JP7337285B2 (ja) 2023-09-01
JPWO2022085408A1 (fr) 2022-04-28
DE112021005493T5 (de) 2023-08-31

Similar Documents

Publication Publication Date Title
WO2022085408A1 (fr) Dispositif de commande de robot et procédé de commande de robot
Balasubramanian et al. Human-guided grasp measures improve grasp robustness on physical robot
Harada et al. Fast grasp planning for hand/arm systems based on convex model
Huang et al. Learning a real time grasping strategy
Zhou et al. 6dof grasp planning by optimizing a deep learning scoring function
De Farias et al. Simultaneous tactile exploration and grasp refinement for unknown objects
EP3812972A1 (fr) Procédé de commande d'un robot et dispositif de commande de robot
Sintov et al. Dynamic regrasping by in-hand orienting of grasped objects using non-dexterous robotic grippers
Rocchi et al. Stable simulation of underactuated compliant hands
Kawaharazuka et al. Object recognition, dynamic contact simulation, detection, and control of the flexible musculoskeletal hand using a recurrent neural network with parametric bias
Kumar et al. Contextual reinforcement learning of visuo-tactile multi-fingered grasping policies
Li et al. Manipulation skill acquisition for robotic assembly based on multi-modal information description
Hasan et al. Modelling and control of the barrett hand for grasping
Bo et al. Automated design of embedded constraints for soft hands enabling new grasp strategies
Lin et al. Grasp mapping using locality preserving projections and knn regression
Ciocarlie et al. On-line interactive dexterous grasping
Ruiz Garate et al. A bio-inspired grasp stiffness control for robotic hands
Platt Learning grasp strategies composed of contact relative motions
Perico et al. Learning robust manipulation tasks involving contact using trajectory parameterized probabilistic principal component analysis
Li et al. Dual loop compliant control based on human prediction for physical human-robot interaction
Añazco et al. Human-like object grasping and relocation for an anthropomorphic robotic hand with natural hand pose priors in deep reinforcement learning
JP5829103B2 (ja) ロボットハンド
Tabata et al. Casting manipulation of unknown string by robot arm
Tavassolian et al. Forward kinematics analysis of a 3-PRR planer parallel robot using a combined method based on the neural network
Vatsal et al. Augmenting vision-based grasp plans for soft robotic grippers using reinforcement learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21882549

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022557371

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 21882549

Country of ref document: EP

Kind code of ref document: A1