WO2022085408A1 - Robot control device and robot control method - Google Patents

Robot control device and robot control method Download PDF

Info

Publication number
WO2022085408A1
WO2022085408A1 PCT/JP2021/036652 JP2021036652W WO2022085408A1 WO 2022085408 A1 WO2022085408 A1 WO 2022085408A1 JP 2021036652 W JP2021036652 W JP 2021036652W WO 2022085408 A1 WO2022085408 A1 WO 2022085408A1
Authority
WO
WIPO (PCT)
Prior art keywords
gripping
deformation
point
force
control device
Prior art date
Application number
PCT/JP2021/036652
Other languages
French (fr)
Japanese (ja)
Inventor
浩司 白土
宏規 土橋
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to DE112021005493.7T priority Critical patent/DE112021005493T5/en
Priority to CN202180060765.2A priority patent/CN116194255A/en
Priority to JP2022557371A priority patent/JP7337285B2/en
Publication of WO2022085408A1 publication Critical patent/WO2022085408A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39469Grip flexible, deformable plate, object and manipulate it
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39507Control of slip motion

Definitions

  • the present disclosure relates to a robot control device and a robot control method that give an operation command to a robot and an end effector attached to the fingertip of the robot in order to grip the object so as not to drop it.
  • the robot control device controls the robot and the robot hand of the robot in order to grip the object, and includes a grip point generation unit that generates a grip point of the object to be gripped by the robot hand.
  • the grip point generation unit includes a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping motion of the robot, and a grip point determination unit that determines the grip point of the object based on the shape deformation information. It has.
  • FIG. 1-9 It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 3. It is a flowchart which shows the operation of the robot control apparatus which concerns on Embodiment 3. It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 4. FIG. It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 5. It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 6. It is a block diagram which shows the structure of another gripping point generation part which concerns on Embodiment 6. It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 7. It is a schematic diagram of the object before grasping which concerns on Embodiment 9. FIG. It is a figure which shows the hardware composition of the robot control apparatus which concerns on Embodiment 1-9.
  • FIG. 1 is an overall view of a robot system according to the first embodiment for carrying out the present disclosure.
  • the robot system is based on the configuration of a robot and a robot control device that controls and operates the robot.
  • the robot 10 may perform a task called material handling, such as gripping an object.
  • a measuring device 60 that measures information such as the shape of the object 70 in order to acquire the position information and shape information of the object 70, and a robot hand 20 (end effector) for gripping the object 70 are provided.
  • the configuration to be provided is added.
  • the information of the object 70 measured based on the measuring device 60 is processed by the measuring device controller 50, and the information of the object 70 is input to the robot control device 30.
  • the robot control device 30 controls either the joint of the arm of the robot 10 or at least one of the fingers of the robot hand 20 so that the finger of the robot hand 20 moves to an appropriate position. ..
  • the position shape and the shape information of the object 70 are exemplified.
  • the robot control device 30 outputs the calculated position command value, the open position command value of the finger of the robot hand 20, and the closed position command value to the robot 10.
  • the robot control device 30 determines the timing at which the position command value for the robot hand 20 is executed with respect to the position command value of the robot 10, and transmits the position command value to the robot 10 as the position command value at each time t.
  • the position command value of the robot 10 is related to 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom.
  • the position command value of the finger of the robot hand 20 depends on the type of the hand, but if it is a link structure, it is defined by the finger tip position or the opening width. In addition, it may refer to the position command value of each drive unit, but here, it refers broadly to the position command value that can be specified without being limited to the structure of the hand.
  • the finger of the robot hand 20 can control the gripping force when the actuator can control the pressure, the force, or the torque.
  • the gripping force command value is given to the gripping point candidate.
  • the "grasping point” means the position and posture of the finger in which the robot hand 20 can grip the object 70.
  • a position command value at each time t is required in addition to the position and orientation of the gripping point, but each of the robots 10 so that the robot hand 20 can reach the gripping point of the object 70.
  • the joint position target value shall be calculated separately.
  • the information that can be used to calculate the gripping point of the object 70 does not have to be limited to the position shape and shape information of the object 70. That is, as the information of the object 70, in addition to the direct information such as the position information and the shape information, the position of the object 70 is used by using indirect information such as the temperature information, the distance information, and the color information of the object 70. It is possible to estimate information and shape information.
  • FIG. 2 is a block diagram showing the configuration of the robot control device 30.
  • the robot control device 30 is mainly composed of a gripping point generation unit 31 and a command value generation unit 39.
  • the robot control device 30 calculates the position of the gripping point where the robot 10 should move, moves the robot hand 20 to the gripping point, and controls the robot 10 so that the robot 10 grips and operates.
  • the gripping point generation unit 31 outputs the gripping point of the object 70 by using the shape information of the object 70 to be gripped by the robot 10.
  • the target shape information is obtained by acquiring and calculating the image information or the distance information of the object 70 obtained by the visual sensor as the measuring device 60 as a point cloud.
  • the object 70 may be actually gripped once by the finger of the robot hand 20 by actually using the robot hand 20, and the shape information may be acquired based on the position information of the finger at the time of gripping.
  • the measuring device 60 may use a distance measuring sensor to acquire shape information based on the cross-sectional shape of the object.
  • the temperature sensor may be used as the measuring device 60 to acquire shape information based on the approximate position and shape of the object 70.
  • the measuring device 60 is not limited to the visual sensor. Further, position information, shape information, temperature information, distance information, color information and the like may be obtained from other than the measuring device 60.
  • FIG. 3 is a block diagram showing the configuration of the grip point generation unit 31.
  • the grip point generation unit 31 is composed of a grip point candidate generation unit 32, a deformation evaluation unit 33, and a grip point determination unit 36.
  • the deformation evaluation unit 33 calculates shape deformation information when the shape of the object 70 is deformed by the gripping operation of the robot hand 20.
  • the gripping point determining unit 36 determines the gripping point of the object based on the deformation amount of the object included in the shape deformation information and the geometric constraint condition after the deformation of the object. decide.
  • each component will be described.
  • the gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 attached to the robot 10 based on the target shape information input to the robot control device 30. At this time, as a method of generating the gripping point candidate, it is possible to search by a method of completely searching between any two points on the entire circumference from the target shape information based on the stroke (opening width) of the finger of the robot hand 20. For example, a case where a 2-finga gripper is taken as an example is shown in FIG. 4 described later.
  • the search for an object is an elliptical shape, that is, a process of selecting any two points on the outer circumference of the object.
  • the deformation evaluation unit 33 moves a finger inside the object with respect to the two selected points, and performs deformation evaluation described later on the assumption that the grip is performed.
  • the search itself is not a full search, but the distance between the two candidate points and the open / close distance L0 with the opening / closing distance L0 of the finger as a constraint condition. It is possible to execute a search method under the constraint of comparison, and the search method itself is not limited.
  • the deformation evaluation unit 33 evaluates and outputs the expected shape deformation information as shown in FIG. 4 for each case of the plurality of grip point candidates generated by the grip point candidate generation unit 32. do.
  • the shape deformation information includes the shape information after deformation. In order to evaluate the shape deformation information, it is possible to calculate the shape deformation information expected in the model in which each finger causes deformation with respect to the object 70, assuming gripping by point contact by each finger.
  • the mechanical relationship in which deformation occurs with respect to force can be calculated at each point contact point.
  • the object 70 is treated as a uniform shape
  • the spring multiplier is K
  • the damping coefficient is C
  • a rigid body, an elastic body or a rheology is used in the direction in which the finger force is generated.
  • the shape deformation information can be evaluated by approximating the characteristics of the object as an object.
  • the relational expression between displacement and force is established as described in Non-Patent Document 1.
  • the expected shape deformation information can be obtained by designating the gripping point and causing the shape deformation under appropriate conditions.
  • FIG. 4 is a case of being restrained, and a case as shown in FIG. 5 is a case of not being restrained.
  • FIG. 4 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70. 4 (a) shows the positional relationship before the finger of the robot hand 20 grips the object 70, and FIGS. 4 (b) and 4 (c) show the positional relationship after the finger of the robot hand 20 grips the object 70. Is shown.
  • FIG. 5 is a diagram showing the positional relationship between the finger and the object 70 when the object 70 can move.
  • the direction in which the finger of the robot hand 20 opens and closes is defined as the X axis
  • the direction perpendicular to the direction along the finger is defined as the Y axis.
  • the black arrows pointing outward from the object 70 shown in FIGS. 4 and 5 indicate the direction in which the object 70 is to be moved by applying a force from the outside.
  • the object is geometrically applied even if an external force is applied in the X-axis direction and the Y-axis direction. Since it is restrained, it is easy to maintain the gripped state. This is because the geometrical constraint is established from the shape information of the object 70 after the deformation and the positional relationship of the fingers to be gripped after the deformation of the object 70.
  • the present embodiment is characterized by paying attention to this point and extracting gripping point candidates that realize stable gripping.
  • the gripping stability due to the geometrical restraint also acts in the X direction, and the object cannot move.
  • the geometric constraint does not act and the gripping stability is low, and the object can be moved.
  • the deformation evaluation unit 33 outputs a plurality of discrete points DP1, DP2, ... As shape deformation information of the object 70.
  • the discrete points DP1, DP2, ... are set based on the contour of the shape expected in the model that causes deformation with respect to the object 70.
  • the gripping point determining unit 36 determines whether or not a geometric constraint is established from the relationship between the positions of the discrete points DP1, DP2, ... And the finger positions FP1, FP2, thereby gripping. Evaluate stability.
  • the gripping point determination unit 36 obtains a first approximate curve for a plurality of discrete points located in the vicinity of the finger position FP1.
  • the gripping point determination unit 36 sets a plurality of discrete points (not shown) with respect to the finger based on the contour of the finger at the position FP1, and obtains a second approximate curve for the plurality of discrete points.
  • the gripping point determination unit 36 determines the shape of the object 70 near the position FP1 of the finger (unevenness information, etc.) and the shape of the finger at the position FP1 (arc, rectangle) based on the first approximate curve and the second approximate curve. Etc.), and the gripping stability is evaluated by determining whether or not the geometrical constraint is established.
  • Examples of the comparison method include the magnitude relation of the curvature between the shape of the object 70 and the shape of the finger, and the height difference between the maximum point and the minimum point of the first approximation curve.
  • the gripping point determination unit 36 similarly obtains approximate curves for a plurality of discrete points located in the vicinity of the finger position FP2 and a plurality of discrete points (not shown) relating to the finger at the position FP2, and the same method as described above. Evaluate grip stability.
  • the gripping point determination unit 36 confirms the position coordinates of the discrete points DP1, DP2, ... When the virtual force Fvir is applied to the object 70, and the position coordinates before the addition are confirmed. By determining whether or not the amount of change is equal to or less than a predetermined value, it is determined whether or not the geometric constraint is established, and the gripping stability is evaluated.
  • the gripping point determination unit 36 may determine that the gripping stability is low when the amount of change exceeds a predetermined value even in one of the plurality of discrete points, or the amount of change in a part of the plurality of discrete points. May be determined that the gripping stability is low when the value exceeds a predetermined value. It is assumed that the virtual force Fvir is applied to the object 70 from an arbitrary direction.
  • the grip point closest to the position of the center of gravity can be selected.
  • the distance between the center of gravity and the gripping point is short, it is expected that the couple can be kept small even when the gripping force near the gripping point fluctuates due to disturbance or the like.
  • FIG. 6 is a flowchart showing the operation of the robot control device.
  • the target shape information is input.
  • the gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 based on the input target shape information.
  • the deformation evaluation unit 33 evaluates and outputs the shape deformation information for each case of the plurality of gripping point candidates.
  • the gripping point determining unit 36 determines the gripping point based on the shape deformation information.
  • the grip point generation unit 31 is a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping operation of the hand.
  • the gripping point determining unit 36 that determines the gripping point of the object based on the shape deformation information 33, an irregular object such as a flexible object grips the selected gripping point with respect to the object.
  • the gripping failure is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
  • production efficiency refers to the speed of work such as picking work.
  • production efficiency it refers to takt time, and if one operation of 1 second is tried 100 times and succeeded 100 times, it is evaluated as an average of 1 second / time takt time, and the same work is tried 100 times. If it succeeds only 50 times, it is evaluated as an average tact time of 2 seconds / time. From the above, the fewer failures there are, the higher the production efficiency.
  • the deformation evaluation unit 33 further adds a configuration for evaluating whether or not the upper limit of the force calculated based on the relational expression between the force applied to the object 70 and the displacement of the object 70 is exceeded. Is different from the first embodiment.
  • a plurality of grip point candidates are extracted by the grip point candidate generation unit 32 after satisfying the condition of being geometrically constrained. From these gripping point candidates, the deformation evaluation unit 33 determines whether or not the object 70 exceeds the allowable deformation amount by the gripping force F (t) (value that changes at time t) expressed in time series. It is a feature of this embodiment that it is evaluated and added to the constraint condition.
  • the deformation evaluation unit 33 sets an upper limit value of the gripping force applied to the object 70 based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 and the amount of deformation of the object 70 that the object 70 can tolerate. calculate. Then, the deformation evaluation unit 33 evaluates whether or not the gripping force applied to the object 70 from the robot hand 20 exceeds the upper limit value. Further, the deformation evaluation unit 33 determines the time-series information of the gripping force calculated based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 as a part of the shape deformation information. Output to 36.
  • Deformations include elastic bodies whose shape returns to their original shape when the gripping force is removed after deformation, rheological bodies whose shape does not completely return, and plastic bodies that deform by the amount of force applied.
  • flexible objects have an upper limit of allowable deformation. If the conditions of deformation are exceeded, an event occurs in which the object 70 is damaged or the commercial value is impaired.
  • the amount of deformation is calculated by the force and the time to apply the force.
  • the relational expression between the force and the amount of deformation can be expressed mathematically, for example, by the description in Non-Patent Document 1.
  • a physical property model can be simulated by a configuration in which an elastic element and a damping element are connected in series as in the Maxwell model.
  • gripping points PG1 and PG2 When using a 2-finger hand, two grip points are given to the finger. These are set as gripping points PG1 and PG2. At this time, the gripping point PG2 and the point P2 on which the force is applied are made to coincide with each other. Further, the vector (P1P2) and the vector (PG1PG2) are set to be parallel to each other. Regarding the displacement, the displacement of the joint between the spring element and the damping element is defined by x1, and the displacement of the gripping point P2 is defined by x2. The origins of the displacement x1 and the displacement x2 can both define the state of natural length. At that time, as an initial positional relationship, the length of the spring element of k1 is set to X10, and the length of the damping element is set to X20.
  • the time-series data F (t) of the gripping force when applied from the outside, it can be obtained as the time-series data of the displacement x1 and the displacement x2 by calculating the equation of motion.
  • the characteristics (having residual displacement) of the rheological object can be simulated by giving the damping coefficient C2 a non-linear characteristic.
  • the definition of the physical property model is not limited to this, and it can be applied to rigid bodies, elastic bodies, rheological objects, and plastic bodies by changing the coefficients and configurations.
  • the change in the position of the displacement x2 can be acquired, so it is possible to determine what kind of deformation occurs according to the time-series data F (t) of the appropriate gripping force.
  • the format of the physical property model is not particularly limited.
  • the deformation evaluation unit 33 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, so that the gripping fails.
  • the ratio of selecting points is reduced, and a special effect of improving production efficiency can be obtained.
  • the robot hand 20 grips the object 70, which is a flexible amorphous object, the object 70 is damaged based on the time-series information of the shape of the deformed object 70 and the force up to it. It is possible to select a gripping point having high gripping stability without any problem.
  • the robot hand 20 grips the object 70 with a slightly large gripping force, it can include a case where the deformation of the object 70 is allowed if the gripping force is removed within a predetermined time. Therefore, the failure is reduced, the object 70 can be grasped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
  • the number of gripping point candidates may be very small if it is determined only by whether or not a certain upper limit value as mentioned above is exceeded. In this case, it can be utilized that the amount of deformation is within the permissible range as long as the upper limit of time is slightly exceeded.
  • the deformation evaluation unit 33 has a gripping force F (t), which is the magnitude of the force acting on the gripping point, and a time t for applying a force equal to or greater than the allowable load, as a part of the shape deformation information. Is output. In this case, it is possible to evaluate whether or not the amount of deformation finally allowed is reached based on the gripping force F (t) expressed in time series.
  • the gripping point determining unit 36 uses the gripping point, the gripping force, and the gripping time as the shape deformation information to determine whether or not the deformation amount is within the allowable range based on the magnitude relationship between the force and the time threshold. ..
  • the gripping point determining unit 36 can acquire the gripping point and the gripping force that realize the state in which the shape of the food is kept within a certain range.
  • the gripping point information includes information on the position of the gripping point and the gripping force (acting force) at the gripping point.
  • the threshold value regarding the gripping force and the time can be obtained by being converted into the gripping force and the gripping time based on the allowable range of the deformation amount of the object 70.
  • the deformation amount may be calculated using the gripping point, the gripping force, and the gripping time, and the upper limit value may be set based on the deformation amount. not.
  • the upper limit of the deformation amount of the object 70 is provided in advance by the user of this system for each food.
  • the robot control device 30 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, the ratio of selecting the gripping points where the gripping fails is reduced and is high. It is possible to grasp the object with the success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be maintained high.
  • the embodiment further includes a gripping stability calculation unit that evaluates gripping stability by mechanical stability against a predetermined external force with respect to the balance of the deformed force near the gripping point.
  • FIG. 8 is a block diagram showing a configuration of a gripping point generation unit according to the third embodiment.
  • the grip point generation unit 31a is composed of a grip stability calculation unit 34 and a result DB (result database) 35.
  • the gripping stability calculation unit 34 evaluates the mechanical stability against a predetermined external force with respect to the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70. Further, the gripping stability calculation unit 34 evaluates the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70, and the gripping force of the robot hand 20 with respect to the object 70 is minimized. Extract the gripping point.
  • the grip stability calculation unit 34 inputs shape deformation information. First, the grip stability calculation unit 34 calculates each point of the finger of the robot hand 20 and the grip target based on the force vector generated in the grip target after the deformation. Next, the grip stability calculation unit 34 evaluates whether or not the object 70 does not move based on the balance of forces at the grip point of the object 70. At this time, if the deformed object 70 and the finger of the robot hand 20 are geometrically constrained (immovable), the object 70 and the finger are affected by the action of a force other than the gripping force of the finger of the robot hand 20. It becomes a state where and is pressed, and it is regarded as a stable state.
  • the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained.
  • the stable state (stability) will be described.
  • the predetermined external force is defined as Fdis
  • the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained even when the external force Fdis is set to a value other than 0.
  • the deformation when the gripping force F (t) and the external force Fdis are added to the shape deformation information is added. Deformation is obtained by the relationship between displacement and force using the above-mentioned physical property model. The "stable state" is determined based on this shape deformation information.
  • the grip stability calculation unit 34 can also determine the "stable state" under the condition that the robot 10 accelerates or decelerates.
  • an inertial force is generated on the object.
  • the external force Fdis (t) due to the inertial force can also be expressed as in Equation 1 by the mass m of the object 70 and the acceleration ⁇ _obj (t) of the object 70.
  • the acceleration ⁇ _obj (t) of the object 70 is a function of time t, but is basically obtained based on the command value regarding the finger of the robot 10.
  • Fdis (t) m ⁇ ⁇ _obj (t) (Equation 1)
  • the phenomenon that the object 70 slides down from the finger of the robot hand 20 with respect to the inertial force Finr that is, the upper limit Flim of the binding force for eliminating the geometrical constraint is set according to the physical properties of the object 70 (elastic modulus K and damping coefficient C). If the upper limit of the binding force is exceeded, it will not be in a stable state even if the geometrical constraint is lost.
  • the gripping stability calculation unit 34 maintains the "stable state"
  • the gripping stability evaluation value is set high, and the stability evaluation result is output to the DB 35 as a result.
  • the grip stability evaluation value is set low, and the stability evaluation result is output to the DB 35 as a result.
  • the upper limit of the binding force from which the stable state disappears can also be defined by the friction coefficient ⁇ between the object 70 and the robot hand 20.
  • the binding force upper limit Flim can also be defined as in Equation 2.
  • Flim ⁇ ⁇ Fi (Equation 2)
  • the gripping stability Si can be defined as in the equation 3.
  • Si (Flim-max (Fdis (t))) (Equation 3)
  • the grip stability calculation unit 34 outputs the grip point candidates and the stability evaluation results calculated via the result DB 35 to the grip point determination unit 36.
  • the gripping point determination unit 36 can select the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result.
  • FIG. 9 is a flowchart showing the operation of the robot control device. Since steps S101 to S103 of FIG. 9 are the same as those of FIG. 6, the description thereof will be omitted.
  • step S201 the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained. If the "stable state" can be maintained, the process proceeds to step S202, and the grip stability calculation unit 34 sets a high evaluation value of the grip stability. When it is no longer in the "stable state", the process proceeds to step S203, and the grip stability calculation unit 34 sets the evaluation value of the grip stability low. Then, in step S204, the gripping point determination unit 36 selects the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result, and determines the gripping point.
  • the deformation evaluation unit 33 performs a simulation in which the gripping force Fi (t) at each gripping point is variously changed.
  • a gripping force Fi (t) is included, according to Equation 2, the binding force upper limit Flim becomes smaller, and as a result, the gripping stability Si tends to become smaller.
  • the gripping stability Si can be defined as in the formula 4 including the index of "minimum deformation” as an index different from the “stable state” as in the formula 3.
  • Si w1 * (Flim-max (Fdis (t))) + w2 / max (Fi (t)) (Equation 4)
  • w1 and w2 are appropriate weighting factors.
  • the weighting factor is designed by the user depending on whether it is easier to maintain a stable state or whether it is gripped with the minimum gripping force.
  • the gripping stability calculation unit 34 evaluates the gripping stability based on the gripping stability Si, when the robot hand 20 grips a flexible amorphous object, it is based on the shape of the deformed object 70. It becomes possible to select a gripping point having high gripping stability without damaging the object 70. As a result, gripping failures are reduced, the object can be gripped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • the gripping stability calculation unit 34 evaluates the resistance to geometrical deviation based on the shape of the finger of the robot hand 20 and the shape deformation information, and outputs the result of the gripping stability evaluation.
  • the gripping points are represented by points, but in the present embodiment, the gripping points are given a geometric shape. In this case, a plurality of contact points are generated even for one finger.
  • the grip stability calculation unit 34 evaluates the difficulty of geometrically shifting the object 70 with respect to the robot hand 20 based on the shape of the finger of the robot hand 20 and the shape deformation information.
  • FIG. 10 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70.
  • Equation 2 defining the binding force upper limit Flim regarding geometrical restraint as Equation 5.
  • Flim ⁇ ⁇ A ⁇ Fi (Equation 5)
  • A is an effective contact area between the finger of the robot hand 20 and the object 70.
  • the effective contact area indicates the contact area when the finger makes surface contact with an object instead of point contact.
  • the coefficient of friction in the state of surface contact is larger than the coefficient of friction in the state of point contact. Modeling to reflect this is the embodiment of the present embodiment.
  • the grip stability calculation unit 34 defines the friction coefficient based on the effective contact area, and the grip stability is calculated based on the friction coefficient, which is a feature of the present embodiment.
  • the robot hand 20 grips a flexible amorphous object
  • the accuracy of the grip stability calculated based on the deformed shape is improved, and the object is damaged more than before. It becomes possible to select a gripping point having high gripping stability. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • the shape deformation information after the lapse of time is called the second shape deformation information.
  • the difference amount between the original shape of the object 70 and the shape in the second shape deformation information is calculated, and the deformation tolerance value and the magnitude relationship determined in advance are compared for the difference amount, and the deformation tolerance value is exceeded. It is characterized in that the gripping stability is evaluated low for those that do, and the gripping stability is evaluated high for those that do not exceed.
  • FIG. 11 is a diagram showing the positional relationship between the finger and the object 70.
  • the gripping stability calculation unit 34 obtains the difference amount between the curvature of the original shape of the object 70 and the curvature of the shape of the object 70 after unloading, and determines the difference amount of the curvature and the predetermined deformation allowable value. Compare and evaluate.
  • the amount of difference in curvature can be obtained as follows. The first shape deformation information and the second shape deformation information are superimposed on the non-deformed point (point far from the gripping point) as a reference.
  • the curves between the two points are a curve having a length L1 from the discrete point DP3 through the discrete point DP5 to the discrete point DP4 and a curve having a length L2 from the discrete point DP3 through the discrete point DP1 to the discrete point DP4. There are two.
  • Corresponding points are defined for each fixed ratio based on the respective lengths L1 and L2. For example, in each curve, points corresponding to 0.25 ⁇ L1 and 0.25 ⁇ L2 are defined as corresponding points, and the distance between the corresponding points is obtained. Find each of the distances and define the maximum value as the "difference in curvature". In the case of FIG. 11, the difference amount of the curvature is the distance DC1 between the discrete point DP1 and the discrete point DP5.
  • the deformation evaluation unit 33 evaluates whether the difference amount is larger or smaller than the "deformation allowable value" predetermined by the user, and outputs the evaluation result as a part of the shape deformation information. In the case of FIG.
  • the result is evaluated by evaluating whether or not the shape is acceptable based on the shape after unloading, that is, the final shape after work. It is possible to exclude from the extraction target a gripping force or a gripping point that is treated as a work failure. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
  • FIG. 12 is a block diagram showing the configuration of the gripping point generation unit 31b according to the sixth embodiment.
  • the grip point generation unit 31b is composed of a grip stability calculation unit 34 and a result DB (result database) 35.
  • the robot control device 30 includes a gripping point candidate learning unit 37 and a learning DB (learning database) 38.
  • the grip point candidate learning unit 37 has a neural network 40.
  • the gripping point candidate learning unit 37 inputs the gripping point candidate output from the gripping stability calculation unit 34, the result data which is the stability evaluation result, and the result label obtained in the actual work, and the object is obtained from the shape deformation information. Learn the relationship that outputs grip point candidates. As shown in FIG. 12, it is exemplified to learn a network that outputs a gripping point, a gripping force, and a gripping stability by inputting target shape information (before deformation).
  • Success / failure label for multiple trials gripping point, gripping force, physical properties of the object, deformation shape of the object (shape before deformation and shape information after deformation) and grip stability, success / failure label for each trial Is prepared, and the process of learning the neural network is performed.
  • the grip point generation unit 31b is a physical characteristic model definition unit (not shown) that models the relationship between the force acting on the object 70 and the displacement of the object 70 by a model using a spring multiplier and a damping coefficient.
  • the physical property model definition unit applies a force that changes with time to the object 70, and sets the physical property model (spring multiplier) of the model based on the time-series information of the displacement based on the deformation of the object 70 with respect to the applied force. K and the damping coefficient C) are estimated. At this time, the spring multiplier K and the damping coefficient C can be updated with respect to the predetermined spring multiplier K and the damping coefficient C based on the deformation result obtained by the actual machine work.
  • the relationship between the force and the displacement can be obtained by learning only from the time-series information of the deformation information and the gripping force actually obtained without assuming the spring multiplier K and the damping coefficient C. can.
  • the gripping point candidate learning unit 37 has a physical characteristic model learning unit (not shown) that learns by modeling the relationship between the force acting on the object 70 and the displacement of the object 70 by a neural network.
  • the physical characteristic model learning unit applies a force that changes with time to the object 70, and learns a neural network 40 that is set based on time-series information of displacement based on the deformation of the object with respect to the applied force.
  • the gripping point candidate learning unit 37 performs learning processing based on the gripping point candidate stored in the result DB 35 and the stability evaluation result. For example, learning of the neural network 40 is exemplified.
  • the neural network 40 has a learning unit and an inference unit (not shown). Using the learning parameters in the learning unit, the neural network 41 that reflects the learning parameters is incorporated in the inference unit. Then, the target shape information can be input and the gripping point and the gripping force can be output.
  • the learning parameters are exemplified by the coefficients that define the network structure of the neural network.
  • FIG. 13 is a block diagram showing a configuration of another grip point generation unit 31c according to the sixth embodiment.
  • the grip point candidate generation unit 32a when the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, the grip point candidate generation unit 32a generates a plurality of grip point candidates and grip stability. It is generated and output to the gripping point determination unit 36.
  • the gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
  • a gripping point generation algorithm that corrects the modeling error acquired in the actual work can be acquired by learning, and as a result, the calculation cost for calculating the gripping point candidate is reduced, and the gripping point is calculated. Since the time to do is shortened, a special effect of increasing production efficiency can be obtained.
  • the gripping point, gripping force, physical properties of the object 70, the deformed shape of the object 70 (shape before deformation and shape information after deformation), and grip stability are prepared for trials with multiple success labels.
  • the grip point candidate generation unit 32a When the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, a plurality of grip point candidates and grip stability are generated and output to the grip point determination unit 36.
  • the gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
  • a grip point generation algorithm that automatically outputs grip points when a target shape is input can be acquired by learning, and as a result, Since the calculation cost for calculating the grip point candidate is reduced and the time for calculating the grip point is shortened, the tact time is shortened and the production efficiency is improved, which is a special effect.
  • Embodiment 7 In the present embodiment, after the gripping point candidate generation unit defines the first gripping force, evaluates it under the condition of gripping with the first gripping force with respect to the gripping point, and extracts an effective gripping point, It is different from the third embodiment in that the gripping point can be efficiently searched for by gripping the object with a second gripping force smaller than the first gripping force.
  • FIG. 14 is a block diagram showing the configuration of the gripping point generation unit 31d according to the seventh embodiment.
  • the gripping point candidate is input to the gripping point candidate generation unit 32 from the result DB 35.
  • the gripping point candidate obtained by the gripping stability calculation unit 34 and the result database based on the stability evaluation result are input to the gripping point candidate generation unit 32 again.
  • the gripping point candidate generation unit 32 extracts a finite number of those having a high stability evaluation, and uses the finite number of those extracted gripping point candidates as a second gripping force (however, smaller than the first gripping force). Is specified.
  • the gripping point generation unit 31d defines a result DB 35 for storing a plurality of gripping point candidates and a first gripping force output by the robot hand 20 to the object 70, and is designated to be gripped by the first gripping force. It has a gripping point candidate generation unit 32 that outputs the first gripping point candidate to the deformation evaluation unit.
  • the gripping stability calculation unit 34 calculates the stability evaluation result for the first gripping point candidate, and outputs the first gripping point candidate and the stability evaluation result to the result DB 35.
  • the gripping point candidate generation unit 32 extracts a plurality of gripping point candidates from the first gripping point candidate stored in the result DB 35 based on the stability evaluation result, and the second gripping force with respect to the plurality of gripping point candidates. Is defined and output to the deformation evaluation unit 33 again.
  • the gripping point candidate generation unit 32 can repeat the same process three or more times. For example, by repeating the third gripping force, the fourth gripping force, ..., The kth gripping force, and reducing the gripping force to be searched, the object 70 effective with the minimum gripping force. It is possible to extract the gripping point where the deformation is obtained. As a result, it is possible to efficiently search for points that can be stably gripped with the smallest gripping force, and as a result of being able to extract candidate points that are unlikely to fail in gripping in a short time, the work time per robot operation is shortened and tact. The special effect of shortening the time and increasing the production efficiency can be obtained.
  • the gripping stability calculation unit 34 is different from the first embodiment in that the gripping stability calculation unit 34 obtains a combination of gripping point candidates for stably gripping the object from the information on the contour of the object.
  • the grip stability calculation unit 34 acquires information on the contour of the object 70 from the point cloud coordinates of the contour of the object of the object 70, and selects a combination of grip point candidates from above the contour of the object 70. Then, the gripping stability calculation unit 34 obtains an evaluation value when the robot hand 20 grips the object 70 with a predetermined gripping force for each combination, and a gripping point for stably gripping the object 70 based on the evaluation value. Find a combination of candidates.
  • the grip stability calculation unit 34 derives a combination of stable grip points based on the evaluation of the magnitude of the minimum grip force required to grip the object 70.
  • the minimum gripping force required to grip the object 70 is the minimum fingertip force required to resist the gravity acting on the object 70. From the viewpoint of the difficulty of breaking the object 70, this value should be small.
  • the evaluation is performed using the value of the fingertip force obtained from the gripping force and the frictional force, and the search for the combination of stable gripping points is performed by the following procedure.
  • the points arranged in the two-dimensional plane are smoothly connected by the Spline interpolation method, and the information of the outline of the object 70 is acquired.
  • Grip point candidates are taken on the contour of the object 70, and all combinations of the two points are stored.
  • the gripping force is set to a certain value, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at that gripping force is obtained. Then, the gripping force is changed, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at the gripping force is obtained. This operation is repeated to obtain a combination of stable grip points with the optimum grip force.
  • the gripping stability calculation unit 34 obtains a combination of gripping point candidates of the object 70 for stably gripping the object 70, whereby an amorphous object such as a flexible object is obtained with respect to the object. , Gripping failure due to gripping the selected gripping point is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be improved. Can be done.
  • the gripping stability calculation unit 34 obtains an evaluation value based not only on the shape deformation information of the object 70 after gripping but also on the shape information of the object 70 before gripping.
  • FIG. 15 is a schematic view of the object 70 before gripping according to the ninth embodiment.
  • the deformation evaluation unit 33 outputs a plurality of discrete points DPB1, DPB2, ... As shape information of the object 70 before gripping.
  • the plurality of discrete points DPB1, DPB2, ... Are set based on the contour of the object 70.
  • the grip stability calculation unit 34 quantitatively evaluates the degree of depression of the object from the positional relationship of the plurality of discrete points DPB1, DPB2, ..., And outputs it as an evaluation value.
  • the gripping stability calculation unit 34 may output the discrete points DPB1 and DPB2 as gripping point candidates having high gripping stability.
  • the gripping stability calculation unit 34 can accurately select the gripping point by obtaining the evaluation value based on the shape information of the object 70 before gripping. As a result, it is possible to grasp the object with a high success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be increased.
  • the processing circuit comprises at least one processor and at least one memory.
  • FIG. 16 is a diagram showing a hardware configuration of the robot control device according to the first to ninth embodiments.
  • the robot control device 30 can be realized by the control circuit shown in FIG. 16A, that is, the processor 81 and the memory 82.
  • An example of the processor 81 is a CPU (Central Processing Unit, central processing unit, processing unit, arithmetic unit, microprocessor, microprocessor, processor, DSP (Digital Signal Processor)) or system LSI (Large Scale Integration).
  • the memory 82 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (registered trademark), etc.
  • a RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory)
  • EEPROM registered trademark
  • it may be a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versaille Disc), or the like.
  • the robot control device 30 is realized by the processor 81 reading and executing a program stored in the memory 82 for executing the operation of the robot control device 30. It can also be said that this program causes a computer to execute the procedure or method of the robot control device 30.
  • the program executed by the robot control device 30 includes a grip point generation unit 31 and a command value generation unit 39, which are loaded on the main storage device and these are generated on the main storage device.
  • the memory 82 stores obstacle information, target shape information, shape deformation information, and the like.
  • the memory 82 is also used as a temporary memory when the processor 81 executes various processes.
  • the robot control device 30 may be realized by dedicated hardware. Further, the functions of the robot control device 30 may be partially realized by dedicated hardware and partially realized by software or firmware.
  • the robot control device 30 may be realized by the dedicated processing circuit 83 shown in FIG. 16 (b). At least a part of the grip point generation unit 31 and the command value generation unit 39 may be realized by the processing circuit 83.
  • the processing circuit 83 is dedicated hardware.
  • the processing circuit 83 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof. Is.
  • a part of the function of the robot control device 30 may be realized by software or firmware, and the other part may be realized by dedicated hardware.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Provided is a robot control device for controlling a robot (10) and a robot hand (20) to grasp an object (70), the robot control device comprising a grasp point generation unit (31) that generates a grasp point of the object (70) to be grasped by the robot hand (20). The grasp point generation unit (31) includes a deformation evaluation unit (33) that calculates shape deformation information when the shape of the object (70) is deformed by a grasping operation of the robot band (20), and a grasp point determination unit (36) that determines a grasp point of the object (70), on the basis of the shape deformation information.

Description

ロボット制御装置およびロボット制御方法Robot control device and robot control method
 本開示は、対象物を落とさないように把持するために、ロボットおよびそのロボットの指先部に備え付けられるエンドエフェクタに対して動作指令を行うロボット制御装置およびロボット制御方法に関する。 The present disclosure relates to a robot control device and a robot control method that give an operation command to a robot and an end effector attached to the fingertip of the robot in order to grip the object so as not to drop it.
 従来のロボット制御装置では、対象物に対するエンドエフェクタ(ロボットハンド)の把持点を決定するために、対象物の計測情報に基づいて対象物の形状および重量を計測し、対象物の重心位置を推定し、その重心位置の近傍を通過する点を把持点とする方法がある(例えば、特許文献1参照)。 In the conventional robot control device, in order to determine the gripping point of the end effector (robot hand) with respect to the object, the shape and weight of the object are measured based on the measurement information of the object, and the position of the center of gravity of the object is estimated. However, there is a method in which a point passing near the position of the center of gravity is set as a gripping point (see, for example, Patent Document 1).
特開2008-49459号公報(第9~10頁、第2図)Japanese Unexamined Patent Publication No. 2008-49459 (pages 9 to 10, FIG. 2)
 従来のロボット制御装置では、2個以上配置された不定形物体を対象物としたピッキング作業を行う場合に、干渉を考慮すると必ずしも重心位置を通過するような位置をつかめず、把持に伴う変形によってロボットハンドから対象物が落下してしまうことがある。その結果、ピッキング作業の成功率が下がり、生産効率が低下するという課題があった。 In the conventional robot control device, when picking an object with two or more irregular objects arranged, it is not always possible to grasp a position that passes through the position of the center of gravity in consideration of interference, and due to deformation due to gripping. An object may fall from the robot hand. As a result, there is a problem that the success rate of picking work is lowered and the production efficiency is lowered.
 本開示は、上記のような問題点を解決するためになされたものであり、柔軟性を持った対象物や不定形物体を対象物としたピッキング作業において、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高く維持できるロボット制御装置を得るものである。 This disclosure is made to solve the above-mentioned problems, and grips an object with a high success rate in a picking operation using a flexible object or an amorphous object as an object. It is possible to obtain a robot control device that can shorten the tact time and maintain high production efficiency.
 本開示に係るロボット制御装置は、対象物を把持するためにロボットおよびロボットのロボットハンドを制御するものであって、ロボットハンドが把持する対象物の把持点を生成する把持点生成部を備え、把持点生成部は、ロボットの把持動作によって対象物の形状が変形する際の形状変形情報を算出する変形評価部と、形状変形情報に基づいて対象物の把持点を決定する把持点決定部とを有するものである。 The robot control device according to the present disclosure controls the robot and the robot hand of the robot in order to grip the object, and includes a grip point generation unit that generates a grip point of the object to be gripped by the robot hand. The grip point generation unit includes a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping motion of the robot, and a grip point determination unit that determines the grip point of the object based on the shape deformation information. It has.
 本開示によれば、柔軟性を持った対象物や不定形物体を対象物としたピッキング作業において、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高く維持できる。 According to the present disclosure, in picking work for a flexible object or an amorphous object, the object can be grasped with a high success rate, the takt time is shortened, and the production efficiency is maintained high. can.
実施の形態1に係るロボットシステムの全体図である。It is an overall view of the robot system which concerns on Embodiment 1. FIG. 実施の形態1に係るロボット制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the robot control apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る把持点生成部の構成を示すブロック図である。It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 1. FIG. 実施の形態1に係るフィンガと対象物との位置関係を示す図である。It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 1. FIG. 実施の形態1に係る対象物が動ける場合のフィンガと対象物との位置関係を示す図である。It is a figure which shows the positional relationship between a finger and an object when the object which concerns on Embodiment 1 can move. 実施の形態1に係るロボット制御装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the robot control apparatus which concerns on Embodiment 1. 実施の形態2に係るばね要素およびダンピング要素を組み合わせた物性モデルである。It is a physical characteristic model which combined the spring element and the damping element which concerns on Embodiment 2. 実施の形態3に係る把持点生成部の構成を示すブロック図である。It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 3. 実施の形態3に係るロボット制御装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the robot control apparatus which concerns on Embodiment 3. 実施の形態4に係るフィンガと対象物との位置関係を示す図である。It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 4. FIG. 実施の形態5に係るフィンガと対象物との位置関係を示す図である。It is a figure which shows the positional relationship between a finger and an object which concerns on Embodiment 5. 実施の形態6に係る把持点生成部の構成を示すブロック図である。It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 6. 実施の形態6に係る別の把持点生成部の構成を示すブロック図である。It is a block diagram which shows the structure of another gripping point generation part which concerns on Embodiment 6. 実施の形態7に係る把持点生成部の構成を示すブロック図である。It is a block diagram which shows the structure of the gripping point generation part which concerns on Embodiment 7. 実施の形態9に係る把持前の対象物の模式図である。It is a schematic diagram of the object before grasping which concerns on Embodiment 9. FIG. 実施の形態1から9に係るロボット制御装置のハードウェア構成を示す図である。It is a figure which shows the hardware composition of the robot control apparatus which concerns on Embodiment 1-9.
実施の形態1.
 図1は、本開示を実施するための実施の形態1に係るロボットシステムの全体図である。ロボットシステムは、ロボットとそれを制御して動作させるロボット制御装置の構成を基本としている。ロボット10は対象物を把持するようなマテリアルハンドリングと呼ばれる作業を実施することがある。この場合、対象物70の位置情報および形状情報を取得するために対象物70の形状等の情報を計測する計測装置60と、対象物70を把持するためのロボットハンド20(エンドエフェクタ)とを備える構成が追加される。計測装置60に基づいて計測された対象物70の情報は計測装置コントローラ50において処理され、ロボット制御装置30に対象物70の情報が入力される。
Embodiment 1.
FIG. 1 is an overall view of a robot system according to the first embodiment for carrying out the present disclosure. The robot system is based on the configuration of a robot and a robot control device that controls and operates the robot. The robot 10 may perform a task called material handling, such as gripping an object. In this case, a measuring device 60 that measures information such as the shape of the object 70 in order to acquire the position information and shape information of the object 70, and a robot hand 20 (end effector) for gripping the object 70 are provided. The configuration to be provided is added. The information of the object 70 measured based on the measuring device 60 is processed by the measuring device controller 50, and the information of the object 70 is input to the robot control device 30.
 ロボット制御装置30は、対象物70を把持するためにロボット10およびロボット10のロボットハンド20を制御するものであり、ロボットハンド20が把持する対象物70の把持点を生成する把持点生成部31を備えている。ロボット制御装置30は、入力された対象物70の情報に基づいてロボットハンド20が対象物70を把持するために、対象物70を把持する位置におけるロボット10への位置指令値およびロボットハンド20のフィンガ(指先部)の開き位置を演算する。ロボット制御装置30は、ロボットハンド20のフィンガが所望の位置に制御されるよう、ロボット10のアームの関節とロボットハンド20のフィンガとをそれぞれ制御する。以下、ロボット制御装置30は、ロボットハンド20のフィンガが適切な位置に移動するように、ロボット10のアームの関節およびロボットハンド20のフィンガのうちの少なくとのいずれか一方を制御するものである。対象物70の情報として対象物70の位置形状および形状情報が例示される。 The robot control device 30 controls the robot 10 and the robot hand 20 of the robot 10 in order to grip the object 70, and the grip point generation unit 31 that generates the grip points of the object 70 gripped by the robot hand 20. It is equipped with. In the robot control device 30, in order for the robot hand 20 to grip the object 70 based on the input information of the object 70, the position command value to the robot 10 and the robot hand 20 at the position where the object 70 is gripped are obtained. Calculate the opening position of the finger (fingertip). The robot control device 30 controls the joint of the arm of the robot 10 and the finger of the robot hand 20 so that the finger of the robot hand 20 is controlled at a desired position. Hereinafter, the robot control device 30 controls either the joint of the arm of the robot 10 or at least one of the fingers of the robot hand 20 so that the finger of the robot hand 20 moves to an appropriate position. .. As the information of the object 70, the position shape and the shape information of the object 70 are exemplified.
 さらに、ロボット制御装置30は、ロボット10に対して、演算した位置指令値およびロボットハンド20のフィンガの開状態の位置指令値と閉状態の位置指令値とを出力する。ロボット制御装置30は、ロボット10の位置指令値に対してロボットハンド20に対する位置指令値が実行されるタイミングを定め,各時刻tにおける位置指令値としてロボット10に送信する。これにより、ロボットハンド20のフィンガを開状態にして接近し,対象物70の把持点においてロボットハンド20のフィンガを閉状態にせしめる動作を実現することができる。ここで、ロボット10の位置指令値とは特に断りがない限り、並進3自由度と回転3自由度の6自由度に関するものとする。また、ロボットハンド20のフィンガの位置指令値については、ハンドの種類に依存するが、リンク構造であればフィンガ先端位置あるいは開き幅で定義されるものとする。他に、各駆動部の位置指令値を指す場合もあるが、ここでは、特にハンドの構造に限定せずに指定できる位置指令値を広く指すものとする。また、ロボットハンド20のフィンガはアクチュエータが圧力制御、力制御、あるいはトルク制御可能な場合は、把持力が制御可能である。以下、把持力を指定している場合は、把持力指令値を把持点候補に対して与えているものとする。 Further, the robot control device 30 outputs the calculated position command value, the open position command value of the finger of the robot hand 20, and the closed position command value to the robot 10. The robot control device 30 determines the timing at which the position command value for the robot hand 20 is executed with respect to the position command value of the robot 10, and transmits the position command value to the robot 10 as the position command value at each time t. As a result, it is possible to realize an operation in which the fingers of the robot hand 20 are opened and approached, and the fingers of the robot hand 20 are closed at the gripping point of the object 70. Here, unless otherwise specified, the position command value of the robot 10 is related to 6 degrees of freedom of translation 3 degrees of freedom and rotation 3 degrees of freedom. Further, the position command value of the finger of the robot hand 20 depends on the type of the hand, but if it is a link structure, it is defined by the finger tip position or the opening width. In addition, it may refer to the position command value of each drive unit, but here, it refers broadly to the position command value that can be specified without being limited to the structure of the hand. Further, the finger of the robot hand 20 can control the gripping force when the actuator can control the pressure, the force, or the torque. Hereinafter, when the gripping force is specified, it is assumed that the gripping force command value is given to the gripping point candidate.
 「把持点」は、ロボットハンド20が対象物70を掴むことができるフィンガの位置姿勢を意味している。実際のロボット制御においては、前述のとおり把持点の位置姿勢以外にも各時刻tにおける位置指令値が必要であるが、ロボットハンド20が対象物70の把持点に到達できるようなロボット10の各関節の位置目標値については別途計算されるものとする。なお、対象物70の把持点を算出するために用いることができる情報として、対象物70の位置形状および形状情報に限定する必要はない。すなわち、対象物70の情報としては、位置情報や形状情報といった直接的な情報以外にも、対象物70の温度情報、距離情報、色情報など間接的な情報を利用して対象物70の位置情報や形状情報を推定することが可能である。 The "grasping point" means the position and posture of the finger in which the robot hand 20 can grip the object 70. In actual robot control, as described above, a position command value at each time t is required in addition to the position and orientation of the gripping point, but each of the robots 10 so that the robot hand 20 can reach the gripping point of the object 70. The joint position target value shall be calculated separately. The information that can be used to calculate the gripping point of the object 70 does not have to be limited to the position shape and shape information of the object 70. That is, as the information of the object 70, in addition to the direct information such as the position information and the shape information, the position of the object 70 is used by using indirect information such as the temperature information, the distance information, and the color information of the object 70. It is possible to estimate information and shape information.
 図2は、ロボット制御装置30の構成を示すブロック図である。図2に示すように、ロボット制御装置30は、主に把持点生成部31と、指令値生成部39とによって構成されている。図2に示すように、ロボット制御装置30は、ロボット10が移動するべき把持点の位置を演算し、ロボットハンド20を把持点に移動させ、ロボット10が把持動作するようロボット10を制御する。把持点生成部31は、ロボット10の把持対象である対象物70の形状情報を用いて、対象物70の把持点を出力する。 FIG. 2 is a block diagram showing the configuration of the robot control device 30. As shown in FIG. 2, the robot control device 30 is mainly composed of a gripping point generation unit 31 and a command value generation unit 39. As shown in FIG. 2, the robot control device 30 calculates the position of the gripping point where the robot 10 should move, moves the robot hand 20 to the gripping point, and controls the robot 10 so that the robot 10 grips and operates. The gripping point generation unit 31 outputs the gripping point of the object 70 by using the shape information of the object 70 to be gripped by the robot 10.
 対象形状情報については、具体的には、計測装置60として視覚センサによって得られた対象物70の画像情報あるいは距離情報を点群として獲得し、計算することによって得られる。他にも、実際にロボットハンド20を用いて対象物70をロボットハンド20のフィンガで一度把持して、把持時のフィンガの位置情報に基づいて形状情報を獲得してもよい。また、計測装置60として距離計測センサを用いて対象物の断面形状に基づいて形状情報を獲得してもよい。また、計測装置60として温度センサを用いて対象物70のおよその位置や形状に基づいて形状情報を獲得してもよい。このように計測装置60として視覚センサに限ることはない。また、計測装置60以外から位置情報、形状情報、温度情報、距離情報、色情報などを入手してもよい。 Specifically, the target shape information is obtained by acquiring and calculating the image information or the distance information of the object 70 obtained by the visual sensor as the measuring device 60 as a point cloud. Alternatively, the object 70 may be actually gripped once by the finger of the robot hand 20 by actually using the robot hand 20, and the shape information may be acquired based on the position information of the finger at the time of gripping. Further, the measuring device 60 may use a distance measuring sensor to acquire shape information based on the cross-sectional shape of the object. Further, the temperature sensor may be used as the measuring device 60 to acquire shape information based on the approximate position and shape of the object 70. As described above, the measuring device 60 is not limited to the visual sensor. Further, position information, shape information, temperature information, distance information, color information and the like may be obtained from other than the measuring device 60.
 図3は、把持点生成部31の構成を示すブロック図である。図3に示すように、把持点生成部31は、把持点候補生成部32と、変形評価部33と、把持点決定部36とによって構成されている。変形評価部33は、ロボットハンド20の把持動作によって前記対象物70の形状が変形する際の形状変形情報を算出する。また、把持点決定部36は、把持点決定部は、この形状変形情報に含まれる対象物の変形量と、対象物の変形後の幾何的な拘束条件とに基づいて対象物の把持点を決定する。以下、それぞれの構成部について説明する。 FIG. 3 is a block diagram showing the configuration of the grip point generation unit 31. As shown in FIG. 3, the grip point generation unit 31 is composed of a grip point candidate generation unit 32, a deformation evaluation unit 33, and a grip point determination unit 36. The deformation evaluation unit 33 calculates shape deformation information when the shape of the object 70 is deformed by the gripping operation of the robot hand 20. Further, the gripping point determining unit 36 determines the gripping point of the object based on the deformation amount of the object included in the shape deformation information and the geometric constraint condition after the deformation of the object. decide. Hereinafter, each component will be described.
 把持点候補生成部32は、ロボット制御装置30に入力される対象形状情報に基づいて、ロボット10が装着しているロボットハンド20が把持可能な把持点候補を生成する。この際、把持点候補の生成方法として、ロボットハンド20のフィンガのストローク(開き幅)に基づいて、対象形状情報から全周の任意の2点間を全探索する方法で探索することができる。例えば、2フィンガグリッパを例にした場合を後述する図4に示す。 The gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 attached to the robot 10 based on the target shape information input to the robot control device 30. At this time, as a method of generating the gripping point candidate, it is possible to search by a method of completely searching between any two points on the entire circumference from the target shape information based on the stroke (opening width) of the finger of the robot hand 20. For example, a case where a 2-finga gripper is taken as an example is shown in FIG. 4 described later.
 対象物の探索は楕円形状すなわち対象物の外周の任意の2点を選定する処理となる。変形評価部33は、選ばれた2点に対して対象物の内側に指を移動させ、把持を行うものとして後述する変形評価を行う。この時、フィンガには拘束条件として指の可動方向や可動範囲があるため、探索自体は全探索ではなく、フィンガの開閉距離L0を制約条件として候補点の2点間の距離と開閉距離L0との比較による制約の下で探索する方法などが実行でき、探索方法自体を限定するものではない。 The search for an object is an elliptical shape, that is, a process of selecting any two points on the outer circumference of the object. The deformation evaluation unit 33 moves a finger inside the object with respect to the two selected points, and performs deformation evaluation described later on the assumption that the grip is performed. At this time, since the finger has a movable direction and a movable range of the finger as a constraint condition, the search itself is not a full search, but the distance between the two candidate points and the open / close distance L0 with the opening / closing distance L0 of the finger as a constraint condition. It is possible to execute a search method under the constraint of comparison, and the search method itself is not limited.
 次に、変形評価部33では、把持点候補生成部32で生成された複数の把持点候補のそれぞれの場合に対して、図4に示すように期待される形状変形情報を評価して、出力する。形状変形情報には、変形後の形状情報が含まれる。形状変形情報を評価するために、各フィンガによる点接触による把持を仮定して、それぞれが対象物70に対して変形を生じるモデルで期待される形状変形情報を算出することができる。 Next, the deformation evaluation unit 33 evaluates and outputs the expected shape deformation information as shown in FIG. 4 for each case of the plurality of grip point candidates generated by the grip point candidate generation unit 32. do. The shape deformation information includes the shape information after deformation. In order to evaluate the shape deformation information, it is possible to calculate the shape deformation information expected in the model in which each finger causes deformation with respect to the object 70, assuming gripping by point contact by each finger.
 さらに、ロボットハンド20に備えられたフィンガの数だけ存在する点接触部については、生じている点i(i=1、2、3…)に対して適当な固定把持力Fiを仮定して、力に対して変形が生じる力学的な関係を、それぞれの点接触地点において計算することができる。この際、力に対する変形量の計算のために、対象物70を均質な形状として取り扱い、ばね乗数をK、ダンピング係数をCとして、フィンガの力が発生する方向に対し、剛体、弾性体あるいはレオロジー物体として対象の特性を近似して形状変形情報を評価することができる。例えば、レオロジー物体であれば、非特許文献1に記載されているように、変位と力との関係式が成立することが知られている。以上のように、把持点を指定し、適当な条件の下で形状変形を生じさせることで、期待される形状変形情報を得ることができる。 Further, for the point contact portions provided in the robot hand 20 as many as the number of fingers, an appropriate fixed gripping force Fi is assumed for the generated points i (i = 1, 2, 3 ...). The mechanical relationship in which deformation occurs with respect to force can be calculated at each point contact point. At this time, in order to calculate the amount of deformation with respect to the force, the object 70 is treated as a uniform shape, the spring multiplier is K, the damping coefficient is C, and a rigid body, an elastic body or a rheology is used in the direction in which the finger force is generated. The shape deformation information can be evaluated by approximating the characteristics of the object as an object. For example, in the case of a rheological object, it is known that the relational expression between displacement and force is established as described in Non-Patent Document 1. As described above, the expected shape deformation information can be obtained by designating the gripping point and causing the shape deformation under appropriate conditions.
 最後に、把持点決定部36では、変形評価部33によって生成される形状変形情報に基づいて把持点を決定する処理が行われる。把持点決定部36では、複数の把持点候補に対して生成されたそれぞれの形状変形情報に対し、幾何的に拘束されているものを把持点として抽出する。具体的には、図4が拘束されている事例であり、図5に示すような場合が拘束されていないものの事例である。ここで、図4は、ロボットハンド20のフィンガと対象物70との位置関係を示す図である。図4(a)は、ロボットハンド20のフィンガが対象物70を把持する前、図4(b)および図4(c)は、ロボットハンド20のフィンガが対象物70を把持した後の位置関係を示している。図5は、対象物70が動ける場合のフィンガと対象物70との位置関係を示す図である。図4および図5において、ロボットハンド20のフィンガが開閉する方向をX軸、フィンガに沿った方向に垂直で、X軸に垂直である方向をY軸とする。 Finally, the gripping point determination unit 36 performs a process of determining the gripping point based on the shape deformation information generated by the deformation evaluation unit 33. The gripping point determination unit 36 extracts geometrically constrained grip points for each shape deformation information generated for the plurality of gripping point candidates. Specifically, FIG. 4 is a case of being restrained, and a case as shown in FIG. 5 is a case of not being restrained. Here, FIG. 4 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70. 4 (a) shows the positional relationship before the finger of the robot hand 20 grips the object 70, and FIGS. 4 (b) and 4 (c) show the positional relationship after the finger of the robot hand 20 grips the object 70. Is shown. FIG. 5 is a diagram showing the positional relationship between the finger and the object 70 when the object 70 can move. In FIGS. 4 and 5, the direction in which the finger of the robot hand 20 opens and closes is defined as the X axis, the direction perpendicular to the direction along the finger, and the direction perpendicular to the X axis is defined as the Y axis.
 図4および図5に記載の対象物70から外側に向いた黒い矢印は、対象物70を外部から力を作用させて移動させようとする方向を示すものである。この場合、図4の把持方法であれば、対象物70の変形後の形状情報に基づくと、X軸方向およびY軸方向に対して外部から力を作用させたとしても対象物は幾何的に拘束されているため把持状態を維持しやすい。これは、対象物70の変形後の形状情報と対象物70の変形後の把持するフィンガの位置関係とから、幾何的な拘束が成立しているためである。本実施の形態は、この点に注目して安定把持を実現する把持点候補を抽出することが特徴である。一方で、図5において、対象物70の変形後の形状情報およびフィンガの位置関係をみてみると、X方向には同様に幾何的な拘束による把持安定性が作用して動けない状態である。しかし、マイナスY方向に対しては、幾何的な拘束が作用せず把持安定性が低い状態であり、対象物を動かせる状態である。 The black arrows pointing outward from the object 70 shown in FIGS. 4 and 5 indicate the direction in which the object 70 is to be moved by applying a force from the outside. In this case, according to the gripping method of FIG. 4, based on the deformed shape information of the object 70, the object is geometrically applied even if an external force is applied in the X-axis direction and the Y-axis direction. Since it is restrained, it is easy to maintain the gripped state. This is because the geometrical constraint is established from the shape information of the object 70 after the deformation and the positional relationship of the fingers to be gripped after the deformation of the object 70. The present embodiment is characterized by paying attention to this point and extracting gripping point candidates that realize stable gripping. On the other hand, looking at the shape information of the object 70 after deformation and the positional relationship of the fingers in FIG. 5, the gripping stability due to the geometrical restraint also acts in the X direction, and the object cannot move. However, in the minus Y direction, the geometric constraint does not act and the gripping stability is low, and the object can be moved.
 以下、把持点決定部36が把持安定性を評価する方法の一例を示す。図4(c)に示すように、変形評価部33は、対象物70の形状変形情報として、複数の離散点DP1,DP2,・・・を出力する。離散点DP1,DP2,・・・は、対象物70に対して変形を生じるモデルで期待される形状の輪郭に基づいて設定される。一例として、把持点決定部36は、離散点DP1,DP2,・・・の位置とフィンガの位置FP1,FP2との関係から、幾何的な拘束が成立するか否かを判定することで、把持安定性を評価する。 Hereinafter, an example of a method in which the gripping point determining unit 36 evaluates gripping stability is shown. As shown in FIG. 4C, the deformation evaluation unit 33 outputs a plurality of discrete points DP1, DP2, ... As shape deformation information of the object 70. The discrete points DP1, DP2, ... Are set based on the contour of the shape expected in the model that causes deformation with respect to the object 70. As an example, the gripping point determining unit 36 determines whether or not a geometric constraint is established from the relationship between the positions of the discrete points DP1, DP2, ... And the finger positions FP1, FP2, thereby gripping. Evaluate stability.
 別の一例として、把持点決定部36は、フィンガの位置FP1の近傍に位置する複数の離散点について第1の近似曲線を求める。把持点決定部36は、位置FP1におけるフィンガの輪郭に基づいて、フィンガに関する複数の離散点(図示しない)を設定し、複数の離散点について第2の近似曲線を求める。把持点決定部36は、第1の近似曲線と第2の近似曲線とに基づいて、フィンガの位置FP1近傍の対象物70の形状(凸凹情報など)と位置FP1におけるフィンガの形状(円弧、長方形など)とを比較して、幾何的な拘束が成立するか否かを判定することで、把持安定性を評価する。比較方法としては、対象物70の形状とフィンガの形状との曲率の大小関係、及び第1の近似曲線が持つ極大点と極小点との高低差などが挙げられる。把持点決定部36は、フィンガの位置FP2の近傍に位置する複数の離散点、及び位置FP2におけるフィンガに関する複数の離散点(図示しない)についても同様に近似曲線を求め、上記と同様の方法で把持安定性を評価する。 As another example, the gripping point determination unit 36 obtains a first approximate curve for a plurality of discrete points located in the vicinity of the finger position FP1. The gripping point determination unit 36 sets a plurality of discrete points (not shown) with respect to the finger based on the contour of the finger at the position FP1, and obtains a second approximate curve for the plurality of discrete points. The gripping point determination unit 36 determines the shape of the object 70 near the position FP1 of the finger (unevenness information, etc.) and the shape of the finger at the position FP1 (arc, rectangle) based on the first approximate curve and the second approximate curve. Etc.), and the gripping stability is evaluated by determining whether or not the geometrical constraint is established. Examples of the comparison method include the magnitude relation of the curvature between the shape of the object 70 and the shape of the finger, and the height difference between the maximum point and the minimum point of the first approximation curve. The gripping point determination unit 36 similarly obtains approximate curves for a plurality of discrete points located in the vicinity of the finger position FP2 and a plurality of discrete points (not shown) relating to the finger at the position FP2, and the same method as described above. Evaluate grip stability.
 更に別の一例として、把持点決定部36は、仮想的な力Fvirを対象物70に加えた時の離散点DP1,DP2,・・・の位置座標を確認し、加える前からの位置座標の変化量が所定値以下であるか否かを判定することで幾何的な拘束が成立するか否かを判定し、把持安定性を評価する。把持点決定部36は、複数の離散点のうち一つでも変化量が所定値を超えた場合に把持安定性が低いと判定してもよいし、複数の離散点のうち一部における変化量が所定値を超えた場合に把持安定性が低いと判定してもよい。仮想的な力Fvirは、対象物70に対し任意の方向から加えられるものとする。 As yet another example, the gripping point determination unit 36 confirms the position coordinates of the discrete points DP1, DP2, ... When the virtual force Fvir is applied to the object 70, and the position coordinates before the addition are confirmed. By determining whether or not the amount of change is equal to or less than a predetermined value, it is determined whether or not the geometric constraint is established, and the gripping stability is evaluated. The gripping point determination unit 36 may determine that the gripping stability is low when the amount of change exceeds a predetermined value even in one of the plurality of discrete points, or the amount of change in a part of the plurality of discrete points. May be determined that the gripping stability is low when the value exceeds a predetermined value. It is assumed that the virtual force Fvir is applied to the object 70 from an arbitrary direction.
 把持点決定部36が把持安定性を評価する方法として、上記3つを挙げたが、これらに限定されない。 The above three methods have been mentioned as methods for the gripping point determining unit 36 to evaluate gripping stability, but the method is not limited thereto.
 複数個の把持点が抽出された場合は、例えば最も重心位置に近い把持点を選択することができる。重心位置と把持点の距離が近い場合は、把持点近傍の把持力が外乱等で揺らいだ時にも偶力を小さく抑えることができることが期待される。 When a plurality of grip points are extracted, for example, the grip point closest to the position of the center of gravity can be selected. When the distance between the center of gravity and the gripping point is short, it is expected that the couple can be kept small even when the gripping force near the gripping point fluctuates due to disturbance or the like.
 ここで、ロボット制御装置30の動作を説明する。図6は、ロボット制御装置の動作を示すフローチャートである。まず、ステップS101において、対象形状情報を入力する。次に、ステップS102において、把持点候補生成部32は、入力した対象形状情報に基づいてロボットハンド20が把持可能な把持点候補を生成する。次に、ステップS103において、変形評価部33は、複数の把持点候補のそれぞれの場合に対して、形状変形情報を評価して、出力する。そして、ステップS104において、把持点決定部36は、形状変形情報に基づいて把持点を決定する。 Here, the operation of the robot control device 30 will be described. FIG. 6 is a flowchart showing the operation of the robot control device. First, in step S101, the target shape information is input. Next, in step S102, the gripping point candidate generation unit 32 generates gripping point candidates that can be gripped by the robot hand 20 based on the input target shape information. Next, in step S103, the deformation evaluation unit 33 evaluates and outputs the shape deformation information for each case of the plurality of gripping point candidates. Then, in step S104, the gripping point determining unit 36 determines the gripping point based on the shape deformation information.
 以上のように、把持点生成部31を備えたロボット制御装置30において、特に把持点生成部31は、ハンドの把持動作によって対象物の形状が変形する際の形状変形情報を算出する変形評価部33と、形状変形情報に基づいて対象物の把持点を決定する把持点決定部36とを有することによって、柔軟物体のような不定形物体を対象物に対して、選定された把持点を把持することによる把持失敗が大幅に低減され、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高く維持できるという格別な効果が得られる。 As described above, in the robot control device 30 provided with the grip point generation unit 31, in particular, the grip point generation unit 31 is a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping operation of the hand. By having the gripping point determining unit 36 that determines the gripping point of the object based on the shape deformation information 33, an irregular object such as a flexible object grips the selected gripping point with respect to the object. By doing so, the gripping failure is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
 なお、生産効率とは、ピッキング作業など作業の速さを指している。例えば、生産効率の一例としてはタクトタイムを指しており、1回1秒の作業を100回試行して100回成功すると平均1秒/回のタクトタイムとして評価し、同様の作業を100回試行して50回しか成功しない場合は平均2秒/回のタクトタイムとして評価する。以上により失敗が少ないほど生産効率が上がることになる。 Note that production efficiency refers to the speed of work such as picking work. For example, as an example of production efficiency, it refers to takt time, and if one operation of 1 second is tried 100 times and succeeded 100 times, it is evaluated as an average of 1 second / time takt time, and the same work is tried 100 times. If it succeeds only 50 times, it is evaluated as an average tact time of 2 seconds / time. From the above, the fewer failures there are, the higher the production efficiency.
実施の形態2.
 本実施の形態では、変形評価部33が、対象物70に加わる力と対象物70の変位との関係式に基づいて算出される力の上限値を超えたかどうかを評価する構成をさらに加えることが実施の形態1と異なる。実施の形態1に記載のロボット制御装置30では、幾何的に拘束されている条件を満たした上で、把持点候補生成部32において複数の把持点候補が抽出される。変形評価部33が、これらの把持点候補の中から、対象物70が許容される変形量を超えるか否かを、時系列表現された把持力F(t)(時刻tで変化する値)に対して評価し、拘束条件に加えることが本実施の形態の特徴である。
Embodiment 2.
In the present embodiment, the deformation evaluation unit 33 further adds a configuration for evaluating whether or not the upper limit of the force calculated based on the relational expression between the force applied to the object 70 and the displacement of the object 70 is exceeded. Is different from the first embodiment. In the robot control device 30 according to the first embodiment, a plurality of grip point candidates are extracted by the grip point candidate generation unit 32 after satisfying the condition of being geometrically constrained. From these gripping point candidates, the deformation evaluation unit 33 determines whether or not the object 70 exceeds the allowable deformation amount by the gripping force F (t) (value that changes at time t) expressed in time series. It is a feature of this embodiment that it is evaluated and added to the constraint condition.
 変形評価部33は、対象物70に加わる把持力と対象物70の変位との関係式および対象物70が許容できる対象物70の変形量に基づいて対象物70に加わる把持力の上限値を算出する。そして、変形評価部33は、ロボットハンド20から対象物70に加わる把持力が上限値を超えたかどうかを評価する。また、変形評価部33は、形状変形情報の一部として、対象物70に加わる把持力と対象物70の変位との関係式に基づいて算出される把持力の時系列情報を把持点決定部36に出力する。 The deformation evaluation unit 33 sets an upper limit value of the gripping force applied to the object 70 based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 and the amount of deformation of the object 70 that the object 70 can tolerate. calculate. Then, the deformation evaluation unit 33 evaluates whether or not the gripping force applied to the object 70 from the robot hand 20 exceeds the upper limit value. Further, the deformation evaluation unit 33 determines the time-series information of the gripping force calculated based on the relational expression between the gripping force applied to the object 70 and the displacement of the object 70 as a part of the shape deformation information. Output to 36.
 一般に、柔軟物体のカテゴリーとして変形の方法で3つに分類できる。変形としては、変形後に把持力を除荷すると形状が元に戻る弾性体、形状が完全には戻らないレオロジー体、力をかけた分だけ変形してしまう塑性体がある。一方で、柔軟物体には許容される変形の上限がある。変形の条件を超えた場合は、対象物70が破損あるいは商品価値を損なうという事象が生じる。 Generally, it can be classified into three categories by the method of deformation as a category of flexible objects. Deformations include elastic bodies whose shape returns to their original shape when the gripping force is removed after deformation, rheological bodies whose shape does not completely return, and plastic bodies that deform by the amount of force applied. On the other hand, flexible objects have an upper limit of allowable deformation. If the conditions of deformation are exceeded, an event occurs in which the object 70 is damaged or the commercial value is impaired.
 なお、変形量は、力とその力を加える時間とによって計算される。力と変形量との関係式は例えば非特許文献1の記載により、数式的に表すことができる。例えば、レオロジー体や塑性体においては、元の形状に戻らないため、マクスウェルモデルのように弾性要素とダンピング要素を直列に接続したような構成によって、物性モデルを模擬できる。 The amount of deformation is calculated by the force and the time to apply the force. The relational expression between the force and the amount of deformation can be expressed mathematically, for example, by the description in Non-Patent Document 1. For example, in a rheological body or a plastic body, since it does not return to its original shape, a physical property model can be simulated by a configuration in which an elastic element and a damping element are connected in series as in the Maxwell model.
 図7は、ばね要素およびダンピング要素を組み合わせた物性モデルである。図7に示すように、ばね要素やダンピング要素を組み合わせることによって物性モデルは構成される。図7において、ばね要素のばね乗数をK1で示し、ダンピング要素のダンピング係数をC1、C2で示す。複数個を直列・並列に接続することで例えば図7に示すような物性モデルが表現できる。ここで、図7(b)に示した「マクスウェルモデル」を例に、力と変形量の計算について説明する。 FIG. 7 is a physical characteristic model that combines a spring element and a damping element. As shown in FIG. 7, a physical characteristic model is constructed by combining a spring element and a damping element. In FIG. 7, the spring multiplier of the spring element is indicated by K1, and the damping coefficient of the damping element is indicated by C1 and C2. By connecting a plurality of them in series or in parallel, a physical characteristic model as shown in FIG. 7, for example, can be expressed. Here, the calculation of the force and the amount of deformation will be described by taking the "Maxwell model" shown in FIG. 7B as an example.
 初めに、固定点をP1、力を作用させる点をP2とし、P1とP2との間にばね乗数K1としたばね要素、ダンピング係数C2としたダンピング要素を配置した物性モデルを設定する。ばね要素、ダンピング要素のそれぞれの係数については、対象とする柔軟物体によって値を事前に入力しておくものとする。また、それぞれの係数について対象物70の物性として、既知の力(1N、2N、3N…といった値)を入力して得られる時系列の位置情報を用いることによって、ばね要素、ダンピング要素のそれぞれの係数を推定することもできる。また、計測などが困難な場合は、対象物70の物性に基づき、事前に得られている類似の柔軟物についての係数を利用することもできる。 First, a physical property model is set in which a fixed point is P1 and a point on which a force is applied is P2, and a spring element having a spring multiplier K1 and a damping element having a damping coefficient C2 are arranged between P1 and P2. For each coefficient of the spring element and damping element, the value shall be input in advance according to the target flexible object. Further, by using the time-series position information obtained by inputting known forces (values such as 1N, 2N, 3N, etc.) as the physical characteristics of the object 70 for each coefficient, each of the spring element and the damping element is used. You can also estimate the coefficients. Further, when measurement or the like is difficult, it is possible to use a coefficient for a similar flexible object obtained in advance based on the physical properties of the object 70.
 2フィンガハンドを用いる場合は、フィンガに対する把持点が2つ与えられる。これらを把持点PG1、PG2とおく。このとき、把持点PG2と力を作用させる点P2とを一致させる。また、ベクトル(P1P2)とベクトル(PG1PG2)とが平行になるように設定する。変位については、ばね要素とダンピング要素のつなぎ目の変位をx1、把持点P2の変位をx2で定義する。変位x1および変位x2の原点は、双方が自然長の状態を定義することができる。その際、初期の位置関係として、k1のばね要素の長さをX10、ダンピング要素の長さをX20としている。 When using a 2-finger hand, two grip points are given to the finger. These are set as gripping points PG1 and PG2. At this time, the gripping point PG2 and the point P2 on which the force is applied are made to coincide with each other. Further, the vector (P1P2) and the vector (PG1PG2) are set to be parallel to each other. Regarding the displacement, the displacement of the joint between the spring element and the damping element is defined by x1, and the displacement of the gripping point P2 is defined by x2. The origins of the displacement x1 and the displacement x2 can both define the state of natural length. At that time, as an initial positional relationship, the length of the spring element of k1 is set to X10, and the length of the damping element is set to X20.
 このような条件の上で、把持力の時系列データF(t)を外部から作用させたときに、運動方程式を演算することで、変位x1および変位x2の時系列データとして求めることができる。非特許文献1に記載の通り、ダンピング係数C2に非線形特性を持たせることでレオロジー物体の特性(残留変位を持つ)を模擬することができる。ただし、物性モデルの定義についてはこれに限定することなく、係数や構成を変更することで、剛体、弾性体、レオロジー物体、塑性体に適用可能とする。 Under such conditions, when the time-series data F (t) of the gripping force is applied from the outside, it can be obtained as the time-series data of the displacement x1 and the displacement x2 by calculating the equation of motion. As described in Non-Patent Document 1, the characteristics (having residual displacement) of the rheological object can be simulated by giving the damping coefficient C2 a non-linear characteristic. However, the definition of the physical property model is not limited to this, and it can be applied to rigid bodies, elastic bodies, rheological objects, and plastic bodies by changing the coefficients and configurations.
 以上の演算結果として、変位x2の位置の変化を取得できるため、適当な把持力の時系列データF(t)に応じてどのような変形が生じるのかを求めることができる。特に、ダンピング要素を含む場合は、除荷後の位置が元の位置(x1=0かつx2=0)にならないことがある。 As a result of the above calculation, the change in the position of the displacement x2 can be acquired, so it is possible to determine what kind of deformation occurs according to the time-series data F (t) of the appropriate gripping force. In particular, when a damping element is included, the position after unloading may not be the original position (x1 = 0 and x2 = 0).
 実際には弾性要素とダンピング要素とを直列に接続した後で、弾性要素と並列の関係でダンピング要素をさらに加えるようなモデル化など、対象物の特性によって様々なバリエーションのモデル化がある。このため、本実施の形態では特に物性モデルの形式は限定しない。 Actually, there are various variations depending on the characteristics of the object, such as modeling in which the elastic element and the damping element are connected in series and then the damping element is added in parallel with the elastic element. Therefore, in this embodiment, the format of the physical property model is not particularly limited.
 以上、説明したように、本実施の形態によると、変形評価部33が、対象物70である把持物体の許容される変形を考慮した把持点のみを抽出することができるため、把持失敗する把持点を選択する割合が低減し、生産効率を向上するという格別な効果を得ることができる。また、ロボットハンド20で柔軟な不定形物体である対象物70を把持する場合に、変形後の対象物70の形状とそれに至るまでの力の時系列情報に基づいて、対象物70を破損させることなく把持安定性が高い把持点を選択することができる。特に、ロボットハンド20が多少大きな把持力で対象物70を把持した場合でも、既定の時間内に把持力を除荷すれば対象物70の変形が許容される場合を含めることができる。このため、失敗が低減し、高い成功率で対象物70を把持することができ、タクトタイムを短くして生産効率を高く維持できるという格別な効果が得られる。 As described above, according to the present embodiment, the deformation evaluation unit 33 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, so that the gripping fails. The ratio of selecting points is reduced, and a special effect of improving production efficiency can be obtained. Further, when the robot hand 20 grips the object 70, which is a flexible amorphous object, the object 70 is damaged based on the time-series information of the shape of the deformed object 70 and the force up to it. It is possible to select a gripping point having high gripping stability without any problem. In particular, even when the robot hand 20 grips the object 70 with a slightly large gripping force, it can include a case where the deformation of the object 70 is allowed if the gripping force is removed within a predetermined time. Therefore, the failure is reduced, the object 70 can be grasped with a high success rate, the tact time can be shortened, and the production efficiency can be maintained high, which is a special effect.
 次に、本実施の形態の別の変形例について説明する。対象物70として食品の把持を考える場合、見栄えの観点で商品価値を損なうという理由で、許容される変形量が存在することがある。この場合、先に挙げたような一定の上限値を超えたかどうかだけで判定すると把持点候補が非常に少なくなることがある。この場合、わずかな時間上限を超えただけであれば、変形量が許容される範囲内に収まることを利用することができる。 Next, another modification of the present embodiment will be described. When considering the gripping of food as the object 70, there may be an acceptable amount of deformation because it impairs the commercial value from the viewpoint of appearance. In this case, the number of gripping point candidates may be very small if it is determined only by whether or not a certain upper limit value as mentioned above is exceeded. In this case, it can be utilized that the amount of deformation is within the permissible range as long as the upper limit of time is slightly exceeded.
 すなわち、本実施の形態では、変形評価部33は、形状変形情報の一部として、把持点に作用させる力の大きさである把持力F(t)と許容負荷以上の力をかける時間tとを出力する。この場合、最終的に許容される変形量に至るかどうかを時系列表現された把持力F(t)に基づいて評価することができる。例えば、把持点決定部36は、形状変形情報としては、把持点、把持力、把持時間を用いて変形量が許容範囲内であるかどうかを、力と時間の閾値との大小関係で判定する。これにより、把持点決定部36は、食品の形状を一定の範囲内に保った状態を実現する把持点および把持力を取得することができる。この構成の場合は、把持点情報の中に、把持点の位置と把持点における把持力(作用力)の情報を含む。 That is, in the present embodiment, the deformation evaluation unit 33 has a gripping force F (t), which is the magnitude of the force acting on the gripping point, and a time t for applying a force equal to or greater than the allowable load, as a part of the shape deformation information. Is output. In this case, it is possible to evaluate whether or not the amount of deformation finally allowed is reached based on the gripping force F (t) expressed in time series. For example, the gripping point determining unit 36 uses the gripping point, the gripping force, and the gripping time as the shape deformation information to determine whether or not the deformation amount is within the allowable range based on the magnitude relationship between the force and the time threshold. .. As a result, the gripping point determining unit 36 can acquire the gripping point and the gripping force that realize the state in which the shape of the food is kept within a certain range. In the case of this configuration, the gripping point information includes information on the position of the gripping point and the gripping force (acting force) at the gripping point.
 また、把持力や時間に関する閾値は、対象物70の変形量の許容範囲に基づいて把持力および把持時間に換算され求めることができる。変位と力との関係は非特許文献1の記載を参考にすることができる。ただし、対象物70の変形量が許容範囲内か否かの判定は、把持点、把持力、把持時間を用いて、変形量を算出し、変形量に基づいて上限値を設定しても構わない。対象物70の変形量の上限値は、予め本システムのユーザによって食品毎に提供される。この場合も、ロボット制御装置30が、対象物70である把持物体の許容される変形を考慮した把持点のみを抽出することができるため、把持失敗する把持点を選択する割合が低減し、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高く維持できるという格別な効果を得ることができる。 Further, the threshold value regarding the gripping force and the time can be obtained by being converted into the gripping force and the gripping time based on the allowable range of the deformation amount of the object 70. For the relationship between displacement and force, the description in Non-Patent Document 1 can be referred to. However, in order to determine whether or not the deformation amount of the object 70 is within the allowable range, the deformation amount may be calculated using the gripping point, the gripping force, and the gripping time, and the upper limit value may be set based on the deformation amount. not. The upper limit of the deformation amount of the object 70 is provided in advance by the user of this system for each food. Also in this case, since the robot control device 30 can extract only the gripping points in consideration of the permissible deformation of the gripping object which is the object 70, the ratio of selecting the gripping points where the gripping fails is reduced and is high. It is possible to grasp the object with the success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be maintained high.
実施の形態3.
 本実施の形態では、さらに把持点近傍の変形後の力のつり合いに対して、予め定めた外力に対する力学的安定性によって把持安定性評価を行う把持安定性計算部をさらに備えることが実施の形態2と異なる。図8は実施の形態3に係る把持点生成部の構成を示すブロック図である。図3に示した把持点生成部31の構成に加えて、把持点生成部31aは、把持安定性計算部34と、結果DB(結果データベース)35とによって構成されている。
Embodiment 3.
In the present embodiment, the embodiment further includes a gripping stability calculation unit that evaluates gripping stability by mechanical stability against a predetermined external force with respect to the balance of the deformed force near the gripping point. Different from 2. FIG. 8 is a block diagram showing a configuration of a gripping point generation unit according to the third embodiment. In addition to the configuration of the grip point generation unit 31 shown in FIG. 3, the grip point generation unit 31a is composed of a grip stability calculation unit 34 and a result DB (result database) 35.
 把持安定性計算部34は、対象物70の把持点の近傍における対象物70の変形後の力のつり合いに対して、予め定めた外力に対する力学的安定性を評価する。また、把持安定性計算部34は、対象物70の把持点の近傍における対象物70の変形後の力のつり合いを評価し、ロボットハンド20の対象物70に対する把持力が最小になる対象物の把持点を抽出する。 The gripping stability calculation unit 34 evaluates the mechanical stability against a predetermined external force with respect to the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70. Further, the gripping stability calculation unit 34 evaluates the balance of the deformed force of the object 70 in the vicinity of the gripping point of the object 70, and the gripping force of the robot hand 20 with respect to the object 70 is minimized. Extract the gripping point.
 把持安定性計算部34は、形状変形情報を入力する。まず、把持安定性計算部34は、ロボットハンド20のフィンガ各点と把持対象について変形後に把持対象に生じる力ベクトルに基づいて算出する。次に、把持安定性計算部34は、対象物70の把持点における力のつり合いから、対象物70が移動しないかどうかを評価する。この際、変形した対象物70とロボットハンド20のフィンガとが幾何的な拘束(動けない)を生じている場合は、ロボットハンド20のフィンガの把持力以外の力の作用によって対象物70とフィンガとが押し付けられた状態になり、安定している状態とみなす。 The grip stability calculation unit 34 inputs shape deformation information. First, the grip stability calculation unit 34 calculates each point of the finger of the robot hand 20 and the grip target based on the force vector generated in the grip target after the deformation. Next, the grip stability calculation unit 34 evaluates whether or not the object 70 does not move based on the balance of forces at the grip point of the object 70. At this time, if the deformed object 70 and the finger of the robot hand 20 are geometrically constrained (immovable), the object 70 and the finger are affected by the action of a force other than the gripping force of the finger of the robot hand 20. It becomes a state where and is pressed, and it is regarded as a stable state.
 把持安定性計算部34は、「安定状態」を維持できるか否かを判定する。安定している状態(安定度)について説明する。予め定めた外力をFdisとし、ロボットハンド20のフィンガの力に応じて対象物70が変形しただけの状態を外力Fdis=0の状態とする。Fdis=0の状態での安定度は、力学的な力のつり合いに基づいて、偶力や加速力が対象物70に発生しないかどうかに評価できる。偶力や加速力が対象物70に発生する場合でも、偶力や加速力が発生した方向への対象物70の移動を妨げるようにロボットハンド20のフィンガと対象物の変形後の形状とが構成されていれば、すなわち幾何的拘束が成り立っていれば偶力や加速度は打ち消されるものとして、指定した把持点での対象物の把持について「安定状態」と判定する。 The grip stability calculation unit 34 determines whether or not the "stable state" can be maintained. The stable state (stability) will be described. The predetermined external force is defined as Fdis, and the state in which the object 70 is only deformed according to the force of the finger of the robot hand 20 is defined as the external force Fdis = 0. The stability in the state of Fdis = 0 can be evaluated as to whether or not a couple or an accelerating force is generated in the object 70 based on the balance of mechanical forces. Even when a couple or acceleration force is generated on the object 70, the finger of the robot hand 20 and the deformed shape of the object are arranged so as to prevent the object 70 from moving in the direction in which the couple or acceleration force is generated. If it is configured, that is, if the geometrical constraint is established, the couple and the acceleration are canceled, and it is determined that the grip of the object at the designated gripping point is in the "stable state".
 さらに、把持安定性計算部34は、外力Fdisを0以外に定めた場合にも、「安定状態」を維持できるか否かを判定する。外力Fdisを加えた場合は、形状変形情報に対してさらに把持力F(t)と外力Fdisを足し合わせた場合の変形を追加する。変形は前述の物性モデルを利用して変位と力との関係で求める。この形状変形情報に基づいて「安定状態」を判定する。 Further, the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained even when the external force Fdis is set to a value other than 0. When the external force Fdis is applied, the deformation when the gripping force F (t) and the external force Fdis are added to the shape deformation information is added. Deformation is obtained by the relationship between displacement and force using the above-mentioned physical property model. The "stable state" is determined based on this shape deformation information.
 さらに、把持安定性計算部34は、ロボット10が加減速した条件における「安定状態」も判定することができる。まず、把持した状態で加速あるいは減速すると対象物には慣性力が発生する。慣性力の場合、慣性力による外力Fdis(t)は対象物70の質量mと対象物70の加速度α_obj(t)とによって式1のようにあらわすこともできる。対象物70の加速度α_obj(t)は時間tの関数になっているが、基本的にはロボット10のフィンガに関する指令値に基づいて求められる。
  Fdis(t)=m・α_obj(t)   (式1)
Further, the grip stability calculation unit 34 can also determine the "stable state" under the condition that the robot 10 accelerates or decelerates. First, when accelerating or decelerating while gripping, an inertial force is generated on the object. In the case of an inertial force, the external force Fdis (t) due to the inertial force can also be expressed as in Equation 1 by the mass m of the object 70 and the acceleration α_obj (t) of the object 70. The acceleration α_obj (t) of the object 70 is a function of time t, but is basically obtained based on the command value regarding the finger of the robot 10.
Fdis (t) = m ・ α_obj (t) (Equation 1)
 慣性力Finrに対して対象物70がロボットハンド20のフィンガから滑り落ちる事象、すなわち、幾何的な拘束が無くなるための拘束力上限Flimを対象物70の物性(弾性係数Kとダンピング係数C)に応じて定めておき、拘束力上限Flimを超えた場合は、幾何的な拘束が無くなったとして、安定している状態ではなくなる。把持安定性計算部34は、「安定状態」を維持している場合は、把持安定度の評価値を高く設定し、安定性評価結果として結果DB35へ出力する。また、把持安定性計算部34は、「安定状態」でなくなった場合は、把持安定度の評価値を低く設定し、安定性評価結果として結果DB35へ出力する。 The phenomenon that the object 70 slides down from the finger of the robot hand 20 with respect to the inertial force Finr, that is, the upper limit Flim of the binding force for eliminating the geometrical constraint is set according to the physical properties of the object 70 (elastic modulus K and damping coefficient C). If the upper limit of the binding force is exceeded, it will not be in a stable state even if the geometrical constraint is lost. When the gripping stability calculation unit 34 maintains the "stable state", the gripping stability evaluation value is set high, and the stability evaluation result is output to the DB 35 as a result. Further, when the grip stability calculation unit 34 is no longer in the "stable state", the grip stability evaluation value is set low, and the stability evaluation result is output to the DB 35 as a result.
 なお、安定している状態が無くなる拘束力上限は、対象物70とロボットハンド20との摩擦係数μによって規定することもできる。ロボットハンド20の把持点における押しつけ力をFiとすると、拘束力上限Flimを式2のように定義することもできる。
  Flim=μ・Fi   (式2)
 この場合、例えばロボットハンド20のフィンガの把持点iに対する「安定状態」を把持安定度Siとすると、把持安定度Siを式3のように定義することもできる。
  Si=(Flim-max(Fdis(t)))   (式3)
 把持安定度Siを用いることで、把持点同士を比較することができるようになる。把持安定性計算部34は、これを安定性評価結果として結果DB35へ出力する。安定性評価結果はこの方法に限定するものではない。
The upper limit of the binding force from which the stable state disappears can also be defined by the friction coefficient μ between the object 70 and the robot hand 20. Assuming that the pressing force at the gripping point of the robot hand 20 is Fi, the binding force upper limit Flim can also be defined as in Equation 2.
Flim = μ ・ Fi (Equation 2)
In this case, for example, assuming that the "stable state" of the finger of the robot hand 20 with respect to the gripping point i is the gripping stability Si, the gripping stability Si can be defined as in the equation 3.
Si = (Flim-max (Fdis (t))) (Equation 3)
By using the grip stability Si, it becomes possible to compare the grip points with each other. The grip stability calculation unit 34 outputs this as a stability evaluation result to the result DB 35. The stability evaluation results are not limited to this method.
 以上のルールに基づいて、把持安定性計算部34は、結果DB35を介して算出された把持点候補と安定性評価結果を把持点決定部36へ出力する。把持点決定部36は、複数記憶された把持点候補と安定性評価結果とに基づいて、最も安定性評価結果が高い把持点を選択することができる。 Based on the above rules, the grip stability calculation unit 34 outputs the grip point candidates and the stability evaluation results calculated via the result DB 35 to the grip point determination unit 36. The gripping point determination unit 36 can select the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result.
 ここで、ロボット制御装置30の動作を説明する。図9は、ロボット制御装置の動作を示すフローチャートである。図9のステップS101からステップS103までは図6と同じであるので、説明を省略する。ステップS201において、把持安定性計算部34は、「安定状態」を維持できるか否かを判定する。「安定状態」を維持できる場合は、ステップS202へ進み、把持安定性計算部34は、把持安定度の評価値を高く設定する。「安定状態」でなくなった場合は、ステップS203へ進み、把持安定性計算部34は、把持安定度の評価値を低く設定する。そして、ステップS204において、把持点決定部36は、複数記憶された把持点候補と安定性評価結果とに基づいて、最も安定性評価結果が高い把持点を選択し、把持点を決定する。 Here, the operation of the robot control device 30 will be described. FIG. 9 is a flowchart showing the operation of the robot control device. Since steps S101 to S103 of FIG. 9 are the same as those of FIG. 6, the description thereof will be omitted. In step S201, the grip stability calculation unit 34 determines whether or not the "stable state" can be maintained. If the "stable state" can be maintained, the process proceeds to step S202, and the grip stability calculation unit 34 sets a high evaluation value of the grip stability. When it is no longer in the "stable state", the process proceeds to step S203, and the grip stability calculation unit 34 sets the evaluation value of the grip stability low. Then, in step S204, the gripping point determination unit 36 selects the gripping point having the highest stability evaluation result based on the plurality of stored gripping point candidates and the stability evaluation result, and determines the gripping point.
 このような構成によって、ロボットハンド20が対象物70を搬送している途中の形状変形による落下を考慮して、落下しにくい把持点を抽出することができるようになる。この場合も、対象物70である把持物体の許容される変形を考慮した把持点のみを抽出することができるため、把持失敗する把持点を選択する割合が低減し、生産効率を向上するという格別な効果を得ることができる。 With such a configuration, it becomes possible to extract a gripping point that is difficult to fall in consideration of a fall due to a shape deformation while the robot hand 20 is transporting the object 70. In this case as well, since it is possible to extract only the gripping points in consideration of the allowable deformation of the gripping object which is the object 70, the ratio of selecting the gripping points that fail to grip is reduced, and the production efficiency is improved. Effect can be obtained.
 次に、本実施の形態の別の変形例について説明する。変形評価部33において、各把持点における把持力Fi(t)を様々に変化させたシミュレーションを行うことを考える。把持力Fi(t)が小さくなるものが含まれる場合は、式2によると、拘束力上限Flimが小さくなり、結果として把持安定度Siは小さくなりやすくなる。一方で、各把持点における把持力Fi(t)が小さくなると変形評価部33で算出される形状変形情報として出力される変形量が小さくなる。この時、式3のような「安定状態」と別の指標として「変形最小」という指標を含め、把持安定度Siを式4のように定義することもできる。
  Si=w1*(Flim-max(Fdis(t)))+w2/max(Fi(t))   (式4)
ここで、w1およびw2は適当な重み係数である。重み係数は安定状態を維持しやすい方を評価するか、最小把持力で把持しているかのいずれを重要視するのかによってユーザが設計する。
Next, another modification of the present embodiment will be described. It is considered that the deformation evaluation unit 33 performs a simulation in which the gripping force Fi (t) at each gripping point is variously changed. When a gripping force Fi (t) is included, according to Equation 2, the binding force upper limit Flim becomes smaller, and as a result, the gripping stability Si tends to become smaller. On the other hand, when the gripping force Fi (t) at each gripping point becomes smaller, the amount of deformation output as the shape deformation information calculated by the deformation evaluation unit 33 becomes smaller. At this time, the gripping stability Si can be defined as in the formula 4 including the index of "minimum deformation" as an index different from the "stable state" as in the formula 3.
Si = w1 * (Flim-max (Fdis (t))) + w2 / max (Fi (t)) (Equation 4)
Here, w1 and w2 are appropriate weighting factors. The weighting factor is designed by the user depending on whether it is easier to maintain a stable state or whether it is gripped with the minimum gripping force.
 把持安定性計算部34が、把持安定度Siに基づいて把持安定性を評価することによって、ロボットハンド20で柔軟な不定形物体を把持する場合に、変形後の対象物70の形状に基づいて、対象物70を破損させることなく把持安定性が高い把持点を選択することができるようになる。結果として、把持失敗が低減し、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率が上がるという格別な効果が得られる。 When the gripping stability calculation unit 34 evaluates the gripping stability based on the gripping stability Si, when the robot hand 20 grips a flexible amorphous object, it is based on the shape of the deformed object 70. It becomes possible to select a gripping point having high gripping stability without damaging the object 70. As a result, gripping failures are reduced, the object can be gripped with a high success rate, the tact time is shortened, and the production efficiency is improved.
実施の形態4.
 本実施の形態では、把持安定性計算部34として、ロボットハンド20のフィンガの形状と形状変形情報とに基づいて幾何的なずれにくさを評価し、把持安定性評価結果として出力する。実施の形態3では、把持点が点で表現されていたが、本実施の形態では、把持点に幾何的な形状を与える。この場合、接触点は1つのフィンガに対しても複数生じる。把持安定性計算部34は、ロボットハンド20のフィンガの形状と形状変形情報とに基づいてロボットハンド20に対する対象物70の幾何的なずれにくさを評価する。
Embodiment 4.
In the present embodiment, the gripping stability calculation unit 34 evaluates the resistance to geometrical deviation based on the shape of the finger of the robot hand 20 and the shape deformation information, and outputs the result of the gripping stability evaluation. In the third embodiment, the gripping points are represented by points, but in the present embodiment, the gripping points are given a geometric shape. In this case, a plurality of contact points are generated even for one finger. The grip stability calculation unit 34 evaluates the difficulty of geometrically shifting the object 70 with respect to the robot hand 20 based on the shape of the finger of the robot hand 20 and the shape deformation information.
 図10は、ロボットハンド20のフィンガと対象物70との位置関係を示す図である。対象物70の把持に伴う接触点が複数個である場合の物性モデルを導入することで、幾何的な拘束に関する拘束力上限Flimを定義した式2を式5のように置き換えることができる。
  Flim=μ・A・Fi   (式5)
ここで、Aはロボットハンド20のフィンガと対象物70との有効接触面積である。有効接触面積とは、図10に示すようにフィンガが点接触ではなく対象物に対して面接触する場合の接触面積を示している。一般に、点接触の状態の摩擦係数に比べて面接触の状態の摩擦係数の方が大きい。これを反映させるためのモデル化が本実施の形態となる。対象物70の変形に伴って、増加するフィンガの接触面積量を示している。一定以上変形するとA=1となり、軽くつまんでいる状態で把持力Fi(t)が小さい場合は、0<A<1となるような物性モデルである。
FIG. 10 is a diagram showing the positional relationship between the finger of the robot hand 20 and the object 70. By introducing a physical characteristic model in which there are a plurality of contact points associated with gripping the object 70, it is possible to replace Equation 2 defining the binding force upper limit Flim regarding geometrical restraint as Equation 5.
Flim = μ ・ A ・ Fi (Equation 5)
Here, A is an effective contact area between the finger of the robot hand 20 and the object 70. As shown in FIG. 10, the effective contact area indicates the contact area when the finger makes surface contact with an object instead of point contact. In general, the coefficient of friction in the state of surface contact is larger than the coefficient of friction in the state of point contact. Modeling to reflect this is the embodiment of the present embodiment. It shows the amount of contact area of the finger that increases with the deformation of the object 70. It is a physical characteristic model in which A = 1 when deformed by a certain amount or more, and 0 <A <1 when the gripping force Fi (t) is small in a lightly pinched state.
 すなわち、対象物70の変形後の形状とロボットハンド20のフィンガとの接触量に基づいて、有効な摩擦係数を変動させているのと等価である。このように、把持安定性計算部34が、有効接触面積に基づいて摩擦係数を定義し、この摩擦係数に基づいて把持安定性を計算することが本実施の形態の特徴である。 That is, it is equivalent to changing the effective coefficient of friction based on the deformed shape of the object 70 and the contact amount with the finger of the robot hand 20. As described above, the grip stability calculation unit 34 defines the friction coefficient based on the effective contact area, and the grip stability is calculated based on the friction coefficient, which is a feature of the present embodiment.
 上記物性モデルを導入する構成では、把持力Fi(t)が弱いほど有効接触面積Aも小さくなるため、拘束力上限Flimが把持力Fi(t)に応じて変化しやすくなる。結果として、実際の把持状態の変形評価および把持安定性計算部の模擬精度が向上する。 In the configuration in which the above physical property model is introduced, the weaker the gripping force Fi (t), the smaller the effective contact area A, so that the upper limit of the binding force Flim tends to change according to the gripping force Fi (t). As a result, the deformation evaluation of the actual gripping state and the simulated accuracy of the gripping stability calculation unit are improved.
 本実施の形態によれば、ロボットハンド20で柔軟な不定形物体を把持する場合に、変形後の形状に基づいて計算される把持安定度の精度が高まり、従来よりも対象物を破損させることなく把持安定性が高い把持点を選択することができるようになる。結果として、失敗が低減し、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率が上がるという格別な効果が得られる。 According to the present embodiment, when the robot hand 20 grips a flexible amorphous object, the accuracy of the grip stability calculated based on the deformed shape is improved, and the object is damaged more than before. It becomes possible to select a gripping point having high gripping stability. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
実施の形態5.
 本実施の形態は、把持力を印加してから一定時間後に把持力を取り除いた場合の形状変形情報を出力することが実施の形態3と異なる。変形評価部33は、ロボットハンド20が対象物70に対して把持力を印加してから一定時間後の把持力を除荷した後の形状変形情報を出力する。そして、把持安定性計算部34は、対象物70の元の形状と対象物70の除荷した後の形状との差分量を求め、差分量と予め定めた変形許容値とを比較して評価する。
Embodiment 5.
This embodiment is different from the third embodiment in that it outputs shape deformation information when the gripping force is removed after a certain period of time after the gripping force is applied. The deformation evaluation unit 33 outputs the shape deformation information after the robot hand 20 applies the gripping force to the object 70 and then unloads the gripping force after a certain period of time. Then, the grip stability calculation unit 34 obtains the difference amount between the original shape of the object 70 and the shape of the object 70 after unloading, and evaluates by comparing the difference amount with the predetermined deformation allowable value. do.
 ロボットハンド20が対象物70に対して把持力を印加し続けている場合は、把持力Fi(t)=F0(一定)とすることができる。この場合、形状変形情報はある一定形状に収束することが期待される。一方で、完全な塑性変形でない限りは、把持力を除荷すると形状がさらに変化する。 When the robot hand 20 continues to apply the gripping force to the object 70, the gripping force Fi (t) = F0 (constant) can be set. In this case, the shape deformation information is expected to converge to a certain shape. On the other hand, unless it is completely plastically deformed, the shape changes further when the gripping force is removed.
 ここで、例えば0秒からt0秒までの間、把持力F0という力を印加し、t0秒経過後に除荷する場合を考える。除荷後はFi(t+t0)=0となる。このとき、本実施の形態では、除荷後十分な時間が経過したのちの形状を「除荷後の形状変形情報」として、変形評価部から形状変形情報の一部として「除荷後の形状変形情報」を出力する構成を追加するものである。 Here, consider a case where, for example, a force called a gripping force F0 is applied from 0 seconds to t0 seconds, and the load is unloaded after the elapse of t0 seconds. After unloading, Fi (t + t0) = 0. At this time, in the present embodiment, the shape after a sufficient time has passed after unloading is regarded as "shape deformation information after unloading", and "shape after unloading" is used as a part of the shape deformation information from the deformation evaluation unit. It adds a configuration to output "transformation information".
 変形評価部33から出力される把持力Fi(t)=F0が印加されている状態での形状変形情報を第1の形状変形情報とし、一定時間t0経過後に把持力を除荷し十分な時間が経過した後の形状変形情報を第2の形状変形情報と呼ぶ。この時、対象物70の元の形状と第2の形状変形情報における形状との差分量を演算し、その差分量について予め定めた変形許容値と大小関係を比較して、変形許容値を超過しているものについては、把持安定度を低く評価し、超過していないものについては把持安定度を高く評価することを特徴とする。 The shape deformation information in the state where the gripping force Fi (t) = F0 output from the deformation evaluation unit 33 is applied is used as the first shape deformation information, and the gripping force is unloaded after a certain period of time t0 has elapsed for a sufficient time. The shape deformation information after the lapse of time is called the second shape deformation information. At this time, the difference amount between the original shape of the object 70 and the shape in the second shape deformation information is calculated, and the deformation tolerance value and the magnitude relationship determined in advance are compared for the difference amount, and the deformation tolerance value is exceeded. It is characterized in that the gripping stability is evaluated low for those that do, and the gripping stability is evaluated high for those that do not exceed.
 図11は、フィンガと対象物70との位置関係を示す図である。以下、把持力が印加されている前後での形状の差分を取る方法について、図11を用いて一例を例示する。把持安定性計算部34は、対象物70の元の形状の曲率と対象物70の除荷した後の形状の曲率との差分量を求め、曲率の差分量と予め定めた変形許容値とを比較して評価する。曲率の差分量は、次のようにして求めることができる。第1の形状変形情報と第2の形状変形情報とについて、変形していない点(把持点から遠い点)を基準として重ね合わせる。重ね合わせたうえで、変形している点から変形していない点に変わる点を2つ選ぶ。すなわち、変形前後で位置が変わっている曲線の線分が初めて変形していないために重なっている点(図11の場合、離散点DP3,DP4に相当)である。この時、この2点間の曲線の長さを求める。2点間の曲線は、離散点DP3から離散点DP5を通って離散点DP4に至る長さL1の曲線と、離散点DP3から離散点DP1を通って離散点DP4に至る長さL2の曲線との2つである。それぞれの長さL1と長さL2とに基づいて、一定比率毎に対応点を定義する。例えば各曲線において、端点から0.25×L1と0.25×L2に相当する地点を対応する点として定義し、それぞれの対応点間の距離を求める。その距離それぞれ求め、最大値を「曲率の差分量」として定義する。図11の場合、曲率の差分量は、離散点DP1と離散点DP5との距離DC1である。変形評価部33は、この差分量が予めユーザが定めた「変形許容値」よりも大きいか小さいかを評価し、評価した結果を、形状変形情報の一部として出力する。なお、図11の場合、2点間の曲線の間にそれぞれ3点の離散点が設けられるが、3点に限定されず、例えば9点であってもよい。離散点が9点の場合、各曲線において、端点から0.1×L1と0.1×L2に相当する地点が対応点として定義される。 FIG. 11 is a diagram showing the positional relationship between the finger and the object 70. Hereinafter, an example of a method of taking a difference in shape before and after the gripping force is applied will be illustrated with reference to FIG. 11. The gripping stability calculation unit 34 obtains the difference amount between the curvature of the original shape of the object 70 and the curvature of the shape of the object 70 after unloading, and determines the difference amount of the curvature and the predetermined deformation allowable value. Compare and evaluate. The amount of difference in curvature can be obtained as follows. The first shape deformation information and the second shape deformation information are superimposed on the non-deformed point (point far from the gripping point) as a reference. After superimposing, select two points that change from the deformed point to the non-deformed point. That is, the line segments of the curves whose positions have changed before and after the deformation are overlapped because they have not been deformed for the first time (in the case of FIG. 11, they correspond to the discrete points DP3 and DP4). At this time, the length of the curve between these two points is obtained. The curves between the two points are a curve having a length L1 from the discrete point DP3 through the discrete point DP5 to the discrete point DP4 and a curve having a length L2 from the discrete point DP3 through the discrete point DP1 to the discrete point DP4. There are two. Corresponding points are defined for each fixed ratio based on the respective lengths L1 and L2. For example, in each curve, points corresponding to 0.25 × L1 and 0.25 × L2 are defined as corresponding points, and the distance between the corresponding points is obtained. Find each of the distances and define the maximum value as the "difference in curvature". In the case of FIG. 11, the difference amount of the curvature is the distance DC1 between the discrete point DP1 and the discrete point DP5. The deformation evaluation unit 33 evaluates whether the difference amount is larger or smaller than the "deformation allowable value" predetermined by the user, and outputs the evaluation result as a part of the shape deformation information. In the case of FIG. 11, three discrete points are provided between the curves between the two points, but the number is not limited to three, and may be, for example, nine points. When there are 9 discrete points, the points corresponding to 0.1 × L1 and 0.1 × L2 from the end points are defined as corresponding points in each curve.
 本実施の形態において、把持安定性計算部34では、形状変形情報を用いて、「変形許容値」よりも大きい場合は、把持点候補として棄却するラベルを付けて出力する。 In the present embodiment, the grip stability calculation unit 34 uses the shape deformation information and outputs it with a label to be rejected as a grip point candidate when it is larger than the "deformation allowable value".
 本実施の形態によれば、ロボットハンド20で柔軟な不定形物体を把持する場合に、除荷後の形状すなわち作業後の最終形状に基づいて許容可能な形状かどうかを評価することで、結果的に作業失敗として取り扱われる把持力あるいは把持点を選択しているものを抽出対象から除外することができる。結果として、失敗が低減し、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率が上がるという格別な効果が得られる。 According to the present embodiment, when the robot hand 20 grips a flexible amorphous object, the result is evaluated by evaluating whether or not the shape is acceptable based on the shape after unloading, that is, the final shape after work. It is possible to exclude from the extraction target a gripping force or a gripping point that is treated as a work failure. As a result, failures are reduced, the object can be grasped with a high success rate, the tact time is shortened, and the production efficiency is improved.
実施の形態6.
 本実施の形態では、シミュレーションあるいは実際に把持点に対して作業を実施した結果ラベルである成功失敗ラベル(成功失敗情報)と、その際の形状変形情報、把持力、把持点、対象物の物性を入力として学習を行い、対象物の形状を入力として把持点を出力することができるニューラルネットワークを構築する把持点候補学習部を把持点生成部が備えることが実施の形態1と異なる。
Embodiment 6.
In this embodiment, a success / failure label (success / failure information), which is a result label obtained by performing work on a simulated or actual gripping point, and shape deformation information, gripping force, gripping point, and physical properties of an object at that time. The grip point generation unit is different from the first embodiment in that the grip point candidate learning unit for constructing a neural network capable of performing learning by inputting the above and outputting the grip point by inputting the shape of the object is provided.
 図12は、実施の形態6に係る把持点生成部31bの構成を示すブロック図である。図3に示した把持点生成部31の構成に加えて、把持点生成部31bは、把持安定性計算部34と、結果DB(結果データベース)35とによって構成されている。また、ロボット制御装置30は、把持点候補学習部37、学習DB(学習データベース)38を備えている。把持点候補学習部37は、ニューラルネットワーク40を有している。把持点候補学習部37は、把持安定性計算部34から出力される把持点候補および安定性評価結果である結果データと実作業で得られる結果ラベルとを入力し、形状変形情報から対象物の把持点候補を出力する関係を学習する。図12に示すように対象形状情報(変形前)を入力として、把持点、把持力、把持安定度を出力するようなネットワークを学習することが例示される。 FIG. 12 is a block diagram showing the configuration of the gripping point generation unit 31b according to the sixth embodiment. In addition to the configuration of the grip point generation unit 31 shown in FIG. 3, the grip point generation unit 31b is composed of a grip stability calculation unit 34 and a result DB (result database) 35. Further, the robot control device 30 includes a gripping point candidate learning unit 37 and a learning DB (learning database) 38. The grip point candidate learning unit 37 has a neural network 40. The gripping point candidate learning unit 37 inputs the gripping point candidate output from the gripping stability calculation unit 34, the result data which is the stability evaluation result, and the result label obtained in the actual work, and the object is obtained from the shape deformation information. Learn the relationship that outputs grip point candidates. As shown in FIG. 12, it is exemplified to learn a network that outputs a gripping point, a gripping force, and a gripping stability by inputting target shape information (before deformation).
 把持点候補学習部37について、シミュレーションと実機実験とを活用した事例で説明する。シミュレーション(数値計算処理)を活用して、対象物70の把持直前に得られた把持対象である対象物70の形状情報に基づいて把持点候補を定め、実際に把持操作を試行することを考える。実施の形態5までに記載の構成に基づいて、実機にて作業すると高い確率で把持が成功することが期待される。ただし、実機で作業する場合にモデル化できていない要因で把持失敗することが想定される。 The gripping point candidate learning unit 37 will be described with an example using simulation and an actual machine experiment. Consider using simulation (numerical calculation processing) to determine gripping point candidates based on the shape information of the object 70, which is the object to be gripped, obtained immediately before gripping the object 70, and to actually try the gripping operation. .. Based on the configurations described up to the fifth embodiment, it is expected that gripping will be successful with a high probability when working with an actual machine. However, when working on an actual machine, it is assumed that gripping will fail due to factors that could not be modeled.
 この場合、変形評価部33と把持点決定部36とで設計された把持点候補すべてに対して、成功と失敗の結果ラベルがある。しかしながら、一般的には、成功と対象物の形状(変形前と変形後)と把持点と把持力との因果関係を数式化するのは難しい。そこで、例えば、ニューラルネットワークのフレームワークを使って、非線形な関係を学習し、関係性を獲得することができる。 In this case, there are success and failure result labels for all the grip point candidates designed by the deformation evaluation unit 33 and the grip point determination unit 36. However, in general, it is difficult to formulate the causal relationship between success, the shape of the object (before and after deformation), the gripping point, and the gripping force. Therefore, for example, a framework of a neural network can be used to learn a non-linear relationship and acquire the relationship.
 複数回の試行に関する成功失敗ラベル、把持点、把持力、対象物の物性、対象物の変形形状(変形前の形状と変形後の形状情報)と把持安定度を、それぞれの試行に関する成功失敗ラベルを用意して、ニューラルネットワークを学習させる処理を行う。 Success / failure label for multiple trials, gripping point, gripping force, physical properties of the object, deformation shape of the object (shape before deformation and shape information after deformation) and grip stability, success / failure label for each trial Is prepared, and the process of learning the neural network is performed.
 ここで、把持点生成部31bは、対象物70に作用する力と対象物70の変位との関係をばね乗数とダンピング係数とを使ったモデルによってモデル化する物性モデル定義部(図示せず)を有している。物性モデル定義部は、対象物70に時間によって変化する力を印加し、印加された力に対して対象物70の変形に基づく変位の時系列情報に基づいて設定したモデルの物性モデル(ばね乗数Kとダンピング係数C)を推定する。なお、この際、予め定めておいた、ばね乗数Kとダンピング係数Cとに対して、実機作業によって獲得された変形結果に基づいて、ばね乗数Kとダンピング係数Cとを更新することもできる。また、他の方法として、ばね乗数Kとダンピング係数Cとを仮定せず、力と変位との関係を実際に得られた変形情報と把持力との時系列情報のみから学習によって獲得することもできる。例えば、変形情報と把持力との時系列情報を与え、把持力の時系列情報から変位情報を出力するようなニューラルネットワークを構築することが例示できる。 Here, the grip point generation unit 31b is a physical characteristic model definition unit (not shown) that models the relationship between the force acting on the object 70 and the displacement of the object 70 by a model using a spring multiplier and a damping coefficient. have. The physical property model definition unit applies a force that changes with time to the object 70, and sets the physical property model (spring multiplier) of the model based on the time-series information of the displacement based on the deformation of the object 70 with respect to the applied force. K and the damping coefficient C) are estimated. At this time, the spring multiplier K and the damping coefficient C can be updated with respect to the predetermined spring multiplier K and the damping coefficient C based on the deformation result obtained by the actual machine work. In addition, as another method, the relationship between the force and the displacement can be obtained by learning only from the time-series information of the deformation information and the gripping force actually obtained without assuming the spring multiplier K and the damping coefficient C. can. For example, it is possible to exemplify the construction of a neural network that gives time-series information of deformation information and gripping force and outputs displacement information from the time-series information of gripping force.
 また、把持点候補学習部37は、対象物70に作用する力と対象物70の変位との関係をニューラルネットワークによってモデル化して学習する物性モデル学習部(図示せず)を有している。物性モデル学習部は、対象物70に時間によって変化する力を印加し、印加された力に対して対象物の変形に基づく変位の時系列情報に基づいて設定したニューラルネットワーク40を学習する。 Further, the gripping point candidate learning unit 37 has a physical characteristic model learning unit (not shown) that learns by modeling the relationship between the force acting on the object 70 and the displacement of the object 70 by a neural network. The physical characteristic model learning unit applies a force that changes with time to the object 70, and learns a neural network 40 that is set based on time-series information of displacement based on the deformation of the object with respect to the applied force.
 把持点候補学習部37は、結果DB35に保存されている把持点候補と安定性評価結果とに基づいて学習処理を行う。例えばニューラルネットワーク40の学習が例示される。ニューラルネットワーク40には図示しない学習部と推論部とがある。学習部での学習パラメータを使って、推論部において学習パラメータを反映させたニューラルネットワーク41を組み込む。すると、対象形状情報を入力として、把持点および把持力を出力とするようなことができる。学習パラメータはニューラルネットワークのネットワーク構造を定義する係数が例示される。 The gripping point candidate learning unit 37 performs learning processing based on the gripping point candidate stored in the result DB 35 and the stability evaluation result. For example, learning of the neural network 40 is exemplified. The neural network 40 has a learning unit and an inference unit (not shown). Using the learning parameters in the learning unit, the neural network 41 that reflects the learning parameters is incorporated in the inference unit. Then, the target shape information can be input and the gripping point and the gripping force can be output. The learning parameters are exemplified by the coefficients that define the network structure of the neural network.
 図13は、実施の形態6に係る別の把持点生成部31cの構成を示すブロック図である。図13に示すように、上記プロセスで獲得されたニューラルネットワーク41を把持点候補生成部32aとして適用し、対象形状情報を入力すると複数の把持点候補と把持安定度を把持点候補生成部32aが生成し、把持点決定部36に出力する。把持点決定部36は、把持安定度を用いて把持点候補を1つ選んで出力する。 FIG. 13 is a block diagram showing a configuration of another grip point generation unit 31c according to the sixth embodiment. As shown in FIG. 13, when the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, the grip point candidate generation unit 32a generates a plurality of grip point candidates and grip stability. It is generated and output to the gripping point determination unit 36. The gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
 本実施の形態によれば、実際の作業で獲得されたモデル化誤差を修正した把持点生成アルゴリズムを学習によって獲得でき、結果として、把持点候補を算出する計算コストが低減し、把持点を算出する時間が短くなるため、生産効率が上がるという格別な効果が得られる According to the present embodiment, a gripping point generation algorithm that corrects the modeling error acquired in the actual work can be acquired by learning, and as a result, the calculation cost for calculating the gripping point candidate is reduced, and the gripping point is calculated. Since the time to do is shortened, a special effect of increasing production efficiency can be obtained.
 次に、本実施の形態の別の変形例について説明する。シミュレーションによって得られた把持点候補から上記に記載したのと同様の処理を行う方法を説明する。先の説明で、シミュレーション(数値計算処理)を活用して、把持直前に得られた把持対象の形状情報に基づいて把持点候補を定め、実際に把持操作を試行することを考える。この時、把持対象の形状情報もシミュレーションによって生成された形状である。また、対象物70の把持を行う試行自体は物理的な接触現象なども模擬し、形状も精緻に観測されている物理シミュレーション上で実施される。このため、外乱や不確定要素がない物理シミュレーションを用いる場合は、最も成功率が高い把持点は既知でありすべて成功するラベルが貼ってあることが期待される。 Next, another modification of the present embodiment will be described. A method of performing the same processing as described above from the gripping point candidates obtained by the simulation will be described. In the above description, it is considered to utilize simulation (numerical calculation processing) to determine a gripping point candidate based on the shape information of the gripping object obtained immediately before gripping, and to actually try the gripping operation. At this time, the shape information of the gripping object is also the shape generated by the simulation. Further, the trial itself for grasping the object 70 simulates a physical contact phenomenon and the like, and is carried out on a physical simulation in which the shape is precisely observed. Therefore, when using a physics simulation without disturbance or uncertainties, it is expected that the grip points with the highest success rate are known and all are labeled as successful.
 複数回の成功ラベルが貼られた試行に対し、把持点、把持力、対象物70の物性、対象物70の変形形状(変形前の形状と変形後の形状情報)と把持安定度を、それぞれの試行に関する成功失敗ラベルを用意して、ニューラルネットワークを学習させる処理を行う。 For trials with multiple success labels, the gripping point, gripping force, physical properties of the object 70, the deformed shape of the object 70 (shape before deformation and shape information after deformation), and grip stability, respectively. Prepare the success / failure label for the trial of, and perform the process of training the neural network.
 物理シミュレーションモデルに基づくものの、実機を用いたときと同様に、形状情報(変形前)を入力として、把持点、把持力、把持安定度を出力するようなネットワークを学習することが例示される。 Although it is based on a physics simulation model, it is exemplified to learn a network that outputs a gripping point, a gripping force, and a gripping stability by inputting shape information (before deformation) as in the case of using an actual machine.
 上記プロセスで獲得されたニューラルネットワーク41を把持点候補生成部32aとして適用し、対象形状情報を入力すると複数の把持点候補と把持安定度を生成し、把持点決定部36に出力する。把持点決定部36は、把持安定度を用いて把持点候補を1つ選んで出力する。 When the neural network 41 acquired in the above process is applied as the grip point candidate generation unit 32a and the target shape information is input, a plurality of grip point candidates and grip stability are generated and output to the grip point determination unit 36. The gripping point determination unit 36 selects and outputs one gripping point candidate using the gripping stability.
 本実施の形態によれば、ロボットハンド20が柔軟な不定形物体である対象物70を把持する場合に、形状情報を入力するだけで最も把持安定度が高い候補点を1つ抽出することができ、把持の失敗が低減する。また、対象物70の変形に関する物性情報を実際の対象物から取得することができ、シミュレーションに基づく変形評価部の変形模擬精度が高まる。このように、複雑な物理シミュレーションモデルに基づいて決定される把持点や安定把持点について、対象形状を入力とすると自動的に把持点が出力される把持点生成アルゴリズムを学習によって獲得でき、結果として、把持点候補を算出する計算コストが低減し、把持点を算出する時間が短くなるため、タクトタイムを短くして生産効率が上がるという格別な効果が得られる。 According to the present embodiment, when the robot hand 20 grips an object 70 which is a flexible amorphous object, it is possible to extract one candidate point having the highest gripping stability simply by inputting shape information. It can be done and the failure of gripping is reduced. Further, the physical property information regarding the deformation of the object 70 can be acquired from the actual object, and the deformation simulation accuracy of the deformation evaluation unit based on the simulation is improved. In this way, for grip points and stable grip points determined based on a complicated physics simulation model, a grip point generation algorithm that automatically outputs grip points when a target shape is input can be acquired by learning, and as a result, Since the calculation cost for calculating the grip point candidate is reduced and the time for calculating the grip point is shortened, the tact time is shortened and the production efficiency is improved, which is a special effect.
実施の形態7.
 本実施の形態では、把持点候補生成部が、第一の把持力を定義し、把持点に対して第一の把持力で把持する条件で評価し、有効な把持点を抽出した後で、第一の把持力より小さい第二の把持力で対象物を把持することで効率的に把持点を探すことのできることが実施の形態3と異なる。
Embodiment 7.
In the present embodiment, after the gripping point candidate generation unit defines the first gripping force, evaluates it under the condition of gripping with the first gripping force with respect to the gripping point, and extracts an effective gripping point, It is different from the third embodiment in that the gripping point can be efficiently searched for by gripping the object with a second gripping force smaller than the first gripping force.
 図14は、実施の形態7に係る把持点生成部31dの構成を示すブロック図である。図8に示した把持点生成部31aの構成に対して、結果DB35から把持点候補生成部32に把持点候補が入力されている。図14に示すように、把持安定性計算部34で出た把持点候補および安定性評価結果による結果データベースが再び把持点候補生成部32に入力されている。そして、把持点候補生成部32は、安定性評価が高いものを有限個抽出し、それらの有限個抽出された把持点候補として、第二の把持力(ただし第一の把持力よりも小さい)を規定する。 FIG. 14 is a block diagram showing the configuration of the gripping point generation unit 31d according to the seventh embodiment. For the configuration of the gripping point generation unit 31a shown in FIG. 8, the gripping point candidate is input to the gripping point candidate generation unit 32 from the result DB 35. As shown in FIG. 14, the gripping point candidate obtained by the gripping stability calculation unit 34 and the result database based on the stability evaluation result are input to the gripping point candidate generation unit 32 again. Then, the gripping point candidate generation unit 32 extracts a finite number of those having a high stability evaluation, and uses the finite number of those extracted gripping point candidates as a second gripping force (however, smaller than the first gripping force). Is specified.
 把持点生成部31dは、複数の把持点候補を保存する結果DB35と、ロボットハンド20が対象物70に出力する第一の把持力を定義し、第一の把持力で把持されることが指定された第一の把持点候補を変形評価部に出力する把持点候補生成部32を有している。把持安定性計算部34は、第一の把持点候補に対する安定性評価結果を計算し、第一の把持点候補および安定性評価結果を結果DB35へ出力する。把持点候補生成部32は、安定性評価結果に基づいて結果DB35に保存される第一の把持点候補から複数の把持点候補を抽出し、複数の把持点候補に対して第二の把持力を定義して、変形評価部33に再度出力する。 The gripping point generation unit 31d defines a result DB 35 for storing a plurality of gripping point candidates and a first gripping force output by the robot hand 20 to the object 70, and is designated to be gripped by the first gripping force. It has a gripping point candidate generation unit 32 that outputs the first gripping point candidate to the deformation evaluation unit. The gripping stability calculation unit 34 calculates the stability evaluation result for the first gripping point candidate, and outputs the first gripping point candidate and the stability evaluation result to the result DB 35. The gripping point candidate generation unit 32 extracts a plurality of gripping point candidates from the first gripping point candidate stored in the result DB 35 based on the stability evaluation result, and the second gripping force with respect to the plurality of gripping point candidates. Is defined and output to the deformation evaluation unit 33 again.
 なお、把持点候補生成部32は、同様の処理を3回以上繰り返すことも可能である。例えば、第三の把持力、第四の把持力、・・・、第kの把持力、と繰り返し、探索する把持力を小さくしていくことで、最小の把持力で有効な対象物70の変形が得られる把持点を抽出することが可能となる。これにより、最も小さい把持力で安定把持できる点を効率的に探索することができ、把持失敗にくい候補点の抽出が短時間で行える結果として、ロボット1回動作あたりの作業時間が短縮し、タクトタイムを短くして生産効率を高くできるという格別な効果を得ることができる。 The gripping point candidate generation unit 32 can repeat the same process three or more times. For example, by repeating the third gripping force, the fourth gripping force, ..., The kth gripping force, and reducing the gripping force to be searched, the object 70 effective with the minimum gripping force. It is possible to extract the gripping point where the deformation is obtained. As a result, it is possible to efficiently search for points that can be stably gripped with the smallest gripping force, and as a result of being able to extract candidate points that are unlikely to fail in gripping in a short time, the work time per robot operation is shortened and tact. The special effect of shortening the time and increasing the production efficiency can be obtained.
実施の形態8.
 本実施の形態では、把持安定性計算部34が、対象物の輪郭の情報から対象物を安定に把持する把持点候補の組み合わせを求めることが実施の形態1と異なる。把持安定性計算部34は、対象物70の物体の輪郭の点群座標から対象物70の輪郭の情報を取得し、把持点候補の組み合わせを対象物70の輪郭の上から選択する。そして、把持安定性計算部34は、ロボットハンド20が所定の把持力で対象物70を把持した際の評価値を組み合わせ毎に求め、評価値に基づいて対象物70を安定に把持する把持点候補の組み合わせを求める。
Embodiment 8.
In the present embodiment, the gripping stability calculation unit 34 is different from the first embodiment in that the gripping stability calculation unit 34 obtains a combination of gripping point candidates for stably gripping the object from the information on the contour of the object. The grip stability calculation unit 34 acquires information on the contour of the object 70 from the point cloud coordinates of the contour of the object of the object 70, and selects a combination of grip point candidates from above the contour of the object 70. Then, the gripping stability calculation unit 34 obtains an evaluation value when the robot hand 20 grips the object 70 with a predetermined gripping force for each combination, and a gripping point for stably gripping the object 70 based on the evaluation value. Find a combination of candidates.
 対象物70を安定に把持するための対象物70の把持点候補の組み合わせの探索のための評価手法について説明する。把持安定性計算部34は、対象物70を把持するために最低限必要な把持力の大きさの評価に基づいて、安定把持点の組み合わせを導出する。対象物70を把持するために最低限必要な把持力の大きさとは、対象物70に働く重力に抗するために最低限必要となるロボットハンド20の指先力の大きさのことである。対象物70の壊しにくさの観点から、この値は小さい方がよい。評価は把持力と摩擦力とから求めた指先力の値を用いて行うこととし、安定把持点の組み合わせの探索を以下の手順で行う。 An evaluation method for searching for a combination of gripping point candidates for the object 70 for stably gripping the object 70 will be described. The grip stability calculation unit 34 derives a combination of stable grip points based on the evaluation of the magnitude of the minimum grip force required to grip the object 70. The minimum gripping force required to grip the object 70 is the minimum fingertip force required to resist the gravity acting on the object 70. From the viewpoint of the difficulty of breaking the object 70, this value should be small. The evaluation is performed using the value of the fingertip force obtained from the gripping force and the frictional force, and the search for the combination of stable gripping points is performed by the following procedure.
 まず、対象物70の輪郭の点群座標から、Spline補間法で二次元平面内に配置した点を滑らかに結び、対象物70の輪郭の情報を取得する。対象物70の輪郭上に把持点候補を取り、その中の二点の組み合わせをすべて保存する。 First, from the point cloud coordinates of the outline of the object 70, the points arranged in the two-dimensional plane are smoothly connected by the Spline interpolation method, and the information of the outline of the object 70 is acquired. Grip point candidates are taken on the contour of the object 70, and all combinations of the two points are stored.
 次に、把持力をある値に設定し、すべての把持点候補の組み合わせについて評価値を求め、その結果から、その把持力での安定把持点の組み合わせを得る。そして、把持力を変更し、すべての把持点候補の組み合わせについて評価値を求め、その結果から、その把持力での安定把持点の組み合わせを得る。この作業を繰り返し、最適な把持力での安定把持点の組み合わせを得る。 Next, the gripping force is set to a certain value, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at that gripping force is obtained. Then, the gripping force is changed, evaluation values are obtained for all combinations of gripping point candidates, and from the results, a combination of stable gripping points at the gripping force is obtained. This operation is repeated to obtain a combination of stable grip points with the optimum grip force.
 以上のように、把持安定性計算部34が対象物70を安定に把持するための対象物70の把持点候補の組み合わせを求めることによって、柔軟物体のような不定形物体を対象物に対して、選定された把持点を把持することによる把持失敗が大幅に低減され、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高くできるという格別な効果を得ることができる。 As described above, the gripping stability calculation unit 34 obtains a combination of gripping point candidates of the object 70 for stably gripping the object 70, whereby an amorphous object such as a flexible object is obtained with respect to the object. , Gripping failure due to gripping the selected gripping point is greatly reduced, the object can be gripped with a high success rate, the tact time can be shortened, and the production efficiency can be improved. Can be done.
実施の形態9.
 本実施の形態では、把持安定性計算部34が、把持後の対象物70の形状変形情報だけでなく、把持前の対象物70の形状情報に基づいて評価値を求める。図15は、実施の形態9に係る把持前の対象物70の模式図である。具体的には、変形評価部33は、把持前の対象物70の形状情報として、複数の離散点DPB1,DPB2,・・・を出力する。複数の離散点DPB1,DPB2,・・・は、対象物70の輪郭に基づいて設定される。把持安定性計算部34は、複数の離散点DPB1,DPB2,・・・の位置関係から、対象物の窪みの程度を定量的に評価し、評価値として出力する。図15の場合、離散点DPB1,DPB2において窪みが発生しているため、把持安定性計算部34は、離散点DPB1,DPB2を把持安定性が高い把持点候補として出力してもよい。
Embodiment 9.
In the present embodiment, the gripping stability calculation unit 34 obtains an evaluation value based not only on the shape deformation information of the object 70 after gripping but also on the shape information of the object 70 before gripping. FIG. 15 is a schematic view of the object 70 before gripping according to the ninth embodiment. Specifically, the deformation evaluation unit 33 outputs a plurality of discrete points DPB1, DPB2, ... As shape information of the object 70 before gripping. The plurality of discrete points DPB1, DPB2, ... Are set based on the contour of the object 70. The grip stability calculation unit 34 quantitatively evaluates the degree of depression of the object from the positional relationship of the plurality of discrete points DPB1, DPB2, ..., And outputs it as an evaluation value. In the case of FIG. 15, since the dents are generated at the discrete points DPB1 and DPB2, the gripping stability calculation unit 34 may output the discrete points DPB1 and DPB2 as gripping point candidates having high gripping stability.
 以上のように、把持安定性計算部34が把持前の対象物70の形状情報に基づいて評価値を求めることによって、把持点を精度よく選定できる。これにより、高い成功率で対象物を把持することができ、タクトタイムを短くして生産効率を高くできるという格別な効果を得ることができる。 As described above, the gripping stability calculation unit 34 can accurately select the gripping point by obtaining the evaluation value based on the shape information of the object 70 before gripping. As a result, it is possible to grasp the object with a high success rate, and it is possible to obtain a special effect that the tact time can be shortened and the production efficiency can be increased.
 ここで、ロボット制御装置30のハードウェア構成について説明する。ロボット制御装置30の各機能は、処理回路によって実現し得る。処理回路は、少なくとも1つのプロセッサと少なくとも1つのメモリとを備える。 Here, the hardware configuration of the robot control device 30 will be described. Each function of the robot control device 30 can be realized by a processing circuit. The processing circuit comprises at least one processor and at least one memory.
 図16は、実施の形態1から9に係るロボット制御装置のハードウェア構成を示す図である。ロボット制御装置30は、図16(a)に示した制御回路、すなわちプロセッサ81およびメモリ82によって実現することができる。プロセッサ81の例は、CPU(Central Processing Unit、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、プロセッサ、DSP(Digital Signal Processor)ともいう)またはシステムLSI(Large Scale Integration)である。 FIG. 16 is a diagram showing a hardware configuration of the robot control device according to the first to ninth embodiments. The robot control device 30 can be realized by the control circuit shown in FIG. 16A, that is, the processor 81 and the memory 82. An example of the processor 81 is a CPU (Central Processing Unit, central processing unit, processing unit, arithmetic unit, microprocessor, microprocessor, processor, DSP (Digital Signal Processor)) or system LSI (Large Scale Integration).
 メモリ82は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(登録商標)(Electrically Erasable Programmable Read-Only Memory)等の不揮発性若しくは揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク又はDVD(Digital Versatile Disk)等である。 The memory 82 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (registered trademark), etc. Alternatively, it may be a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versaille Disc), or the like.
 ロボット制御装置30は、プロセッサ81が、メモリ82で記憶されている、ロボット制御装置30の動作を実行するためのプログラムを読み出して実行することによって実現される。また、このプログラムは、ロボット制御装置30の手順または方法をコンピュータに実行させるものであるともいえる。ロボット制御装置30で実行されるプログラムは、把持点生成部31、および指令値生成部39を含み、これらが主記憶装置上にロードされ、これらが主記憶装置上に生成される。 The robot control device 30 is realized by the processor 81 reading and executing a program stored in the memory 82 for executing the operation of the robot control device 30. It can also be said that this program causes a computer to execute the procedure or method of the robot control device 30. The program executed by the robot control device 30 includes a grip point generation unit 31 and a command value generation unit 39, which are loaded on the main storage device and these are generated on the main storage device.
 メモリ82は、障害物情報、対象形状情報や形状変形情報などを記憶する。メモリ82は、プロセッサ81が各種処理を実行する際の一時メモリにも使用される。 The memory 82 stores obstacle information, target shape information, shape deformation information, and the like. The memory 82 is also used as a temporary memory when the processor 81 executes various processes.
 プロセッサ81が実行するプログラムは、インストール可能な形式または実行可能な形式のファイルで、コンピュータが読み取り可能な記憶媒体に記憶されてコンピュータプログラムプロダクトとして提供されてもよい。また、プロセッサ81が実行するプログラムは、インターネットなどのネットワーク経由でロボット制御装置30に提供されてもよい。 The program executed by the processor 81 may be a file in an installable format or an executable format, stored in a computer-readable storage medium, and provided as a computer program product. Further, the program executed by the processor 81 may be provided to the robot control device 30 via a network such as the Internet.
 また、ロボット制御装置30を専用のハードウェアで実現してもよい。また、ロボット制御装置30の機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。 Further, the robot control device 30 may be realized by dedicated hardware. Further, the functions of the robot control device 30 may be partially realized by dedicated hardware and partially realized by software or firmware.
 また、ロボット制御装置30は、図16(b)に示した専用の処理回路83によって実現してもよい。把持点生成部31、および指令値生成部39の少なくとも一部は、処理回路83によって実現されてもよい。処理回路83は、専用のハードウェアである。処理回路83は、例えば、単一回路、複合回路、プログラム化されたプロセッサ、並列プログラム化されたプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又はこれらを組み合わせたものである。ロボット制御装置30の機能の一部がソフトウェア又はファームウェアで実現され、残りの一部が専用のハードウェアで実現されてもよい。 Further, the robot control device 30 may be realized by the dedicated processing circuit 83 shown in FIG. 16 (b). At least a part of the grip point generation unit 31 and the command value generation unit 39 may be realized by the processing circuit 83. The processing circuit 83 is dedicated hardware. The processing circuit 83 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof. Is. A part of the function of the robot control device 30 may be realized by software or firmware, and the other part may be realized by dedicated hardware.
 10 ロボット、20 ロボットハンド、30 ロボット制御装置、31,31a,31b,31c,31d 把持点生成部、32,32a 把持点候補生成部、33 変形評価部、34 把持安定性計算部、35 結果DB、36 把持点決定部、37 把持点候補学習部、38 学習DB、39 指令値生成部、50 計測装置コントローラ、60 計測装置、70 対象物、81 プロセッサ、82 メモリ、83 処理回路、FP1~FP6 フィンガの位置、DP1~DP5,DPB1,DPB2 離散点。 10 robot, 20 robot hand, 30 robot control device, 31, 31a, 31b, 31c, 31d grip point generation unit, 32, 32a grip point candidate generation unit, 33 deformation evaluation unit, 34 grip stability calculation unit, 35 result DB , 36 grip point determination unit, 37 grip point candidate learning unit, 38 learning DB, 39 command value generation unit, 50 measuring device controller, 60 measuring device, 70 object, 81 processor, 82 memory, 83 processing circuit, FP1 to FP6 Finger position, DP1 to DP5, DPB1, DPB2 discrete points.

Claims (18)

  1.  対象物を把持するためにロボットおよび前記ロボットのロボットハンドを制御するロボット制御装置であって、
     前記ロボットハンドが把持する前記対象物の把持点を生成する把持点生成部を備え、
     前記把持点生成部は、前記ロボットハンドの把持動作によって前記対象物の形状が変形する際の形状変形情報を算出する変形評価部と、
     前記形状変形情報に基づいて前記対象物の把持点を決定する把持点決定部とを有する
    ことを特徴とするロボット制御装置。
    A robot control device that controls a robot and the robot hand of the robot in order to grip an object.
    A grip point generation unit for generating a grip point of the object to be gripped by the robot hand is provided.
    The gripping point generation unit includes a deformation evaluation unit that calculates shape deformation information when the shape of the object is deformed by the gripping operation of the robot hand.
    A robot control device comprising a gripping point determining unit that determines a gripping point of the object based on the shape deformation information.
  2.  前記把持点決定部は、前記形状変形情報に含まれる前記対象物の変形量と、前記対象物の変形後の幾何的な拘束条件とに基づいて前記対象物の把持点を決定する
    ことを特徴とする請求項1に記載のロボット制御装置。
    The gripping point determining unit is characterized in that the gripping point of the object is determined based on the deformation amount of the object included in the shape deformation information and the geometrical constraint condition after the deformation of the object. The robot control device according to claim 1.
  3.  前記変形評価部は、前記形状変形情報として複数の離散点を出力し、
     前記把持点決定部は、前記複数の離散点と前記ロボットハンドのフィンガとの位置関係に基づいて前記幾何的な拘束条件を判定することを特徴とする請求項2に記載のロボット制御装置。
    The deformation evaluation unit outputs a plurality of discrete points as the shape deformation information, and outputs the plurality of discrete points.
    The robot control device according to claim 2, wherein the gripping point determining unit determines the geometrical constraint condition based on the positional relationship between the plurality of discrete points and the finger of the robot hand.
  4.  前記変形評価部は、前記形状変形情報として複数の離散点を出力し、
     前記把持点決定部は、前記対象物に対し仮想的な力を加えた時の前記複数の離散点の位置の変化量に基づいて前記幾何的な拘束条件を判定することを特徴とする請求項2に記載のロボット制御装置。
    The deformation evaluation unit outputs a plurality of discrete points as the shape deformation information, and outputs the plurality of discrete points.
    The claim is characterized in that the gripping point determining unit determines the geometric constraint condition based on the amount of change in the positions of the plurality of discrete points when a virtual force is applied to the object. 2. The robot control device according to 2.
  5.  前記変形評価部は、前記対象物に加わる把持力と前記対象物の変位との関係式および前記対象物が許容できる前記対象物の変形量に基づいて前記対象物に加わる把持力の上限値を算出し、前記ロボットハンドから前記対象物に加わる把持力が前記上限値を超えないかどうかを評価する
    ことを特徴とする請求項1から請求項4のいずれか1項に記載のロボット制御装置。
    The deformation evaluation unit sets an upper limit value of the gripping force applied to the object based on the relational expression between the gripping force applied to the object and the displacement of the object and the amount of deformation of the object that the object can tolerate. The robot control device according to any one of claims 1 to 4, wherein the robot control device is calculated and evaluated whether or not the gripping force applied to the object from the robot hand exceeds the upper limit value.
  6.  前記変形評価部は、前記対象物に加わる把持力と前記対象物の変位との関係式に基づいて算出される把持力の時系列情報を前記形状変形情報の一部として出力する
    ことを特徴とする請求項1から請求項5のいずれか1項に記載のロボット制御装置。
    The deformation evaluation unit is characterized in that it outputs time-series information of the gripping force calculated based on the relational expression between the gripping force applied to the object and the displacement of the object as a part of the shape deformation information. The robot control device according to any one of claims 1 to 5.
  7.  前記把持点生成部は、前記対象物の把持点の近傍における前記対象物の変形後の力のつり合いに対して、予め定めた外力に対する力学的安定性を評価する把持安定性計算部を有する
    ことを特徴とする請求項1から請求項6のいずれか1項に記載のロボット制御装置。
    The gripping point generation unit has a gripping stability calculation unit that evaluates mechanical stability against a predetermined external force with respect to the balance of the deformed force of the object in the vicinity of the gripping point of the object. The robot control device according to any one of claims 1 to 6, wherein the robot control device is characterized.
  8.  前記把持安定性計算部は、前記対象物の把持点の近傍における前記対象物の変形後の力のつり合いを評価し、前記ロボットハンドの前記対象物に対する把持力が最小になる前記対象物の把持点を抽出する
    ことを特徴とする請求項7に記載のロボット制御装置。
    The gripping stability calculation unit evaluates the balance of the deformed force of the object in the vicinity of the gripping point of the object, and grips the object so that the gripping force of the robot hand with respect to the object is minimized. The robot control device according to claim 7, wherein points are extracted.
  9.  前記把持点生成部は、前記ロボットハンドの指先の形状と前記形状変形情報とに基づいて前記ロボットハンドに対する前記対象物の幾何的なずれにくさを評価する把持安定性計算部を有する
    ことを特徴とする請求項1から請求項6のいずれか1項に記載のロボット制御装置。
    The gripping point generation unit is characterized by having a gripping stability calculation unit that evaluates the resistance to geometrical deviation of the object with respect to the robot hand based on the shape of the fingertip of the robot hand and the shape deformation information. The robot control device according to any one of claims 1 to 6.
  10.  前記変形評価部は、前記ロボットハンドが前記対象物に対して把持力を印加してから一定時間後の把持力を除荷した後の前記形状変形情報を出力し、
     前記把持安定性計算部は、前記対象物の元の形状と前記対象物の除荷した後の形状との差分量を求め、前記差分量と予め定めた変形許容値とを比較して評価する
    ことを特徴とする請求項7から請求項9のいずれか1項に記載のロボット制御装置。
    The deformation evaluation unit outputs the shape deformation information after the robot hand applies the gripping force to the object and then unloads the gripping force after a certain period of time.
    The gripping stability calculation unit obtains a difference amount between the original shape of the object and the shape of the object after unloading, and evaluates the difference amount by comparing it with a predetermined deformation tolerance value. The robot control device according to any one of claims 7 to 9, wherein the robot control device is characterized.
  11.  前記変形評価部は、前記ロボットハンドが前記対象物に対して把持力を印加してから一定時間後の把持力を除荷した後の前記形状変形情報を出力し、
     前記把持安定性計算部は、前記対象物の元の形状の曲率と前記対象物の除荷した後の形状の曲率との差分量を求め、曲率の前記差分量と予め定めた変形許容値とを比較して評価する
    ことを特徴とする請求項7から請求項9のいずれか1項に記載のロボット制御装置。
    The deformation evaluation unit outputs the shape deformation information after the robot hand applies the gripping force to the object and then unloads the gripping force after a certain period of time.
    The gripping stability calculation unit obtains a difference amount between the curvature of the original shape of the object and the curvature of the shape after unloading the object, and sets the difference amount of the curvature and a predetermined deformation tolerance value. The robot control device according to any one of claims 7 to 9, wherein the robot control apparatus is compared and evaluated.
  12.  前記把持点生成部は、前記対象物の物体の輪郭の点群座標から前記対象物の輪郭の情報を取得し、前記対象物の把持点候補の組み合わせを前記対象物の輪郭の上から選択し、前記ロボットハンドが所定の把持力で前記対象物を把持した際の評価値を前記対象物の把持点候補の組み合わせ毎に求め、前記評価値に基づいて前記対象物を安定に把持する前記対象物の把持点候補の組み合わせを求める把持安定性計算部を有する
    ことを特徴とする請求項1から請求項6のいずれか1項に記載のロボット制御装置。
    The gripping point generation unit acquires information on the contour of the object from the point cloud coordinates of the contour of the object of the object, and selects a combination of grip point candidates of the object from above the contour of the object. The evaluation value when the robot hand grips the object with a predetermined gripping force is obtained for each combination of gripping point candidates of the object, and the object that stably grips the object based on the evaluation value. The robot control device according to any one of claims 1 to 6, further comprising a gripping stability calculation unit for obtaining a combination of gripping point candidates for an object.
  13.  前記把持点生成部は、複数の把持点候補を保存する結果データベースと、
     前記ロボットハンドが前記対象物に出力する第一の把持力を定義し、前記第一の把持力で把持されることが指定された第一の把持点候補を前記変形評価部に出力する把持点候補生成部を有し、
     前記把持安定性計算部は、前記第一の把持点候補に対する安定性評価結果を計算し、前記第一の把持点候補および前記安定性評価結果を前記結果データベースへ出力し、
     前記把持点候補生成部は、前記安定性評価結果に基づいて前記結果データベースに保存される前記第一の把持点候補から複数の把持点候補を抽出し、前記複数の把持点候補に対して第二の把持力を定義して、前記変形評価部に再度出力する
    ことを特徴とする請求項7から請求項12のいずれか1項に記載のロボット制御装置。
    The grip point generation unit includes a result database for storing a plurality of grip point candidates and a result database.
    A gripping point that defines a first gripping force output to the object by the robot hand and outputs a first gripping point candidate designated to be gripped by the first gripping force to the deformation evaluation unit. It has a candidate generator and
    The gripping stability calculation unit calculates the stability evaluation result for the first gripping point candidate, outputs the first gripping point candidate and the stability evaluation result to the result database, and outputs the result.
    The gripping point candidate generation unit extracts a plurality of gripping point candidates from the first gripping point candidate stored in the result database based on the stability evaluation result, and the first gripping point candidate is obtained with respect to the plurality of gripping point candidates. The robot control device according to any one of claims 7 to 12, wherein the second gripping force is defined and output to the deformation evaluation unit again.
  14.  前記把持点生成部は、前記把持安定性計算部から出力される結果データと実作業で得られる結果ラベルとを入力し、前記形状変形情報から前記対象物の把持点候補を出力する関係を学習する把持点候補学習部を有する
    ことを特徴とする請求項7から請求項12のいずれか1項に記載のロボット制御装置。
    The gripping point generation unit inputs the result data output from the gripping stability calculation unit and the result label obtained in the actual work, and learns the relationship of outputting the gripping point candidate of the object from the shape deformation information. The robot control device according to any one of claims 7 to 12, further comprising a gripping point candidate learning unit.
  15.  前記把持点生成部は、前記対象物に作用する力と前記対象物の変位との関係をばね乗数とダンピング係数とを使ったモデルによりモデル化する物性モデル定義部を有し、
     前記物性モデル定義部は、前記対象物に時間によって変化する力を印加し、当該力に対して前記対象物の変形に基づく変位の時系列情報に基づいて設定した前記モデルのばね乗数とダンピング係数とを推定する
    ことを特徴とする請求項1から請求項14のいずれか1項に記載のロボット制御装置。
    The grip point generation unit has a physical characteristic model definition unit that models the relationship between the force acting on the object and the displacement of the object by a model using a spring multiplier and a damping coefficient.
    The physical property model definition unit applies a force that changes with time to the object, and sets the spring multiplier and damping coefficient of the model based on the time-series information of the displacement based on the deformation of the object with respect to the force. The robot control device according to any one of claims 1 to 14, wherein the robot control device is characterized in that.
  16.  前記把持点生成部は、前記対象物に作用する力と前記対象物の変位との関係をニューラルネットワークによりモデル化する物性モデル学習部を有し、
     前記物性モデル学習部は、前記対象物に時間によって変化する力を印加し、当該力に対して前記対象物の変形に基づく変位の時系列情報に基づいて設定したニューラルネットワークを学習する
    ことを特徴とする請求項1から請求項14のいずれか1項に記載のロボット制御装置。
    The gripping point generation unit has a physical characteristic model learning unit that models the relationship between the force acting on the object and the displacement of the object by a neural network.
    The physical property model learning unit is characterized by applying a force that changes with time to the object and learning a neural network set based on time-series information of displacement based on the deformation of the object with respect to the force. The robot control device according to any one of claims 1 to 14.
  17.  前記変形評価部は、把持前の前記対象物の形状情報として複数の離散点を更に出力し、
     前記把持安定性計算部は、前記複数の離散点の位置関係から、前記対象物の窪みを評価して窪み評価値とし、前記窪み評価値に基づいて前記把持点候補を出力することを特徴とする請求項12あるいは請求項13に記載のロボット制御装置。
    The deformation evaluation unit further outputs a plurality of discrete points as shape information of the object before gripping.
    The gripping stability calculation unit is characterized in that it evaluates a dent of the object from the positional relationship of the plurality of discrete points to obtain a dent evaluation value, and outputs the grip point candidate based on the dent evaluation value. 12. The robot control device according to claim 12 or 13.
  18.  対象物を把持するためにロボットおよび前記ロボットのロボットハンドを制御するロボット制御方法であって、
     前記ロボットハンドの把持動作によって前記対象物の形状が変形する際の形状変形情報を算出するステップと、
     前記形状変形情報に基づいて前記対象物の把持点を決定し、前記ロボットハンドが把持する前記対象物の把持点を生成するステップとを含む
    ことを特徴とするロボット制御方法。
    A robot control method for controlling a robot and the robot hand of the robot in order to grip an object.
    A step of calculating shape deformation information when the shape of the object is deformed by the gripping operation of the robot hand, and
    A robot control method comprising a step of determining a gripping point of the object based on the shape deformation information and generating a gripping point of the object gripped by the robot hand.
PCT/JP2021/036652 2020-10-19 2021-10-04 Robot control device and robot control method WO2022085408A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112021005493.7T DE112021005493T5 (en) 2020-10-19 2021-10-04 ROBOT CONTROL DEVICE AND ROBOT CONTROL METHOD
CN202180060765.2A CN116194255A (en) 2020-10-19 2021-10-04 Robot control device and robot control method
JP2022557371A JP7337285B2 (en) 2020-10-19 2021-10-04 ROBOT CONTROL DEVICE AND ROBOT CONTROL METHOD

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020175203 2020-10-19
JP2020-175203 2020-10-19

Publications (1)

Publication Number Publication Date
WO2022085408A1 true WO2022085408A1 (en) 2022-04-28

Family

ID=81289731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/036652 WO2022085408A1 (en) 2020-10-19 2021-10-04 Robot control device and robot control method

Country Status (4)

Country Link
JP (1) JP7337285B2 (en)
CN (1) CN116194255A (en)
DE (1) DE112021005493T5 (en)
WO (1) WO2022085408A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023110111B3 (en) 2023-04-20 2024-06-06 J.Schmalz Gmbh Method for controlling a handling system and handling system
DE102023110107B3 (en) 2023-04-20 2024-05-23 J.Schmalz Gmbh Method for handling objects and handling system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016196077A (en) * 2015-04-06 2016-11-24 キヤノン株式会社 Information processor, information processing method, and program
WO2018092254A1 (en) * 2016-11-17 2018-05-24 株式会社安川電機 Gripping force-setting system, gripping force-setting method and gripping force-estimating system
JP2019107725A (en) * 2017-12-18 2019-07-04 国立大学法人信州大学 Gripping device, learning device, learned model, gripping system, determination method, and learning method
JP2019188587A (en) * 2018-04-24 2019-10-31 ファナック株式会社 Robot control device and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4211701B2 (en) * 2004-07-21 2009-01-21 トヨタ自動車株式会社 Robot hand gripping control device
JP2008049459A (en) 2006-08-28 2008-03-06 Toshiba Corp System, method and program for controlling manipulator
WO2018092860A1 (en) 2016-11-16 2018-05-24 三菱電機株式会社 Interference avoidance device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016196077A (en) * 2015-04-06 2016-11-24 キヤノン株式会社 Information processor, information processing method, and program
WO2018092254A1 (en) * 2016-11-17 2018-05-24 株式会社安川電機 Gripping force-setting system, gripping force-setting method and gripping force-estimating system
JP2019107725A (en) * 2017-12-18 2019-07-04 国立大学法人信州大学 Gripping device, learning device, learned model, gripping system, determination method, and learning method
JP2019188587A (en) * 2018-04-24 2019-10-31 ファナック株式会社 Robot control device and system

Also Published As

Publication number Publication date
DE112021005493T5 (en) 2023-08-31
JPWO2022085408A1 (en) 2022-04-28
JP7337285B2 (en) 2023-09-01
CN116194255A (en) 2023-05-30

Similar Documents

Publication Publication Date Title
WO2022085408A1 (en) Robot control device and robot control method
Balasubramanian et al. Human-guided grasp measures improve grasp robustness on physical robot
Harada et al. Fast grasp planning for hand/arm systems based on convex model
Huang et al. Learning a real time grasping strategy
Zhou et al. 6dof grasp planning by optimizing a deep learning scoring function
EP3812972A1 (en) Method for controlling a robot and robot controller
Delgado et al. In-hand recognition and manipulation of elastic objects using a servo-tactile control strategy
Sintov et al. Dynamic regrasping by in-hand orienting of grasped objects using non-dexterous robotic grippers
Chang et al. On alternative uses of structural compliance for the development of adaptive robot grippers and hands
Rocchi et al. Stable simulation of underactuated compliant hands
Kawaharazuka et al. Object recognition, dynamic contact simulation, detection, and control of the flexible musculoskeletal hand using a recurrent neural network with parametric bias
Li et al. Manipulation skill acquisition for robotic assembly based on multi-modal information description
Kumar et al. Contextual reinforcement learning of visuo-tactile multi-fingered grasping policies
Bo et al. Automated design of embedded constraints for soft hands enabling new grasp strategies
Lin et al. Grasp mapping using locality preserving projections and knn regression
Ruiz Garate et al. A bio-inspired grasp stiffness control for robotic hands
Ciocarlie et al. On-line interactive dexterous grasping
Perico et al. Learning robust manipulation tasks involving contact using trajectory parameterized probabilistic principal component analysis
Li et al. Dual loop compliant control based on human prediction for physical human-robot interaction
Añazco et al. Human-like object grasping and relocation for an anthropomorphic robotic hand with natural hand pose priors in deep reinforcement learning
Patel et al. An Analysis of Unified Manipulation with Robot Arms and Dexterous Hands via Optimization-based Motion Synthesis
Tavassolian et al. Forward kinematics analysis of a 3-PRR planer parallel robot using a combined method based on the neural network
JP5829103B2 (en) Robot hand
Vatsal et al. Augmenting vision-based grasp plans for soft robotic grippers using reinforcement learning
Fernandez et al. Regrasping objects during manipulation tasks by combining genetic algorithms and finger gaiting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21882549

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022557371

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 21882549

Country of ref document: EP

Kind code of ref document: A1