WO2023062941A1 - 把持制御装置、および把持制御方法 - Google Patents
把持制御装置、および把持制御方法 Download PDFInfo
- Publication number
- WO2023062941A1 WO2023062941A1 PCT/JP2022/031538 JP2022031538W WO2023062941A1 WO 2023062941 A1 WO2023062941 A1 WO 2023062941A1 JP 2022031538 W JP2022031538 W JP 2022031538W WO 2023062941 A1 WO2023062941 A1 WO 2023062941A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fingertip
- control device
- gripping
- force
- contact surface
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 28
- 238000001514 detection method Methods 0.000 claims abstract description 76
- 238000006073 displacement reaction Methods 0.000 claims abstract description 46
- 238000004364 calculation method Methods 0.000 claims abstract description 23
- 230000003287 optical effect Effects 0.000 claims description 8
- 238000010008 shearing Methods 0.000 claims description 2
- 239000013598 vector Substances 0.000 abstract description 8
- 238000011156 evaluation Methods 0.000 description 43
- 238000010586 diagram Methods 0.000 description 36
- 238000005516 engineering process Methods 0.000 description 17
- 238000004458 analytical method Methods 0.000 description 7
- 238000004088 simulation Methods 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000005484 gravity Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000013016 damping Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000052 comparative effect Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 229910052742 iron Inorganic materials 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 239000002243 precursor Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/08—Gripping heads and other end effectors having finger members
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
Definitions
- the present disclosure relates to a gripping control device and a gripping control method.
- Patent Documents 1 and 2 Various technologies related to the control of robot hands, etc. have been proposed (see, for example, Patent Documents 1 and 2, and Non-Patent Document 1).
- a robot hand or the like is required to stably grip even an unknown object (an object whose mass, center of gravity, coefficient of friction, etc. are unknown).
- it is required to appropriately apply the magnitude and direction of each fingertip force in a robot hand or the like.
- a gripping control device includes a detection unit that detects displacement in the normal direction of each of a plurality of curved elastic bodies provided on each of a plurality of fingertips that are in contact with a gripped object. and a calculating unit for calculating the contact surface normal of each fingertip with respect to the gripped object based on the detection result of the detecting unit.
- a gripping control method detects a displacement in a normal direction of each of a plurality of curved elastic bodies provided on each of a plurality of fingertips contacting a gripped object; calculating the contact surface normal of each fingertip with respect to the gripped object based on the detection result.
- FIG. 4 is a configuration diagram schematically showing a state in which an object is brought into vertical contact with a fingertip of a curved elastic body, and then a shear displacement is generated.
- FIG. 2 is an explanatory diagram showing an FEM analysis result of the state shown in FIG. 1;
- FIG. 4 is a configuration diagram schematically showing a state in which an object is obliquely contacted with a fingertip of a curved elastic body;
- FIG. 3 is an explanatory diagram showing an FEM analysis result of the state shown in FIG. 2;
- 1 is a configuration diagram schematically showing an example of a grasping system according to a first embodiment of the present disclosure;
- FIG. FIG. 4 is an explanatory diagram of gripping force control;
- FIG. 4 is an explanatory diagram of sticking rate
- FIG. 2 is a configuration diagram schematically showing an example of fingertips in the gripping device according to the first embodiment
- FIG. 4 is an explanatory diagram showing an outline of a method of calculating a contact surface normal
- FIG. 4 is an explanatory diagram showing an outline of a contact point calculation method
- FIG. 4 is an explanatory diagram schematically showing a first example (situation 1) of an environment for evaluating contact surface normal detection accuracy
- FIG. 11 is an explanatory diagram schematically showing a second example (situation 2) of the evaluation environment for the detection accuracy of the contact surface normal
- FIG. 13 is a configuration diagram showing an outline of a robot hand in the evaluation environment of FIG. 12;
- FIG. 1 is an explanatory diagram showing an outline of a method of calculating a contact surface normal
- FIG. 4 is an explanatory diagram showing an outline of a contact point calculation method
- FIG. 4 is an explanatory diagram schematically showing a first example (situation 1) of an environment for evaluating contact
- FIG. 12 is a characteristic diagram showing evaluation results of estimated angles of contact surface normals in situation 1 of FIG. 11 ;
- FIG. 13 is a characteristic diagram showing an evaluation result of an estimated contact surface normal angle in situation 2 of FIG. 12 ;
- FIG. 10 is an explanatory diagram schematically showing an environment for evaluating the detection accuracy of contact surface normals of different object shapes;
- FIG. 4 is an explanatory diagram schematically showing a planar evaluation object;
- FIG. 4 is an explanatory diagram schematically showing a columnar evaluation object;
- FIG. 4 is an explanatory diagram schematically showing a spherical evaluation object;
- FIG. 10 is a characteristic diagram showing an evaluation result when the evaluation object is planar;
- FIG. 10 is a characteristic diagram showing evaluation results when the evaluation object is cylindrical and the detection direction is the non-curvature direction;
- FIG. 10 is a characteristic diagram showing evaluation results when the evaluation object is cylindrical and the detection direction is the direction with curvature;
- FIG. 10 is a characteristic diagram showing evaluation results when an evaluation object is spherical;
- FIG. 10 is an explanatory diagram of evaluation results when the evaluation object is cylindrical and the detection direction is the direction with curvature;
- FIG. 10 is an explanatory diagram showing evaluation results of each evaluation object;
- 4 is a block diagram schematically showing an example of hand control by the gripping control device according to the first embodiment;
- FIG. FIG. 4 is an explanatory diagram showing an example of parameters used for hand control;
- FIG. FIG. 2 is a configuration diagram schematically showing an example of a robot hand used for evaluation of a hand control method by the gripping control device according to the first embodiment;
- FIG. 10 is an explanatory diagram showing a simulation result of evaluation of a hand control method by the gripping control device according to the first embodiment;
- FIG. 4 is an explanatory diagram showing physical parameters of a robot hand used in a simulation for evaluating a hand control method by the gripping control device according to the first embodiment;
- FIG. 5 is an explanatory diagram showing physical parameters of a gripped object and control parameters of hand control used in a simulation for evaluating a hand control method by the gripping control device according to the first embodiment;
- the fingertip In order to make the pressure distribution steep, the fingertip is required to be an elastic body with a curved surface. In order to satisfy the condition 2, it is common to use the contact point of the fingertip and the normal line of the contact surface during the gripping operation. Therefore, in order to satisfy both the condition 1 and the condition 2, it is required to detect the contact surface normal of the fingertip of the curved elastic body and the contact point during the gripping operation.
- FIG. 1 schematically shows a state in which an object 100 is brought into vertical contact with a fingertip 110 of a curved elastic body, and then a shear displacement is generated.
- FIG. 2 shows the FEM analysis results of the state shown in FIG.
- FIG. 3 schematically shows a state in which the object 100 is in oblique contact with the fingertip 110 of the curved elastic body.
- FIG. 4 shows the FEM analysis results of the state shown in FIG.
- a sensor 120 is provided on the bottom surface of the fingertip 110 .
- FIGS. 2 and 4 show the contact surface state (A), the normal stress distribution on the contact surface (B), and the normal stress distribution on the sensor surface (C) as the FEM analysis results. From the results of FIGS. 2 and 4, it can be seen that the vertical stress distribution is maximized at a position shifted from the center in both cases in which the object 100 is in contact with the fingertip 110 vertically and obliquely. It can be seen that the distribution changes to From this, it can be seen that it is difficult to distinguish between the case of vertical contact and subsequent shear displacement and the oblique contact.
- Non-Patent Document 1 (“Intelligent Fingertip Sensing for Contact Information Identification”, Advances in Reconfigurable Mechanisms and Robots, pp.599-608, 2012) is an existing technology related to contact surface normal and contact point detection.
- the technique described in Non-Patent Document 1 utilizes 6-axis haptic information to calculate the contact surface normal of a curved rigid body fingertip and the contact point from a mathematical model.
- the technique described in Non-Patent Document 1 assumes that the fingertip is a rigid body and does not deform. Therefore, it is difficult to both calculate the contact surface normal of the fingertip of the curved elastic body and the contact point and detect the initial slippage.
- Patent document 1 Japanese Patent Application Laid-Open No. 2009-56593 is an existing technology related to contact surface normal line and contact point detection.
- the fingertip surface is divided into elements, and the contact point and the contact surface normal are calculated in advance from the coordinates of each grid point.
- image recognition the coordinates of the target contact point on the object are determined, the fingertip contact point having a normal line along the normal vector is determined, and the target joint angle is determined.
- the technology described in Patent Literature 1 is not a technology aimed at stable gripping after contact, but a technology for fingertip approach before contact, and detection after contact is not taken into consideration.
- the image recognition method may cause occlusion depending on the position and orientation of the robot hand and the object.
- Patent document 2 Japanese Patent Application Laid-Open No. 2007-75929 is an existing technology related to multi-fingered hand control using contact surface normals and contact points.
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-75929
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-75929
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-75929
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-75929
- Patent Document 2 Japanese Patent Application Laid-Open No. 2007-75929
- FIG. 5 schematically shows a configuration example of a grasping system according to the first embodiment of the present disclosure.
- the gripping system 50 is a system that grips the object 100, and has a gripping control device 51 and a gripping device 52, as shown in FIG.
- the gripping control device 51 is communicably connected to the gripping device 52 and can control driving of the gripping device 52 .
- the gripping control device 51 can drive the gripping device 52 so as to grip the object 100 and control the gripping force (fingertip force) with which the gripping device 52 grips the object 100 .
- the gripping control device 51 can also acquire information obtained by the gripping device 52 .
- the gripping control device 51 can control driving of the gripping device 52 using information acquired from the gripping device 52 .
- the gripping control device 51 has a "detection unit” and a “calculation unit” in the technology of the present disclosure.
- the detection unit in the gripping control device 51 detects the normal direction of each of the plurality of curved elastic bodies 10 ( FIG. 8 to be described later) provided on each of the plurality of fingertips 1 of the gripping control device 51 . Detects the displacement of The calculation unit in the gripping control device 51 calculates the contact surface normal line of each fingertip 1 with respect to the gripped object, etc., based on the detection result of the detection unit, as will be described later.
- the grip control device 51 is configured by a computer including, for example, one or more CPUs (Central Processing Units), one or more ROMs (Read Only Memories), and one or more RAMs (Random Access Memories). good too.
- the processing of each unit by the gripping control device 51 can be realized by one or more CPUs executing processes based on programs stored in one or more ROMs or RAMs. Further, the processing of each unit by the gripping control device 51 may be realized by one or a plurality of CPUs executing processing based on a program externally supplied via a wired or wireless network, for example.
- the gripping device 52 is, for example, a robot hand, and performs processing related to gripping the object 100 .
- the gripping device 52 is driven under the control of the gripping control device 51 and can grip the object 100 with a gripping force designated by the gripping control device 51 .
- Robot tasks such as object gripping and walking require control of contact force with the surrounding environment and the object 100 for gripping and walking.
- the control becomes difficult.
- gripping control it is necessary to control the gripping force that does not cause the object 100 to slip or break.
- an appropriate gripping force is determined. This is a difficult task for robot control.
- Initial slippage is a phenomenon in which only a part of the contact surface begins to slip, and is also called a premonitory phenomenon of total slippage.
- slip also called total slip
- fixation refers to a state in which static friction occurs over the entire contact surface between the fingertip and the object 100 as the gripped object, and there is no relative movement between the two.
- Slip total slip
- total slip refers to a state in which dynamic friction is generated and two objects in contact are in relative motion.
- dynamic friction is generated over the entire contact surface between the fingertip and the grasped object, and it refers to slippage accompanied by relative movement between the two.
- Initial slippage is a phenomenon in which dynamic friction is generated on a part of the contact surface between the fingertip and the gripped object, which is also called a precursor phenomenon of the above-mentioned slippage (overall slippage).
- This initial slip state is said to exist during the transition from the "stick” state to the "slip” state. In the initial sliding state, no relative motion occurs between the fingertip and the grasped object.
- the contact area is defined as the "sticking area” (that is, the partial area where static friction occurs in the contact surface between the fingertip and the grasped object) where initial slippage does not occur, and the contact area where initial slippage occurs. (that is, the partial area where dynamic friction is generated in the contact surface between the fingertip and the gripped object).
- the degree of slippage can be expressed as a ratio of these two regions.
- FIG. 7A a spherical object as the object 100 is gripped with a planar fingertip, or as the object 100 as shown in FIG.
- FIG. 7C shows an example of the results of FEM analysis performed under conditions corresponding to the case of gripping with fingertips.
- FIG. 7C shows how the sticking ratio (sliding area/sticking area) changes on the contact surface.
- the areas shown in dark gray indicate sticking areas, and the areas shown in light gray indicate slip areas.
- F X unit: Newton (N)
- N Newton
- the "shear direction” is a direction perpendicular to the normal direction of the contact surface and indicates a direction parallel to the contact surface. It is the same as the direction in which slip occurs.
- a gripping device 52 such as a robot hand has a plurality of fingertips 1 .
- a configuration example of one fingertip 1 is shown in FIG.
- Each fingertip 1 has a plurality of elastic bodies 10 having a curvature (having a curved surface shape) and a sensor 20 such as a pressure distribution sensor provided on the bottom surface of each elastic body 10 .
- the detection unit of the gripping control device 51 Based on the detection result of the sensor 20 of each fingertip 1, the detection unit of the gripping control device 51 detects the displacement of each elastic body 10 in the normal direction, the displacement of each fingertip 1 in the shear direction, and the displacement of the gripping object during gripping.
- the sensor 20 is not limited to the pressure distribution sensor, and a force sensor, an optical tactile sensor, a displacement sensor, or the like can also be used.
- the detection unit of the gripping control device 51 can directly detect force in the normal direction using the force sensor. Further, when using a force sensor, the detection unit of the gripping control device 51 can directly detect force in the shear direction using the force sensor and convert it into displacement in the shear direction.
- the detection section of the grip control device 51 can detect force in the normal direction in the same way as when a pressure distribution sensor is used. Moreover, when an optical tactile sensor is used, the detection unit of the grip control device 51 can directly detect the displacement in the shear direction by the optical tactile sensor. When a displacement sensor is used, the detection unit of the grip control device 51 can directly detect the displacement in the normal direction using the displacement sensor.
- FIG. 9 shows an outline of a method for calculating the contact surface normal.
- FIG. 10 shows an overview of the contact point calculation method.
- the detection unit of the gripping control device 51 uses, for example, formulas (1) and (2) based on Hertz's theorem below to determine the force acting on each elastic body 10 in the normal direction and the force of each elastic body 10 on the gripped object. From the contact surface information, the displacement (push amount) ⁇ in the normal direction of each elastic body 10 is calculated.
- the calculation unit of the gripping control device 51 linearly approximates each pressing amount to calculate the contact surface of each fingertip 1 with respect to the gripped object (see FIGS. 9A and 9B).
- P max maximum contact pressure
- F contact force
- E * contact elastic modulus
- R * relative radius of curvature
- ⁇ Displacement in the normal direction (indentation amount) indicates
- the calculation unit of the grip control device 51 calculates a contact surface normal vector n xz (see FIG. 9A) from the estimated contact surface.
- the calculation unit of the gripping control device 51 calculates the contact point Ci between each fingertip 1 and the gripped object based on the contact surface of each fingertip 1 with respect to the gripped object (see FIGS. 10A and 10B).
- the calculation unit of the grip control device 51 calculates the center of pressure position (CoP ).
- the calculation unit of the grip control device 51 calculates the z-coordinate of the contact point Ci , for example, from the intersection of the contact surface and the x, y coordinates (the height of the x, y coordinate point on the contact surface in the z-axis direction). .
- the pressure distribution sensor has a plurality of nodes for detecting pressure formed in a matrix.
- N is the number of sensor nodes of the pressure distribution sensor
- xi is the coordinate of the i-th node in the x-axis direction
- yi is the coordinate of the i-th node in the y-axis direction
- p(xi ) is the pressure detected by the i-th node in the x-axis direction
- p(yi) is the pressure detected by the i-th node in the y-axis direction.
- FIG. 11 schematically shows a first example (situation 1) of the environment for evaluating the detection accuracy of the contact surface normal.
- FIG. 12 schematically shows a second example (situation 2) of the environment for evaluating the detection accuracy of the contact surface normal.
- FIG. 13 shows an overview of the robotic hand 310 in the evaluation environment of FIG.
- the evaluation tester 200 includes a Z stage 201 for adjusting the pushing amount and an X stage (translation/rotation stage 202) for generating shear displacement.
- the shape of the object 100 is a planar object.
- the robot 300 whose outline is shown in FIG. 12 is used.
- the robot 300 has a robot hand 310 and an arm 320 .
- the arm 320 is an arm with 7 degrees of freedom.
- the robot hand 310 is a parallel gripper having a first finger (first fingertip) 311 and a second finger (second fingertip) 312, the outline of which is shown in FIG.
- the first finger portion 311 and the second finger portion 312 have the same structure as the fingertip 1 shown in FIG.
- FIG. 14 shows the evaluation result of the estimated angle of the normal to the contact surface in situation 1 of Fig. 11.
- FIG. 15 shows the evaluation result of the estimated angle of the normal to the contact surface in Situation 2 of FIG. 14 and 15 show evaluation results of estimated angles between the xz direction and the yz direction.
- FIG. 16 schematically shows an evaluation environment for the detection accuracy of the contact surface normal of the other object shape.
- An evaluation tester 200A having basically the same structure as the evaluation tester 200 of FIG. ing.
- Figures 17 to 19 show examples of evaluation objects.
- evaluation objects three types of planar objects 101, cylindrical objects 102, and spherical objects 103 were prepared. Data were collected at 2° steps from 0° to 10° contact angle. For the columnar object 102, two types of data were collected: a direction without curvature and a direction with curvature.
- 20 to 25 show evaluation results of each evaluation object.
- the horizontal axis indicates the jig angle ⁇ a (°) and the vertical axis indicates the estimated angle ⁇ b (°). Since the estimated angle ⁇ b depends on the Young's modulus of the object, the absolute value cannot be guaranteed, but it was confirmed that the linearity and reproducibility are high.
- 20 to 25 the closer the coefficient of determination R2 is to 1, the higher the linearity.
- Each measurement was performed five times, and the standard deviation shown in FIG. 25 indicates the dispersion of measured values for each measurement. It should be noted that in the case of the cylindrical object 102 with curvature and the spherical object 103, since the object surface has a curvature, the jig angle object does not have a correct value.
- the calculator calculates the fingertip force (gripping force) of each fingertip 1 on the gripped object based on the initial slippage of each fingertip 1, the normal to the contact surface of each fingertip 1, and the contact point of each fingertip 1. Calculate The gripping control device 51 performs hand control for stably gripping an unknown object based on the calculated fingertip force (gripping force).
- FIG. 26 schematically shows an example of a control block diagram of hand control by the grip control device 51.
- FIG. 27 shows an example of parameters used for hand control.
- the grip control device 51 includes a slip detection/grip force determination unit 21, a grip control unit 22, a damping unit 24, and a subtractor 25 as control blocks for hand control.
- Dai indicates a damping coefficient
- S indicates a differential term.
- the slip detection/grip force determination unit 21 detects and calculates the initial slip using the method described above based on the detection result from the sensor 20 . Further, the slip detection/grip force determination unit 21 calculates the target grip force (fingertip force) f d based on the initial slip and the detection/calculation results of the contact surface normal, and the grip control unit 22 controls input as a signal.
- the hand 23 is hand-controlled based on the joint torque ⁇ calculated by the grip control unit 22 , the damping unit 24 and the subtractor 25 .
- the hand 23 has a first finger portion (first fingertip) 11 and a second finger portion (second fingertip) 12 as a plurality of fingertips to grip the object 100.
- the first finger portion 11 and the second finger portion 12 have a structure of a plurality of elastic bodies 10 having a curved shape like the fingertip 1 shown in FIG. 8, but the structure is shown in a simplified manner in FIG. .
- the number of fingertips of the hand 23 is not limited to two, and is arbitrary.
- Equation (5) An example of the control signal ⁇ i output from the grip control unit 22 is shown in Equation (5).
- J(q i ) Jacobian matrix relating to the joint angle q i of each finger of the hand 23 and the center position of the fingertip hemisphere;
- a i the center of the fingertip of the hand 23;
- O the geometric center of each contact point between each fingertip of the hand 23 and the object 100;
- C i each contact point between each fingertip of the hand 23 and the object 100;
- f d target gripping force (fingertip force) indicates
- the control signal ⁇ i (equation (5)) utilizes each fingertip force vector directed from each contact point C i between each fingertip of the hand 23 and the object 100 to the geometric center O of each contact point C i , resulting in a resultant force, and that the resultant moment is zero.
- the control signal ⁇ i (equation (5)) can also be transformed into equation (6).
- X i , Y i the contact surface line distance from the geometric center O to the contact point C i of each fingertip; e Xi , e Yi : contact surface line unit vector, e Zi : Indicates the contact surface normal unit vector.
- the first term indicates the control term in the fingertip normal direction
- the second and third terms indicate the control terms in the fingertip tangential direction.
- the second and third terms are terms that compensate for the moment that each fingertip receives from the object 100 .
- FIG. 28 schematically shows a configuration example of the slip detection/gripping force determination unit 21.
- the slip detection/grip force determination unit 21 has an LPF (low pass filter) 31 , a reference value generation unit 32 , and a PID (Proportional Integral Differential) control unit 33 .
- Initial slippage can be detected from displacement in shear direction and contact surface information.
- the shear displacement can be estimated by using, for example, a pressure distribution sensor and based on the movement information of the pressure center position.
- a pressure distribution sensor For the target gripping force fd , for example, as shown in FIG. 28, an algorithm that determines a gripping force that does not cause initial slippage using PID control can be used.
- the initial slip can be used to prevent the occurrence of slip, and at the same time, the contact surface normal and the contact point can be used to achieve three-dimensional force balance.
- the calculation unit calculates the fingertip force for controlling the position and orientation of the gripped object based on the contact surface normal of each fingertip 1 and the contact point of each fingertip 1 as the fingertip force of each fingertip 1 . You may make it calculate force.
- FIG. 29 schematically shows an example of a control block diagram for performing position/orientation control of a grasped object by the grasping control device 51.
- FIG. 29 schematically shows an example of a control block diagram for performing position/orientation control of a grasped object by the grasping control device 51.
- the grip control device 51 includes a position/attitude control unit 40 as a control block that controls the position/attitude of the gripped object.
- the position/orientation control section 40 has a position control section 41 and an orientation control section 42 .
- FIG. 29 shows an example in which the position and orientation of the grasped object are controlled by the first finger (first fingertip) 11 and the second finger (second fingertip) 12, but the number of fingertips is reduced to two. It is optional and not limited.
- Mode conversion by discrete Fourier transform is used for position/orientation control of the grasped object (equation (7)).
- DFT transform discrete Fourier transform
- G is a real scalar value after mode conversion
- g is a real scalar value before mode conversion
- W is a complex scalar value indicating a rotator.
- N is a scalar value representing any integer.
- the DFT can be expressed using the matrix FN shown in Equation (8) below.
- the DFT matrix is used for the extraction of grasping modes and manipulation modes.
- g denotes the external force [f 1 , .
- G is the force or position extraction mode, whose components are divided into grasping and manipulating modes.
- the "grasping mode” means force balance control
- the "manipulation mode” means position and orientation control of the center of gravity of an object.
- the position control signal ⁇ pi of the object center of gravity in the manipulation mode can be expressed, for example, by Equation (9), and the attitude control signal ⁇ Oi can be expressed, for example, by Equation (10).
- O d target object position
- K p position gain
- e Xd e Xd
- e Yd e Zd
- K O attitude gain
- J( ⁇ i ) Jacobian matrix for each joint angle of the posture angular velocity vector at the fingertip hemisphere center position.
- FIG. 30 schematically shows an example of a robot hand used for evaluation of the hand control technique by the grip control device 51.
- FIG. 31 shows simulation results when the hand control method according to the present technology described above is applied to the two-finger four-degree-of-freedom (pitch-pitch/pitch-pitch) robot hand shown in FIG.
- FIG. 32 shows the physical parameters of the robot hand used in the simulation. Also, the physical parameters of the gripped object and the control parameters of hand control are shown in FIG. From the simulation results in FIG. 31, it can be confirmed that the values of Y1 and Y2 gradually approach each other. In other words, it was confirmed that the rolling of the fingertips could be used to control the balance of force and moment.
- the normal lines of the plurality of curved elastic bodies 10 provided at the fingertips 1 contacting the gripped object Based on the directional displacement, the contact surface normal of each fingertip 1 with respect to the gripped object is calculated. This makes it possible to achieve stable gripping of an unknown object using the contact surface normal.
- the gripping control device According to the gripping control device according to the first embodiment, the following effects are obtained. (1) It is possible to grasp an unknown object (no prior information about the object 100) by simultaneously detecting the initial slip that occurs during object grasping, the normal to the contact surface, and the contact point. . (2) To stably grip an unknown object with the minimum necessary force by detecting initial slippage, preventing slippage, and using the contact surface normal and contact point to solve the force/moment balance. enable (3) It can be applied to multi-flexible hand control without depending on the hand configuration. (4) If it becomes possible to apply it to a multi-degree-of-freedom hand, it will improve gripping stability and expand the range of objects that can be gripped.
- the present technology can also have the following configuration.
- each of the fingertips with respect to the gripped object is displaced in the normal direction of each of the plurality of elastic bodies having a curved surface shape provided on each of the plurality of fingertips contacting the gripped object. Calculate the contact surface normal. This makes it possible to achieve stable gripping of an unknown object using the contact surface normal.
- a gripping control device comprising: a calculator that calculates a contact surface normal of each fingertip with respect to the gripped object based on a detection result of the detector.
- the calculation unit calculates the contact surface normal of each fingertip based on the contact surface of each fingertip with respect to the gripped object calculated based on the displacement of each elastic body of each fingertip in the normal direction.
- the calculator calculates the fingertip force for controlling the position and orientation of the grasped object based on the contact surface normal of each fingertip and the contact point of each fingertip as the fingertip force of each fingertip.
- the gripping control device according to any one of (9) to (11) above. (13) The gripping control device according to (4) above, wherein the detection unit detects the force acting on each elastic body in the normal direction using a force sensor provided on the bottom surface of each elastic body. (14) The detection unit detects the force acting on each elastic body in the normal direction by combining the maximum pressure calculated based on the detection result of an optical tactile sensor provided on the bottom surface of each elastic body and the contact surface information. The gripping control device according to (4) above. (15) The gripping control device according to (1) above, wherein the detection unit detects displacement in a normal direction of each elastic body by a displacement sensor provided on a bottom surface of each elastic body.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
0.比較例(図1~図4)
1.第1の実施の形態(図5~図33)
1.1 把持システムの構成例
1.2 把持システムの各部の構成例および動作例、ならびに評価例
1.3 効果
2.その他の実施の形態
ロボットハンド等では、把持物体が未知物体(質量、重心位置、摩擦係数等が不明な物体)であっても安定的に把持することが求められる。安定した把持を実現するためには、ロボットハンド等において各指先力の大きさおよび向きを適切に与えることが求められる。安定把持の条件は2つあり、各指先力が摩擦円錐内に存在すること(滑りの発生を防ぐ)(条件1)、合力、および合モーメントが0であること(条件2)である。条件1を満たすためには、初期滑りの検出が有効である。初期滑りとは、滑りの前兆現象であり、指先の接触面の一部のみが滑り出す現象である。初期滑りは圧力分布が急峻であるほど緩やかに発生し、検出が容易になる。圧力分布を急峻にするためには、指先は曲面の弾性体であることが求められる。条件2を満たすためには、把持動作中の指先の接触点および接触面法線を利用する方法が一般的である。したがって、条件1と条件2とを両立させるためには、把持動作中の曲面弾性体の指先の接触面法線、および接触点を検出することが求められる。
[1.1 把持システムの構成例]
図5は、本開示の第1の実施の形態に係る把持システムの一構成例を概略的に示している。
物体把持や歩行等のロボットのタスクで、把持や歩行などは、周囲環境や物体100との接触力の制御が必要である。しかしながら、その環境や物体100の物理量が不明の場合は制御が難しくなる。例えば把持制御においては物体100を滑らさずかつ破壊しない把持力を制御する必要があるが、物理量(質量、重心位置、摩擦係数等)が不明な未知物体の場合は適切な把持力を決定することが難しく、ロボット制御の課題となっている。
初期滑りを制御するためには、初期滑りの度合いを定量化する必要がある。ここで、接触領域は、初期滑りが発生していない「固着領域」(つまり、指先と把持物体との接触面のうち、静止摩擦が発生している部分領域)と、初期滑りが発生している「滑り領域」(つまり、指先と把持物体との接触面のうち、動摩擦が発生している部分領域)に分けられる。滑り度合いはこの2つの領域の割合で示すことができる。ここでは、接触領域に対する固着領域の割合を「固着率」と定義する。固着率1(=100%)の場合、接触領域は、滑り領域が無く完全に固着している状態にある。逆に、固着率0の場合、接触領域は、その全てが滑り領域となり、滑り(全体滑り)が発生している状態にある。逆に、固着率0の場合、接触領域は、その全てが滑り領域となり、滑り(全体滑り)が発生している状態にある。
(初期滑り検出と両立する接触面法線、および接触点の算出手法の例)
ロボットハンド等の把持装置52は、指先1を複数、有する。1つの指先1の一構成例を図8に示す。各指先1は、曲率を有する(曲面形状を有する)複数の弾性体10と、各弾性体10の底面に設けられた圧力分布センサ等のセンサ20とを有する。把持制御装置51の検出部は、各指先1のセンサ20の検出結果に基づいて、各弾性体10の法線方向の変位と、各指先1のせん断方向の変位と、把持物体の把持中に発生する各指先1の初期滑りとを同時に検出可能となっている。なお、以下では、各弾性体10に働く法線方向の力や法線方向の変位、および各指先のせん断方向の変位の検出をセンサ20として圧力分布センサを用いる場合を例に説明する。ただし、センサ20としては、圧力分布センサに限らず、力覚センサ、光学式触覚センサ、または変位センサ等を用いることも可能である。力覚センサを用いる場合、把持制御装置51の検出部は、力覚センサによって直接的に法線方向の力を検出することが可能である。また、力覚センサを用いる場合、把持制御装置51の検出部は、力覚センサによって直接的にせん断方向の力を検出し、せん断方向の変位に変換することが可能である。光学式触覚センサを用いる場合、把持制御装置51の検出部は、圧力分布センサを用いる場合と同様にして法線方向の力を検出することが可能である。また、光学式触覚センサを用いる場合、把持制御装置51の検出部は、光学式触覚センサによって直接的にせん断方向の変位を検出することが可能である。変位センサを用いる場合、把持制御装置51の検出部は、変位センサによって直接的に法線方向の変位を検出することが可能である。
ここで、
Pmax:最大接触圧力、
F:接触力、
E*:接触弾性係数、
R*:相対曲率半径、
δ:法線方向の変位(押し込み量)
を示す。
(評価例1:把持動作中の接触面法線の検出精度の評価例)
図11は、接触面法線の検出精度の評価環境の第1の例(シチュエーション1)を模式的に示している。図12は、接触面法線の検出精度の評価環境の第2の例(シチュエーション2)を模式的に示している。図13は、図12の評価環境におけるロボットハンド310の概要を示す。
・物体100を、複数の曲面状の弾性体10を有する図8と同様の構造の指先1に対して垂直に接触させ、その後、せん断変位を発生させた場合(図11、シチュエーション1)
・ロボットハンド310で滑らないように紙コップ401を把持させ、途中で鉄球402が流し込まれた場合(図12、シチュエーション2)
図16に、他物体形状の接触面法線の検出精度の評価環境を模式的に示す。図11の評価試験機200と基本的に同様の構造の評価試験機200Aを用いているが、評価試験機200Aは、Zステージ201と並進・回転ステージ202とに加え、さらに回転ステージ203を備えている。
把持制御装置51において、算出部は、各指先1の初期滑り、各指先1の接触面法線、および各指先1の接触点に基づいて、各指先1の把持物体に対する指先力(把持力)を算出する。把持制御装置51は、算出された指先力(把持力)に基づいて未知物体を安定して把持するためのハンド制御を行う。
ここで、
J(qi):ハンド23の各指の関節角度qiと指先半球中心位置に関するヤコビ行列、
Ai:ハンド23の指先中心、
O:ハンド23の各指先と物体100との各接触点の幾何中心、
Ci:ハンド23の各指先と物体100との各接触点、
fd:目標の把持力(指先力)
を示す。
ここで、
Xi、Yi:幾何中心Oから各指先の接触点Ciまでの接触面接線距離、
eXi,eYi:接触面接線単位ベクトル、
eZi:接触面法線単位ベクトル
を示す。
式(6)において、第1項は指先法線方向の制御項、第2項および第3項は指先接線方向の制御項を示す。第2項および第3項は、物体100から各指先が受けるモーメントを補償する項である。
さらに、上記ハンド制御に加え、接触面法線、および接触点を利用して、物体100の位置姿勢を制御する手法(重力補償)を説明する。
ただし、
Od:目標物体位置、
Kp:位置ゲイン、
eXd,eYd,eZd:目標物体姿勢、
KO:姿勢ゲイン、
J(Ωi):指先半球中心位置における姿勢角速度ベクトルの各関節角に関するヤコビ行列
を示す。
図30は、把持制御装置51によるハンド制御手法の評価に用いたロボットハンドの一例を概略的に示している。上述の本技術によるハンド制御手法を、図30に示した2指4自由度(pitch-pitch/pitch-pitch)のロボットハンドに適用した場合のシミュレーション結果を図31に示す。
以上説明したように、第1の実施の形態に係る把持制御装置によれば、把持物体に接触する複数の指先1のそれぞれに設けられた曲面形状を有する複数の弾性体10のそれぞれの法線方向の変位に基づいて、把持物体に対する各指先1の接触面法線を算出する。これにより、接触面法線を利用して、未知物体の安定的な把持を実現することが可能となる。
(1)物体把持中に発生する初期滑りと接触面法線、および接触点を同時に検出することを可能にすることで、未知物体(事前に物体100の情報がない)の把持が可能になる。
(2)初期滑りを検出して滑りを防ぎ、接触面法線、および接触点を利用して力・モーメントのつり合いを解くことで、未知物体を必要最小限の力で安定的に把持することを可能にする。
(3)ハンド構成に依存せず、多自由のハンド制御への適用が可能となる。
(4)多自由度ハンドへの適用が可能になると、把持安定性の向上や、把持可能物体の範囲が拡大する。
本開示による技術は、上記実施の形態の説明に限定されず種々の変形実施が可能である。
以下の構成の本技術によれば、把持物体に接触する複数の指先のそれぞれに設けられた曲面形状を有する複数の弾性体のそれぞれの法線方向の変位に基づいて、把持物体に対する各指先の接触面法線を算出する。これにより、接触面法線を利用して、未知物体の安定的な把持を実現することが可能となる。
把持物体に接触する複数の指先のそれぞれに設けられた曲面形状を有する複数の弾性体のそれぞれの法線方向の変位を検出する検出部と、
前記検出部の検出結果に基づいて、前記把持物体に対する前記各指先の接触面法線を算出する算出部と
を備える
把持制御装置。
(2)
前記算出部は、前記各指先の前記各弾性体の法線方向の変位をもとに算出された前記各指先の前記把持物体に対する接触面に基づいて、前記各指先の前記接触面法線を算出する
上記(1)に記載の把持制御装置。
(3)
前記算出部は、さらに、前記各指先の前記把持物体に対する前記接触面に基づいて、前記各指先と前記把持物体との接触点を算出する
上記(2)に記載の把持制御装置。
(4)
前記検出部は、前記各弾性体の法線方向の変位を、前記各弾性体に働く法線方向の力と前記各弾性体の前記把持物体に対する接触面情報とから算出する
上記(1)ないし(3)のいずれか1つに記載の把持制御装置。
(5)
前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた圧力分布センサの検出結果に基づいて算出された最大圧力と前記接触面情報とから算出する
上記(4)に記載の把持制御装置。
(6)
前記検出部は、さらに、前記各指先のせん断方向の変位を検出する
上記(3)に記載の把持制御装置。
(7)
前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた圧力分布センサの検出結果に基づいて算出された前記各指先の圧力中心位置の移動情報に基づいて算出する
上記(6)に記載の把持制御装置。
(8)
前記検出部は、さらに、前記把持物体の把持中に発生する前記各指先の初期滑りを検出する
上記(6)または(7)に記載の把持制御装置。
(9)
前記算出部は、さらに、前記各指先の初期滑り、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、前記各指先の前記把持物体に対する指先力を算出する
上記(8)に記載の把持制御装置。
(10)
前記算出部は、前記各指先の前記指先力として、前記各指先の初期滑りが発生しないような指先力を算出する
上記(9)に記載の把持制御装置。
(11)
前記算出部は、前記各指先の前記指先力として、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、合力、および合モーメントが0となるような指先力を算出する
上記(9)または(10)に記載の把持制御装置。
(12)
前記算出部は、前記各指先の前記指先力として、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、前記把持物体の位置および姿勢を制御するための指先力を算出する
上記(9)ないし(11)のいずれか1つに記載の把持制御装置。
(13)
前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた力覚センサによって検出する
上記(4)に記載の把持制御装置。
(14)
前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた光学式触覚センサの検出結果に基づいて算出された最大圧力と前記接触面情報とから算出する
上記(4)に記載の把持制御装置。
(15)
前記検出部は、前記各弾性体の法線方向の変位を、前記各弾性体の底面に設けられた変位センサによって検出する
上記(1)に記載の把持制御装置。
(16)
前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた光学式触覚センサによって検出する
上記(6)に記載の把持制御装置。
(17)
前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた力覚センサによって検出する
上記(6)に記載の把持制御装置。
(18)
把持物体に接触する複数の指先のそれぞれに設けられた曲面形状を有する複数の弾性体のそれぞれの法線方向の変位を検出することと、
前記検出結果に基づいて、前記把持物体に対する前記各指先の接触面法線を算出することと
を含む
把持制御方法。
Claims (18)
- 把持物体に接触する複数の指先のそれぞれに設けられた曲面形状を有する複数の弾性体のそれぞれの法線方向の変位を検出する検出部と、
前記検出部の検出結果に基づいて、前記把持物体に対する前記各指先の接触面法線を算出する算出部と
を備える
把持制御装置。 - 前記算出部は、前記各指先の前記各弾性体の法線方向の変位をもとに算出された前記各指先の前記把持物体に対する接触面に基づいて、前記各指先の前記接触面法線を算出する
請求項1に記載の把持制御装置。 - 前記算出部は、さらに、前記各指先の前記把持物体に対する前記接触面に基づいて、前記各指先と前記把持物体との接触点を算出する
請求項2に記載の把持制御装置。 - 前記検出部は、前記各弾性体の法線方向の変位を、前記各弾性体に働く法線方向の力と前記各弾性体の前記把持物体に対する接触面情報とから算出する
請求項1に記載の把持制御装置。 - 前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた圧力分布センサの検出結果に基づいて算出された最大圧力と前記接触面情報とから算出する
請求項4に記載の把持制御装置。 - 前記検出部は、さらに、前記各指先のせん断方向の変位を検出する
請求項3に記載の把持制御装置。 - 前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた圧力分布センサの検出結果に基づいて算出された前記各指先の圧力中心位置の移動情報に基づいて算出する
請求項6に記載の把持制御装置。 - 前記検出部は、さらに、前記把持物体の把持中に発生する前記各指先の初期滑りを検出する
請求項6に記載の把持制御装置。 - 前記算出部は、さらに、前記各指先の初期滑り、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、前記各指先の前記把持物体に対する指先力を算出する
請求項8に記載の把持制御装置。 - 前記算出部は、前記各指先の前記指先力として、前記各指先の初期滑りが発生しないような指先力を算出する
請求項9に記載の把持制御装置。 - 前記算出部は、前記各指先の前記指先力として、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、合力、および合モーメントが0となるような指先力を算出する
請求項9に記載の把持制御装置。 - 前記算出部は、前記各指先の前記指先力として、前記各指先の前記接触面法線、および前記各指先の前記接触点に基づいて、前記把持物体の位置および姿勢を制御するための指先力を算出する
請求項9に記載の把持制御装置。 - 前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた力覚センサによって検出する
請求項4に記載の把持制御装置。 - 前記検出部は、前記各弾性体に働く前記法線方向の力を、前記各弾性体の底面に設けられた光学式触覚センサの検出結果に基づいて算出された最大圧力と前記接触面情報とから算出する
請求項4に記載の把持制御装置。 - 前記検出部は、前記各弾性体の法線方向の変位を、前記各弾性体の底面に設けられた変位センサによって検出する
請求項1に記載の把持制御装置。 - 前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた光学式触覚センサによって検出する
請求項6に記載の把持制御装置。 - 前記検出部は、前記せん断方向の変位を、前記各弾性体の底面に設けられた力覚センサによって検出する
請求項6に記載の把持制御装置。 - 把持物体に接触する複数の指先のそれぞれに設けられた曲面形状を有する複数の弾性体のそれぞれの法線方向の変位を検出することと、
前記検出結果に基づいて、前記把持物体に対する前記各指先の接触面法線を算出することと
を含む
把持制御方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280067963.6A CN118076467A (zh) | 2021-10-15 | 2022-08-22 | 把持控制装置和把持控制方法 |
JP2023554950A JPWO2023062941A1 (ja) | 2021-10-15 | 2022-08-22 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021169412 | 2021-10-15 | ||
JP2021-169412 | 2021-10-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023062941A1 true WO2023062941A1 (ja) | 2023-04-20 |
Family
ID=85987376
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/031538 WO2023062941A1 (ja) | 2021-10-15 | 2022-08-22 | 把持制御装置、および把持制御方法 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2023062941A1 (ja) |
CN (1) | CN118076467A (ja) |
WO (1) | WO2023062941A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01316194A (ja) * | 1988-06-16 | 1989-12-21 | Yamatake Honeywell Co Ltd | 滑り検出装置及びロボットハンドの滑り検出装置 |
JP2000254884A (ja) * | 1999-03-10 | 2000-09-19 | Keiogijuku | ハンド又はマニピュレータによる物体把持制御方法 |
JP2017177294A (ja) * | 2016-03-31 | 2017-10-05 | キヤノン株式会社 | ロボット制御装置、ロボット制御方法、ロボットシステムおよびコンピュータプログラム |
WO2020246263A1 (ja) * | 2019-06-05 | 2020-12-10 | ソニー株式会社 | 制御装置および方法、並びに、プログラム |
WO2022039058A1 (ja) * | 2020-08-20 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
-
2022
- 2022-08-22 CN CN202280067963.6A patent/CN118076467A/zh active Pending
- 2022-08-22 WO PCT/JP2022/031538 patent/WO2023062941A1/ja active Application Filing
- 2022-08-22 JP JP2023554950A patent/JPWO2023062941A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01316194A (ja) * | 1988-06-16 | 1989-12-21 | Yamatake Honeywell Co Ltd | 滑り検出装置及びロボットハンドの滑り検出装置 |
JP2000254884A (ja) * | 1999-03-10 | 2000-09-19 | Keiogijuku | ハンド又はマニピュレータによる物体把持制御方法 |
JP2017177294A (ja) * | 2016-03-31 | 2017-10-05 | キヤノン株式会社 | ロボット制御装置、ロボット制御方法、ロボットシステムおよびコンピュータプログラム |
WO2020246263A1 (ja) * | 2019-06-05 | 2020-12-10 | ソニー株式会社 | 制御装置および方法、並びに、プログラム |
WO2022039058A1 (ja) * | 2020-08-20 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023062941A1 (ja) | 2023-04-20 |
CN118076467A (zh) | 2024-05-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Costanzo et al. | Two-fingered in-hand object handling based on force/tactile feedback | |
Dollar et al. | Joint coupling design of underactuated grippers | |
JP2009269127A (ja) | 把持装置及びその制御方法 | |
Delgado et al. | In-hand recognition and manipulation of elastic objects using a servo-tactile control strategy | |
JP2014018931A (ja) | 制御システム、プログラム及び機械装置の制御方法 | |
CN106994685B (zh) | 一种机械手的手指姿态判断方法及机械手 | |
Costanzo | Control of robotic object pivoting based on tactile sensing | |
US20240009857A1 (en) | Information processing device, information processing method, and program | |
Costanzo et al. | Slipping control algorithms for object manipulation with sensorized parallel grippers | |
León et al. | Robot grasping foundations | |
Jia et al. | Pose and motion from contact | |
JP6003312B2 (ja) | ロボットシステム | |
Mazhitov et al. | Human–robot handover with prior-to-pass soft/rigid object classification via tactile glove | |
WO2023062941A1 (ja) | 把持制御装置、および把持制御方法 | |
Narwal et al. | Study of dynamics of soft contact rolling using multibond graph approach | |
Yussof et al. | Development of optical three-axis tactile sensor and its application to robotic hand for dexterous manipulation tasks | |
Kansal et al. | Tele-operation of an industrial robot by an arm exoskeleton for peg-in-hole operation using immersive environments | |
Lazher et al. | Modeling and analysis of 3D deformable object grasping | |
Yussof et al. | Analysis of tactile slippage control algorithm for robotic hand performing grasp-move-twist motions | |
Kawasaki et al. | Perception and haptic rendering of friction moments | |
Fasoulas et al. | Active control of rolling manoeuvres of a robotic finger with hemispherical tip | |
Nacy et al. | A novel fingertip design for slip detection under dynamic load conditions | |
Jia et al. | Observing pose and motion through contact | |
Fasoulas et al. | Modeling and grasp stability analysis for object manipulation by soft rolling fingertips | |
Cruz-Valverde et al. | Kinematic and dynamic modeling of the phantom premium 1.0 haptic device: Experimental validation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22880638 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023554950 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280067963.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22880638 Country of ref document: EP Kind code of ref document: A1 |