WO2024004622A1 - ロボット及びロボットの制御方法 - Google Patents
ロボット及びロボットの制御方法 Download PDFInfo
- Publication number
- WO2024004622A1 WO2024004622A1 PCT/JP2023/021834 JP2023021834W WO2024004622A1 WO 2024004622 A1 WO2024004622 A1 WO 2024004622A1 JP 2023021834 W JP2023021834 W JP 2023021834W WO 2024004622 A1 WO2024004622 A1 WO 2024004622A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- robot
- user
- unit
- handshake
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Definitions
- the present technology relates to a robot and a method of controlling the robot, and particularly relates to a robot that can perform contact interaction and a method of controlling the robot.
- the present technology has been developed in view of this situation, and is intended to increase the sense of security of users, such as those with cognitive decline, when it comes to contact interactions such as handshakes with robots.
- a robot includes a hand unit that can shake hands with a user's hand, a tactile detection unit that detects tactile information of the hand unit, and a robot that is configured to shake hands with the user's hand based on the tactile information. and a handshake control section that controls the grip force of the handshake.
- a robot control method detects tactile information of a hand that can shake hands with a user's hand, and controls the grip force of the hand that is shaking hands with the user's hand based on the tactile information. .
- tactile information of a hand unit that can shake hands with a user's hand is detected, and based on the tactile information, the grip force of the hand unit that is shaking hands with the user's hand is controlled.
- FIG. 2 is a diagram for explaining important elements for recognition.
- 1 is a schematic diagram of an example of an external configuration of a robot to which the present technology is applied.
- FIG. 2 is a schematic diagram of an example of the external appearance and cross-sectional configuration of a hand portion of a robot.
- FIG. 3 is a diagram for explaining initial slippage.
- FIG. 3 is a diagram illustrating a functional configuration example of a handshake execution processing section of a robot. It is a flowchart for explaining the handshake control process executed by the robot. It is a flowchart for explaining the handshake control process executed by the robot.
- FIG. 3 is a diagram for explaining a method for controlling the grip force of a hand section of a robot.
- FIG. 3 is a diagram for explaining a method of controlling the grip force of a hand section of a robot.
- FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot.
- FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot. It is a figure which shows the modification of the cross section of the hand part of a robot.
- 1 is a block diagram showing an example of the configuration of a computer. FIG.
- a safe state is a state without fear, and can be translated as a state without a sense of anxiety or danger.
- a case where the other party's actions become unpredictable may be a case where there is no indication as to whether or not your input is being transmitted to the other party. Specifically, for example, when a person shakes someone's hand for a handshake, there is no reaction at all from the other person. Further, for example, cases in which the other party's actions become unpredictable include cases in which the other party does not know what the other party is doing, or cases in which the other party is not recognized in the first place. Specifically, for example, if a person is suddenly touched by someone they do not recognize, they become surprised and anxious.
- the following three requirements can be considered to provide a sense of security to the person with whom the robot interacts (hereinafter referred to as the user).
- Requirement 1 Respond appropriately to user input, be able to predict the next action, and do not pose any threat to the user.
- Requirement 3 The contact area with the user should be as large as possible, and the rate of change of the contact surface and reaction force from the robot should be as small as possible.
- the purpose of this technology is to provide a robot that satisfies Requirements 1 to 3 and can perform contact interactions with a sense of security.
- the robot executes the following four measures in order to achieve a touch interaction that gives a sense of security.
- the robot instantly detects changes when the user tries to move the hand that has been shaken by the user, based on the initial slippage, and gently returns the grip with just enough force to prevent the user's hand from slipping, thereby transmitting the user's input to the robot. to present things.
- the robot's movements can be easily understood by linking the functions of "seeing (gaze control),” “speaking (speech control),” and “touching (touch control),” which are important elements in cognition shown in Figure 1. , and make it easier to recognize.
- a method called autofeedback which is one of the nursing care methods, is effective.And by executing autofeedback in accordance with the robot's control and changes in observed values, the robot's movement can be improved. It makes things easier to understand, improves awareness, and increases the sense of security.
- the "seeing” and “touching” functions seem to be independent at first glance, but by making eye contact and “looking” with the other person, the user can clearly recognize the robot.
- the robot can give the user a sense of security by shifting to the "touching" action after confirming that the robot has made eye contact with the user and has been recognized.
- the robot not only looks at the user's eyes, but also looks at the area to be touched (for example, the user's hand in the case of a handshake) when touching, making it easier for the user to recognize what is being touched.
- the robot moves the hand part in a trajectory that maximizes the contact area with the hand of the user who is shaking hands.
- Means 4 The robot immediately detects the external force when the user tries to move the hand after shaking hands, based on initial slippage, and moves the hand part in a trajectory that reduces the reaction force against the external force.
- the grip force control of means 1 allows the robot to transition from a state in which it returns its grip to a state in which the robot itself gently releases its force, thereby further enhancing the effect of means 1.
- the effectiveness can be maximized by using a method that uses initial slippage to detect external force.
- Embodiment >> Next, embodiments of the present technology will be described with reference to FIGS. 2 to 11.
- FIG. 2 schematically shows an example of the external appearance of the robot 101.
- the robot 101 includes a head 111, a neck 112, a chest 113, an abdomen 114, a waist 115, a cart 116, an arm 117L, and an arm 117R.
- a head 111 , a neck 112 , a chest 113 , an abdomen 114 , and a waist 115 are connected in order from top to bottom, and the waist 115 is placed on a trolley 116 .
- An arm portion 117L corresponding to the left arm is connected to the left side of the chest 113, and an arm portion 117R corresponding to the right arm is connected to the right side of the chest 113.
- the neck 112 is rotatable, for example, around the roll axis, pitch axis, and yaw axis with respect to the chest 113.
- the chest 113 is rotatable about the pitch axis and the yaw axis with respect to the abdomen 114.
- the abdomen 114 is rotatable about the pitch axis and the yaw axis, for example, relative to the waist 115.
- the waist portion 115 is, for example, rotatable about the yaw axis with respect to the truck 116.
- the head 111 includes a sensor section 121, an eye 122L corresponding to the left eye, and an eye 122R corresponding to the right eye.
- the sensor section 121 is provided near the forehead of the head 111.
- the sensor unit 121 includes, for example, a sensor such as an image sensor that detects the state around the robot 101.
- the sensor unit 121 outputs sensor data indicating the detection results of each sensor.
- the eyes 122L and 122R each include a monitor (not shown).
- the monitor for the eye 122L displays the image of the left eye of the robot 101 and can move the image of the left eye.
- the monitor of the eye 122R displays the image of the right eye of the robot 101 and can move the image of the right eye. This causes the robot 101's line of sight to move and its facial expression to change.
- eyes 122 when there is no need to distinguish between the eyes 122L and 122R, they will simply be referred to as eyes 122.
- the arm portion 117L includes a shoulder portion 131L, an upper arm portion 132L, a forearm portion 133L, and a hand portion 134L.
- the shoulder portion 131L, the upper arm portion 132L, the forearm portion 133L, and the hand portion 134L are connected in order so as to extend from the left side of the chest 113.
- the shoulder portion 131L is rotatable around the pitch axis and the roll axis, for example, with respect to the chest 113.
- the upper arm portion 132L is fixed to the shoulder portion 131L.
- the forearm portion 133L is rotatable about the pitch axis and the yaw axis, for example, with respect to the upper arm portion 132L.
- the hand portion 134L is rotatable around the yaw axis relative to the forearm portion 133L, for example.
- the arm 117R includes a shoulder 131R, an upper arm 132R, a forearm 133R, and a hand 134R, and can move in the same way as the arm 117L.
- arm 117L and the arm 117R if there is no need to distinguish between the arm 117L and the arm 117R, they will simply be referred to as arm 117.
- shoulder portion 131L and the shoulder portion 131R when there is no need to distinguish between the shoulder portion 131L and the shoulder portion 131R, they will simply be referred to as the shoulder portion 131.
- upper arm part 132L and the upper arm part 132R they will simply be referred to as the upper arm part 132.
- forearm part 133L and the forearm part 133R they will simply be referred to as the forearm part 133.
- hand section 134L and the hand section 134R when there is no need to distinguish between the hand section 134L and the hand section 134R, they will simply be referred to as the hand section 134.
- FIG. 3 shows an example of the configuration of the hand section 134L.
- FIG. 3A schematically shows an example of the external appearance of the hand portion 134L.
- B in FIG. 3 schematically shows a cross-sectional configuration example in the width direction of the part 151L of the hand portion 134L.
- the hand portion 134L includes a part 151L, a part 152L, and a yaw axis 153L. Part 151L and part 152L are connected to yaw axis 153L.
- the parts 151L are parts corresponding to the palm, back, index finger, middle finger, ring finger, and little finger of a human hand. However, in the part 151L, each finger is not separated and is integrated.
- Part 152L is a part corresponding to the thumb of a human hand.
- the part 151L includes pitch axes 161L to 163L that extend in the width direction and are parallel to each other. By individually rotating the pitch axes 161L to 163L, the parts 151L can be opened and closed to grasp or release an object. Thereby, the hand portion 134L can wrap the user's palm with the part 151L, and has the degree of freedom to simultaneously contact the user's palm side and back side and apply force to each other.
- the part 151L and the part 152L can rotate together around the yaw axis 153L.
- the part 151L includes a base portion 171L, a tactile sensor 172L, and an elastic body 173L.
- the surface of the base portion 171L is covered with a tactile sensor 172.
- the surface of the tactile sensor 172 is covered with an elastic body 173L.
- the base portion 171L is made of metal, for example, and constitutes the main body of the hand portion 134.
- the tactile sensor 172L detects a tactile sensation (for example, one or more of a contact sensation, a pressure sensation, a distributed pressure sensation, a force sensation, and a slip sensation) for (the part 151L of) the hand portion 134L, and outputs sensor data indicating the detection result.
- a tactile sensation for example, one or more of a contact sensation, a pressure sensation, a distributed pressure sensation, a force sensation, and a slip sensation
- the elastic body 173L is made of a flexible and elastic material, such as a flexible gel material, that is close to the softness of human skin. As a result, when the user shakes hands with the hand portion 134L, a feeling similar to shaking hands with a human being can be obtained, and initial slippage is more likely to occur.
- the initial slip is a phenomenon that is a precursor to slip.
- initial slip is, for example, when two objects are in contact and when one object starts moving, only part of the contact surface of that object touches the contact surface of the other object. This is a phenomenon in which the object starts to slide. For example, when a user attempts to move a hand that has been shaken, only a portion of the palm-side contact surface of the user's hand begins to slide against the palm-side contact surface of the hand portion 134L.
- a to C in FIG. 4 schematically show initial slippage.
- Area A1 indicates a slip area where slipping of the user's hand on the contact surface of the hand portion 134L of the robot 101 is detected.
- Area A2 indicates a stick area where the user's hand is fixed without moving on the contact surface of the hand portion 134L.
- the arrow in the figure indicates the direction in which the user moves his or her hand.
- the entire contact surface of the user's hand does not begin to move at once relative to the hand portion 134L, but only a portion of it begins to slide. That is, initial slippage occurs. Then, the slip area A1 gradually becomes larger.
- the robot 101 controls the movement of the hand portion 134L during a handshake based on the initial slippage.
- the hand portion 134R includes a part 151R, a part 152R, and a yaw axis 153R.
- the part 151R includes pitch axes 161L to 163L, a base portion 171R, a tactile sensor 172R, and an elastic body 173R.
- the hand portion 134R can move in the same manner as the hand portion 134L.
- parts 151L and 151R when there is no need to distinguish between parts 151L and 151R, they will simply be referred to as parts 151.
- parts 152L and part 152R they are simply referred to as part 152.
- the yaw axis 153L and the yaw axis 153R they are simply referred to as the yaw axis 153.
- the pitch axis 161L and the pitch axis 161R they are simply referred to as the pitch axis 161.
- pitch axis 162L and the pitch axis 162R When it is not necessary to distinguish between the pitch axis 162L and the pitch axis 162R, they are simply referred to as the pitch axis 162.
- the pitch axis 163L and the pitch axis 163R When there is no need to distinguish between the pitch axis 163L and the pitch axis 163R, they are simply referred to as the pitch axis 163.
- the base portion 171L and the base portion 171R they will simply be referred to as the base portion 171.
- the tactile sensor 172L and the tactile sensor 172R when there is no need to distinguish between the tactile sensor 172L and the tactile sensor 172R, they will simply be referred to as the tactile sensor 172.
- the elastic body 173L and the elastic body 173R when there is no need to distinguish between the elastic body 173L and the elastic body 173R, they will simply be referred to as the elastic body 173.
- FIG. 5 shows a functional configuration example of the handshake execution processing unit 201 that executes processing related to handshaking, which is one type of contact communication of the robot 101.
- the handshake execution processing unit 201 includes a handshake command unit 211, a line of sight detection unit 212, a tactile detection unit 213, a handshake state management unit 214, a line of sight control unit 215, a display control unit 216, a handshake control unit 217, a movement control unit 218, and a speech control unit. section 219 and an audio output section 220.
- the handshake command unit 211 gives a handshake execution command to the handshake state management unit 214 according to the situation around the robot 101, etc.
- the line-of-sight detection unit 212 detects the user's line-of-sight direction based on sensor data (for example, image data, etc.) from the sensor unit 121.
- the line-of-sight detection unit 212 supplies the line-of-sight control unit 215 with user line-of-sight information indicating the detection result of the user's line-of-sight direction.
- the line of sight detection unit 212 also controls the speech control unit 219 by giving commands to the speech control unit 219.
- the tactile detection unit 213 detects tactile information for (the part 151 of) the hand unit 134 of the robot 101 based on sensor data from the tactile sensor 172, and controls the handshake state management unit 214, handshake control unit 217, and speech control. 219.
- the tactile information includes, for example, the contact state (for example, presence or absence of contact, contact position, etc.), the amount of shear deformation, the grip force applied from the outside, and the like.
- the amount of shear deformation is the amount of deformation in the shear direction, which is the direction in which the elastic body 173 on the surface of the hand portion 134 is displaced in the plane direction.
- the handshake state management unit 214 manages the handshake state of the robot 101.
- the handshake state management unit 214 controls the handshake state of the robot 101 based on commands from the handshake command unit 211, tactile information, gaze state information from the sight line control unit 215, and motion state information from the motion control unit 218. Detect conditions.
- the handshake state management unit 214 also controls the handshake control unit 215, handshake control unit 217,
- the handshake state of the robot 101 is controlled by giving commands to the speech control unit 219 and notifying the handshake state.
- the handshake state of the robot 101 includes, for example, the position and posture of the hand unit 134 with respect to the user's hand, the grip strength of the hand unit 134, the line of sight direction of the robot 101, and the speaking state of the robot 101.
- the line-of-sight control unit 215 gives commands to the display control unit 216 and the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212 and the command from the handshake state management unit 214, and controls the line-of-sight direction of the robot 101. Control.
- the line-of-sight control unit 215 supplies the handshake state management unit 214 with line-of-sight status information indicating the line-of-sight status of the user and the robot 101 (for example, the relative relationship of line-of-sight between the robot 101 and the user).
- the display control unit 216 controls the display of the eye image displayed on the monitor provided in the eye 122 of the robot 101 based on the command from the line of sight control unit 215.
- the handshake control unit 217 gives a command to the motion control unit 218 based on the tactile information and the command from the handshake state management unit 214, and controls the handshake motion by the hand unit 134.
- the handshake control section 217 includes an initial grip force setting section 231 , a grip force control section 232 , and a handshake position/posture control section 233 .
- the initial grip force setting unit 231 sets an initial grip force, which is an initial value of the grip force of the hand unit 134 when shaking hands, based on the tactile information.
- the grip force control unit 232 gives a command to the operation control unit 218 to adjust the grip force of the hand unit 134 based on the tactile information, the command from the handshake state management unit 214, and the initial grip force set by the initial grip force setting unit 231. Control.
- the handshake position/posture control section 233 gives a command to the motion control section 218 based on the tactile information and the command from the handshake state management section 214 to control the position and posture of (the handshake by) the hand section 134 .
- the motion control unit 218 controls the movement and position of the robot 101 by driving the actuators of each part of the robot 101 and controlling the joints of each part and the cart 116 based on commands from the line of sight control unit 215 and the handshake control unit 217. , and control the posture.
- the motion control section 218 supplies motion state information indicating the motion state including the movement, position, and posture of the robot 101 to the handshake state management section 214 .
- the operation control unit 218 controls the speech control unit 219 by giving commands to the speech control unit 219.
- the audio output unit 220 includes, for example, an audio output device such as a speaker.
- the audio output unit 220 outputs uttered audio under the control of the utterance control unit 219, for example.
- This process is started when the handshake command unit 211 gives a command to the handshake state management unit 214 to shake hands with the user.
- step S1 the robot 101 moves to the front of the user's field of vision.
- the handshake state management unit 214 instructs the line of sight control unit 215 to prompt the user to face the robot 101.
- the line-of-sight control unit 215 drives the cart 116 via the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212, and moves the robot 101 to a position in front of the user's field of view.
- step S2 the line-of-sight control unit 215 determines whether the user is looking toward the robot 101 based on the user's line-of-sight information from the line-of-sight detection unit 212. If it is determined that the user is not looking at the robot 101, the process advances to step S3.
- step S3 the robot 101 adjusts the position of the robot 101 and the position of the hand unit 134 so that the hand unit 134 is within the user's field of vision.
- the line of sight control unit 215 drives the cart 116 via the motion control unit 218 to move the robot 101 to a position where the hand unit 134 is within the user's field of vision.
- the line of sight control section 215 drives the arm section 117 via the motion control section 218 and adjusts the position of the hand section 134 so that it is within the user's field of vision.
- step S4 the robot 101 raises the hand section 134 to the level of the eyes 122 of the robot 101.
- the line of sight control section 215 drives the arm section 117 via the motion control section 218 to raise the hand section 134 to the level of the eye 122.
- the user's line of sight is guided to match the line of sight of the robot 101.
- step S2 After that, the process returns to step S2, and the processes from step S2 to step S4 are repeatedly executed until it is determined in step S2 that the user is looking at the robot 101.
- step S2 if the line of sight control unit 215 determines that the user is looking at the robot 101, it notifies the handshake state management unit 214 that the user is looking at the robot 101. After that, the process proceeds to step S5.
- step S5 the robot 101 approaches the user after attracting the user's attention.
- the handshake state management unit 214 instructs the line-of-sight control unit 215 and the speech control unit 219 to approach the user and make eye contact with the user.
- the line of sight control unit 215 controls the hand unit 134 and the like via the motion control unit 218 to cause the robot 101 to perform an action of waving at the user.
- the utterance control unit 219 causes the audio output unit 220 to output a utterance that calls out to the user, such as "I'm going to see you now.”
- the line of sight control unit 215 drives the trolley 116 via the motion control unit 218 to bring the robot 101 closer to the user.
- step S6 the robot 101 makes eye contact with the user.
- the line of sight control unit 215 drives each part of the robot 101 via the motion control unit 218 to adjust the posture of the robot 101 as necessary, and displays the eye image via the display control unit 216.
- the direction of the line of sight of the robot 101 is adjusted by moving the robot 101 to align the line of sight of the robot 101 with the line of sight of the user.
- the height of the eyes 122 of the robot 101 be at the same height as the user's eyes or at a position lower than the user's eyes.
- step S7 the robot 101 moves to a position where it holds out its hand and shakes hands.
- the handshake state management unit 214 instructs the line of sight control unit 215, handshake control unit 217, and speech control unit 219 to start shaking hands with the user.
- the handshake position/posture control section 233 moves the robot 101 so as to hold out the hand section 134 to the user and to shake hands with the user by driving the arm section 117 and the like via the motion control section 218 .
- step S8 the robot 101 alternately looks at the hand unit 134 and the user while calling out to them.
- the speech control unit 219 informs the user that they are about to shake hands by causing the speech output unit 220 to output speech such as "I'll shake your hand” or "Please hold my hand.”
- the line of sight control unit 215 moves the line of sight of the robot 101 in the direction of the hand unit 134 and the direction of the user by moving the head 111 via the motion control unit 218 and moving the eye image via the display control unit 216. Alternate between directions.
- step S9 the handshake state management unit 214 determines whether the user's hand has touched the hand unit 134 based on the tactile information from the tactile detection unit 213. If it is determined that the user's hand is not in contact with the hand portion 134, the process returns to step S8.
- step S8 and step S9 are repeatedly executed until it is determined in step S9 that the user's hand has contacted the hand portion 134.
- step S9 determines whether the user's hand has touched the hand portion 134. If it is determined in step S9 that the user's hand has touched the hand portion 134, the process proceeds to step S10.
- step S10 the robot 101 detects the user's grip strength while informing that the user will shake hands, and sets the initial grip strength. Specifically, the handshake state management unit 214 notifies the line of sight control unit 215, the handshake control unit 217, and the speech control unit 219 that the handshake has started.
- the speech control unit 219 informs the user that they are shaking hands by causing the speech output unit 220 to output a speech such as "I'm shaking your hand.”
- the initial grip strength setting unit 231 sets the initial grip strength based on the user's grip strength included in the tactile information from the tactile detection unit 213. For example, the initial grip strength setting unit 231 sets the initial grip strength to approximately the same grip strength as the user's grip strength when the handshake is started.
- the grip force control unit 232 drives the hand unit 134 via the operation control unit 218 so as to grasp the user's hand with an initial grip force.
- the robot 101 immediately grips the user's hand back with approximately the same grip force. This can give the user a sense of security.
- step S11 the robot 101 controls the movement of the hand section 134 in accordance with the movement of the user's hand.
- step S11 details of the process of step S11 will be explained with reference to FIGS. 8 to 11.
- initial slippage occurs at the contact surface between the hand portion 134 and the user's hand.
- the robot 101 detects initial slippage, the robot 101 controls the gripping force of the hand unit 134 so as to gently grasp the user's hand with enough force to prevent the user's hand from slipping.
- FIG. 8 schematically shows an example of a physical model of elastic contact. Specifically, a cross section of a portion where a user's hand 301 and a finger 302 of a general robot hand are in contact is schematically shown.
- the finger 302 is one of a plurality of fingers of the robot's hand, and the surface of each finger is covered with an elastic material.
- 8A shows the state before the hand 301 moves
- FIG. 8B shows the state when the hand 301 moves in the direction of the arrow.
- the radius of the contact surface of the finger 302 is a, and the radius of the fixed area is c.
- the shear direction of the contact surface of the finger 302 will be referred to as the x direction, and the normal direction will be referred to as the z direction.
- F x indicates a shearing force applied to the finger 302 by the hand 301 in the shearing direction (x direction).
- F z represents the normal force applied by the hand 301 to the finger 302 in the normal direction.
- ⁇ indicates the coefficient of friction between the hand 301 and the fingers 302.
- u x indicates the amount of deformation of the finger 302 in the shear direction.
- ⁇ ( ⁇ , F z ) is a simplified function of the friction coefficient ⁇ and normal force F z .
- the left side (c/a) of equation (1) indicates the proportion of the sticking area in the contact surface, and is called the sticking rate. Therefore, the initial slip can be quantified by a physical quantity called sticking rate.
- u r indicates the component in an arbitrary translational direction r of the amount of displacement in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the amount of translational shear displacement), and u r_ref is the amount of translational shear displacement of each finger (hereinafter referred to as translational shear displacement reference value).
- u ⁇ indicates the rotational direction ⁇ component of the displacement amount in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the rotational shear displacement amount), and u ⁇ _ref is the reference for the rotational shear displacement amount of each finger. (hereinafter referred to as rotational shear displacement reference value).
- ⁇ u r represents the sum of the translational shear displacement amounts u r of each finger
- ⁇ u r_ref represents the sum of the translational shear displacement reference values u r_ref of each finger
- ⁇ u ⁇ represents the sum of the rotational shear displacement amounts u ⁇ of each finger
- ⁇ u ⁇ _ref represents the sum of the rotational shear displacement reference values u ⁇ _ref of each finger.
- K pr , K ir , K dr , K p ⁇ , K i ⁇ , and K d ⁇ each indicate the gain of PID control.
- f r indicates the grip force calculated based on the amount of shear displacement that occurred in the translational direction of each finger of the hand section, and f ⁇ is calculated based on the amount of shear displacement that occurred in the rotational direction of each finger of the hand section. It shows the grip strength.
- the grip force f d is calculated by applying PID (Proportional-Integral-Differential) control using as input the vector sum of the deformation amounts u x in the shear direction detected by each finger of the robot.
- PID Proportional-Integral-Differential
- FIG. 9 shows a configuration example of the grip force control section 331 that implements the grip force control method shown in equations (2) to (4).
- the grip force control section 331 includes a reference generation section 341, a calculation section 342, a grip force calculation section 343, a calculation section 344, a torque calculation section 345, an actuator control section 346, a hand section 347, an LPF (Law Pass Filter) 348, and a calculation section. Equipped with 349.
- the reference generation unit 341 generates a translational shear displacement reference value u r_ref and a rotational shear displacement reference value u of each finger based on the translational shear displacement amount u ri and the rotational shear displacement amount u ⁇ i of each finger detected by the hand unit 347. Generate ⁇ _ref .
- the reference generation unit 341 supplies the calculation unit 342 with information indicating the translational shear displacement reference value u r_ref and the rotational shear displacement reference value u ⁇ _ref .
- the calculation unit 342 calculates the difference ( ⁇ u r ⁇ u r_ref ) between the sum of the translational shear displacement amount ur of each finger of the hand unit 347 and the sum of the translational shear displacement reference value u r_ref , and uses information indicating the calculation result as a grip force. It is supplied to the calculation unit 343.
- the calculation unit 342 calculates the difference ( ⁇ u ⁇ ⁇ ⁇ u ⁇ _ref ) between the sum of the rotational shear displacement amounts u ⁇ of each finger of the hand unit 347 and the sum of the rotational shear displacement reference values u ⁇ _ref , and provides information indicating the calculation result. It is supplied to the grip strength calculation section 343.
- the grip force calculation unit 343 calculates the above-mentioned equations (2) to (4), calculates the grip force f d of the hand unit 347 in the shear direction, and supplies information indicating the calculation result to the calculation unit 344.
- the calculation unit 344 calculates the grip force by adding the initial grip force of the hand unit 347 and the grip force fd in the shear direction, and supplies information indicating the calculation result to the torque calculation unit 345. This gripping force becomes the gripping force applied from the hand section 347 to the user's hand.
- the torque calculation unit 345 adjusts the joints of each finger in order to grasp the hand unit 347 with the calculated grip force based on the grip force of the hand unit 347, the angle q i of the joint of each finger, and the Jacobian matrix regarding the fingertip of each finger.
- the torque ⁇ i for driving is calculated, and information indicating the calculation result is supplied to the actuator control unit 346.
- the actuator control unit 346 drives each actuator that drives each finger joint of the hand unit 347 using the torque ⁇ i calculated by the torque calculation unit 345.
- the hand unit 347 detects the translational shear displacement amount u ri and the rotational shear displacement amount u ⁇ i of each finger, and supplies a signal indicating the detection result (hereinafter referred to as a shear displacement amount signal) to the reference generation unit 341 and the LPF 348. .
- the hand section 347 supplies a signal indicating the angle q i of each finger joint (hereinafter referred to as a joint angle signal) to the torque calculation section 345 and the actuator control section 346 .
- the LPF 348 reduces high frequency noise in the shear displacement amount signal and supplies the shear displacement amount signal after the high frequency noise reduction to the calculation unit 349.
- the calculation unit 349 calculates the sum of the translational shear displacement amount uri ( ⁇ u r ) and the sum of the rotational shear displacement amount u ⁇ i ( ⁇ u ⁇ ) of each finger based on the shear displacement amount signal, and provides information indicating the calculation result. is supplied to the calculation unit 342.
- the grip force of the hand portion 347 can be set to drive the joints of each finger.
- the grip force control section 232 controls the grip force of the hand section 134 using the same method as the grip force control section 331 in FIG. 9 . That is, when initial slippage is detected by the tactile detection section 213, the grip force control section 232 determines whether the initial slippage is caused by the initial slippage based on the amount of shear deformation of the hand section 134 and the initial grip force set by the initial grip force setting section 231. The grip force of the hand section 134 is calculated so that the amount of shear deformation of the hand section 134 approaches zero. The grip force control section 232 drives the hand section 134 via the motion control section 218 so as to grasp the user's hand with the calculated grip force.
- the maximum value of the gripping force of the hand unit 134 is set in advance, and the gripping force of the hand unit 134 is continuously controlled within a range that does not cause harm to the user.
- the handshake position/posture control unit 233 calculates the trajectory of the hand unit 134 so as to cancel the external force caused by the initial slippage of the user's hand, and moves the hand unit 134 along the calculated trajectory via the motion control unit 218.
- the handshake position and posture control unit 233 adjusts the position and posture of the hand unit 134 via the operation control unit 218 so that the contact area between the hand unit 134 of the robot 101 and the user's hand is as large as possible.
- FIG. 10A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before adjusting the posture of the hand section 134.
- B in FIG. 10 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the posture of the hand section 134 has been adjusted.
- Arrow L1 indicates the direction of the normal to the contact surface of hand portion 134.
- the posture of the hand section 134 is adjusted so that the normal L1 of the contact surface of the hand section 134 is approximately perpendicular to the contact surface of the user's hand 301.
- the contact surface of the hand portion 134 and the contact surface of the user's hand 301 are opposed to each other in substantially parallel.
- FIG. 11A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before the position of the hand section 134 is adjusted.
- B in FIG. 11 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the position of the hand section 134 has been adjusted.
- the position of the hand section 134 is adjusted so that the contact surface of the hand section 134 and the contact surface of the user are aligned. That is, the position of the hand section 134 is adjusted so that the area where the contact surface of the hand section 134 and the contact surface of the user's hand 301 overlap is as large as possible.
- step S12 the robot 101 calls out to the user depending on the strength of the user's grip.
- the speech control unit 219 responds to the strength of the user's grip detected by the tactile detection unit 213 by saying, “You look strong today,” “I’m glad you look healthy,” or “How are you feeling?”
- the voice output section 220 outputs the voice while changing the content of the voice, such as "?”.
- step S13 the handshake state management unit 214 determines whether to continue the handshake.
- the user's fingertips come into contact with the back side of the hand portion 134.
- the user's fingertips separate from the back side of the hand section 134.
- the handshake state management unit 214 determines to continue the handshake, and the process returns to step S11.
- step S13 if the tactile detection unit 213 does not detect contact with the back side of the hand part 134, the handshake state management unit 214 determines that the handshake is to be completed, and the process proceeds to step S14.
- step S14 the robot 101 opens its hand while calling out to you.
- the handshake state management section 214 gives a command to the line of sight control section 215, the handshake control section 217, and the speech control section 219 to end the handshake.
- the speech control section 219 notifies the user of the end of the handshake by causing the speech output section 220 to output a speech such as "Let go of your hand.”
- the handshake position/posture control section 233 causes the robot 101 to perform an action of opening its hand by opening the hand section 134 via the motion control section 218 .
- step S15 the handshake state management unit 214 determines whether the user's hand has been released. If the tactile detection unit 213 detects contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has not left the hand unit 134, and repeats the determination process in step S15. On the other hand, if the tactile detection unit 213 does not detect contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has been released, and the process proceeds to step S16.
- step S16 the robot 101 returns to its original position while calling out to you.
- the handshake state management section 214 notifies the line of sight control section 215, the handshake control section 217, and the speech control section 219 that the handshake has ended.
- the speech control section 219 outputs a speech such as "lower your hand” from the voice output section 220 to inform the user that the hand section 134 should be returned to its original position.
- the handshake position/posture control section 233 drives each part of the robot 101 via the motion control section 218 to return the posture of the robot 101 to its original state.
- a contact interaction such as shaking hands between people is realized, and the user's sense of security regarding the contact interaction by the robot 101 can be increased. For example, the following effects are brought about.
- the robot 101 immediately detects changes when the user tries to move the hand that has been shaken by the user based on the initial slippage, and gently returns the grip with enough force to prevent the hand from slipping, thereby confirming that the user's movement (input) is being transmitted. is presented to give the user a sense of security.
- the movements of the robot 101 become easier to understand and recognize, giving the user a sense of security.
- the hand section 134 moves in a trajectory that cancels out the external force caused by the user's initial slippage. As a result, the hand section 134 moves under trajectory control that reduces the rate of change and reaction force of the hand section 134, thereby eliminating the user's sense of danger.
- the robot 101 can transition from a state of grasping the user's hand back to a state of relaxing itself.
- User's sense of security increases.
- the effect of increasing the user's sense of security is increased.
- 12A to 12E show modified examples of the cross-sectional configuration of the part 151 of the hand portion 134.
- the upper side shows the palm side of the part 151
- the lower side shows the back side of the part 151.
- the tactile sensor 172 and the elastic body 173 may not be provided on at least one side of the part 151.
- the elastic body 173 may be formed into a mountain shape that becomes thicker toward the center.
- the elastic body 173 may have a shape in which a plurality of chevrons are connected on the palm side and back side of the part 151.
- the tactile sensor 172 and the elastic body 173 may be provided only on the palm side of the part 151.
- the part 152L of the hand portion 134 may also be provided with a tactile sensor and an elastic body, or may be provided with a joint.
- the configuration of the hand section 134 is not limited to the example described above, and can be changed. For example, it is possible to change the number of fingers of the hand section 134, as well as the number and position of joints.
- time-series condition information includes, for example, the user's grip strength, facial expression, vital values (body temperature, heart rate, blood pressure, SpO2, etc.), and speech. Content, line of sight, movement, etc. are assumed.
- time-series state information may be used to control the initial grip strength and grip strength of the hand portion 134, as well as the position and posture of the hand portion 134 during handshaking.
- a six-axis force sensor provided at the wrist of the arm 117 of the robot 101, a torque sensor at each joint, etc. may be used to detect the external force applied to the hand 134.
- the robot 101 may also combine these sensors to distinguish between external forces caused by handshakes and other external forces to achieve safer contact interactions.
- the robot 101 may control the temperature of the surface of the hand section 134 in order to achieve a touch interaction that gives a sense of security.
- the hand section 134 may be provided with a temperature sensor or a heating/cooling element, so that the temperature of the surface of the hand section 134 is initially set to a temperature slightly higher than human skin, and is controlled to match the temperature of the user's hand. You may also do so. By setting the initial temperature to a temperature slightly higher than human skin, the user can easily feel the sensation of contact with the hand section 134. Further, for example, the robot 101 may learn an appropriate temperature based on time-series state information.
- the robot 101 may shake hands with both hand parts 134 (both hands).
- the grip force of each hand section 134 is controlled by performing PID control using the sum of the deformation amount vectors in the shear direction detected by each hand section 134 as input.
- the present technology can also be applied to, for example, an arm robot that only has arms.
- means 2 that uses the "view" function cannot be applied, but by applying other means, it is possible to increase the user's sense of security regarding the handshake.
- This technology can also be applied to contact interactions other than handshakes. For example, by applying a method applicable to the target contact interaction among the four methods described above, it is possible to increase the user's sense of security regarding the contact interaction.
- the user's emotion estimation may be performed based on the above-mentioned time series information, and the touch interaction may be controlled based on the estimated emotion.
- FIG. 13 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processes using a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input/output interface 1005 is further connected to the bus 1004.
- An input section 1006, an output section 1007, a storage section 1008, a communication section 1009, and a drive 1010 are connected to the input/output interface 1005.
- the input unit 1006 includes an input switch, a button, a microphone, an image sensor, and the like.
- the output unit 1007 includes a display, a speaker, and the like.
- the storage unit 1008 includes a hard disk, nonvolatile memory, and the like.
- the communication unit 1009 includes a network interface and the like.
- the drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 100 for example, loads the program recorded in the storage unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program. A series of processing is performed.
- a program executed by the computer 1000 can be provided by being recorded on a removable medium 1011 such as a package medium, for example. Additionally, programs may be provided via wired or wireless transmission media, such as local area networks, the Internet, and digital satellite broadcasts.
- a program can be installed in the storage unit 1008 via the input/output interface 1005 by installing a removable medium 1011 into the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. Other programs can be installed in the ROM 1002 or the storage unit 1008 in advance.
- the program executed by the computer may be a program in which processing is performed chronologically in accordance with the order described in this specification, in parallel, or at necessary timing such as when a call is made. It may also be a program that performs processing.
- a system refers to a collection of multiple components (devices, modules (components), etc.), regardless of whether all the components are located in the same casing. Therefore, multiple devices housed in separate casings and connected via a network, and a single device with multiple modules housed in one casing are both systems. .
- embodiments of the present technology are not limited to the embodiments described above, and various changes can be made without departing from the gist of the present technology.
- each step described in the above flowchart can be executed by one device or can be shared and executed by multiple devices.
- one step includes multiple processes
- the multiple processes included in that one step can be executed by one device or can be shared and executed by multiple devices.
- the present technology can also have the following configuration.
- the tactile information includes an amount of deformation of the hand portion in a shear direction, The robot according to (1), wherein the handshake control section controls the grip force of the hand section based on the amount of deformation of the hand section in the shear direction.
- the handshake control unit controls the hand so as to grasp the hand again to the extent that the hand does not slip.
- the handshake control unit controls the position of the hand unit such that the palm side of the hand unit and the user's palm are approximately parallel, and the overlapping portion of the palm side of the hand unit and the user's palm is large. and the robot according to (9) above.
- the handshake control unit opens the hand when contact of the hand to the back side of the hand is no longer detected.
- a line-of-sight detection unit that detects the user's line-of-sight direction;
- a line-of-sight control unit that controls the robot's line of sight to align with the user's line of sight before shaking hands with the user.
- the handshake control unit puts the robot in a handshake position after the user and the robot make eye contact.
- the line of sight control unit controls the line of sight of the robot to alternately look at the hand and the user after the hand is held out to the user.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024530657A JPWO2024004622A1 (enrdf_load_stackoverflow) | 2022-06-29 | 2023-06-13 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022104293 | 2022-06-29 | ||
JP2022-104293 | 2022-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024004622A1 true WO2024004622A1 (ja) | 2024-01-04 |
Family
ID=89382080
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/021834 WO2024004622A1 (ja) | 2022-06-29 | 2023-06-13 | ロボット及びロボットの制御方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2024004622A1 (enrdf_load_stackoverflow) |
WO (1) | WO2024004622A1 (enrdf_load_stackoverflow) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004283975A (ja) * | 2003-03-24 | 2004-10-14 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2006247780A (ja) * | 2005-03-10 | 2006-09-21 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2007185763A (ja) * | 2005-12-12 | 2007-07-26 | Honda Motor Co Ltd | 脚式移動ロボットの制御装置 |
JP2009028859A (ja) * | 2007-07-27 | 2009-02-12 | Toshiba Corp | マニピュレータおよびロボット |
JP2019139758A (ja) * | 2018-02-09 | 2019-08-22 | マスターカード アジア パシフィック ピーティーイー リミテッドMastercard Asia/Pacific Pte.Ltd. | トランザクションを実行するシステム及び方法 |
WO2022039058A1 (ja) * | 2020-08-20 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
-
2023
- 2023-06-13 WO PCT/JP2023/021834 patent/WO2024004622A1/ja active Application Filing
- 2023-06-13 JP JP2024530657A patent/JPWO2024004622A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004283975A (ja) * | 2003-03-24 | 2004-10-14 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2006247780A (ja) * | 2005-03-10 | 2006-09-21 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
JP2007185763A (ja) * | 2005-12-12 | 2007-07-26 | Honda Motor Co Ltd | 脚式移動ロボットの制御装置 |
JP2009028859A (ja) * | 2007-07-27 | 2009-02-12 | Toshiba Corp | マニピュレータおよびロボット |
JP2019139758A (ja) * | 2018-02-09 | 2019-08-22 | マスターカード アジア パシフィック ピーティーイー リミテッドMastercard Asia/Pacific Pte.Ltd. | トランザクションを実行するシステム及び方法 |
WO2022039058A1 (ja) * | 2020-08-20 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Non-Patent Citations (2)
Title |
---|
OOWADA, KENICHI: " Let's Move the Popular Robot "Pepper" - Creating Apps for Greetings and Handshakes Activated When a Person's Face or Voice Is Recognized", NIKKEI LINUX, NIKKEI BUSINESS PUBLICATIONS, JP, vol. 17, no. 5, 1 May 2015 (2015-05-01), JP , pages 82 - 89, XP009552011, ISSN: 1345-0182 * |
RAVINDRA P., S. DE SILVA, NOSAKA, TATSUYA, FUKAMACHI, KENTA, TAKEDA, YASUTAKA: "The Effective Communication between "Mako no Te" and Human through the Patterns of Grasping", THE TRANSACTIONS OF HUMAN INTERFACE SOCIETY, vol. 17, no. 2, 1 January 2015 (2015-01-01), pages 191 - 200, XP093122644, ISSN: 2186-828X * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2024004622A1 (enrdf_load_stackoverflow) | 2024-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Noronha et al. | “Wink to grasp”—comparing eye, voice & EMG gesture control of grasp with soft-robotic gloves | |
Eid et al. | A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS | |
Prattichizzo et al. | Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback | |
Cortese et al. | A mechatronic system for robot-mediated hand telerehabilitation | |
Gunasekara et al. | Control methodologies for upper limb exoskeleton robots | |
JP7315568B2 (ja) | 把持支援システムおよび方法 | |
CN114206557A (zh) | 控制装置、控制方法和控制程序 | |
Kirchner et al. | Intuitive interaction with robots–technical approaches and challenges | |
Haseeb et al. | Head gesture-based control for assistive robots | |
Sahadat et al. | Simultaneous multimodal access to wheelchair and computer for people with tetraplegia | |
WO2024004622A1 (ja) | ロボット及びロボットの制御方法 | |
Basdogan et al. | Perception of soft objects in virtual environments under conflicting visual and haptic cues | |
Krishnaswamy et al. | Toward the development of a BCI and gestural interface to support individuals with physical disabilities | |
Yamada et al. | Proposal of a psychophysiological experiment system applying the reaction of human pupillary dilation to frightening robot motions | |
CN114129392A (zh) | 可调控末端指尖力的自适应冗余驱动外骨骼康复机器人 | |
Kim et al. | QuadStretcher: a forearm-worn skin stretch display for bare-hand interaction in AR/VR | |
Weisz et al. | A user interface for assistive grasping | |
Khairuddin et al. | Assistive-as-needed strategy for upper-limb robotic systems: An initial survey | |
Schubö et al. | Movement coordination in applied human-human and human-robot interaction | |
Cao et al. | Musculoskeletal Model-Based Adaptive Variable Impedance Control With Flexible Prescribed Performance for Rehabilitation Robots | |
Toyama et al. | Cybernic robot hand-arm that realizes cooperative work as a new hand-arm for people with a single upper-limb dysfunction | |
Beckerle | Virtual Hand Experience | |
Rominger et al. | Supporting functional tasks in bi-manual robotic mirror therapy by coupling upper limb movements based on virtual reality | |
Hu et al. | MultiClear: Multimodal Soft Exoskeleton Glove for Transparent Object Grasping Assistance | |
Wang et al. | Recent advances in hand movement rehabilitation system and related strategies |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23831070 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024530657 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 23831070 Country of ref document: EP Kind code of ref document: A1 |