WO2024004622A1 - Robot, and robot control method - Google Patents

Robot, and robot control method Download PDF

Info

Publication number
WO2024004622A1
WO2024004622A1 PCT/JP2023/021834 JP2023021834W WO2024004622A1 WO 2024004622 A1 WO2024004622 A1 WO 2024004622A1 JP 2023021834 W JP2023021834 W JP 2023021834W WO 2024004622 A1 WO2024004622 A1 WO 2024004622A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
robot
user
unit
handshake
Prior art date
Application number
PCT/JP2023/021834
Other languages
French (fr)
Japanese (ja)
Inventor
智子 水谷
康宏 松田
利充 坪井
良 寺澤
慶直 袖山
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024004622A1 publication Critical patent/WO2024004622A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present technology relates to a robot and a method of controlling the robot, and particularly relates to a robot that can perform contact interaction and a method of controlling the robot.
  • the present technology has been developed in view of this situation, and is intended to increase the sense of security of users, such as those with cognitive decline, when it comes to contact interactions such as handshakes with robots.
  • a robot includes a hand unit that can shake hands with a user's hand, a tactile detection unit that detects tactile information of the hand unit, and a robot that is configured to shake hands with the user's hand based on the tactile information. and a handshake control section that controls the grip force of the handshake.
  • a robot control method detects tactile information of a hand that can shake hands with a user's hand, and controls the grip force of the hand that is shaking hands with the user's hand based on the tactile information. .
  • tactile information of a hand unit that can shake hands with a user's hand is detected, and based on the tactile information, the grip force of the hand unit that is shaking hands with the user's hand is controlled.
  • FIG. 2 is a diagram for explaining important elements for recognition.
  • 1 is a schematic diagram of an example of an external configuration of a robot to which the present technology is applied.
  • FIG. 2 is a schematic diagram of an example of the external appearance and cross-sectional configuration of a hand portion of a robot.
  • FIG. 3 is a diagram for explaining initial slippage.
  • FIG. 3 is a diagram illustrating a functional configuration example of a handshake execution processing section of a robot. It is a flowchart for explaining the handshake control process executed by the robot. It is a flowchart for explaining the handshake control process executed by the robot.
  • FIG. 3 is a diagram for explaining a method for controlling the grip force of a hand section of a robot.
  • FIG. 3 is a diagram for explaining a method of controlling the grip force of a hand section of a robot.
  • FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot.
  • FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot. It is a figure which shows the modification of the cross section of the hand part of a robot.
  • 1 is a block diagram showing an example of the configuration of a computer. FIG.
  • a safe state is a state without fear, and can be translated as a state without a sense of anxiety or danger.
  • a case where the other party's actions become unpredictable may be a case where there is no indication as to whether or not your input is being transmitted to the other party. Specifically, for example, when a person shakes someone's hand for a handshake, there is no reaction at all from the other person. Further, for example, cases in which the other party's actions become unpredictable include cases in which the other party does not know what the other party is doing, or cases in which the other party is not recognized in the first place. Specifically, for example, if a person is suddenly touched by someone they do not recognize, they become surprised and anxious.
  • the following three requirements can be considered to provide a sense of security to the person with whom the robot interacts (hereinafter referred to as the user).
  • Requirement 1 Respond appropriately to user input, be able to predict the next action, and do not pose any threat to the user.
  • Requirement 3 The contact area with the user should be as large as possible, and the rate of change of the contact surface and reaction force from the robot should be as small as possible.
  • the purpose of this technology is to provide a robot that satisfies Requirements 1 to 3 and can perform contact interactions with a sense of security.
  • the robot executes the following four measures in order to achieve a touch interaction that gives a sense of security.
  • the robot instantly detects changes when the user tries to move the hand that has been shaken by the user, based on the initial slippage, and gently returns the grip with just enough force to prevent the user's hand from slipping, thereby transmitting the user's input to the robot. to present things.
  • the robot's movements can be easily understood by linking the functions of "seeing (gaze control),” “speaking (speech control),” and “touching (touch control),” which are important elements in cognition shown in Figure 1. , and make it easier to recognize.
  • a method called autofeedback which is one of the nursing care methods, is effective.And by executing autofeedback in accordance with the robot's control and changes in observed values, the robot's movement can be improved. It makes things easier to understand, improves awareness, and increases the sense of security.
  • the "seeing” and “touching” functions seem to be independent at first glance, but by making eye contact and “looking” with the other person, the user can clearly recognize the robot.
  • the robot can give the user a sense of security by shifting to the "touching" action after confirming that the robot has made eye contact with the user and has been recognized.
  • the robot not only looks at the user's eyes, but also looks at the area to be touched (for example, the user's hand in the case of a handshake) when touching, making it easier for the user to recognize what is being touched.
  • the robot moves the hand part in a trajectory that maximizes the contact area with the hand of the user who is shaking hands.
  • Means 4 The robot immediately detects the external force when the user tries to move the hand after shaking hands, based on initial slippage, and moves the hand part in a trajectory that reduces the reaction force against the external force.
  • the grip force control of means 1 allows the robot to transition from a state in which it returns its grip to a state in which the robot itself gently releases its force, thereby further enhancing the effect of means 1.
  • the effectiveness can be maximized by using a method that uses initial slippage to detect external force.
  • Embodiment >> Next, embodiments of the present technology will be described with reference to FIGS. 2 to 11.
  • FIG. 2 schematically shows an example of the external appearance of the robot 101.
  • the robot 101 includes a head 111, a neck 112, a chest 113, an abdomen 114, a waist 115, a cart 116, an arm 117L, and an arm 117R.
  • a head 111 , a neck 112 , a chest 113 , an abdomen 114 , and a waist 115 are connected in order from top to bottom, and the waist 115 is placed on a trolley 116 .
  • An arm portion 117L corresponding to the left arm is connected to the left side of the chest 113, and an arm portion 117R corresponding to the right arm is connected to the right side of the chest 113.
  • the neck 112 is rotatable, for example, around the roll axis, pitch axis, and yaw axis with respect to the chest 113.
  • the chest 113 is rotatable about the pitch axis and the yaw axis with respect to the abdomen 114.
  • the abdomen 114 is rotatable about the pitch axis and the yaw axis, for example, relative to the waist 115.
  • the waist portion 115 is, for example, rotatable about the yaw axis with respect to the truck 116.
  • the head 111 includes a sensor section 121, an eye 122L corresponding to the left eye, and an eye 122R corresponding to the right eye.
  • the sensor section 121 is provided near the forehead of the head 111.
  • the sensor unit 121 includes, for example, a sensor such as an image sensor that detects the state around the robot 101.
  • the sensor unit 121 outputs sensor data indicating the detection results of each sensor.
  • the eyes 122L and 122R each include a monitor (not shown).
  • the monitor for the eye 122L displays the image of the left eye of the robot 101 and can move the image of the left eye.
  • the monitor of the eye 122R displays the image of the right eye of the robot 101 and can move the image of the right eye. This causes the robot 101's line of sight to move and its facial expression to change.
  • eyes 122 when there is no need to distinguish between the eyes 122L and 122R, they will simply be referred to as eyes 122.
  • the arm portion 117L includes a shoulder portion 131L, an upper arm portion 132L, a forearm portion 133L, and a hand portion 134L.
  • the shoulder portion 131L, the upper arm portion 132L, the forearm portion 133L, and the hand portion 134L are connected in order so as to extend from the left side of the chest 113.
  • the shoulder portion 131L is rotatable around the pitch axis and the roll axis, for example, with respect to the chest 113.
  • the upper arm portion 132L is fixed to the shoulder portion 131L.
  • the forearm portion 133L is rotatable about the pitch axis and the yaw axis, for example, with respect to the upper arm portion 132L.
  • the hand portion 134L is rotatable around the yaw axis relative to the forearm portion 133L, for example.
  • the arm 117R includes a shoulder 131R, an upper arm 132R, a forearm 133R, and a hand 134R, and can move in the same way as the arm 117L.
  • arm 117L and the arm 117R if there is no need to distinguish between the arm 117L and the arm 117R, they will simply be referred to as arm 117.
  • shoulder portion 131L and the shoulder portion 131R when there is no need to distinguish between the shoulder portion 131L and the shoulder portion 131R, they will simply be referred to as the shoulder portion 131.
  • upper arm part 132L and the upper arm part 132R they will simply be referred to as the upper arm part 132.
  • forearm part 133L and the forearm part 133R they will simply be referred to as the forearm part 133.
  • hand section 134L and the hand section 134R when there is no need to distinguish between the hand section 134L and the hand section 134R, they will simply be referred to as the hand section 134.
  • FIG. 3 shows an example of the configuration of the hand section 134L.
  • FIG. 3A schematically shows an example of the external appearance of the hand portion 134L.
  • B in FIG. 3 schematically shows a cross-sectional configuration example in the width direction of the part 151L of the hand portion 134L.
  • the hand portion 134L includes a part 151L, a part 152L, and a yaw axis 153L. Part 151L and part 152L are connected to yaw axis 153L.
  • the parts 151L are parts corresponding to the palm, back, index finger, middle finger, ring finger, and little finger of a human hand. However, in the part 151L, each finger is not separated and is integrated.
  • Part 152L is a part corresponding to the thumb of a human hand.
  • the part 151L includes pitch axes 161L to 163L that extend in the width direction and are parallel to each other. By individually rotating the pitch axes 161L to 163L, the parts 151L can be opened and closed to grasp or release an object. Thereby, the hand portion 134L can wrap the user's palm with the part 151L, and has the degree of freedom to simultaneously contact the user's palm side and back side and apply force to each other.
  • the part 151L and the part 152L can rotate together around the yaw axis 153L.
  • the part 151L includes a base portion 171L, a tactile sensor 172L, and an elastic body 173L.
  • the surface of the base portion 171L is covered with a tactile sensor 172.
  • the surface of the tactile sensor 172 is covered with an elastic body 173L.
  • the base portion 171L is made of metal, for example, and constitutes the main body of the hand portion 134.
  • the tactile sensor 172L detects a tactile sensation (for example, one or more of a contact sensation, a pressure sensation, a distributed pressure sensation, a force sensation, and a slip sensation) for (the part 151L of) the hand portion 134L, and outputs sensor data indicating the detection result.
  • a tactile sensation for example, one or more of a contact sensation, a pressure sensation, a distributed pressure sensation, a force sensation, and a slip sensation
  • the elastic body 173L is made of a flexible and elastic material, such as a flexible gel material, that is close to the softness of human skin. As a result, when the user shakes hands with the hand portion 134L, a feeling similar to shaking hands with a human being can be obtained, and initial slippage is more likely to occur.
  • the initial slip is a phenomenon that is a precursor to slip.
  • initial slip is, for example, when two objects are in contact and when one object starts moving, only part of the contact surface of that object touches the contact surface of the other object. This is a phenomenon in which the object starts to slide. For example, when a user attempts to move a hand that has been shaken, only a portion of the palm-side contact surface of the user's hand begins to slide against the palm-side contact surface of the hand portion 134L.
  • a to C in FIG. 4 schematically show initial slippage.
  • Area A1 indicates a slip area where slipping of the user's hand on the contact surface of the hand portion 134L of the robot 101 is detected.
  • Area A2 indicates a stick area where the user's hand is fixed without moving on the contact surface of the hand portion 134L.
  • the arrow in the figure indicates the direction in which the user moves his or her hand.
  • the entire contact surface of the user's hand does not begin to move at once relative to the hand portion 134L, but only a portion of it begins to slide. That is, initial slippage occurs. Then, the slip area A1 gradually becomes larger.
  • the robot 101 controls the movement of the hand portion 134L during a handshake based on the initial slippage.
  • the hand portion 134R includes a part 151R, a part 152R, and a yaw axis 153R.
  • the part 151R includes pitch axes 161L to 163L, a base portion 171R, a tactile sensor 172R, and an elastic body 173R.
  • the hand portion 134R can move in the same manner as the hand portion 134L.
  • parts 151L and 151R when there is no need to distinguish between parts 151L and 151R, they will simply be referred to as parts 151.
  • parts 152L and part 152R they are simply referred to as part 152.
  • the yaw axis 153L and the yaw axis 153R they are simply referred to as the yaw axis 153.
  • the pitch axis 161L and the pitch axis 161R they are simply referred to as the pitch axis 161.
  • pitch axis 162L and the pitch axis 162R When it is not necessary to distinguish between the pitch axis 162L and the pitch axis 162R, they are simply referred to as the pitch axis 162.
  • the pitch axis 163L and the pitch axis 163R When there is no need to distinguish between the pitch axis 163L and the pitch axis 163R, they are simply referred to as the pitch axis 163.
  • the base portion 171L and the base portion 171R they will simply be referred to as the base portion 171.
  • the tactile sensor 172L and the tactile sensor 172R when there is no need to distinguish between the tactile sensor 172L and the tactile sensor 172R, they will simply be referred to as the tactile sensor 172.
  • the elastic body 173L and the elastic body 173R when there is no need to distinguish between the elastic body 173L and the elastic body 173R, they will simply be referred to as the elastic body 173.
  • FIG. 5 shows a functional configuration example of the handshake execution processing unit 201 that executes processing related to handshaking, which is one type of contact communication of the robot 101.
  • the handshake execution processing unit 201 includes a handshake command unit 211, a line of sight detection unit 212, a tactile detection unit 213, a handshake state management unit 214, a line of sight control unit 215, a display control unit 216, a handshake control unit 217, a movement control unit 218, and a speech control unit. section 219 and an audio output section 220.
  • the handshake command unit 211 gives a handshake execution command to the handshake state management unit 214 according to the situation around the robot 101, etc.
  • the line-of-sight detection unit 212 detects the user's line-of-sight direction based on sensor data (for example, image data, etc.) from the sensor unit 121.
  • the line-of-sight detection unit 212 supplies the line-of-sight control unit 215 with user line-of-sight information indicating the detection result of the user's line-of-sight direction.
  • the line of sight detection unit 212 also controls the speech control unit 219 by giving commands to the speech control unit 219.
  • the tactile detection unit 213 detects tactile information for (the part 151 of) the hand unit 134 of the robot 101 based on sensor data from the tactile sensor 172, and controls the handshake state management unit 214, handshake control unit 217, and speech control. 219.
  • the tactile information includes, for example, the contact state (for example, presence or absence of contact, contact position, etc.), the amount of shear deformation, the grip force applied from the outside, and the like.
  • the amount of shear deformation is the amount of deformation in the shear direction, which is the direction in which the elastic body 173 on the surface of the hand portion 134 is displaced in the plane direction.
  • the handshake state management unit 214 manages the handshake state of the robot 101.
  • the handshake state management unit 214 controls the handshake state of the robot 101 based on commands from the handshake command unit 211, tactile information, gaze state information from the sight line control unit 215, and motion state information from the motion control unit 218. Detect conditions.
  • the handshake state management unit 214 also controls the handshake control unit 215, handshake control unit 217,
  • the handshake state of the robot 101 is controlled by giving commands to the speech control unit 219 and notifying the handshake state.
  • the handshake state of the robot 101 includes, for example, the position and posture of the hand unit 134 with respect to the user's hand, the grip strength of the hand unit 134, the line of sight direction of the robot 101, and the speaking state of the robot 101.
  • the line-of-sight control unit 215 gives commands to the display control unit 216 and the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212 and the command from the handshake state management unit 214, and controls the line-of-sight direction of the robot 101. Control.
  • the line-of-sight control unit 215 supplies the handshake state management unit 214 with line-of-sight status information indicating the line-of-sight status of the user and the robot 101 (for example, the relative relationship of line-of-sight between the robot 101 and the user).
  • the display control unit 216 controls the display of the eye image displayed on the monitor provided in the eye 122 of the robot 101 based on the command from the line of sight control unit 215.
  • the handshake control unit 217 gives a command to the motion control unit 218 based on the tactile information and the command from the handshake state management unit 214, and controls the handshake motion by the hand unit 134.
  • the handshake control section 217 includes an initial grip force setting section 231 , a grip force control section 232 , and a handshake position/posture control section 233 .
  • the initial grip force setting unit 231 sets an initial grip force, which is an initial value of the grip force of the hand unit 134 when shaking hands, based on the tactile information.
  • the grip force control unit 232 gives a command to the operation control unit 218 to adjust the grip force of the hand unit 134 based on the tactile information, the command from the handshake state management unit 214, and the initial grip force set by the initial grip force setting unit 231. Control.
  • the handshake position/posture control section 233 gives a command to the motion control section 218 based on the tactile information and the command from the handshake state management section 214 to control the position and posture of (the handshake by) the hand section 134 .
  • the motion control unit 218 controls the movement and position of the robot 101 by driving the actuators of each part of the robot 101 and controlling the joints of each part and the cart 116 based on commands from the line of sight control unit 215 and the handshake control unit 217. , and control the posture.
  • the motion control section 218 supplies motion state information indicating the motion state including the movement, position, and posture of the robot 101 to the handshake state management section 214 .
  • the operation control unit 218 controls the speech control unit 219 by giving commands to the speech control unit 219.
  • the audio output unit 220 includes, for example, an audio output device such as a speaker.
  • the audio output unit 220 outputs uttered audio under the control of the utterance control unit 219, for example.
  • This process is started when the handshake command unit 211 gives a command to the handshake state management unit 214 to shake hands with the user.
  • step S1 the robot 101 moves to the front of the user's field of vision.
  • the handshake state management unit 214 instructs the line of sight control unit 215 to prompt the user to face the robot 101.
  • the line-of-sight control unit 215 drives the cart 116 via the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212, and moves the robot 101 to a position in front of the user's field of view.
  • step S2 the line-of-sight control unit 215 determines whether the user is looking toward the robot 101 based on the user's line-of-sight information from the line-of-sight detection unit 212. If it is determined that the user is not looking at the robot 101, the process advances to step S3.
  • step S3 the robot 101 adjusts the position of the robot 101 and the position of the hand unit 134 so that the hand unit 134 is within the user's field of vision.
  • the line of sight control unit 215 drives the cart 116 via the motion control unit 218 to move the robot 101 to a position where the hand unit 134 is within the user's field of vision.
  • the line of sight control section 215 drives the arm section 117 via the motion control section 218 and adjusts the position of the hand section 134 so that it is within the user's field of vision.
  • step S4 the robot 101 raises the hand section 134 to the level of the eyes 122 of the robot 101.
  • the line of sight control section 215 drives the arm section 117 via the motion control section 218 to raise the hand section 134 to the level of the eye 122.
  • the user's line of sight is guided to match the line of sight of the robot 101.
  • step S2 After that, the process returns to step S2, and the processes from step S2 to step S4 are repeatedly executed until it is determined in step S2 that the user is looking at the robot 101.
  • step S2 if the line of sight control unit 215 determines that the user is looking at the robot 101, it notifies the handshake state management unit 214 that the user is looking at the robot 101. After that, the process proceeds to step S5.
  • step S5 the robot 101 approaches the user after attracting the user's attention.
  • the handshake state management unit 214 instructs the line-of-sight control unit 215 and the speech control unit 219 to approach the user and make eye contact with the user.
  • the line of sight control unit 215 controls the hand unit 134 and the like via the motion control unit 218 to cause the robot 101 to perform an action of waving at the user.
  • the utterance control unit 219 causes the audio output unit 220 to output a utterance that calls out to the user, such as "I'm going to see you now.”
  • the line of sight control unit 215 drives the trolley 116 via the motion control unit 218 to bring the robot 101 closer to the user.
  • step S6 the robot 101 makes eye contact with the user.
  • the line of sight control unit 215 drives each part of the robot 101 via the motion control unit 218 to adjust the posture of the robot 101 as necessary, and displays the eye image via the display control unit 216.
  • the direction of the line of sight of the robot 101 is adjusted by moving the robot 101 to align the line of sight of the robot 101 with the line of sight of the user.
  • the height of the eyes 122 of the robot 101 be at the same height as the user's eyes or at a position lower than the user's eyes.
  • step S7 the robot 101 moves to a position where it holds out its hand and shakes hands.
  • the handshake state management unit 214 instructs the line of sight control unit 215, handshake control unit 217, and speech control unit 219 to start shaking hands with the user.
  • the handshake position/posture control section 233 moves the robot 101 so as to hold out the hand section 134 to the user and to shake hands with the user by driving the arm section 117 and the like via the motion control section 218 .
  • step S8 the robot 101 alternately looks at the hand unit 134 and the user while calling out to them.
  • the speech control unit 219 informs the user that they are about to shake hands by causing the speech output unit 220 to output speech such as "I'll shake your hand” or "Please hold my hand.”
  • the line of sight control unit 215 moves the line of sight of the robot 101 in the direction of the hand unit 134 and the direction of the user by moving the head 111 via the motion control unit 218 and moving the eye image via the display control unit 216. Alternate between directions.
  • step S9 the handshake state management unit 214 determines whether the user's hand has touched the hand unit 134 based on the tactile information from the tactile detection unit 213. If it is determined that the user's hand is not in contact with the hand portion 134, the process returns to step S8.
  • step S8 and step S9 are repeatedly executed until it is determined in step S9 that the user's hand has contacted the hand portion 134.
  • step S9 determines whether the user's hand has touched the hand portion 134. If it is determined in step S9 that the user's hand has touched the hand portion 134, the process proceeds to step S10.
  • step S10 the robot 101 detects the user's grip strength while informing that the user will shake hands, and sets the initial grip strength. Specifically, the handshake state management unit 214 notifies the line of sight control unit 215, the handshake control unit 217, and the speech control unit 219 that the handshake has started.
  • the speech control unit 219 informs the user that they are shaking hands by causing the speech output unit 220 to output a speech such as "I'm shaking your hand.”
  • the initial grip strength setting unit 231 sets the initial grip strength based on the user's grip strength included in the tactile information from the tactile detection unit 213. For example, the initial grip strength setting unit 231 sets the initial grip strength to approximately the same grip strength as the user's grip strength when the handshake is started.
  • the grip force control unit 232 drives the hand unit 134 via the operation control unit 218 so as to grasp the user's hand with an initial grip force.
  • the robot 101 immediately grips the user's hand back with approximately the same grip force. This can give the user a sense of security.
  • step S11 the robot 101 controls the movement of the hand section 134 in accordance with the movement of the user's hand.
  • step S11 details of the process of step S11 will be explained with reference to FIGS. 8 to 11.
  • initial slippage occurs at the contact surface between the hand portion 134 and the user's hand.
  • the robot 101 detects initial slippage, the robot 101 controls the gripping force of the hand unit 134 so as to gently grasp the user's hand with enough force to prevent the user's hand from slipping.
  • FIG. 8 schematically shows an example of a physical model of elastic contact. Specifically, a cross section of a portion where a user's hand 301 and a finger 302 of a general robot hand are in contact is schematically shown.
  • the finger 302 is one of a plurality of fingers of the robot's hand, and the surface of each finger is covered with an elastic material.
  • 8A shows the state before the hand 301 moves
  • FIG. 8B shows the state when the hand 301 moves in the direction of the arrow.
  • the radius of the contact surface of the finger 302 is a, and the radius of the fixed area is c.
  • the shear direction of the contact surface of the finger 302 will be referred to as the x direction, and the normal direction will be referred to as the z direction.
  • F x indicates a shearing force applied to the finger 302 by the hand 301 in the shearing direction (x direction).
  • F z represents the normal force applied by the hand 301 to the finger 302 in the normal direction.
  • indicates the coefficient of friction between the hand 301 and the fingers 302.
  • u x indicates the amount of deformation of the finger 302 in the shear direction.
  • ⁇ ( ⁇ , F z ) is a simplified function of the friction coefficient ⁇ and normal force F z .
  • the left side (c/a) of equation (1) indicates the proportion of the sticking area in the contact surface, and is called the sticking rate. Therefore, the initial slip can be quantified by a physical quantity called sticking rate.
  • u r indicates the component in an arbitrary translational direction r of the amount of displacement in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the amount of translational shear displacement), and u r_ref is the amount of translational shear displacement of each finger (hereinafter referred to as translational shear displacement reference value).
  • u ⁇ indicates the rotational direction ⁇ component of the displacement amount in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the rotational shear displacement amount), and u ⁇ _ref is the reference for the rotational shear displacement amount of each finger. (hereinafter referred to as rotational shear displacement reference value).
  • ⁇ u r represents the sum of the translational shear displacement amounts u r of each finger
  • ⁇ u r_ref represents the sum of the translational shear displacement reference values u r_ref of each finger
  • ⁇ u ⁇ represents the sum of the rotational shear displacement amounts u ⁇ of each finger
  • ⁇ u ⁇ _ref represents the sum of the rotational shear displacement reference values u ⁇ _ref of each finger.
  • K pr , K ir , K dr , K p ⁇ , K i ⁇ , and K d ⁇ each indicate the gain of PID control.
  • f r indicates the grip force calculated based on the amount of shear displacement that occurred in the translational direction of each finger of the hand section, and f ⁇ is calculated based on the amount of shear displacement that occurred in the rotational direction of each finger of the hand section. It shows the grip strength.
  • the grip force f d is calculated by applying PID (Proportional-Integral-Differential) control using as input the vector sum of the deformation amounts u x in the shear direction detected by each finger of the robot.
  • PID Proportional-Integral-Differential
  • FIG. 9 shows a configuration example of the grip force control section 331 that implements the grip force control method shown in equations (2) to (4).
  • the grip force control section 331 includes a reference generation section 341, a calculation section 342, a grip force calculation section 343, a calculation section 344, a torque calculation section 345, an actuator control section 346, a hand section 347, an LPF (Law Pass Filter) 348, and a calculation section. Equipped with 349.
  • the reference generation unit 341 generates a translational shear displacement reference value u r_ref and a rotational shear displacement reference value u of each finger based on the translational shear displacement amount u ri and the rotational shear displacement amount u ⁇ i of each finger detected by the hand unit 347. Generate ⁇ _ref .
  • the reference generation unit 341 supplies the calculation unit 342 with information indicating the translational shear displacement reference value u r_ref and the rotational shear displacement reference value u ⁇ _ref .
  • the calculation unit 342 calculates the difference ( ⁇ u r ⁇ u r_ref ) between the sum of the translational shear displacement amount ur of each finger of the hand unit 347 and the sum of the translational shear displacement reference value u r_ref , and uses information indicating the calculation result as a grip force. It is supplied to the calculation unit 343.
  • the calculation unit 342 calculates the difference ( ⁇ u ⁇ ⁇ ⁇ u ⁇ _ref ) between the sum of the rotational shear displacement amounts u ⁇ of each finger of the hand unit 347 and the sum of the rotational shear displacement reference values u ⁇ _ref , and provides information indicating the calculation result. It is supplied to the grip strength calculation section 343.
  • the grip force calculation unit 343 calculates the above-mentioned equations (2) to (4), calculates the grip force f d of the hand unit 347 in the shear direction, and supplies information indicating the calculation result to the calculation unit 344.
  • the calculation unit 344 calculates the grip force by adding the initial grip force of the hand unit 347 and the grip force fd in the shear direction, and supplies information indicating the calculation result to the torque calculation unit 345. This gripping force becomes the gripping force applied from the hand section 347 to the user's hand.
  • the torque calculation unit 345 adjusts the joints of each finger in order to grasp the hand unit 347 with the calculated grip force based on the grip force of the hand unit 347, the angle q i of the joint of each finger, and the Jacobian matrix regarding the fingertip of each finger.
  • the torque ⁇ i for driving is calculated, and information indicating the calculation result is supplied to the actuator control unit 346.
  • the actuator control unit 346 drives each actuator that drives each finger joint of the hand unit 347 using the torque ⁇ i calculated by the torque calculation unit 345.
  • the hand unit 347 detects the translational shear displacement amount u ri and the rotational shear displacement amount u ⁇ i of each finger, and supplies a signal indicating the detection result (hereinafter referred to as a shear displacement amount signal) to the reference generation unit 341 and the LPF 348. .
  • the hand section 347 supplies a signal indicating the angle q i of each finger joint (hereinafter referred to as a joint angle signal) to the torque calculation section 345 and the actuator control section 346 .
  • the LPF 348 reduces high frequency noise in the shear displacement amount signal and supplies the shear displacement amount signal after the high frequency noise reduction to the calculation unit 349.
  • the calculation unit 349 calculates the sum of the translational shear displacement amount uri ( ⁇ u r ) and the sum of the rotational shear displacement amount u ⁇ i ( ⁇ u ⁇ ) of each finger based on the shear displacement amount signal, and provides information indicating the calculation result. is supplied to the calculation unit 342.
  • the grip force of the hand portion 347 can be set to drive the joints of each finger.
  • the grip force control section 232 controls the grip force of the hand section 134 using the same method as the grip force control section 331 in FIG. 9 . That is, when initial slippage is detected by the tactile detection section 213, the grip force control section 232 determines whether the initial slippage is caused by the initial slippage based on the amount of shear deformation of the hand section 134 and the initial grip force set by the initial grip force setting section 231. The grip force of the hand section 134 is calculated so that the amount of shear deformation of the hand section 134 approaches zero. The grip force control section 232 drives the hand section 134 via the motion control section 218 so as to grasp the user's hand with the calculated grip force.
  • the maximum value of the gripping force of the hand unit 134 is set in advance, and the gripping force of the hand unit 134 is continuously controlled within a range that does not cause harm to the user.
  • the handshake position/posture control unit 233 calculates the trajectory of the hand unit 134 so as to cancel the external force caused by the initial slippage of the user's hand, and moves the hand unit 134 along the calculated trajectory via the motion control unit 218.
  • the handshake position and posture control unit 233 adjusts the position and posture of the hand unit 134 via the operation control unit 218 so that the contact area between the hand unit 134 of the robot 101 and the user's hand is as large as possible.
  • FIG. 10A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before adjusting the posture of the hand section 134.
  • B in FIG. 10 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the posture of the hand section 134 has been adjusted.
  • Arrow L1 indicates the direction of the normal to the contact surface of hand portion 134.
  • the posture of the hand section 134 is adjusted so that the normal L1 of the contact surface of the hand section 134 is approximately perpendicular to the contact surface of the user's hand 301.
  • the contact surface of the hand portion 134 and the contact surface of the user's hand 301 are opposed to each other in substantially parallel.
  • FIG. 11A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before the position of the hand section 134 is adjusted.
  • B in FIG. 11 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the position of the hand section 134 has been adjusted.
  • the position of the hand section 134 is adjusted so that the contact surface of the hand section 134 and the contact surface of the user are aligned. That is, the position of the hand section 134 is adjusted so that the area where the contact surface of the hand section 134 and the contact surface of the user's hand 301 overlap is as large as possible.
  • step S12 the robot 101 calls out to the user depending on the strength of the user's grip.
  • the speech control unit 219 responds to the strength of the user's grip detected by the tactile detection unit 213 by saying, “You look strong today,” “I’m glad you look healthy,” or “How are you feeling?”
  • the voice output section 220 outputs the voice while changing the content of the voice, such as "?”.
  • step S13 the handshake state management unit 214 determines whether to continue the handshake.
  • the user's fingertips come into contact with the back side of the hand portion 134.
  • the user's fingertips separate from the back side of the hand section 134.
  • the handshake state management unit 214 determines to continue the handshake, and the process returns to step S11.
  • step S13 if the tactile detection unit 213 does not detect contact with the back side of the hand part 134, the handshake state management unit 214 determines that the handshake is to be completed, and the process proceeds to step S14.
  • step S14 the robot 101 opens its hand while calling out to you.
  • the handshake state management section 214 gives a command to the line of sight control section 215, the handshake control section 217, and the speech control section 219 to end the handshake.
  • the speech control section 219 notifies the user of the end of the handshake by causing the speech output section 220 to output a speech such as "Let go of your hand.”
  • the handshake position/posture control section 233 causes the robot 101 to perform an action of opening its hand by opening the hand section 134 via the motion control section 218 .
  • step S15 the handshake state management unit 214 determines whether the user's hand has been released. If the tactile detection unit 213 detects contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has not left the hand unit 134, and repeats the determination process in step S15. On the other hand, if the tactile detection unit 213 does not detect contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has been released, and the process proceeds to step S16.
  • step S16 the robot 101 returns to its original position while calling out to you.
  • the handshake state management section 214 notifies the line of sight control section 215, the handshake control section 217, and the speech control section 219 that the handshake has ended.
  • the speech control section 219 outputs a speech such as "lower your hand” from the voice output section 220 to inform the user that the hand section 134 should be returned to its original position.
  • the handshake position/posture control section 233 drives each part of the robot 101 via the motion control section 218 to return the posture of the robot 101 to its original state.
  • a contact interaction such as shaking hands between people is realized, and the user's sense of security regarding the contact interaction by the robot 101 can be increased. For example, the following effects are brought about.
  • the robot 101 immediately detects changes when the user tries to move the hand that has been shaken by the user based on the initial slippage, and gently returns the grip with enough force to prevent the hand from slipping, thereby confirming that the user's movement (input) is being transmitted. is presented to give the user a sense of security.
  • the movements of the robot 101 become easier to understand and recognize, giving the user a sense of security.
  • the hand section 134 moves in a trajectory that cancels out the external force caused by the user's initial slippage. As a result, the hand section 134 moves under trajectory control that reduces the rate of change and reaction force of the hand section 134, thereby eliminating the user's sense of danger.
  • the robot 101 can transition from a state of grasping the user's hand back to a state of relaxing itself.
  • User's sense of security increases.
  • the effect of increasing the user's sense of security is increased.
  • 12A to 12E show modified examples of the cross-sectional configuration of the part 151 of the hand portion 134.
  • the upper side shows the palm side of the part 151
  • the lower side shows the back side of the part 151.
  • the tactile sensor 172 and the elastic body 173 may not be provided on at least one side of the part 151.
  • the elastic body 173 may be formed into a mountain shape that becomes thicker toward the center.
  • the elastic body 173 may have a shape in which a plurality of chevrons are connected on the palm side and back side of the part 151.
  • the tactile sensor 172 and the elastic body 173 may be provided only on the palm side of the part 151.
  • the part 152L of the hand portion 134 may also be provided with a tactile sensor and an elastic body, or may be provided with a joint.
  • the configuration of the hand section 134 is not limited to the example described above, and can be changed. For example, it is possible to change the number of fingers of the hand section 134, as well as the number and position of joints.
  • time-series condition information includes, for example, the user's grip strength, facial expression, vital values (body temperature, heart rate, blood pressure, SpO2, etc.), and speech. Content, line of sight, movement, etc. are assumed.
  • time-series state information may be used to control the initial grip strength and grip strength of the hand portion 134, as well as the position and posture of the hand portion 134 during handshaking.
  • a six-axis force sensor provided at the wrist of the arm 117 of the robot 101, a torque sensor at each joint, etc. may be used to detect the external force applied to the hand 134.
  • the robot 101 may also combine these sensors to distinguish between external forces caused by handshakes and other external forces to achieve safer contact interactions.
  • the robot 101 may control the temperature of the surface of the hand section 134 in order to achieve a touch interaction that gives a sense of security.
  • the hand section 134 may be provided with a temperature sensor or a heating/cooling element, so that the temperature of the surface of the hand section 134 is initially set to a temperature slightly higher than human skin, and is controlled to match the temperature of the user's hand. You may also do so. By setting the initial temperature to a temperature slightly higher than human skin, the user can easily feel the sensation of contact with the hand section 134. Further, for example, the robot 101 may learn an appropriate temperature based on time-series state information.
  • the robot 101 may shake hands with both hand parts 134 (both hands).
  • the grip force of each hand section 134 is controlled by performing PID control using the sum of the deformation amount vectors in the shear direction detected by each hand section 134 as input.
  • the present technology can also be applied to, for example, an arm robot that only has arms.
  • means 2 that uses the "view" function cannot be applied, but by applying other means, it is possible to increase the user's sense of security regarding the handshake.
  • This technology can also be applied to contact interactions other than handshakes. For example, by applying a method applicable to the target contact interaction among the four methods described above, it is possible to increase the user's sense of security regarding the contact interaction.
  • the user's emotion estimation may be performed based on the above-mentioned time series information, and the touch interaction may be controlled based on the estimated emotion.
  • FIG. 13 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processes using a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input/output interface 1005 is further connected to the bus 1004.
  • An input section 1006, an output section 1007, a storage section 1008, a communication section 1009, and a drive 1010 are connected to the input/output interface 1005.
  • the input unit 1006 includes an input switch, a button, a microphone, an image sensor, and the like.
  • the output unit 1007 includes a display, a speaker, and the like.
  • the storage unit 1008 includes a hard disk, nonvolatile memory, and the like.
  • the communication unit 1009 includes a network interface and the like.
  • the drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 100 for example, loads the program recorded in the storage unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program. A series of processing is performed.
  • a program executed by the computer 1000 can be provided by being recorded on a removable medium 1011 such as a package medium, for example. Additionally, programs may be provided via wired or wireless transmission media, such as local area networks, the Internet, and digital satellite broadcasts.
  • a program can be installed in the storage unit 1008 via the input/output interface 1005 by installing a removable medium 1011 into the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. Other programs can be installed in the ROM 1002 or the storage unit 1008 in advance.
  • the program executed by the computer may be a program in which processing is performed chronologically in accordance with the order described in this specification, in parallel, or at necessary timing such as when a call is made. It may also be a program that performs processing.
  • a system refers to a collection of multiple components (devices, modules (components), etc.), regardless of whether all the components are located in the same casing. Therefore, multiple devices housed in separate casings and connected via a network, and a single device with multiple modules housed in one casing are both systems. .
  • embodiments of the present technology are not limited to the embodiments described above, and various changes can be made without departing from the gist of the present technology.
  • each step described in the above flowchart can be executed by one device or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the multiple processes included in that one step can be executed by one device or can be shared and executed by multiple devices.
  • the present technology can also have the following configuration.
  • the tactile information includes an amount of deformation of the hand portion in a shear direction, The robot according to (1), wherein the handshake control section controls the grip force of the hand section based on the amount of deformation of the hand section in the shear direction.
  • the handshake control unit controls the hand so as to grasp the hand again to the extent that the hand does not slip.
  • the handshake control unit controls the position of the hand unit such that the palm side of the hand unit and the user's palm are approximately parallel, and the overlapping portion of the palm side of the hand unit and the user's palm is large. and the robot according to (9) above.
  • the handshake control unit opens the hand when contact of the hand to the back side of the hand is no longer detected.
  • a line-of-sight detection unit that detects the user's line-of-sight direction;
  • a line-of-sight control unit that controls the robot's line of sight to align with the user's line of sight before shaking hands with the user.
  • the handshake control unit puts the robot in a handshake position after the user and the robot make eye contact.
  • the line of sight control unit controls the line of sight of the robot to alternately look at the hand and the user after the hand is held out to the user.

Abstract

The present technology relates to a robot and a robot control method that enable an improvement in a user's sense of security with respect to a touch interaction with a robot. This robot comprises: a hand portion capable of gripping a user's hand; a haptic detecting unit for detecting haptic information relating to the hand portion; and a grip control unit for controlling a gripping force of the hand portion that is shaking hands with the hand, on the basis of the haptic information. This technology is applicable to robots capable of executing touch interactions, for example.

Description

ロボット及びロボットの制御方法Robot and robot control method
 本技術は、ロボット及びロボットの制御方法に関し、特に、接触インタラクションを実行可能なロボット及びロボットの制御方法に関する。 The present technology relates to a robot and a method of controlling the robot, and particularly relates to a robot that can perform contact interaction and a method of controlling the robot.
 近年、握手等によりユーザに直接接触してコミュニケーションする接触インタラクションを実行可能なロボットの開発が盛んである(例えば、特許文献1参照)。また、今後、高齢者、障碍者、認知症者等の認知機能が低下している人(以下、認知機能低下者と称する)の介護の現場に、接触インタラクションを実行可能なロボットが導入されることが予想される。 In recent years, there has been an active development of robots that can perform contact interactions that communicate by directly contacting users, such as by shaking hands (for example, see Patent Document 1). In addition, in the future, robots that can perform touch interactions will be introduced to care sites for people with reduced cognitive function, such as the elderly, people with disabilities, and people with dementia (hereinafter referred to as people with reduced cognitive function). It is expected that.
国際公開第2016/035759号International Publication No. 2016/035759
 一方、接触インタラクションを実行可能なロボットが、認知機能低下者に受け入れられるためには、認知機能低下者に安心感を与えることが重要となる。 On the other hand, in order for robots that can perform contact interactions to be accepted by people with cognitive decline, it is important to give them a sense of security.
 本技術は、このような状況に鑑みてなされたものであり、ロボットによる握手等の接触インタラクションに対する認知機能低下者等のユーザの安心感を高めることができるようにするものである。 The present technology has been developed in view of this situation, and is intended to increase the sense of security of users, such as those with cognitive decline, when it comes to contact interactions such as handshakes with robots.
 本技術の一側面のロボットは、ユーザの手と握手可能なハンド部と、前記ハンド部の触覚情報を検出する触覚検出部と、前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する握手制御部とを備える。 A robot according to an aspect of the present technology includes a hand unit that can shake hands with a user's hand, a tactile detection unit that detects tactile information of the hand unit, and a robot that is configured to shake hands with the user's hand based on the tactile information. and a handshake control section that controls the grip force of the handshake.
 本技術の一側面のロボットの制御方法は、ユーザの手と握手可能なハンド部の触覚情報を検出し、前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する。 A robot control method according to one aspect of the present technology detects tactile information of a hand that can shake hands with a user's hand, and controls the grip force of the hand that is shaking hands with the user's hand based on the tactile information. .
 本技術の一側面においては、ユーザの手と握手可能なハンド部の触覚情報が検出され、前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力が制御される。 In one aspect of the present technology, tactile information of a hand unit that can shake hands with a user's hand is detected, and based on the tactile information, the grip force of the hand unit that is shaking hands with the user's hand is controlled.
認知に重要な要素について説明するための図である。FIG. 2 is a diagram for explaining important elements for recognition. 本技術を適用したロボットの外観の構成例の模式図である。1 is a schematic diagram of an example of an external configuration of a robot to which the present technology is applied. ロボットのハンド部の外観および断面の構成例の模式図である。FIG. 2 is a schematic diagram of an example of the external appearance and cross-sectional configuration of a hand portion of a robot. 初期滑りについて説明するための図である。FIG. 3 is a diagram for explaining initial slippage. ロボットの握手実行処理部の機能の構成例を示す図である。FIG. 3 is a diagram illustrating a functional configuration example of a handshake execution processing section of a robot. ロボットにより実行される握手制御処理を説明するためのフローチャートである。It is a flowchart for explaining the handshake control process executed by the robot. ロボットにより実行される握手制御処理を説明するためのフローチャートである。It is a flowchart for explaining the handshake control process executed by the robot. ロボットのハンド部の握力の制御方法について説明するための図である。FIG. 3 is a diagram for explaining a method for controlling the grip force of a hand section of a robot. ロボットのハンド部の握力の制御方法について説明するための図である。FIG. 3 is a diagram for explaining a method of controlling the grip force of a hand section of a robot. ロボットのハンド部の位置及び姿勢の制御方法について説明するための図である。FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot. ロボットのハンド部の位置及び姿勢の制御方法について説明するための図である。FIG. 3 is a diagram for explaining a method for controlling the position and posture of a hand section of a robot. ロボットのハンド部の断面の変形例を示す図である。It is a figure which shows the modification of the cross section of the hand part of a robot. コンピュータの構成例を示すブロック図である。1 is a block diagram showing an example of the configuration of a computer. FIG.
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 0.本技術の背景
 1.実施の形態
 2.変形例
 3.その他
Hereinafter, a mode for implementing the present technology will be described. The explanation will be given in the following order.
0. Background of this technology 1. Embodiment 2. Modification example 3. others
<<0.本技術の背景>>
 まず、本技術の背景について説明する。
<<0. Background of this technology >>
First, the background of this technology will be explained.
 従来、認知機能低下者とのコミュニケーションにおいて、「触れる」ことが認知機能低下者の認知に役立ち、安心感を与える重要な要素であることが分かっている。 Conventionally, when communicating with people with cognitive decline, it has been known that "touch" is an important element that helps the person recognize the situation and gives them a sense of security.
 ここで、認知を促進し、安心感を与える接触インタラクションとはどのようなものか、またロボットでそれを実現するにはどうすればよいかについて考える。 Here, we will consider what kind of touch interaction promotes recognition and gives a sense of security, and how we can achieve this with robots.
 安心な状態とは、恐れのない状態であり、不安感や危険感のない状態と言い換えることができる。 A safe state is a state without fear, and can be translated as a state without a sense of anxiety or danger.
 一方、人は、接触インタラクションを行う場合に、相手の動作が予測不能であり、危害の心配があるときに不安感を覚えやすい。 On the other hand, when engaging in contact interactions, people tend to feel anxious when the other person's movements are unpredictable and there is a fear of harm.
 例えば、相手の動作が予測不能となる場合として、こちらの入力が相手に伝わっているか否かの提示がない場合が想定される。具体的には、例えば、人が握手をするために手を握っても、相手から全く反応がない場合等である。また、例えば、相手の動作が予測不能となる場合として、相手が何をしているか分からない場合や、そもそも相手を認識していない場合が想定される。具体的には、例えば、人は、存在を認識していない相手に急に触れられたら、驚いて不安になる。 For example, a case where the other party's actions become unpredictable may be a case where there is no indication as to whether or not your input is being transmitted to the other party. Specifically, for example, when a person shakes someone's hand for a handshake, there is no reaction at all from the other person. Further, for example, cases in which the other party's actions become unpredictable include cases in which the other party does not know what the other party is doing, or cases in which the other party is not recognized in the first place. Specifically, for example, if a person is suddenly touched by someone they do not recognize, they become surprised and anxious.
 また、人は、接触インタラクションを行う場合に、接触時の接触面が小さかったり、接触部分の変化速度が大きかったり、相手から与えられる力が大きかったりするとき、危険感を感じやすい。例えば、腕を掴まれて無理に動かされる場合、人は危険感を感じやすい。 Furthermore, when engaging in contact interaction, people tend to feel a sense of danger when the contact surface during contact is small, the speed of change of the contact area is large, or the force applied by the other party is large. For example, people tend to feel a sense of danger when their arms are grabbed and forced to move.
 以上により、ロボットが接触インタラクションを行う相手(以下、ユーザと称する)に対して安心感を与える要件として、以下の3つが考えられる。 Based on the above, the following three requirements can be considered to provide a sense of security to the person with whom the robot interacts (hereinafter referred to as the user).
要件1:ユーザの入力に対して適切に応答し、次の動作が予測可能であり、ユーザに危害の心配を与えないこと。 Requirement 1: Respond appropriately to user input, be able to predict the next action, and do not pose any threat to the user.
要件2:動作がわかりやすく、かつ、ユーザにより認知され、次の動作が予測可能であり、危害の心配を与えないこと。 Requirement 2: Actions are easy to understand and recognized by the user, the next action is predictable, and there is no risk of harm.
要件3:ユーザとの接触面積ができるだけ大きくなり、接触面の変化速度、及び、ロボットからの反力ができるだけ小さくなること。 Requirement 3: The contact area with the user should be as large as possible, and the rate of change of the contact surface and reaction force from the robot should be as small as possible.
 本技術は、要件1乃至要件3を満たし、安心感のある接触インタラクションを実行可能なロボットを提供することを目的とする。 The purpose of this technology is to provide a robot that satisfies Requirements 1 to 3 and can perform contact interactions with a sense of security.
 具体的には、ロボットは、安心感のある接触インタラクションを実現するために、以下の4つの手段を実行する。 Specifically, the robot executes the following four measures in order to achieve a touch interaction that gives a sense of security.
手段1.ロボットが、ユーザが握手した手を動かそうとするときの変化を初期滑りにより即座に検出し、ユーザの手が滑らない程度の力で優しく握り返すことで、ユーザの入力がロボットに伝わっていることを提示する。 Means 1. The robot instantly detects changes when the user tries to move the hand that has been shaken by the user, based on the initial slippage, and gently returns the grip with just enough force to prevent the user's hand from slipping, thereby transmitting the user's input to the robot. to present things.
手段2.ロボットが、図1に示される認知において重要な要素である「見る(視線制御)」、「話す(発話制御)」、「触れる(接触制御)」機能を連携させて、ロボットの動きを分かりやすく、かつ、認知しやすくする。 Means 2. The robot's movements can be easily understood by linking the functions of "seeing (gaze control)," "speaking (speech control)," and "touching (touch control)," which are important elements in cognition shown in Figure 1. , and make it easier to recognize.
 なお、認知力の向上、安心感を与えるケア・コミュニケーション技法としてユマニチュードという技法が確立しており、「見る」「話す」「触れる」「立つ」の4つの柱が基本とされている。 Furthermore, a technique called Humanitude has been established as a care and communication technique that improves cognition and gives a sense of security, and is based on the four pillars of "seeing," "talking," "touching," and "standing."
 「話す」機能については、例えば、介護手法のひとつであるオートフィードバックと呼ばれる手法が有効である、そして、ロボットの制御や観測値の変化に合わせてオートフィードバックが実行されることで、ロボットの動作のわかりやすさ、認知の向上につながり、安心感の効果が高まる。 Regarding the "talking" function, for example, a method called autofeedback, which is one of the nursing care methods, is effective.And by executing autofeedback in accordance with the robot's control and changes in observed values, the robot's movement can be improved. It makes things easier to understand, improves awareness, and increases the sense of security.
 「見る」機能と「触れる」機能は一見独立しているように思えるが、相手と目を合わせ、「見る」ことで、ユーザにロボットがしっかり認識される。また、ロボットが、ユーザと視線が合い認識されたことを確認した上で、「触れる」動作に移行することにより、ユーザに安心感を与えることができる。さらに、ロボットがユーザの目を見るだけでなく、触れる箇所(例えば、握手であればユーザの手)を見ながら、触れることで、いま触れていることをユーザが認知しやすくなる。 The "seeing" and "touching" functions seem to be independent at first glance, but by making eye contact and "looking" with the other person, the user can clearly recognize the robot. In addition, the robot can give the user a sense of security by shifting to the "touching" action after confirming that the robot has made eye contact with the user and has been recognized. Furthermore, the robot not only looks at the user's eyes, but also looks at the area to be touched (for example, the user's hand in the case of a handshake) when touching, making it easier for the user to recognize what is being touched.
手段3.ロボットが、握手しているユーザの手との接触面積ができるだけ大きくなるような軌道でハンド部を動かす。 Means 3. The robot moves the hand part in a trajectory that maximizes the contact area with the hand of the user who is shaking hands.
手段4.ロボットが、ユーザが握手をした手を動かそうとするときの外力を初期滑りにより即座に検出し、外力に対する反力が減少するような軌道でハンド部を動かす。手段4は、手段1と組み合わせることにより、例えば、手段1の握力制御で握り返す状態から、ロボット自ら優しく力を抜く状態へ遷移でき、手段1の効果をさらに高めることができる。特に、初期滑りを利用して外力を検出する手法をとれば、効果を最大化することができる。 Means 4. The robot immediately detects the external force when the user tries to move the hand after shaking hands, based on initial slippage, and moves the hand part in a trajectory that reduces the reaction force against the external force. By combining means 4 with means 1, for example, the grip force control of means 1 allows the robot to transition from a state in which it returns its grip to a state in which the robot itself gently releases its force, thereby further enhancing the effect of means 1. In particular, the effectiveness can be maximized by using a method that uses initial slippage to detect external force.
<<1.実施の形態>>
 次に、図2乃至図11を参照して、本技術の実施の形態について説明する。
<<1. Embodiment >>
Next, embodiments of the present technology will be described with reference to FIGS. 2 to 11.
 <ロボット101の構成例>
 まず、図2乃至図5を参照して、本技術の一実施の形態であるロボット101の構成例について説明する。
<Example of configuration of robot 101>
First, a configuration example of the robot 101, which is an embodiment of the present technology, will be described with reference to FIGS. 2 to 5.
 図2は、ロボット101の外観の構成例を模式的に示している。 FIG. 2 schematically shows an example of the external appearance of the robot 101.
 ロボット101は、頭部111、首部112、胸部113、腹部114、腰部115、台車116、腕部117L、及び、腕部117Rを備える。頭部111、首部112、胸部113、腹部114、及び、腰部115が上から下方向に順番に接続され、腰部115が台車116の上に載置されている。胸部113の左側には左腕に対応する腕部117Lが接続され、胸部113の右側には右腕に対応する腕部117Rが接続されている。 The robot 101 includes a head 111, a neck 112, a chest 113, an abdomen 114, a waist 115, a cart 116, an arm 117L, and an arm 117R. A head 111 , a neck 112 , a chest 113 , an abdomen 114 , and a waist 115 are connected in order from top to bottom, and the waist 115 is placed on a trolley 116 . An arm portion 117L corresponding to the left arm is connected to the left side of the chest 113, and an arm portion 117R corresponding to the right arm is connected to the right side of the chest 113.
 首部112は、例えば、胸部113に対して、ロール軸、ピッチ軸、及び、ヨー軸回りに回転可能である。胸部113は、例えば、腹部114に対して、ピッチ軸及びヨー軸回りに回転可能である。腹部114は、例えば、腰部115に対して、ピッチ軸及びヨー軸回りに回転可能である。腰部115は、例えば、台車116に対して、ヨー軸回りに回転可能である。 The neck 112 is rotatable, for example, around the roll axis, pitch axis, and yaw axis with respect to the chest 113. For example, the chest 113 is rotatable about the pitch axis and the yaw axis with respect to the abdomen 114. The abdomen 114 is rotatable about the pitch axis and the yaw axis, for example, relative to the waist 115. The waist portion 115 is, for example, rotatable about the yaw axis with respect to the truck 116.
 頭部111には、センサ部121、左目に対応する目122L、及び、右目に対応する目122Rを備える。 The head 111 includes a sensor section 121, an eye 122L corresponding to the left eye, and an eye 122R corresponding to the right eye.
 センサ部121は、頭部111の額付近に設けられている。センサ部121は、例えば、画像センサ等のロボット101の周囲の状態等を検出するセンサを備える。センサ部121は、各センサの検出結果を示すセンサデータを出力する。 The sensor section 121 is provided near the forehead of the head 111. The sensor unit 121 includes, for example, a sensor such as an image sensor that detects the state around the robot 101. The sensor unit 121 outputs sensor data indicating the detection results of each sensor.
 目122L及び目122Rは、それぞれモニタ(不図示)を備える。目122Lのモニタは、ロボット101の左目の画像を表示するとともに、左目の画像を動かすことができる。目122Rのモニタは、ロボット101の右目の画像を表示するとともに、右目の画像を動かすことができる。これにより、ロボット101の視線が動いたり、表情が変化したりする。 The eyes 122L and 122R each include a monitor (not shown). The monitor for the eye 122L displays the image of the left eye of the robot 101 and can move the image of the left eye. The monitor of the eye 122R displays the image of the right eye of the robot 101 and can move the image of the right eye. This causes the robot 101's line of sight to move and its facial expression to change.
 以下、目122Lと目122Rを個々に区別する必要がない場合、単に目122と称する。 Hereinafter, when there is no need to distinguish between the eyes 122L and 122R, they will simply be referred to as eyes 122.
 腕部117Lは、肩部131L、上腕部132L、前腕部133L、及び、ハンド部134Lを備えている。肩部131L、上腕部132L、前腕部133L、及び、ハンド部134Lは、胸部113の左側から延びるように順番に接続されている。 The arm portion 117L includes a shoulder portion 131L, an upper arm portion 132L, a forearm portion 133L, and a hand portion 134L. The shoulder portion 131L, the upper arm portion 132L, the forearm portion 133L, and the hand portion 134L are connected in order so as to extend from the left side of the chest 113.
 肩部131Lは、例えば、胸部113に対して、ピッチ軸及びロール軸回りに回転可能である。上腕部132Lは、例えば、肩部131Lに固定されている。前腕部133Lは、例えば、上腕部132Lに対して、ピッチ軸及びヨー軸回りに回転可能である。ハンド部134Lは、例えば、前腕部133Lに対してヨー軸回りに回転可能である。 The shoulder portion 131L is rotatable around the pitch axis and the roll axis, for example, with respect to the chest 113. For example, the upper arm portion 132L is fixed to the shoulder portion 131L. The forearm portion 133L is rotatable about the pitch axis and the yaw axis, for example, with respect to the upper arm portion 132L. The hand portion 134L is rotatable around the yaw axis relative to the forearm portion 133L, for example.
 なお、腕部117Rは、腕部117Lと同様に、肩部131R、上腕部132R、前腕部133R、及び、ハンド部134Rを備えており、腕部117Lと同様の動きが可能である。 Note that, like the arm 117L, the arm 117R includes a shoulder 131R, an upper arm 132R, a forearm 133R, and a hand 134R, and can move in the same way as the arm 117L.
 以下、腕部117Lと腕部117Rとを個々に区別する必要がない場合、単に腕部117と称する。以下、肩部131Lと肩部131Rとを個々に区別する必要がない場合、単に肩部131と称する。以下、上腕部132Lと上腕部132Rとを個々に区別する必要がない場合、単に上腕部132と称する。以下、前腕部133Lと前腕部133Rとを個々に区別する必要がない場合、単に前腕部133と称する。以下、ハンド部134Lとハンド部134Rとを個々に区別する必要がない場合、単にハンド部134と称する。 Hereinafter, if there is no need to distinguish between the arm 117L and the arm 117R, they will simply be referred to as arm 117. Hereinafter, when there is no need to distinguish between the shoulder portion 131L and the shoulder portion 131R, they will simply be referred to as the shoulder portion 131. Hereinafter, if there is no need to distinguish between the upper arm part 132L and the upper arm part 132R, they will simply be referred to as the upper arm part 132. Hereinafter, if there is no need to distinguish between the forearm part 133L and the forearm part 133R, they will simply be referred to as the forearm part 133. Hereinafter, when there is no need to distinguish between the hand section 134L and the hand section 134R, they will simply be referred to as the hand section 134.
 図3は、ハンド部134Lの構成例を示している。図3のAは、ハンド部134Lの外観の構成例を模式的に示している。図3のBは、ハンド部134Lのパーツ151Lの幅方向の断面の構成例を模式的に示している。 FIG. 3 shows an example of the configuration of the hand section 134L. FIG. 3A schematically shows an example of the external appearance of the hand portion 134L. B in FIG. 3 schematically shows a cross-sectional configuration example in the width direction of the part 151L of the hand portion 134L.
 ハンド部134Lは、パーツ151L、パーツ152L、及び、ヨー軸153Lを備える。パーツ151L及びパーツ152Lは、ヨー軸153Lに接続されている。 The hand portion 134L includes a part 151L, a part 152L, and a yaw axis 153L. Part 151L and part 152L are connected to yaw axis 153L.
 パーツ151Lは、人間の手の掌、甲、人差し指、中指、薬指、及び、小指の部分に対応するパーツである。ただし、パーツ151Lは、各指が分かれておらず、一体化している。 The parts 151L are parts corresponding to the palm, back, index finger, middle finger, ring finger, and little finger of a human hand. However, in the part 151L, each finger is not separated and is integrated.
 パーツ152Lは、人間の手の親指に対応するパーツである。 Part 152L is a part corresponding to the thumb of a human hand.
 パーツ151Lは、幅方向に延び、互いに平行なピッチ軸161L乃至ピッチ軸163Lを備える。ピッチ軸161L乃至ピッチ軸163Lがそれぞれ個別に回転することにより、パーツ151Lが開閉し、物体を掴んだり離したりすることができる。これにより、ハンド部134Lは、パーツ151Lによりユーザの掌を包み込むことができ、ユーザの掌側及び甲側に同時に接触し、互いに力をかける自由度を有する。 The part 151L includes pitch axes 161L to 163L that extend in the width direction and are parallel to each other. By individually rotating the pitch axes 161L to 163L, the parts 151L can be opened and closed to grasp or release an object. Thereby, the hand portion 134L can wrap the user's palm with the part 151L, and has the degree of freedom to simultaneously contact the user's palm side and back side and apply force to each other.
 パーツ151L及びパーツ152Lは、一体となってヨー軸153L回りに回転可能である。 The part 151L and the part 152L can rotate together around the yaw axis 153L.
 また、パーツ151Lは、ベース部171L、触覚センサ172L、及び、弾性体173Lを備える。ベース部171Lの表面は、触覚センサ172により覆われている。触覚センサ172の表面は、弾性体173Lにより覆われている。 Furthermore, the part 151L includes a base portion 171L, a tactile sensor 172L, and an elastic body 173L. The surface of the base portion 171L is covered with a tactile sensor 172. The surface of the tactile sensor 172 is covered with an elastic body 173L.
 ベース部171Lは、例えば、金属製であり、ハンド部134の本体を構成する。 The base portion 171L is made of metal, for example, and constitutes the main body of the hand portion 134.
 触覚センサ172Lは、ハンド部134L(のパーツ151L)に対する触覚(例えば、接触覚、圧覚、分布圧覚、力覚、滑り覚のうち1つ以上)を検出し、検出結果を示すセンサデータを出力する。 The tactile sensor 172L detects a tactile sensation (for example, one or more of a contact sensation, a pressure sensation, a distributed pressure sensation, a force sensation, and a slip sensation) for (the part 151L of) the hand portion 134L, and outputs sensor data indicating the detection result. .
 弾性体173Lには、柔軟ゲル素材等の柔軟性及び弾力性のあり、人肌の柔らかさに近い素材が用いられる。これにより、ユーザがハンド部134Lと握手した場合に、人間と握手する場合に近い感触を得られるとともに、初期滑りを発生させやすくなる。 The elastic body 173L is made of a flexible and elastic material, such as a flexible gel material, that is close to the softness of human skin. As a result, when the user shakes hands with the hand portion 134L, a feeling similar to shaking hands with a human being can be obtained, and initial slippage is more likely to occur.
 ここで、初期滑りとは、滑りの前兆現象である。具体的には、初期滑りとは、例えば、2つの物体が接触している場合に、一方の物体が動き始めるときに、その物体の接触面の一部のみが、他方の物体の接触面に対して滑り出す現象である。例えば、ユーザが握手した手を動かそうとしたとき、まずユーザの手の掌側の接触面の一部のみが、ハンド部134Lの掌側の接触面に対して滑り出す現象である。 Here, the initial slip is a phenomenon that is a precursor to slip. Specifically, initial slip is, for example, when two objects are in contact and when one object starts moving, only part of the contact surface of that object touches the contact surface of the other object. This is a phenomenon in which the object starts to slide. For example, when a user attempts to move a hand that has been shaken, only a portion of the palm-side contact surface of the user's hand begins to slide against the palm-side contact surface of the hand portion 134L.
 図4のA乃至Cは、初期滑りを模式的に示している。領域A1は、ロボット101のハンド部134Lの接触面においてユーザの手の滑りが検出された滑り領域(Slip Area)を示している。領域A2は、ハンド部134Lの接触面においてユーザの手が動かずに固着している固着領域(Stick Area)を示している。図内の矢印は、ユーザが手を動かす方向を示している。 A to C in FIG. 4 schematically show initial slippage. Area A1 indicates a slip area where slipping of the user's hand on the contact surface of the hand portion 134L of the robot 101 is detected. Area A2 indicates a stick area where the user's hand is fixed without moving on the contact surface of the hand portion 134L. The arrow in the figure indicates the direction in which the user moves his or her hand.
 この図に示されるように、ユーザが握手した手を動かそうとした場合、ハンド部134Lに対してユーザの手の接触面全体が一度に動き出すのではなく、一部のみがまず滑り出す。すなわち、初期滑りが発生する。そして、滑り領域A1が次第に大きくなる。 As shown in this figure, when the user attempts to move his hand after shaking hands, the entire contact surface of the user's hand does not begin to move at once relative to the hand portion 134L, but only a portion of it begins to slide. That is, initial slippage occurs. Then, the slip area A1 gradually becomes larger.
 後述するように、ロボット101は、初期滑りに基づいて、握手時におけるハンド部134Lの動きを制御する。 As described later, the robot 101 controls the movement of the hand portion 134L during a handshake based on the initial slippage.
 なお、ハンド部134Rは、ハンド部134Lと同様に、パーツ151R、パーツ152R、及び、ヨー軸153Rを備えている。パーツ151Rは、パーツ151Lと同様に、ピッチ軸161L乃至ピッチ軸163L、ベース部171R、触覚センサ172R、及び、弾性体173Rを備えている。ハンド部134Rは、ハンド部134Lと同様の動きが可能である。 Note that, like the hand portion 134L, the hand portion 134R includes a part 151R, a part 152R, and a yaw axis 153R. Like the part 151L, the part 151R includes pitch axes 161L to 163L, a base portion 171R, a tactile sensor 172R, and an elastic body 173R. The hand portion 134R can move in the same manner as the hand portion 134L.
 以下、パーツ151Lとパーツ151Rとを個々に区別する必要がない場合、単にパーツ151と称する。パーツ152Lとパーツ152Rとを個々に区別する必要がない場合、単にパーツ152と称する。ヨー軸153Lとヨー軸153Rとを個々に区別する必要がない場合、単にヨー軸153と称する。ピッチ軸161Lとピッチ軸161Rとを個々に区別する必要がない場合、単にピッチ軸161と称する。ピッチ軸162Lとピッチ軸162Rとを個々に区別する必要がない場合、単にピッチ軸162と称する。ピッチ軸163Lとピッチ軸163Rとを個々に区別する必要がない場合、単にピッチ軸163と称する。以下、ベース部171Lとベース部171Rとを個々に区別する必要がない場合、単にベース部171と称する。以下、触覚センサ172Lと触覚センサ172Rとを個々に区別する必要がない場合、単に触覚センサ172と称する。以下、弾性体173Lと弾性体173Rとを個々に区別する必要がない場合、単に弾性体173と称する。 Hereinafter, when there is no need to distinguish between parts 151L and 151R, they will simply be referred to as parts 151. When there is no need to distinguish between part 152L and part 152R, they are simply referred to as part 152. When there is no need to distinguish between the yaw axis 153L and the yaw axis 153R, they are simply referred to as the yaw axis 153. When there is no need to distinguish between the pitch axis 161L and the pitch axis 161R, they are simply referred to as the pitch axis 161. When it is not necessary to distinguish between the pitch axis 162L and the pitch axis 162R, they are simply referred to as the pitch axis 162. When there is no need to distinguish between the pitch axis 163L and the pitch axis 163R, they are simply referred to as the pitch axis 163. Hereinafter, when there is no need to distinguish between the base portion 171L and the base portion 171R, they will simply be referred to as the base portion 171. Hereinafter, when there is no need to distinguish between the tactile sensor 172L and the tactile sensor 172R, they will simply be referred to as the tactile sensor 172. Hereinafter, when there is no need to distinguish between the elastic body 173L and the elastic body 173R, they will simply be referred to as the elastic body 173.
 図5は、ロボット101の接触コミュニケーションの1つである握手に関する処理を実行する握手実行処理部201の機能の構成例を示している。 FIG. 5 shows a functional configuration example of the handshake execution processing unit 201 that executes processing related to handshaking, which is one type of contact communication of the robot 101.
 握手実行処理部201は、握手指令部211、視線検出部212、触覚検出部213、握手状態管理部214、視線制御部215、表示制御部216、握手制御部217、動作制御部218、発話制御部219、及び、音声出力部220を備える。 The handshake execution processing unit 201 includes a handshake command unit 211, a line of sight detection unit 212, a tactile detection unit 213, a handshake state management unit 214, a line of sight control unit 215, a display control unit 216, a handshake control unit 217, a movement control unit 218, and a speech control unit. section 219 and an audio output section 220.
 握手指令部211は、ロボット101の周囲の状況等に応じて、握手の実行の指令を握手状態管理部214に与える The handshake command unit 211 gives a handshake execution command to the handshake state management unit 214 according to the situation around the robot 101, etc.
 視線検出部212は、センサ部121からのセンサデータ(例えば、画像データ等)に基づいて、ユーザの視線方向を検出する。視線検出部212は、ユーザの視線方向の検出結果を示すユーザ視線情報を視線制御部215に供給する。また、視線検出部212は、発話制御部219に指令を与えることにより、発話制御部219を制御する。 The line-of-sight detection unit 212 detects the user's line-of-sight direction based on sensor data (for example, image data, etc.) from the sensor unit 121. The line-of-sight detection unit 212 supplies the line-of-sight control unit 215 with user line-of-sight information indicating the detection result of the user's line-of-sight direction. The line of sight detection unit 212 also controls the speech control unit 219 by giving commands to the speech control unit 219.
 触覚検出部213は、触覚センサ172からのセンサデータに基づいて、ロボット101のハンド部134(のパーツ151)に対する触覚情報を検出し、握手状態管理部214、握手制御部217、及び、発話制御部219に供給する。触覚情報は、例えば、接触状態(例えば、接触の有無、接触位置等)、せん断変形量、外部から加えられる握力等を含む。せん断変形量とは、ハンド部134の表面の弾性体173が平面方向にずれる方向であるせん断方向の変形量である。 The tactile detection unit 213 detects tactile information for (the part 151 of) the hand unit 134 of the robot 101 based on sensor data from the tactile sensor 172, and controls the handshake state management unit 214, handshake control unit 217, and speech control. 219. The tactile information includes, for example, the contact state (for example, presence or absence of contact, contact position, etc.), the amount of shear deformation, the grip force applied from the outside, and the like. The amount of shear deformation is the amount of deformation in the shear direction, which is the direction in which the elastic body 173 on the surface of the hand portion 134 is displaced in the plane direction.
 握手状態管理部214は、ロボット101の握手の状態を管理する。例えば、握手状態管理部214は、握手指令部211からの指令、触覚情報、視線制御部215からの視線状態情報、及び、動作制御部218からの動作状態情報に基づいて、ロボット101の握手の状態を検出する。また、握手状態管理部214は、握手指令部211からの指令、触覚情報、視線状態情報、動作状態情報、及び、ロボット101の握手の状態に基づいて、視線制御部215、握手制御部217、及び、発話制御部219に指令を与えたり、握手の状態を通知したりすることにより、ロボット101の握手の状態を制御する。 The handshake state management unit 214 manages the handshake state of the robot 101. For example, the handshake state management unit 214 controls the handshake state of the robot 101 based on commands from the handshake command unit 211, tactile information, gaze state information from the sight line control unit 215, and motion state information from the motion control unit 218. Detect conditions. The handshake state management unit 214 also controls the handshake control unit 215, handshake control unit 217, The handshake state of the robot 101 is controlled by giving commands to the speech control unit 219 and notifying the handshake state.
 ロボット101の握手の状態は、例えば、ユーザの手に対するハンド部134の位置及び姿勢、ハンド部134の握力、ロボット101の視線方向、及び、ロボット101の発話状態を含む。 The handshake state of the robot 101 includes, for example, the position and posture of the hand unit 134 with respect to the user's hand, the grip strength of the hand unit 134, the line of sight direction of the robot 101, and the speaking state of the robot 101.
 視線制御部215は、視線検出部212からのユーザ視線情報、及び、握手状態管理部214からの指令に基づいて、表示制御部216及び動作制御部218に指令を与え、ロボット101の視線方向を制御する。視線制御部215は、ユーザ及びロボット101の視線の状態(例えば、ロボット101とユーザとの間の視線の相対関係等)を示す視線状態情報を握手状態管理部214に供給する。 The line-of-sight control unit 215 gives commands to the display control unit 216 and the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212 and the command from the handshake state management unit 214, and controls the line-of-sight direction of the robot 101. Control. The line-of-sight control unit 215 supplies the handshake state management unit 214 with line-of-sight status information indicating the line-of-sight status of the user and the robot 101 (for example, the relative relationship of line-of-sight between the robot 101 and the user).
 表示制御部216は、視線制御部215からの指令に基づいて、ロボット101の目122が備えるモニタに表示する目の画像の表示を制御する。 The display control unit 216 controls the display of the eye image displayed on the monitor provided in the eye 122 of the robot 101 based on the command from the line of sight control unit 215.
 握手制御部217は、触覚情報、及び、握手状態管理部214からの指令に基づいて、動作制御部218に指令を与え、ハンド部134による握手の動作を制御する。握手制御部217は、初期握力設定部231、握力制御部232、及び、握手位置姿勢制御部233を備える。 The handshake control unit 217 gives a command to the motion control unit 218 based on the tactile information and the command from the handshake state management unit 214, and controls the handshake motion by the hand unit 134. The handshake control section 217 includes an initial grip force setting section 231 , a grip force control section 232 , and a handshake position/posture control section 233 .
 初期握力設定部231は、触覚情報に基づいて、ハンド部134の握手時の握力の初期値である初期握力を設定する。 The initial grip force setting unit 231 sets an initial grip force, which is an initial value of the grip force of the hand unit 134 when shaking hands, based on the tactile information.
 握力制御部232は、触覚情報、握手状態管理部214からの指令、及び、初期握力設定部231により設定された初期握力に基づいて、動作制御部218に指令を与え、ハンド部134の握力を制御する。 The grip force control unit 232 gives a command to the operation control unit 218 to adjust the grip force of the hand unit 134 based on the tactile information, the command from the handshake state management unit 214, and the initial grip force set by the initial grip force setting unit 231. Control.
 握手位置姿勢制御部233は、触覚情報、及び、握手状態管理部214からの指令に基づいて、動作制御部218に指令を与え、ハンド部134(による握手)の位置及び姿勢を制御する。 The handshake position/posture control section 233 gives a command to the motion control section 218 based on the tactile information and the command from the handshake state management section 214 to control the position and posture of (the handshake by) the hand section 134 .
 動作制御部218は、視線制御部215及び握手制御部217からの指令に基づいて、ロボット101の各部のアクチュエータを駆動し、各部の関節及び台車116を制御することにより、ロボット101の動き、位置、及び、姿勢等を制御する。動作制御部218は、ロボット101の動き、位置、及び、姿勢を含む動作状態を示す動作状態情報を握手状態管理部214に供給する。また、動作制御部218は、発話制御部219に指令を与えることにより、発話制御部219を制御する。 The motion control unit 218 controls the movement and position of the robot 101 by driving the actuators of each part of the robot 101 and controlling the joints of each part and the cart 116 based on commands from the line of sight control unit 215 and the handshake control unit 217. , and control the posture. The motion control section 218 supplies motion state information indicating the motion state including the movement, position, and posture of the robot 101 to the handshake state management section 214 . Further, the operation control unit 218 controls the speech control unit 219 by giving commands to the speech control unit 219.
 発話制御部219は、触覚情報、並びに、視線検出部212、握手状態管理部214、及び、動作制御部218からの指令に基づいて、音声出力部220からの発話音声の出力を制御する。 The speech control unit 219 controls the output of the speech sound from the audio output unit 220 based on the tactile information and commands from the line of sight detection unit 212, handshake state management unit 214, and operation control unit 218.
 音声出力部220は、例えば、スピーカ等の音声出力デバイスを備える。音声出力部220は、例えば、発話制御部219の制御の下に、発話音声を出力する。 The audio output unit 220 includes, for example, an audio output device such as a speaker. The audio output unit 220 outputs uttered audio under the control of the utterance control unit 219, for example.
 <握手制御処理>
 次に、図6及び図7のフローチャートを参照して、ロボット101により実行される握手制御処理について説明する。
<Handshake control process>
Next, the handshake control process executed by the robot 101 will be described with reference to the flowcharts of FIGS. 6 and 7.
 この処理は、握手指令部211がユーザと握手するように握手状態管理部214に指令を与えたときに開始される。 This process is started when the handshake command unit 211 gives a command to the handshake state management unit 214 to shake hands with the user.
 ステップS1において、ロボット101は、ユーザの視界の正面に移動する。具体的には、握手状態管理部214は、ロボット101の方を向くようにユーザを促すように視線制御部215に指令する。視線制御部215は、視線検出部212からのユーザ視線情報に基づいて、動作制御部218を介して台車116を駆動し、ユーザの視界の正面に入る位置にロボット101を移動させる。 In step S1, the robot 101 moves to the front of the user's field of vision. Specifically, the handshake state management unit 214 instructs the line of sight control unit 215 to prompt the user to face the robot 101. The line-of-sight control unit 215 drives the cart 116 via the operation control unit 218 based on the user's line-of-sight information from the line-of-sight detection unit 212, and moves the robot 101 to a position in front of the user's field of view.
 ステップS2において、視線制御部215は、視線検出部212からのユーザ視線情報に基づいて、ユーザがロボット101の方を見ているか否かを判定する。ユーザがロボット101の方を見ていないと判定された場合、処理はステップS3に進む。 In step S2, the line-of-sight control unit 215 determines whether the user is looking toward the robot 101 based on the user's line-of-sight information from the line-of-sight detection unit 212. If it is determined that the user is not looking at the robot 101, the process advances to step S3.
 ステップS3において、ロボット101は、ユーザの視界にハンド部134が入るようにロボット101の位置及びハンド部134の位置を調整する。具体的には、視線制御部215は、動作制御部218を介して台車116を駆動し、ユーザの視界にハンド部134が入る位置にロボット101を移動させる。視線制御部215は、動作制御部218を介して腕部117を駆動し、ユーザの視界に入るようにハンド部134の位置を調整する。 In step S3, the robot 101 adjusts the position of the robot 101 and the position of the hand unit 134 so that the hand unit 134 is within the user's field of vision. Specifically, the line of sight control unit 215 drives the cart 116 via the motion control unit 218 to move the robot 101 to a position where the hand unit 134 is within the user's field of vision. The line of sight control section 215 drives the arm section 117 via the motion control section 218 and adjusts the position of the hand section 134 so that it is within the user's field of vision.
 ステップS4において、ロボット101は、ハンド部134をロボット101の目122の高さまで上げる。具体的には、視線制御部215は、動作制御部218を介して腕部117を駆動し、目122の高さまでハンド部134を上げる。 In step S4, the robot 101 raises the hand section 134 to the level of the eyes 122 of the robot 101. Specifically, the line of sight control section 215 drives the arm section 117 via the motion control section 218 to raise the hand section 134 to the level of the eye 122.
 これに対して、ユーザがハンド部134の動きに合わせて視線を動かすことにより、ユーザの視線がロボット101の視線と合うように誘導される。 On the other hand, by moving the user's line of sight in accordance with the movement of the hand unit 134, the user's line of sight is guided to match the line of sight of the robot 101.
 その後、処理はステップS2に戻り、ステップS2において、ユーザがロボット101の方を見ていると判定されるまで、ステップS2乃至ステップS4の処理が繰り返し実行される。 After that, the process returns to step S2, and the processes from step S2 to step S4 are repeatedly executed until it is determined in step S2 that the user is looking at the robot 101.
 一方、ステップS2において、視線制御部215は、ユーザがロボット101の方を見ていると判定した場合、ユーザがロボット101の方を見ていることを握手状態管理部214に通知する。その後、処理はステップS5に進む。 On the other hand, in step S2, if the line of sight control unit 215 determines that the user is looking at the robot 101, it notifies the handshake state management unit 214 that the user is looking at the robot 101. After that, the process proceeds to step S5.
 ステップS5において、ロボット101は、ユーザの注意を引いてからユーザに接近する。具体的には、握手状態管理部214は、ユーザに接近し、ユーザと視線を合わせるように視線制御部215及び発話制御部219に指令する。視線制御部215は、動作制御部218を介してハンド部134等を制御して、ユーザに手を振る動作をロボット101に実行させる。発話制御部219は、「これから会いに行くよ」等のユーザに呼びかける発話音声を音声出力部220から出力させる。 In step S5, the robot 101 approaches the user after attracting the user's attention. Specifically, the handshake state management unit 214 instructs the line-of-sight control unit 215 and the speech control unit 219 to approach the user and make eye contact with the user. The line of sight control unit 215 controls the hand unit 134 and the like via the motion control unit 218 to cause the robot 101 to perform an action of waving at the user. The utterance control unit 219 causes the audio output unit 220 to output a utterance that calls out to the user, such as "I'm going to see you now."
 これにより、ユーザが、ロボット101に注意を向けるとともに、ロボット101が接近することを認識することが促進される。 This facilitates the user to pay attention to the robot 101 and to recognize that the robot 101 is approaching.
 次に、視線制御部215は、動作制御部218を介して台車116を駆動し、ロボット101をユーザに接近させる。 Next, the line of sight control unit 215 drives the trolley 116 via the motion control unit 218 to bring the robot 101 closer to the user.
 ステップS6において、ロボット101は、ユーザと視線を合わせる。具体的には、視線制御部215は、必要に応じて、動作制御部218を介してロボット101の各部を駆動し、ロボット101の姿勢を調整したり、表示制御部216を介して目の画像を動かしたりして、ロボット101の視線の向きを調整し、ロボット101の視線をユーザの視線と合わせる。 In step S6, the robot 101 makes eye contact with the user. Specifically, the line of sight control unit 215 drives each part of the robot 101 via the motion control unit 218 to adjust the posture of the robot 101 as necessary, and displays the eye image via the display control unit 216. The direction of the line of sight of the robot 101 is adjusted by moving the robot 101 to align the line of sight of the robot 101 with the line of sight of the user.
 なお、このとき、ロボット101の目122の高さが、ユーザの目の高さと同じか、ユーザの目より低い位置になることが望ましい。 Note that at this time, it is desirable that the height of the eyes 122 of the robot 101 be at the same height as the user's eyes or at a position lower than the user's eyes.
 ステップS7において、ロボット101は、手を差し出して握手する体勢に移る。具体的には、握手状態管理部214は、ユーザとの握手を開始するように視線制御部215、握手制御部217、及び、発話制御部219に指令する。握手位置姿勢制御部233は、動作制御部218を介して腕部117等を駆動することにより、ハンド部134をユーザに差し出し、ユーザと握手する体勢となるようにロボット101を動かす。 In step S7, the robot 101 moves to a position where it holds out its hand and shakes hands. Specifically, the handshake state management unit 214 instructs the line of sight control unit 215, handshake control unit 217, and speech control unit 219 to start shaking hands with the user. The handshake position/posture control section 233 moves the robot 101 so as to hold out the hand section 134 to the user and to shake hands with the user by driving the arm section 117 and the like via the motion control section 218 .
 ステップS8において、ロボット101は、声をかけながら、ハンド部134とユーザに交互に視線を送る。具体的には、発話制御部219は、「手を握りますよ」、「手を握ってね」等の発話音声を音声出力部220から出力させることにより、これから握手することをユーザに伝える。視線制御部215は、動作制御部218を介して頭部111を動かしたり、表示制御部216を介して目の画像を動かしたりすることにより、ロボット101の視線をハンド部134の方向とユーザの方向との間を交互に移動させる。 In step S8, the robot 101 alternately looks at the hand unit 134 and the user while calling out to them. Specifically, the speech control unit 219 informs the user that they are about to shake hands by causing the speech output unit 220 to output speech such as "I'll shake your hand" or "Please hold my hand." The line of sight control unit 215 moves the line of sight of the robot 101 in the direction of the hand unit 134 and the direction of the user by moving the head 111 via the motion control unit 218 and moving the eye image via the display control unit 216. Alternate between directions.
 これにより、これから握手することに対するユーザの認知が促進される。 This promotes the user's awareness of the upcoming handshake.
 ステップS9において、握手状態管理部214は、触覚検出部213からの触覚情報に基づいて、ユーザの手がハンド部134に接触したか否かを判定する。ユーザの手がハンド部134に接触していないと判定された場合、処理はステップS8に戻る。 In step S9, the handshake state management unit 214 determines whether the user's hand has touched the hand unit 134 based on the tactile information from the tactile detection unit 213. If it is determined that the user's hand is not in contact with the hand portion 134, the process returns to step S8.
 その後、ステップS9において、ユーザの手がハンド部134に接触したと判定されるまで、ステップS8及びステップS9の処理が繰り返し実行される。 Thereafter, the processes of step S8 and step S9 are repeatedly executed until it is determined in step S9 that the user's hand has contacted the hand portion 134.
 一方、ステップS9において、ユーザの手がハンド部134に接触したと判定された場合、処理はステップS10に進む。 On the other hand, if it is determined in step S9 that the user's hand has touched the hand portion 134, the process proceeds to step S10.
 ステップS10において、ロボット101は、握手することを伝えながら、ユーザの握力を検出し、初期握力を設定する。具体的には、握手状態管理部214は、握手が開始されたことを視線制御部215、握手制御部217、及び、発話制御部219に通知する。 In step S10, the robot 101 detects the user's grip strength while informing that the user will shake hands, and sets the initial grip strength. Specifically, the handshake state management unit 214 notifies the line of sight control unit 215, the handshake control unit 217, and the speech control unit 219 that the handshake has started.
 発話制御部219は、「手を握っていますよ」等の発話音声を音声出力部220から出力させることにより、握手していることをユーザに伝える。 The speech control unit 219 informs the user that they are shaking hands by causing the speech output unit 220 to output a speech such as "I'm shaking your hand."
 初期握力設定部231は、触覚検出部213からの触覚情報に含まれるユーザの握力に基づいて、初期握力を設定する。例えば、初期握力設定部231は、握手が開始されたときのユーザの握力とほぼ同じ握力を、初期握力に設定する。 The initial grip strength setting unit 231 sets the initial grip strength based on the user's grip strength included in the tactile information from the tactile detection unit 213. For example, the initial grip strength setting unit 231 sets the initial grip strength to approximately the same grip strength as the user's grip strength when the handshake is started.
 握力制御部232は、動作制御部218を介して、初期握力でユーザの手を握るようにハンド部134を駆動する。 The grip force control unit 232 drives the hand unit 134 via the operation control unit 218 so as to grasp the user's hand with an initial grip force.
 これにより、ユーザが握手しようとハンド部134に手を触れたときに、即座にロボット101によりほぼ同じ握力でユーザの手が握り返される。これにより、ユーザに安心感を与えることができる。 As a result, when the user touches the hand portion 134 to shake hands, the robot 101 immediately grips the user's hand back with approximately the same grip force. This can give the user a sense of security.
 ステップS11において、ロボット101は、ユーザの手の動きに合わせて、ハンド部134の動きを制御する。 In step S11, the robot 101 controls the movement of the hand section 134 in accordance with the movement of the user's hand.
 ここで、図8乃至図11を参照して、ステップS11の処理の詳細について説明する。 Here, details of the process of step S11 will be explained with reference to FIGS. 8 to 11.
 上述したように、ユーザが握手をし、握手した手を動かそうとしたとき、ハンド部134とユーザの手との間の接触面において初期滑りが発生する。ロボット101は、初期滑りを検出した場合、ユーザの手が滑らない程度の力で優しくユーザの手を握り返すように、ハンド部134の握力を制御する。 As described above, when the user shakes hands and attempts to move the hand that has been shaken, initial slippage occurs at the contact surface between the hand portion 134 and the user's hand. When the robot 101 detects initial slippage, the robot 101 controls the gripping force of the hand unit 134 so as to gently grasp the user's hand with enough force to prevent the user's hand from slipping.
 ここで、図8及び図9を参照して、ハンド部134の握力制御の方法について説明する。 Here, a method of controlling the grip force of the hand section 134 will be described with reference to FIGS. 8 and 9.
 図8は、弾性接触の物理モデルの例を模式的に示している。具体的には、ユーザの手301と一般的なロボットのハンド部の指302とが接触している部分の断面を模式的に示している。指302は、ロボットのハンド部の複数の指のうちの1つであり、各指の表面は弾性体により覆われている。図8のAは、手301が動く前の状態を示し、図8のBは、手301が矢印の方向に動いた場合の状態を示している。 FIG. 8 schematically shows an example of a physical model of elastic contact. Specifically, a cross section of a portion where a user's hand 301 and a finger 302 of a general robot hand are in contact is schematically shown. The finger 302 is one of a plurality of fingers of the robot's hand, and the surface of each finger is covered with an elastic material. 8A shows the state before the hand 301 moves, and FIG. 8B shows the state when the hand 301 moves in the direction of the arrow.
 以下、指302の接触面の半径をa、固着領域の半径をcとする。以下、指302の接触面のせん断方向をx方向とし、法線方向をz方向とする。 Hereinafter, the radius of the contact surface of the finger 302 is a, and the radius of the fixed area is c. Hereinafter, the shear direction of the contact surface of the finger 302 will be referred to as the x direction, and the normal direction will be referred to as the z direction.
 指302の表面の曲面をn次元とした場合、ヘルツの接触定理より、次式(1)が導出される。 When the curved surface of the finger 302 is n-dimensional, the following equation (1) is derived from Hertz's contact theorem.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 Fxは、手301により指302に対してせん断方向(x方向)に加わるせん断力を示している。Fzは、手301により指302に対して法線方向に加わる法線力を示している。μは、手301と指302との間の摩擦係数を示している。uxは、指302のせん断方向の変形量を示している。Ψ(μ,Fz)は摩擦係数μと法線力Fzの関数を簡略化して示している。 F x indicates a shearing force applied to the finger 302 by the hand 301 in the shearing direction (x direction). F z represents the normal force applied by the hand 301 to the finger 302 in the normal direction. μ indicates the coefficient of friction between the hand 301 and the fingers 302. u x indicates the amount of deformation of the finger 302 in the shear direction. Ψ(μ, F z ) is a simplified function of the friction coefficient μ and normal force F z .
 式(1)の左辺(c/a)は、接触面のうち固着領域の割合を示しており、固着率と呼ばれる。従って、初期滑りは、固着率という物理量により定量化できる。 The left side (c/a) of equation (1) indicates the proportion of the sticking area in the contact surface, and is called the sticking rate. Therefore, the initial slip can be quantified by a physical quantity called sticking rate.
 なお、圧力センサ等により指302に対する圧力の分布が検出される場合、指302のせん断方向の変形量uxから、摩擦係数μに依存することなく固着率の減少を計測することが可能になる。 Note that when the distribution of pressure on the finger 302 is detected by a pressure sensor or the like, it becomes possible to measure the decrease in the sticking rate from the amount of deformation u x of the finger 302 in the shear direction without depending on the friction coefficient μ. .
 次の式(2)乃至式(4)は、ロボットのハンド部の指のせん断方向の変形量uxが参照値ux_refになるように制御するために必要なハンド部の握力fdの算出方法を示している。 The following equations (2) to (4) are used to calculate the grip force f d of the robot hand necessary to control the deformation u x of the fingers of the robot hand in the shear direction to the reference value u x_ref . Shows how.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 urは、各指のせん断方向の変形により生じるせん断方向の変位量の任意の並進方向rの成分(以下、並進せん断変位量と称する)を示し、ur_refは、各指の並進せん断変位量の参照値(以下、並進せん断変位参照値と称する)を示している。uθは、各指のせん断方向の変形により生じるせん断方向の変位量の回転方向θの成分(以下、回転せん断変位量と称する)を示し、uθ_refは、各指の回転せん断変位量の参照値(以下、回転せん断変位参照値と称する)を示している。Σurは、各指の並進せん断変位量urの和を示し、Σur_refは、各指の並進せん断変位参照値ur_refの和を示している。Σuθは、各指の回転せん断変位量uθの和を示し、Σuθ_refは、各指の回転せん断変位参照値uθ_refの和を示している。Kpr、Kir、Kdr、K、K、Kは、それぞれPID制御のゲインを示している。frは、ハンド部の各指の並進方向に発生したせん断変位量に基づいて算出された握力を示し、fθは、ハンド部の各指の回転方向に発生したせん断変位量に基づいて算出された握力を示している。 u r indicates the component in an arbitrary translational direction r of the amount of displacement in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the amount of translational shear displacement), and u r_ref is the amount of translational shear displacement of each finger (hereinafter referred to as translational shear displacement reference value). u θ indicates the rotational direction θ component of the displacement amount in the shear direction caused by the deformation of each finger in the shear direction (hereinafter referred to as the rotational shear displacement amount), and u θ_ref is the reference for the rotational shear displacement amount of each finger. (hereinafter referred to as rotational shear displacement reference value). Σu r represents the sum of the translational shear displacement amounts u r of each finger, and Σu r_ref represents the sum of the translational shear displacement reference values u r_ref of each finger. Σu θ represents the sum of the rotational shear displacement amounts u θ of each finger, and Σu θ_ref represents the sum of the rotational shear displacement reference values u θ_ref of each finger. K pr , K ir , K dr , K , K , and K each indicate the gain of PID control. f r indicates the grip force calculated based on the amount of shear displacement that occurred in the translational direction of each finger of the hand section, and f θ is calculated based on the amount of shear displacement that occurred in the rotational direction of each finger of the hand section. It shows the grip strength.
 このように、ロボットの各指で検出されたせん断方向の変形量uxのベクトルの和を入力として、PID(Proportional-Integral-Differential)制御を適用することで、握力fdが算出される。 In this way, the grip force f d is calculated by applying PID (Proportional-Integral-Differential) control using as input the vector sum of the deformation amounts u x in the shear direction detected by each finger of the robot.
 図9は、式(2)乃至式(4)に示される握力制御方法を実現する握力制御部331の構成例を示している。 FIG. 9 shows a configuration example of the grip force control section 331 that implements the grip force control method shown in equations (2) to (4).
 握力制御部331は、リファレンス生成部341、演算部342、握力算出部343、演算部344、トルク算出部345、アクチュエータ制御部346、ハンド部347、LPF(Law Pass Filter)348、及び、演算部349を備える。 The grip force control section 331 includes a reference generation section 341, a calculation section 342, a grip force calculation section 343, a calculation section 344, a torque calculation section 345, an actuator control section 346, a hand section 347, an LPF (Law Pass Filter) 348, and a calculation section. Equipped with 349.
 リファレンス生成部341は、ハンド部347により検出された各指の並進せん断変位量uri及び回転せん断変位量uθiに基づいて、各指の並進せん断変位参照値ur_ref及び回転せん断変位参照値uθ_refを生成する。リファレンス生成部341は、並進せん断変位参照値ur_ref及び回転せん断変位参照値uθ_refを示す情報を演算部342に供給する。 The reference generation unit 341 generates a translational shear displacement reference value u r_ref and a rotational shear displacement reference value u of each finger based on the translational shear displacement amount u ri and the rotational shear displacement amount u θi of each finger detected by the hand unit 347. Generate θ_ref . The reference generation unit 341 supplies the calculation unit 342 with information indicating the translational shear displacement reference value u r_ref and the rotational shear displacement reference value u θ_ref .
 演算部342は、ハンド部347の各指の並進せん断変位量urの和と並進せん断変位参照値ur_refの和との差分(Σur-Σur_ref)を算出し、算出結果を示す情報を握力算出部343に供給する。演算部342は、ハンド部347の各指の回転せん断変位量uθの和と回転せん断変位参照値uθ_refの和との差分(Σuθ-Σuθ_ref)を算出し、算出結果を示す情報を握力算出部343に供給する。 The calculation unit 342 calculates the difference (Σu r −Σu r_ref ) between the sum of the translational shear displacement amount ur of each finger of the hand unit 347 and the sum of the translational shear displacement reference value u r_ref , and uses information indicating the calculation result as a grip force. It is supplied to the calculation unit 343. The calculation unit 342 calculates the difference (Σu θ − Σu θ_ref ) between the sum of the rotational shear displacement amounts u θ of each finger of the hand unit 347 and the sum of the rotational shear displacement reference values u θ_ref , and provides information indicating the calculation result. It is supplied to the grip strength calculation section 343.
 握力算出部343は、上述した式(2)乃至式(4)の演算を行い、ハンド部347のせん断方向の握力fdを算出し、算出結果を示す情報を演算部344に供給する。 The grip force calculation unit 343 calculates the above-mentioned equations (2) to (4), calculates the grip force f d of the hand unit 347 in the shear direction, and supplies information indicating the calculation result to the calculation unit 344.
 演算部344は、ハンド部347の初期握力とせん断方向の握力fdを加算した握力を算出し、算出結果を示す情報をトルク算出部345に供給する。この握力が、ハンド部347からユーザの手に加えられる握力となる。 The calculation unit 344 calculates the grip force by adding the initial grip force of the hand unit 347 and the grip force fd in the shear direction, and supplies information indicating the calculation result to the torque calculation unit 345. This gripping force becomes the gripping force applied from the hand section 347 to the user's hand.
 トルク算出部345は、ハンド部347の握力、各指の関節の角度qi、及び、各指の指先に関するヤコビ行列に基づいて、算出された握力でハンド部347を握るために各指の関節を駆動するトルクτiを算出し、算出結果を示す情報をアクチュエータ制御部346に供給する。 The torque calculation unit 345 adjusts the joints of each finger in order to grasp the hand unit 347 with the calculated grip force based on the grip force of the hand unit 347, the angle q i of the joint of each finger, and the Jacobian matrix regarding the fingertip of each finger. The torque τ i for driving is calculated, and information indicating the calculation result is supplied to the actuator control unit 346.
 アクチュエータ制御部346は、ハンド部347の各指の関節を駆動する各アクチュエータを、トルク算出部345により算出されたトルクτiにより駆動する。 The actuator control unit 346 drives each actuator that drives each finger joint of the hand unit 347 using the torque τ i calculated by the torque calculation unit 345.
 ハンド部347は、各指の並進せん断変位量uri及び回転せん断変位量uθiを検出し、検出結果を示す信号(以下、せん断変位量信号と称する)をリファレンス生成部341及びLPF348に供給する。ハンド部347は、各指の関節の角度qiを示す信号(以下、関節角度信号と称する)をトルク算出部345及びアクチュエータ制御部346に供給する。 The hand unit 347 detects the translational shear displacement amount u ri and the rotational shear displacement amount u θi of each finger, and supplies a signal indicating the detection result (hereinafter referred to as a shear displacement amount signal) to the reference generation unit 341 and the LPF 348. . The hand section 347 supplies a signal indicating the angle q i of each finger joint (hereinafter referred to as a joint angle signal) to the torque calculation section 345 and the actuator control section 346 .
 LPF348は、せん断変位量信号の高周波ノイズを低減し、高周波ノイズ低減後のせん断変位量信号を演算部349に供給する。 The LPF 348 reduces high frequency noise in the shear displacement amount signal and supplies the shear displacement amount signal after the high frequency noise reduction to the calculation unit 349.
 演算部349は、せん断変位量信号に基づいて、各指の並進せん断変位量uriの和(Σur)及び回転せん断変位量uθiの和(Σuθ)を算出し、算出結果を示す情報を演算部342に供給する。 The calculation unit 349 calculates the sum of the translational shear displacement amount uri (Σu r ) and the sum of the rotational shear displacement amount u θi (Σu θ ) of each finger based on the shear displacement amount signal, and provides information indicating the calculation result. is supplied to the calculation unit 342.
 ここで、例えば、並進せん断変位参照値ur_ref及び回転せん断変位参照値uθ_refを0に設定することにより、ハンド部347のせん断変形量を0に近づけるように(ハンド部347のせん断方向の変形を低減するように)、ハンド部347の握力を設定し、各指の関節を駆動することができる。 Here, for example, by setting the translational shear displacement reference value u r_ref and the rotational shear displacement reference value u θ_ref to 0, the amount of shear deformation of the hand portion 347 is brought close to 0 (deformation of the hand portion 347 in the shear direction). ), the grip force of the hand portion 347 can be set to drive the joints of each finger.
 握力制御部232は、図9の握力制御部331と同様の方法により、ハンド部134の握力を制御する。すなわち、握力制御部232は、触覚検出部213により初期滑りが検出された場合、ハンド部134のせん断変形量、及び、初期握力設定部231により設定された初期握力に基づいて、初期滑りにより発生するハンド部134のせん断変形量を0に近づけるように、ハンド部134の握力を計算する。握力制御部232は、動作制御部218を介して、計算した握力でユーザの手を握るようにハンド部134を駆動する。 The grip force control section 232 controls the grip force of the hand section 134 using the same method as the grip force control section 331 in FIG. 9 . That is, when initial slippage is detected by the tactile detection section 213, the grip force control section 232 determines whether the initial slippage is caused by the initial slippage based on the amount of shear deformation of the hand section 134 and the initial grip force set by the initial grip force setting section 231. The grip force of the hand section 134 is calculated so that the amount of shear deformation of the hand section 134 approaches zero. The grip force control section 232 drives the hand section 134 via the motion control section 218 so as to grasp the user's hand with the calculated grip force.
 なお、図9には、ハンド部347の指の本数がi本である場合の例が示されているが、ロボット101のハンド部134の指の本数は1本であるため、i=1となる。 Note that although FIG. 9 shows an example in which the number of fingers on the hand section 347 is i, since the number of fingers on the hand section 134 of the robot 101 is one, i=1. Become.
 これにより、ユーザが握手した手を動かそうとしたことが初期滑りにより即座に検出され、ハンド部134のせん断変形量に基づいて、ユーザの手が滑らない程度の力で、かつ、ユーザの手の動きにより生じる外力が減少するように、ユーザの手が優しく握り返される。 As a result, it is immediately detected that the user tries to move the hand after shaking hands due to the initial slippage, and based on the amount of shear deformation of the hand section 134, the user's hand is moved with enough force that the user's hand does not slip. The user's hand is gently grasped back so that the external force caused by the movement is reduced.
 なお、安全のためハンド部134の握力の最大値が予め設定され、ユーザに危害を与えない範囲で連続的にハンド部134の握力が制御される。 Note that for safety, the maximum value of the gripping force of the hand unit 134 is set in advance, and the gripping force of the hand unit 134 is continuously controlled within a range that does not cause harm to the user.
 また、握手位置姿勢制御部233は、ユーザの手の初期滑りによる外力を打ち消すようにハンド部134の軌道を計算し、動作制御部218を介して、計算した軌道でハンド部134を動かす。 Furthermore, the handshake position/posture control unit 233 calculates the trajectory of the hand unit 134 so as to cancel the external force caused by the initial slippage of the user's hand, and moves the hand unit 134 along the calculated trajectory via the motion control unit 218.
 さらに、握手位置姿勢制御部233は、ロボット101のハンド部134とユーザの手との接触面積ができるだけ大きくなるように、動作制御部218を介して、ハンド部134の位置及び姿勢を調整する。 Furthermore, the handshake position and posture control unit 233 adjusts the position and posture of the hand unit 134 via the operation control unit 218 so that the contact area between the hand unit 134 of the robot 101 and the user's hand is as large as possible.
 ここで、図10及び図11を参照して、ハンド部134の位置及び姿勢の制御方法について説明する。 Here, a method for controlling the position and posture of the hand section 134 will be described with reference to FIGS. 10 and 11.
 図10のAは、ハンド部134の姿勢を調整する前のハンド部134の接触面(掌側)と、ユーザの手301の接触面(掌)との位置関係を模式的に示している。図10のBは、ハンド部134の姿勢を調整した後のハンド部134の接触面と、ユーザの手301の接触面との位置関係を模式的に示している。矢印L1は、ハンド部134の接触面の法線の方向を示している。 FIG. 10A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before adjusting the posture of the hand section 134. B in FIG. 10 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the posture of the hand section 134 has been adjusted. Arrow L1 indicates the direction of the normal to the contact surface of hand portion 134.
 このように、ハンド部134の接触面の法線L1が、ユーザの手301の接触面に対して略垂直になるように、ハンド部134の姿勢が調整される。これにより、ハンド部134の接触面とユーザの手301の接触面とが、略平行に対向する。 In this way, the posture of the hand section 134 is adjusted so that the normal L1 of the contact surface of the hand section 134 is approximately perpendicular to the contact surface of the user's hand 301. Thereby, the contact surface of the hand portion 134 and the contact surface of the user's hand 301 are opposed to each other in substantially parallel.
 図11のAは、ハンド部134の位置を調整する前のハンド部134の接触面(掌側)と、ユーザの手301の接触面(掌)との位置関係を模式的に示している。図11のBは、ハンド部134の位置を調整した後のハンド部134の接触面と、ユーザの手301の接触面との位置関係を模式的に示している。 FIG. 11A schematically shows the positional relationship between the contact surface (palm side) of the hand section 134 and the contact surface (palm side) of the user's hand 301 before the position of the hand section 134 is adjusted. B in FIG. 11 schematically shows the positional relationship between the contact surface of the hand section 134 and the contact surface of the user's hand 301 after the position of the hand section 134 has been adjusted.
 このように、ハンド部134の接触面とユーザの接触面との位置が合うように、ハンド部134の位置が調整される。すなわち、ハンド部134の接触面とユーザの手301の接触面とが重なる面積ができるだけ大きくなるように、ハンド部134の位置が調整される。 In this way, the position of the hand section 134 is adjusted so that the contact surface of the hand section 134 and the contact surface of the user are aligned. That is, the position of the hand section 134 is adjusted so that the area where the contact surface of the hand section 134 and the contact surface of the user's hand 301 overlap is as large as possible.
 図7に戻り、ステップS12において、ロボット101は、ユーザの握力の強さに応じてユーザに呼びかける。具体的には、発話制御部219は、触覚検出部213により検出されたユーザの握力の強さに応じて、「今日は力強いですね」、「元気そうでよかったです」、「体調はいかがですか」等に発話音声の内容を変更しながら、音声出力部220から出力させる。 Returning to FIG. 7, in step S12, the robot 101 calls out to the user depending on the strength of the user's grip. Specifically, the speech control unit 219 responds to the strength of the user's grip detected by the tactile detection unit 213 by saying, “You look strong today,” “I’m glad you look healthy,” or “How are you feeling?” The voice output section 220 outputs the voice while changing the content of the voice, such as "?".
 これにより、ユーザに握手していることが伝えられる。また、握力に応じて発話内容が変更され、ユーザの体調等が確認される。 This tells the user that you are shaking hands. Furthermore, the content of the utterance is changed according to the grip strength, and the user's physical condition is checked.
 ステップS13において、握手状態管理部214は、握手を継続するか否かを判定する。 In step S13, the handshake state management unit 214 determines whether to continue the handshake.
 例えば、ユーザが握手を継続し、ハンド部134を握りしめている場合、ユーザの指先がハンド部134の甲側に接触する。一方、ユーザが握手を終了し、ハンド部134から手を離そうとした場合、ユーザの指先がハンド部134の甲側から離れる。 For example, when the user continues to shake hands and is clenching the hand portion 134, the user's fingertips come into contact with the back side of the hand portion 134. On the other hand, when the user finishes the handshake and tries to release his/her hand from the hand section 134, the user's fingertips separate from the back side of the hand section 134.
 これに対して、握手状態管理部214は、触覚検出部213によりハンド部134の甲側への接触が検出されている場合、握手を継続すると判定し、処理はステップS11に戻る。 On the other hand, if the tactile detection unit 213 detects contact with the back side of the hand unit 134, the handshake state management unit 214 determines to continue the handshake, and the process returns to step S11.
 その後、ステップS13において、握手を終了すると判定されるまで、ステップS11乃至ステップS13の処理が繰り返し実行される。 Thereafter, the processes from step S11 to step S13 are repeatedly executed until it is determined in step S13 that the handshake is finished.
 一方、ステップS13において、握手状態管理部214は、触覚検出部213によりハンド部134の甲側への接触が検出されていない場合、握手を終了すると判定し、処理はステップS14に進む。 On the other hand, in step S13, if the tactile detection unit 213 does not detect contact with the back side of the hand part 134, the handshake state management unit 214 determines that the handshake is to be completed, and the process proceeds to step S14.
 ステップS14において、ロボット101は、声をかけながら手を開く。具体的には、握手状態管理部214は、視線制御部215、握手制御部217、及び、発話制御部219に握手の終了の指令を与える。発話制御部219は、「手を離すよ」等の発話音声を音声出力部220から出力させることにより、ユーザに握手の終了を伝える。握手位置姿勢制御部233は、動作制御部218を介して、ハンド部134を開くことにより、ロボット101に手を開く動作を実行させる。 In step S14, the robot 101 opens its hand while calling out to you. Specifically, the handshake state management section 214 gives a command to the line of sight control section 215, the handshake control section 217, and the speech control section 219 to end the handshake. The speech control section 219 notifies the user of the end of the handshake by causing the speech output section 220 to output a speech such as "Let go of your hand." The handshake position/posture control section 233 causes the robot 101 to perform an action of opening its hand by opening the hand section 134 via the motion control section 218 .
 ステップS15において、握手状態管理部214は、ユーザの手が離れたか否かを判定する。握手状態管理部214は、触覚検出部213によりハンド部134への接触が検出されている場合、ユーザの手がハンド部134から離れていないと判定し、ステップS15の判定処理を繰り返す。一方、握手状態管理部214は、触覚検出部213によりハンド部134への接触が検出されていない場合、ユーザの手が離れたと判定し、処理はステップS16に進む。 In step S15, the handshake state management unit 214 determines whether the user's hand has been released. If the tactile detection unit 213 detects contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has not left the hand unit 134, and repeats the determination process in step S15. On the other hand, if the tactile detection unit 213 does not detect contact with the hand unit 134, the handshake state management unit 214 determines that the user's hand has been released, and the process proceeds to step S16.
 ステップS16において、ロボット101は、声をかけながら元の姿勢に戻る。具体的には、握手状態管理部214は、握手が終了したことを視線制御部215、握手制御部217、及び、発話制御部219に通知する。発話制御部219は、「手を下げるよ」等の発話音声を音声出力部220から出力させることにより、ハンド部134を元の位置に戻すことをユーザに伝える。握手位置姿勢制御部233は、動作制御部218を介してロボット101の各部を駆動し、ロボット101の姿勢を元の状態に戻す。 In step S16, the robot 101 returns to its original position while calling out to you. Specifically, the handshake state management section 214 notifies the line of sight control section 215, the handshake control section 217, and the speech control section 219 that the handshake has ended. The speech control section 219 outputs a speech such as "lower your hand" from the voice output section 220 to inform the user that the hand section 134 should be returned to its original position. The handshake position/posture control section 233 drives each part of the robot 101 via the motion control section 218 to return the posture of the robot 101 to its original state.
 その後、握手制御処理は終了する。 After that, the handshake control process ends.
 以上のようにして、人と人とが握手するような接触インタラクションが実現され、ロボット101による接触インタラクションに対するユーザの安心感を高めることができる。
例えば、以下の効果がもたらされる。
As described above, a contact interaction such as shaking hands between people is realized, and the user's sense of security regarding the contact interaction by the robot 101 can be increased.
For example, the following effects are brought about.
 ロボット101が、ユーザが握手した手を動かそうとしたときの変化を初期滑りにより即座に検出し、滑らない程度の力で優しく握り返すことにより、ユーザの動き(入力)が伝わっていることが提示され、ユーザに安心感が与えられる。 The robot 101 immediately detects changes when the user tries to move the hand that has been shaken by the user based on the initial slippage, and gently returns the grip with enough force to prevent the hand from slipping, thereby confirming that the user's movement (input) is being transmitted. is presented to give the user a sense of security.
 認知において重要な要素である「見る」、「話す」、「触れる」機能が連携されることにより、ユーザがロボット101の動きを分かりやすく、認知しやすくなる。特に、「見る」と「触れる」が連携し、ロボット101がユーザを「見る」ことで、ユーザがロボット101を認識し、ロボット101も視線が合うことでユーザに認識されたことを確認した上で、「触れる」動作に移行する。これにより、ロボット101の動きが分かりやすくかつ認知しやすくなり、ユーザに安心感が与えられる。 By linking the functions of "seeing," "speaking," and "touching" which are important elements in recognition, it becomes easier for the user to understand and recognize the movements of the robot 101. In particular, we confirmed that "seeing" and "touching" work together, and that the robot 101 "sees" the user so that the user recognizes the robot 101, and that the robot 101 is also recognized by the user by making eye contact. Then, move to the "touch" action. This makes the movement of the robot 101 easier to understand and recognize, giving the user a sense of security.
 「話す」機能が、介護手法の一つであるオートフィードバック手法により実行されることで、ロボット101の動きが分かりやすくかつ認知しやすくなり、ユーザに安心感が与えられる。 By executing the "speak" function using the auto-feedback method, which is one of the nursing care methods, the movements of the robot 101 become easier to understand and recognize, giving the user a sense of security.
 ロボット101が、全身を使った視線制御により「見る」機能を実行することで、ロボット101の動きが分かりやすくかつ認知しやすくなり、ユーザに安心感が与えられる。 When the robot 101 performs the "seeing" function by controlling the line of sight using its entire body, the movements of the robot 101 become easier to understand and recognize, giving the user a sense of security.
 ユーザの手とロボット101のハンド部134の接触面積が大きくなることにより、ユーザの危険感が排除される。 By increasing the contact area between the user's hand and the hand portion 134 of the robot 101, the user's sense of danger is eliminated.
 ユーザの初期滑りによる外力を打ち消すような軌道で、ハンド部134が移動する。これにより、ハンド部134の変化速度及び反力が小さくなる軌道制御の下にハンド部134が動くことにより、ユーザの危険感が排除される。 The hand section 134 moves in a trajectory that cancels out the external force caused by the user's initial slippage. As a result, the hand section 134 moves under trajectory control that reduces the rate of change and reaction force of the hand section 134, thereby eliminating the user's sense of danger.
 ユーザの手を優しく握り返す握力制御と、変化速度及び反力が小さくなる軌道制御とが組み合わされることにより、ロボット101が、ユーザの手を握り返す状態から自ら力を抜く状態に遷移できることより、ユーザの安心感が高まる。特に、初期滑りを利用して、ハンド部134の握力が制御され、ユーザの手による外力を検出する手法をとることにより、ユーザの安心感を高める効果が増大する。 By combining grip force control that gently returns the user's hand with trajectory control that reduces the rate of change and reaction force, the robot 101 can transition from a state of grasping the user's hand back to a state of relaxing itself. User's sense of security increases. In particular, by utilizing the initial slippage to control the gripping force of the hand section 134 and detecting the external force exerted by the user's hand, the effect of increasing the user's sense of security is increased.
<<2.変形例>>
 以下、上述した本技術の実施の形態の変形例について説明する。
<<2. Modified example >>
Modifications of the embodiment of the present technology described above will be described below.
 図12のA乃至Eは、ハンド部134のパーツ151の断面の構成の変形例を示している。なお、図12のA乃至Eにおいて、上側がパーツ151の掌側を示し、下側がパーツ151の甲側を示している。 12A to 12E show modified examples of the cross-sectional configuration of the part 151 of the hand portion 134. In addition, in A to E of FIG. 12, the upper side shows the palm side of the part 151, and the lower side shows the back side of the part 151.
 例えば、図12のA及びBに示されるように、パーツ151の側面の少なくとも一方において、触覚センサ172及び弾性体173を設けないようにしてもよい。 For example, as shown in FIGS. 12A and 12B, the tactile sensor 172 and the elastic body 173 may not be provided on at least one side of the part 151.
 例えば、図12のCに示されるように、弾性体173を中央になるほど厚くなる山型にしてもよい。 For example, as shown in FIG. 12C, the elastic body 173 may be formed into a mountain shape that becomes thicker toward the center.
 例えば、図12のDに示されるように、パーツ151の掌側及び甲側において、弾性体173を複数の山型が連なる形状としてもよい。 For example, as shown in FIG. 12D, the elastic body 173 may have a shape in which a plurality of chevrons are connected on the palm side and back side of the part 151.
 例えば、図12のEに示されるように、パーツ151の掌側にのみ触覚センサ172及び弾性体173を設けるようにしてもよい。 For example, as shown in FIG. 12E, the tactile sensor 172 and the elastic body 173 may be provided only on the palm side of the part 151.
 また、例えば、ハンド部134のパーツ152Lにも触覚センサ及び弾性体を設けたり、関節を設けたりするようにしてもよい。 Furthermore, for example, the part 152L of the hand portion 134 may also be provided with a tactile sensor and an elastic body, or may be provided with a joint.
 さらに、ハンド部134の構成は、上述した例に限定されるものではなく、変更することが可能である。例えば、ハンド部134の指の数、並びに、関節の数及び位置等を変更することが可能である。 Furthermore, the configuration of the hand section 134 is not limited to the example described above, and can be changed. For example, it is possible to change the number of fingers of the hand section 134, as well as the number and position of joints.
 例えば、図7のステップS13において、握手状態管理部214が、ユーザの状態の時系列の変化を観察し、その結果に基づいて、握手を継続するか終了するかを判定するようにしてもよい。判定条件に用いることが可能なユーザの状態に関する時系列の情報(以下、時系列状態情報と称する)として、例えば、ユーザの握力、表情、バイタル値(体温、心拍、血圧、SpO2等)、発話内容、視線、動き等が想定される。 For example, in step S13 in FIG. 7, the handshake state management unit 214 may observe changes in the user's state over time and determine whether to continue or end the handshake based on the results. . Time-series information regarding the user's condition that can be used as a determination condition (hereinafter referred to as time-series condition information) includes, for example, the user's grip strength, facial expression, vital values (body temperature, heart rate, blood pressure, SpO2, etc.), and speech. Content, line of sight, movement, etc. are assumed.
 なお、ハンド部134の初期握力及び握力、並びに、握手時のハンド部134の位置及び姿勢の制御に、時系列状態情報が用いられてもよい。 Note that time-series state information may be used to control the initial grip strength and grip strength of the hand portion 134, as well as the position and posture of the hand portion 134 during handshaking.
 例えば、ハンド部134への外力の検出に、ロボット101の腕部117の手首に備えた6軸の力覚センサ、各関節のトルクセンサ等が用いられてもよい。また、ロボット101は、これらのセンサを組み合わせて、握手による外力かその他の外力かを識別して、より安全な接触インタラクションを実現するようにしてもよい。 For example, a six-axis force sensor provided at the wrist of the arm 117 of the robot 101, a torque sensor at each joint, etc. may be used to detect the external force applied to the hand 134. The robot 101 may also combine these sensors to distinguish between external forces caused by handshakes and other external forces to achieve safer contact interactions.
 例えば、ロボット101は、安心感のある接触インタラクションを実現するために、ハンド部134の表面の温度を制御するようにしてもよい。例えば、ハンド部134に温度センサや加熱冷却素子を設け、ハンド部134の表面の温度が、初期状態で人肌より少し高い温度に設定され、ユーザの手の温度になじむように制御されるようにしてもよい。初期温度が人肌より少し高い温度に設定されることにより、ハンド部134が接触した感覚をユーザが感じやすくなる。また、例えば、ロボット101は、時系列状態情報に基づいて、適切な温度を学習するようにしてもよい。 For example, the robot 101 may control the temperature of the surface of the hand section 134 in order to achieve a touch interaction that gives a sense of security. For example, the hand section 134 may be provided with a temperature sensor or a heating/cooling element, so that the temperature of the surface of the hand section 134 is initially set to a temperature slightly higher than human skin, and is controlled to match the temperature of the user's hand. You may also do so. By setting the initial temperature to a temperature slightly higher than human skin, the user can easily feel the sensation of contact with the hand section 134. Further, for example, the robot 101 may learn an appropriate temperature based on time-series state information.
 例えば、ロボット101は、ユーザの手の滑りが大きすぎる場合、両方のハンド部134(両手)で握手するようにしてもよい。この場合、各ハンド部134で検出されたせん断方向の変形量ベクトルの和を入力としてPID制御を行うことにより、各ハンド部134の握力が制御される。 For example, if the user's hand slips too much, the robot 101 may shake hands with both hand parts 134 (both hands). In this case, the grip force of each hand section 134 is controlled by performing PID control using the sum of the deformation amount vectors in the shear direction detected by each hand section 134 as input.
 本技術は、例えば、腕のみを備えるアームロボットにも適用することが可能である。この場合、上述した4つの手段のうち「見る」機能を用いる手段2は適用できないが、それ以外の手段が適用されることにより、握手に対するユーザの安心感を高めることができる。 The present technology can also be applied to, for example, an arm robot that only has arms. In this case, among the four means described above, means 2 that uses the "view" function cannot be applied, but by applying other means, it is possible to increase the user's sense of security regarding the handshake.
 本技術は、握手以外の接触インタラクションにも適用することができる。例えば、上述した4つの手段のうち、対象となる接触インタラクションに適用可能な手段を適用することにより、接触インタラクションに対するユーザの安心感を高めることができる。 This technology can also be applied to contact interactions other than handshakes. For example, by applying a method applicable to the target contact interaction among the four methods described above, it is possible to increase the user's sense of security regarding the contact interaction.
 例えば、上述した時系列情報に基づいて、ユーザの感情推定を実行し、推定された感情に基づいて、接触インタラクションの制御が行われるようにしてもよい。 For example, the user's emotion estimation may be performed based on the above-mentioned time series information, and the touch interaction may be controlled based on the estimated emotion.
<<3.その他>>
 <コンピュータの構成例>
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<<3. Others>>
<Computer configuration example>
The series of processes described above can be executed by hardware or software. When a series of processes is executed by software, the programs that make up the software are installed on the computer. Here, the computer includes a computer built into dedicated hardware and, for example, a general-purpose personal computer that can execute various functions by installing various programs.
 図13は、上述した一連の処理をプログラムにより実行するコンピュータのハードウエアの構成例を示すブロック図である。 FIG. 13 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processes using a program.
 コンピュータ1000において、CPU(Central Processing Unit)1001、ROM(Read Only Memory)1002、RAM(Random Access Memory)1003は、バス1004により相互に接続されている。 In the computer 1000, a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, and a RAM (Random Access Memory) 1003 are interconnected by a bus 1004.
 バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、入力部1006、出力部1007、記憶部1008、通信部1009、及びドライブ1010が接続されている。 An input/output interface 1005 is further connected to the bus 1004. An input section 1006, an output section 1007, a storage section 1008, a communication section 1009, and a drive 1010 are connected to the input/output interface 1005.
 入力部1006は、入力スイッチ、ボタン、マイクロフォン、撮像素子などよりなる。出力部1007は、ディスプレイ、スピーカなどよりなる。記憶部1008は、ハードディスクや不揮発性のメモリなどよりなる。通信部1009は、ネットワークインタフェースなどよりなる。ドライブ1010は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブルメディア1011を駆動する。 The input unit 1006 includes an input switch, a button, a microphone, an image sensor, and the like. The output unit 1007 includes a display, a speaker, and the like. The storage unit 1008 includes a hard disk, nonvolatile memory, and the like. The communication unit 1009 includes a network interface and the like. The drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
 以上のように構成されるコンピュータ1000では、CPU1001が、例えば、記憶部1008に記録されているプログラムを、入出力インタフェース1005及びバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer 1000 configured as described above, the CPU 1001, for example, loads the program recorded in the storage unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program. A series of processing is performed.
 コンピュータ1000(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 A program executed by the computer 1000 (CPU 1001) can be provided by being recorded on a removable medium 1011 such as a package medium, for example. Additionally, programs may be provided via wired or wireless transmission media, such as local area networks, the Internet, and digital satellite broadcasts.
 コンピュータ1000では、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記憶部1008にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部1009で受信し、記憶部1008にインストールすることができる。その他、プログラムは、ROM1002や記憶部1008に、あらかじめインストールしておくことができる。 In the computer 1000, a program can be installed in the storage unit 1008 via the input/output interface 1005 by installing a removable medium 1011 into the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. Other programs can be installed in the ROM 1002 or the storage unit 1008 in advance.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 Note that the program executed by the computer may be a program in which processing is performed chronologically in accordance with the order described in this specification, in parallel, or at necessary timing such as when a call is made. It may also be a program that performs processing.
 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 Furthermore, in this specification, a system refers to a collection of multiple components (devices, modules (components), etc.), regardless of whether all the components are located in the same casing. Therefore, multiple devices housed in separate casings and connected via a network, and a single device with multiple modules housed in one casing are both systems. .
 さらに、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Further, the embodiments of the present technology are not limited to the embodiments described above, and various changes can be made without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, the present technology can take a cloud computing configuration in which one function is shared and jointly processed by multiple devices via a network.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, each step described in the above flowchart can be executed by one device or can be shared and executed by multiple devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, when one step includes multiple processes, the multiple processes included in that one step can be executed by one device or can be shared and executed by multiple devices.
 <構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
<Example of configuration combinations>
The present technology can also have the following configuration.
(1)
 ユーザの手と握手可能なハンド部と、
 前記ハンド部の触覚情報を検出する触覚検出部と、
 前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する握手制御部と
 を備えるロボット。
(2)
 前記触覚情報は、前記ハンド部のせん断方向の変形量を含み、
 前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて、前記ハンド部の握力を制御する
 前記(1)に記載のロボット。
(3)
 前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて前記手の初期滑りが検出された場合、前記手が滑らない程度に前記手を握り返すように前記ハンド部を制御する
 前記(2)に記載のロボット。
(4)
 前記握手制御部は、前記ハンド部のせん断方向の変形量が小さくなるように、前記ハンド部の握力を制御する
 前記(3)に記載のロボット。
(5)
 前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて前記手の初期滑りが検出された場合、前記手により前記ハンド部に加わる外力が減少するように前記ハンド部の軌道を制御する
 前記(2)乃至(4)のいずれかに記載のロボット。
(6)
 前記触覚情報は、前記手の握力を含み、
 前記握手制御部は、前記手の握力に基づいて、前記ハンド部の握力を制御する
 前記(1)乃至(5)のいずれかに記載のロボット。
(7)
 前記握手制御部は、前記手による握手が開始されたときの前記手の握力に基づいて、前記ハンド部の握力の初期値を設定する
 前記(6)に記載のロボット。
(8)
 前記ユーザに握手していることを伝える発話音声の出力を制御するとともに、前記手の握力に基づいて、前記発話音声の内容を変更する発話制御部を
 さらに備える前記(6)又は(7)に記載のロボット。
(9)
 前記握手制御部は、前記手との接触面積が大きくなるように前記ハンド部の位置及び姿勢を制御する
 前記(1)乃至(8)のいずれかに記載のロボット。
(10)
 前記握手制御部は、前記ハンド部の掌側と前記ユーザの掌とがほぼ平行になり、前記ハンド部の掌側と前記ユーザの掌とが重なる部分が大きくなるように、前記ハンド部の位置及び姿勢を制御する
 前記(9)に記載のロボット。
(11)
 前記握手制御部は、前記ハンド部の甲側への前記手の接触が検出されなくなった場合、握手を終了する
 前記(1)乃至(10)のいずれかに記載のロボット。
(12)
 前記握手制御部は、前記ハンド部の甲側への前記手の接触が検出されなくなった場合、前記ハンド部を開く
 前記(11)に記載のロボット。
(13)
 前記ユーザの視線方向を検出する視線検出部と、
 前記ユーザと握手する前に、前記ユーザの視線と合わせるように前記ロボットの視線を制御する視線制御部と
 を備える前記(1)乃至(12)のいずれかに記載のロボット。
(14)
 前記握手制御部は、前記ユーザと前記ロボットの視線が合った後、前記ロボットを握手する体勢にする
 前記(13)に記載のロボット。
(15)
 前記視線制御部は、前記ハンド部が前記ユーザに差し出された後、前記ハンド部と前記ユーザとを交互に見るように前記ロボットの視線を制御する
 前記(14)に記載のロボット。
(16)
 前記ロボットを握手する体勢にする前及びした後の少なくとも一方において、これから前記ユーザと握手することを伝える発話音声の出力を制御する発話制御部を
 さらに備える前記(14)又は(15)に記載のロボット。
(17)
 前記ハンド部に対する前記触覚情報の検出に用いられる触覚センサと、
 前記触覚センサの表面を覆う弾性体と
 を備える前記(1)乃至(16)のいずれかに記載のロボット。
(18)
 ユーザの手と握手可能なハンド部の触覚情報を検出し、
 前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する
 ロボットの制御方法。
(1)
a hand part that can shake hands with a user's hand;
a tactile detection section that detects tactile information of the hand section;
A handshake control unit that controls the grip force of the hand unit that is shaking hands with the hand based on the tactile information.
(2)
The tactile information includes an amount of deformation of the hand portion in a shear direction,
The robot according to (1), wherein the handshake control section controls the grip force of the hand section based on the amount of deformation of the hand section in the shear direction.
(3)
When initial slippage of the hand is detected based on the amount of deformation of the hand in the shear direction, the handshake control unit controls the hand so as to grasp the hand again to the extent that the hand does not slip. The robot described in (2).
(4)
The robot according to (3), wherein the handshake control unit controls the gripping force of the hand so that the amount of deformation of the hand in the shear direction is small.
(5)
The handshake control unit controls the trajectory of the hand unit so that an external force applied to the hand unit by the hand is reduced when initial slippage of the hand is detected based on the amount of deformation of the hand unit in the shear direction. The robot according to any one of (2) to (4) above.
(6)
The tactile information includes grip strength of the hand,
The robot according to any one of (1) to (5), wherein the handshake control unit controls the grip force of the hand unit based on the grip force of the hand.
(7)
The robot according to (6), wherein the handshake control unit sets an initial value of the grip force of the hand unit based on the grip force of the hand when the handshake is started.
(8)
According to (6) or (7) above, further comprising a speech control unit that controls the output of a speech sound that informs the user that the user is shaking hands, and changes the content of the speech sound based on the grip strength of the hand. The robot described.
(9)
The robot according to any one of (1) to (8), wherein the handshake control section controls the position and posture of the hand section so that the contact area with the hand becomes large.
(10)
The handshake control unit controls the position of the hand unit such that the palm side of the hand unit and the user's palm are approximately parallel, and the overlapping portion of the palm side of the hand unit and the user's palm is large. and the robot according to (9) above.
(11)
The robot according to any one of (1) to (10), wherein the handshake control unit ends the handshake when contact of the hand to the back side of the hand unit is no longer detected.
(12)
The robot according to (11), wherein the handshake control unit opens the hand when contact of the hand to the back side of the hand is no longer detected.
(13)
a line-of-sight detection unit that detects the user's line-of-sight direction;
The robot according to any one of (1) to (12), further comprising: a line-of-sight control unit that controls the robot's line of sight to align with the user's line of sight before shaking hands with the user.
(14)
The robot according to (13), wherein the handshake control unit puts the robot in a handshake position after the user and the robot make eye contact.
(15)
The robot according to (14), wherein the line of sight control unit controls the line of sight of the robot to alternately look at the hand and the user after the hand is held out to the user.
(16)
The method according to (14) or (15), further comprising an utterance control unit that controls the output of a uttered voice informing that the robot is about to shake hands with the user, at least one of before and after the robot is in a handshake position. robot.
(17)
a tactile sensor used to detect the tactile information on the hand portion;
The robot according to any one of (1) to (16), further comprising: an elastic body that covers a surface of the tactile sensor.
(18)
Detects tactile information of the hand part that can shake hands with the user's hand,
A method for controlling a robot, comprising: controlling the grip force of the hand part that is shaking hands with the hand based on the tactile information.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limited, and other effects may also exist.
 101 ロボット, 111 頭部, 112 首部, 116 台車, 117L,117R 腕部, 121 センサ部, 122L,122R 目, 134L,134R ハンド部, 151L乃至152R パーツ, 153L,153R ヨー軸, 161L乃至163R ピッチ軸, 172L,172R 触覚センサ, 173L,173R 弾性体, 201 握手実行処理部, 211 握手指令部, 212 視線検出部, 213 触覚検出部, 214 握手状態管理部, 215 視線制御部, 216 表示制御部, 217 握手制御部, 218 動作制御部, 219 発話制御部, 220 音声出力部, 231 初期握力設定部, 232 握力制御部, 233 握手位置姿勢制御部 101 Robot, 111 Head, 112 Neck, 116 Cart, 117L, 117R Arm, 121 Sensor, 122L, 122R Eyes, 134L, 134R Hand, 151L to 152R Parts, 153L, 153R Yaw axis, 161L to 163R Pitch axis , 172L, 172R tactile sensor, 173L, 173R elastic body, 201 handshake execution processing unit, 211 handshake command unit, 212 line of sight detection unit, 213 tactile detection unit, 214 handshake state management unit, 215 line of sight control unit, 216 Display control unit, 217 Handshake control unit, 218 Movement control unit, 219 Speech control unit, 220 Audio output unit, 231 Initial grip force setting unit, 232 Grip force control unit, 233 Handshake position and posture control unit

Claims (18)

  1.  ユーザの手と握手可能なハンド部と、
     前記ハンド部の触覚情報を検出する触覚検出部と、
     前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する握手制御部と
     を備えるロボット。
    a hand part that can shake hands with a user's hand;
    a tactile detection section that detects tactile information of the hand section;
    A handshake control unit that controls the grip force of the hand unit that is shaking hands with the hand based on the tactile information.
  2.  前記触覚情報は、前記ハンド部のせん断方向の変形量を含み、
     前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて、前記ハンド部の握力を制御する
     請求項1に記載のロボット。
    The tactile information includes an amount of deformation of the hand portion in a shear direction,
    The robot according to claim 1, wherein the handshake control section controls the grip force of the hand section based on the amount of deformation of the hand section in the shear direction.
  3.  前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて前記手の初期滑りが検出された場合、前記手が滑らない程度に前記手を握り返すように前記ハンド部を制御する
     請求項2に記載のロボット。
    The handshake control unit controls the hand unit to grasp the hand again to the extent that the hand does not slip when initial slippage of the hand is detected based on the amount of deformation of the hand unit in the shear direction. The robot according to item 2.
  4.  前記握手制御部は、前記ハンド部のせん断方向の変形量が小さくなるように、前記ハンド部の握力を制御する
     請求項3に記載のロボット。
    The robot according to claim 3, wherein the handshake control unit controls the gripping force of the hand so that the amount of deformation of the hand in the shear direction becomes small.
  5.  前記握手制御部は、前記ハンド部のせん断方向の変形量に基づいて前記手の初期滑りが検出された場合、前記手により前記ハンド部に加わる外力が減少するように前記ハンド部の軌道を制御する
     請求項2に記載のロボット。
    The handshake control unit controls the trajectory of the hand unit so that an external force applied to the hand unit by the hand is reduced when initial slippage of the hand is detected based on the amount of deformation of the hand unit in the shear direction. The robot according to claim 2.
  6.  前記触覚情報は、前記手の握力を含み、
     前記握手制御部は、前記手の握力に基づいて、前記ハンド部の握力を制御する
     請求項1に記載のロボット。
    The tactile information includes grip strength of the hand,
    The robot according to claim 1, wherein the handshake control section controls the gripping force of the hand section based on the gripping force of the hand.
  7.  前記握手制御部は、前記手による握手が開始されたときの前記手の握力に基づいて、前記ハンド部の握力の初期値を設定する
     請求項6に記載のロボット。
    The robot according to claim 6, wherein the handshake control unit sets an initial value of the grip force of the hand unit based on the grip force of the hand when the handshake is started.
  8.  前記ユーザに握手していることを伝える発話音声の出力を制御するとともに、前記手の握力に基づいて、前記発話音声の内容を変更する発話制御部を
     さらに備える請求項6に記載のロボット。
    7. The robot according to claim 6, further comprising a speech control unit that controls the output of a speech voice that informs the user that the user is shaking hands, and changes the content of the speech voice based on the grip strength of the hand.
  9.  前記握手制御部は、前記手との接触面積が大きくなるように前記ハンド部の位置及び姿勢を制御する
     請求項1に記載のロボット。
    The robot according to claim 1, wherein the handshake control section controls the position and posture of the hand section so that the contact area with the hand increases.
  10.  前記握手制御部は、前記ハンド部の掌側と前記ユーザの掌とがほぼ平行になり、前記ハンド部の掌側と前記ユーザの掌とが重なる部分が大きくなるように、前記ハンド部の位置及び姿勢を制御する
     請求項9に記載のロボット。
    The handshake control unit controls the position of the hand unit such that the palm side of the hand unit and the user's palm are approximately parallel, and the overlapping portion of the palm side of the hand unit and the user's palm is large. The robot according to claim 9, wherein the robot controls the posture and posture.
  11.  前記握手制御部は、前記ハンド部の甲側への前記手の接触が検出されなくなった場合、握手を終了する
     請求項1に記載のロボット。
    The robot according to claim 1, wherein the handshake control unit ends the handshake when contact of the hand to the back side of the hand unit is no longer detected.
  12.  前記握手制御部は、前記ハンド部の甲側への前記手の接触が検出されなくなった場合、前記ハンド部を開く
     請求項11に記載のロボット。
    The robot according to claim 11, wherein the handshake control unit opens the hand when contact of the hand to the back side of the hand is no longer detected.
  13.  前記ユーザの視線方向を検出する視線検出部と、
     前記ユーザと握手する前に、前記ユーザの視線と合わせるように前記ロボットの視線を制御する視線制御部と
     を備える請求項1に記載のロボット。
    a line-of-sight detection unit that detects the user's line-of-sight direction;
    The robot according to claim 1, further comprising: a line-of-sight control unit that controls the robot's line of sight to align with the user's line of sight before shaking hands with the user.
  14.  前記握手制御部は、前記ユーザと前記ロボットの視線が合った後、前記ロボットを握手する体勢にする
     請求項13に記載のロボット。
    The robot according to claim 13, wherein the handshake control unit puts the robot in a handshake position after the user and the robot make eye contact.
  15.  前記視線制御部は、前記ハンド部が前記ユーザに差し出された後、前記ハンド部と前記ユーザとを交互に見るように前記ロボットの視線を制御する
     請求項14に記載のロボット。
    The robot according to claim 14, wherein the line of sight control unit controls the line of sight of the robot to alternately look at the hand and the user after the hand is held out to the user.
  16.  前記ロボットを握手する体勢にする前及びした後の少なくとも一方において、これから前記ユーザと握手することを伝える発話音声の出力を制御する発話制御部を
     さらに備える請求項14に記載のロボット。
    15. The robot according to claim 14, further comprising a speech control unit that controls the output of a speech message informing that the robot will shake hands with the user, at least one of before and after the robot is in a handshake position.
  17.  前記ハンド部に対する前記触覚情報の検出に用いられる触覚センサと、
     前記触覚センサの表面を覆う弾性体と
     を備える請求項1に記載のロボット。
    a tactile sensor used to detect the tactile information on the hand portion;
    The robot according to claim 1, further comprising: an elastic body that covers the surface of the tactile sensor.
  18.  ユーザの手と握手可能なハンド部の触覚情報を検出し、
     前記触覚情報に基づいて、前記手と握手している前記ハンド部の握力を制御する
     ロボットの制御方法。
    Detects tactile information of the hand part that can shake hands with the user's hand,
    A method for controlling a robot, comprising: controlling the grip force of the hand part that is shaking hands with the hand based on the tactile information.
PCT/JP2023/021834 2022-06-29 2023-06-13 Robot, and robot control method WO2024004622A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-104293 2022-06-29
JP2022104293 2022-06-29

Publications (1)

Publication Number Publication Date
WO2024004622A1 true WO2024004622A1 (en) 2024-01-04

Family

ID=89382080

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/021834 WO2024004622A1 (en) 2022-06-29 2023-06-13 Robot, and robot control method

Country Status (1)

Country Link
WO (1) WO2024004622A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004283975A (en) * 2003-03-24 2004-10-14 Advanced Telecommunication Research Institute International Communication robot
JP2006247780A (en) * 2005-03-10 2006-09-21 Advanced Telecommunication Research Institute International Communication robot
JP2007185763A (en) * 2005-12-12 2007-07-26 Honda Motor Co Ltd Controller of legged mobile robot
JP2009028859A (en) * 2007-07-27 2009-02-12 Toshiba Corp Manipulator and robot
JP2019139758A (en) * 2018-02-09 2019-08-22 マスターカード アジア パシフィック ピーティーイー リミテッドMastercard Asia/Pacific Pte.Ltd. System and method for conducting transaction
WO2022039058A1 (en) * 2020-08-20 2022-02-24 ソニーグループ株式会社 Information processing device, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004283975A (en) * 2003-03-24 2004-10-14 Advanced Telecommunication Research Institute International Communication robot
JP2006247780A (en) * 2005-03-10 2006-09-21 Advanced Telecommunication Research Institute International Communication robot
JP2007185763A (en) * 2005-12-12 2007-07-26 Honda Motor Co Ltd Controller of legged mobile robot
JP2009028859A (en) * 2007-07-27 2009-02-12 Toshiba Corp Manipulator and robot
JP2019139758A (en) * 2018-02-09 2019-08-22 マスターカード アジア パシフィック ピーティーイー リミテッドMastercard Asia/Pacific Pte.Ltd. System and method for conducting transaction
WO2022039058A1 (en) * 2020-08-20 2022-02-24 ソニーグループ株式会社 Information processing device, information processing method, and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
OOWADA, KENICHI: " Let's Move the Popular Robot "Pepper" - Creating Apps for Greetings and Handshakes Activated When a Person's Face or Voice Is Recognized", NIKKEI LINUX, NIKKEI BUSINESS PUBLICATIONS, JP, vol. 17, no. 5, 1 May 2015 (2015-05-01), JP , pages 82 - 89, XP009552011, ISSN: 1345-0182 *
RAVINDRA P., S. DE SILVA, NOSAKA, TATSUYA, FUKAMACHI, KENTA, TAKEDA, YASUTAKA: "The Effective Communication between "Mako no Te" and Human through the Patterns of Grasping", THE TRANSACTIONS OF HUMAN INTERFACE SOCIETY, vol. 17, no. 2, 1 January 2015 (2015-01-01), pages 191 - 200, XP093122644, ISSN: 2186-828X *

Similar Documents

Publication Publication Date Title
Noronha et al. “Wink to grasp”—comparing eye, voice & EMG gesture control of grasp with soft-robotic gloves
Eid et al. A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS
Prattichizzo et al. Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback
Cortese et al. A mechatronic system for robot-mediated hand telerehabilitation
Tran et al. Patient-specific, voice-controlled, robotic flexotendon glove-ii system for spinal cord injury
Jackowski et al. Head motion and head gesture-based robot control: A usability study
Ben-Tzvi et al. The design evolution of a sensing and force-feedback exoskeleton robotic glove for hand rehabilitation application
Gunasekara et al. Control methodologies for upper limb exoskeleton robots
Kirchner et al. Intuitive interaction with robots–technical approaches and challenges
Goldau et al. Autonomous multi-sensory robotic assistant for a drinking task
JP7315568B2 (en) Grasping assistance system and method
WO2021015025A1 (en) Control device, control method, and control program
Penaloza et al. Towards intelligent brain-controlled body augmentation robotic limbs
Garcia et al. EEG control of an industrial robot manipulator
Haseeb et al. Head gesture-based control for assistive robots
WO2024004622A1 (en) Robot, and robot control method
Krishnaswamy et al. Toward the development of a BCI and gestural interface to support individuals with physical disabilities
Sahadat et al. Simultaneous multimodal access to wheelchair and computer for people with tetraplegia
Yamada et al. Proposal of a psychophysiological experiment system applying the reaction of human pupillary dilation to frightening robot motions
Modi et al. Interactive iiot-based 5dof robotic arm for upper limb telerehabilitation
Weisz et al. A user interface for assistive grasping
Singer et al. Automatic support control of an upper body exoskeleton—Method and validation using the Stuttgart Exo-Jacket
JP2004174644A (en) Control device for leg type mobile robot
Ketkar et al. Design and Development of a Spherical 5-Bar Thumb Exoskeleton Mechanism for Poststroke Rehabilitation
Khairuddin et al. Assistive-as-needed strategy for upper-limb robotic systems: an initial survey

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23831070

Country of ref document: EP

Kind code of ref document: A1