WO2022123618A1 - 情報処理装置、システム、情報処理方法およびプログラム - Google Patents
情報処理装置、システム、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2022123618A1 WO2022123618A1 PCT/JP2020/045387 JP2020045387W WO2022123618A1 WO 2022123618 A1 WO2022123618 A1 WO 2022123618A1 JP 2020045387 W JP2020045387 W JP 2020045387W WO 2022123618 A1 WO2022123618 A1 WO 2022123618A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- contact event
- contact
- robot
- information processing
- active
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 41
- 238000003672 processing method Methods 0.000 title claims description 5
- 238000001514 detection method Methods 0.000 claims abstract description 100
- 230000000694 effects Effects 0.000 claims abstract description 37
- 238000005259 measurement Methods 0.000 description 80
- 230000008859 change Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 210000001503 joint Anatomy 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 210000000323 shoulder joint Anatomy 0.000 description 6
- 238000000034 method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 230000007123 defense Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 210000001364 upper extremity Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H13/00—Toy figures with self-moving parts, with or without movement of the toy as a whole
- A63H13/02—Toy figures with self-moving parts, with or without movement of the toy as a whole imitating natural actions, e.g. catching a mouse by a cat, the kicking of an animal
- A63H13/04—Mechanical figures imitating the movement of players or workers
- A63H13/06—Mechanical figures imitating the movement of players or workers imitating boxing or fighting
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/003—Manipulators for entertainment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1682—Dual arm manipulator; Coordination of several manipulators
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40264—Human like, type robot arm
Definitions
- the present invention relates to an information processing device, a system, an information processing method and a program.
- Patent Document 1 discloses a robot device including a module that generates an operation synchronized with a user's operation according to a user's image input via a camera image input device.
- an object of the present invention is to provide an information processing device, a system, an information processing method, and a program that enable detection and use of unexpected movements of a robot.
- the detection result of the first contact event generated in the housing portion of the first robot and the detection of the second contact event generated in the housing portion of the second robot are active, and the first contact event is the first.
- An information processing apparatus is provided that includes a score addition unit that adds a score to the first robot when the second contact event detected in synchronization with the contact event is not active.
- a system including the above information processing apparatus, the first and second robots, and each of the first and second robots has a contact event generated in a housing portion.
- a contact event detection unit for detecting a contact event and an activity determination unit for determining that the contact event is active when a contact event is detected in the operation execution portion included in the housing portion are provided, and the first and second parts are provided.
- a system is provided in which the detection result of the contact event and the determination result of the activity of the first and second contact events are transmitted from the first and second robots to the information processing apparatus.
- the detection result of the first contact event generated in the housing portion of the first robot, and the second contact event generated in the housing portion of the second robot are active and synchronized with the first contact event.
- An information processing method is provided that includes a step of adding a score to the first robot when the detected second contact event is not active.
- the detection result of the first contact event generated in the housing portion of the first robot and the second contact event generated in the housing portion of the second robot are active.
- the computer functions as an information processing device including a score addition unit that adds a score to the first robot when the second contact event detected in synchronization with the first contact event is not active. Program for is provided.
- FIG. 1 is a diagram schematically showing a configuration example of a system according to an embodiment of the present invention.
- the system 10 provides a fighting game by robots 100A and 100B.
- the system 10 further includes controllers 200A, 200B and a determination device 300.
- Robots 100A and 100B are head 101A and 101B (hereinafter collectively referred to as head 101) and body portions 102A and 102B (hereinafter collectively referred to as body 102), respectively.
- Arms 103A, 103B, 104A, 104B (hereinafter collectively referred to as arm 103, 104) and legs 105A, 105B, 106A, 106B (hereinafter collectively referred to as legs 105, 106).
- the controllers 200A and 200B generate an operation signal according to a user's operation input to a button or stick (not shown), for example.
- the controllers 200A, 200B may generate operation signals according to user movements identified by motion capture, including cameras or sensors (not shown).
- the operation signal is transmitted from the controller 200A to the robot 100A and from the controller 200B to the robot 100B, respectively, and the robots 100A and 100B operate according to the respective operation signals.
- the fighting game is provided by the robots 100A and 100B operating according to the operation signals of the controllers 200A and 200B, respectively, and driving the arms 103 and 104 and the legs 105 and 106 to attack and defend against the opponent.
- the rules of the fighting game are not particularly limited, but for example, when an attack by any of the arms 103A and 104A of the robot 100A hits the head 101B or the body 102B of the robot 100B, points may be given to the robot 100A. ..
- the determination device 300 determines the points as described above according to the information transmitted from the robots 100A and 100B, respectively.
- FIG. 2 is a diagram schematically showing a configuration example of a robot in the system shown in FIG. 1.
- the robot 100 has, for example, an information processing device 110 mounted on a body 102.
- the information processing device 110 includes a CPU (Central Processing Unit) 111, a RAM (Random Access Memory) 112, a ROM (Read Only Memory) 113, an external memory 114, and the like that execute arithmetic processing.
- the information processing device 110 determines the operation of each part of the robot 100 according to the operation signal or control signal received by the communication interface 121.
- the communication interface 121 is connected to the information processing device 110 via the bus interface 115.
- the information processing apparatus 110 rotates and drives the joints of the arms 103, 104, hands 103H, 104H, legs 105, 106, and feet 105F, 106F so that the determined movements are executed. Control.
- the head 101 and the body 102 may also be provided with joints driven by the motor 130.
- the CPU 111 of the information processing apparatus 110 selects a pattern corresponding to the motion determined from the motion patterns stored in the ROM 113 or the external memory 114, and according to the selected pattern, the foot movement and the ZMP (Zero Moment Point) trajectory. , Trunk movement, upper limb movement, waist horizontal position and height, etc. are set, and the motor 130 is controlled according to these set values.
- the robot 100 is equipped with a sensor such as an inertial measurement unit (IMU: Inertial Measurement Unit) 122.
- the sensor is connected to the information processing device 110 via the bus interface 115, and the information processing device 110 controls each part of the robot 100 with reference to the output value of the sensor as needed.
- the information processing device 110 may transmit the determination information regarding the attack and the defense acquired by the process described later to the determination device 300 shown in FIG.
- the information processing device 110 is used as a determination device 300 for extracting determination information regarding attack and defense from at least a part of the output value of the sensor, a selected operation pattern, a set value for controlling the motor 130, and the like. You may send it.
- FIG. 3 is a diagram schematically showing a configuration example of a determination device in the system shown in FIG. 1.
- the determination device 300 has an information processing device 310 and a communication interface 321.
- the information processing apparatus 310 includes a CPU 311 that executes arithmetic processing, a RAM 312, a ROM 313, an external memory 314, and the like.
- the information processing device 310 determines the outcome of the fighting game based on the determination information received by the communication interface 321 from the robots 100A and 100B, respectively.
- the determination result or the game score may be transmitted to another device such as a user terminal via the communication interface 321 or may be displayed on the display 322.
- the communication interface 321 and the display 322 are connected to the information processing apparatus 310 via the bus interface 315.
- a contact event is defined as an element for determining the victory or defeat of a fighting game.
- the contact event occurs, for example, when the robot 100A hits any part of the housing of the robot 100B in a fighting game.
- the robot on the other side which is an external object, comes into contact with the housing.
- a torque change is generated by the force applied by. While the former torque change, including the one caused by contact with the floor surface, the magnitude and timing of the torque change can be predicted from the operation pattern set in the robot 100 and the output value of the sensor. The latter is also difficult to predict because the magnitude and timing of the torque change fluctuate depending on the position and movement of the robot 100B.
- Such an event accompanied by a torque change generated by contact with an external object is referred to as a contact event in the present specification.
- the contact event defined above is further divided into a passive contact event and an active contact event.
- a passive contact event when the robot 100A hits any part of the housing of the robot 100B, it occurs in the robot 100B on the hit side.
- the robot 100B since the robot 100B does not perform an operation for hitting, it is difficult to predict the occurrence of the event itself on the robot 100B side. In such a case, it is said that a passive contact event has occurred in the robot 100B.
- the robot 100A since the robot 100A is performing an operation for hitting, the occurrence of the contact event itself can be predicted on the robot 100A side. In such a case, it is said that an active contact event has occurred in the robot 100A.
- the predicted contact event may not occur, for example, when the robot 100B performs an avoidance operation. Further, even when a contact event occurs, the magnitude and timing of the torque change depend on the position and movement of the robot 100B, so that it is difficult to predict these even if the contact event is active.
- the passive contact event and the active contact event as described above are detected on the robot 100A side and the robot 100B side, respectively, and the winning or losing of the fighting game is determined by integrating the detection results. ..
- the system 10 includes a torque measurement value acquisition unit 510, an angle difference value acquisition unit 520, a contact event detection unit 530, an activity determination unit 540, and a score addition unit 550. These functional configurations are implemented in the information processing device 110 of the robot 100 or the information processing device 310 of the determination device 300 in the system 10 shown in FIG.
- the robot 100 is equipped with a torque measurement value acquisition unit 510, an angle difference value acquisition unit 520, a contact event detection unit 530, and an activity determination unit 540, and the determination device 300 is used.
- the score addition unit 550 may be implemented.
- the determination device 300 includes a contact event detection result acquisition unit 551 that receives the detection results of the contact events of the robots 100A and 100B, and an activity determination result acquisition unit that receives the activity determination results for each contact event. 552 and the like are included.
- the torque measurement value acquisition unit 510 and the angle difference value acquisition unit 520 are mounted on the robot 100, and the contact event detection unit 530, the activity determination unit 540, and the score addition unit 550 are mounted on the determination device 300. May be done.
- the functions of the contact event detection result acquisition unit and the activity determination result acquisition unit are included in the contact event detection unit 530 and the activity determination unit 540.
- the torque measurement value acquisition unit 510 acquires the torque measurement value measured by the motor that drives the joint portion of the housing of the robot 100.
- the housing of the robot 100 includes the head 101, the body 102, the arms 103, 104, and the like, and the joints connecting the respective portions are driven by the motor 130.
- the torque measurement value is acquired, for example, by measuring the current flowing through the motor 130.
- the joint can be driven in a plurality of directions (for example, Roll, Pitch, Yaw)
- the torque measurement value of the motor that drives the joint in each direction may be acquired.
- the acquired torque measurement value may be processed in real time by the contact event detection unit 530, or the torque measurement value acquired in time series may be buffered for a predetermined time to calculate a time difference, a moving average, or the like. good.
- the angle difference value acquisition unit 520 acquires an angle difference value indicating the difference between the angle measurement value and the angle indication value in the motor that drives the joint portion of the housing of the robot 100.
- the angle measurement value is acquired as an output value of a potentiometer or encoder attached to a joint portion driven by a motor, for example.
- the angle indicated value is, for example, a target value of the rotation angle of each motor 130 determined by the information processing apparatus 110.
- the contact event detection unit 530 is a joint unit driven by the motor 130 from which the torque measurement value is acquired when the torque measurement value acquired by the torque measurement value acquisition unit 510 or the value based on the torque measurement value exceeds the threshold range. Detects contact events that occur in the parts of the housing that are connected by. Specifically, for example, when the torque measurement value of the motor that drives the joint portion connecting the head 101 and the body portion 102 of the robot 100, or the value based on the torque measurement value exceeds the threshold range, the contact event is detected. The unit 530 detects a contact event generated in the head 101 or the body 102.
- the threshold range used by the contact event detection unit 530 to detect the contact event is assumed to be during normal operation, that is, the motor driving each joint of the robot 100 operates according to the operation pattern determined by the information processing device 110. It corresponds to the range of torque measurement values detected by each motor when no external force is applied, and is determined by, for example, actually measuring the torque measurement values when the robot 100 performs various operations.
- the threshold range may be set individually for each joint or motor drive direction (for example, Roll, Pitch, Yaw), or a common threshold range may be set for two or more joints or drive directions.
- the contact event detection unit 530 simply indicates that the contact event occurs when the torque measurement value exceeds the threshold range, that is, when the torque measurement value is larger than the maximum value of the threshold range or smaller than the minimum value of the threshold range. May be detected. Alternatively, the contact event detection unit 530 detects a contact event when a component below a predetermined frequency of the torque measurement value exceeds the threshold range as in the example described later, or the time difference of the torque measurement value sets the threshold range. If it exceeds, a contact event may be detected. Further, the contact event detection unit 530 may further detect the contact event based on the ratio of the plurality of torque measurement values and the angle difference value acquired by the angle difference value acquisition unit 520 as in the example described later.
- the activity determination unit 540 determines whether or not the contact event detected by the contact event detection unit 530 is active.
- the active contact event is an event whose occurrence itself was predictable because the robot 100 is performing an operation for inducing the contact event.
- the activeness determination unit 540 determines that the contact event is active when the contact event is detected in the operation execution portion included in the housing portion of the robot 100.
- the activity determination unit 540 can specify the motion execution portion in the robot 100 based on the motion pattern of the joint portion determined by the information processing apparatus 110.
- the activity determination unit 540 is a hand portion which is a tip portion of the arm portions 103 and 104 with respect to the body portion 102 of the robot 100 and the arm portions 103 and 104 supported by the body portion 102.
- the arm 103 and 104H satisfy a predetermined positional relationship with respect to the body 102, the arm 103 and 104 including the hand 103H and 104H may be specified as the motion execution portion.
- the contact event that the activeness determination unit 540 does not determine to be active is treated as a passive contact event.
- the score addition unit 550 determines the victory or defeat of the fighting game based on the contact event detection result by the contact event detection unit 530 and the contact event activity determination result by the activity determination unit 540 for each of the robots 100A and 100B. do. Specifically, for example, in the score addition unit 550, the first contact event detected for the robot 100A is active, and the second contact event detected for the robot 100B in synchronization with the first contact event. When it is not active, it is determined that the robot 100A has successfully hit the robot 100B, and a score is added to the robot 100A. A specific example of such processing of the score addition unit 550 will be described later.
- FIG. 5 is a diagram showing an example of a specific functional configuration for detecting a contact event in one embodiment of the present invention.
- the torque measurement value acquisition unit 510 is a torque measurement value (Head_Roll, Head_Pitch, Head_Yaw) in three directions of a motor that drives a joint portion connecting the head 101 and the body 102, and the inside of the body 102.
- the angle difference value acquisition unit 520 has the angle difference value (Trunk_Yaw) of the motor that drives the joint portion connecting the two portions in the body portion 102 and the angle difference value of the motor that drives the shoulder joint portion (Trunk_Yaw).
- R Roll
- P Pitch
- Y Yaw.
- the contact event detection unit 530 integrates the contact event detection results with the contact event detection units 530A to 530D, which detect the contact event by comparing the torque measurement value with the threshold range in different ways.
- the contact event detection unit 530E is included. Hereinafter, each part will be further described.
- the contact event detection unit 530A determines whether or not each of the torque measurement value (Head_Roll, Head_Pitch, Head_Yaw) of the head 101 and the torque measurement value (Trunk_Roll) of the body 102 exceeds the threshold range. Examples of comparisons between torque measurements and threshold ranges are shown in FIGS. 6A and 6B. As shown in FIG. 6A, for the motor that drives each part of the robot 100, the torque measurement value during normal operation does not exceed the threshold range R. As described above, the threshold range R is determined, for example, by collecting the measured values of the torque measurement values during the normal operation of the robot 100. In the illustrated example, the threshold range R is set evenly positively and negatively with 0 as the center, but the threshold range R may be a range biased to either positive or negative.
- the contact event detection unit 530A detects a contact event in at least one of the head 101 or the body 102 when the torque measurement values (Head_Roll, Head_Pitch, Head_Yaw, Trunk_Roll) exceed the threshold range R set for each. To detect.
- the contact event detection unit 530A (similar to other event detection units) applies a low-pass filter to the torque measurement value, and the component of the torque measurement value below the predetermined frequency is in the threshold range.
- a contact event may be detected when R is exceeded. Since the torque measurement value may include high frequency components due to noise and measurement error, by removing frequency components higher than the fluctuation frequency of the torque measurement value generated by the contact event, false detection is reduced and the contact event is detected. The accuracy can be improved.
- the contact event detection unit 530B measures the torque of the body 102 of the torque measurement value (Head_Pitch) of the head 101 when the torque measurement value (Trunk_Pitch) of the body 102 exceeds the threshold range. If the ratio (Head_Pitch / Trunk_Pitch) to the value (Trunk_Pitch) exceeds the threshold value, the contact event generated in the head 101 is detected, and if the ratio does not exceed the threshold value, the contact event generated in the body 102 is detected. By such a determination, as described below, it is possible to correctly identify whether the contact event has occurred in the head 101 or the body 102.
- the lower legs 105 and 106 are restrained ends in contact with the floor surface, while the upper head 101 is a free end. That is, the housing of the robot 100 includes the legs 105 and 106, which are the first portions, the lower portion of the body portion 102, and the upper portion of the body portion 102, which is the second portion supported by the first portion. Includes a third portion, the head 101, which is supported by the second portion and constitutes a free end.
- the fluctuation of the torque measurement value when the force F is applied by the contact event generated in the third portion constituting the free end is the joint portion between the third portion and the second portion. Rather, it tends to be more prominently observed at the joint between the second and first parts.
- FIGS. 9A and 9B torque measurement values are shown for each of the case where a contact event due to an external impact occurs in the body 102 and the case where a contact event occurs due to an external impact in the head 101.
- (Head_Pitch, Trunk_Pitch) and their ratios (Head_Pitch / Trunk_Pitch) are shown.
- the torque measurement value (Trunk_Pitch) of the body portion 102 also fluctuates greatly in the positive direction and exceeds the threshold value (maximum value of the threshold range R).
- the torque measurement value (Head_Pitch) of the head 101 also has a small fluctuation and does not exceed the threshold value.
- the ratio of the measured torque values (Head_Pitch / Trunk_Pitch) is almost 0 in the example of FIG. 9A, whereas it is 0.25 or more in the example of FIG. 9B, which is a difference between the cases. Has occurred.
- the torque measurement value of the head 101 hardly fluctuates in the case of FIG. 9A in which the contact event occurs in the body portion 102, whereas the torque measurement value of the head 101 hardly fluctuates in the case of FIG. 9B in which the contact event occurs in the head 101.
- the torque measurement value of the head 101 also changes, although it is not as large as the body 102.
- the head by setting 0.25 as a threshold value for the ratio of measured torque values (Head_Pitch / Trunk_Pitch), if the ratio exceeds the threshold value, the head is started. It can be determined that the contact event has occurred in the unit 101, and if the ratio does not exceed the threshold value, it can be determined that the contact event has occurred in the body 102.
- the threshold value of the above ratio is an example, and in another example, for example, the measured value of the ratio when the contact event occurs in the head 101 and the measured value of the ratio when the contact event occurs in the body 102. Appropriate thresholds can be set by collecting.
- the torque measurement value (Trunk_Yaw) of the motor that drives the joint portion connecting the two portions in the body portion 102 exceeds the threshold range, and the angle of the motor is increased.
- the difference value exceeds the threshold range the contact event generated in the body portion 102 is detected.
- the torque measurement value fluctuates relatively greatly depending on the posture of the robot 100, whereas the angle difference value is less dependent on the posture of the robot 100. Therefore, the contact event is performed by using the angle difference value. The accuracy of detection can be improved.
- FIGS. 10A and 10B show torque measurement values and angle difference values for each of the case where the contact event does not actually occur in the body portion 102 and the case where the contact event occurs.
- the torque measurements are beyond the threshold range.
- the angle difference value does not exceed the threshold value in the example of FIG. 10A, whereas it exceeds the threshold value in the example of FIG. 10B. Therefore, by setting that the angle difference value exceeds the threshold value as a condition for detecting the contact event, the torque measurement value fluctuates due to a change in the posture of the robot 100 even though the contact event does not occur as shown in FIG. 10A. It is possible to prevent erroneous detection in the case of such a case.
- the contact event detection unit 530D the torque measurement value (Left_Shoulder_Roll, Left_Shoulder_Pitch, Right_Shoulder_Roll, Right_Shoulder_Pitch) of the motor that drives the shoulder joint exceeds the threshold range, and the angle difference value of each motor is different.
- the threshold range is exceeded, the contact events generated in the respective arm portions 103 and 104 are detected.
- the contact event detection unit 530D compares the time difference, that is, the difference between the torque detection values at time t and the time t-1, with the threshold range, instead of the torque detection value at a single time t.
- the difference between the fluctuation amount of the torque detection value during normal operation and the fluctuation amount of the torque detection value when a contact event occurs is small, so the torque detection value is simply set as a threshold value. It is difficult to improve the detection accuracy when compared with the range.
- the contact event detection unit 530D enables more accurate event detection by using the angle difference value of the motor as in the above example of the contact event detection unit 530C. However, it is not always necessary to combine the time difference of the torque detection value with the angle difference value.
- the contact event detection unit 530E integrates the contact event detection results by the contact event detection units 530A to 520D. Specifically, for example, the contact event detection unit 530E determines that the contact event has occurred in the head 101 when both the contact event detection units 530A and 530B detect the contact event generated in the head 101. You may. Further, the contact event detection unit 530E may determine that the contact event has occurred in the body 102 when all of the contact event detection units 530A, 530B, and 530C detect the contact event generated in the body 102. .. The contact event detection unit 530E may use the determination result of the contact event detection unit 530D as it is for the arms 103 and 104.
- the contact event detection unit 530E may determine the occurrence of a contact event by the logical product (AND determination) of each detection result as described above, or in another example, the logical sum (OR determination) of each detection result.
- the occurrence of a contact event may be determined by.
- the activeness determination unit 540 may determine that the contact event is active when the contact event is detected in the motion execution portion of the robot 100.
- the motion execution portion may be specified based on, for example, the motion pattern of the joint portion determined by the information processing apparatus 110, or may be specified based on the fluctuation of the torque measurement value of the joint portion before and after the contact event is detected. You may.
- the activity determination unit 540 is connected by the joint portion when the torque measurement value is within the threshold range but fluctuates relatively larger than other joint portions before and after the detection of the contact event.
- the part may be specified as an operation execution part. Further, the activity determination unit 540 may specify the operation execution portion based on the positional relationship between the housing portions at the time of contact event detection, as described below.
- the activeness determination unit 540 specifies an operation performing portion based on the positional relationship between the hand portions 103H and 104H, which are the tip portions of the arm portions 103 and 104, and the body portion 102 when a contact event occurs. There is. In the illustrated example, the position of any of the hand portions 103H and 104H is in front of the body portion 102 and with respect to the shoulder joint portion connecting the body portion 102 and the arm portions 103 and 104 as shown in FIG. 11A.
- the arm portions 103 and 104 including the hand portions 103H and 104H satisfying the above conditions are specified as the motion execution portion.
- the dimensions shown in FIGS. 11A and 11B are examples when a small robot 100 is used, and the dimensions and their ratios are arbitrarily set according to the size and shape of the robot 100.
- FIG. 12 is a diagram schematically showing a process for detecting an active contact event in the present embodiment.
- the contact event is detected based on the torque measurement values (Left_Shoulder_Yaw, Left_Elbow_Pitch, Right_Shoulder_Yaw, Right_Elbow_Pitch) of the motors driving the shoulder joints and elbow joints of the arms 103 and 104.
- the contact event may be detected by simply comparing the torque measurement value with the threshold range as in the example described with reference to FIG. 5 above, or with reference to FIG. 5 and the like above.
- the torque measurement value may be detected by comparing a component below a predetermined frequency or a time difference with a threshold range.
- the contact event may be detected based on the ratio of a plurality of torque measurement values or the angle difference value.
- the activity of the contact event is determined. Specifically, for example, the positions of the hand portions 103H and 104H are calculated from the angle measurement values at the respective joint portions, and the body portion 102 and the hand portion 103H, as described above with reference to FIGS. 12A and 12B, It is determined whether or not the positional relationship with 104H is satisfied. In addition, in order to prevent one contact event from being detected multiple times, a predetermined invalid period is set after the active contact event is detected once, and if the invalid period has not elapsed, the next Prevents active contact events from being detected. When the contact event is detected, the positions of the hands 103H and 104H satisfy the condition, and the invalid period has already passed, the activity determination unit 540 determines that the contact event is active.
- FIG. 13 is a flowchart showing an example of processing of the score calculation unit in one embodiment of the present invention.
- the score addition unit 550 acquires the detection result of the contact event generated in each of the robots 100A and 100B (step S101), and acquires the determination result of the activity for each of the contact events (step S102). ).
- the robots 100A and 100B synchronously detect contact events (YES in step S103), one of the synchronously detected contact events is active (YES in step S104), and the other contact event is When it is not active (YES in step S105), the score adding unit 550 adds a score to the side of the robots 100A and 100B in which an active contact event is detected (step S106).
- step S103 when the contact events are not detected synchronously by the robots 100A and 100B (NO in step S103), and when both the contact events detected in synchronization are not active (in step S104). NO), and when both the contact events detected in synchronization are active (NO in step S105), the score addition unit 550 does not add a score to either the robots 100A or 100B.
- the contact events detected in synchronization with each other do not necessarily have to be the contact events detected at the same time.
- Contact events acquired at times within a time lag may be included.
- the allowable time difference is determined in consideration of, for example, processing delay and communication delay.
- the score addition unit 550 adds scores for each of the robots 100A and 100B as described above, and determines the outcome of the fighting game based on the scores within the predetermined match time. Alternatively, the score addition unit 550 may subtract the score for the side where the active contact event is not detected in the above processing, or the score of either the robots 100A or 100B reaches the upper or lower threshold value. You may end the game at that point.
- a contact event generated in the housing portion is detected based on a measured value indicating a change in the motion state of the housing portion of the robots 100A and 100B.
- the activeness of the detected contact event is determined based on the relationship with the operation performing portion.
- a score is added to the robots 100A and 100B according to the synchrony of the contact events detected by the robots 100A and 100B and the activity of each contact event, thereby determining the outcome of the fighting game.
- the torque measurement value measured by the motor that drives the joint part of the housing is used as the measurement value indicating the change in the motion state of the housing portion, but it is acquired by using, for example, an acceleration sensor. It is also possible to use other measured values.
- the contact event does not necessarily have to be detected in the housing of the robot, and the contact event may be detected in another device capable of actively operating.
- 10 ... System, 100, 100A, 100B ... Robot, 101, 101A, 101B ... Head, 102, 102A, 102B ... Body, 103, 103A, 103B, 104, 104A, 104B ... Arms, 103H, 104H ... Hands Unit, 105, 105A, 105B, 106, 106A, 106B ... Leg, 105F, 106F ... Foot, 110 ... Information processing device, 111 ... CPU, 113 ... ROM, 114 ... External memory, 115 ... Bus interface, 121 ... Communication interface, 130 ... motor, 200A, 200B ... controller, 300 ... judgment device, 310 ... information processing device, 311 ...
- CPU 312 ... RAM, 313 ... ROM, 314 ... external memory, 315 ... bus interface, 321 ... communication interface 322 ... Display, 510 ... Torque measurement value acquisition unit, 520 ... Angle difference value acquisition unit, 530, 530A, 530B, 530C, 530D, 530E ... Contact event detection unit, 540 ... Activeness determination unit, 550 ... Score addition unit, 551 ... Contact event detection result acquisition unit, 552 ... Activeness determination result acquisition unit.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
図1は、本発明の一実施形態に係るシステムの構成例を概略的に示す図である。本実施形態において、システム10はロボット100A,100Bによる格闘ゲームを提供する。システム10は、コントローラ200A,200Bおよび判定装置300をさらに含む。
図5は、本発明の一実施形態における接触イベントの検出のための具体的な機能構成の例を示す図である。図示された例において、トルク計測値取得部510は、頭部101と胴部102とを連結する関節部を駆動するモータの3方向のトルク計測値(Head_Roll,Head_Pitch,Head_Yaw)、胴部102内の2つの部分を連結する関節部を駆動するモータの3方向のトルク計測値(Trunk_Roll,Trunk_Pitch,Trunk_Yaw)、および胴部102と腕部103,104とを連結する肩の関節部を駆動するモータのそれぞれ2方向のトルク計測値(Left_Shoulder_Roll,Left_Shoulder_Pitch,Right_Shoulder_Roll,Right_Shoulder_Pitch)を取得する。一方、角度差分値取得部520は、胴部102内の2つの部分を連結する関節部を駆動するモータの角度差分値(Trunk_Yaw)および上記の肩の関節部を駆動するモータの角度差分値(Left_Shoulder_Roll,Left_Shoulder_Pitch,Right_Shoulder_Roll,Right_Shoulder_Pitch)を取得する。なお、図5では、R:Roll、P:Pitch、Y:Yawとして略記されている場合がある。
接触イベント検出部530Aは、頭部101のトルク計測値(Head_Roll,Head_Pitch,Head_Yaw)および胴部102のトルク計測値(Trunk_Roll)について、それぞれが閾値範囲を超えるか否かを判定する。トルク計測値と閾値範囲との比較の例は、図6Aおよび図6Bに示されている。図6Aに示されるように、ロボット100の各部を駆動するモータについて、通常動作時のトルク計測値は閾値範囲Rを超えない。上述したように、閾値範囲Rは、例えばロボット100の通常動作時におけるトルク計測値の実測値を収集することによって決定される。なお、図示された例では閾値範囲Rが0を中心として正負均等に設定されているが、閾値範囲Rは正負いずれかに偏った範囲であってもよい。
再び図5を参照して、接触イベント検出部530Bは、胴部102のトルク計測値(Trunk_Pitch)が閾値範囲を超える場合に、頭部101のトルク計測値(Head_Pitch)の胴部102のトルク計測値(Trunk_Pitch)に対する比率(Head_Pitch/Trunk_Pitch)が閾値を超えていれば頭部101で発生した接触イベントを検出し、比率が閾値を超えていなければ胴部102で発生した接触イベントを検出する。このような判定によって、以下で説明するように、頭部101または胴部102のどちらで接触イベントが発生したかを正しく識別することができる。
再び図5を参照して、接触イベント検出部530Cは、胴部102内の2つの部分を連結する関節部を駆動するモータのトルク計測値(Trunk_Yaw)が閾値範囲を超え、かつ当該モータの角度差分値が閾値範囲を超える場合に、胴部102で発生した接触イベントを検出する。以下で説明するように、トルク計測値がロボット100の姿勢によって比較的大きく変動するのに対して、角度差分値はロボット100の姿勢への依存が小さいため、角度差分値を用いることによって接触イベント検出の精度を向上させることができる。
再び図5を参照して、接触イベント検出部530Dは、肩関節を駆動するモータのトルク計測値(Left_Shoulder_Roll,Left_Shoulder_Pitch,Right_Shoulder_Roll,Right_Shoulder_Pitch)時間差分が閾値範囲を超え、かつ各モータの角度差分値が閾値範囲を超えた場合に、それぞれの腕部103,104で発生した接触イベントを検出する。具体的には、接触イベント検出部530Dは、単一時刻tのトルク検出値に代えて、時間差分、すなわち時刻tおよび時刻t-1のそれぞれのトルク検出値の差分を閾値範囲と比較する。例えば腕部103,104のような部分では、通常動作時のトルク検出値の変動量と接触イベントが発生したときのトルク検出値の変動量との差が小さいため、単純にトルク検出値を閾値範囲と比較した場合には検出精度が向上しにくい。これに対して、接触イベントが発生した場合におけるトルク検出値の時間あたり変動量は通常動作時におけるトルク検出値の時間あたり変動量よりも大きいため、時間差分を利用すること接触イベントを正しく検出することができる。なお、接触イベント検出部530Dは、トルク検出値の時間差分に加えて、上記の接触イベント検出部530Cの例と同様にモータの角度差分値を用いることによって、より正確なイベントの検出を可能にしているが、トルク検出値の時間差分を角度差分値と組み合わせることは必ずしも必要ではない。
上述のように、能動性判定部540は、ロボット100における動作実施部分で接触イベントが検出された場合に、当該接触イベントが能動的であると判定してもよい。動作実施部分は、例えば情報処理装置110で決定された関節部の動作パターンに基づいて特定されてもよいし、接触イベントが検出される前後における関節部のトルク計測値の変動に基づいて特定されてもよい。例えば、能動性判定部540は、接触イベントの検出の前後において、トルク計測値が閾値範囲内ではあるものの他の関節部よりも相対的に大きく変動していた場合に、当該関節部によって連結される部分を動作実施部分として特定してもよい。また、能動性判定部540は、以下で説明するように、接触イベント検出時の筐体の部分同士の位置関係に基づいて動作実施部分を特定してもよい。
Claims (6)
- 第1のロボットの筐体の部分で発生した第1の接触イベントの検出結果、および第2のロボットの筐体の部分で発生した第2の接触イベントの検出結果を取得する接触イベント検出結果取得部と、
前記第1および第2の接触イベントのそれぞれについて能動性の判定結果を取得する能動性判定結果取得部と、
前記第1の接触イベントが能動的であり、前記第1の接触イベントに同期して検出された前記第2の接触イベントが能動的ではない場合に、前記第1のロボットに対してスコアを加算するスコア加算部と
を備える情報処理装置。 - 前記第1の接触イベントに同期して検出された前記第2の接触イベントは、前記第1の接触イベントの検出結果が取得された時刻と所定以内の時間差の時刻に検出結果が取得された前記第2の接触イベントを含む、請求項1に記載の情報処理装置。
- 前記スコア加算部は、前記第1の接触イベントに同期して検出された前記第2の接触イベントがない場合、および互いに同期して検出された前記第1および第2の接触イベントがいずれも能動的であるか、またはいずれも能動的ではない場合には前記スコアを加算しない、請求項1または請求項2に記載の情報処理装置。
- 請求項1から請求項3のいずれか1項に記載の情報処理装置、前記第1および第2のロボットを含むシステムであって、
前記第1および第2のロボットのそれぞれは、
筐体の部分で発生した接触イベントを検出する接触イベント検出部と、
前記筐体の部分に含まれる動作実施部分で前記接触イベントが検出された場合に前記接触イベントが能動的であると判定する能動性判定部と
を備え、
前記第1および第2の接触イベントの検出結果、ならびに前記第1および第2の接触イベントの能動性の判定結果は、前記第1および第2のロボットから前記情報処理装置に送信されるシステム。 - 第1のロボットの筐体の部分で発生した第1の接触イベントの検出結果、および第2のロボットの筐体の部分で発生した第2の接触イベントの検出結果を取得するステップと、
前記第1および第2の接触イベントのそれぞれについて能動性の判定結果を取得するステップと、
前記第1の接触イベントが能動的であり、前記第1の接触イベントに同期して検出された前記第2の接触イベントが能動的ではない場合に、前記第1のロボットに対してスコアを加算するステップと
を含む情報処理方法。 - 第1のロボットの筐体の部分で発生した第1の接触イベントの検出結果、および第2のロボットの筐体の部分で発生した第2の接触イベントの検出結果を取得する接触イベント検出結果取得部と、
前記第1および第2の接触イベントのそれぞれについて能動性の判定結果を取得する能動性判定結果取得部と、
前記第1の接触イベントが能動的であり、前記第1の接触イベントに同期して検出された前記第2の接触イベントが能動的ではない場合に、前記第1のロボットに対してスコアを加算するスコア加算部と
を備える情報処理装置としてコンピュータを機能させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/045387 WO2022123618A1 (ja) | 2020-12-07 | 2020-12-07 | 情報処理装置、システム、情報処理方法およびプログラム |
US18/255,147 US20240001251A1 (en) | 2020-12-07 | 2020-12-07 | Information processing device, system, information processing method, and program |
EP20965006.8A EP4257222A4 (en) | 2020-12-07 | 2020-12-07 | INFORMATION PROCESSING DEVICE, SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM |
JP2022567723A JP7456002B2 (ja) | 2020-12-07 | 2020-12-07 | 情報処理装置、システム、情報処理方法およびプログラム |
CN202080107554.5A CN116490252A (zh) | 2020-12-07 | 2020-12-07 | 信息处理设备、系统、信息处理方法和程序 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/045387 WO2022123618A1 (ja) | 2020-12-07 | 2020-12-07 | 情報処理装置、システム、情報処理方法およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022123618A1 true WO2022123618A1 (ja) | 2022-06-16 |
Family
ID=81974298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/045387 WO2022123618A1 (ja) | 2020-12-07 | 2020-12-07 | 情報処理装置、システム、情報処理方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240001251A1 (ja) |
EP (1) | EP4257222A4 (ja) |
JP (1) | JP7456002B2 (ja) |
CN (1) | CN116490252A (ja) |
WO (1) | WO2022123618A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7285703B2 (ja) * | 2019-06-17 | 2023-06-02 | 株式会社ソニー・インタラクティブエンタテインメント | ロボット制御システム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH048484A (ja) * | 1990-04-26 | 1992-01-13 | Hitachi Ltd | 複数ロボットの協調動作制御方法および制御装置 |
JP2001287184A (ja) * | 2000-04-07 | 2001-10-16 | Moode:Kk | 自律型半自律型行動装置 |
JP2003181152A (ja) * | 2001-12-14 | 2003-07-02 | Takara Co Ltd | ロボット玩具 |
JP2005342873A (ja) | 2004-06-07 | 2005-12-15 | Sony Corp | ロボット装置及びその動作制御方法 |
JP2007301004A (ja) * | 2006-05-09 | 2007-11-22 | Mechatracks Kk | 制御装置、対戦型ロボットシステム、及びロボット装置 |
KR101336802B1 (ko) * | 2012-07-26 | 2013-12-03 | (주)미니로봇 | 복싱을 수행하는 휴머노이드 로봇 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009000286A (ja) * | 2007-06-21 | 2009-01-08 | Taito Corp | ゲームシステム及び遠隔操作可能なゲームロボット |
-
2020
- 2020-12-07 WO PCT/JP2020/045387 patent/WO2022123618A1/ja active Application Filing
- 2020-12-07 EP EP20965006.8A patent/EP4257222A4/en active Pending
- 2020-12-07 CN CN202080107554.5A patent/CN116490252A/zh active Pending
- 2020-12-07 JP JP2022567723A patent/JP7456002B2/ja active Active
- 2020-12-07 US US18/255,147 patent/US20240001251A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH048484A (ja) * | 1990-04-26 | 1992-01-13 | Hitachi Ltd | 複数ロボットの協調動作制御方法および制御装置 |
JP2001287184A (ja) * | 2000-04-07 | 2001-10-16 | Moode:Kk | 自律型半自律型行動装置 |
JP2003181152A (ja) * | 2001-12-14 | 2003-07-02 | Takara Co Ltd | ロボット玩具 |
JP2005342873A (ja) | 2004-06-07 | 2005-12-15 | Sony Corp | ロボット装置及びその動作制御方法 |
JP2007301004A (ja) * | 2006-05-09 | 2007-11-22 | Mechatracks Kk | 制御装置、対戦型ロボットシステム、及びロボット装置 |
KR101336802B1 (ko) * | 2012-07-26 | 2013-12-03 | (주)미니로봇 | 복싱을 수행하는 휴머노이드 로봇 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4257222A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022123618A1 (ja) | 2022-06-16 |
EP4257222A1 (en) | 2023-10-11 |
JP7456002B2 (ja) | 2024-03-26 |
CN116490252A (zh) | 2023-07-25 |
US20240001251A1 (en) | 2024-01-04 |
EP4257222A4 (en) | 2024-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6884526B2 (ja) | 起立補助方法及び装置 | |
US9233467B2 (en) | Control apparatus and method for master-slave robot, master-slave robot, and control program | |
US7061200B2 (en) | Legged mobile robot and actuator device applicable to join shaft of the robot | |
US20060247104A1 (en) | Fall prevention training system and method using a dynamic perturbation platform | |
WO2016084285A1 (ja) | 歩行解析システムおよび歩行解析プログラム | |
JP5987742B2 (ja) | 歩行補助装置及び歩行補助方法 | |
JP2023052495A (ja) | ロボットアームの衝突の検出 | |
WO2022123618A1 (ja) | 情報処理装置、システム、情報処理方法およびプログラム | |
KR102712460B1 (ko) | 실시간 골프 스윙 트레이닝 보조 장치 | |
JP7172886B2 (ja) | 状態推定プログラム、リハビリ支援システム及び状態推定方法 | |
KR102131097B1 (ko) | 로봇 제어 시스템 및 이를 이용한 로봇 제어 방법 | |
Yin et al. | A wearable rehabilitation game controller using IMU sensor | |
WO2022123616A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
WO2022123617A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2002306628A (ja) | 歩行機能検査装置 | |
JP2009226127A (ja) | 動き検出装置、動き検出システム及びプログラム | |
JP7352516B2 (ja) | 脚運動認識装置及び脚運動補助装置 | |
Moya et al. | Fall detection and damage reduction in biped humanoid robots | |
CN108247605B (zh) | 一种紧急姿态控制方法和系统 | |
CN113317779A (zh) | 体态分析智能穿戴设备及系统 | |
JP4577607B2 (ja) | ロボットの制御装置およびロボットシステム | |
KR20150142991A (ko) | 센싱장치 및 이를 구비한 스크린 사격 시뮬레이션 시스템 | |
JP2012191962A (ja) | 位置変動検出装置、これを含むシステム、及び位置変動検出装置の動作方法 | |
JP2006116635A5 (ja) | ||
US20220387244A1 (en) | Walking training system, control method thereof, and control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20965006 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202080107554.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18255147 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022567723 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020965006 Country of ref document: EP Effective date: 20230707 |