WO2017104199A1 - ロボット、ロボットの制御方法、およびプログラム - Google Patents
ロボット、ロボットの制御方法、およびプログラム Download PDFInfo
- Publication number
- WO2017104199A1 WO2017104199A1 PCT/JP2016/077358 JP2016077358W WO2017104199A1 WO 2017104199 A1 WO2017104199 A1 WO 2017104199A1 JP 2016077358 W JP2016077358 W JP 2016077358W WO 2017104199 A1 WO2017104199 A1 WO 2017104199A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- robot
- input
- setting
- movable
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1633—Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40387—Modify without repeating teaching operation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40391—Human to robot skill transfer
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40411—Robot assists human in non-industrial environment like home or office
Definitions
- the present invention relates to a robot capable of changing parameter settings, a method for controlling the robot, and a program for causing a computer to function as the robot.
- Patent Document 1 discloses a robot phone that allows a person to communicate by synchronizing the shape, movement, position, etc. of a plurality of robots placed at distant locations. Yes.
- Patent Document 1 since the user can shake the hand of one robot phone by shaking the hand of the other robot phone, a robot capable of communicating using gestures can be realized.
- Japanese Patent Publication Japanese Patent Laid-Open No. 2003-305670 (published Oct. 28, 2003)”
- Patent Document 1 The user's operation on the robot phone of Patent Document 1 is only for operating the other robot phone.
- a predetermined parameter such as a volume in his / her robot phone
- it is necessary to provide some input unit for example, an input key, a button, a touch panel for the setting in the robot phone in advance.
- An object of the present invention is to provide a robot, a robot control method, and a program that allow a user to set parameters without having a specific input unit for setting parameters.
- a robot includes a movable unit, a drive unit that drives the movable unit, a position information acquisition unit that acquires position information regarding the position of the movable unit, and a predetermined unit. And a setting unit that sets the value of the parameter to a value corresponding to the position information acquired by the position information acquisition unit.
- the robot according to one aspect of the present invention has an effect that the user can set the parameter without having a specific input unit for parameter setting.
- FIG. 1 is a block diagram illustrating a configuration of a robot according to a first embodiment.
- A is a block diagram showing an external configuration of the robot according to the first embodiment, and (b) is a diagram showing a skeleton of the robot shown in (a).
- A) is a table
- (b) is a figure which defines the angle of a right shoulder pitch when receiving a user's input by the right arm part of a robot.
- A) is a table
- (b) is a table
- (A) is a flowchart showing a flow of parameter setting processing by the robot control method of the first embodiment
- (b) to (d) are diagrams showing the posture of the robot in the flowchart shown in (a). is there.
- (A) is a flowchart which shows the flow of the parameter setting process by the robot control method of Embodiment 2
- (b), (d), and (f) are the angles of the right shoulder pitch with respect to the present setting value, respectively.
- (C), (e), and (g) are side views of the robot shown in (b), (d), and (f), respectively.
- FIG. 2C is a front view of the robot shown in FIG. 2C
- FIG. 3C is a side view of the robot shown in FIG. 2B
- FIG. 3D is a robot showing a state in which the robot shown in FIG. (E) is a side view of the robot shown in (d).
- Embodiment 1 Embodiment 1 according to the present invention will be described below with reference to FIGS.
- FIG. 2A is a block diagram showing an external configuration of the robot 1 which is a humanoid robot according to the present embodiment.
- the robot 1 includes a head 2 (movable part), a trunk part 3, a right arm part 4 (movable part), a left arm part 5 (movable part, second movable part), a right leg part 6 ( A movable part) and a left leg part 7 (movable part).
- FIG. 2A shows an external appearance when the robot 1 is viewed from the front.
- the head 2 is provided with a voice input unit 20 (microphone), an LED (Light Emitting Diode) 22, and a speaker 23.
- the LEDs 22 are provided around both eyes of the robot 1.
- the voice input unit 20 and the LEDs 22 are provided in pairs on the left and right, corresponding to the ears and eyes of the robot.
- the right arm portion 4 includes an upper right arm portion 41, a right forearm portion 42, and a right hand portion 43. From the one end (base side) to the other end (tip side) of the right arm portion 4, the upper right arm portion 41, the right forearm portion 42, and the right hand portion 43 are arranged in this order. One end of the right arm 4 is connected to a location corresponding to the right shoulder side of the trunk 3.
- the left arm part 5 includes a left upper arm part 51, a left forearm part 52, and a left hand part 53. A left upper arm portion 51, a left forearm portion 52, and a left hand portion 53 are arranged in this order from one end (base side) to the other end (tip side) of the left arm portion 5. One end of the left arm 5 is connected to a place corresponding to the left shoulder side of the trunk 3.
- the right leg 6 is composed of a right thigh 61 and a right foot 62.
- One end (base side) of the right thigh 61 is connected to a place corresponding to the waist side of the trunk 3, and the right foot 62 is connected to the other end (tip side) of the right thigh 61.
- the left leg 7 is composed of a left thigh 71 and a left foot 72.
- One end (base side) of the left thigh 71 is connected to a place corresponding to the waist side of the trunk 3, and the left foot 72 is connected to the other end (tip side) of the left thigh 71.
- FIG. 2B is a diagram illustrating a skeleton configuration of the robot 1 according to the present embodiment.
- the robot 1 further includes a neck roll 11a and a neck pitch 11b as a plurality of drive units 40 (see FIG. 1) for individually driving each movable part.
- a left ankle roll 19a is provided.
- the neck roll 11a to the left ankle roll 19a are all servomotors in this embodiment.
- the term neck roll 11a is intended to enable the servomotor to rotate and move the movable part in the roll direction. The same applies to other members such as the neck pitch 11b.
- each drive unit 40 By instructing each drive unit 40 from the control unit 10 (see FIG. 1), which will be described later, control is performed such that the drive unit 40 is rotated to a specified angle or torque is turned on / off. Thereby, the robot 1 can perform operations such as changing the posture or walking. Below, what can adjust an angle among the drive parts 40 is described as a joint part especially.
- the torque of the drive unit 40 being ON means a state in which force (drive force) can be transmitted from the drive unit 40 to the movable unit, while the torque of the drive unit 40 is turned off. Means that the transmission of force from the drive unit 40 to the movable unit is stopped.
- the neck roll 11a, the neck pitch 11b, and the neck yaw 11c are arranged at a location corresponding to the neck in the robot 1.
- the control unit 10 can control the movement of the head 2 in the robot 1 by controlling these.
- the right shoulder pitch 12 is arranged at a position corresponding to the right shoulder in the robot 1.
- the control unit 10 can control the movement of the entire right arm unit 4 in the robot 1 by controlling this.
- the left shoulder pitch 13 is disposed at a location corresponding to the left shoulder in the robot 1.
- the control unit 10 can control the movement of the entire left arm unit 5 in the robot 1 by controlling this.
- the right elbow roll 14 is disposed at a location corresponding to the right elbow in the robot 1.
- the control unit 10 can control the movement of the right forearm unit 42 and the right hand unit 43 in the robot 1 by controlling this.
- the left elbow roll 15 is disposed at a location corresponding to the left elbow in the robot 1.
- the control unit 10 can control the movement of the left forearm unit 52 and the left hand unit 53 in the robot 1 by controlling this.
- the right crotch pitch 16 is disposed at a location corresponding to the right crotch in the robot 1.
- the control unit 10 can control the movement of the entire right leg 6 in the robot 1 by controlling this.
- the left crotch pitch 17 is disposed at a location corresponding to the left crotch in the robot 1.
- the control unit 10 can control the movement of the entire left leg 7 in the robot 1 by controlling this.
- the right ankle pitch 18b and the right ankle roll 18a are arranged at a location corresponding to the right ankle in the robot 1.
- the control unit 10 can control the movement of the right foot 62 in the robot 1 by controlling these.
- the left ankle pitch 19 b and the left ankle roll 19 a are disposed at a location corresponding to the left ankle in the robot 1.
- the control unit 10 can control the movement of the left foot 72 in the robot 1 by controlling these.
- Each drive unit 40 can notify the control unit 10 of a status such as an angle at a predetermined interval.
- the status notification is performed even when the torque of the servo motor is OFF, and the operation of the movable part by the user can be detected.
- the control unit 10 can recognize the angle of the servo motor by receiving the status notification.
- FIG. 1 is a block diagram showing the configuration of the robot 1. As illustrated in FIG. 1, the robot 1 includes a control unit 10, a voice input unit 20 (setting instruction acquisition unit), a storage unit 30, and a drive unit 40.
- the drive unit 40 is as described above with reference to FIG.
- the control unit 10 controls the operation and processing of the robot 1 in an integrated manner. A specific configuration of the control unit 10 will be described later.
- the voice input unit 20 acquisition unit
- the voice input unit 20 is a microphone.
- the storage unit 30 is a storage medium that stores various types of information for the control unit 10 to perform processing. Specific examples of the storage unit 30 include a hard disk or a flash memory.
- the storage unit 30 stores an audio table 31, an input posture table 32, and the like. The description of the voice table 31 and the input posture table 32 will be described later.
- the control unit 10 includes a voice recognition unit 101, a voice determination unit 102, an input posture identification unit 103, a drive control unit 104 (stop unit), an acquisition unit 105 (position information acquisition unit), an input determination unit 106, and a setting unit 107. Prepare.
- the voice recognition unit 101 recognizes the voice input to the voice input unit 20.
- the voice determination unit 102 determines whether or not the voice recognized by the voice recognition unit 101 is a predetermined voice included in the voice table 31 of the storage unit 30.
- the input posture specifying unit 103 refers to the input posture table 32 and specifies the input posture and the input joint portion of the robot 1.
- the input posture is the posture of the robot 1 that receives input from the user.
- an input joint part is a joint part utilized for the input from a user.
- the drive control unit 104 controls the drive unit 40 so that the robot 1 takes the input posture specified by the input posture specifying unit 103.
- the acquisition unit 105 acquires position information related to the position of the movable unit after the movable unit is operated by the user. In the present embodiment, the acquisition unit 105 acquires angle information related to the angle of the movable unit as the position information.
- the input determination unit 106 determines an input value by the user based on the position information.
- the setting unit 107 sets a predetermined parameter value to a value corresponding to the input value. Examples of the predetermined parameter include the volume of sound output from the speaker 23 or the brightness of the LED 22.
- the value of the parameter set at this time is not the position information or the input value itself, but may be a value different from the position information or the input value calculated (converted) from either of them.
- control unit 10 is a CPU.
- a program for causing the control unit 10 to function as each of the above-described units, for example, the acquisition unit 105 and the setting unit 107 is stored in the storage unit 30. That is, the robot 1 includes a computer including the control unit 10 and the storage unit 30.
- FIG. 3A is a table showing an example of the voice table 31.
- the voice table 31 is a data table showing a correspondence relationship between voices recognized by the voice recognition unit 101 and functions executed in the robot 1.
- the voice determination unit 102 determines that the voice is a voice that starts changing the volume setting value.
- the voice recognition unit 101 recognizes a “brightness change” or “brightness change” voice
- the voice determination unit 102 determines that the voice is a voice that starts changing the brightness setting value of the LED 22.
- the voice recognition unit 101 recognizes the voice “I'm done”
- the voice determination unit 102 determines that the voice is a voice that notifies the end of the change of the setting value of the robot 1.
- (A) of FIG. 4 is a table showing an example of the input posture table 32.
- the input posture table 32 may be a data table indicating the correspondence between posture information and input joint portions.
- the posture information is information indicating the rotation angle of the servo motor in each drive unit 40 for controlling each drive unit 40 so that the robot 1 assumes the input posture.
- posture information is set so that the posture in which the robot 1 is sitting becomes the input posture.
- the right shoulder pitch 12 is set as the input joint portion.
- the drive unit 40 that is controlled when the input posture is changed is not necessarily all the drive units 40. It is only necessary to control at least one of the plurality of drive units 40 in accordance with the type of input posture. That is, when the posture of the robot changes to the input posture, at least one of the plurality of movable parts is driven to a position corresponding to the input posture.
- FIG. 4B is a table showing another example of the input posture table 32.
- the input posture table 32 may be a data table indicating the correspondence between the type of setting value to be changed, posture information, and the input joint portion.
- the input posture and the input joint portion differ depending on the type of setting value to be changed. Specifically, when changing the volume setting, the robot 1 takes a sitting posture and uses the right shoulder pitch 12 as an input joint. On the other hand, when changing the brightness setting of the LED 22, the robot 1 takes an upright posture and uses the left shoulder pitch 13 as an input joint.
- the input posture and the input joint are associated with each other for each type of set value.
- the input posture may be common regardless of the type of setting value to be changed, and only the input joint portion may be associated with each other.
- the input joint portions may be shared and associated so that only the input posture is different.
- FIG. 5A is a flowchart showing a flow of parameter setting processing by the control method of the robot 1 of the present embodiment.
- the robot 1 waits for input of a setting instruction (S11).
- the setting instruction is a voice associated with the change of the setting value in the voice table 31.
- Voices 1 to 4 correspond to setting instructions.
- the voice determination unit 102 determines whether the voice is a setting instruction. When the voice is not a setting instruction (S11, NO), the robot 1 stands by until there is an input to the voice input unit 20 again.
- the input posture specifying unit 103 refers to the input posture table 32 and specifies the input posture and the input joint. Then, the drive control unit 104 drives the drive unit 40 to shift the robot 1 to the input posture (S12). Further, the drive control unit 104 turns off the torque of the input joint unit (S13).
- the notification of the position information from the input joint part is acquired by the acquisition part 105 (position information acquisition step).
- the acquisition unit 105 grasps the posture of the robot 1 from the position information and notifies the input determination unit 106 of the position information of the input joint unit.
- the position information may be information on the position itself or a change amount of the position.
- the input determination unit 106 determines an input value by the user from the position information.
- the setting unit 107 sets the parameter based on the input value (setting step) (S14).
- the robot 1 While the voice for ending the change of the set value is not input to the voice input unit 20 (S15, NO), the robot 1 continues the state in which the input posture and the torque of the input joint part are turned off.
- the input posture specifying unit 103 cancels the input posture and the drive control unit 104 To instruct.
- the drive control unit 104 turns on the torque of the input joint and releases the input posture.
- the posture of the robot 1 after cancellation of the input posture may be a predetermined posture set in advance, or may be a posture when a setting instruction is acquired in step S11. Alternatively, the posture may not be changed from step S15 but only the torque of the input joint portion may be turned on.
- the voice recognition unit 101 recognizes the voice.
- the sound determination unit 102 refers to the sound table 31 and determines whether the sound is a setting instruction.
- the voice table 31 is associated with a voice “change volume” and a function “change volume setting value”. For this reason, the audio
- the voice table 31 does not have a function for changing the set value corresponding to the voice “Good morning”. For this reason, the audio
- step S12 will be described.
- the input posture table shown in FIG. 4B is used as the input posture table 32.
- the input posture specifying unit 103 specifies the sitting posture as the input posture according to the posture information corresponding to the “volume” column of the input posture table 32, and specifies the right shoulder pitch 12 as the input joint unit.
- the drive control unit 104 controls the drive unit 40 so that the robot 1 takes the input posture (S12). Further, the drive control unit 104 turns off the torque of the right shoulder pitch 12 that is the input joint unit, and makes the input receivable (S13).
- the acquisition unit 105 acquires user input, that is, angle information of the right arm unit 4 via the right shoulder pitch 12.
- the input determination unit 106 obtains an angle change of the right shoulder pitch 12 from the angle information, and determines an input value by the user.
- the setting unit 107 changes the volume setting value according to the input value (S14).
- the voice recognition unit 101 recognizes the voice and the voice determination unit 102 determines it.
- the voice “I'm done” is associated with “End of change of setting value”. For this reason, the audio
- the input posture specifying unit 103 notifies the drive control unit 104 to release the input posture of the robot 1.
- the drive control unit 104 controls the drive unit 40 to release the input posture (S16).
- the robot 1 according to the present embodiment is set to return to the posture of step S11 when the input posture is canceled.
- the setting unit 107 changes the volume setting value according to the input value every time the user operates the right arm unit 4 in step S14 (every time the angle of the right arm unit 4 is changed). It may be performed after it is determined in step S15 that the user's operation has been completed. That is, the volume may be changed (set) only once according to the angle of the right arm portion 4 determined by the user immediately before the end of the user operation.
- FIGS. 5B to 5D are views showing the posture of the robot 1 in the flowchart shown in FIG.
- the robot 1 has an arbitrary posture, for example, an upright posture shown in FIG.
- the robot 1 takes an input posture, for example, a sitting posture shown in FIG.
- step S14 the angle of the right arm 4 of the robot 1 changes as shown in FIG. Thereafter, when the parameter setting change is completed in step S15 and the input posture is released in step S16, the posture returns to the upright posture shown in FIG.
- the operation of the input determination unit 106 will be described in more detail.
- the user operates the movable part of the robot 1 for the purpose of changing the parameter setting of the robot 1, it is necessary to associate the operation amount of the movable part by the user with the change amount of the set value.
- the input determination unit 106 performs the above association.
- 3B is a diagram for defining the angle of the right shoulder pitch 12 when the right shoulder pitch 12 of the robot 1 is the input joint portion.
- the angle of the right shoulder pitch 12 when the right arm portion 4 is lowered perpendicularly to the plane on which the robot 1 is sitting is 0 °.
- the angle of the right shoulder pitch 12 when the right arm 4 is extended horizontally in front of the robot 1 is 90 °.
- the angle of the right shoulder pitch 12 when the right arm portion 4 is raised perpendicular to the plane on which the robot 1 is sitting is 180 °.
- the angle of the right shoulder pitch 12 when the right arm 4 is extended horizontally behind the robot 1 is 270 °.
- the input determination unit 106 divides the movable range of the right shoulder pitch 12 by 10 steps of the volume. Associate.
- the volume is divided into “0” if 0 ° ⁇ ⁇ ⁇ 36 °, and “1” if 36 ° ⁇ ⁇ ⁇ 72 °, and 324 ° ⁇ ⁇ ⁇ 360. If it is °, associate it with volume “9”.
- the movable range of the right shoulder pitch 12 that is the input joint is limited. Even if the right shoulder pitch 12 can be rotated 360 ° in the normal state, it is not always appropriate to accept the user's input over 360 °. For example, when the angle of the right shoulder pitch 12 is around 0 degrees as shown in FIG. 3B, the right arm portion 4 may come into contact with the ground. When the user operates the right arm 4 from the front of the robot 1, operating the right arm 4 behind the robot 1 (near 270 ° shown in FIG. 3B) Compared with the case where the right arm 4 is operated in the vicinity of 90 ° shown in FIG.
- the movable range of the right shoulder pitch 12 may be different between the normal time (autonomous state) and the time when a user input is received (other state).
- the movable range of the right shoulder pitch 12 is 0 ° to 360 ° in the autonomous state and 30 ° to 150 ° in the other state.
- the input determination unit 106 divides 120 ° from 30 ° to 150 ° into 10 which is the number of stages of volume.
- the robot 1 may use a plurality of joints as input joints. This method is particularly effective when the number of setting value stages is large. For example, consider a case where the volume of the speaker 23 can be set in 100 steps from “0” to “99”. In this case, both the right shoulder pitch 12 and the left shoulder pitch 13 are set as joint portions that receive user input, and the number of steps at the first place is set at the left shoulder pitch 13 and the number of steps at the tens place is set at the right shoulder pitch 12.
- the movable range of each joint part may be 0 ° to 360 °, 30 ° to 150 °, or another range.
- the left shoulder pitch 13 is set to “3” and the right shoulder pitch 12 is set to “4”.
- both the left shoulder pitch 13 and the right shoulder pitch 12 are set to “9”.
- the setting unit 107 temporarily turns on the torque of the joint unit receiving the input at the timing when the setting value has been switched, and the joint
- the drive control unit 104 may be instructed to stop the unit. Or you may notify a setting value from the speaker 23 at said timing.
- the joint is stopped, the sound “same setting value 2” is output from the speaker 23, or the joint is stopped and the sound is output. You may do both.
- the joint is stopped, a sound “same setting value 1” is output from the speaker 23, or the joint is stopped and the sound is output. You may do both.
- the drive control unit 104 moves to the input posture and then turns the joint within the angle range before turning off the torque of the joint part.
- the part may be driven. For example, when the input from the user is received within the angle range of 30 ° to 150 ° at the right shoulder pitch 12, the drive control unit 104 changes the right shoulder pitch 12 after shifting the robot 1 to the input posture. After moving from 30 ° to 150 °, the initial position in the input posture may be set.
- the user can set the parameter without having a specific input unit for parameter setting. Therefore, the robot 1 can be a robot that does not have the input unit and whose design is not impaired by the input unit.
- the robot 1 of the present embodiment may be an animal type or insect type robot in addition to the above-described humanoid robot as long as it has a joint. Furthermore, even robots other than those described above can be used as the robot 1 according to the present embodiment as long as the robot has a portion whose angle can be adjusted. For example, if the angle of flowers, stems, branches, etc. can be changed with a plant-type robot, the robot 1 of this embodiment can be obtained.
- the robot 1 may include a display unit such as an LCD (Liquid Crystal Display) or an input unit as long as the size and shape do not impair the design of the robot 1.
- a display unit such as an LCD (Liquid Crystal Display) or an input unit as long as the size and shape do not impair the design of the robot 1.
- the input unit include an input key / button or a touch panel integrated with a display unit.
- the change of the setting value of the robot 1 has been started and ended based on the voice input to the voice input unit 20 in steps S11 and S15.
- the change of the set value of the robot 1 may be started and ended based on other than the voice.
- the robot 1 may start and end the change of the set value by an input to the input unit.
- the setting value is changed by operating the movable part of the robot 1 as described above.
- Embodiment 2 With reference to FIG. 6, Embodiment 2 which concerns on this invention is demonstrated below. Each member common to Embodiment 1 described above is denoted by the same reference numeral, and detailed description thereof is omitted.
- FIG. 6A is a flowchart showing the flow of parameter setting processing by the control method of the robot 1 of this embodiment.
- 6B, 6D, and 6F are front views of the robot 1 showing the angle of the right shoulder pitch 12 with respect to the current set value, respectively.
- (C), (e), and (g) of FIG. 6 are side views of the robot 1 shown in (b), (d), and (f), respectively.
- step S21 is included between steps S12 and S13.
- the drive control unit 104 controls the input joint unit and reflects the current setting value of the setting item on the movable unit corresponding to the input joint unit of the robot 1 (S21).
- the current set value is stored in the storage unit 30, for example.
- the volume setting change step is 11 steps from 0 to 10
- the movable range of the right shoulder pitch 12 is 0 ° to 180 °.
- the right arm portion 4 is in a state of being directly lowered as shown in (b) and (c) of FIG.
- the volume is “10”
- the right arm portion 4 is in a state of being directly raised as shown in (f) and (g) of FIG.
- the drive control unit 104 is in a state in which the right arm unit 4 protrudes forward as viewed from the robot 1 ((d) and (e) in FIG. 6).
- the right shoulder pitch 12 is controlled so that
- FIG. 7A is a flowchart showing the flow of parameter setting processing by the control method of the robot 1 of the present embodiment.
- FIG. 7B is a front view of the robot 1 showing a state in which the current value of the setting item is reflected in the joint portion other than the input joint portion.
- FIG. 7C is a side view of the robot 1 shown in FIG.
- FIG. 7D is a front view of the robot 1 showing a state in which the robot 1 shown in FIG. 7C is being operated by the user.
- FIG. 7E is a side view of the robot 1 shown in FIG.
- the flowchart shown in FIG. 7A is different from the flowchart shown in FIG. 5A in that step S31 is included between steps S12 and S13.
- the drive control unit 104 reflects the current setting value of the setting item on the second movable unit (S31).
- the second movable part is an arbitrary movable part other than the movable part corresponding to the input joint part.
- the second movable part may always be the same, and the second movable part corresponding to the type of the set value may be defined in the input posture table 32.
- the drive control unit 104 determines that the left arm unit 5 is in front of the robot 1 as shown in FIGS. 7B to 7E.
- the left shoulder pitch 13, which is the second drive unit is controlled so as to project horizontally.
- the right arm portion 4 may be in a lowered state, as shown in FIGS.
- the current set value may be reflected as shown in (d) and (e) of FIG.
- the user can change the setting value while checking the setting value before the setting change. This facilitates an input operation for changing the setting.
- the robot 1 of the present embodiment is a robot phone
- the robot phone includes an LCD.
- the drive control unit 104 shifts the robot 1 to the input posture.
- the input attitude varies depending on the number of values requested to be input. If the volume or LCD brightness is adjusted, the value required to be input is a single value, and the robot 1 assumes a single value input posture (for example, a sitting posture as shown in FIG. 5C). Transition. On the other hand, when used like a joystick for a game, the value required to be input is binary, so that the robot 1 has a binary input posture (for example, an upright posture as shown in FIG. 5B). ).
- the angle of the joint part (position of the movable part) that receives the input may be moved in accordance with the current set value of the item requested to be input.
- the robot 1 is notified of the end by voice or a touch panel provided superimposed on the LCD. Upon receiving the notification, the robot 1 shifts to the initial posture.
- a setting instruction for simultaneously setting a plurality of setting items may be provided in the voice table 31.
- the correspondence relationship “sound:“ batch setting change ”, function: volume / LED brightness” is stored.
- the voice recognizing unit 101 recognizes the voice of “batch setting change”
- the acquiring unit 105 changes the angle for each of the input joint unit corresponding to the volume and the input joint unit corresponding to the brightness of the LED. Get quantity and change parameter settings.
- Another example of changing a plurality of setting values at the same time is receiving a plurality of setting instructions in duplicate. For example, when the “brightness change” sound is acquired without completing the setting value change after obtaining the “volume change” setting instruction, the volume and LED brightness may be changed simultaneously.
- the input joint corresponding to the volume and the input joint corresponding to the brightness of the LED are different from each other.
- the input joint corresponding to the volume may be set to the right shoulder pitch 12, and the input joint corresponding to the brightness of the LED may be set to the left shoulder pitch 13.
- the control block (especially the control unit 10) of the robot 1 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or realized by software using a CPU (Central Processing Unit). May be.
- the robot 1 includes a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) or a memory in which the program and various data are recorded so as to be readable by the computer (or CPU).
- An apparatus (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like are provided.
- a computer reads the said program from the said recording medium and runs it.
- a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
- the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
- the robot (1) includes a movable part (right arm part 4, left arm part 5 and the like), a drive part (40) for driving the movable part, a value of a predetermined parameter, and the position information.
- a position information acquisition unit (105) that sets a value corresponding to the position information acquired by the acquisition unit, and a setting unit that sets a value corresponding to the position information acquired by the position information acquisition unit as a value of a predetermined parameter (107).
- the movable part is a part of the robot driven by the driving part, such as an arm part or a leg part.
- the driving part such as an arm part or a leg part.
- the robot according to aspect 2 of the present invention further includes a stop unit that stops transmission of force from the drive unit to the movable unit in the above-described aspect 1, and the position information acquisition unit is operated by the stop unit.
- the movable part is operated in a state where transmission of the force from the driving part to the movable part is stopped, the position information is acquired.
- the movable unit since the transmission of force from the corresponding driving unit to the movable unit is stopped when the movable unit is operated, the movable unit can be reliably operated.
- the robot according to aspect 3 of the present invention is characterized in that, in the aspect 1 or 2, the position information acquisition unit acquires angle information related to the angle of the movable part as the position information.
- the user can input the parameter setting values to the robot by operating the movable part so as to change the angle of the movable part.
- a robot according to an aspect 4 of the present invention is the robot according to any one of the aspects 1 to 3, wherein the plurality of different movable parts including the movable part and the plurality of movable parts including the driving part are individually driven.
- the setting instruction is acquired by the driving unit
- the setting instruction acquiring unit speech input unit 20
- the setting instruction acquiring unit at least one of the plurality of driving units is controlled.
- a drive control unit that drives at least one of the plurality of movable units to a position corresponding to the posture of the robot for receiving an operation on the movable unit.
- the user visually recognizes the posture change of the robot after giving the parameter setting instruction to the robot, so that the robot can input the parameter setting value by operating the movable unit. You can know that it has moved. Thereby, the user can start operation of the movable part for parameter setting at an appropriate timing.
- the robot according to aspect 5 of the present invention acquires the setting instruction by the setting instruction acquisition unit that acquires the setting instruction of the parameter and the setting instruction acquisition unit in any of the above aspects 1 to 3, the driving is performed. And a drive control unit that drives the movable unit to a position corresponding to the current value of the parameter by controlling the unit.
- the user can know the current value of the parameter to be set by visually recognizing the position of the movable part after giving the parameter setting instruction to the robot.
- the user can operate the movable unit with reference to the current position of the parameter setting movable unit, and thus can easily operate the movable unit to input a desired set value.
- the robot according to Aspect 6 of the present invention is the robot according to any one of Aspects 1 to 3, wherein the second movable part (left arm part 5) that is different from the movable part and the second movable part are driven.
- the second driving unit left shoulder pitch 13
- a setting instruction acquiring unit for acquiring the parameter setting instruction, and the setting instruction acquiring unit the second driving unit is controlled.
- it is further characterized by further comprising a drive control section for driving the second movable section to a position corresponding to the current value of the parameter.
- the user can know the current value of the parameter to be set by visually recognizing the position of the second movable part after giving the parameter setting instruction to the robot.
- the user can operate the movable part for parameter setting while grasping the current value of the parameter based on the position of the second movable part. Therefore, the user can operate the movable part for inputting a desired set value. It can be done easily.
- a robot control method is a robot control method including a movable part and a drive part that drives the movable part, and a position for acquiring position information regarding the position of the movable part.
- the information acquisition step includes a setting step for setting a value of a predetermined parameter to a value corresponding to the position information acquired in the position information acquisition step.
- the robot according to each aspect of the present invention may be realized by a computer.
- the robot is controlled by the computer by causing the computer to operate as each unit (software element) included in the robot.
- a program and a computer-readable recording medium on which the program is recorded also fall within the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
Description
図1~図5を参照して、本発明に係る実施形態1を以下に説明する。
図2の(a)は、本実施形態に係る、人型ロボットであるロボット1の外部構成を示すブロック図である。この図に示すように、ロボット1は、頭部2(可動部)、体幹部3、右腕部4(可動部)、左腕部5(可動部、第2の可動部)、右脚部6(可動部)、および左脚部7(可動部)を備えている。図2の(a)には、ロボット1をその正面から視認した場合の外観を示す。
図2の(b)は、本実施形態に係るロボット1の骨格構成を示す図である。この図に示すように、ロボット1は、図1に示す各部材に加えて、さらに、各可動部を個別に駆動する複数の駆動部40(図1参照)としての首ロール11a、首ピッチ11b、首ヨー11c、右肩ピッチ12、左肩ピッチ13、右肘ロール14、左肘ロール15、右股ピッチ16、左股ピッチ17、右足首ピッチ18b、右足首ロール18a、左足首ピッチ19b、および左足首ロール19aを備えている。首ロール11a~左足首ロール19aは、本実施形態ではいずれもサーボモータである。首ロール11aという文言は、このサーボモータがロール方向に可動部を回転移動させることができることを意図している。首ピッチ11b等のその他の部材についてもこれに準ずる。
図1は、ロボット1の構成を示すブロック図である。この図に示すように、ロボット1は、制御部10、音声入力部20(設定指示取得部)、記憶部30、および駆動部40を備える。駆動部40については、図2の(b)を参照して上述した通りである。
制御部10は、音声認識部101、音声判定部102、入力姿勢特定部103、駆動制御部104(停止部)、取得部105(位置情報取得部)、入力判定部106、および設定部107を備える。
図3の(a)は、音声テーブル31の例を示す表である。この表に示すように、音声テーブル31は、音声認識部101が認識した音声と、ロボット1において実行される機能との対応関係を示すデータテーブルである。例えば、音声認識部101が「音量変更」または「ボリューム変更」の音声を認識した場合、音声判定部102は、当該音声を音量の設定値変更を開始する音声であると判定する。また、音声認識部101が「明るさ変更」または「輝度変更」の音声を認識した場合、音声判定部102は、当該音声をLED22の明るさの設定値変更を開始する音声であると判定する。さらに、音声認識部101が「終わったよ」の音声を認識した場合、音声判定部102は当該音声を、ロボット1の設定値変更の終了を通知する音声であると判定する。
図5の(a)は、本実施形態のロボット1の制御方法によるパラメータ設定処理の流れを示すフローチャートである。初期状態では、ロボット1は、設定指示の入力を待機する(S11)。ここで、設定指示とは、音声テーブル31において設定値の変更と対応付けられている音声である。図3の(a)に示す例では、No.1~4の音声が設定指示に該当する。
図5の(b)~(d)は、図5の(a)に示したフローチャートにおける、ロボット1の姿勢を示す図である。ステップS11においては、ロボット1は任意の姿勢であり、例えば図5の(b)に示す直立姿勢である。上述した通り、ステップS12において、ロボット1は、入力姿勢、例えば図5の(c)に示す、座った姿勢を取る。
入力判定部106の動作について、より詳しく説明する。ロボット1のパラメータ設定を変更する目的で、ユーザがロボット1の可動部を操作する場合、ユーザによる可動部の操作量と、設定値の変更量との対応付けを行う必要がある。入力判定部106は、上記の対応付けを行う。
図6を参照して、本発明に係る実施形態2を以下に説明する。上述した実施形態1と共通する各部材には同じ符号を付し、詳細な説明を省略する。
図7を参照して、本発明に係る実施形態3を以下に説明する。上述した実施形態1または2と共通する各部材には同じ符号を付し、詳細な説明を省略する。
複数の設定項目の設定値を同時に変更する例について、以下に2通り説明する。1つの例としては、複数の設定項目について同時に設定を行う設定指示を音声テーブル31に設けることが挙げられる。例えば、図3の(a)に示した音声と機能との対応関係に加えて、「音声:「一括設定変更」、機能:音量/LEDの明るさ」という対応関係を記憶させておく。この場合、音声認識部101が「一括設定変更」という音声を認識すると、取得部105は、音量に対応する入力関節部と、LEDの明るさに対応する入力関節部とのそれぞれについて角度の変化量を取得し、パラメータ設定を変更する。
「音量」または「LED22の明るさ」といった設定項目では、1つの設定項目が1つの設定値を持つ。これに対し、1つの設定項目が複数の設定値を持つ場合について以下に説明する。例えば「LED22の色」という設定項目が、R(レッド)の強度、G(グリーン)の強度、およびB(ブルー)の強度という3つの設定値を持つ場合を考える。この場合、Rの強度を右肩ピッチ12、Gの強度を左肩ピッチ13、Bの強度を右股ピッチ16で設定できるようにする。このように、複数の設定値を持つ設定項目について、設定値ごとに異なる関節部を入力関節部とすれば、それぞれの設定値を容易に変更できる。
ロボット1の制御ブロック(特に制御部10)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。後者の場合、ロボット1は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。
本発明の態様1に係るロボット(1)は、可動部(右腕部4、左腕部5等)と、上記可動部を駆動する駆動部(40)と、所定のパラメータの値を、上記位置情報取得部によって取得した上記位置情報に応じた値に設定する位置情報取得部(105)と、位置情報取得部により取得した上記位置情報に応じた値を、所定のパラメータの値として設定する設定部(107)とを備えていることを特徴としている。
Claims (8)
- 可動部と、
上記可動部を駆動する駆動部と、
上記可動部の位置に関する位置情報を取得する位置情報取得部と、
所定のパラメータの値を、上記位置情報取得部によって取得した上記位置情報に応じた値に設定する設定部とを備えていることを特徴とするロボット。 - 上記駆動部から上記可動部への力の伝達を停止させる停止部をさらに備えており、
上記位置情報取得部は、上記停止部により上記駆動部から上記可動部への上記力の伝達を停止している状態で上記可動部が操作された場合、上記位置情報を取得することを特徴とする請求項1に記載のロボット。 - 上記位置情報取得部は、上記位置情報として、上記可動部の角度に関する角度情報を取得することを特徴とする請求項1に記載のロボット。
- 上記可動部を含む異なる複数の可動部と、
上記駆動部を含む、上記複数の可動部を個別に駆動する複数の駆動部と、
上記パラメータの設定指示を取得する設定指示取得部と、
上記設定指示取得部により上記設定指示を取得した場合、上記複数の駆動部のうち少なくともいずれかを制御することによって、上記複数の可動部のうち少なくともいずれかを、上記可動部に対する操作を受け付けるための上記ロボットの姿勢に対応した位置に駆動させる駆動制御部とをさらに備えていることを特徴とする請求項1~3のいずれか1項に記載のロボット。 - 上記パラメータの設定指示を取得する設定指示取得部と、
上記設定指示取得部により上記設定指示を取得した場合、上記駆動部を制御することによって、上記可動部を、上記パラメータの現在値に応じた位置に駆動させる駆動制御部とをさらに備えていることを特徴とする請求項1~3のいずれか1項に記載のロボット。 - 上記可動部と異なる第2の可動部と、
上記第2の可動部を駆動する、上記駆動部と異なる第2の駆動部と、
上記パラメータの設定指示を取得する設定指示取得部と、
上記設定指示取得部により上記設定指示を取得した場合、上記第2の駆動部を制御することによって、上記第2の可動部を、上記パラメータの現在値に応じた位置に駆動させる駆動制御部とをさらに備えていることを特徴とする請求項1~3のいずれか1項に記載のロボット。 - 可動部と、
上記可動部を駆動する駆動部とを備えているロボットの制御方法であって、
上記可動部の位置に関する位置情報を取得する位置情報取得工程と、
所定のパラメータの値を、位置情報取得工程において取得した上記位置情報に応じた値に設定する設定工程とを有するロボットの制御方法。 - 請求項1から6のいずれか1項に記載のロボットとしてコンピュータを機能させるためのプログラムであって、コンピュータを上記位置情報取得部および上記設定部として機能させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680061701.3A CN108472813A (zh) | 2015-12-18 | 2016-09-16 | 机器人、机器人的控制方法以及程序 |
JP2017556360A JPWO2017104199A1 (ja) | 2015-12-18 | 2016-09-16 | ロボット、ロボットの制御方法、およびプログラム |
US15/766,784 US20180319017A1 (en) | 2015-12-18 | 2016-09-16 | Robot and control method for robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015248040 | 2015-12-18 | ||
JP2015-248040 | 2015-12-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017104199A1 true WO2017104199A1 (ja) | 2017-06-22 |
Family
ID=59056531
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/077358 WO2017104199A1 (ja) | 2015-12-18 | 2016-09-16 | ロボット、ロボットの制御方法、およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180319017A1 (ja) |
JP (1) | JPWO2017104199A1 (ja) |
CN (1) | CN108472813A (ja) |
WO (1) | WO2017104199A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113119120B (zh) * | 2021-03-30 | 2022-06-07 | 深圳市优必选科技股份有限公司 | 一种机器人的控制方法、装置及机器人 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06190753A (ja) * | 1992-12-25 | 1994-07-12 | Fujitsu Ltd | ロボット制御装置 |
JP2014213399A (ja) * | 2013-04-24 | 2014-11-17 | 国立大学法人横浜国立大学 | ロボット教示方法及び教示システム |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0122347B1 (en) * | 1983-02-15 | 1988-04-20 | Graham S. Hawkes | Audio feedback for remotely controlled manipulators |
US5243513A (en) * | 1991-04-23 | 1993-09-07 | Peters John M | Automation control with improved operator/system interface |
US5954621A (en) * | 1993-07-09 | 1999-09-21 | Kinetecs, Inc. | Exercise apparatus and technique |
US6028593A (en) * | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
JP2000254360A (ja) * | 1999-03-11 | 2000-09-19 | Toybox:Kk | 対話型玩具 |
US6244429B1 (en) * | 1999-05-04 | 2001-06-12 | Kalish Canada Inc. | Automatic adjustable guide rails |
JP4022478B2 (ja) * | 2002-02-13 | 2007-12-19 | 株式会社東京大学Tlo | ロボットフォン |
WO2003068461A1 (fr) * | 2002-02-13 | 2003-08-21 | Toudai Tlo, Ltd. | Robot-telephone |
US7252299B2 (en) * | 2005-02-15 | 2007-08-07 | Marine Travelift, Inc. | Steering system for crane |
US8190292B2 (en) * | 2005-08-29 | 2012-05-29 | The Board Of Trustees Of The Leland Stanford Junior University | High frequency feedback in telerobotics |
WO2010069430A1 (de) * | 2008-12-17 | 2010-06-24 | Kuka Roboter Gmbh | Verfahren zum abfahren einer vorgegebenen bahn durch einen manipulator, sowie steuervorrichtung zur durchführung eines solchen verfahrens |
WO2011161765A1 (ja) * | 2010-06-22 | 2011-12-29 | 株式会社 東芝 | ロボット制御装置 |
JP2013071239A (ja) * | 2011-09-29 | 2013-04-22 | Panasonic Corp | ロボットアームの制御装置及び制御方法、ロボット、ロボットアームの制御プログラム、並びに、集積電子回路 |
DE102012202181A1 (de) * | 2012-02-14 | 2013-08-29 | Kuka Roboter Gmbh | Verfahren zum Ermitteln eines Drehmoments und Industrieroboter |
JP5948932B2 (ja) * | 2012-02-16 | 2016-07-06 | セイコーエプソン株式会社 | ロボット制御装置、ロボット制御方法およびロボット制御プログラムならびにロボットシステム |
WO2013158689A2 (en) * | 2012-04-16 | 2013-10-24 | Cornell University | Digitally controlled musical instrument |
JP5910491B2 (ja) * | 2012-12-28 | 2016-04-27 | トヨタ自動車株式会社 | ロボットアーム教示システム及びロボットアーム教示方法 |
JP5946859B2 (ja) * | 2014-04-14 | 2016-07-06 | ファナック株式会社 | 力に応じて動かすロボットのロボット制御装置およびロボットシステム |
-
2016
- 2016-09-16 WO PCT/JP2016/077358 patent/WO2017104199A1/ja active Application Filing
- 2016-09-16 CN CN201680061701.3A patent/CN108472813A/zh active Pending
- 2016-09-16 JP JP2017556360A patent/JPWO2017104199A1/ja active Pending
- 2016-09-16 US US15/766,784 patent/US20180319017A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06190753A (ja) * | 1992-12-25 | 1994-07-12 | Fujitsu Ltd | ロボット制御装置 |
JP2014213399A (ja) * | 2013-04-24 | 2014-11-17 | 国立大学法人横浜国立大学 | ロボット教示方法及び教示システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017104199A1 (ja) | 2018-04-12 |
CN108472813A (zh) | 2018-08-31 |
US20180319017A1 (en) | 2018-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9939913B2 (en) | Smart home control using modular sensing device | |
EP1669172B1 (en) | Communication robot control system | |
CN110832439A (zh) | 发光用户输入设备 | |
JP2019069268A5 (ja) | ||
US20190389075A1 (en) | Robot system and robot dialogue method | |
US11104005B2 (en) | Controller for end portion control of multi-degree-of-freedom robot, method for controlling multi-degree-of-freedom robot by using controller, and robot operated thereby | |
JP6113897B1 (ja) | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 | |
CN112154047A (zh) | 远程操作系统、信息处理方法以及程序 | |
EP4325335A1 (en) | Multi-stage gestures detected based on neuromuscular-signal sensors of a wearable device to activate user-interface interactions with low-false positive rates, and systems and methods of use thereof | |
WO2017104199A1 (ja) | ロボット、ロボットの制御方法、およびプログラム | |
JPWO2021059359A1 (ja) | アニメーション制作システム | |
JP6610609B2 (ja) | 音声対話ロボットおよび音声対話システム | |
CN105955488A (zh) | 一种操控终端的方法和装置 | |
JP7470345B2 (ja) | アニメーション制作システム | |
CN108392831A (zh) | 一种舞台演绎系统及方法 | |
JP2022153477A (ja) | アニメーション制作システム | |
JP2015050657A (ja) | 制御装置、制御装置の制御方法、制御システム、および、制御プログラム | |
JP2018032413A (ja) | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 | |
JP2019020836A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
KR101402908B1 (ko) | 행위기반 로봇 제어장치 및 그 제어방법 | |
JP3194056U (ja) | ロボット | |
JP6356494B2 (ja) | ロボット | |
WO2024095329A1 (ja) | 操作装置、情報処理方法およびコンピュータプログラム | |
WO2024095328A1 (ja) | 操作装置、情報処理システム、情報処理方法およびコンピュータプログラム | |
US20240192766A1 (en) | Controlling locomotion within an artificial-reality application using hand gestures, and methods and systems of use thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16875185 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017556360 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15766784 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16875185 Country of ref document: EP Kind code of ref document: A1 |