WO2002065825A2 - Robot et procede de commande du fonctionnement dudit robot - Google Patents
Robot et procede de commande du fonctionnement dudit robot Download PDFInfo
- Publication number
- WO2002065825A2 WO2002065825A2 PCT/IB2002/000544 IB0200544W WO02065825A2 WO 2002065825 A2 WO2002065825 A2 WO 2002065825A2 IB 0200544 W IB0200544 W IB 0200544W WO 02065825 A2 WO02065825 A2 WO 02065825A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- external force
- robot device
- control signal
- detecting
- unit
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1633—Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Definitions
- the present invention relates to a robot device, an operation control method of the robot device, an external force detection device, a program for controlling the operation of the external force detection method robot device, and a recording medium on which the program is recorded, More specifically, a robot device that autonomously determines and expresses an action according to the surrounding situation or external work, a method of controlling the operation of such a robot device, a program for controlling the operation, and It relates to a recording medium on which this program is recorded.
- BACKGROUND ART In recent years, a four-legged walk-in mouth device that performs an action in response to a command from a user or the surrounding environment has been developed and sold by the present applicant.
- Such a robot device is equipped with a CCD (Charge Coupled Device) camera and a microphone, and is configured based on surrounding conditions captured by the CCD camera, a command sound from a user collected by the microphone, a surrounding sound, and the like. It determines the presence or absence of a command from the surrounding situation user, determines the action autonomously based on the result of this determination, and expresses it.
- CCD Charge Coupled Device
- Such a robot device is usually used in a general home where there are many obstacles such as a door sill and a cord, and therefore, even in such an environment, there are some types of devices that do not easily fall over when walking. Ingenuity is needed. Therefore, for example, a method is used in which an obstacle is detected based on an image signal output from a CCD camera, and the robot device controls its own action so as to avoid the obstacle based on the detection result. Have been.
- Other methods of obstacle detection include, for example, For example, a method has been considered in which a force sensor is specially installed on each leg of the robot device, and a collision with an obstacle is detected based on the output.
- such a robot device includes a plurality of sensors at a predetermined position to not only detect an obstacle, but also perform a process corresponding to the pressed sensor, for example, a predetermined operation (such as sitting down). Can also be expressed. Therefore, by providing more sensors, the user can enjoy advanced interaction with the robot device.
- the position at which the contact is detected is necessarily limited, so that interaction with the user is limited.
- an advanced sensor that detects the direction in which the force is applied is used, the configuration of the robot device becomes complicated, which leads to an increase in manufacturing costs and an increase in weight.
- the object of the present invention is to solve the problems of the conventional robot device, to reliably detect a collision with an obstacle with a simple configuration, and to detect an external force without using a dedicated sensor.
- An external force detecting device and method capable of detecting information on an external force such as the direction of the external force, a robot device to which the external force detecting device is applied, an operation control method of the robot device, and a D-bot device And a program recording medium.
- the robot device includes: an operation unit that is operation-controlled by a control signal; a control unit that outputs the control signal to execute a predetermined operation; and a control unit that outputs the control signal when an external force is applied to the operation unit. Detects external force based on changes in control signals External force detecting means.
- control method of the robot device includes a control step of controlling an operation of an operation unit of the robot device by a control signal, and a change of the control signal when an external force is applied.
- the external force detection device includes: an operation unit that drives each unit of the robot device; a control unit that gives an instruction value of an operation amount of the operation unit to the operation unit; and an actual operation amount of the operation unit.
- An operation amount detection means for detecting a value, and the presence or absence of an external force is determined based on the instruction value given to the operation means by the control means and an actual value of the drive amount of the operation means detected by the operation amount detection means.
- an external force detecting means for detecting the external force.
- the external force detection method includes a step of giving an instruction value of a drive amount of the drive unit at the time of the operation to an operation unit that drives each unit of the robot device;
- the method includes a step of detecting an actual value of the driving amount, and a step of judging the presence or absence of an external force based on the instruction value and the actual value.
- a program according to the present invention includes a control step of controlling operation of an operation unit of a robot device by a control signal, and an external force applied to the robot device based on a change in the control signal when an external force is applied. And an external force detecting step of detecting the force.
- the recording medium includes a control step of controlling an operation of an operation unit of the robot device by a control signal, and a control process for the robot device based on a change in the control signal when an external force is applied.
- a program for causing the robot device to execute an external force detection step of detecting an external force is recorded.
- FIG. 2 is a diagram showing processing executed by the basic configuration of FIG.
- FIG. 3 is a program described to execute step S3 in FIG.
- FIG. 4 is a perspective view showing an external configuration of the robot device.
- FIG. 5 is a block diagram showing the circuit configuration of the robot.
- FIG. 6 is a block diagram illustrating a configuration of the signal processing circuit.
- FIG. 7 is a block diagram conceptually showing a software configuration of the control program.
- FIG. 8 is a block diagram conceptually showing a software configuration of the middleware layer.
- FIG. 9 is a block diagram conceptually showing a software configuration of an application layer.
- FIG. 10 is a conceptual diagram showing the configuration of the behavior model library.
- FIG. 11 is a conceptual diagram illustrating a stochastic automaton.
- FIG. 12 is a conceptual diagram showing a state transition table.
- FIG. 13 is a block diagram showing a main configuration for detecting an external force in the robot device.
- FIG. 14 is a characteristic diagram showing the relationship between the PWM pulse and the measured torque.
- FIG. 15 is a characteristic diagram showing the relationship between the PWM pulse and the measured torque.
- FIG. 16 is a characteristic diagram showing the relationship between the PWM pulse and the measured torque.
- FIG. 17 is a characteristic diagram showing the relationship between the PWM pulse and the measured torque.
- FIG. 18 is a diagram showing a configuration of the force sensor system.
- FIG. 19 is a characteristic diagram of the calibration used for describing the force sensor system.
- FIG. 20 is a diagram illustrating a modeled robot device.
- FIG. 21 is a characteristic diagram showing the relationship between the actually measured torque and the torque obtained by the conversion in the joint 1.
- FIG. 22 is a characteristic diagram showing the relationship between the actually measured torque and the torque obtained by the conversion in the joint 1.
- FIG. 23 is a characteristic diagram showing a relationship between the actually measured torque and the torque obtained by the conversion in the joint 2.
- Figure 24 shows the measured torque and the torque obtained by the conversion in Joint 2.
- FIG. 4 is a characteristic diagram showing the relationship of FIG.
- FIG. 25 is a characteristic diagram showing a relationship between the actually measured torque and the torque obtained by the conversion in the joint 3.
- FIG. 26 is a characteristic diagram showing a relationship between the actually measured torque and the torque obtained by the conversion in the joint 3.
- FIG. 27 is a characteristic diagram showing results when forces are applied from various directions.
- FIG. 28 is a characteristic diagram showing results when forces are applied from various directions.
- FIG. 29 is a characteristic diagram showing results when forces are applied from various directions.
- Fig. 30 is a characteristic diagram showing the results when a force is applied to the body in one turn.
- Fig. 31 is a waveform showing the relationship between the indicated value and the actual value for the pitch direction actuator.
- FIG. 32 is a waveform chart showing various waveforms.
- Fig. 33 is a conceptual diagram for explaining the stability margin.
- FIG. 34 is a waveform chart for explaining the stability margin.
- FIG. 35 is a waveform diagram for explaining the obstacle detection by the first to third obstacle detection methods.
- FIG. 36 is a waveform diagram for explaining the obstacle detection by the first to third obstacle detection methods.
- FIG. 37 is a waveform diagram for describing obstacle detection by the first to third obstacle detection methods.
- FIG. 38 is a conceptual diagram explaining the method of calculating the stability margin.
- FIG. 39 is a flowchart showing an obstacle detection processing procedure.
- BEST MODE FOR CARRYING OUT THE INVENTION The robot apparatus shown as a specific example of the present invention is designed to operate in the surrounding environment (external factors) and the internal state. It is an autonomous robot device that acts autonomously according to (internal factors).
- the external factors include, for example, a force applied by a user or an external force generated by contact with an obstacle, and the robot device is configured to detect the external force. ing.
- the robot device has an external force measurement unit 101 and an external force analysis unit, as shown in Figs. 102 and an action selection unit 103. Then, the external force measuring unit 101 measures the position, direction, size, etc. of the external force applied to the robot device (step S l), and the external force analyzing unit 102 receives the external force from the external force measuring unit 101. The type of external force is classified according to the input (step S2), and the action selecting unit 103 changes the posture of the robot device according to the analysis result of the external force analyzing unit 102 (step S3).
- the action selection in step S3 is realized, for example, according to the program shown in FIG.
- the program in FIG. 3 is described so that the next action can be selected based on the type of external force applied to the robot device and the behavior of the robot device.
- the F 0 rce Kind variable, the Behavior Sta seven us variable, and the F The orceD irection variable indicates the position to which an external force is applied, the behavior of the mouth-pot device, and the direction in which the force is applied.
- the action selection unit 103 selects an action based on the information of the external force.
- a specific example of the present invention describes an external force detection method in an autonomous robot device that selects an action based on an applied external force.
- the robot unit 1 is a so-called pet-type robot shaped like an animal such as a "dog", and is a leg unit at the front, rear, left and right of the body unit 2.
- the head Yuni' preparative 4 and tail unit Yuni' bets 5 each is constructed by connecting the front and rear ends of the body portion Yunidzu DOO 2 ing.
- the body unit 2 has a CPU (Central Processing Unit) 10, a DRAM (Dynamic Random Access Memory) 11, and a flash ROM (Read Only Memory) 12.
- a control unit 16 formed by connecting a PC (Personal Computer) card interface circuit 13 and a signal processing circuit 14 to each other via an internal bus 15;
- the battery 17 as a power source of the device 1 is housed.
- the body unit 2 has an angular velocity sensor for detecting the acceleration of the direction and movement of the robot device 1.
- a sensor 18 and an acceleration sensor 19 are also stored.
- the head unit 4 has a CCD (Charge Coupled Device) camera 20 for imaging the external situation and the pressure received by the user from physical actions such as ⁇ stroke '' and ⁇ hit ''.
- CCD Charge Coupled Device
- Contact detection 21 for detection, distance sensor 22 for measuring the distance to an object located ahead, microphone 23 for collecting external sound, and output of sound such as squeal Speaker 24 and an LED (Light Emitting Diode) (not shown) corresponding to the “eye” of the robot device 1 are arranged at predetermined positions.
- the joints of the leg units 3A to 3D, the joints of the leg units 3A to 3D and the trunk unit 2, the connection of the head unit 4 and the trunk unit 2 has a degree of freedom of 25 minutes! ⁇ 2 5 n and Botenshome Isseki 2 6 ⁇ 2 6 n are provided.
- Akuchiyue Isseki 2 5 to 2 5 n has a configuration the evening servomotor. By driving the servo motor, the leg units 3A to 3D are controlled, and the state shifts to the target posture or operation.
- Various sensors such as the angular velocity sensor 18, the acceleration sensor 19, the contact detection 21, the distance sensor 22, the microphone 23, the speaker 24, and each potentiometer 26 to 26 n, and the LED and the like One night of each actiyu 2 5! To 25 n are connected to the signal processing circuit 14 of the control unit 16 via the corresponding hubs 27 to 27 n , respectively, and the CCD camera 20 and the battery 17 are respectively connected to the signal processing circuit 14. Directly connected to processing circuit 14.
- the signal processing circuit 14 sequentially captures the sensor data, image data, and audio data supplied from each of the above-described sensors, and sequentially stores the data at a predetermined position in the DRAM 11 via the internal bus 15. In addition, the signal processing circuit 14 sequentially takes in remaining battery power data indicating the remaining battery power supplied from the battery 17 and stores the data in a predetermined position in the DRAM 11.
- the CPU 10 controls the operation of the robot device 1 after each sensor data, image data, audio data, and battery remaining data stored in the DRAM 11 as described above. Used when performing. Actually, when the robot unit 1 is initially turned on, the CPU 10 reads the memory card 28 or the flash ROM inserted in the PC card slot (not shown) of the body unit 2. The control program stored in 12 is read out directly or directly via the PC interface 13 and stored in the DRAM 11 . The CPU 10 then performs signal processing as described above. Based on the sensor data, image data, voice data and remaining battery data stored sequentially in the DRAM 11 from the circuit 14, the self and surrounding conditions, and the presence or absence of instructions and actions from the user, etc. to decide.
- CPU 1 0 is configured to determine a subsequent action based on the control program that is stored in the determination result and DR AM 1 1, required Akuchiyue based on the determination result Isseki 2 5 i ⁇ 2 5 n
- the head unit 4 can be swung up, down, left and right, the tail unit 5A of the tail unit 5 can be moved, and the leg units 3A to 3D can be driven to walk. Have them perform actions such as
- the CPU 10 generates an audio signal as needed and gives it to the speaker 24 as an audio signal via the signal processing circuit 14 so that the audio based on the audio signal is externally output. Output, or turn on, off or blink the above LED.
- the robot device 1 is capable of acting autonomously in response to the situation of itself and the surroundings, and instructions and actions from the user.
- FIG. 6 shows a specific configuration of the signal processing circuit 14.
- the signal processing circuit 14 is composed of a DMA (Direct Memory Access) controller 30, a DSP (Digital Signal Processor) 31, a peripheral interface 32, and an evening image 33 , FBK / CDT (Filter Bank / Color Detection) 34, IPE (Inner Product Engine) 35, serial bus host controller 36 and serial bus 37 mediate the bus 38 and the right to use the bus 38
- the bus 40 is sequentially connected to the bus 40 via the bus arbiter 39, and the bus 40 is connected to the DRAM interface 41, the host interface 42, and the ROM interface, respectively.
- DRAM 11 Fig. 5
- CPU 10 Fig. 5
- flash ROM 12 Fig. 5
- a parallel port 44, Bruno 1? Uz Teri manager 4 5 and serial port 4 6 is constructed by connecting.
- Devices such as 262 s 26 a) are connected to the serial host controller 36 via hubs 27 (27 i to 27 n ), and the CCD camera 20 (FIG. 2)
- the battery 17 is connected to the FBK / CDT 34, and the battery 17 (FIG. 5) is connected to the battery manager 45.
- serial host controller 3-6 of each connected device, the angular velocity sensor 1 8, the acceleration sensor 1 9, the contact detection 2 1, the distance sensor 2 2 and the potentiometer Isseki 2 6 (2 6 have 2 6 2 , 26 3 ;), etc., are sequentially taken in from each sensor, and these sensor data are controlled under the control of the DMA controller 30 which functions as a bus master for data transfer.
- the DMA controller 30 which functions as a bus master for data transfer.
- the bus 39, the bus 40, and the DRAM interface 41 in order, and store the data.
- serial host controller 36 sends the audio data supplied from the microphone 23 to the DSP 31 and the DSP 31 performs predetermined data processing on the audio data.
- the DMA controller 30 Under the control of the DMA controller 30 to the DRAM 11 via the bus 38, the bus arbiter 39, the bus 40, and the DRAM interface 41 in order. Then, this is stored in a predetermined storage area in the DRAM 11.
- the FBK / CD T 34 divides the image data supplied from the CCD camera 20 into a plurality of resolutions while performing color recognition, and fetches the obtained image data to control the DMA controller 30. At the same time, the data is transferred to the DRAM 11 (FIG. 5) via the bus 38, the bus arbiter 39, the bus 40, and the DRAM interface 41 in order, and is transferred to the DRAM 11 as described later. Stored in specified storage area I do.
- the battery manager 45 sends the remaining battery data indicating the remaining energy notified from the battery 17 under the control of the DMA controller 30 to the refresh interface 32 and the bus 3. 8. Transfer to the DRAM 11 via the bus arbiter 39, the bus 40 and the DRAM interface 41 in order, and store it in a predetermined storage area in the DRAM 11.
- the signal processing circuit 14 is connected to each of the actuators 25 (25 to 25 2 , 25 3 ) provided from the CPU 10 (FIG. 5) via the bus 15 (FIG. 5). ..) Are input via the host interface 41, a first drive signal for driving the LED, an audio signal, and a second drive signal for driving the LED.
- the signal processing circuit 14 sends these signals to the bus 40, the bus arbiter 39, the bus 38, the serial bus host controller 36, and the corresponding hub 27 (27! To 27n ) (FIG. 5). and successively through corresponding with the Akuchiyue Ichita 2 5 (2 5 have 2 5 2 2 5 3 ...) (FIG. 5), is sent to the speaker 24 (FIG. 5) or LED.
- each sensor In this way, in the signal processing circuit 14, each sensor, CCD camera 20, microphone 23, speaker 24, and each actuator 25 (25 to 25 2 ,
- the CPU 10 can perform various signal processing necessary for controlling the behavior of the robot device 1.
- FIG. 7 shows a software configuration of the above-described control program in the robot device 1.
- the device.dryno layer 50 is located at the lowest layer of the control program, and includes a device driver set 51 including a plurality of device drivers.
- each device driver is an object that is allowed to directly access hardware used in a normal computer, such as a CCD camera 20 (FIG. 5) or a camera, and the corresponding hardware Performs the process in response to the interrupt.
- the robotics brute server 'object 5 2 the device' located in the driver 'Les I catcher 5 0
- a virtual robot 53 which is a group of software that provides an interface for accessing hardware
- a power manager 54 which is a group of software that manages power switching, etc., and various other devices.
- a design robot 56 which is a software group that manages the mechanism of the report apparatus 1.
- the manager object 57 is composed of an object 'manager 58 and a service' manager 59.
- the object 'manager' 58 initiates and terminates each software group included in the robotic 'server' object 52, the middle 'ware' layer 60, and the application layer 61.
- This is a group of software to be managed
- the service manager 59 manages each object based on the connection information between the objects described in the connection file stored in the memory card 28 (FIG. 5). This is a group of software that manages connections.
- the middle 'ware' layer 60 is located on the upper layer of the robotic 'server' object 52 and is composed of software groups which provide basic functions of the robot device 1 such as image processing and audio processing. It is configured. Also, the application layer 61 is located above the middleware layer 60, and is based on the processing result processed by each software group constituting the middleware layer 40. It consists of a software group for determining the behavior of the device 1.
- FIGS. 8 and 9 show the specific software configurations of the middleware 'ware' layer 60 and the application layer 61, respectively.
- the middleware layer 60 as is clear from FIG. 8, noise detection, temperature detection, brightness detection, scale recognition, distance detection, posture detection, contact detection, Recognition system 700 including signal processing modules 70-78 for motion detection and color recognition and input semantics module 79, etc., and output semantics Overnight module 88 and signal processing modules 81 1 for posture management, tracking, motion playback, walking, fallback recovery, LED lighting and sound playback
- each signal processing module 70 to 78 of the recognition system 700 is composed of a robot, a server, a virtual robot 53 of the object 52, and a sensor read from the DRAM 11 (FIG. 5) along with the virtual robot 53.
- the data and image data and the corresponding data of the audio data are fetched, a predetermined process is performed based on the data, and the processing result is given to the input semantics compiler 'night module 79'.
- the input semantics comparator module 79 detects “noisy”, “hot”, “bright”, and “ball” based on the processing results given from these signal processing modules 70 to 78.
- Self such as ⁇ has fallen '', ⁇ detected fall '', ⁇ stroked '', ⁇ struck '', ⁇ domiso scale has exceeded '', ⁇ detected moving object '' or ⁇ detected obstacle '' And recognizes the surrounding situation, commands and actions from the user, and outputs the recognition result to the application layer 61 (Fig. 7).
- Layer 6 1 consists of five modules: behavior model library 90, behavior switching module 91, learning module 92, emotion model 93, and Motokura model 94, as shown in Figure 9. Have been.
- the behavior model library 90 includes, as shown in FIG. 10, “when the remaining battery power is low”, “when falling back”, “when avoiding obstacles”, and “expressing emotions”.
- Independent action models 90 to 90 correspond to several pre-selected condition items such as “if” and “when a ball is detected”.
- These behavior models 90 ⁇ to 90 n are generated when the recognition result is given from the input semantics converter module 76 or when a certain time has passed since the last recognition result was given. If necessary, refer to the corresponding emotion parameter values held in the emotion model 93 and the corresponding desire parameter values held in the instinct model 94, as described later. The decision is made, and the decision result is output to the action switching module 91.
- each behavior model 9 0 1 ⁇ 9 0 n is determined the next action IB02 / 00544
- one node (state) NOD E as shown in Fig. 8.
- ⁇ NO DE n which the other nodes N_ ⁇ DE o ⁇ N OD or each node NODE transition to E n.
- Algorithm called probability Oto Ma tons determining probabilistically based on ⁇ NO DE arc AR C 1 ⁇ AR C n + 1 transition probability P I-P n +1 which is it it set for connecting between n Is used.
- each behavior model 90 ⁇ to 90 n is a node NOD E that forms its own behavior model 901 to 90 n . Respectively so as to correspond to the ⁇ NOD E n, these node NOD E. And a state transition table 1 0 0 as shown in Figure 1 1 of each to N OD E n.
- the node NODE 100 represented by the state transition table 100 in FIG. 11 when a recognition result of “detection of a pole (BALL)” is given, the node of the ball given together with the recognition result is given. If the “SIZE” is in the range of “0 to 100” or a recognition result of “OB STACLE” is given, along with the recognition result The condition for the transition to another node is that the given “distance (DIS TAN CE)” to the obstacle is in the range of “0 to 100”.
- behavioral model 9 0 i to 9 0 n is it it held in the emotion model 9 3 and instinct model 9 4 Referring periodically Of the parameter values of each emotion and each desire, one of the parameters of “joy”, “SURPRISE”, or “sadness” (SUDNE SS) held in the emotion model 93 When the evening value is in the range of “50 to 100”, it is possible to transition to another node.
- the node name that can transition from the node N_ ⁇ _DE o ⁇ NOD E n in the column of "Qian Utsurisaki nodes” in the column of "probability of transition to another Roh one de” is listed Along with “input event name”, “de-night value” (and “de-night range” Other nodes NODE that can transition when all the conditions described in the line are met.
- the transition probabilities to ⁇ N 0 DE n are described in corresponding places in the column of “Transition probabilities to other nodes”, and the node NODE.
- Action to be output when transitioning to ⁇ NOD E n are described in the row of "output action" in the column of "probability of transition to another node”.
- the sum of the probabilities of each row in the column “Transition probability to another node” is 100%.
- node node 100 represented by the state transition table 100 in FIG. 11 for example, “ball detected (BALL)”, and “SIZE” of the ball is set to “0” If the recognition result that the range is from 1 to 0 0 0 is given, the transition to “n 0 de 1 2 0 (node 1 2 0)” can be made with a probability of “3 0 [%]”. At this time, the action “AC TI ON 1” is output.
- Each of the behavior models 90 1 to 90 n is a node NO DE described as such a state transition table 100.
- ⁇ NO DE n and is constructed as several leads, such as when a recognition result from the input semantics comparator 'Isseki module 7 6 is given, the corresponding node NODEQ ⁇ NOD E n of states Qian Utsurihyo
- the next action is determined stochastically by using 10 and the result of the determination is output to the action switching module 91.
- Action switching module 9 1 of the action to which it is then output from the behavior model 9 0 i ⁇ 9 0 n behavioral model library 9 0, predetermined higher priority behavior model 9 0 1 ⁇ 9 0 n Is selected, and a command to execute the action (hereinafter, referred to as an action command) is sent to the output semantics converter 88 of the middleware layer 60.
- an action command a command to execute the action
- it is denoted behavioral model 9. 01 to 9 0 n as priority lower is set higher 1 0.
- the behavior switching module 91 based on the behavior completion information provided by the output semantics compa- nator 77 after the behavior is completed, determines that the behavior has been completed by the learning module 92, the emotion model 93, and the instinct model. Notify 9 4
- the learning module 92 based on the recognition results given from the input semantics converter 79, performs the actions of the user such as “I hit” and “stroke”.
- the recognition result of the instruction received as a trigger is input.
- the learning module 92 reduces the probability of the action being “hit” (reproached), and “strokes (praise)”.
- the corresponding behavior model 90 in the behavior model library 90 is raised so as to increase the probability of occurrence of that behavior. To change the corresponding transition probability of ⁇ 9 0 n.
- the emotion model 93 consists of “joy”, “sadness”, “anger”, “surprise”, “disgust” and “fear”. For a total of six emotions, each emotion has a parameter that indicates the intensity of that emotion. The emotion model 93 then converts the parameter values of each of these emotions into a specific perception, such as ⁇ hitted '' and ⁇ stroked '', given by the input semantics converter module 79. It is updated periodically based on the result, the elapsed time, the notification from the action switching module 91, and the like.
- the emotion model 93 is based on the recognition result given from the input semantic converter module 79, the behavior of the robot device 1 at that time, the elapsed time since the last update, and the like.
- the amount of fluctuation of the emotion at that time calculated by a predetermined arithmetic expression AE [t], E [t ] of the current parameter value of the emotion, the coefficient representing the sensitivity of the emotion as k e, the following formula
- the parameter value E [t + 1] of the emotion in the next cycle is calculated, and this is replaced with the current parameter value E of the emotion: E [7].
- the emotion model 73 updates the parameter values of all emotions in the same manner.
- the degree to which the recognition result and the notification from the output semantics converter module 88 affect the variation ⁇ ⁇ [t] of the parameter value of each emotion is determined in advance. For example, when a recognition result such as “hit” is given, the change amount of the emotion parameter overnight value of “anger” E [t] is more greatly affected. If the recognition result such as “stroke” is given, the variation ⁇ ⁇ [t] of the emotion parameter overnight value of “joy” is more greatly affected.
- the notification from the output semantics-comparison module 88 is the so-called action feed pack information (action completion information), information on the appearance result of the action, and the emotion model 93 Such information also changes emotions. This is the case, for example, when a barking action reduces the emotional level of anger.
- the notification from the output semantics converter module 88 is also input to the learning module 92 described above, and the learning module 92 responds to the behavior models 90 i to 90 n based on the notification. Change the transition probabilities.
- the feedback of the action result may be made by the output of the action switching modulator 91 (the action to which the emotion is added).
- the instinct model 94 describes four independent needs of “exercise”, “affection”, “appetite” j and “curiosity j”. For each of these desires, a parameter indicating the strength of the desire is stored, and the instinct model 94 calculates the parameter values of these desires and the input semantics-combination module 7 9 The instinct model 94 is updated periodically based on the recognition result given by the user, the elapsed time, the notification from the action switching module 91, etc.
- the instinct model 94 includes the Regarding “curiosity”, the amount of change in the desire at that time calculated by a predetermined arithmetic expression based on the recognition result, the elapsed time, and the notification from the output semantics converter module 88 is represented by ⁇ [k].
- the current that The parameter value of the desire in the next cycle is defined as I [k:] and the coefficient ki representing the sensitivity of the desire in the following cycle using the following equation (2). ], And replaces the result of this operation with the current parameter value I [k] of the desire to update the parameter value of the desire.
- the degree to which the recognition result and the notification from the output semantics converter module 88 affect the variation ⁇ I [k] of the parameter value of each desire is predetermined.
- the notification from the output semantics compa- rator module 88 has a large effect on the amount of change I [k] in the parameter value of "fatigue".
- the parameter values of each emotion and each desire (instinct) are regulated so as to fluctuate in a range from 0 to 100, and the coefficients k e , ki Is also set individually for each emotion and desire.
- the output semantics converter module 88 of the middle 'wear' layer 40 is, as shown in FIG. 9, the “forward” provided by the action switching module 91 of the application layer 61 as described above.
- An abstract action command such as “pleasing”, “squealing” or “tracking (chasing the ball)” is given to the corresponding signal processing module 78 to 84 of the output system 800.
- the mouth port device 1 can perform autonomous actions according to its own and surrounding conditions, instructions and actions from a user based on the control program. ing.
- these signal processing modules 81 to 87 give the corresponding actuary 25 1 to 25 n (FIG. 5) to perform the action based on the action command.
- the power servo command value, the audio data of the sound output from the speaker 24 (Fig. 5) and / or the drive data given to the LED of the "eye" are generated. These data are transmitted to the virtual server object 32 via the virtual robot 33 and the signal processing circuit 14 (FIG. 12). 5 sequentially sent to i to 2 5 n or the speaker 2 4 or LED.
- the report device 1 can perform autonomous actions according to its own (internal) and surrounding (external) conditions, and instructions and actions from the user, based on the control program. It has been made like that.
- the robot apparatus 1 includes an external force detecting means 7 as a portion for detecting an external force applied from outside.
- the external force detecting means 7 detects an external force based on a change in a control signal output from the control means 6 to the driving section 8 for controlling the operation section 9 when an external force is applied.
- the external force detecting means 7 is configured as a so-called module or object which is an arithmetic processing program.
- the control means 5 is, for example, CPU 10.
- the operating unit 9 is the leg unit 3A, 3B, 3C, 3D, the head unit 4 or the tail unit 5 described above.
- the control signal is, for example, a PWM (pulse width modulation) pulse whose pulse width is modulated by data.
- the PWM pulse is a signal frequently used as a control signal or the like.
- the external force applied to the robot device 1 can be detected by the configuration described above as follows.
- the control means 10 outputs a control signal to the operation unit 9.
- the robot device 1 has a plurality of predetermined action plan data.
- the action plan data is information on actions that can be executed by the robot device 1.
- the robot device 1 can cause various actions to appear by a plurality of action plan data. It is said that.
- the control means 1 ⁇ selects one action plan data based on a predetermined condition from the plurality of action plan data prepared in this way, and controls a control signal based on the selected action plan data. Is output.
- the above-mentioned predetermined condition is, for example, that the emotion defining the action of the mouth device and the sorting device 1 is at a predetermined level.
- the control signal output from the control unit 6 is output to the driving unit 8. You.
- the drive unit 8 controls the operation unit 9 based on the control signal. Then, the driving unit 8 Four
- the operation unit 9 is controlled based on control signals sequentially output from the control means 6 based on the data.
- the external detection unit 7 monitors the control signal output from the control unit 6 to the drive unit 8 as described above, and detects the external force based on the control signal when an external force is applied to the operation unit 9. .
- the control signal is a PWM pulse
- the external force is detected based on the pulse width of the PWM pulse that changes due to the application of the external force.
- the external force is detected by the external force detecting means 7.
- the external force detecting means 7 obtains the direction and magnitude as information of the external force.
- the robot apparatus 1 controls the posture and the operation, and also controls the external force by using the control signal used for the control of the posture and the operation. Can be detected.
- the robot device 1 outputs a control signal to perform a posture control to control the posture in response to the body being pushed backward by an external force. External force information can be obtained based on this.
- the robot apparatus 1 links the information of the external force with a predetermined operation, thereby applying the external force, detecting the information of the external force, and detecting the information corresponding to the external force. (For example, "sitting").
- the relationship between the torque A (actual measurement) and the PWM pulse is determined.
- the relationship between the force (actual measurement) measured by a sensor or the like and the torque B (calculated value) is determined.
- the relationship between torque A and torque B (for example, a relational expression) is calculated. Ask.
- the force information is obtained only from the PWM pulse calculated in each section, using the relation obtained in the above third step.
- FIGS. 14 to 17 show, as a graph, the relationship between the torque and the value of the PWM pulse for each leg.
- the horizontal axis indicates the pulse of PWM
- the vertical axis indicates the torque value.
- the results in Fig. 14 are for the right front leg
- the results in Fig. 15 are for the left front leg
- the results in Fig. 16 are for the right hind leg
- the results in Fig. 17 are for the right hind leg.
- the relationship between the torque at each joint (joint 1 to joint 3) from the shoulder to the toe and the PWM pulse width is shown.
- the leg is pulled by a spring scale, and the torque is obtained based on the tension, and the PWM pulse width is the value at that time.
- the dead zone occurs within a range of ⁇ 320.
- the torque is indeterminate and cannot be used to determine the force.
- the available area is used, and the relationship is obtained from the value of such available area using a regression line.
- regression lines were calculated for all joints, and the average was taken to obtain the relational expression.
- the available regions include the PWM pulses of +32 0 to +5 12 and 1320 to 1512, and the PWM pulse
- FIG. 18 shows a system (hereinafter, referred to as a force sensor system) for measuring a force applied to the robot device 1.
- This force sensor system includes a sensor 201, a differential amplifier 202, an A / D converter 203, an interface 204, and an analyzer 205.
- the sensor 201 is a three-axis force sensor that measures a three-axis force, for example.
- the originally one-axis force sensor 201a is arranged in three directions, and each force sensor 201a contacts the surface of the metal ball 201b.
- the configuration is as follows.
- the sensor 201 has such a configuration, and when an external force is applied to the metal ball 201, each force sensor 201a outputs a signal corresponding thereto.
- each force sensor 201a is a strain gauge, and the sensor 201 is configured by a bridge circuit using such a strain gauge.
- the signal detected by the sensor 201 is output to the differential amplifier 202, amplified by the differential amplifier 202, and output to the A / D converter 203.
- the A / D converter 203 is configured as, for example, an 8-bit data converter.
- the data obtained by the conversion by the A / D converter 203 is a so-called RS 00544
- the voltage level is converted by an interface 204 such as 2C and output to the analysis unit 205.
- the analysis unit 205 analyzes the force based on the data obtained by the sensor 201.
- the analysis unit 205 obtains the direction and magnitude of the force as the information of the force by the analysis.
- the analysis unit 205 is a suitable analysis software and a personal computer (PC) that performs force analysis according to the description of the analysis software.
- FIG. 19 shows a characteristic diagram for force standardization (calibration). Calibration is performed by checking the output voltage from the pump when a certain weight is applied, for example, by using a tool, and it can be confirmed from this characteristic diagram that there is a proportional relationship.
- V V 0 + F X (V I-V 0) / (W 1-W 0)
- the force applied to the robot device 1 can be measured.
- a conversion formula that converts a force measured by the above-described force sensor system into a torque will be described.
- the conversion formula is obtained by modeling each link of the robot device 1 so as to convert a force assumed to be an end effector into a torque.
- FIG. 20 shows a coordinate system of a robot device modeling each link. By modeling each link of the robot device in this way, the measured force can be converted into torque.
- the conversion uses, for example, a Jacobi matrix.
- F be the endef ekta and the required torque of each joint (link) when the force is given. Then, the Jacobian matrix J is used between the force and the torque. Equation (6) holds.
- the base coordinate system is the coordinates of the body of the reporting device.
- the force obtained by the above-described force sensor system can be converted into torque by the conversion using the Jacobian matrix.
- the measured torque is the torque described in the above-mentioned text item (1-11-1), and the torque obtained by the conversion is the torque obtained by the force sensor described in the above-mentioned text item (2-1-2). It is.
- FIGS. 21 to 26 graphically show the relationship between the measured torque and the torque obtained by conversion from the force sensor values.
- the vertical axis is the torque obtained by the conversion (the value obtained based on the force sensor)
- the horizontal axis is the measured torque (ApproxEqu).
- FIGS. 21 and 22 show the relationship at joint 1
- FIGS. 23 and 24 show the relationship at joint 2
- FIGS. 25 and 26 show the relationship at joint 3.
- FIGS. 21, 23, and 25 show the positive regions of the PWM pulse
- FIGS. 22, 24, and 26 show the negative regions thereof.
- FIGS. 27 to 30 show the calculated forces obtained by applying the force by using the above relational expressions.
- FIGS. 27 to 29 show the results in the case where a force is applied from each direction to the robot apparatus 1 in the standing state
- FIG. 27 shows the results in the case where the force is applied in the forward direction
- Fig. 28 shows the result when a force is applied in the backward direction
- Fig. 29 shows the result when the left leg is lifted.
- applying a force in a forward direction is, for example, a method of applying a force when trying to cause the robot 1 to face down, in this case, And the left and right sides are well calculated, indicating that the rear leg is hardly applied. This is because when a force is applied to the robot device 1 in the forward direction, the force is concentrated on the front legs.
- FIG. 30 shows the case where force is applied to the robot device 1 in the standing posture in the order of forward, rightward, leftward, and rearward, that is, the body of the robot device 1. This shows the result when force is applied all around. Where F x and F y are horizontal forces. It can be seen that the robot device 1 reliably senses the one-way force applied as shown in FIG.
- the above is an example in which information (magnitude, force, etc.) of an externally applied force is calculated based on a PWM pulse used as a control signal, as a specific example for detecting an external force.
- the robot apparatus 1 can control the attitude control and the operation by the control signal, and can detect the external force using the control signal. For example, the robot device 1 can transition the operation to another posture by using the external force detected in this way as a trigger.
- the user can cause the robot device 1 to appear, for example, with a certain action by pressing the buttocks, by using the robot device 1, and enjoy the operation that the robot device 1 expresses by such timing. it can.
- the lopot device 1 uses the control signal originally used to generate an external force. Because of the detection, it is possible to detect external force as a simple structure without the need for a dedicated sensor.
- the robot apparatus 1 uses kinematics and kinematics to determine the respective angles of the pitch direction and the roll direction of the shoulder joints and the target angles of the knee joints of the leg units 3A to 3D during walking, respectively.
- the shoulder joints are set so that the pitch and roll angles of the shoulder joint and the pitch angle of the knee joint of the leg units 3A to 3D become the instruction values, respectively.
- Walking is realized by driving and controlling the corresponding actuators 25 in the mechanism and / or the knee joint mechanism.
- the pitch direction of the shoulder joints of the leg units 3A to 3D that are most sensitive to obstacles is determined by the operation 25 for the pitch direction (hereinafter referred to as the shoulder joint pitch direction).
- the difference between the indicated value given to the actuator and the corresponding angle of the pitch direction obtained based on the output of the corresponding potentiometer 26 (hereinafter referred to as the actual value) is detected.
- the magnitude of the detected difference is larger than a preset threshold, it is determined that the vehicle has collided with the obstacle, and the collision with the obstacle is detected.
- This method is referred to as the first obstacle detection method for the front leg units 3A to 3D, and the method for the rear leg units 3A to 3D is referred to as the second obstacle detection method. What is called an object detection method .
- FIG. 31 is a graph showing the relationship between the indicated value and the actual value given to the shoulder pitch direction actuary 25 of the leg units 3A to 3D when there is no obstacle. is there.
- K 1 indicates an indicated value given to the shoulder pitch direction actuary 25
- K 2 indicates an actual value
- K 3 indicates a difference value between the indicated value and the actual value.
- Figs. 32A to 32C show the output 4 (Fig. 328) of the acceleration sensor 19 (Fig. 5) when the robot device 1 walks and collides with an obstacle. It is a graph showing the relationship between the indicated value and the actual value (FIGS. 32B and 32C) given to the shoulder pitch direction practice 25 of each leg unit 3A to 3D. .
- the lower K5 is an instruction value given to the shoulder joint pitch direction actuation unit 25 of the front leg unit 3A, 3B of the robot apparatus 1
- K6 is The actual values detected based on the output of the corresponding potentiometers 26 are shown, and K7 to K9 in the upper row show the respective leg unit 3 3, 3 ⁇ swing period and grounding period. , A difference value between the indicated value and the actual value, and a detection signal generated based on whether the difference value is greater than a preset threshold value.
- the upper row 10 is given to the shoulder joint pitch direction actuation unit 25 of the rear leg unit 3C, 3D in the robot apparatus 1.
- the indicated value, K 11 indicates the actual value detected based on the output of the corresponding potentiometer 26, and the lower row, 12 to K 14, respectively, each leg unit 3 C ,
- each leg before and after the robot device 1 collides with an obstacle immediately before the robot device 1 falls (the portion circled in FIG. 32A). This indicates that a collision with an obstacle was detected in the detection signals for the unit, the software 3C, and the 3D (accurately, the difference between the indicated value and the actual value was greater than the preset threshold). It can be confirmed that pulses PL 1 to PL 3 are generated. Therefore, as is clear from FIGS. 32B and 32C, the mouth pot device 1 collides with the obstacle by the first and second obstacle detection methods. It can be seen that it can be detected.
- the leg unit 3A to 3D may be caused by factors other than collision with an obstacle, for example, when walking on a hairy carpet.
- a load is applied to the drive of a vehicle, there may be a case where an obstacle is erroneously detected as if it has collided with an obstacle even though the obstacle does not actually exist.
- the detection threshold based on the difference between the indicated value and the actual value in the first and second obstacle detection methods is increased, and the detection is thereby performed.
- a collision with an obstacle that cannot be detected is detected by another method (hereinafter referred to as a third obstacle detection method).
- the third obstacle detection method a method introducing the concept of a safety margin is used.
- the stability margin refers to the point at which the center of gravity of the robot device 1 is projected onto the ground (hereinafter referred to as the center of gravity projection point).
- Each side TRE 1 to TRE 3 of the triangle TR formed by connecting the ground points PA, PB, and PD of the unit units 3A to 3D (leg units 3A, 3B, and 3D in Fig. 33) are defined as distances L1 to L3.
- any of the stability margins L1 to L3 becomes a negative value that is, if the center of gravity projection point PG goes out of the triangle TR
- the robot 1 falls down, so that the robot 1 has stability margins L1 to L3.
- Walk while controlling the posture so that it does not always take a negative value, but if it collides with an obstacle, the posture is lost and one of the stability margins L1 to: L3 becomes negative, or The stability margins L 1 to L 3 become extremely small even if they do not become negative, and the posture becomes unstable.
- the robot device 1 constantly monitors each of the stability margins L1 to L3 when walking, and calculates the value of one of the stability margins L1 to L3 (the minimum stability margin L1 to L3 at that time). Is smaller than a preset threshold, it is determined that the vehicle has collided with an obstacle.
- K 20 indicates the theoretical value
- K 21 indicates the measured value
- K 22 indicates a leg.
- the phase showing the swing leg period and the ground contact period of the units 3A to 3D is shown.
- the lopot device 1 sequentially determines whether or not there is a collision with an obstacle during walking, and when it is determined that the collision with the obstacle has occurred, for example, Prescribed countermeasures such as changing the way of walking and the direction of walking are performed to prevent falls, so that falls due to collision with obstacles can be prevented effectively beforehand.
- Figures 35 to 37 show the results of obstacle detection by the above obstacle detection method.
- the four graphs from the top show the shoulders for the front right, front left, rear left, and rear right leg units 3B, 3A, 3C, and 3D, respectively.
- the detection signals (K31B, K31A, K31C, and K31D) generated depending on whether they are greater than or not are shown.
- the fifth graph from the top shows the theoretical values (K32) and the measured values (K33) of the minimum stability margins L1 to L3 and the measured values in advance.
- the lower graph shows the output of the acceleration sensor (K35) and the detection signal (K335) representing the final obstacle detection result by the first and second obstacle detection methods. 6).
- FIG. 35 a collision with an obstacle is detected by the first obstacle detection method immediately before the robot device 1 loses its posture, and in FIG.
- the collision with the obstacle is detected by the obstacle detection method of Fig. 3. It can be seen that the collision with the obstacle is detected by the second obstacle detection method just before the device 1 falls.
- the first to third obstacle detection methods can accurately and reliably detect the collision with the obstacle before the robot device 1 falls down. It could be confirmed.
- the grounding points of the three grounding legs are each P. , P 11 and P 12 , the coordinates P! Of these three-dimensional spaces using the kinematics from the joint angles. Find 0 , and P 12 .
- the direction vector u which is the unit vector of the cross product of the vector a and the vector b, is calculated.
- the inner product of the outer product (u x a) of the direction vector u and the vector a and the vector P i G is calculated as follows.
- the stability margin s n corresponding to the side: P i o P i can be obtained.
- the magnitude of the outer product (ii xa) is “1” because the directional vector 11 and the vector a are orthogonal to each other.
- the stability margin sm 2 corresponding to the side P 1 0 P 12 can be obtained by the following equation (2 6)
- the stability margin sm 3 corresponding to the side P ⁇ P is can be obtained as follows.
- the above-described obstacle detection processing in the robot device is performed by the CPU 10 (FIG. 4) according to the obstacle detection processing procedure RT1 shown in FIG.
- the CPU 10 starts this obstacle detection processing procedure RT1 at step SP0 at the same time as the start of the walking motion, and in the subsequent step SP1, the left front leg unit 3B is in the swing period. It is determined whether or not there is.
- step SP1 If the CPU 10 obtains a negative result in step SP1, it proceeds to step SP4, and if it obtains a positive result, it proceeds to step SP2, Finally, the PU 10 gave the indicated value to the pitch direction actuator 25 at the shoulder joint at the left front leg unit 3B, and the potentiometer 26 corresponding to the pitch direction actuator 25. The difference between the actual value in the pitch direction of the shoulder joint at that time obtained based on the output is calculated.
- step SP3 determines whether or not the difference value calculated in step SP2 is larger than a preset threshold.
- the threshold value is set to 1 18]. If the CPU 10 obtains a negative result in step SP3, it proceeds to step SP4, and thereafter, for the right front leg unit 3A, proceeds to step SP4 to step SP4 in the same manner as step SP1 to step SP3.
- Process step SP6 As described above, the CPU 10 determines whether or not there is a collision with an obstacle by the first obstacle detection method in steps SP1 to SP6. If the CPU 10 detects a collision with an obstacle by obtaining a positive result in step SP3 or step SP6, the CPU 10 proceeds to step SP16 to change the way of walking or the direction of walking. After executing the predetermined fall prevention process, the process returns to step SP1 via step SP17, and thereafter, the obstacle detection process procedure RT1 is started again from the step SP1.
- step SP4 the CPU 10 proceeds to step SP7 and returns to the left rear leg. Judgment is made as to whether or not club unit 3D is in the swing period.
- step SP7 If the CPU 10 obtains an affirmative result in step SP7, it proceeds to step SP8, where the CPU finally transmits to the actuator 25 for the pitch direction of the shoulder joint in the left rear leg unit 3C.
- the process proceeds to step SP11.
- step SP7 if the CPU 10 obtains a positive result in step SP7, it proceeds to step SP9 to calculate the difference value, and thereafter, calculates the difference value. Then, it is determined whether or not the difference value calculated in step SP 9 is larger than the preset threshold value and whether the current difference value is larger than the previous difference value (whether the difference value has become larger). to decide.
- the value of this threshold is set to 15 (:.).
- step SP10 If the CPU 10 obtains a negative result in step SP10, it proceeds to step SP11, and thereafter, in the same manner as steps SP7 to SP10 for the right rear leg unit 3D. Process steps SP11 to SP14.
- the CPU 10 determines whether or not there is a collision with an obstacle by the second obstacle detection method in steps SP7 to SP14. If a collision with an obstacle is detected by obtaining a positive result in step SP10 or step SP14 at this time, the CPU 10 proceeds to step SP16 and executes the above-described fall prevention processing. I do.
- the CPU 10 compares the difference value for each of the left and right rear leg units 3C and 3D stored at that time in the following step SP17 in step SP9 or step SP14. After updating to the obtained value, the process returns to step SP1, and thereafter, the obstacle detection processing procedure RT1 is started again from the step SP1.
- step SP10 or SP14 the CPU 10 proceeds to step SP15, and in this step SP15, Judgment of collision with obstacles is made by the obstacle detection method.
- the three stability margins L 1 to L 3 (FIG. 33) at that time are deviated from the equations (1) to (7) by the above-described method.
- the threshold value is set to 2 [mm].
- step SP15 means that a collision with an obstacle has been detected by the third obstacle detection method.
- step SP16 the process proceeds to step SP16 to execute the above-described overturn prevention process, and also proceeds to step SP17 to store the left rear or right rear leg unit 3 Cs 3 stored at that time.
- step SP17 After updating the difference value of D to the value temporarily stored (“0”) in step SP8 or step SP12, the process returns to step SP1, and then repeats the obstacle detection procedure RT1 again in step SP8. Start from 1.
- step SP15 means that a collision with an obstacle was not detected by any of the first to third obstacle detection methods.
- 10 advances to step SP17 and temporarily stores the difference value for each of the left and right rear leg units 3C and 3D stored at that time in step SP8 or step SP12. After updating to the stored value ("0"), the process returns to step SP1, and thereafter, the obstacle detection processing procedure RT1 is started again from the step SP1.
- the CPU repeatedly performs the obstacle detection processing procedure RT1 sequentially during the walking operation, thereby controlling the walking of the mouth pot device 1 so as not to fall down even if it collides with the obstacle. .
- the CPU 10 of the robot apparatus 1 determines the instruction value given to the pitch direction actuator 25 of the shoulder joint in the leg units 3A to 3D, which are the free legs, and at this time, The presence or absence of a collision with an obstacle is determined based on the pitch direction function 25 and the actual value detected based on the output of the corresponding potentiometer 26, and the collision with the obstacle is thereby determined. If it cannot be detected, each of the stability margins L1 to L3 is calculated and the presence or absence of an obstacle is determined based on the calculation results.
- the robot device 1 detects a collision with an obstacle based on the actual state and posture of each leg unit 3A to 3D, and thus, for example, image data from a CCD camera 20 (FIG. 5). It is possible to detect a collision with an obstacle with much higher accuracy than when an obstacle is detected by image processing or the like based on the image processing.
- the detection of such an obstacle is performed by using an existing sensor (a potentiometer 26 for the pitch direction of the shoulder joint in each leg unit 3A to 3D). Therefore, the configuration as a whole can be simplified as compared with, for example, a case where a special force sensor is installed, the manufacturing cost can be reduced, and an increase in weight can be effectively prevented.
- the instruction value given to the pitch direction actuating unit 25 of the shoulder joint in the leg units 3A to 3D, which are the free legs, and at this time, the pitch direction actuating unit 25 The presence / absence of a collision with an obstacle is determined based on the actual value detected based on the output of the corresponding potentiometer 26, and when the collision with the obstacle cannot be detected, By calculating the margins L1 to L3 and judging the presence or absence of an obstacle based on the calculation result, it is possible to reliably collide with the obstacle without complicating the mechanism.
- the robot device can be detected, and thus a robot device having a simple configuration capable of performing stable walking even on uneven terrain can be realized.
- the present invention is configured as shown in FIG. 4 and is applied to the robot 1.
- the present invention is not limited to this. It can be widely applied to mold-type robots and various other types of robots.
- the present invention is not limited to this, and the first to third obstacle detection methods are not limited thereto. One or any two of these methods may be used.
- the collision with the obstacle is detected based on the indicated value and the actual value of the leg joint units 3A to 3D regarding the shoulder pitch direction actuation unit 25.
- the instruction values and actual values for the roll direction actuating element 25 of the shoulder joint mechanism in each leg unit 3A to 3D, and the knee value The collision with an obstacle may be detected based on the indicated value and the actual value of the joint mechanism for the operation 25, that is, a robot that walks while driving a predetermined joint mechanism as necessary. Instruction values and actual values for the driving means for driving the joint mechanism in the switching device. If a collision with an obstacle is detected based on the critical value, various other driving means can be widely applied as the target driving means.
- the drive control means for giving the command value of the drive amount of the shoulder pitch direction actuating device 25 during driving to the shoulder joint pitch direction actuating device 25, and the command value And a means of judging the presence or absence of an obstacle based on the actual value, and a triangular TR formed by connecting the ground points PAPB and PD of the grounded unit 3 A to 3 D (for four or more legs) Are often other polygons).
- the drive amount detection means for detecting the actual value of the drive amount of the drive means at the time of driving is potentiometer 26, but the present invention is not limited to this. Various other types can be widely applied according to the type of driving means.
- the mouth bot device includes: an operation unit whose operation is controlled by a control signal; control means for outputting a control signal to execute a predetermined operation; and an external force applied to the operation unit.
- External force detecting means for detecting an external force based on a change in the control signal when the external force is detected, thereby detecting the external force based on a control signal for controlling its own operation without using a dedicated sensor for detecting the external force. can do.
- the mouth bot device includes a joint mechanism driven by an operation unit; Drive control means for providing the drive means with an instruction value of the drive amount of the drive means when the joint mechanism is driven, and drive amount detection means for detecting the actual value of the drive amount of the drive means when the joint mechanism is driven.
- a plurality of legs driven in a predetermined pattern and a polygonal side formed by connecting the grounding positions of the grounded legs of the plurality of legs are used.
- the control method of a robot device includes a control step of controlling an operation of an operation unit of the robot device by a control signal, and a robot based on a change in the control signal when an external force is applied.
- a control step of controlling an operation of an operation unit of the robot device by a control signal and a robot based on a change in the control signal when an external force is applied.
- a driving control step of giving a driving amount instruction value to the operating unit, and an actual value of the driving amount of the operating unit By determining the presence of an external force based on the drive amount detection step to be detected and the indicated value and the actual value, the external force can be detected without the need for a complicated configuration, and the actual obstacle can be detected. Physical collisions can be reliably detected, and thus collisions with obstacles can be reliably detected with a simple configuration.
- a polygonal polygon formed by connecting a walking step of walking by driving in a predetermined pattern and a grounding position of each of the plurality of legs that are grounded.
- the presence or absence of an obstacle is determined based on the distance at each point, so that a physical collision with the obstacle can be reliably detected without adding any special parts. Can reliably be detected.
- an operation means for driving each unit of the robot device a control means for giving an instruction value of an operation amount of the operation means to the operation means, and an actual value of the operation amount of the operation means are detected.
- a judgment means for judging the presence or absence of an external force based on the indicated value and the actual value it is possible to perform obstacle detection without requiring a complicated configuration, and A physical collision with an actual obstacle can be reliably detected, and a collision with an obstacle can be reliably detected with a simple configuration.
- the program according to the present invention includes a control step of controlling the operation of the operation unit of the robot device by a control signal, and an external force to the robot device based on a change in the control signal when an external force is applied.
- the robot device that executes the operation according to such a program controls its own operation without using a dedicated sensor for detecting the external force. External force can be detected based on the control signal.
- the recording medium according to the present invention is characterized in that a control step of controlling the operation of the operation unit of the robot device by a control signal and an external force applied to the robot device based on a change in the control signal when an external force is applied.
- the robot device that executes an operation according to a program recorded on such a recording medium can perform its own operation without using a dedicated sensor for detecting an external force. External force can be detected based on the control signal to be controlled.
- a step of determining the presence / absence of an obstacle based on the indicated value and the actual value it is possible to perform obstacle detection without requiring a complicated configuration, A physical collision with an obstacle can be reliably detected, and a collision with an obstacle can be reliably detected with a simple configuration.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Toys (AREA)
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02703771A EP1386699A4 (en) | 2001-02-21 | 2002-02-21 | ROBOT DEVICE AND METHOD FOR CONTROLLING THE OPERATION OF THE ROBOT DEVICE |
US10/258,152 US6865446B2 (en) | 2001-02-21 | 2002-02-21 | Robot device and method of controlling robot device operation |
KR1020027014119A KR100864340B1 (ko) | 2001-02-21 | 2002-02-21 | 로봇 장치 및 로봇 장치의 동작 제어 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2001045693A JP2002239963A (ja) | 2001-02-21 | 2001-02-21 | ロボット装置、ロボット装置の動作制御方法、プログラム及び記録媒体 |
JP2001055669A JP2002254375A (ja) | 2001-02-28 | 2001-02-28 | ロボット装置及びその制御方法並びに障害物検出装置及び方法 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2002065825A2 true WO2002065825A2 (fr) | 2002-08-29 |
WO2002065825A3 WO2002065825A3 (fr) | 2002-11-07 |
WO2002065825B1 WO2002065825B1 (fr) | 2003-03-06 |
Family
ID=26609848
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2002/000544 WO2002065825A2 (fr) | 2001-02-21 | 2002-02-21 | Robot et procede de commande du fonctionnement dudit robot |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1386699A4 (ja) |
KR (1) | KR100864340B1 (ja) |
CN (1) | CN100445047C (ja) |
WO (1) | WO2002065825A2 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1586423A1 (en) * | 2002-12-10 | 2005-10-19 | HONDA MOTOR CO., Ltd. | Robot control device, robot control method, and robot control program |
US12023811B2 (en) | 2019-01-31 | 2024-07-02 | Sony Group Corporation | Robot control device and robot control method |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101028814B1 (ko) * | 2007-02-08 | 2011-04-12 | 삼성전자주식회사 | 소프트웨어 로봇 장치와 그 장치에서 소프트웨어 로봇의행동 발현 방법 |
KR101487782B1 (ko) * | 2008-12-22 | 2015-01-29 | 삼성전자 주식회사 | 로봇 및 그 균형 제어방법 |
KR101487783B1 (ko) * | 2008-12-22 | 2015-01-29 | 삼성전자 주식회사 | 로봇 및 그 제어방법 |
JP5893664B2 (ja) * | 2014-04-14 | 2016-03-23 | ファナック株式会社 | 作用された力に応じて移動されるロボットを制御するロボット制御装置 |
JP5893666B2 (ja) | 2014-04-14 | 2016-03-23 | ファナック株式会社 | 力に応じて動かすロボットのロボット制御装置およびロボットシステム |
JP6034895B2 (ja) * | 2015-02-20 | 2016-11-30 | ファナック株式会社 | 外力に応じてロボットを退避動作させる人間協調ロボットシステム |
JP6034900B2 (ja) | 2015-03-06 | 2016-11-30 | ファナック株式会社 | 動作プログラムの再開を判断するロボット制御装置 |
CN114514091A (zh) * | 2019-10-23 | 2022-05-17 | Abb瑞士股份有限公司 | 机器人控制方法和装置 |
CN111037564B (zh) * | 2019-12-27 | 2022-03-18 | 深圳市越疆科技有限公司 | 机器人碰撞检测方法、装置、设备及计算机可读存储介质 |
CN113984057A (zh) * | 2021-10-19 | 2022-01-28 | 山东中瑞电气有限公司 | 基于多数据分析的移动机器人定位方法 |
CN114714350B (zh) * | 2022-03-31 | 2024-03-26 | 北京云迹科技股份有限公司 | 服务机器人的控制方法、装置、设备及介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0692076B2 (ja) * | 1989-11-22 | 1994-11-16 | 工業技術院長 | 歩行脚運動制御装置 |
JPH1142576A (ja) * | 1997-07-28 | 1999-02-16 | Matsushita Electric Ind Co Ltd | ロボットの制御方法および装置 |
WO2000040377A1 (fr) * | 1999-01-07 | 2000-07-13 | Sony Corporation | Appareil de type machine, procede d'actionnement de celui-ci et support enregistre |
EP1070571A1 (en) * | 1999-02-10 | 2001-01-24 | Sony Corporation | Device and method for controlling joint mechanism, joint device, robot device, and method for controlling robot device |
JP2001025984A (ja) * | 1999-05-10 | 2001-01-30 | Sony Corp | ロボット装置及びその制御方法並びに記録媒体 |
JP4048590B2 (ja) * | 1998-03-11 | 2008-02-20 | セイコーエプソン株式会社 | プラスチックレンズの製造方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63237883A (ja) * | 1987-03-27 | 1988-10-04 | 株式会社 スタ−精機 | チャック駆動装置 |
JP6092076B2 (ja) * | 2013-11-15 | 2017-03-08 | 株式会社東芝 | 汚染飛灰の処理方法及び処理システム |
-
2002
- 2002-02-21 KR KR1020027014119A patent/KR100864340B1/ko not_active IP Right Cessation
- 2002-02-21 EP EP02703771A patent/EP1386699A4/en not_active Withdrawn
- 2002-02-21 CN CNB028010337A patent/CN100445047C/zh not_active Expired - Fee Related
- 2002-02-21 WO PCT/IB2002/000544 patent/WO2002065825A2/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0692076B2 (ja) * | 1989-11-22 | 1994-11-16 | 工業技術院長 | 歩行脚運動制御装置 |
JPH1142576A (ja) * | 1997-07-28 | 1999-02-16 | Matsushita Electric Ind Co Ltd | ロボットの制御方法および装置 |
JP4048590B2 (ja) * | 1998-03-11 | 2008-02-20 | セイコーエプソン株式会社 | プラスチックレンズの製造方法 |
WO2000040377A1 (fr) * | 1999-01-07 | 2000-07-13 | Sony Corporation | Appareil de type machine, procede d'actionnement de celui-ci et support enregistre |
EP1070571A1 (en) * | 1999-02-10 | 2001-01-24 | Sony Corporation | Device and method for controlling joint mechanism, joint device, robot device, and method for controlling robot device |
JP2001025984A (ja) * | 1999-05-10 | 2001-01-30 | Sony Corp | ロボット装置及びその制御方法並びに記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1386699A2 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1586423A1 (en) * | 2002-12-10 | 2005-10-19 | HONDA MOTOR CO., Ltd. | Robot control device, robot control method, and robot control program |
EP1586423A4 (en) * | 2002-12-10 | 2009-12-16 | Honda Motor Co Ltd | CONTROL DEVICE, CONTROL PROCEDURE AND CONTROL PROGRAM FOR A ROBOT |
US7873448B2 (en) | 2002-12-10 | 2011-01-18 | Honda Motor Co., Ltd. | Robot navigation system avoiding obstacles and setting areas as movable according to circular distance from points on surface of obstacles |
US12023811B2 (en) | 2019-01-31 | 2024-07-02 | Sony Group Corporation | Robot control device and robot control method |
Also Published As
Publication number | Publication date |
---|---|
CN1460052A (zh) | 2003-12-03 |
KR20030007543A (ko) | 2003-01-23 |
WO2002065825B1 (fr) | 2003-03-06 |
WO2002065825A3 (fr) | 2002-11-07 |
EP1386699A2 (en) | 2004-02-04 |
KR100864340B1 (ko) | 2008-10-17 |
EP1386699A4 (en) | 2006-04-19 |
CN100445047C (zh) | 2008-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6865446B2 (en) | Robot device and method of controlling robot device operation | |
US7058476B2 (en) | Robot apparatus, control method for robot apparatus, and toy for robot apparatus | |
US7515992B2 (en) | Robot apparatus and emotion representing method therefor | |
US20240075998A1 (en) | Control of robotic devices with non-constant body pitch | |
WO2002065825A2 (fr) | Robot et procede de commande du fonctionnement dudit robot | |
US11691289B2 (en) | Systems and methods for robotic behavior around moving bodies | |
JP2002239960A (ja) | ロボット装置の動作制御方法、プログラム、記録媒体及びロボット装置 | |
JP3855812B2 (ja) | 距離計測方法、その装置、そのプログラム、その記録媒体及び距離計測装置搭載型ロボット装置 | |
JPH11156765A (ja) | ロボツト装置 | |
US20040153212A1 (en) | Robot apparatus, and behavior controlling method for robot apparatus | |
JP2005074620A (ja) | 簡易地面反力センサを用いる歩行ロボット及びその制御方法 | |
JP2003159674A (ja) | ロボット装置、ロボット装置の外力検出方法及びロボット装置の外力検出プログラム、並びにロボット装置の外力検出のためのキャリブレーション方法及びロボット装置の外力検出のためのキャリブレーションプログラム | |
Neville et al. | A bipedal running robot with one actuator per leg | |
JP4649913B2 (ja) | ロボット装置及びロボット装置の移動制御方法 | |
JP4905041B2 (ja) | ロボット制御装置 | |
JP2004034169A (ja) | 脚式移動ロボット装置及び脚式移動ロボット装置の移動制御方法 | |
JP2002239952A (ja) | ロボット装置、ロボット装置の行動制御方法、プログラム及び記録媒体 | |
JP2003136456A (ja) | ロボット装置、ロボット装置の明るさ検出方法、明るさ検出プログラム及び記録媒体 | |
JP2003271958A (ja) | 画像処理方法、その装置、そのプログラム、その記録媒体及び画像処理装置搭載型ロボット装置 | |
JP2002059384A (ja) | ロボットのための学習システム及び学習方法 | |
JP2001154707A (ja) | ロボット装置及びその制御方法 | |
JP2021133487A (ja) | 脚式ロボット、および脚式ロボットの制御方法 | |
JP2001157979A (ja) | ロボット装置及びその制御方法 | |
JP2002254375A (ja) | ロボット装置及びその制御方法並びに障害物検出装置及び方法 | |
JP2003136439A (ja) | ロボット装置及びロボット装置の歩行制御方法並びにロボット装置の歩行制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002703771 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020027014119 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 028010337 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020027014119 Country of ref document: KR |
|
AK | Designated states |
Kind code of ref document: B1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: B1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
B | Later publication of amended claims | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10258152 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2002703771 Country of ref document: EP |