WO2003090980A1 - Dispositif d'estimation automatique de position pour robots mobiles montes sur des jambes - Google Patents
Dispositif d'estimation automatique de position pour robots mobiles montes sur des jambes Download PDFInfo
- Publication number
- WO2003090980A1 WO2003090980A1 PCT/JP2003/005448 JP0305448W WO03090980A1 WO 2003090980 A1 WO2003090980 A1 WO 2003090980A1 JP 0305448 W JP0305448 W JP 0305448W WO 03090980 A1 WO03090980 A1 WO 03090980A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- estimated
- posture
- coordinate system
- gait
- robot
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
- B62D57/032—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
- G01C21/185—Compensation of inertial measurements, e.g. for temperature effects for gravity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
- G01C21/188—Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
Definitions
- the present invention relates to a device for estimating the self-position of a legged mobile robot.
- the posture rotation in the desired gait is referred to as a desired posture rotation.
- the phenomenon that is mainly a problem in the present specification is that the actual rotation of the entire posture of the lopot (or the rotation of the posture of a representative portion such as the upper body) deviates from the target posture rotation. If this phenomenon is strictly expressed.
- spin the phenomenon in which the entire mouth pot rotates around the vertical axis and deviates from the direction of the target gait.
- an environment recognition device such as a video camera
- an object to be watched such as a landmark
- the map information is captured at a predetermined position in the image (such as the center of the image). Therefore, it is necessary to accurately estimate the self-position and posture.
- an environment recognition device such as a video camera measures the position of an object in a mouth-to-mouth coordinate system (coordinate system fixed to the floor of the moving environment of the mouth pot) set in the environment. It is necessary to accurately estimate the self-position and orientation in the global coordinate system.
- an inertial navigation device used for guiding a rocket or the like is known.
- the mouth port is statically fixed to the floor (ground) in the initial state, so that the initial values of the position and speed of the mouth port are set to 0, and then movement starts. There is a need to.
- the inertial navigation calculation starts when the initial values of the position and velocity do not completely become 0, and the estimation error increases. Was prone.
- GPS global 'positioning' system
- external devices such as satellites, ground-based radars, and beacons. It would be possible to introduce a similar system in a robot, but the accuracy is such that the height of the robot's foot (leg end) from the floor is measured on the order of millimeters or centimeters. It is difficult to achieve this at low cost.
- the present invention has been made in view of such a background, and an object of the present invention is to provide a self-position estimating device capable of accurately estimating the self-position of a robot.
- an object of the present invention is to provide a self-position estimating device capable of accurately estimating the self-position of a robot.
- a self-position estimating device capable of accurately estimating the self-position and posture even in a state in which posture rotation (or spin) occurs in the entire lopot.
- the target object is recognized by an environment recognition device such as a video camera mounted on the lopot. It is an object of the present invention to provide a self-position estimating device capable of appropriately performing a gaze control for controlling a direction of the environment recognition device so as to capture an appropriate position.
- an object having accurate position information such as a landmark can be recognized in advance by environment recognition means such as a video camera, and the accuracy of the robot's estimated self-position can be improved based on the information obtained thereby. It is intended to provide a self-position estimation device.
- a first invention of a legged mobile lopot self-position estimating device of the present invention is a legged mobile lopot controlled to follow a determined target gait.
- the temporal change amount of the posture rotation deviation which is the difference between the detected value or estimated value of the actual posture of the predetermined part of the lopot and the target posture of the predetermined part in the target gait, is determined as the posture rotation deviation change amount.
- a position estimating means for determining an estimated position which is an estimated value of the position of the lopot.
- a change in a posture rotation deviation which is a difference between a detected value or an estimated value of an actual posture of the predetermined portion of the robot and a target posture of the predetermined portion in the target gait, is a certain rotation center. It is assumed that the entirety of the lopot is rotated around the position by the above-described posture rotation deviation change amount.
- the detected value or estimated value of the actual posture of the predetermined part is, for example, an inclinometer. It may be detected or estimated using a sensor such as a jar mouth sensor, but may be appropriately corrected by using the estimated position of the lopot determined by the present invention, the acceleration detection value of the acceleration sensor, etc. .
- the detected value or estimated value of the actual posture of the predetermined part may basically be any value that represents the actual posture of the predetermined part with relatively high accuracy, regardless of the means for obtaining it. This is the same in the present invention other than the first invention.
- the position estimating means rotates the first coordinate system, which is a coordinate system describing the desired gait, around the rotation center by the posture rotation deviation change amount.
- Means for obtaining a second coordinate system comprising: a target motion of the target gait; a detected displacement value of the joint of the lopot; and a target displacement value of the joint. It is possible to determine the estimated position of the robot as viewed from the global coordinate system so that the robot position as viewed from the first coordinate system is the same as the estimated position of the robot as viewed from the second coordinate system.
- the global coordinate system is a coordinate system fixed to the floor (ground) of the environment where the mouth pot moves.
- the mouth pot position viewed from the first coordinate system i.e., the mouth pot does not rotate or slip on the first coordinate system, and the target gait can be adjusted. If it is determined that the target movement or the joint displacement detection value or the joint displacement target value is exercising, the robot is grasped from at least one of the target movement, the joint displacement detection value, and the joint displacement target value. (A position of the mouth pot viewed from the first coordinate system) and a lopot viewed from the second coordinate system obtained by rotating the first coordinate system around the rotation center by the variation amount of the posture rotation deviation.
- the estimated position of the robot as viewed from the global coordinate system is determined so that the estimated position of the robot is the same (each coordinate component of the position on each coordinate system is the same in both coordinate systems).
- the estimated position of the robot can be determined by appropriately reflecting the change in the posture rotation deviation, and an accurate estimated position can be obtained.
- the first coordinate system is a coordinate system set for each landing of the lopot, for example, in the vicinity of the foot contact surface of the foot of the leg that lands by the robot landing operation (described in an embodiment to be described later). (Supporting leg coordinate system), but may be the same coordinate system as the global coordinate system.
- the predetermined portion is an upper body of a mouth pot (a base on which a leg is extended) (a third invention). This is because the influence of the rotating slip of the lopot, etc. tends to appear as a change in the posture of the upper body of the lopot.
- the posture rotation deviation includes at least a posture rotation deviation component of the predetermined portion in the one direction (a fourth invention). This is because the position of the mouth pot is easily affected by the change in the posture rotation deviation in the Yaw direction.
- the rotation center determines the rotation center at the center of gravity of the mouth port during the aerial period, and at any time other than the aerial period, the center of the actual floor reaction force and the target ZMP of the desired gait or any point thereof. It is preferable to determine the rotation center in the vicinity (fifth invention). That is, in the aerial period, the change in the posture rotation deviation described above is a rotation about the center of gravity of the lopot, and at a time other than the aerial period, that is, at a time when one of the legs is in contact with the ground.
- the rotation center of the change in the posture rotation deviation is approximately near the center point of the actual floor reaction force (the center point of the total floor reaction force that actually acts on the mouth pot) and the target ZMP. Therefore, by determining the center of rotation as described above, the center of rotation becomes appropriate according to the actual situation, and the accuracy of the estimated position of the mouth pot can be appropriately secured.
- the target gait is a gait that does not have an aerial period, such as walking at the mouth pot.
- the rotation center may be determined at, for example, any one of the actual floor reaction force center point and the target ZMP or in the vicinity thereof.
- the posture rotation deviation calculating means is means for sequentially calculating the posture rotation deviation change amount at each moment
- the position estimating means is the posture rotation deviation at each moment.
- the estimated position at each moment of the lopot is sequentially determined using the deviation change amount (a sixth invention). According to this, since the estimated position of each moment is determined sequentially, the self-position of the lopot can be grasped almost continuously.
- the posture rotation deviation change amount corresponds to the change speed of the posture rotation deviation.
- the posture rotation deviation calculating means is configured such that each time the leg of the robot lands due to the landing operation of the robot, the posture rotation deviation calculation means performs a process from the previous landing of the leg to the landing of the present leg.
- the reliability of the estimated position can be improved.
- the position of the mouth pot estimated by the position estimating means is a ground contact position of a leg landed by the landing operation of the lopot (eighth invention). .
- the landing position of the leg of the mouth pot is more stable because it is less likely to cause minute fluctuations compared to the upper body and other parts of the mouth pot ( therefore, the landing position is estimated as the location of the mouth pot). By doing so, a highly reliable estimated position can be obtained.
- the floor reaction force acting on the mouth pot is reduced.
- a floor reaction force detecting means for detecting, the position estimating means estimating a deformation amount of the mouth port based on the detected floor reaction force, and at least using the estimated deformation amount, the robot It is preferable to determine the estimated position of (9th invention). According to this, the estimated position of the mouth pot is determined in consideration of the deformation amount of the foot of the lopot due to the floor reaction force acting on the mouth pot, so that the accuracy of the estimated position can be improved.
- a tenth invention of the self-position estimating device of the present invention is a legged locomotion locopot controlled to follow the determined target gait, and detects a translational acceleration mounted on the lopot.
- An acceleration sensor that detects an angular velocity with respect to inertial space mounted on the robot; a target movement of the target gait; a detected displacement value of a joint of the robot and a target displacement value of the joint;
- Geometric position estimating means for determining a first estimated position as a geometric estimated value of a predetermined position of the robot or a vertical position of a center of gravity of the robot based on at least one of the following.
- a second position as an inertial navigational estimated value of the vertical position of the predetermined portion or the entire center of gravity of the robot by inertial navigation based on at least the detected value of the acceleration sensor and the detected value of the angular velocity sensor.
- an inertial navigation position estimating means for determining the estimated position and correcting the second estimated position based on at least the difference between the first estimated position and the second estimated position. Things.
- the vertical position of the predetermined portion or the center of gravity of the lopot is based on at least the target movement of the target gait, the detected displacement value of the joint of the robot, and the target displacement value of the joint.
- a first estimated position is determined as a geometric estimated value (vertical position obtained by a kinematics operation or the like).
- a second estimated position as an inertial navigational estimated value of the vertical position of the predetermined portion or the entire center of gravity of the lopot is determined by inertial navigation based on at least a detected value of the acceleration sensor and a detected value of the angular velocity sensor. Determined : Then, at this time, the second estimated position is corrected based on a difference from the first estimated position.
- the second estimated position based on the inertial navigation is corrected using the first estimated position determined by a geometric method different from the inertial navigation.
- the predetermined portion is the body of the above-mentioned lopot (the eleventh invention).
- a floor reaction force detecting means for detecting a floor reaction force acting on the mouth port, and the geometrical position estimating means is provided with the detected floor reaction force. It is preferable to estimate the deformation amount of the lopot based on the above, and to determine the first estimated position using the estimated deformation amount (12th invention). According to this, similarly to the ninth invention, the first estimated position is determined in consideration of the deformation amount of the mouth pot due to the floor reaction force, so that the accuracy of the first estimated position can be improved. it can. As a result, the second estimated position can be more appropriately corrected, and the accuracy of the second estimated position can be increased.
- the inertial navigation position estimating means is configured to calculate the second estimated position so that a difference between the first estimated position and the second estimated position approaches zero. It is preferable to correct (the thirteenth invention). Thereby, the correction of the second estimated position can be appropriately performed so that the accuracy of the second estimated position is increased (error is reduced).
- the inertial navigation position estimating means when the target gait of the robot is a gait having an aerial period in which all the legs of the robot are floated in the air, the inertial navigation position estimating means is used.
- the correction amount of the second estimated position is set to approximately 0 (a 14th invention). That is, mid-air
- the correction amount of the second estimated position is set to 0 in the mid-air period. This prevents the second estimated position from being improperly corrected using the unreliable first estimated position in the aerial period, and provides a long-term accuracy of the second estimated position. Can be secured stably.
- a fifteenth invention of the self-position estimating device of the present invention detects a translational acceleration mounted on the robot in a legged movement robot controlled to follow the determined target gait.
- An acceleration sensor, and a temporal change amount of a posture rotation deviation which is a difference between a detected value or an estimated value of an actual posture of the predetermined portion of the mouth port and a target posture of the predetermined portion in the target gait.
- Posture rotation deviation calculating means for calculating the rotation deviation change amount; rotation center determination means for determining the rotation center of the change in the posture rotation deviation; and the lopot rotates around the rotation center with the posture rotation deviation change amount.
- Geometric position estimating means for determining a first estimated position, which is an estimated value of the position of the predetermined portion or the overall center of gravity of the lopot, and at least the detected value of the acceleration sensor and Calculating a second estimated position as an estimated value of the position of the entire center of gravity of the predetermined part or the lopot by inertial navigation based on the detected value or the estimated value of the actual posture of the predetermined part;
- An inertial navigation position estimating means for correcting the second estimated position based on a difference between the first estimated position and the second estimated position is provided.
- the first estimated position which is an estimated value of the position of the predetermined part of the lopot or the position of the overall center of gravity, is determined using the method described in the first invention.
- the landing position of the leg landed by the landing operation of the mouth pot is determined as described in the eighth invention, and the desired gait is determined with reference to the landing position.
- Target movement, joint displacement of mouth pot It is preferable that the first estimated position is determined based on at least one of the detected value and the displacement target value of the joint.
- the position of the predetermined portion or the entire center of gravity is estimated by inertial navigation based on at least a detection value of the acceleration sensor and a detection value or an estimated value of the actual posture of the predetermined portion.
- a second estimated position is calculated as a value, and the second estimated position is corrected based on a difference from the first estimated position.
- the second estimated position based on the inertial navigation is corrected using the first estimated position that is geometrically determined in consideration of the position change of the lopot due to the slip of the lopot. Is done.
- drift accumulation of integration error
- the geometrical position estimating means is configured to move the first coordinate system, which is a coordinate system describing the desired gait, around the rotation center by the posture rotation deviation change amount.
- the position of the predetermined part or the overall center of gravity as viewed from the first coordinate system and the first estimated position of the predetermined part or the total center of gravity as viewed from the second coordinate system are the same. It is preferable to determine a first estimated position of the predetermined portion or the entire center of gravity as viewed from a global coordinate system (a sixteenth invention).
- a sixteenth invention the same operation and effect as in the second invention can be achieved in determining the first estimated position.
- the predetermined portion is an upper body of the mouth pot (a seventeenth invention).
- the posture is the same as in the fourth invention. It is preferable that the rotation deviation includes at least a posture rotation deviation component in the Y direction of the predetermined portion (the eighteenth invention).
- the floor reaction force detecting means for detecting a floor reaction force acting on the mouth port is provided, and the geometrical position estimating means includes the detected floor reaction force. It is preferable to estimate the amount of deformation of the robot based on the above, and to determine the first estimated position using the estimated amount of deformation (the nineteenth invention). According to this, similarly to the ninth invention, the first estimated position is determined in consideration of the deformation amount of the mouth pot due to the floor reaction force, so that the accuracy of the first estimated position can be improved. it can. As a result, the second estimated position can be more appropriately corrected, and the accuracy of the second estimated position can be increased.
- the rotation center determining means may include In the period, the rotation center is determined at the center of gravity of the mouth pot, and in periods other than the aerial period, the rotation center is located at or near any point of the actual floor reaction force center point and the target ZMP of the desired gait. It is preferable to determine (20th invention). According to this, similarly to the fifth invention, it is possible to appropriately secure the accuracy of the first estimated position by setting the rotation center to be appropriate according to the actual situation.
- the inertial navigation position estimating means may adjust the second estimated position so that a difference between the first estimated position and the second estimated position approaches zero. It is preferable to make correction (21st invention).
- the correction of the second estimated position can be appropriately performed so as to increase the accuracy of the second estimated position (reduce the error).
- the target gait of the robot is a gait having an aerial period in which all the legs of the robot are floated in the air
- the inertial navigation position estimating means includes: In the aerial period, the correction amount of the second estimated position is substantially omitted. It is preferably set to 0 (22nd invention).
- the correction amount of the second estimated position is set to 0 in the aerial period, as in the 14th invention, so that the second estimated position in the aerial period is the first estimated position with low reliability. It is possible to prevent improper correction by using, and to stably secure the long-term accuracy of the second estimated position.
- a twenty-third invention of the self-position estimating device of the present invention is a floor-type moving robot controlled to follow a determined target gait, and a floor for detecting a floor reaction force acting on the robot.
- a reaction force detection means, and a temporal change amount of a posture rotation deviation which is a difference between a detected value or an estimated value of an actual posture of the predetermined portion of the lopot and the posture of the predetermined portion in the target gait.
- Attitude rotation deviation calculating means to be obtained as a rotation deviation change amount; rotation center determination means for determining a rotation center of the change in the posture rotation deviation; and the mouth port rotates by the posture rotation deviation change amount around the rotation center.
- Geometric position estimating means for determining a first estimated position as an estimated position of the predetermined portion or the overall center of gravity of the lopot, at least a detection value of the floor reaction force detecting means and the actual position.
- Posture detection A second estimated position as an estimated position of the predetermined portion or the overall center of gravity of the robot is calculated by a dynamics operation based on the value or the estimated value, and at least the first estimated position and the second estimated position are calculated.
- Dynamic position estimating means for correcting the second estimated position based on the difference from the position.
- the first estimated position is determined in exactly the same manner as in the fifteenth invention.
- the second estimated position is based on at least the detected value of the floor counter detecting means and the detected value or estimated value of the actual posture of the predetermined part, and the dynamic calculation calculates the predetermined part or the roppo.
- a second estimated position is calculated as an estimated position of the overall center of gravity of the object, and the second estimated position is corrected based on a difference from the first estimated position.
- the floor reaction force is used as the overall quality of the mouth pot. Since the value obtained by subtracting the gravitational acceleration from the value divided by the amount and inverting the sign is equivalent to the acceleration detection value referred to in the fifteenth invention, the fifteenth invention is consequently performed.
- the second estimated position is determined and corrected in the same manner as described above. Therefore, according to the twenty-third aspect, similarly to the fifteenth aspect, it is possible to prevent the drift (accumulation of integration error) that is likely to be included in the second estimated position and to improve the accuracy of the second estimated position. Can be. That is, it is possible to improve the accuracy of estimating the position of the predetermined part or the center of gravity of the entirety.
- the geometrical position estimating means is configured to move the first coordinate system, which is a coordinate system describing the desired gait, around the rotation center by the posture rotation deviation change amount.
- the position of the predetermined part or the overall center of gravity as viewed from the first coordinate system and the first estimated position of the predetermined part or the total center of gravity as viewed from the second coordinate system are the same. It is preferable to determine a first estimated position of the predetermined portion or the entire center of gravity as viewed from a global coordinate system (24th invention).
- the same operation and effect as in the second invention can be achieved in determining the first estimated position.
- the predetermined portion is the upper body of the mouth pot (the twenty-fifth invention).
- the posture rotation deviation includes at least a posture rotation deviation component of the predetermined portion in the Y direction as in the fourth invention (a twenty-sixth invention). .
- the geometrical position estimating means estimates a deformation amount of the mouth pot based on the detected floor reaction force, and uses the estimated deformation amount. It is preferable to determine the first estimated position by using 7 inventions). According to this, the first estimated position is determined in consideration of the amount of deformation of the mouth pot due to the floor reaction force as in the ninth aspect, so that the accuracy of the first estimated position can be improved. As a result, the second estimated position can be more appropriately corrected, and the accuracy of the second estimated position can be improved.
- the target step of the lopot is When the body is a gait having an aerial phase in which all the legs of the lopot are floated in the air, the rotation center determining means determines the rotation center at the center of gravity of the mouth port in the aerial phase, It is preferable to determine the rotation center at or near any point of the actual floor reaction force central point and the target ZMP of the desired gait during periods other than the mid-air period (the twenty-eighth invention). According to this, similarly to the fifth invention, it is possible to appropriately secure the accuracy of the first estimated position by setting the rotation center to be appropriate according to the actual situation.
- the dynamic position estimating means sets the second estimated position such that a difference between the first estimated position and the second estimated position approaches zero. It is preferable to correct (29th invention).
- the correction of the second estimated position can be appropriately performed such that the accuracy of the second estimated position is increased (error is reduced).
- the dynamic position estimating means includes: In the aerial period, it is preferable that the correction amount of the second estimated position is set to approximately 0 (30th invention).
- the correction amount of the second estimated position is set to 0 in the aerial period, as in the 14th invention, so that the second estimated position in the aerial period is the first estimated position with low reliability. It is possible to prevent improper correction by using, and to stably secure the long-term accuracy of the second estimated position.
- At least The position information of the object such as the floor or obstacle on the map, and the relative position of the robot with respect to the object recognized by environment recognition means such as an imaging device mounted on the robot It is preferable that at least one of the detected value or estimated value of the actual posture of the predetermined part and the estimated position of the lopot is corrected based on the relationship information (third invention).
- At least position information of an object such as a floor surface or an obstacle on a map stored in advance, an imaging device mounted on the robot, etc.
- At least position information of an object such as a floor surface or an obstacle on a map stored in advance and an imaging device mounted on the robot
- the first estimated position, the second estimated position, and the detected value of the actual posture of the predetermined part based on the relative positional relationship information between the object and the mouth port recognized by the environment recognition means such as
- it is preferable to provide a means for correcting at least one of the estimated value and the estimated value (third invention).
- an object such as a landmark whose position information is clearly determined is recognized by the environment recognition means on the map, and the object and the mouth port are identified.
- the estimated position (the first or second estimated position in the tenth to thirty-third inventions) or the actual posture of the predetermined part required to determine the estimated position is determined. Since the detected value or the estimated value (in the case of the first to ninth inventions or the 15th to 30th inventions) is corrected when the object is recognized, the estimated position (the 10th to 30th inventions) is corrected. In the invention, the accuracy of the second estimated position) can be improved.
- At least the position of the lopot estimated by the position estimating means and the environment recognizing means such as an imaging device mounted on the lopot are recognized. It is preferable to include means for estimating the position of the object based on the relative positional relationship information of the robot with respect to the object (34th invention).
- At least the second estimated position and the relative position of the lopot to an object recognized by environment recognizing means such as an imaging device mounted on the lopot. It is preferable to provide a means for estimating the position of the object based on the positional relationship information (the 35th invention).
- the estimated position (the second estimated position in the tenth to thirtieth inventions) which is the self-position of the mouth pot determined as described above, The position of the object can be accurately estimated.
- the mouth port is determined based on at least the estimated position of the mouth port and the position of an object such as a floor or an obstacle on a map stored in advance. It is preferable to provide a means for determining the direction in which the environment recognition means such as the mounted image pickup device should be watched (the 36th invention).
- the robot uses the estimated position (the second estimated position in the tenth to thirtieth inventions) which is the self-position of the robot determined as described above.
- the robot can accurately recognize the positional relationship between the robot and the robot.
- the gaze direction of the environment recognition means in a direction in which the gaze should be observed (for example, a direction in which an object is present at a predetermined position such as the center of an image of an imaging device constituting the environment recognition means). it can.
- FIG. 1 is a schematic diagram showing an outline of the overall configuration of a bipedal locomotion lopot as a legged locomotive according to an embodiment of the present invention
- FIG. 2 is a diagram showing the configuration of a foot portion of each leg in FIG. Fig. 3 and Fig. 4 are a schematic side view and a bottom view, respectively, showing the detailed configuration of the foot of each leg
- Fig. 5 is a control unit provided in the mouth port of Fig. 1.
- FIG. 6 is a block diagram showing a functional configuration of the control unit shown in FIG. Fig. 7 is an explanatory diagram showing the locomotion gait.Figs.
- FIG. 8 (a), (b), and (c) show the floor reaction force vertical component of the target gait, the target ZMP, and the gain for self-position / posture estimation.
- FIG. 9 is a flowchart showing a process of a main part of the control unit in the first embodiment.
- FIG. 10 is a flowchart showing a self-position / posture estimation process in the flowchart of FIG. 11 is a diagram for explaining the self-position / posture estimation processing in the flowchart of FIG.
- FIG. 12 is a flowchart showing the self-position / posture estimation processing of the second embodiment, and
- FIG. 13 is a view for explaining the self-position / posture estimation processing of the second embodiment.
- FIG. 12 is a flowchart showing the self-position / posture estimation processing
- FIG. 14 is a flowchart showing the self-position / posture estimation processing of the third embodiment
- FIG. 15 is a view for explaining the self-position / posture estimation processing of the third embodiment
- FIG. 16 is a flowchart showing the self-position / posture estimation processing of the fourth embodiment.
- FIG. 17 is a block diagram showing the main part of the self-position / posture estimation processing of the fourth embodiment. a) and (b) are diagrams for explaining the self-position / posture estimation processing of the fourth embodiment, and FIG. 19 is a graph showing an example of gain settings used in the fourth embodiment.
- FIG. 20 is a fifth embodiment.
- FIG. 21 is a flowchart showing a self-position / posture estimation process according to an embodiment of the present invention.
- FIG. 21 is a flowchart showing a self-position / posture estimation process according to an embodiment of the present invention.
- FIG. 22 is a block diagram showing the main part of the self-position / posture estimation processing according to the sixth embodiment.
- FIGS. 23 and 24 show the seventh embodiment.
- FIG. 25 is a diagram showing the internal structure of the robot head in a front view and a side view, respectively.
- FIG. 25 is a flowchart showing a self-position / posture estimation process according to a seventh embodiment.
- a two-legged lopot is used as an example of a leg-type moving port.
- FIG. 1 is a schematic diagram showing an overall bipedal robot as a legged mobile robot according to this embodiment.
- a two-legged port (hereinafter referred to as a port) 1 is a pair of left and right legs (leg links) extending downward from an upper body (the base of the port 1) 2 , 2 are provided.
- the two legs 2 have the same structure, each having six joints.
- the six joints are, in order from the upper body 3 side, joints for rotation (rotation) of the crotch (lumbar) (for rotation in one direction with respect to the upper body 3) 10 R, 10 L (signs R, L).
- joints 14 R, 14 L for rotation in the crotch (lumbar) pitch direction around the Y axis
- joints 16 R, 16 L for rotation in the knee pitch direction and ankle joints It consists of joints 18R for rotation in the pitch direction : 18L and joints 20R and 20L for rotation in the roll direction of the ankle.
- the foot (foot) 2 2 R (L) that constitutes the tip of each leg 2 is At the same time as being attached, the top of both legs 2, 2 is via the three joints 10 R (L), 12 R (L), 14 R (L) of the crotch of each leg 2.
- the upper body 3 is attached. Inside the upper body 3, a control unit 60 and the like, which will be described in detail later, are stored. Will be delivered. In FIG. 1, the control unit 60 is shown outside the body 3 for convenience of illustration.
- the hip joint (or hip joint) is composed of joints 1 OR (L), 12 R (L), and 14 R (L), and the knee joint is joint 16 R (L).
- the ankle joint is composed of joints 18 R (L) and 20 R (L).
- the hip joint and the knee joint are connected by a thigh link 24 R (L), and the knee joint and the ankle joint are connected by a lower leg link 26 R (L).
- a pair of left and right arms 5 are attached to both upper sides of the upper body 3, and a head 4 is disposed at the upper end of the upper body 3. Since the arms 5, 5 and the head 4 do not directly relate to the gist of the present invention, a detailed description is omitted.
- the desired movement of both feet 22 R and 22 L can be performed by driving the joint of the two feet at an appropriate angle.
- the mouth port 1 can move arbitrarily in the three-dimensional space.
- a known 6-axis force sensor 50 is located below the ankle joint 18 R (L), 20 R (L) of each leg 2 between the foot 22 and R (L). Are interposed.
- the 6-axis force sensor 50 is for detecting the presence / absence of landing on the foot 22 R (L) of each leg 2, the floor reaction force (ground load) acting on each leg 2, and the like.
- the detection signals of the three-directional components Fx, Fy, Fz of the translational force of the floor reaction force and the three-directional components Mx, My, Mz of the moment are output to the control unit 60.
- the upper body 3 has an inclination (posture angle) with respect to the Z axis (vertical direction (gravity direction)) and an inclination for detecting the angular velocity and the like.
- a sensor 54 is provided, and a detection signal is output from the tilt sensor 54 to the control unit 60.
- the tilt sensor 54 includes a three-axis acceleration sensor and a three-axis gyro sensor (not shown), and detection signals from these sensors are used to detect the tilt of the body 3 and its angular velocity. It is used for estimating the self-position and orientation of the mouth port 1.Although the detailed structure is not shown, each joint of the mouth port 1 has an electric motor 644 for driving it. (See FIG. 5), and an encoder (rotary encoder) 65 (see FIG. 5) for detecting the rotation amount (rotation angle of each joint) of the electric motor 64 are provided.
- the detection signal of No. 5 is output from the encoder 65 to the control unit 60.
- a joystick (operator) 73 (see FIG. 5) is provided at an appropriate position of the robot 1, and the joystick 73 is operated.
- a request for the gait of the mouth port 1 can be input to the control unit 60 as needed, for example, by turning the robot 1 moving straight.
- FIG. 2 is a view schematically showing a basic configuration of a tip portion (including each foot 22 R (L)) of each leg 2 in the present embodiment.
- a spring mechanism 70 is provided between the foot 22 and the 6-axis force sensor 50, and a sole (each foot 22) is provided.
- An elastic sole 71 made of rubber or the like is affixed to the bottom surfaces of R and L).
- the compliance mechanism 72 is constituted by the spring mechanism 70 and the sole elastic body 71.
- a rectangular guide member (not shown in FIG. 2) attached to the upper surface of the foot 22 R (L) and an ankle joint 18 R (L) (The ankle joint 2 OR (L) is omitted in FIG. 2) and attached to the 6-axis force sensor 50 side, and are housed in the guide member via an elastic material (rubber or spring) so as to be finely movable.
- Piston-like member (not shown in Fig. 2) and It is composed of ⁇
- the foot 22 R (L) shown by a solid line in FIG. 2 shows a state when no floor reaction force is applied.
- the spring mechanism 70 of the compliance mechanism 72 and the sole elastic body ⁇ 1 bend, and the foot 2 2 R (L) is illustrated by a dotted line in the figure. Move to such a position and orientation.
- the structure of the compliance mechanism 72 is not only used for reducing landing impact as described in detail in, for example, Japanese Patent Application Laid-Open No. 5-305558, which was previously proposed by the present applicant. It is also important to improve controllability.
- FIG. 3 is a cross-sectional view of a side view of the foot mechanism 22 R (L)
- FIG. 4 is a plan view of the foot mechanism 22 R (L) viewed from the bottom side.
- the foot mechanism 22 R (L) includes a generally flat foot plate member 102 as a skeletal member.
- the foot plate member 102 has a front end portion (toe portion) and a rear end portion (heel portion) that are slightly upwardly curved, but the other portions are flat and flat.
- a guide member 103 having a rectangular cross section is fixed to the upper surface of the foot plate member 102 with its axis centered in the vertical direction.
- a movable plate (piston-like member) 104 is provided inside the guide member 103 so as to be movable substantially vertically along the inner peripheral surface of the guide member 103.
- the movable plate 104 is connected to the ankle joints 18 R (L) and 20 R (L) via a six-axis force sensor 50.
- the movable plate 104 has a foot plate member 1 through a plurality of elastic members 106 (illustrated as springs in the figure) whose lower peripheral edge is made of an elastic material such as a spring or rubber. 0 2. Therefore, The late member 102 is connected to the ankle joint 18 R (L) via an elastic member 106, a movable plate 104 and a six-axis force sensor 50.
- the inside of the guide member 103 (the space below the movable plate 104) is open to the atmosphere through holes and gaps (not shown). The member 103 can enter and exit freely. Further, the guide member 103, the movable plate 104, and the elastic member 106 constitute the spring mechanism 70 shown in FIG.
- the ground member 71 as the sole elastic body 71 shown in FIG. 2 is attached to the bottom surface (lower surface) of the foot plate member 102.
- the grounding member 71 is an elastic member interposed between the foot plate member 102 and the floor (the elastic member that directly contacts the floor).
- the foot plate member 102 is fixed to the four corners of the ground contact surface (both sides of the toe portion of the foot plate member 102 and both side portions of the heel portion). .
- the grounding member 71 is formed by vertically stacking a soft layer 107 a made of a relatively soft rubber material and a hard layer 107 b made of a relatively hard rubber material.
- a hard layer 107 b is provided on the lowermost surface side as a grounding surface portion that comes into direct contact with the floor surface when the leg 2 is placed on the floor.
- the foot mechanism 22 R (L) is provided with a landing shock absorbing device 108 in addition to the above configuration.
- the landing shock absorbing device 108 includes a bag-like member 109 attached to the bottom surface of the foot plate member 102 and a pressurized fluid with respect to the inside of the bag-like member 109. There is a flow passage 110 for letting in and out of the air (air in the atmosphere).
- the bag-shaped member 109 is provided substantially at the center of the bottom surface of the foot plate member 102 such that the grounding member 71 is present around the bag-shaped member 109.
- This bag The cylindrical member 109 is made of a deformable material such as rubber and the like. In a natural state where no elastic deformation occurs due to external force, as shown by a solid line in FIG. Present.
- the open end of the bag-shaped member 109 is fixed to the bottom surface of the foot plate member 102 over the entire circumference, and is closed by the foot plate member 102.
- the bag-shaped member 109 is provided such that the bottom of the bag-shaped member 109 protrudes below the grounding member 71 in a natural state in the shape of a cylindrical container.
- the height of the bag-shaped member 109 (the distance from the lower surface of the foot plate member 102 to the bottom of the bag-shaped member 109) is larger than the thickness of the grounding member 71. Therefore, in the state where the foot plate member 102 is grounded via the grounding member 71 (the landing state of the leg 2), the bag-like member 109 is not placed on the floor as shown by the phantom line in FIG. It is compressed in the height direction of the bag-shaped member 109 by the reaction force.
- the natural state in which the bag-shaped member 109 has the shape of a cylindrical container is the inflated state of the bag-shaped member 109. Since the bag-shaped member 109 is made of a flexible material, it has a shape restoring force to a natural state (cylindrical container shape) when compressed.
- the flow passage 110 constitutes inflow / outflow means for inflow / outflow of air to / from the bag-like member 109.
- the inside of the bag-like member 109 and the guide And a communication hole formed in the foot plate member 102 so as to communicate with the inside of the pad member 103.
- the flow passage 110 connects the inside of the bag-like member 109 to the atmosphere side. It will be. Therefore, the air in the atmosphere can freely enter and exit through the flow passage 110 inside the bag-shaped member 109, and the bag-shaped member 109 is in an expanded state (natural state).
- the bag-shaped member 109 is filled with air, and the pressure inside the bag-shaped member 109 is atmospheric pressure. Is equivalent to In addition, the flow passage 110 is a throttle passage, and generates a fluid resistance when air enters and exits the bag-like member 109.
- FIG. 5 is a block diagram showing the configuration of the control unit 60.
- the control unit 60 is constituted by a microcomputer, and comprises a first arithmetic unit 90 and a second arithmetic unit 92 each comprising a CPU, an AZD converter 80, a power unit 86, A DZA converter 96, a RAM 84, a ROM 94, and a bus line 82 for exchanging data between them are provided.
- the output signals of the 6-axis force sensor 50, tilt sensor 54 (acceleration sensor and rate gyro sensor), and joystick 73 of each leg 2 are A / D converted. After being converted into a digital value by the device 80, it is sent to the RAM 84 via the bus line 82.
- the output of the encoder 65 (rotary encoder) of each joint of the robot 1 is input to the RAM 84 via the counter 86.
- the first arithmetic unit 90 generates a desired gait as described later and calculates an articulation angle displacement command (a displacement angle of each joint or a command value of a rotation angle of each electric motor 64). 8 Send to 4. Further, the second arithmetic unit 92 reads the joint angle displacement command from the RAM 84 and the actual measured value of the joint angle detected based on the output signal of the encoder 65, and is necessary for driving each joint. The operation amount is calculated and output to the electric motor 64 that drives each joint via the DZA converter 96 and the servo amplifier 64a.
- FIG. 6 is a block diagram showing the overall functional configuration of the legged mobile robot control device according to this embodiment.
- the part other than the “real mouth pot” in FIG. 6 is constituted by the processing functions executed by the control unit 60 (mainly the functions of the first arithmetic unit 90 and the second arithmetic unit 92). It is something that is done. In the following description, it is not necessary to distinguish left and right of leg 2 in particular. If not, the symbols R and L are omitted.
- control unit 60 includes a gait generator 200 that freely generates and outputs a desired gait of the robot 1 in real time, a self-position / posture estimating unit 214, and the like. .
- the self-position / posture estimating unit 2 14 executes processing related to the characteristic part of the present invention, and estimates the position / posture (position / posture in the global coordinate system) of the mouth port 1. Things.
- the desired gait output by the gait generator 200 is a desired body position / posture trajectory (trajectory of the desired position and desired posture of the body 3), a desired foot position / posture trajectory (the desired position of each foot 22) And the trajectory of the desired posture), the trajectory of the desired arm posture (the trajectory of the desired posture of each arm 5), the trajectory of the desired total floor reaction force center point (the desired ZMP), and the trajectory of the desired total floor reaction force.
- a movable part is provided for the upper body 3 in addition to the leg 2 and the arm 5, the target position / posture trajectory of the movable part is added to the target gait.
- the “trajectory” in the above gait means a temporal change pattern (time-series pattern), and in the following description, may be referred to as “pattern” instead of “trajectory”.
- the “posture” of each part is a general term for the inclination and direction of the part.
- tilt is an angle between the vertical direction of the part
- direction is the direction of the vector obtained by projecting a vector indicating the front direction of the part on a horizontal plane.
- the inclination of the body posture is the inclination angle (posture angle) of the body 3 in the roll direction (around the X axis) with respect to the Z axis (vertical axis), and the body in the pitch direction (around the Y axis) with respect to the Z axis.
- the orientation of the upper body 3 is represented by the rotation angle of the vector in which the vector indicating the forward direction of the upper body 3 is projected on a horizontal plane, in the X direction (around the Z axis).
- the foot posture is represented by a two-axis spatial azimuth fixedly set for each foot 22.
- the landing posture of foot 22 basically indicates the direction of the foot 22 that has landed, and specifically, the direction of the vector that projects the vector from the heel of the landed foot 22 to the toe onto the horizontal plane.
- the target arm posture is represented by a posture relative to the body 3 with respect to all positions of the arm 5.
- the body position means a predetermined position of the body 3, specifically, a position of a predetermined representative point of the body 3.
- the foot position means the position of a predetermined representative point of each foot 22R, 22L.
- the body speed means the moving speed of the representative point of the body 3, and the foot speed means the moving speed of the representative point of each of the feet 22 R and 22 L.
- the desired gait such as the desired body position / posture
- “goal” is often omitted in the following description if there is no risk of misunderstanding.
- the components of the gait other than the components relating to the floor reaction force that is, the gaits relating to the movement of the mouth port 1, such as the foot position posture, the body position / posture, etc.
- each foot floor reaction force The floor reaction force (floor reaction force consisting of translational force and moment) of each foot 22 is called “each foot floor reaction force”, and all (two) feet 22
- the resultant of the R and 22 L floor reaction forces is called the “total floor reaction force”.
- total floor reaction force the resultant of the R and 22 L floor reaction forces.
- the desired floor reaction force is generally expressed by the point of action, the force acting on that point (translational force), and the moment of the force. Since the point of action is good for everywhere, countless expressions can be considered for the same desired floor reaction force, but especially when the target floor reaction force is expressed using the aforementioned target floor reaction force center point as the point of action, the moment of force is , Except for the vertical axis component.
- the ZMP calculated from the target motion trajectory (the robot calculated from the target motion trajectory, the inertial force of (The point at which the resultant force acting around that point becomes 0 except for the vertical axis component) and the desired total floor reaction force center point coincide with each other.
- the ZMP orbit for details, see PCT Publication WO / 02/40224 by the present applicant, for example).
- a desired gait in a broad sense is a set of a desired motion trajectory and a desired floor reaction force trajectory during one or more steps.
- the target gait in a narrow sense is a set of the target motion trajectory and its ZMP trajectory during one step.
- a desired gait is a set of a desired motion trajectory during one step, its ZMP trajectory, and a floor reaction force vertical component trajectory.
- the target gait will be used in the narrow sense of the target gait unless otherwise specified.
- “one step” of the target gait is a piece of mouth pot 1 The meaning is used from the time when one leg 2 lands to the time when the other leg 2 lands.
- the aerial period is the period during which both legs 2, 2 are off the floor (floating in the air).
- the leg 2 on the side that does not support the own weight of the lopot 1 during the one-leg supporting period is called a “free leg”, and the leg 2 on the side supporting the own weight is called the “supporting leg”.
- the two-leg supporting period and the one-leg supporting period are alternately repeated, and in the running of robot 1, the one-leg supporting period and the aerial period are alternately repeated.
- both legs 2 and 2 do not support the weight of mouth port 1, but legs 2 and 3, which were free legs in the one-leg support phase immediately before the aerial phase,
- the legs 2 that were support legs are also referred to as a free leg and a support leg, respectively, even in the aerial period.
- each part of the lopot 1 in the desired gait such as a desired body posture, a desired body position, a desired foot position and posture, and a desired arm posture
- the support leg coordinate system is a coordinate system fixed to the floor having an origin near the ground contact surface of the foot 22 of the support leg. More specifically, the supporting leg coordinate system does not slide the foot 22 of the supporting leg between the ground surface and the horizontal position as described in the applicant's patent No. 3273443.
- the horizontal axis toward the toe of the foot 22 of the support leg, with the vertical projection point from the center of the ankle joint of the support leg to the tread (the axis in the front-rear direction of the foot 22) ) Is the X axis, the vertical axis is the Z axis, and the coordinate axis orthogonal to these X and Z axes (the axis in the horizontal direction of the foot 22) is the Y axis.
- the gait generator 200 receives a landing position / posture of the foot 22 of the free leg up to two steps ahead and a required value (a target value) of a landing time as inputs. Generate a desired gait consisting of the desired body position / posture trajectory, the desired foot position / posture trajectory, the desired ZMP trajectory, the desired floor reaction force vertical component trajectory, and the desired arm posture trajectory. At this time, some of the parameters that define these trajectories (this is called gait parameters) are modified to satisfy the gait continuity.
- the kinetic model of mouth pot 1 is used. Examples of the dynamic model include a simplified model described in PCT Publication WOZ02Z40224 or a multi-mass model (full model) described in Japanese Patent Application Laid-Open No. 2002-326173 proposed by the present applicant. Just use it.
- the gait generator 200 is a target gait for one step from when one leg 2 of the mouth port 1 lands to when the other leg 2 lands (in the narrow sense described above).
- the target gait for one step is generated in order with the target gait) as the unit.
- the gait that is to be generated now or in the future is “this gait”, and the next gait is “next gait”.
- the next gait is called the “next gait”.
- the target gait generated just before the current gait is called the previous gait.
- a part of a desired gait generated by the gait generator 200 will be outlined.
- a desired foot position / posture trajectory is generated using a finite time setting filter disclosed in Japanese Patent No. 3233450 by the present applicant. Is done.
- the foot position trajectory starts moving while gradually accelerating the foot 22 toward a target landing position (required value of the landing position). The speed is gradually reduced to 0 or almost 0 by the target landing time (required value of the landing time), and is generated so as to reach the target landing position and stop at the target landing time.
- the target foot position / posture trajectory generated in this way is particularly suitable for traveling at mouth port 1 because the ground speed at the moment of landing is 0 or almost 0. In this case, the landing impact at the time of landing from the mid-air period can be reduced.
- the desired floor reaction force vertical component trajectory and the desired ZMP trajectory (specifically, the support leg coordinate system)
- the target ZMP trajectory in the X-axis direction (the front and back direction of the support leg foot 22) is set in the pattern shown by the solid line in Fig. 8 (a) and Fig. 8 (b), respectively.
- the first to third figures in Fig. 7 schematically show the movement states of the two legs 2, 2 of the mouth pot 1 at the start, middle, and end of the one-leg support period, respectively.
- the fourth and fifth diagrams show the movement of the two legs 2, 2 of the mouth pot 1 at the midpoint of the aerial phase and at the end of the aerial phase, respectively (at the beginning of the next one-leg support period). Is schematically shown.
- the target floor reaction force vertical component trajectory When traveling at mouth port 1, the target floor reaction force vertical component trajectory basically has an upwardly convex pattern during the one-leg support period, and is maintained at 0 during the aerial period.
- the desired floor reaction force vertical component trajectory is set, for example, as shown by a two-dot chain line in FIG. 8 (a).
- the upper convex part of the two-dot chain line corresponds to the two-leg supporting period
- the lower convex part corresponds to the one-leg supporting period.
- the target ZMP is basically set near the center of the contact surface of the leg 2 of the mouth pot 1 (more specifically, the so-called support polygon) regardless of whether the vehicle is running or walking. .
- FIG. 9 is a flowchart (structured flowchart) showing the gait generation processing of the gait generator 200 and the self-position / posture estimation processing of the self-position / posture estimating section 214 shown in FIG. is there.
- the process proceeds to S014 via S012, and waits for a timer interrupt for each control cycle.
- the control cycle is At.
- the process proceeds to S 0 16, where the self-position / posture estimation is performed by the self-position / posture estimation unit 2 14.
- the processing in S016 is a feature of the self-position estimating device of the legged mobile lopot according to the present application, but this description will be described later.
- the process proceeds to S018, and the gait switching (previous It is determined whether or not the generation of the gait is completed. It is time to start generating a new current time gait.
- the process proceeds to S 0 20
- the result is NO
- the process proceeds to S 0 32.
- the processing after S 0 20 described below is described in detail in PCT Publication No. WO 02/40224 previously proposed by the present applicant or PCT application PCT / JP02Z 13596 mentioned above. In the specification, only a brief explanation will be given.
- time t is initialized to 0.
- the next time's gait support leg coordinate system specifically, its position and direction
- the next time's gait support leg coordinate system specifically, its position and direction
- this time's gait cycle and next time Read the gait cycle.
- next time gait support leg coordinate system and the next time gait support leg coordinate system are respectively the first step free leg foot 22 (specified by the operation of the joystick 73).
- Required value of landing position / posture of free leg foot 2 2) (target landing position / posture) of second step, Request of landing position / posture of free leg foot 2 2 of second step (free leg foot 2 2 of next time gait) It is determined according to the value (target landing position / posture) according to the definition of the support leg coordinate system described above.
- the gait cycle this time and the next time's gait cycle are respectively the required landing time (target landing time) of the first step of the free leg foot 22 and the landing time of the second step of the free leg foot 22. Determined according to the required time value (target landing time).
- the above-described required values of the landing position / posture of the free leg foot 22 and the required values of the landing time, or the position and orientation of the supporting leg coordinate system and the gait cycle may be stored in advance as a walking schedule. Good or joy The determination may be made based on a command (request) from the control device such as tick 73 and the walking history up to that time.
- the gait parameters of the normal turning gait leading to the gait this time are the next time's gait support leg coordinate system, the next time's gait support leg coordinate system determined in S 0 22, Determined based on this time's gait cycle and next time's gait cycle, etc.
- foot trajectory parameters defining target foot position / posture trajectory reference body posture defining reference trajectory of target body posture Orbit parameters, arm posture trajectory parameters that specify the target arm posture trajectory, ZMP trajectory parameters that specify the target ZMP trajectory, and floor reaction force vertical component trajectory parameters that specify the desired floor reaction force vertical component trajectory are determined. Is done. For example, taking the floor reaction force vertical component trajectory parameters as an example, the time and value of the break point of the pattern shown in FIG. 8A are determined as the floor reaction force vertical component trajectory parameters.
- the normal turning gait means a periodic gait such that when the gait is repeated, no discontinuity occurs in the motion state of the mouth port 1 at the boundary of the gait (hereinafter, “normal gait” may be abbreviated as “normal gait”.
- T A gait for one cycle of the normal gait consists of a first gait and a second gait.
- the first turning gait corresponds to the gait when the support leg foot 22 corresponding to the support leg coordinate system of the current time gait is moved to the position and orientation corresponding to the next time gait support leg coordinate system.
- the turning gait corresponds to the gait when the support leg foot 22 corresponding to the next time gait support leg coordinate system is moved to the position and orientation corresponding to the next next time support gait coordinate system.
- next / next gait support leg coordinate system corresponds to the target landing position / posture of the free leg foot 22 of the second turning gait.
- the next-next gait support leg coordinate system is the position and orientation of the next-next gait support leg coordinate system viewed from the next-next gait support leg coordinate system (the support leg coordinate system of the second turning gait).
- Part and orientation is the position and orientation (position and posture) of the next time's gait support leg coordinate system (landing position and posture of free leg foot 22 of this time's gait) viewed from the current time Orientation) is set to match.
- turn is used for a normal turning gait because when the turning rate is zero, it means straight ahead, and straight turning can be included in turning in a broad sense. .
- the normal turning gait is a virtual periodic gait tentatively created by the gait generator 200 to determine the divergent component at the end of the current gait and the body vertical position speed.
- the gait generator 200 does not directly output the gait generator 200 for actually controlling the port 1.
- divergence means that the position of the upper body is shifted to a position far away from the position of both feet (foot).
- the value of the divergent component means that the upper body of the two-legged port is located at the position of both feet (foot) (strictly speaking, far away from the origin of the support leg coordinate system set on the support foot contact surface) It is a numerical value representing the condition, and is expressed as a function of the horizontal position of the upper body 3 and its speed.
- a normal gait to be connected after the current gait to be generated is requested according to the movement request (required values such as the landing position / posture of the foot 22 of the free leg up to two steps ahead and the landing time).
- the gait is generated this time so that the terminal divergent component of the current gait matches the initial divergent component of the normal gait.
- S 0 26 After performing the processing from S 010 to S 0 24, proceed to S 0 26, where the initial state of the normal turning gait (initial body horizontal position speed component, initial body vertical position speed, initial Divergent component, initial body posture angle and angular velocity) are determined.
- the details of S026 are described in PCT published publication WOZ02Z40224 or PCTZ JP02Z13596, and further description is omitted here.
- the determined gait parameters of the current time's gait are mainly the foot trajectory parameters, the reference body posture trajectory parameters, and the arms, similar to the gait parameters of the normal turning gait.
- Attitude trajectory parameter overnight target ZMP trajectory parameter overnight, target floor reaction force vertical component trajectory parameter trajectory. Is done.
- the target ZMP orbit parameters are provisional.
- the details of the processing of S028 are described in the above-mentioned PCT published publication WOZ 02Z 40224, PCT / JP02Z 13596, and the like, and further description thereof will be omitted here.
- the process proceeds to S030, in which the gait parameters of the current time's gait are corrected such that the terminal divergent component of the current time's gait matches the initial divergent component of the normal gait.
- the corrected gait parameters are the target ZMP trajectory parameters.
- the above is the target gait generation processing in the gait generator 200 and the self-position / posture estimation processing of the self-position / posture estimation unit 2 14.
- a volume is generated.
- the desired body position / posture (trajectory) and the desired arm posture trajectory are sent directly to the mouth-pot geometric model (inverse kinematics calculation unit) 202.
- the desired foot position / posture (trajectory), the desired ZMP trajectory (target total floor reaction force center point trajectory), and the desired total floor reaction force (trajectory) (target floor reaction force horizontal component and target floor reaction force vertical component) Is directly sent to the composite compliance operation determination unit 204, and is also sent to the target floor reaction force distributor 206.
- the desired floor reaction force distributor 206 the desired total floor reaction force is distributed to the feet 22R and 22L, and the desired foot floor reaction force center point and the desired foot floor reaction force are determined. .
- the determined desired foot floor reaction force center point and the desired foot floor reaction force are sent to the composite compliance operation determination unit 204.
- the composite compliance motion determination unit 204 generates a corrected target foot position / posture trajectory with mechanism deformation compensation, and sends it to the lopot geometric model 202.
- the mouth pot geometric model 202 12.
- the displacement controller 208 uses the joint displacement command (value) calculated by the geometric model of the robot 202 as a target value to perform tracking control of the displacement of the 12 joints of the robot 1.
- the floor reaction force generated at the mouth pot 1 (specifically, the floor reaction force of each foot) is detected by the 6-axis force sensor 50.
- the detected value is sent to the composite compliance operation determining unit 204.
- the angle deviation 0 errx, erry erry is sent to the attitude stabilization control calculation unit 2 1 2.
- 0 e rrx is the roll method
- 0 erry is the gradient component in the pitch direction (around the Y axis).
- the posture stabilization control calculation unit 2 1 2 is used to restore the inclination of the mouth posture of mouth port 1 to the inclination of the body posture of the target gait.
- the compensated total floor reaction force moment Mdmd is calculated, and the compensated total floor reaction force moment Mdmd is given to the composite compliance operation determination unit 204.
- the composite compliance operation determination unit 204 corrects the desired foot position / posture based on the input values. Specifically, in the composite compliance operation determination unit 204, the actual total floor reaction force (the resultant force of all the actual foot floor reaction forces, including both the translational force and the moment) is calculated as the target of each foot floor reaction force.
- the target foot position / posture given by the gait generator 200 is corrected so that it matches the resultant force of the desired total floor reaction force, which is the resultant force, and the compensation total floor reaction force moment Mdmd. Determine the corrected target foot position position (orbit).
- the corrected target foot position position orbit
- weights are given to the control deviation for each target of the foot position / posture and floor reaction force, and control is performed so that the weighted average of the control deviation (or the square of the control deviation) is minimized.
- the corrected target foot position / posture (trajectory) with the mechanism deformation compensation is the foot deformation mechanism (the trajectory necessary to generate the target value of the floor reaction force corrected by the composite compliance operation determination unit 204).
- Deformation of cylindrical rubber, sole sponge, and shock-absorbing bag-shaped air damper is calculated using the mechanical model of the deformation mechanism (spring damper model, etc.) and corrected to generate the deformation. This is the desired foot position and orientation (trajectory).
- FIG. 10 is a flowchart thereof.
- the self-position / posture estimation processing of the first embodiment described below is based on the walking of the mouth port 1. This is the process in the case of performing.
- the detected value of the gyro sensor provided in the tilt sensor 54 of the body 3 that is, the detected value of the angular velocity (angular velocity in three axial directions) of the body 3 is integrated by an integrator and estimated.
- Find body posture This estimated body posture is described in the global coordinate system.
- the inclination component of the estimated body posture in order to suppress accumulation (drift) of the integration error of the detection value of the gyro sensor, the direction of gravity detected by the acceleration sensor provided in the inclination sensor 54 is controlled. Is used for drift correction.
- the difference between the relative angle of the body 3 with respect to the direction of gravity obtained from the detected value of the acceleration sensor and the inclination of the above estimated body posture is calculated, and the difference is converged to 0.
- the correction is performed by additionally inputting a value obtained from the difference by a feedback control law such as a PI control law into an integrator for integrating the gyro sensor detection value. Since this correction method is publicly known, further description is omitted.
- the process proceeds to S2002, and it is determined whether or not the landing of the robot 1 has been determined.
- the determination may be made from the timing of the desired gait, as in any of the methods listed below, or the detection value of the six-axis force sensor 50 (floor reaction force sensor) or the acceleration sensor May be determined based on Alternatively, a comprehensive judgment may be made from the detected values and the time of the desired gait. a) Whether a predetermined time has elapsed since the start of the one-leg support period.
- the landing moment should be minimized so that the landed foot22 will not slip or leave the floor again. It is preferable that the time when an appropriate amount of time elapses from that time is the time when the landing is determined. Therefore, it is preferable that the predetermined times a) to c) are set so as to satisfy such a condition.
- the process first proceeds to S204, and the difference between the estimated body posture viewed from the global coordinate system and the desired body posture (target body posture) viewed from the global coordinate system is determined by the posture rotation. It is calculated as a deviation change amount.
- the body posture of the target gait viewed from the global coordinate system is the posture inclination and spin of the body 3 during one step on the current estimated supporting leg coordinate system. This is the body posture as seen from the global coordinate system when it is assumed that mouth port 1 exercises in accordance with the desired gait.
- the estimated supporting leg coordinate system is a supporting leg coordinate system corresponding to the estimated position and orientation of the actual supporting leg foot 22 of the mouth port 1.
- the estimated support leg coordinate system is, more specifically, the actual support leg foot 22 of the mouth port 1 according to the definition of the support leg coordinate system described above.
- the origin is the vertical projection point from the ankle center of the support leg to the ground contact surface when the support leg is rotated to the horizontal without sliding between, taking the horizontal axis toward the toe of the support leg foot 22 as the X axis, It is a coordinate system with the vertical axis as the Z axis and the coordinate axis orthogonal to these as the Y axis.
- the estimated supporting leg coordinate system is updated only when the landing is determined, so the estimated position of the actual supporting leg foot 22 of the lopot at the time of landing is estimated. Is a support leg coordinate system corresponding to After all, in the first embodiment, the position and orientation of the estimated supporting leg coordinate system is estimated as the estimated value of the self-position of the mouth port 1.
- the estimated body posture and the target The difference between the gait and the body posture was calculated as the amount of change in the posture rotation deviation, but strictly speaking, the change in the estimated body posture during the period of one step and the change in the body posture of the target gait were calculated. It is better to calculate the difference from the change amount as the posture rotation deviation change amount.
- the estimated body posture automatically coincides with the body posture of the target gait at the beginning of one step. Since the difference between the estimated body posture and the amount of change in the estimated body posture during the period of one step is the same, the difference between May be used.
- the process proceeds to S 2006, where the posture rotation center is determined.
- the target ZMP of the previous time's gait for example, the target ZMP in a state in which the support leg foot 22 on the rear side of the previous time's gait is in toe contact is determined as the posture rotation center.
- Proceed to 08 and proceed to the current estimated support leg coordinate system (specifically, the estimated support leg coordinate system determined when the landing of the support leg foot 22 of the previous gait was determined.
- the position and posture of the estimated support leg coordinate system are rotated around the posture rotation center determined in S206 by the amount of change in the posture rotation deviation. (Estimated support leg coordinate system after the occurrence of slip as shown in (2)).
- the origin of the estimated supporting leg coordinate system and the direction of the coordinate axes shall be represented by the global coordinate system.
- next gait estimation support leg coordinate system (Initial position / posture in the global coordinate system) is assumed to be set ( then proceed to S210, and the next gait relative to the current estimated support leg coordinate system, the relative position of the estimated support leg coordinate system)
- the next time's gait estimation support leg coordinate system (Fig. 11) is used so that the posture relationship is the same as the relative position / posture relationship of the next time's gait support leg coordinate system with respect to the gantry leg coordinate system in the target gait (previous gait).
- Next time to illustrate Gait estimation support leg coordinate system is determined. Note that the next gait support leg coordinate system referred to here is not the next gait support leg coordinate system for the current gait to be generated, but the gait next to the previous gait (that is, the current gait). Leg coordinate system.
- the center of rotation of the estimated supporting leg coordinate system means the center of rotation of the sliding rotation of the supporting leg foot 22 in normal walking without an aerial period.
- the posture rotation in the air (Or spin) is preferably expressed by rotation about the center of rotation of the estimated supporting leg coordinate system.
- “slip of the supporting leg foot” may be defined as posture rotation about the rotation center of the estimated supporting leg coordinate system.
- the support leg coordinate system is set with respect to the contact surface of the support leg foot 22, and the origin is not the vertical projection point from the center of the ankle of the support leg to the contact surface as described above. Is also good. That is, the support leg coordinate system is a local coordinate system set on a virtual floor near the support leg foot 22 for describing the motion of the robot.
- the posture rotation (or spin) phenomenon of the mouth port 1 is caused by keeping the state in which the mouth port 1 is moving on the virtual floor without relatively rotating the posture (or spinning).
- Each mouth point 1 is considered to be a phenomenon in which the virtual floor is rotated (or spinned) in attitude with a predetermined point in the global coordinate system as a center of rotation.
- the movement of the mouth port 1 is based on the fact that the entire robot 1 that is moving in the local coordinate system according to the target gait or the detected joint displacement value is different from the local coordinate system with respect to the global coordinate system. It can be considered that the rotation is a perturbation about a certain point (center of rotation of posture).
- the posture rotation center is the rotation center of the estimated supporting leg coordinate system, and can also be said to be the rotation center of the perturbation rotation.
- the current estimated support leg coordinate system since the current estimated support leg coordinate system is updated only for each landing, it is not preferable that the current estimated support leg coordinate system remains inclined due to the body posture at the moment of landing. Therefore, the currently estimated supporting leg coordinate system should be determined so that the direction of the Z axis is vertical. Therefore, in the first embodiment, after rotating the current estimated supporting leg coordinate system in S 2008, the direction of the Z axis of the current estimated supporting leg coordinate system after the rotation is returned to the vertical direction. . Specifically, by rotating the current estimated support leg coordinate system after rotation by the inclination angle of the Z axis with respect to the vertical direction, the estimated support leg coordinate system is rotated. Return the Z axis to the vertical direction.
- the current estimated supporting leg coordinate system is rotated by the vertical axis component (spin component) of the posture rotation deviation change amount.
- the direction of the Z axis of the next time's gait estimation support leg coordinate system may be returned to the vertical direction.
- the currently estimated supporting leg coordinate system is the Z-axis. It is good to decide so that the direction is vertical. In the third embodiment and thereafter, the same may be applied if importance is placed on the self-position shift due to spin.
- the actual behavior of the mouth port 1 is described.
- the robot moves in accordance with the desired gait on the estimated supporting leg coordinate system, which is a local coordinate system describing the movement of the mouth port 1.
- the predetermined posture determined by the posture rotation center determining means (processing of S 2006) by the posture rotation deviation change amount (or the vertical axis component of the posture rotation deviation change amount) as the change amount.
- the first embodiment described above corresponds to the first to fifth inventions, and the seventh and eighth inventions of the present invention.
- FIG. 12 is a flowchart showing the self-position / posture estimation processing in the second embodiment.
- the self-position estimation process according to the second embodiment described below is a process performed when the robot 1 is walking.
- the detection value (angular velocity detection value) of the gyro sensor provided in the tilt sensor 54 of the body 3 is integrated by an integrator to estimate the estimation body. Ask for posture. Then S 2 1 0 2 Then, similarly to the first embodiment, it is determined whether or not the landing is determined.
- the process proceeds to S2104, and as in the first embodiment, the difference between the estimated body posture viewed from the global coordinate system and the body posture of the desired gait viewed from the global coordinate system is determined by posture rotation. It is calculated as a deviation change amount.
- the process proceeds to S2108, and, as in the first embodiment, the current estimated supporting leg coordinate system is rotated around the posture rotation center by the posture rotation deviation change amount. Determined as the estimated support leg coordinate system.
- Figure 13 illustrates the current estimated supporting leg coordinate system before rotation and the current estimated supporting leg coordinate system after rotation.
- the process proceeds to S210, and the kinematics calculation is performed based on at least the detected joint displacement values to determine the actual free leg foot position and posture at the time of landing in the current estimated supporting leg coordinate system (in the example of FIG. Estimate the position and posture of the foot 2 in front of 2).
- the estimated actual foot position and posture of the free leg at the time of landing will be referred to as the estimated free leg foot position and attitude at the time of landing.
- the mouth port 1 supports the virtual floor on the current estimated support leg coordinate system while maintaining the body posture as the target gait on the current estimated support leg coordinate system. It is assumed that the foot 22 of the leg does not slide and moves according to the detected joint displacement value.
- the mouth port 1 maintains the body posture according to the desired gait on the currently estimated supporting leg coordinate system rotated about the posture rotation center by the posture rotation deviation change amount.
- the reason for this was that, at this moment, This is because the body posture in the figure that has been set matches the estimated body posture in the global coordinate system. Therefore, when the currently estimated supporting leg coordinate system is rotated by the component around the vertical axis (spin component) of the change amount of the posture rotation deviation instead of by the posture rotation by the difference, the mouth port 1 becomes the current rotation.
- the body inclination matches the inclination component of the posture rotation deviation change amount, and the current estimated supporting leg coordinate system It is assumed that the foot 22 of the supporting leg does not slide on the virtual floor of the object and moves according to the detected value of the joint displacement.
- the deformation mechanism of the foot 22 (see FIGS. 3 and 4)
- the amount of deformation of the elastic member 106, the grounding member (foot sole elastic body) 71, and the bag-like member 109 shown in Fig. 4 is obtained using the dynamic model of the deformation mechanism (spring damper model, etc.). Including (taking into account) the amount, the estimated free landing foot position / posture at landing may be obtained. More specifically, as in the case of robot 1 shown in Fig. 13, the body posture is the target body posture in the current estimated supporting leg coordinate system, and the joint displacement is the joint displacement detection value (jth joint joint value).
- the floor reaction force detection value the entire appearance of mouth port 1 at that moment
- a load acting on the speed reducer / link is estimated using a disturbance observer based on the motor current command or the detected value, and the deformation of the speed reducer and the link is determined based on the estimated load. Estimated, including (consideration of) the gearbox and deformation of the link, estimated landing It may be obtained free leg foot position attitude.
- the estimated free leg foot position / posture at the time of landing may be obtained.
- the process proceeds to S2112, and based on the estimated landing free leg foot position / posture, the relative position of the next gait estimation support leg coordinate system with respect to the current estimated support leg coordinate system as shown in FIG. Ask for posture.
- the correspondence between the estimated landing free leg foot position and posture that is, the position and orientation of the support leg foot 22 of the next time gait, and the position and orientation of the next time gait estimation support leg foot coordinate system is supported as described above.
- the correspondence between the leg foot 22 and the supporting leg coordinate system should be the same.
- the actual behavior of the mouth port 1 is expressed in the estimated supporting leg coordinate system, which is a local coordinate system that describes the movement of the mouth port 1, in the body posture according to the desired gait.
- the posture rotation deviation change as an amount of change during one step of the difference between the estimated value of the actual posture) and the target posture of the upper body 3
- the rotation amount is determined to be the same as the estimated support leg coordinate system around the predetermined posture rotation center determined by the posture rotation center determination means (processing of S2106) by the A new estimation based on the swing leg position / posture of the support leg position / posture, that is, the position and orientation of the landing point of the foot 22, and in other words, the footprint Presumed.
- the second embodiment described above is the first to fifth inventions and the fifth invention of the present invention.
- the third embodiment is different from the first embodiment only in the self-position / posture estimation process of S 0 16 in FIG. 9 described above, and the other configuration and the process of the control unit 60 are the same as those in the first embodiment. Same as the form.
- FIG. 14 is a flowchart of the self-position estimating process of S 016 in FIG. 9 in the third embodiment.
- the posture rotation (or spin) of the estimated supporting leg coordinate system is estimated for each control cycle, and the estimated supporting leg coordinate system is updated.
- the body position (strictly speaking, the position of the representative point of the body) was also estimated.
- an estimated body posture is obtained as in the first embodiment.
- the process proceeds to S222, in which the estimated body posture change amount viewed from the global coordinate system during the control cycle and the target gait body viewed from the global coordinate system during the control cycle are determined.
- the difference from the change in posture is calculated as the change in posture rotation deviation. That is, in the first and second embodiments, the difference between the estimated body posture change amount and the target gait body posture change amount during the period of one step. Was obtained as the posture rotation deviation change amount. In the third embodiment, the difference between the two change amounts for each control cycle is obtained as the posture rotation deviation change amount.
- the process proceeds to S2204, and the attitude rotation center is determined as in the first embodiment. Specifically, the target ZMP at that moment (current value of the target ZMP) is set as the attitude rotation center.
- the process proceeds to S 220, and the current estimated support leg coordinate system (the estimated support leg coordinate system determined in the previous control cycle, the estimated support leg coordinate system at time t ⁇ 1 ⁇ t shown in FIG. 15) is changed to The position and orientation rotated around the attitude rotation center by the attitude rotation deviation change amount are determined again as the current estimated support leg coordinate system (estimated support leg coordinate system at time t shown in FIG. 15).
- the way of rotation is the same as in the first embodiment.
- the process proceeds to S2210, and as in the first embodiment, the relative position and orientation relationship of the next time's gait estimation support leg coordinate system with respect to the current estimated support leg coordinate system is the target gait (previous gait).
- the next time's gait estimation support leg coordinate system is determined so as to have the same relationship as the relative position and orientation relationship of the next time's gait support leg coordinate system with respect to the support leg coordinate system in.
- the actual robot behavior is represented by a mouth moving in accordance with the desired gait on the estimated supporting leg coordinate system, which is an oral coordinate system that describes the motion of the robot.
- the amount of change in body posture (change speed) and the amount of change in body posture (change speed) of the target gait obtained by the posture detection means (processing of S2200) at each moment.
- the posture rotation center determining means the processing of S 2204 with the posture rotation deviation change amount as a difference with the estimated support leg coordinate system, Assume that it is rotated.
- a new estimated supporting leg coordinate system and an estimated body position are determined (updated) at each moment (for each control cycle), and at the time of landing, a new estimated supporting leg according to the swing leg position and orientation is determined.
- the position and orientation of the leg coordinate system that is, the position and orientation of a new landing point, or in other words, a new footprint, is estimated for each step (for each landing). In the present embodiment, not only the case where the mouth port 1 is walked, but also the case where the robot 1 runs.
- the posture rotation center when determining the posture rotation center in S2204 when traveling, during the one-leg support period, the posture rotation center may be the target ZMP at each moment (current), but both legs of robot 1 In the mid-air period when the bodies 2 and 2 float in the air, the center of rotation of the posture is determined to be the position of the center of gravity of the mouth port 1 in the target gait at each moment (current).
- the estimated landing play is performed in S2110 based on at least the current estimated support leg coordinate system and the detected joint displacement value.
- the body position with respect to the supporting leg coordinate system is determined by the same method as the method for determining the leg foot position and orientation, and the estimated body position is determined so that this positional relationship matches the estimated body position with respect to the estimated supporting leg coordinate system. You may ask for the position.
- the estimated body position, estimated support leg coordinate system, and joint displacement detection The relationship between the values is as shown in FIG.
- the posture may be obtained, and the relationship between the estimated body position and posture and the estimated supporting leg coordinate system may be made to match this.
- the estimated body position / posture can be determined with higher accuracy.
- the processing from S2110 to S2114 in FIG. 12 in the second embodiment may be executed instead of S2210. good. This makes it possible to more accurately determine the estimated supporting leg coordinate system at the time of landing.
- the processing from S2110 to S2114 in the second embodiment is executed, and at S2214, at least the joint displacement is detected as described above.
- the estimated body position / posture may be obtained by kinematics calculation based on the values.
- an estimated landing free leg foot position / posture is determined in consideration of the amount of deformation of the foot 22, and this is used. It is more preferable to determine the next time's gait estimation support leg coordinate system.
- the actual behavior of the mouth pot is calculated based on at least the joint displacement while maintaining the body posture as the desired gait on the estimated supporting leg coordinate system, which is the local coordinate system that describes the movement of the mouth pot.
- the robot moving according to the detected value changes the body posture change amount (per control cycle) obtained by the posture detection means (processing of S2200).
- Posture rotation deviation change as the difference between the change amount, which corresponds to the change speed, and the change amount of the body posture of the target gait (the change amount per control cycle, which corresponds to the change speed) It is considered that the rotation has been performed together with the estimated supporting leg coordinate system around the predetermined posture rotation center at the moment determined by the posture rotation center determination means (the processing of S2204).
- a new estimated support leg coordinate system and an estimated body position are determined (updated), and at the time of landing, a new estimated support leg according to the swing leg position and posture is determined.
- the position and orientation of the coordinate system that is, the position and orientation of a new landing point, or in other words, a new footprint, may be estimated for each step (for each landing).
- the third embodiment described above corresponds to the first to sixth inventions of the present invention, as well as the eighth and ninth inventions.
- the estimated body position determined as in the third embodiment is hereinafter referred to as “geometrically estimated body position”.
- the geometrically estimated body position is the position viewed from the global coordinate system.
- the estimated body posture obtained in S2200 of the third embodiment is the inertial navigation estimated body posture.
- the geometrically estimated body position is based on the floor as described in the third embodiment. Since the body position is estimated geometrically based on the standard, the estimation accuracy is likely to decrease during the aerial period such as running.
- the inertial navigation estimated body position / posture is the same regardless of whether it is walking or running, during the one-leg support phase, the two-leg support phase, and the aerial phase. Although the short-term accuracy does not decrease, long-term drift is likely to occur because integration is used.In consideration of these characteristics, the fourth embodiment uses the inertial navigation estimated body position Forces were corrected for geometrically estimated body position.
- the detection drift of the jay mouth sensor is corrected, and The drift of the tilt component of the estimated body posture was corrected. Furthermore, depending on the situation, drift correction of the detected value of the jay mouth sensor in the yellow direction is also performed.
- FIG. 16 is a flowchart of the self-position estimation process in S 0 16 of FIG. 9 in the fourth embodiment
- FIG. 17 is a block diagram of the estimation process.
- the self-position estimation processing of S 016 in FIG. 9 in the fourth embodiment will be described.
- the detected value of the gyro sensor is calculated. Integrate to obtain an estimated body posture.
- the detection drift of the jay mouth sensor is corrected using the motion acceleration calculated from the motion of the body position estimated geometrically obtained in the previous control cycle (the control cycle before the previous control cycle) and the detection value of the acceleration sensor. In this way, the drift of the tilt component of the estimated body posture is corrected.
- the process of S2300 is described in detail. First, based on the previous control cycle and before that, based on the motion of the geometrically estimated body position determined in S2302 described later, Calculate the geometrical estimated body acceleration, which is the second derivative of the estimated body position. This processing is executed in block 301 of FIG. The processing for obtaining the geometrically estimated body position is executed in block 300 in FIG. Supplementally, if the body representative point and the position of the acceleration sensor coincide, the geometrically estimated body acceleration (strictly speaking, the acceleration of the acceleration sensor position in body 3 that is geometrically estimated) Estimation of body posture is not necessary for calculating.
- the acceleration sensor detection value (body acceleration detection value) is converted into a global coordinate system using the estimated body posture, and a global coordinate system converted value is obtained.
- This processing is executed in block 302 of FIG.
- the kinematics calculation converts the acceleration sensor detection value and the gyro mouth sensor detection value into the acceleration and angular velocity at the upper body representative point, or converts the acceleration and angular velocity at the upper body representative point into the acceleration sensor position. What is necessary is just to convert into the acceleration degree and the angular velocity at the position of the jar sensor.
- the geometrically estimated body acceleration is subtracted (vector subtraction is performed) from the acceleration sensor detected value global coordinate system converted value.
- the acceleration sensor detection value global coordinate system conversion value includes the acceleration component due to gravity acting on the acceleration sensor, but the geometrically estimated body acceleration does not include the acceleration component due to gravity. Accordingly, the estimated gravity acceleration can be obtained by subtracting the geometrically estimated body acceleration from the acceleration sensor detected value global coordinate system converted value.
- Fig. 18 (a) shows the estimated body posture when there is no error
- Fig. 18 (b) shows the estimated body posture error (hereinafter referred to as the estimated body posture error or the estimated body posture error). (Referred to as body posture error angle).
- the geometrically estimated body acceleration is described in the global coordinate system.
- the actual mouth port 1 exercises to follow the target gait on the assumption that the estimated body posture correctly estimates the actual body posture.
- the user exercises so as to follow the desired gait on the estimated global coordinate system.
- the global coordinate system estimated based on the estimated body position and orientation is correct, and the actual movement of the At each moment, the robot 1 exercising according to the desired gait on the estimated support leg coordinate system, which is the local coordinate system set in the global coordinate system believed to be correct,
- the amount of change in posture rotation deviation as the difference between the obtained (detected or estimated) body posture change speed and the desired gait body posture change speed, and the instantaneous amount determined by the posture rotation center determination means. It is considered to have rotated around the predetermined attitude rotation center together with the estimated supporting leg coordinate system.
- Estimated body posture error angle angle between assumed gravity acceleration and estimated gravity acceleration
- Equation 4 2 the difference between the estimated gravitational acceleration and the assumed gravitational acceleration is called the gravitational acceleration estimation error. If there is no error in the detected value of the acceleration sensor, the error is caused by the error of the estimated body posture, and the horizontal component of the gravitational acceleration estimation error as viewed from the global coordinate system estimated by Lopot 1 based on the estimated body posture. It can be seen that there is a relationship of (Equation 43) between (the component of the estimated gravity acceleration orthogonal to the assumed gravity acceleration) and the estimated body posture error angle. Equation 43 shows the relationship between the longitudinal component (X component) of the gravitational acceleration estimation error and the estimated body posture error angle around the Y axis. When expressing the relationship between the horizontal component (Y component) of the gravitational acceleration estimation error and the component of the estimated body posture error angle around the X axis, the minus sign on the right side may be deleted. Here, the gravitational acceleration is positive.
- the estimated body posture is corrected so that the estimated body posture error converges to 0 using the estimated body posture error angle calculated from Equation 42 or Equation 43. I did it.
- the estimated upper body is calculated from the angle between the assumed gravitational acceleration and the estimated gravitational acceleration at the current moment (strictly, the moment before one control cycle). Calculate the attitude error angle.
- the estimated body posture error angle may be calculated from the horizontal component of the gravitational acceleration estimation error at the current moment (strictly, one moment before the control cycle). The process of calculating the estimated body posture error angle is executed in block 304 of FIG. 17. Next, the estimated body posture error angle is calculated by the sensor coordinate system (coordinate axis) in block 304 of FIG. 17.
- the value obtained by multiplying the converted value by the integration gain Ka is integrated to obtain the estimated gyro sensor drift (gyro port drift). (Estimated value of sensor drift).
- the estimated gyro sensor drift is subtracted from the gyro sensor detection value co in (detection value of the body angular velocity) to obtain an angular velocity in which the drift has been corrected.
- the drift of the halo rate is also appropriately subtracted, which will be described later.
- the angular velocity corrected for this drift is converted into a global coordinate system using the estimated body posture in block 350, thereby obtaining a global body angular velocity co gl.
- the value obtained by multiplying the estimated body posture error angle by the gain Kb in block 309 in FIG. 17 is subtracted from the global body angle velocity o gl in block 308, and the value after subtraction (
- the new estimated body posture 6> es tm is obtained by integrating the output of block 3 08) with the block 310 (integrator).
- the estimated body posture 0 estm and angular velocities are quaternion , Rotation matrix, or Euler angles.
- the converted values (vectors) of the estimated body posture error angle into the sensor coordinate system must be set.
- the estimated body posture error angle around each sensor detection axis, in other words, the sensor local estimated body posture error angle is affected only by the drift of the gyro sensor corresponding to that element.
- Gyro sensor A prerequisite is that the lift is unaffected or hardly affected.
- the error of the estimated body posture error angle around the X (Y) axis is affected by the drift of the X (Y) axis gyro sensor, but the drift of the Y (X) axis gyro sensor is Is a precondition for not being affected by
- the estimated body posture error angle component around the X-axis occurs due to the influence of the drift of the X-axis gyro sensor, and the body is suddenly rotated 90 degrees around the Z-axis of the sensor coordinate system, Since the estimated body posture error angle remains accumulated in the global coordinate system, as a result, the component of the estimated body posture error angle around the sensor mouth X-axis becomes the sensor local Y axis of the estimated body posture error angle. Move to surrounding components. Therefore, in order to satisfy the above prerequisite, it is a necessary condition that the absolute value of the rotation speed around the Z axis of the sensor coordinate system is sufficiently small.
- the detected value of the angular velocity of the gyro sensor is expressed in the global coordinate system.
- the global body angle velocity c glz converted to is close to the value of the angular velocity detection value of the Z-axis gyro sensor.
- the integral gain Ka is It may be reduced or set to 0 (that is, gyro sensor drift correction is not performed).
- the absolute value of the component around the vertical axis of the body rotation speed or the component around the vertical axis of the body 3 determined based on at least one of the angular velocity detection value of the gyro sensor and the target gait
- the actual mouth port 1 is determined to be correct on the estimated global coordinate system based on the assumption that the global coordinate system estimated based on the estimated body position and orientation is correct. She is exercising to follow her gait. Therefore, in the state where the supporting leg is in contact with the ground, even if there is a large error in the estimated body posture, the actual body acceleration is not governed by the actual gravitational acceleration. It almost matches the estimated body acceleration on the system. On the other hand, in the mid-air period, the actual body acceleration is dominated by the actual gravity acceleration and accelerates, so that the direction of the geometrically estimated body acceleration on the estimated global coordinate system is significantly different from the estimated body acceleration. Fig. 18 (b) The engagement is not established.
- the accuracy of the geometrically estimated body position tends to be lower than the accuracy of the body position obtained by inertial navigation based on the relatively high-accuracy acceleration sensor and gyro sensor. Therefore, it is better to set the gain Ka to a small value or to 0 even in the aerial period.
- the gain Ka is preferably set to a small value or set to zero.
- the state in which the foot 22 of the leg 2 is in good contact with the floor is, specifically, a state in which at least one of the following conditions is satisfied, or a plurality of the following conditions: Refers to the state of being satisfied at the same time.
- Target Z MP (or actual floor reaction force center point) is at or near the center of foot 22
- the above state may be determined based on, for example, the timing (phase) of the desired gait, the desired ZMP, and the detected floor reaction force.
- the above state is determined based on the detection values of the distributed pressure sensor and the contact sensor. You may use it.
- the component around the vertical axis of the body rotation speed or the component around the vertical axis of the body 3 obtained based on at least one of the angular velocity detection value of the gyro sensor and the target gait
- the absolute value of the absolute value of the acceleration sensor detection value, the absolute value of the value obtained by subtracting the assumed gravitational acceleration from the global coordinate system converted value, or the absolute value of the geometrically estimated body acceleration is large, the air period, or
- the gain Ka is preferably set to a small value or set to zero.
- the integral gain Ka may be determined according to the instantaneous value of these states or the long-term tendency.
- the integral gain Kb is set similarly to the gain Ka.
- an auto-rate correction (drift correction in one direction) is also performed depending on the situation as described below.
- At least one or more of the following conditions are prepared as judgment conditions, and when these judgment conditions are satisfied, it is judged that the Joule correction is performed.
- Yoret correction is to be performed in the situation where slippage (rotational slip) does not occur or hardly occurs on the contact surface between the support leg foot 22 and the floor.
- the gyro detection values in d) and f) are values obtained by correcting the gyro sensor detection value itself (the attitude angular velocity detection value represented by the raw output of the gyro sensor itself).
- At block 312 when it is determined that the rate correction is to be performed, as shown in FIG. 17, at block 312, at least the target gait, the target body posture, the target joint displacement, or the detected joint displacement value is determined. Either one of them and the estimated support leg coordinate system memorized at the time of the latest landing (hereinafter referred to as the estimated support leg coordinate system at the time of landing), and slip between the foot 22 corresponding to the estimated support leg coordinate system and the floor Calculate the body posture assuming that no slippage has occurred (hereinafter referred to as the estimated body posture without slip). Then, a difference between the estimated body posture and the estimated body posture without slip is obtained in block 313, and the difference obtained by converting the difference into a sensor coordinate system in block 314 is defined by a feedback control law.
- the input to the block 3 15 of the feedback control law is shut off (switch 3 16 in FIG. 17 is opened), and the immediately preceding The rate drift value is held, and the value is subtracted from the angular velocity sensor detection value o in.
- the floor reaction force detection value and the Z or Z were calculated in the same way as when estimating the free leg foot position and posture at landing in S2110 in Fig. 12. Based on the floor reaction force of the desired gait, the deformation mechanism of the foot 22 (the elastic member 106 shown in FIGS.
- an estimated slip-free estimated body posture may be obtained. Also, using a disturbance observer based on the motor current command or the detected value, the load acting on the speed reducer and the link is estimated, and the deformation of the speed reducer / link is estimated based on the estimated load.
- the slipless estimation body posture may be obtained including (considering) the deformation of the speed reducer and the link.
- the kinematics calculation is used to calculate the estimated free leg foot position / posture at landing. You may ask.
- the process proceeds to S2302, and S2202 of FIG. 14 of the third embodiment described above.
- S 2 214 is executed to determine the position and orientation of the estimated supporting leg coordinate system and the geometrically estimated body position.
- the processes of S2110 to S2114 in the second embodiment are executed instead of S2210 of FIG.
- the estimated body position / posture may be obtained by kinematics calculation based on at least the detected joint displacement value as described above.
- an estimated landing free leg foot position / posture is determined based on the amount of deformation of the foot 22, and this is used. It is more preferable to determine the coordinate system of the next time's gait estimation support leg.
- the inertial navigation estimated body position and orientation are obtained by inertial navigation based on the acceleration sensor and the gyro sensor, and the geometric estimated body position and the inertial navigation estimated body position are obtained. Correct the inertial navigational body position so that the difference of converges to zero.
- the difference between the previous value of the inertial navigational estimated body position Xinertestm and the geometrically estimated body position is obtained in block 317, and the time derivative of this difference ( Or the sum of the value obtained by multiplying the amount of change of the difference during the control cycle by the gain Kc and the value obtained by multiplying the difference by the gain Kd is obtained in block 318. That is, the above sum is obtained from the above difference by the PD control law as a feedback control law. Further, a difference between the acceleration sensor detected value global coordinate system converted value output from the block 302 and the assumed gravity acceleration G is obtained by a block 319.
- the gains Kc and Kd should be small or set to 0 in situations where errors in the geometrically estimated body position are likely to occur due to large errors.
- Support leg When the contact area between the bottom surface of the foot 22 and the floor is small, the error of the geometrically estimated body position increases. Therefore, when the contact area between the bottom surface of the foot 22 of the supporting leg and the floor is small, the gains Kc and Kd are preferably set small or set to zero. Particularly, when the feet 22 are in solid contact with the floor (so-called solid feet), the gains Kc and Kd are preferably increased.
- the correction gains Ka, Kb, Kc, and Kd are high during the sole contacting period, as shown in the graph of the correction gain K shown in Fig. 8 (c). Should be set to 0 or approximately 0. Note that the graph of the correction gain K in Fig. 8 (c) shows the tendency of the change in the magnitudes of Ka, Kb, Kc, and Kd, but does not represent an exact value. In FIG. 8 (c), the correction gain K is normalized so that the maximum value is 1. Therefore, K may be considered to mean the aperture (attenuator) of the correction gains Ka, Kb, Kc, and Kd.
- the estimated body posture is corrected on the basis of the estimated body posture error calculated using Equation 42 or Equation 43, but Equations 4 2 and Equation 43 are not used.
- the estimated body posture may be directly corrected based on the horizontal component of the gravitational acceleration estimation error. That is, the following equation 44 may be used instead of equation 43.
- Horizontal component of the gravitational acceleration estimation error one estimated body posture error angle * gravity addition formula 4 4
- the entirety of the robot 1 is the same as in the zero-gravity state, and the output of the acceleration sensor is not affected by the estimated body posture error.
- condition A the condition that the model parameters almost match the actual mouth pot 1 (hereafter referred to as condition A) is satisfied, the estimated gravitational acceleration and the assumed gravitational acceleration always almost match. Strictly speaking, the estimated gravitational acceleration and the assumed gravitational acceleration deviate from the true value by the same amount according to the error in the estimated body posture, and as a result, the estimated gravitational acceleration and the assumed gravitational acceleration always almost coincide. Therefore, in the mid-air period, the direction of gravity cannot be estimated.
- the estimated body posture error at that moment is almost 0, and the estimated body posture error can be obtained without setting the gain Ka small. Is multiplied by the gain Ka, the correction amount is almost zero, and it is unlikely that the estimated body posture will have a large adverse effect.
- FIG. 20 is a flowchart of the self-position estimation process of SO 16 in FIG. 9 in the fifth embodiment
- FIG. 21 is a flowchart of the self-position estimation process of S 0 16 in the fifth embodiment.
- FIG. 4 is a block diagram showing a means for estimating the position of the total body weight.
- an inertial navigation estimated overall center of gravity position XGinertestm described later is determined for each control cycle.
- FIG. 21 the same components as those in FIG. 17 are denoted by the same reference numerals as those in FIG. 17.
- the process proceeds to S 2404, and based on the position and orientation of the estimated supporting leg coordinate system, the target body position and orientation (or the geometrically estimated body position) and the joint displacement detection value (or the target value), As in the third embodiment, the total body weight position is calculated by kinematics calculation in consideration of the actual behavior (posture rotation) of the robot. This processing is executed in block 330 of FIG. 21.
- the overall barycentric position calculated in this way is called a geometrically estimated global barycentric position.
- the process proceeds to S 246, where the joint angle detection value (or target value), the estimated body posture (or target body posture), and the acceleration sensor detection value Based on the bal coordinate system conversion value, the acceleration of all the centers of gravity is calculated by kinematics calculation. This process is performed in block 33 of FIG.
- the acceleration of the overall center of gravity obtained in this manner is referred to as an inertial navigation estimated overall center of gravity acceleration.
- the process proceeds to S 2 408, and the inertial navigation estimated overall center of gravity position XGinertestm is obtained by inertial navigation based on the inertial navigation estimated overall center of gravity acceleration. Correct the inertial navigation estimated overall center of gravity so that the difference between the two converges to zero. More specifically, as shown in Fig. 21, the difference between the inertial navigation estimated overall center of gravity position XGinertestm determined in the previous control cycle (previous control cycle before) and the geometrically estimated body position is blocked. The sum of the value obtained by multiplying the time derivative of this difference (or the amount of change in the difference during the control cycle) by a gain Kc and the value obtained by multiplying the difference by a gain Kd is obtained as a block 3 3 Ask in 3.
- the above sum is obtained from the above difference by the PD control law as a feedback control law. Then, the value obtained by subtracting the above sum (the output of the block 3 33) from the block 3 3 1 from the inertial navigation estimated total center of gravity acceleration output from the block 3 3 1 in the block 3 3 4 Determine the new inertial navigational estimate overall center of gravity position XGinertestm.
- the inertial navigation estimated overall center of gravity acceleration may be determined.
- lopot 1 does not receive any external force from the environment, in the aerial period, the entire center of gravity performs a parabolic exercise, so that the position of the overall center of gravity can be more accurately estimated. .
- whether or not the aerial period is attained may be determined by using at least one of a target gait timing (phase), a floor reaction force of the target gait, and a floor reaction force detection value.
- the airborne period is based on whether or not the floor floor detection value is equal to or less than a predetermined value.
- the fifth embodiment described above corresponds to the first to sixth inventions and the eighth to 22nd inventions of the present invention.
- FIG. 22 is a block diagram showing a means for estimating the overall center-of-gravity position in the self-position estimation process of S 0 16 in FIG. 9 in the sixth embodiment.
- the floor reaction force detection of the six-axis sensor (floor reaction sensor) 50 is performed.
- the position of the dynamically estimated center of gravity was calculated based on the values. Except for this, it is the same as the fifth embodiment. Further, in FIG. 22, the same reference numerals as those in FIG. 21 are used for the same components as those in FIG. 21.
- the detection value of the floor reaction force sensor (6-axis force sensor 50) is calculated using the joint angle detection value (or target value) and the estimation value.
- the value is converted into a value in the global coordinate system in block 340.
- the converted floor reaction force sensor detected value is called the floor reaction force sensor detected value global coordinate system converted value.
- the overall center-of-gravity acceleration obtained in this manner is referred to as a dynamically estimated overall center-of-gravity acceleration.
- the dynamic estimated overall center of gravity position is obtained by integrating the dynamic estimated overall center of gravity acceleration by the second order, and the difference between the geometric estimated overall center of gravity position and orientation and the inertial navigation estimated overall center of gravity position is 0.
- the dynamic total body weight position is corrected so as to converge.
- the difference between the dynamic estimated total gravity center position and the geometric estimated total weight center position determined in the previous control cycle is blocked.
- the sum of the value obtained by multiplying the time differential value of this difference (or the amount of change in the difference during the control cycle) by a gain Kc and the value obtained by multiplying the difference by a gain Kd is obtained as a block 3 4 3 Ask in 4. That is, the above sum is obtained from the above difference by the PD control law as a feedback control law.
- a value obtained by subtracting the above sum (the output of the block 344) from the dynamic estimated overall gravity center acceleration, which is the output of the block 342, in the block 345 is second-order integrated in the block 346.
- a new dynamic estimated overall center of gravity position XGinertestm is determined.
- the block Estimate body position based on the above-mentioned dynamic estimated overall center-of-gravity position, joint angle detection value (or target value), and estimated body posture (or target body posture).
- the means for estimating the total body weight position and the body position in the self-position estimation processing of S 0 16 in the sixth embodiment has been described above.
- the dynamic estimation overall center of gravity acceleration may be determined.
- the entire center of gravity performs a parabolic exercise, so that it is possible to more accurately estimate the position of the entire center of gravity. .
- whether or not the aerial period is attained may be determined by using at least one of a target gait timing (phase), a floor reaction force of the target gait, and a floor reaction force detection value.
- the airborne period is based on whether or not the floor floor detection value is equal to or less than a predetermined value.
- the sixth embodiment described above corresponds to the first to sixth inventions, the eighth and ninth inventions, and the 23rd to 30th inventions of the present invention.
- the center of rotation of the posture determined in each of the above embodiments will be supplemented below.
- a so-called supporting polygon minimum convex polygon including the ground contact surface, ZMP possible range or center of total floor reaction force
- any contact points other than the posture rotation center point will slip, but the actual bottom of the foot 22 is made of an elastic body such as rubber. It is considered that no slippage occurs near the rotation center of the posture.
- the mouth pot In the mid-air period, the mouth pot is considered to perturb around the center of gravity.
- the rotation center of the posture is It is considered that it exists between the supporting polygon and the position of the overall center of gravity (or the position of the upper body representative point).
- the attitude rotation center determined in S2204 (see FIG. 14) executed in the third and subsequent embodiments is generally one of the following.
- the rotation center of the posture should be set in the supporting polygon. Specifically, it is sufficient to set the above a) and b). Alternatively, the predetermined point in c) above may be set so that the rotation center of the posture is included in the supporting polygon. For example, it may be set to the origin of the supporting leg coordinate system (usually below the ankle joint).
- the rotation center of the posture is preferably set as in d) above, but since the position of the center of gravity of all exists near the position of the upper body representative point, it may be set as in e) above. .
- attitude rotation center point on or inside the smallest convex body including the area where the entire mouth port 1 is swept by the motion during the latest one step.
- attitude rotation center point on or within the surface of the smallest convex body including the entire mouth port 1 at a certain moment, for example, at the moment of landing.
- an integrator for calculating the inertial navigational estimated body position (block 3 2 in FIG. 17) is used.
- a correction input (output of block 3 18 in FIG. 17) was additionally added to 0). Therefore, the inertial navigational estimated body position was corrected for each control cycle.
- the inertial navigation estimated body position is determined without correction for a certain period, and the inertial navigation estimated body position and the geometric estimated body position are calculated by the following Equation 45.
- An internal estimated body position, which is the interpolated estimated body position, may be obtained and output as the finally determined estimated body position. Interpolative estimated body position
- Equation 45 where hi is the weight of the weighted average, and gradually changes from 1 to 0 during the predetermined period.
- the predetermined period is set to a period other than the aerial period.
- the interpolation is the estimated overall center of gravity obtained by interpolating the inertial navigational or dynamic estimated overall center of gravity and the geometrically estimated overall center of gravity.
- the position of the overall gravity center may be determined.
- the estimated inertial navigational body position to be corrected or the inertial navigational estimated center of gravity position or the dynamic estimated center of gravity position is only the vertical position component (relative to the floor surface). good. Alternatively, only the horizontal component may be corrected.
- the value of the correction gain Ka, Kb, Kc or Kd may be determined using the stationary Kalman filter or the non-stationary Kalman filter method.
- the characteristics of the system noise (disturbance) and the observation noise do not sufficiently satisfy the preconditions of the Kalman filter, so that It doesn't always produce the effect.
- the acceleration sensor and / or the gyro sensor may be mounted (built-in) on a part other than the upper body 3, for example, on the head 4.
- kinematics calculation is performed on the detected values of the acceleration sensor and / or gyro sensor based on the displacement command (target displacement) or the detected displacement value of the neck joint. If the acceleration is converted into the acceleration of the body representative point and the angular acceleration, the remaining position may be estimated in the same manner as in the above-described embodiments.
- the components of the estimated position and orientation (position vertical component, position horizontal component Depending on the direction component and the tilt component around the vertical axis, different processes among the self-position estimation processes from the first embodiment to the sixth embodiment may be selected.
- the horizontal position of the body 3 is estimated geometrically according to the self-position estimation processing of the third embodiment, and the vertical position is calculated by inertial navigation estimation according to the self-position estimation processing of the fourth embodiment.
- Body vertical position geometrically estimated to body vertical position More correction may be made.
- the joint displacement is defined as the joint displacement of the target gait or the detected value of the joint displacement.
- Weighted average. The weight at this time may have a frequency characteristic.
- the estimated position and orientation such as the estimated support leg coordinate system and the estimated body position and orientation
- the global coordinate system instead of using the global coordinate system as a reference as in the above embodiments, it is expressed by perturbation from the target position and orientation. You may.
- the estimated body posture error angle in the global coordinate system is determined based on the estimated gravitational acceleration in the global coordinate system, as shown in FIG.
- the estimated body posture was corrected by additionally inputting the value obtained by multiplying the error angle by the gain Kb into the integrator (block 3110 in Fig. 17) that integrates the global body angular velocity co gl . That is, although the estimated body posture is corrected in the global coordinate system, it may be corrected in the local coordinate system of the gyro sensor (the coordinate system fixed to the body 3) instead. Specifically, in FIG.
- the block 309 of the gain Kb and the adder (block 308) for subtracting the output from the global upper body angular velocity c gl are deleted, and the integrator KaZ S ( Block 3 06) may be changed to Ka / S + Kb, that is, a block of the PI control law.
- the acceleration sensor detected value is calculated from the global coordinate system converted value.
- the estimated gravitational acceleration may be obtained by reducing the body acceleration of the gait.
- a posture rotation around the center of posture rotation occurs in the mouth port that is trying to exercise according to the desired gait, and the body rotation deviates from the desired gait due to the posture rotation.
- the posture inclination of the posture rotation is 0 on average even if it vibrates back and forth and right and left.
- the spin in the posture rotation is close to 0 on average because the spin direction is switched in reverse for each step. Therefore, excluding the centrifugal force and other forces acting in almost the same direction regardless of the rotation direction of the posture rotation, the positive and negative effects of the posture rotation offset the positive and negative effects, and the long-term Is almost zero.
- “long term” refers to a time longer than the settling time for correcting the estimated body posture.
- the detection drift of the gyro sensor is obtained by using the motion acceleration calculated from the geometrically estimated body position and posture motion and the detection value of the acceleration sensor. May be corrected. Also, in the first to third embodiments, as in the fourth embodiment, the drift of the tilt component of the estimated body posture may be corrected. In the first to third embodiments, however, as in the fourth embodiment, the rate correction may be performed.
- FIG. 23 and 24 show the internal structure of the head 4 of the mouth pot 1 in the seventh embodiment.
- FIG. 23 is a front view
- FIG. 24 is a side view.
- the head 4 is connected to the upper part of the upper body 3 via the neck joint 120 rotating in the pan and tilt directions. Touched.
- the neck joint 120 is equipped with motors 121, 122 with encoders (joint displacement detectors) and reduction gears 123, 124, as shown in the figure. It is controlled so as to follow the joint displacement command from the control unit 60 via the motor control device which is omitted.
- the head 4 is provided with two left and right video cameras 1 25 and 1 25 as environment recognition means so that an object can be stereoscopically viewed.
- the outputs (imaging information) of the video cameras 125 and 125 are input to the control unit 60, and the control unit 60 determines the distance to the object in the imaging information. Be recognized.
- the following environment recognition means may be provided.
- Non-contact multi-point distance measuring devices such as range finder and scanning laser distance meter
- the seventh embodiment is different from the fourth embodiment in the self-position estimation processing of S 016 in FIG. 9 in addition to the configuration of the head 4 described above.
- Other configurations and processes are the same as in the fourth embodiment. Note that other configurations and processes may be the same as those of the fifth or sixth embodiment, for example, instead of the fourth embodiment.
- FIG. 25 is a flowchart showing the self-position estimating process in S 0 16 of FIG. 9 in the seventh embodiment.
- an object is positioned at the center or an appropriate position of the images of the video cameras 125 and 125 (hereinafter, simply referred to as camera images) using the estimated self-position and orientation in the fourth embodiment.
- Control the neck joint 120 so that the video camera 1 2 5, 1 Gaze control for controlling the direction of 25 is performed.
- the video camera 125, 125 (or range finder, etc.) recognizes the landmark, etc., which stores the exact position in the global coordinate system in advance, and corrects the estimated self-position / posture. I do.
- a certain object is recognized by the video cameras 125 and 125, and the information obtained by the video cameras 125 and 125 and the estimated self-position and orientation are converted to the global coordinate system of the object. Recognize the position and orientation or shape of the object.
- processing from S2500 to S2504 is the same as that of S2300 to S2304 in the fourth embodiment.
- the object In order to determine which object to look at, specifically, in the map information, for each object, the object can be observed, and the relative positional relationship between mouth port 1 and the object The area where can be measured (hereinafter referred to as the observable area) is described. Then, based on the current estimated body posture and the inertial navigation body position and the future target route, it is predicted whether the vehicle is in any observable area for the current and future specified periods, and for a while If it is expected to be in the observable area, it is determined that gaze control should be performed.
- the target object to be watched is specified in advance in the action planning section (Fig. 6) of the mouth port 1 located on the upper layer of the movement control section (portion shown in Fig. 6) of the mouth port 1. (Not shown), but may be determined according to the travel route, or may be specified by the operator via a man-machine interface.
- the map information is information on landmarks, obstacles, departure points, destinations, movable areas, roads, etc., such as position, shape characteristics, and observable area, and is stored in memory before moving. It is assumed that Then, the process proceeds to S2508, in which a relative relationship between the position of the target object to be watched on the map and the estimated body position / posture is obtained.
- the control unit 60 outputs the target neck joint displacement to the motor control device (motor driver) of the neck joint motors 122 and 122.
- the motor control device (motor driver) of the neck joint motors 1 2 1 and 1 2 1 2 1 1 2 1 , 122 are controlled.
- the process proceeds to S2514, and it is determined whether or not the object is a landmark. Specifically, it is determined whether or not the object is a landmark based on the attribute information of the searched object.
- the process proceeds to S2516, and the current actual robot is obtained from the camera image, the map information, and the estimated body position / posture and neck joint displacement by geometric calculation.
- Estimate the position and orientation in the global coordinate system of the support leg coordinate system (hereinafter referred to as the visual estimation support leg coordinate system) corresponding to the support leg foot position and orientation in Step 1.
- a difference between the position and orientation of the visual estimation support leg coordinate system and the position and orientation of the estimated support leg coordinate system is determined.
- a product of this difference and a predetermined gain Ke is obtained, and the position and orientation obtained by adding the product to the position and orientation of the estimated support leg coordinate system are used as a new position and orientation of the estimated support leg coordinate system.
- a new estimated supporting leg coordinate system is determined by the following equation 46.
- the position and orientation of the visual estimation support leg coordinate system and the position and orientation of the estimated support leg coordinate system are determined by interpolation (that is, internal division or weighted average).
- the position and orientation of the new estimated support leg coordinate system are determined. .
- the gain Ke may be determined based on the certainty of each estimated value so that the new estimated supporting leg coordinate system to be determined is a value having the highest probability as a true value.
- the process proceeds to S2250, and the shape and position of the target object are calculated from the camera image, the map information, the estimated body position / posture, and the neck joint displacement.
- the calculated object shape / position is referred to as a visual estimation object shape / position.
- the process proceeds to S2522, and the shape and position of the map information are corrected based on the shape and position of the visual estimation target.
- the visual estimation Elephant shape ⁇ Modify the object shape and position of map information so that the difference between the position and the object shape of map information ⁇ position converges to zero.
- a difference between the shape / position of the visual estimation target object and the target object shape / position of the map information is obtained.
- a product of the difference and a predetermined gain Kf is obtained, and the shape and position obtained by adding the product to the object shape and position of the map information are set as the object shape and position of the new map information.
- the object shape and position of the new map information are determined by the following Expression 47.
- the shape and position of the new map information are determined by interpolation of the shape of the visual estimation target object and the position and the object shape of the map information and position (that is, internal division or weighted average). .
- the position of the visual estimation support leg coordinate system is estimated based on the inertial navigation estimated body position, and the inertial navigation estimated body position is calculated based on the estimated support leg coordinate system. Since it is determined to converge to the estimated body position / posture, if the position / posture of the estimated support leg coordinate system is changed by the processing of S 2 5 18, the position of the visual estimation support leg coordinate system also changes with a delay Is done. Therefore, in the processing of S 2 518, oscillation may occur if the gain Ke is too large in an attempt to speed up convergence.
- the inertial navigation estimated body position is also directly corrected to change the position of the visual estimated supporting leg coordinate system. What is necessary is just to shorten the delay.
- the seventh embodiment described above is the third embodiment of the present invention. This corresponds to the first to thirty-seventh inventions.
- the actual movement of the mouth port 1 is made according to the desired gait on the estimated supporting leg coordinate system, which is a local coordinate system describing the movement of the mouth port 1.
- the mouth port 1 that is exercising is determined by the difference between the posture detected or estimated by the posture detecting means and the target posture (or the vertical component of the difference) during one step.
- the position and orientation of the new estimated support leg coordinate system corresponding to the swing leg position and orientation at the time of landing That is, since the position and direction of the landing point are estimated for each step (for each landing), the position and direction of the landing point can be accurately estimated.
- the acceleration when the acceleration is fluctuating violently in the horizontal or vertical direction, when all legs 2, 2 are off the floor, as in the aerial period during running, or when the feet 22 and the floor
- the position and orientation of the landing point can be estimated accurately even in a state in which the posture rotation (or spin) occurs in the entire robot 1 due to rotational slip during the rotation.
- the actual behavior of the mouth pot 1 is maintained on the estimated supporting leg coordinate system, which is a mouth-to-cal coordinate system that describes the motion of the robot 1, while maintaining the body posture as the desired gait. While moving at least according to the detected joint displacement value
- the posture of the mouth pot is determined by the posture rotation center determining means by the difference between the posture detected or estimated by the posture detecting means and the target posture during one step.
- the position and orientation of the new estimated supporting leg coordinate system corresponding to the swing position and orientation at the time of landing, i.e., the position of the landing point, assuming that the object is rotated together with the estimated supporting leg coordinate system Since the direction is estimated for each step (for each landing), the position and orientation of the landing point can be estimated with higher accuracy.
- the actual behavior of the robot 1 is represented by a mouth port 1 that is moving along a target gait on an estimated supporting leg coordinate system that is a local coordinate system that describes the motion of the robot 1.
- the change rate of the difference in the body posture change rate which is considered to have rotated together with the estimated supporting leg coordinate system around the predetermined posture rotation center at that moment determined by the posture rotation center determination means. Then, a new estimated support leg coordinate system and estimated body position / posture are determined (updated) at each moment (for each control cycle).
- the position and orientation of the new estimated supporting leg coordinate system according to the swing position and orientation that is, the position and orientation of the new landing point, were estimated for each step (for each landing). Therefore, the position and orientation of the landing point can be more accurately estimated, and the position and orientation of the mouth port 1 can be continuously and accurately estimated.
- the estimated body position (inertial navigation estimated body position) and the estimated body posture are obtained by inertial navigation, and the geometric body, which is the estimated body position determined by the third embodiment, is obtained.
- Inertial navigation body estimated body position and estimated body posture are corrected by using the estimated body position, so that the self-position and orientation of the mouth port 1 and the landing position and orientation (of the estimated supporting leg coordinate system) Position 'orientation) can be more accurately estimated.
- the estimated overall center of gravity position by the inertial navigation is determined for each control cycle. The self-position and orientation of the robot 1 and the landing position and orientation (the position and orientation of the estimated supporting leg coordinate system) can be accurately estimated as in the fourth embodiment.
- the dynamic estimated overall center of gravity position is calculated based on the floor reaction force sensor detected value.
- the self-position and orientation of the mouth port 1 and the landing position and orientation the position and orientation of the estimated supporting leg coordinate system
- the estimation is performed continuously (for each control cycle) by using the geometric self-position estimation operation or by using the geometric self-position estimation operation and the inertial navigation operation in combination.
- the video camera mounted on the mouth port 1 based on the self-position / posture and the position in the global coordinate system of the object such as the floor or obstacle on the map, which is stored in advance. Since the environment recognizing means such as, 125 determines the direction to be watched, the object can be continuously captured at the center or an appropriate position of the image of the environment recognizing means.
- the environment recognizing means recognizes the object
- the position and orientation or shape of the object in a global coordinate system can be accurately recognized from the information obtained by the environment recognizing means and the estimated self-position and posture.
- the environment recognizing means may provide an accurate position information such as a landmark in advance.
- Self-position and posture estimated by using a geometric self-position estimation operation based on the position information of the object storing the information and the relative position information of the robot 1 with respect to the object obtained by the environment recognition means.
- the estimated self-position / posture can be corrected by using the geometric self-position estimation calculation and the inertial navigation calculation in combination, and the accuracy of the self-position / posture estimation can be improved.
- the self-position and orientation are corrected based on a plurality of continuous images, it is less susceptible to noise and erroneous recognition.
- the posture inclination of the mouth port 1 can be accurately estimated without being affected by the motion acceleration. This can also be applied to other embodiments.
- the rate is corrected, and the posture (particularly the direction of the horizontal component) and the position of the robot 1 can be more accurately estimated. This can also be applied to other embodiments. Industrial applicability
- the present invention is useful as providing a technique capable of accurately estimating the position of a leg-type moving port such as a bipedal moving port.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/512,263 US7337040B2 (en) | 2002-04-26 | 2003-04-28 | Self-position estimating device for leg type movable robots |
KR1020047017293A KR100956520B1 (ko) | 2002-04-26 | 2003-04-28 | 다리식 이동 로봇의 자기위치 추정 장치 |
DE60332224T DE60332224D1 (de) | 2002-04-26 | 2003-04-28 | Selbstpositionsabschätzvorrichtung für mobile roboter mit beinen |
EP03725698A EP1502712B1 (en) | 2002-04-26 | 2003-04-28 | Self-position estimating device for leg type movable robots |
JP2004501989A JP4246696B2 (ja) | 2002-04-26 | 2003-04-28 | 脚式移動ロボットの自己位置推定装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002127066 | 2002-04-26 | ||
JP2002-127066 | 2002-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003090980A1 true WO2003090980A1 (fr) | 2003-11-06 |
Family
ID=29267632
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/005449 WO2003090981A1 (fr) | 2002-04-26 | 2003-04-28 | Systeme permettant d'estimer l'attitude d'un robot mobile monte sur des jambes |
PCT/JP2003/005448 WO2003090980A1 (fr) | 2002-04-26 | 2003-04-28 | Dispositif d'estimation automatique de position pour robots mobiles montes sur des jambes |
PCT/JP2003/005447 WO2003090979A1 (fr) | 2002-04-26 | 2003-04-28 | Systeme permettant d'estimer l'attitude d'un robot mobile monte sur des jambes |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/005449 WO2003090981A1 (fr) | 2002-04-26 | 2003-04-28 | Systeme permettant d'estimer l'attitude d'un robot mobile monte sur des jambes |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/005447 WO2003090979A1 (fr) | 2002-04-26 | 2003-04-28 | Systeme permettant d'estimer l'attitude d'un robot mobile monte sur des jambes |
Country Status (6)
Country | Link |
---|---|
US (3) | US6963185B2 (ja) |
EP (5) | EP1502712B1 (ja) |
JP (3) | JP4181113B2 (ja) |
KR (3) | KR100956520B1 (ja) |
DE (5) | DE60332224D1 (ja) |
WO (3) | WO2003090981A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008284690A (ja) * | 2003-03-27 | 2008-11-27 | Sony Corp | ロボット装置及びロボット装置の制御方法 |
JP2008309781A (ja) * | 2007-05-11 | 2008-12-25 | Commiss Energ Atom | 連接構造の動作をキャプチャするための処理方法 |
US7541764B2 (en) | 2003-11-27 | 2009-06-02 | Honda Motor Co., Ltd. | Control system for mobile body |
JP2011230213A (ja) * | 2010-04-26 | 2011-11-17 | Honda Motor Co Ltd | ロボット、制御システムおよび制御プログラム |
Families Citing this family (148)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3726058B2 (ja) * | 2001-12-28 | 2005-12-14 | 本田技研工業株式会社 | 脚式移動ロボットおよびその床反力検出装置 |
WO2003057423A1 (fr) * | 2001-12-28 | 2003-07-17 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif de production de demarche pour robot se deplaçant sur des jambes |
JP3726057B2 (ja) * | 2001-12-28 | 2005-12-14 | 本田技研工業株式会社 | 脚式移動ロボットおよびその床反力検出装置 |
WO2003057427A1 (fr) * | 2001-12-28 | 2003-07-17 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif de commande pour robot mobile sur jambes |
WO2003064116A2 (en) * | 2002-01-31 | 2003-08-07 | Braintech Canada, Inc. | Method and apparatus for single camera 3d vision guided robotics |
US6963185B2 (en) * | 2002-04-26 | 2005-11-08 | Honda Giken Kogyo Kabushiki Kaisha | System for estimating attitude of leg type moving robot itself |
EP1504858B1 (en) * | 2002-04-26 | 2014-06-04 | Honda Giken Kogyo Kabushiki Kaisha | Control device and footstep determination device for legged mobile robot |
JP3938326B2 (ja) * | 2002-05-10 | 2007-06-27 | 川田工業株式会社 | ロボット用付加的支持構造 |
US20040064195A1 (en) | 2002-07-15 | 2004-04-01 | Hugh Herr | Variable-mechanical-impedance artificial legs |
US7343223B2 (en) * | 2003-03-13 | 2008-03-11 | Alps Electric Co., Ltd. | Robot apparatus and load sensor |
EP1642689B1 (en) * | 2003-06-27 | 2011-02-16 | Honda Motor Co., Ltd. | Controller of legged mobile robot |
US8075633B2 (en) | 2003-09-25 | 2011-12-13 | Massachusetts Institute Of Technology | Active ankle foot orthosis |
KR100620118B1 (ko) * | 2004-03-31 | 2006-09-13 | 학교법인 대양학원 | 관성센서를 이용한 보행패턴 분석장치 및 그 방법 |
KR100571839B1 (ko) * | 2004-03-31 | 2006-04-17 | 삼성전자주식회사 | 인간형 로봇 |
JP2006068872A (ja) * | 2004-09-03 | 2006-03-16 | Honda Motor Co Ltd | 脚式移動ロボット |
JP2006136962A (ja) * | 2004-11-11 | 2006-06-01 | Hitachi Ltd | 移動ロボット |
JP4316477B2 (ja) * | 2004-11-18 | 2009-08-19 | パナソニック株式会社 | 移動ロボットの追従方法 |
JP4492322B2 (ja) * | 2004-12-02 | 2010-06-30 | トヨタ自動車株式会社 | 歩行ロボット |
US7313463B2 (en) * | 2005-03-31 | 2007-12-25 | Massachusetts Institute Of Technology | Biomimetic motion and balance controllers for use in prosthetics, orthotics and robotics |
US8864846B2 (en) | 2005-03-31 | 2014-10-21 | Massachusetts Institute Of Technology | Model-based neuromechanical controller for a robotic leg |
US8500823B2 (en) | 2005-03-31 | 2013-08-06 | Massachusetts Institute Of Technology | Powered artificial knee with agonist-antagonist actuation |
US20070162152A1 (en) | 2005-03-31 | 2007-07-12 | Massachusetts Institute Of Technology | Artificial joints using agonist-antagonist actuators |
US20070123997A1 (en) | 2005-03-31 | 2007-05-31 | Massachusetts Institute Of Technology | Exoskeletons for running and walking |
US8512415B2 (en) | 2005-03-31 | 2013-08-20 | Massachusetts Institute Of Technology | Powered ankle-foot prothesis |
US10080672B2 (en) | 2005-03-31 | 2018-09-25 | Bionx Medical Technologies, Inc. | Hybrid terrain-adaptive lower-extremity systems |
US20060249315A1 (en) | 2005-03-31 | 2006-11-09 | Massachusetts Institute Of Technology | Artificial human limbs and joints employing actuators, springs, and variable-damper elements |
US20070043449A1 (en) | 2005-03-31 | 2007-02-22 | Massachusetts Institute Of Technology | Artificial ankle-foot system with spring, variable-damping, and series-elastic actuator components |
US10307272B2 (en) | 2005-03-31 | 2019-06-04 | Massachusetts Institute Of Technology | Method for using a model-based controller for a robotic leg |
US11278433B2 (en) | 2005-03-31 | 2022-03-22 | Massachusetts Institute Of Technology | Powered ankle-foot prosthesis |
JP2007007796A (ja) * | 2005-07-01 | 2007-01-18 | Toyota Motor Corp | 歩行ロボット |
JP2007041733A (ja) * | 2005-08-01 | 2007-02-15 | Toyota Motor Corp | 運動体の姿勢角検出装置 |
JP2009509779A (ja) * | 2005-09-23 | 2009-03-12 | ブレインテック カナダ インコーポレイテッド | 視覚追跡のシステム及び方法 |
KR100883792B1 (ko) * | 2005-12-29 | 2009-02-18 | 한국생산기술연구원 | 이동 로봇의 위치 추정 시스템 및 그 방법 |
KR20070073273A (ko) * | 2006-01-04 | 2007-07-10 | 삼성전자주식회사 | 휴대용 단말기에서 폴더의 회전 상태를 감지하는 장치 및방법 |
JP5034235B2 (ja) * | 2006-01-16 | 2012-09-26 | ソニー株式会社 | 制御システム及び制御方法、並びにコンピュータ・プログラム |
JP4173513B2 (ja) * | 2006-06-12 | 2008-10-29 | ヤマザキマザック株式会社 | 機器移設有無検知装置及びその機器移設有無検知装置を備えた機器 |
US8437535B2 (en) | 2006-09-19 | 2013-05-07 | Roboticvisiontech Llc | System and method of determining object pose |
JP4281777B2 (ja) | 2006-10-05 | 2009-06-17 | トヨタ自動車株式会社 | 傾斜角推定機構を有する移動体 |
WO2008076942A1 (en) * | 2006-12-15 | 2008-06-26 | Braintech Canada, Inc. | System and method of identifying objects |
JP4365402B2 (ja) * | 2006-12-20 | 2009-11-18 | 本田技研工業株式会社 | 移動体の移動角度検出装置 |
JP4143103B2 (ja) | 2006-12-20 | 2008-09-03 | 本田技研工業株式会社 | 移動装置、ならびにその制御システム、制御プログラムおよび監督システム |
JP4171510B2 (ja) | 2006-12-20 | 2008-10-22 | 本田技研工業株式会社 | 移動装置、ならびにその制御システム、制御プログラムおよび監督システム |
KR100864607B1 (ko) * | 2007-05-31 | 2008-10-22 | 한국과학기술원 | 휴머노이드 이족 보행 로봇을 위한 기울기 센서와 각속도센서를 이용한 관성 측정 장치 및 방법 |
JP5161498B2 (ja) * | 2007-06-18 | 2013-03-13 | 株式会社豊田中央研究所 | 姿勢信号演算装置 |
KR100922494B1 (ko) * | 2007-07-19 | 2009-10-20 | 삼성전자주식회사 | 이동 로봇의 자세 측정 방법 및 상기 방법을 이용한 위치측정 방법 및 장치 |
WO2009040885A1 (ja) * | 2007-09-25 | 2009-04-02 | Fujitsu Limited | ロボット制御装置、ロボット制御方法およびロボット制御プログラム |
US20100250177A1 (en) * | 2007-11-13 | 2010-09-30 | Koninklijke Philips Electronics N.V. | Orientation measurement of an object |
JP4968684B2 (ja) | 2007-12-10 | 2012-07-04 | 本田技研工業株式会社 | 目標経路設定支援システム |
KR100926783B1 (ko) * | 2008-02-15 | 2009-11-13 | 한국과학기술연구원 | 물체인식 및 인식된 물체를 포함하는 주변 환경 정보를바탕으로 한 로봇의 자기 위치 추정 방법 |
KR100988568B1 (ko) * | 2008-04-30 | 2010-10-18 | 삼성전자주식회사 | 로봇 및 그 지도작성방법 |
CN101581936B (zh) * | 2008-05-16 | 2012-01-25 | 深圳富泰宏精密工业有限公司 | 利用手机控制两足式机器人的系统及方法 |
KR20090126038A (ko) * | 2008-06-03 | 2009-12-08 | 삼성전자주식회사 | 보행 로봇 및 그 제어 방법 |
KR101008360B1 (ko) * | 2008-07-01 | 2011-01-14 | (주)마이크로인피니티 | 이동 로봇에서의 자이로 센서 오차를 교정하는 장치 및방법 |
US20110082566A1 (en) | 2008-09-04 | 2011-04-07 | Herr Hugh M | Implementing a stand-up sequence using a lower-extremity prosthesis or orthosis |
US9554922B2 (en) | 2008-09-04 | 2017-01-31 | Bionx Medical Technologies, Inc. | Hybrid terrain-adaptive lower-extremity systems |
US8559699B2 (en) * | 2008-10-10 | 2013-10-15 | Roboticvisiontech Llc | Methods and apparatus to facilitate operations in image based systems |
US8127871B2 (en) * | 2008-11-03 | 2012-03-06 | Robert J Viola | Frame walker predicated on a parallel mechanism |
KR20100078248A (ko) * | 2008-12-30 | 2010-07-08 | 삼성전자주식회사 | 보행 로봇 및 그 제어방법 |
JP4957753B2 (ja) * | 2009-06-15 | 2012-06-20 | セイコーエプソン株式会社 | ロボット、搬送装置、及び慣性センサーを用いた制御方法 |
JP5219956B2 (ja) * | 2009-07-23 | 2013-06-26 | 本田技研工業株式会社 | 移動体の制御装置 |
KR20110015765A (ko) * | 2009-08-10 | 2011-02-17 | 삼성전자주식회사 | 로봇의 경로계획장치 및 그 방법 |
US8924015B2 (en) * | 2009-09-14 | 2014-12-30 | Honda Motor Co., Ltd. | Whole-body humanoid control from upper-body task specifications |
JP5232120B2 (ja) * | 2009-10-01 | 2013-07-10 | 本田技研工業株式会社 | 移動体の制御装置 |
EP2555716A2 (en) | 2010-04-05 | 2013-02-13 | Iwalk, Inc. | Controlling torque in a prosthesis or orthosis |
JP5652042B2 (ja) * | 2010-08-06 | 2015-01-14 | セイコーエプソン株式会社 | ロボット装置、ロボット装置の制御方法およびプログラム |
EP2663267B1 (en) | 2011-01-10 | 2019-07-31 | Iwalk, Inc. | Powered joint orthosis |
US20120259430A1 (en) | 2011-01-12 | 2012-10-11 | Zhixiu Han | Controlling powered human augmentation devices |
US9687377B2 (en) | 2011-01-21 | 2017-06-27 | Bionx Medical Technologies, Inc. | Terrain adaptive powered joint orthosis |
WO2012125562A1 (en) | 2011-03-11 | 2012-09-20 | Iwalk, Inc. | Biomimetic joint actuators |
JP5930892B2 (ja) * | 2011-09-07 | 2016-06-08 | 本田技研工業株式会社 | 接触状態推定装置及び軌道生成装置 |
WO2013067407A1 (en) | 2011-11-02 | 2013-05-10 | Iwalk, Inc. | Biomimetic transfemoral prosthesis |
JP5927031B2 (ja) | 2011-11-26 | 2016-05-25 | 本田技研工業株式会社 | 倒立振子型車両 |
KR20130063230A (ko) * | 2011-12-06 | 2013-06-14 | 삼성전자주식회사 | 보행 로봇 및 그 제어 방법 |
US9032635B2 (en) | 2011-12-15 | 2015-05-19 | Massachusetts Institute Of Technology | Physiological measurement device or wearable device interface simulator and method of use |
KR101913332B1 (ko) * | 2011-12-23 | 2018-10-31 | 삼성전자주식회사 | 이동 장치 및 이동 장치의 위치 인식 방법 |
JP5929224B2 (ja) * | 2012-01-20 | 2016-06-01 | セイコーエプソン株式会社 | ロボット |
JP5185473B1 (ja) * | 2012-02-13 | 2013-04-17 | パナソニック株式会社 | 脚式ロボット |
US9221177B2 (en) | 2012-04-18 | 2015-12-29 | Massachusetts Institute Of Technology | Neuromuscular model-based sensing and control paradigm for a robotic leg |
JP6081081B2 (ja) | 2012-05-14 | 2017-02-15 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5916520B2 (ja) | 2012-05-14 | 2016-05-11 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5959927B2 (ja) | 2012-05-14 | 2016-08-02 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5959928B2 (ja) | 2012-05-14 | 2016-08-02 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5808289B2 (ja) | 2012-05-14 | 2015-11-10 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5921950B2 (ja) | 2012-05-14 | 2016-05-24 | 本田技研工業株式会社 | 倒立振子型車両 |
JP5927032B2 (ja) | 2012-05-14 | 2016-05-25 | 本田技研工業株式会社 | 倒立振子型車両 |
CA2876187C (en) | 2012-06-12 | 2021-01-26 | Iwalk, Inc. | Prosthetic, orthotic or exoskeleton device |
KR101441187B1 (ko) * | 2012-07-19 | 2014-09-18 | 고려대학교 산학협력단 | 자율 보행 로봇 경로 계획 방법 |
JP5945477B2 (ja) | 2012-08-31 | 2016-07-05 | 本田技研工業株式会社 | 倒立振子型車両、及び倒立振子型車両の制御方法 |
JP5919143B2 (ja) * | 2012-08-31 | 2016-05-18 | 本田技研工業株式会社 | 駆動装置 |
JP5840109B2 (ja) | 2012-11-01 | 2016-01-06 | 本田技研工業株式会社 | 移動体 |
DE102012224107A1 (de) * | 2012-12-20 | 2014-06-26 | Continental Teves Ag & Co. Ohg | Verfahren zum Bestimmen einer Referenzposition als Startposition für ein Trägheitsnavigationssystem |
JP6081238B2 (ja) | 2013-03-12 | 2017-02-15 | 本田技研工業株式会社 | 移動体 |
JP6095436B2 (ja) | 2013-03-27 | 2017-03-15 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6081271B2 (ja) | 2013-03-29 | 2017-02-15 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6081270B2 (ja) | 2013-03-29 | 2017-02-15 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6111119B2 (ja) | 2013-03-29 | 2017-04-05 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6062784B2 (ja) | 2013-03-29 | 2017-01-18 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6062785B2 (ja) | 2013-03-29 | 2017-01-18 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6099484B2 (ja) | 2013-05-31 | 2017-03-22 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6099483B2 (ja) | 2013-05-31 | 2017-03-22 | 本田技研工業株式会社 | 倒立振子型車両 |
JP6099485B2 (ja) | 2013-05-31 | 2017-03-22 | 本田技研工業株式会社 | 倒立振子型車両 |
US20150176989A1 (en) * | 2013-12-23 | 2015-06-25 | Hti Ip, Llc | Accelerometer-Based Hill Angle Estimation |
JP6380828B2 (ja) * | 2014-03-07 | 2018-08-29 | セイコーエプソン株式会社 | ロボット、ロボットシステム、制御装置、及び制御方法 |
US9259838B1 (en) | 2014-07-24 | 2016-02-16 | Google Inc. | Systems and methods for ground plane estimation |
US9618937B1 (en) | 2014-08-25 | 2017-04-11 | Google Inc. | Slip detection using robotic limbs |
US9387588B1 (en) | 2014-08-25 | 2016-07-12 | Google Inc. | Handling gait disturbances with asynchronous timing |
US9403275B2 (en) * | 2014-10-17 | 2016-08-02 | GM Global Technology Operations LLC | Dynamic obstacle avoidance in a robotic system |
US9352470B1 (en) * | 2014-11-11 | 2016-05-31 | Google Inc. | Yaw slip handling in a robotic device |
JP6240590B2 (ja) * | 2014-11-12 | 2017-11-29 | 本田技研工業株式会社 | 移動ロボットの制御装置 |
US9499218B1 (en) | 2014-12-30 | 2016-11-22 | Google Inc. | Mechanically-timed footsteps for a robotic device |
WO2016142794A1 (en) | 2015-03-06 | 2016-09-15 | Wal-Mart Stores, Inc | Item monitoring system and method |
US20180099846A1 (en) | 2015-03-06 | 2018-04-12 | Wal-Mart Stores, Inc. | Method and apparatus for transporting a plurality of stacked motorized transport units |
US9801517B2 (en) | 2015-03-06 | 2017-10-31 | Wal-Mart Stores, Inc. | Shopping facility assistance object detection systems, devices and methods |
JP6700546B2 (ja) * | 2015-06-01 | 2020-05-27 | 富士通株式会社 | 負荷検出方法、負荷検出装置および負荷検出プログラム |
JP6450273B2 (ja) * | 2015-07-08 | 2019-01-09 | 本田技研工業株式会社 | 移動ロボットの動作環境情報生成装置 |
CN105180929A (zh) * | 2015-09-01 | 2015-12-23 | 深圳市华颖泰科电子技术有限公司 | 一种车载惯性导航系统中惯性传感器的安装方法 |
US9586316B1 (en) | 2015-09-15 | 2017-03-07 | Google Inc. | Determination of robotic step path |
US9925667B1 (en) * | 2016-01-25 | 2018-03-27 | Boston Dynamics, Inc. | Continuous slip recovery |
US10179619B1 (en) * | 2016-03-30 | 2019-01-15 | Schaft Inc. | Robotic foot sensor |
CA2961938A1 (en) | 2016-04-01 | 2017-10-01 | Wal-Mart Stores, Inc. | Systems and methods for moving pallets via unmanned motorized unit-guided forklifts |
JP2017196704A (ja) * | 2016-04-28 | 2017-11-02 | セイコーエプソン株式会社 | 可動部の振動測定方法、ロボットの振動測定方法および制御装置 |
JP2017196705A (ja) * | 2016-04-28 | 2017-11-02 | セイコーエプソン株式会社 | ロボット、及びロボットシステム |
CN106184463B (zh) * | 2016-08-15 | 2019-03-08 | 北京钢铁侠科技有限公司 | 具有零点位置检测功能的仿人机器人关节机构 |
US10828767B2 (en) | 2016-11-11 | 2020-11-10 | Sarcos Corp. | Tunable actuator joint modules having energy recovering quasi-passive elastic actuators with internal valve arrangements |
US10821614B2 (en) | 2016-11-11 | 2020-11-03 | Sarcos Corp. | Clutched joint modules having a quasi-passive elastic actuator for a robotic assembly |
CN110831725A (zh) * | 2017-06-29 | 2020-02-21 | 索尼互动娱乐股份有限公司 | 机器人控制装置、控制方法及控制程序 |
CN109693235B (zh) * | 2017-10-23 | 2020-11-20 | 中国科学院沈阳自动化研究所 | 一种仿人眼视觉跟踪装置及其控制方法 |
CN110053039B (zh) * | 2018-01-17 | 2021-10-29 | 深圳市优必选科技有限公司 | 一种机器人行走中重力补偿的方法、装置及机器人 |
SE542305C2 (en) * | 2018-01-30 | 2020-04-07 | Infonomy Ab | Arrangement and system for wearable balance meter |
EP3812729A4 (en) | 2018-06-22 | 2021-08-25 | Sony Group Corporation | SLIP DETECTION DEVICE |
CN109394095B (zh) * | 2018-10-23 | 2020-09-15 | 珠海市一微半导体有限公司 | 一种机器人运动地毯偏移的控制方法、芯片及清洁机器人 |
CN109184673B (zh) * | 2018-11-12 | 2023-11-24 | 美钻深海能源科技研发(上海)有限公司 | 一种机械式管柱接箍检测装置及方法 |
US11241801B2 (en) | 2018-12-31 | 2022-02-08 | Sarcos Corp. | Robotic end effector with dorsally supported actuation mechanism |
JP7088318B2 (ja) * | 2019-01-11 | 2022-06-21 | オムロン株式会社 | 制御装置 |
KR102658278B1 (ko) * | 2019-02-20 | 2024-04-18 | 삼성전자주식회사 | 이동형 로봇 및 그것의 로봇 암 정렬 방법 |
KR20210000095A (ko) * | 2019-06-24 | 2021-01-04 | 삼성전자주식회사 | 자세 추정 방법 및 장치 |
WO2021049227A1 (ja) * | 2019-09-13 | 2021-03-18 | ソニー株式会社 | 情報処理システム、情報処理装置及び情報処理プログラム |
JP7391616B2 (ja) * | 2019-11-05 | 2023-12-05 | 本田技研工業株式会社 | 移動体の姿勢推定装置 |
US11745902B1 (en) | 2019-12-11 | 2023-09-05 | Government Of The United States As Represented By The Secretary Of The Air Force | Systems, methods and apparatus for multifunctional central pattern generator |
US11833676B2 (en) | 2020-12-07 | 2023-12-05 | Sarcos Corp. | Combining sensor output data to prevent unsafe operation of an exoskeleton |
CN112596531B (zh) * | 2021-03-04 | 2021-06-22 | 德鲁动力科技(成都)有限公司 | 一种四足机器人自适应负载参数调整方法 |
CN113091999A (zh) * | 2021-04-01 | 2021-07-09 | 燕山大学 | 一种足式机器人腿部一维力传感器标定方法及系统 |
CN114794953A (zh) * | 2022-04-19 | 2022-07-29 | 净豹智能机器人(台州)有限公司 | 基于深度学习的智能清洁机器人贴边清洁系统及方法 |
US11738452B1 (en) * | 2022-07-29 | 2023-08-29 | Sarcos Corp. | Sole with various compliant regions for robots |
US11826907B1 (en) | 2022-08-17 | 2023-11-28 | Sarcos Corp. | Robotic joint system with length adapter |
US11717956B1 (en) | 2022-08-29 | 2023-08-08 | Sarcos Corp. | Robotic joint system with integrated safety |
US11897132B1 (en) | 2022-11-17 | 2024-02-13 | Sarcos Corp. | Systems and methods for redundant network communication in a robot |
US11924023B1 (en) | 2022-11-17 | 2024-03-05 | Sarcos Corp. | Systems and methods for redundant network communication in a robot |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11272983A (ja) * | 1998-03-19 | 1999-10-08 | Fujitsu Ltd | 経路計画装置,到着時間予測装置,走行記録保存装置および経路計画/到着時間予測システム |
JP2000153476A (ja) * | 1998-09-14 | 2000-06-06 | Honda Motor Co Ltd | 脚式移動ロボット |
EP1120203A1 (en) | 1998-04-20 | 2001-08-01 | Honda Giken Kogyo Kabushiki Kaisha | Controller for legged mobile robot |
WO2002040224A1 (fr) | 2000-11-17 | 2002-05-23 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif generateur d'un modele de demarche pour robot mobile pourvu de jambes |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63150176A (ja) * | 1986-12-15 | 1988-06-22 | 工業技術院長 | 動的歩行ロボツトの歩行制御方法 |
JP2520019B2 (ja) * | 1989-06-29 | 1996-07-31 | 本田技研工業株式会社 | 脚式移動ロボットの駆動制御装置 |
JP2592340B2 (ja) * | 1989-12-14 | 1997-03-19 | 本田技研工業株式会社 | 脚式歩行ロボットの関節構造 |
DE69124486T2 (de) * | 1990-11-30 | 1997-05-15 | Honda Motor Co Ltd | System zur Steuerung der Fortbewegung eines Schreitroboters mit Beinen |
US5355064A (en) * | 1992-03-04 | 1994-10-11 | Honda Giken Kogyo Kabushiki Kaisha | Control system for legged mobile robot |
JP3148827B2 (ja) | 1992-04-30 | 2001-03-26 | 本田技研工業株式会社 | 脚式移動ロボットの歩行制御装置 |
US5432417A (en) * | 1992-04-30 | 1995-07-11 | Honda Giken Kogyo Kabushiki Kaisha | Locomotion control system for legged mobile robot |
JP3273443B2 (ja) | 1992-05-22 | 2002-04-08 | 本田技研工業株式会社 | ロボットのリンクなどの軌道生成方法及び装置 |
JP3233450B2 (ja) | 1992-05-22 | 2001-11-26 | 本田技研工業株式会社 | 指定時刻到達関数発生器 |
JP3132156B2 (ja) * | 1992-05-22 | 2001-02-05 | 本田技研工業株式会社 | 脚式移動ロボットの歩容生成装置 |
JP3269852B2 (ja) * | 1992-05-29 | 2002-04-02 | 本田技研工業株式会社 | 脚式移動ロボットの姿勢安定化制御装置 |
US5404086A (en) * | 1992-07-20 | 1995-04-04 | Honda Giken Kogyo Kabushiki Kaisha | System for controlling locomotion of legged mobile robot and correcting inclinometer's output thereof |
JPH09272083A (ja) * | 1996-04-08 | 1997-10-21 | Mitsubishi Electric Corp | 2足歩行ロボット |
JP3663034B2 (ja) | 1996-07-25 | 2005-06-22 | 本田技研工業株式会社 | 脚式移動ロボットの歩容生成装置 |
JP3672406B2 (ja) * | 1997-01-31 | 2005-07-20 | 本田技研工業株式会社 | 脚式移動ロボットの歩容生成装置 |
JP3629133B2 (ja) | 1997-01-31 | 2005-03-16 | 本田技研工業株式会社 | 脚式移動ロボットの制御装置 |
JP3655056B2 (ja) | 1997-08-04 | 2005-06-02 | 本田技研工業株式会社 | 脚式移動ロボットの制御装置 |
US6260862B1 (en) * | 1998-02-11 | 2001-07-17 | Joseph C. Klann | Walking device |
JP4213310B2 (ja) * | 1999-08-30 | 2009-01-21 | 本田技研工業株式会社 | 2足歩行脚式移動ロボット |
JP4475708B2 (ja) | 1999-11-12 | 2010-06-09 | ソニー株式会社 | 脚式移動ロボット及びその動作制御方法 |
JP4480843B2 (ja) * | 2000-04-03 | 2010-06-16 | ソニー株式会社 | 脚式移動ロボット及びその制御方法、並びに、脚式移動ロボット用相対移動測定センサ |
JP2002144260A (ja) | 2000-11-13 | 2002-05-21 | Sony Corp | 脚式移動ロボット及びその制御方法 |
DE60142850D1 (de) * | 2000-11-17 | 2010-09-30 | Honda Motor Co Ltd | Fernsteuerung von zweifüssigem roboter |
TW499349B (en) * | 2000-11-17 | 2002-08-21 | Sony Corp | Legged mobile robot, leg structure of legged mobile robot, and mobile leg unit for legged mobile robot |
JP3726032B2 (ja) * | 2001-04-27 | 2005-12-14 | 本田技研工業株式会社 | 脚式移動ロボットの目標運動生成装置 |
JP4188607B2 (ja) * | 2001-06-27 | 2008-11-26 | 本田技研工業株式会社 | 二足歩行移動体の床反力推定方法及び二足歩行移動体の関節モーメント推定方法 |
WO2003057423A1 (fr) | 2001-12-28 | 2003-07-17 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif de production de demarche pour robot se deplaçant sur des jambes |
WO2003061917A1 (fr) * | 2002-01-18 | 2003-07-31 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif de commande pour robot bipede |
US6963185B2 (en) * | 2002-04-26 | 2005-11-08 | Honda Giken Kogyo Kabushiki Kaisha | System for estimating attitude of leg type moving robot itself |
KR101121020B1 (ko) * | 2004-01-13 | 2012-03-15 | 혼다 기켄 고교 가부시키가이샤 | 이동 로봇의 보용생성 장치 |
-
2003
- 2003-04-28 US US10/511,128 patent/US6963185B2/en not_active Expired - Lifetime
- 2003-04-28 DE DE60332224T patent/DE60332224D1/de not_active Expired - Lifetime
- 2003-04-28 WO PCT/JP2003/005449 patent/WO2003090981A1/ja active Application Filing
- 2003-04-28 DE DE60332233T patent/DE60332233D1/de not_active Expired - Lifetime
- 2003-04-28 WO PCT/JP2003/005448 patent/WO2003090980A1/ja active Application Filing
- 2003-04-28 DE DE60332227T patent/DE60332227D1/de not_active Expired - Lifetime
- 2003-04-28 US US10/511,451 patent/US7145305B2/en not_active Expired - Lifetime
- 2003-04-28 JP JP2004501988A patent/JP4181113B2/ja not_active Expired - Fee Related
- 2003-04-28 EP EP03725698A patent/EP1502712B1/en not_active Expired - Fee Related
- 2003-04-28 JP JP2004501990A patent/JP4181114B2/ja not_active Expired - Fee Related
- 2003-04-28 EP EP03728007A patent/EP1508408B1/en not_active Expired - Fee Related
- 2003-04-28 KR KR1020047017293A patent/KR100956520B1/ko not_active IP Right Cessation
- 2003-04-28 EP EP09009476A patent/EP2106886B1/en not_active Expired - Fee Related
- 2003-04-28 EP EP09009477A patent/EP2110210B1/en not_active Expired - Fee Related
- 2003-04-28 KR KR1020047017278A patent/KR100956539B1/ko active IP Right Grant
- 2003-04-28 JP JP2004501989A patent/JP4246696B2/ja not_active Expired - Fee Related
- 2003-04-28 EP EP03725699A patent/EP1504855B1/en not_active Expired - Fee Related
- 2003-04-28 US US10/512,263 patent/US7337040B2/en active Active
- 2003-04-28 WO PCT/JP2003/005447 patent/WO2003090979A1/ja active Application Filing
- 2003-04-28 KR KR1020047017286A patent/KR100956537B1/ko active IP Right Grant
- 2003-04-28 DE DE60336519T patent/DE60336519D1/de not_active Expired - Lifetime
- 2003-04-28 DE DE60336518T patent/DE60336518D1/de not_active Expired - Lifetime
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11272983A (ja) * | 1998-03-19 | 1999-10-08 | Fujitsu Ltd | 経路計画装置,到着時間予測装置,走行記録保存装置および経路計画/到着時間予測システム |
EP1120203A1 (en) | 1998-04-20 | 2001-08-01 | Honda Giken Kogyo Kabushiki Kaisha | Controller for legged mobile robot |
JP2000153476A (ja) * | 1998-09-14 | 2000-06-06 | Honda Motor Co Ltd | 脚式移動ロボット |
WO2002040224A1 (fr) | 2000-11-17 | 2002-05-23 | Honda Giken Kogyo Kabushiki Kaisha | Dispositif generateur d'un modele de demarche pour robot mobile pourvu de jambes |
Non-Patent Citations (1)
Title |
---|
See also references of EP1502712A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008284690A (ja) * | 2003-03-27 | 2008-11-27 | Sony Corp | ロボット装置及びロボット装置の制御方法 |
US7541764B2 (en) | 2003-11-27 | 2009-06-02 | Honda Motor Co., Ltd. | Control system for mobile body |
US7603199B2 (en) | 2003-11-27 | 2009-10-13 | Honda Motor Co., Ltd. | Control device for mobile body |
US7606634B2 (en) | 2003-11-27 | 2009-10-20 | Honda Motor Co., Ltd. | Control device for mobile body |
JP2008309781A (ja) * | 2007-05-11 | 2008-12-25 | Commiss Energ Atom | 連接構造の動作をキャプチャするための処理方法 |
JP2011230213A (ja) * | 2010-04-26 | 2011-11-17 | Honda Motor Co Ltd | ロボット、制御システムおよび制御プログラム |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003090980A1 (fr) | Dispositif d'estimation automatique de position pour robots mobiles montes sur des jambes | |
US7664572B2 (en) | Control device of legged mobile robot | |
US8204625B2 (en) | Gait creation device of leg-type mobile robot | |
KR100685339B1 (ko) | 이동 로봇 | |
US9120512B2 (en) | Control device and gait generating device for bipedal mobile robot | |
JP4800037B2 (ja) | 移動ロボットの歩容生成装置 | |
WO2003090982A1 (fr) | Dispositif de commande et dispositif de determination de pas pour robot mobile sur jambes | |
JP3901694B2 (ja) | 歩行式ロボット及びその位置移動方法 | |
JPWO2005068136A1 (ja) | 移動ロボットの歩容生成装置 | |
WO2006064597A1 (ja) | 脚式移動ロボットおよびその制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004501989 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10512263 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020047017293 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003725698 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020047017293 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2003725698 Country of ref document: EP |