WO2005087452A1 - Dispositif robot, procede de commande de comportement pour ce dispositif robot et dispositif mobile - Google Patents
Dispositif robot, procede de commande de comportement pour ce dispositif robot et dispositif mobile Download PDFInfo
- Publication number
- WO2005087452A1 WO2005087452A1 PCT/JP2005/004838 JP2005004838W WO2005087452A1 WO 2005087452 A1 WO2005087452 A1 WO 2005087452A1 JP 2005004838 W JP2005004838 W JP 2005004838W WO 2005087452 A1 WO2005087452 A1 WO 2005087452A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plane
- tread
- information
- stair
- robot device
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
- B62D57/032—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
Definitions
- Robot device operation control method thereof, and moving device
- the present invention relates to a robot apparatus, a moving apparatus, and a method for moving up and down stairs, for example, having moving means such as legs, and enabling a stair climbing operation including a plurality of steps.
- FIG. 3 there is a method in which an infrared sensor is provided on the side of the sole, and a stair climbing operation is performed by applying a landmark tape to the stairs (Japanese Patent No. 3330710).
- a plurality of optical sensor detectors 682 are provided on the left and right sides of the foot 622RZL of a bipedal walking robot device, and painted with a paint such as black paint that absorbs light well.
- the land marker 680 that is a surface area having a predetermined width that also produces a linear force
- a relative direction with respect to the land marker 680 is detected by comparing paired sensor outputs.
- the present invention has been proposed in view of such a conventional situation, and a robot apparatus and a moving apparatus that allow a moving body itself to acquire information on a stair and perform an autonomous stair climbing operation, and
- An object of the present invention is to provide an operation control method of a robot device.
- a robot device is a robot device that can be moved by moving means, detects one or a plurality of planes included in an environment from three-dimensional distance data, Plane detection means for outputting information as information, stair recognition means for recognizing a stair having a movable plane from the plane information, and outputting tread information and tread information relating to the tread of the stair, Based on the stair information, it is determined whether the stairs can be moved up or down. It is characterized by having.
- a robot device that can be moved with, for example, legs as moving means, it is determined whether the sole can be placed on the tread based on tread information on, for example, the size and position of the tread of the stairs. Information ability to judge whether or not it is possible to move to the tread of that height, and if it is judged that it is possible to move, it is positioned autonomously. This makes it possible to climb and descend stairs.
- the stair recognition means is provided with a given plane information force.
- the stair detection means which detects a stair having a movable plane and outputs stair information before integration is different in time outputted from the stair detection means.
- Statistical processing of multiple pre-integration staircase information Stair integration means for outputting combined integrated stair information as the above-mentioned stair information.For example, in the case of a robot device having a narrow field of view, or a case where stairs cannot be recognized well by a single detection process, However, accurate and high-frequency recognition results can be obtained by using integrated staircase information that is statistically integrated over time.
- the stair detecting means recognizes the size and spatial position of the tread based on the plane information, and outputs tread information as a result of the recognition as the pre-integration stair information. If the tread group having two or more tread forces having an overlapping area larger than a predetermined threshold and having a relative difference in height equal to or less than a predetermined threshold is detected from tread information before and after the tread, the tread group is determined. Both treads can be integrated so that they can be combined into a single tread, and at the time of unification, by integrating all treads selected to be integrated, recognition results can be obtained over a wide range. it can.
- the stair recognizing means can recognize the size and spatial position of the tread based on the plane information and use the tread information as the tread information.
- the tread information is at least in front of the tread with respect to the moving direction. It can include information on the front edge indicating the boundary of the tread and the backedge indicating the boundary on the back side.In order to recognize the front edge and the back edge of the tread, for example, even if it is a spiral-shaped staircase, etc. Recognition of the stairs enables the stairs to move up and down.
- right margin information and left margin information indicating a margin area which is adjacent to the left and right sides of the safety area which is an area sandwiched between the front edge and the back edge and is estimated to have a high probability of being movable
- Reference point information indicating the center of gravity of the area estimated to be a tread based on the above-described plane information
- three-dimensional coordinate information of a point group forming a tread plane can be provided.
- the stair recognition means can extract a boundary of a plane based on the plane information to calculate a polygon, and calculate the tread information based on the polygon.
- the polygon can be a convex polygon area circumscribing the boundary of the plane extracted based on the plane information, and is actually detected. It can be a region including a plane.
- the polygon may be a convex polygon area inscribed on the boundary of the plane extracted based on the plane information, and may be an area included in the plane actually detected. By doing so, the tread can be accurately detected by cutting the noise portion.
- the device can be controlled to perform an ascending / descending operation.
- the front edge and the back edge overlap, for example, a stair with a small rise.
- the user may move the front edge to the target to perform the elevating operation.
- the stair climbing control means moves to a predetermined position that is opposite to the front edge of the next tread surface to be subjected to the next climbing operation. It can be controlled to perform a vertical movement.For example, if a stair is detected while moving on the floor, the knock edge of the floor may not overlap with the front edge of the first step of the stair. Can move up and down by moving the front edge of the tread of the first step of the stairs, that is, the next step to be moved up and down.
- the stair climbing control means detects a tread to be moved next, and performs a series of operations to move to a predetermined position opposite to the tread to be moved, thereby performing a climbing operation. Each time the user moves to a new tread, a search 'align' approach operation is performed on the tread to enable a vertical movement.
- the stair climbing control means searches for the next step to be moved and the stair information obtained in the past. By acquiring information on several steps up or down in advance, it is possible for the robot device to move up and down even if the field of view is narrow due to a configuration in which the robot device cannot obtain the latest information. I do.
- the stair climbing control means detects the next tread to be moved after moving to the predetermined position on the current moving surface, which is opposite to the back edge, and determines the predetermined position of the current moving surface, which is opposite to the front edge. To perform the elevating operation to move to the tread. By using the front edge and the back edge, both edges can be controlled in parallel, and even a spiral staircase can be moved up and down.
- the elevation control means can control the elevation operation using a parameter that defines the position of the movement means with respect to the tread surface, and this parameter can be, for example, the height at which the leg is raised or the height at which the foot is lowered. It can be determined based on height. Further, it is possible to have a parameter switching means for changing the numerical values of the above parameters between the operation of climbing the stairs and the operation of descending the stairs. It can be controlled similarly.
- the plane detecting means is a line segment extracting means for extracting a line segment for each distance data point group estimated to be on the same plane in a three-dimensional space, and is extracted by the line segment extracting means.
- Plane area extending means for extracting a plurality of line segments estimated to belong to the same plane from the line segment group and calculating a plane from the plurality of line segments,
- the line segment extraction means can adaptively extract a line segment according to the distribution of distance data points, and the line segment extraction means is arranged on the same straight line when the three-dimensional distance data is on the same plane.
- the distribution of distance data points differs due to the effects of noise and other factors. Therefore, the line segments are adaptively extracted according to the distribution of distance data (Adaptive Line). Fitting) makes it possible to extract line segments accurately and robustly against noise, and to obtain planes by a large number of extracted line segment force line segment expansion methods. It is possible to accurately extract planes without using a single plane to perform, or multiple planes when only one plane exists.
- the line segment extracting means extracts a distance data point group which is estimated to be on the same plane based on the distance between the distance data points, and extracts the distance data point group based on the distribution of the distance data points in the distance data point group. It is possible to re-estimate whether the distance data point group is on the same plane or not, and once extract the distance data point group based on the distance of the distance data points in the three-dimensional space, and based on the distribution of the data points. By estimating the force on the same plane again, line segments can be extracted accurately.
- the line segment extracting means extracts a first line segment from the distance data point group estimated to be on the same plane, and calculates a distance from the distance data point group to the first line segment. Is the most A large distance data point is set as a point of interest, and when the distance is equal to or smaller than a predetermined threshold, a second line segment is extracted from the distance data point group, and a distance data point is located on one side of the second line segment. A determination is made as to whether or not there is a predetermined number or more of continuous data, and if there is a predetermined number or more of data, the distance data point group can be divided at the target point.
- a line segment connecting the end points of the point group is defined as a first line segment, and if there is a point having a large distance, a second line segment is generated by, for example, the least squares method. If multiple data points exist consecutively on the side, it can be assumed that the data point group has, for example, a zigzag shape with respect to the line segment, and thus the extracted data point group is biased. Is determined, and the data point group can be divided based on the noted point or the like.
- the plane area extending means selects one or more line segments estimated to belong to the same plane, calculates a reference plane, and calculates a line segment estimated to belong to the same plane as the reference plane.
- a process for retrieving a segment from the group of segments as an extension line segment, updating the reference plane with the extension line segment, and expanding the area of the reference plane is repeated, and the updated plane can be output as an updated plane.
- the plane area extension processing and the plane update processing can be performed by line segments that belong to the same plane.
- the distance data point group belonging to the updated plane if there is a distance data point whose distance from the updated plane exceeds a predetermined threshold, the distance data point group force excluding this is removed again from the plane. It is possible to further have a plane recalculating means for calculating, and the updated plane is obtained as an average plane of all line segments belonging to the updated plane. By obtaining it, a detection result in which the influence of noise and the like is further reduced can be obtained.
- the plane area expanding means can estimate whether or not the line segment belongs to the same plane as the reference plane based on an error between the plane determined by the line segment and the reference plane.
- the plane can be detected more accurately by determining whether the plane is due to noise or a different plane based on the root mean square error or the like.
- An operation control method for a robot device is the operation control method for a robot device movable by a moving means, wherein one or a plurality of planes included in an environment is detected from three-dimensional distance data, and the plane information is obtained.
- the plane detection step to be output and transfer from the plane information
- a stair recognition step of recognizing a stair having a movable plane and outputting tread information and tread information relating to the tread of the stair, and determining whether or not the stairs can be raised or lowered based on the stair information. If it is determined that the ascending / descending operation is possible, a stairs ascending / descending control step of controlling the stairs ascending / descending operation by autonomously positioning with respect to the tread surface is provided.
- a mobile device is a mobile device movable by the mobile device, wherein the mobile device detects one or more planes included in the environment from the three-dimensional distance data and outputs the plane information as plane information;
- a step recognition means for recognizing a stair having a movable plane from the plane information and outputting stair information including tread information and kick-up information relating to the tread of the stair, and whether or not stair climbing is possible based on the stair information
- stairs ascending / descending control means for controlling the stairs ascending / descending operation by autonomously positioning with respect to the tread surface is provided.
- the sole in a robot device and a moving device that are movable with, for example, legs as moving means, the sole can be placed on the tread from the tread information on, for example, the size and position of the tread of the stairs.
- the ability to judge whether it is the size that can be carried out and the information power of kicking up indicating the steps of the stairs It is judged whether it is possible to move to the tread of that height or not, and if it is judged that it can be moved, it will be autonomous Positioning at a location makes it possible to climb and descend stairs.
- FIG. 1 is a diagram illustrating a conventional elevating operation.
- FIG. 2 is a diagram illustrating a conventional elevating operation.
- FIG. 3A and FIG. 3B are diagrams illustrating a conventional elevating operation.
- FIG. 4 is a perspective view showing an overview of a robot device according to an embodiment of the present invention.
- FIG. 5 is a view schematically showing a configuration of a degree of freedom of a joint included in the robot apparatus.
- FIG. 6 is a schematic diagram showing a control system configuration of a robot device.
- FIG. 7 is a functional block diagram showing a system that executes processing from the robot apparatus force stereo data to the step of performing a stair climbing operation.
- FIG. 8A is a schematic diagram showing a state in which the robot apparatus photographs the outside world
- FIG. 8B is a view showing the size of the sole of the robot apparatus.
- FIG. 9 is a diagram for explaining staircase detection.
- FIG. 9A is a diagram of the stairs viewed from the front
- FIG. 9B is a diagram of the stairs viewed from the side
- FIG. 10 is a diagram illustrating another example of staircase detection.
- FIG. 10A is a diagram of the stairs viewed from the front
- FIG. 10B is a diagram of the stairs viewed from the side
- FIG. 11 is a diagram showing an example of a result of detecting the stairs in FIG. 9;
- FIG. 11A is a schematic diagram showing an image obtained by photographing the stairs in FIG. 9;
- FIG. 11B is a diagram showing three-dimensional distance data acquired from the image shown in FIG. 11A.
- FIG. 12 is a diagram showing an example of a result of detecting the stairs in FIG. 10;
- FIG. 12A is a schematic diagram showing an image obtained by photographing the stairs in FIG. 10;
- FIG. 12B is a diagram showing three-dimensional distance data acquired from the image shown in FIG. 12A.
- FIG. 13A is a schematic diagram showing an image of a staircase
- FIG. 13B shows a result of detecting four plane areas A, B, C, and D obtained from the three-dimensional distance data obtained from FIG. 13A. It is a figure
- FIG. 14 is a functional block diagram showing a staircase recognizer.
- FIG. 15 is a flowchart showing a procedure of a staircase detection process.
- FIG. 16A and FIG. 16B are schematic diagrams showing polygons.
- FIG. 17 is a schematic diagram for explaining the algorithm of Melkman.
- FIG. 18A and FIG. 18B are schematic diagrams for explaining a method for obtaining a polygon by Sklansky's algorithm.
- FIG. 19 is a schematic diagram for explaining a problem that occurs with a non-convex polygonal staircase.
- FIG. 19A is a diagram showing an input plane, and FIG. It is a figure which shows the polygon expression result of the convex polygon-shaped staircase.
- FIG. 20 is a schematic diagram showing a method for obtaining a polygon including an input plane by smoothing.
- 20A is a diagram illustrating an input plane
- FIG. 20B is a diagram illustrating an input plane
- FIG. 20C is a diagram illustrating a polygon in which a polygonal force in which a discontinuous gap is also removed is removed and smoothed.
- FIG. 21B is a diagram showing a polygon obtained by further smoothing the polygon obtained in FIG. 20B by line fitting.
- FIG. 21 is a diagram showing a program example of a process of obtaining a polygon including an input plane by smoothing by gap removal and line fitting.
- FIG. 22A and FIG. 22B are schematic diagrams for explaining a method of calculating staircase parameters.
- FIG. 23 is a schematic diagram for explaining tread and staircase parameters finally recognized.
- FIG. 24A and FIG. 24B are schematic diagrams showing stairs.
- FIG. 25 is a flowchart showing a method of staircase integration processing.
- FIG. 26 is a schematic diagram for explaining a process of integrating overlapping staircase data.
- FIG. 27 is a diagram for explaining an alignment operation.
- FIG. 28 is a schematic diagram for explaining an approach operation.
- FIG. 29 is a flowchart showing a procedure of a stair climbing operation.
- FIG. 30 is a flowchart showing a search-alignment-approach processing method.
- FIG. 31 is a flowchart showing a method of a lifting operation process.
- FIG. 32 is a schematic diagram showing a staircase surface recognized or scheduled to be recognized by the robot device.
- FIG. 33 is a flowchart showing a method of a lifting operation process.
- FIG. 34 is a flowchart showing a method of a lifting operation process.
- FIG. 35A is a diagram for explaining a relationship between a tread and a sole recognized by a robot device
- FIG. 35B is a diagram illustrating dimensions of respective parts.
- FIG. 36 is a traced image of a state in which the robot device has performed a vertical movement.
- FIG. 37 is a traced image of a state in which the robot apparatus performs a vertical movement.
- FIG. 38 is a diagram showing the relationship between a single step and the sole of the robot apparatus.
- FIG. 39 is a view showing the relationship between a single recess and the sole of the robot apparatus.
- FIG. 40 is a functional block diagram showing a flat panel detection device in this modification.
- FIG. 41 is a diagram for explaining a robot apparatus having a means for giving a texture.
- FIG. 42 is a diagram illustrating a plane detection method by a line segment extension method in this modification.
- FIG. 43 is a flowchart showing plane detection processing by the line segment extension method.
- FIG. 44 is a flowchart showing details of processing in a line segment extraction unit in this modification.
- FIG. 45 is a diagram showing a distribution of distance data points.
- FIG. 45A shows a case where the distribution of data is zigzag with respect to a line segment
- FIG. FIG. 4 is a schematic diagram showing a case where the data is uniformly distributed in the vicinity.
- FIG. 46 is a flowchart showing a Zig-Zag-Shape discrimination method according to the present modification.
- FIG. 47 is a diagram showing a program example of the Zig-Zag-Shape discrimination processing.
- FIG. 48 is a block diagram illustrating a processing unit that performs a Zig-Zag-Shape determination process.
- FIG. 49 is a schematic diagram for explaining an area extension process in the present modification.
- FIG. 50 is a flowchart showing a procedure of a process of searching for an area type and an area expanding process in an area expanding unit in the present modification.
- FIG. 51 is a diagram showing an example in which the root-mean-square error rms of the plane equation is different even when the distance between the end point and the straight line is equal.
- FIG. 51A shows that the line segment has a flat force due to the influence of noise or the like. If misaligned,
- FIG. 51B is a schematic diagram showing a case where there is another plane to which the line segment belongs.
- FIG. 52 is a diagram showing an area type selection process.
- FIG. 53 is a diagram showing an area extension process.
- FIG. 54A is a schematic view showing a floor surface when the robot apparatus is standing and looking down on the floor surface.
- the vertical axis is X
- the horizontal axis is y
- the z-axis is expressed by the shading of each data point, which is the same in the line segment extraction processing from the three-dimensional distance data and the pixel columns in the row direction
- FIG. 54C is a diagram showing a straight line detected from a data point group assumed to be present on a plane
- FIG. 54C is a diagram showing a plane region obtained from the straight line group shown in FIG. 54B by a region expanding process.
- FIG. 55 is a diagram for explaining the difference between the result of the plane detection method according to the present modification and the result of the conventional plane detection method when one step is placed on the floor surface.
- 55A is a schematic diagram showing the observed image
- FIG. 55B is a diagram showing the experimental conditions
- FIG. 55C is a diagram showing the result of plane detection by the plane detection method in the present modified example
- 55D is a diagram showing a result of plane detection by a conventional plane detection method.
- Figure 56A is a schematic diagram showing an image of the floor, and Figures 56B and 56C are three-dimensional distances obtained by capturing the floor shown in Figure 56A. It is a figure which shows the line segment detected by the line segment detection of this modification, and the line segment detected by the conventional line segment detection from the distance line of a horizontal direction and a vertical direction from night and night.
- the present invention is applied to an autonomously operable mouth pot device equipped with a step recognition device for recognizing a step such as a staircase existing in the surrounding environment.
- the robot device uses distance information obtained by stereo vision or the like.
- a biped walking type mouth pot device will be described as an example of such a mouth pot device.
- This mouth pot device is a practical mouth pot that supports human activities in various situations in the living environment and other everyday life, and can act according to the internal state (anger, sadness, joy, enjoyment, etc.) It is an entertainment pot device that can display basic actions performed by humans.
- a bipedal walking robot device will be described as an example, but the staircase recognition device is not limited to a bipedal walking pot device, but can be mounted on a legged movable mouth pot device. Can perform a stair climbing operation.
- FIG. 4 is a perspective view showing an overview of the robot device according to the present embodiment.
- a head unit 203 is connected to a predetermined position of a trunk unit 202, and two left and right arm units 204RZL and two left and right leg units 205RZL are connected.
- R and L is a suffix indicating right and left. The same applies hereinafter.
- FIG. 5 schematically shows the configuration of the degrees of freedom of the joints provided in the robot apparatus 201.
- the neck joint supporting the head unit 203 includes a neck joint axis 101, a neck joint pitch axis 102, and a neck joint one-piece axis 103! With three degrees of freedom! / Puru.
- each arm unit 204RZL constituting the upper limb includes a shoulder joint pitch axis 107, a shoulder joint Lorenole axis 108, an upper arm joint axis 109, a lunar joint pitch axis 110, a forearm joint axis 111, and a wrist joint. It comprises a pitch axis 112, a wrist joint roll wheel 113, and a hand 114.
- the hand 114 is actually a multi-joint * multi-degree-of-freedom structure including a plurality of fingers. However, the movement of the hand 114 has little contribution or influence to the posture control and the walking control of the robot apparatus 201, and therefore is assumed to have zero degrees of freedom for simplicity in this specification. Therefore, each arm has seven degrees of freedom.
- the trunk unit 202 has three degrees of freedom, namely, a trunk pitch axis 104, a trunk roll axis 105, and a trunk axis 106.
- Each of the leg units 205RZL constituting the lower limb has a hip joint axis 115, a hip joint pitch axis 116, a hip joint roll axis 117, a knee joint pitch axis 118, an ankle joint pitch axis 119, and an ankle joint axis. It is composed of a roll shaft 120 and a sole 121.
- the intersection of the hip joint pitch axis 116 and the hip joint roll axis 117 defines the hip joint position of the robot device 201.
- the sole 121 of the human body is actually a structure including a multi-joint, multi-degree-of-freedom sole, in this specification, the sole of the robot apparatus 201 is assumed to have zero degrees of freedom for simplicity. . Thus, each leg has six degrees of freedom.
- the robot 201 for entertainment is not necessarily limited to 32 degrees of freedom.
- the degree of freedom that is, the number of joints, can be appropriately increased or decreased according to design constraints, production constraints, required specifications, etc. Absent.
- Each degree of freedom of the robot apparatus 201 as described above is actually implemented using an actuator. Eliminating extra bulges on the appearance to approximate the human body shape, bipedal walking! Due to demands such as controlling the posture of unstable structures, it is preferable that the actuator is small and lightweight!
- Such a robot device includes a control system that controls the operation of the entire robot device, for example, in the trunk unit 202 or the like.
- FIG. 6 is a schematic diagram illustrating a control system configuration of the robot device 201. As shown in Fig. 6, the control system controls the whole-body cooperative movement of the robot device 201, such as the drive of the thought control module 200, which dynamically responds to user input, etc., and performs emotion judgment and emotional expression, and the actuator 350. And a motion control module 300 to be operated.
- the thinking control module 200 includes a CPU (Central Processing Unit) 211 that executes arithmetic processing related to emotion determination and emotional expression, a RAM (Random Access Memory) 212, a ROM (Read Only Memory) 213, and an external storage device (node).
- This is an independent drive type information processing device composed of 214, etc., which can perform self-contained processing in the module.
- the thinking control module 200 determines the current emotion and intention of the robot device 201 according to external stimuli, such as image data input from the image input device 251 and audio data input from the voice input device 252. I do. That is, as described above, the input image data is recognized. By recognizing the user's facial expression and reflecting the information on the emotions and intentions of the robot device 201, it is possible to express an action according to the user's facial expression.
- the image input device 251 includes, for example, a plurality of CCD (Charge Coupled Device) cameras, and can obtain a distance image with an image captured by these cameras.
- the audio input device 252 includes, for example, a plurality of microphones.
- the thinking control module 200 issues a command to the motion control module 300 to execute a motion or action sequence based on a decision, that is, a motion of a limb.
- One motion control module 300 controls the whole body cooperative motion of the robot device 201.
- This is an independent drive type information processing device which includes a CPU 311, a RAM 312, a ROM 313, an external storage device (a node 'disk' drive, etc.) 314, and can perform self-contained processing in a module. Further, in the external storage device 314, for example, a walking pattern calculated offline, a target ZMP trajectory, and other action plans can be stored.
- the motion control module 300 includes an actuator 350 for realizing the degrees of freedom of the joints distributed over the whole body of the robot apparatus 201 shown in FIG. 5, and a distance measurement sensor (not shown) for measuring a distance to an object.
- a posture sensor 351 for measuring the posture and inclination of the trunk unit 202
- ground contact confirmation sensors 352, 353 for detecting leaving or landing on the left and right soles
- a load sensor provided on the sole 121 of the sole 121
- a battery Various devices such as a power supply control device 354 that manages power supplies such as the power supply are connected via a bus interface (IZF) 310.
- the attitude sensor 351 is configured by, for example, a combination of an acceleration sensor and a gyro 'sensor
- the grounding confirmation sensors 352, 353 are configured by a proximity sensor or a micro' switch.
- the thought control module 200 and the motion control module 300 are built on a common platform, and are interconnected via a bus; interfaces 210 and 310.
- the motion control module 300 is instructed by the thought control module 200. It controls the whole-body coordination by each actuator 350 that embodies the behavior. That is, the CPU 311 retrieves an operation pattern corresponding to the action instructed from the thought control module 200 from the external storage device 314, or internally generates an operation pattern. Then, the CPU 311 sets the foot motion, the ZMP trajectory, the trunk motion, the upper limb motion, the waist horizontal position and the height, etc., according to the specified motion pattern, and instructs the motion according to the set contents. Command value to be transferred to each actuator 350.
- the CPU 311 detects the posture and inclination of the trunk unit 202 of the robot apparatus 201 based on the output signal of the posture sensor 351, and the leg unit 205RZL detects the swing leg based on the output signal of each of the grounding confirmation sensors 352 and 353.
- the CPU 311 controls the posture and operation of the robot device 201 so that the ZMP position always faces the center of the ZMP stable region.
- the motion control module 300 is designed to return the force, ie, the state of processing, to which degree the action determined by the thought control module 200 has been performed, as described in the thought control module 200. In this way, the robot device 201 can determine its own and surrounding conditions based on the control program, and can act autonomously.
- a stereo vision system is mounted on the head unit 203, and three-dimensional distance information of the outside world can be acquired.
- a flat surface is detected using the three-dimensional distance data obtained by acquiring the surrounding environmental power by a stereo vision system, which is preferably mounted on such a robot device.
- FIG. 7 is a functional block diagram showing a system for executing processing from the stereoscopic data of the robot apparatus to the start of the stair climbing operation.
- the robot apparatus receives a stereo vision system (Stereo Vision System) 1 as a distance data measuring means for acquiring three-dimensional distance data, and stereo data D1 from the stereo vision system 1, and receives the stereo data.
- a stereo vision system Stepo Vision System 1
- stereo data D1 stereo data measuring means
- the robot apparatus first observes the outside world by stereo vision, and outputs stereo data D1, which is three-dimensional distance information calculated by parallax between the eyes, as an image. That is, it compares the image input of two cameras with the right and left equivalent to both eyes of the human for each pixel neighborhood, estimates the distance to the parallax map target, and outputs 3D distance information as an image (distance image ).
- stereo data D1 is three-dimensional distance information calculated by parallax between the eyes, as an image. That is, it compares the image input of two cameras with the right and left equivalent to both eyes of the human for each pixel neighborhood, estimates the distance to the parallax map target, and outputs 3D distance information as an image (distance image ).
- the plane detector 2 By detecting planes with the plane detector 2 using the distance image camera, a plurality of planes existing in the environment can be recognized.
- the staircase recognizer 3 These plane force robots extract a plane that can be raised and lowered, recognize stairs from the plane, and output
- FIG. 8A is a schematic diagram illustrating a state where the robot apparatus 201 is capturing an image of the outside world. Assuming that the floor is an XY plane and the height direction is the z direction, as shown in FIG. 8A, the visual field range of the robot device 201 having the image input unit (stereo camera) in the head unit 203 is as follows. It is a predetermined range in front of 201.
- the robot device 201 has a software configuration in which the CPU 211 described above receives a color image and a parallax image from the image input device 251 and sensor data such as all joint angles of each actuator 350, and executes various processes. Realize.
- the software in the robot device 201 is configured for each object, recognizes the position, the movement amount, the surrounding obstacles, the environment map, and the like of the robot device, and performs an action that the robot device should ultimately take. It can perform various kinds of recognition processing to output an action sequence for.
- coordinates indicating the position of the robot apparatus for example, a camera coordinate system of a world reference system (hereinafter, also referred to as absolute coordinates) having a predetermined position based on a specific object such as a landmark as an origin of the coordinates, Two coordinates are used: the robot center coordinate system (hereinafter also referred to as relative coordinates) with the robot itself as the center (origin of coordinates).
- the robot center 201 is fixed at the center using the joint angle at which the sensor data force is also determined.
- a homogeneous transformation matrix and the like in the camera coordinate system are derived from the robot central coordinate system, and a distance image including the homogeneous transformation matrix and the corresponding three-dimensional distance data is output. I do.
- the robot device can recognize a staircase included in its own visual field, and can perform a stairs ascent / descent operation using a recognition result (hereinafter, referred to as staircase data). Therefore, for the stair climbing operation, the robot device determines whether the size of the stairs is smaller than the size of its sole, or whether the height of the stairs is a height that can be climbed or descended. Various decisions need to be made regarding the size of the stairs.
- the size of the sole of the robot apparatus is set to FIG. 8B. That is, as shown in FIG.
- the forward direction of the robot apparatus 201 is defined as the X-axis direction, a method parallel to the floor surface and orthogonal to the X direction ⁇ y direction, and the y-direction of both feet when the robot apparatus 201 stands upright.
- the width of feet base width is the size of the sole and the ankle (the joint between the leg and the sole) is the front part of the force, the front width of the sole foot_fr 0 nt_ S i Ze , the part of the back from the ankle is the foot
- the sole width behind the sole shall be foot_back_size.
- Stairs detected by the robot apparatus 201 from the environment include, for example, those shown in FIGS. 9A and 10A show the stairs viewed from the front, FIGS. 9B and 10B show the stairs viewed from the side, and FIGS. 9C and IOC show the stairs viewed obliquely.
- the surface used by humans and robotic equipment to move up and down the stairs (the surface on which the feet or movable legs are placed) is referred to as the tread, and the tread force is the height to the next tread (one step of the stairs).
- the height of the steps) is called kicking.
- stairs are counted as the first and second steps as they climb from the side closer to the ground.
- the staircase ST1 shown in Fig. 9 is a three-step staircase, with a kick-up of 4cm, the size of the treads of the first and second steps is 30cm in width, 10cm in depth, and only the third step, which is the top step, has a width of It is 30cm in depth and 21cm in depth.
- the staircase ST2 shown in Fig. 10 is also a three-step staircase, with a 3cm kick-up, the treads of the first and second steps are 33cm in width, 12cm in depth, and the third step, the top step Only 33cm wide and 32cm deep. The result of the robot device recognizing these stairs will be described later.
- the plane detector 2 detects a plurality of planes existing in the environment (distance data D1), and outputs plane data D2.
- distance data D1 distance data
- plane data D2 plane data
- a known plane detecting technique using Hough transform can be applied in addition to a line segment extension method described later.
- planes can be accurately detected by performing plane detection by a line extension method described later.
- FIG. 11 and FIG. 12 are diagrams illustrating an example of a result of detecting a staircase.
- FIGS. 11 and 12 show examples in which three-dimensional distance data is acquired from images obtained by photographing the stairs shown in FIGS. 9 and 10, respectively, and plane detection is performed by the plane detection method described later. That is, FIG. 11B to 11D are schematic diagrams showing images when the stairs are photographed, and FIGS. 11B to 11D are diagrams showing three-dimensional distance data acquired from the images shown in FIG. 11A.
- FIG. 12A is a schematic diagram showing an image when the stage of FIG. 10 is photographed, and FIGS. 12B to 12D are diagrams showing three-dimensional distance data acquired from the image shown in FIG. 12A.
- all treads could be detected as flat in any case.
- FIG. 11B shows an example in which the first, second, and third steps from the bottom are detected as flat surfaces.
- FIG. 12B shows that a part of the floor surface is successfully detected as another plane.
- the areas A to D are respectively the floor surface, the first step, the second step, and the third step. It is detected as a plane indicating the tread surface of the eye.
- the point cloud in the same area included in each of the areas A to D indicates a distance data point group estimated to constitute the same plane.
- the plane data D2 detected by the plane detector 2 is input to the stair recognizer 3 to recognize the shape of the stairs, that is, the size of the tread, the height of the stairs (the size of the kick-up), and the like.
- the staircase recognizer 3 in the present embodiment has a boundary on the near side (the side closer to the robot apparatus) with respect to the area (polygon) included in the tread recognized by the force robot apparatus 201 described later in detail.
- (Front Edge) hereinafter referred to as “Front Edge FE”
- the boundary hereinafter referred to as “Knock Edge BE”
- the stair climbing controller 4 controls the stair climbing operation using the stair data.
- a stair climbing control method of the robot device will be specifically described.
- the staircase recognition method of the robot device the stair climbing / lowering operation using the recognized staircase second, and the plane detection method by the line segment extension method as a specific example of the plane detection method are described below in this order. Will be explained.
- FIG. 14 is a functional block diagram showing the staircase recognizer shown in FIG.
- the stair recognizer 3 includes a stair detector (Stair Extraction) 5 for detecting a stair from the plane data D2 output from the plane detector 2, and a stair data detected by the stair detector 5.
- a stair merging unit (Stair Merging) 6 that performs processing to recognize stairs more accurately by integrating the time series data of D3, that is, a plurality of stair data D3 detected at different times.
- the stair data D4 integrated by the integrator is the output of the stair recognizer 3.
- the staircase detector 5 detects a staircase from the plane data D2 input from the plane detector 2,
- the plane data D2 input from the plane detector 2 has a plurality of pieces of information shown below for each plane, and the plane data for each of a plurality of planes in which the image power captured by the stereo vision system 1 is also detected is input. Is done.
- plane data D2 is
- Plane parameters normal vector, distance from origin
- the robot device selects a plane that is substantially horizontal to the ground surface, such as the floor surface or tread, on which it is grounded, and calculates the following information (hereinafter referred to as staircase parameters). That is,
- the front edge FE and the knock edge BE recognized by the robot device indicate the boundary (line) of the tread surface of the stairs as described above.
- the front boundary (front side boundary) near the robot unit is the front edge FE
- the boundary far away from the robot unit (back side boundary) is the back edge BE.
- a minimum polygon including all points constituting a plane can be obtained and set as a boundary on the near side or the far side.
- the information on the front edge FE and the back edge BE can be information on these end points. Further, information such as the width W (width) of the stairs and the length (length) of the stairs can be obtained from the polygon.
- the height of the stairs can be calculated using the center point of the plane of the given plane data D2 as the height difference between the center points of the planes of 2 and the above polygon. Using the center of gravity of 2 Or the difference in height. Note that the kick-up may be based on the difference between the height of the front back edge BE and the front edge FE of the rear stage.
- the front edge FE and the back edge BE in addition to the front edge FE and the back edge BE, it is a region adjacent to the left and right of a region (safety region) sandwiched between the front edge FE and the back edge BE and is movable.
- the region estimated to have a high probability is recognized as a margin (region). How to obtain these will be described later.
- a margin region
- stair parameters such as the number of data point groups forming the tread and the information of the reference point defining one of the above-mentioned center of gravity points can be used as the stair data D3.
- a plane (stair) satisfying the following conditions is extracted from the above stair data.
- the length of the front edge FE and the back edge BE is greater than or equal to a predetermined threshold
- the height of the stairs is less than a predetermined threshold
- Stair width W (width) is greater than or equal to a predetermined threshold
- Stair length L (length) is more than a predetermined threshold
- FIG. 15 is a flowchart showing the procedure of the staircase detection process of the staircase detector 5.
- the input plane is a plane that can be walked or moved, for example, whether or not the input plane is horizontal to a ground contact surface.
- Judge (Step Sl).
- the condition of what plane is horizontal or movable may be set according to the function of the robot device. For example, if the plane vector of the input plane is ⁇ ( ⁇ , ⁇ , ⁇ ), it is horizontal if
- min is a threshold value for judging a horizontal plane.
- step S1 If it is determined in step S1 that the level is not horizontal (step Sl: No), the detection failure is output and the processing is terminated. The processing is performed on the plane data.
- step SI: Yes processing for recognizing the boundary (shape) of the plane is performed.
- the algorithm of Sklansky J. Sklansky, "Measuring concavity on a rectangular mosaic, IEEE Trans Comput. 21, 1974, pp. L355-1364"
- the algorithm of Melkman Melkman A., "On-line Construction of the A convex hull such as Convex Hull of a sample Polygon "Information Processing Letters 25, 1987, p.11) or a polygon encompassing the input plane is obtained by smoothing by removing noise (step S2).
- the boundary lines before and after this polygon are determined as staircase parameters such as a front edge and a back edge (step S3), and in the present embodiment, a plane indicating a stair tread surface is obtained from both the boundary lines of the front edge and the back edge.
- the width W (width) and the length L (length) of the tread are determined, and it is determined whether or not these values are larger than a predetermined threshold value (step S4).
- S4 No) it is determined that the plane is not a movable plane of the robot apparatus, and the processing from step S1 is repeated again for the next plane data.
- Step S4 If the width and length of the plane are equal to or larger than the predetermined threshold (Step S4: No), it is determined that the tread is movable, and the left and right margins (Left Margin, Right Margin) are calculated (Step S5). Is output as stair data D3.
- FIG. 16 is a schematic diagram showing a convex polygon
- FIG. 16A shows all supporting points determined to belong to one input plane (contained in a continuous area on the same plane.
- FIG. 16B shows a convex polygon obtained from the figure shown in FIG. 16A.
- the convex polygon shown here can use a convex hull (convex hull) for finding the minimum convex set including a given plane figure (the area including the supporting points).
- the point indicated by G is used when calculating the width W of the tread, and indicates, for example, a point (reference point) such as the center of gravity of the area including the supporting point.
- FIG. 17 shows Melkman's It is a schematic diagram for explaining an algorithm. As shown in Fig. 17, three points PI, P2, and P3 are extracted from the points included in the given figure, a line segment connecting the points PI and P2 is drawn, and a straight line passing through the points PI, ⁇ 3,, P2, and P3 pull. As a result, it is divided into five areas AR1 to AR5 including a triangle AR4 consisting of three points PI, P2, and P3.
- the process of determining which area the next selected point P4 is included in and re-forming the polygon is repeated to update the convex polygon. For example, if P4 exists in the area AR1, the area surrounded by segments connected in the order of PI, P2, P4, and P3 becomes the updated convex polygon. When point P4 exists in regions AR3 and AR4, the region surrounded by segments connected in the order of PI, P2, P3, and P4, respectively, is connected in the order of PI, P4, P2, and P3. Update the convex polygon as an area surrounded by line segments.
- the convex polygon is not updated. If the point P4 is in the area AR2, the points P3 and P3 are excluded except for the point P3.
- the convex polygon is updated as an area surrounded by the line segments connected in order.
- a convex polygon can be generated for all the supporting points in consideration of a region included in each point.
- FIG. 18 is a schematic diagram for explaining a method of obtaining a polygon by the Sklansky algorithm.
- the polygons extracted by Sklansky's algorithm are called Weakly Externally Visible Polygons.
- the computational complexity is smaller than that of Sklansky's algorithm described above, so that high-speed operation is possible.
- a half line is drawn from an arbitrary point X on the boundary of the given figure 131 to a circle 132 including the figure 131 as shown in FIG. 18A.
- this point is assumed to be a point that forms the boundary of the convex polygon.
- FIG. 18B when a half line is drawn from any other point y on the boundary of the given figure 133 to a circle 134 including the figure 133, a half line that does not cross the figure 133 is drawn. I can't.
- the other point y does not form a boundary of the convex polygon.
- a figure as shown in FIG. 16A is obtained.
- the convex polygon shown in Fig. 16B can be obtained.
- the convex polygon shown in Fig. 16B in consideration of the accuracy, characteristics, and the like of the stereo vision system 1, when obtaining the convex polygon from FIG. 16A, as shown in FIG. 16B, the convex circumscribing the figure in FIG.
- a convex polygon inscribed in the figure of FIG. 16A may be obtained in consideration of the accuracy and characteristics of the camera. Also, use these methods according to the degree of inclination of the plane and the surrounding situation.
- FIG. 19 is a schematic diagram showing this problem
- FIG. 19A is an input plane
- stepO is a non-convex polygonal step
- Figure 19B shows the polygonal representation of stepO using convex hulls, with significant deviations from the desired results for non-convex portions.
- FIG. 20 is a schematic diagram showing a smoothing ridge
- FIG. 20A is a diagram showing all supporting points determined to belong to one input plane (contained in a continuous area on the same plane).
- FIG. 20B shows an input polygon which is a region including the distance data point group), and
- FIG. 20B shows a smoothed polygon (close gaps) by removing discontinuous gaps from the polygon indicating the input plane.
- the gap-removed polygon is shown as a closed polygon
- FIG. 20C is a polygon obtained by further smoothing the polygon obtained in FIG. 20B by fit line segments (smoothed polygon).
- FIG. 21 is a diagram showing an example of a program for processing for obtaining a polygon including the input plane by gap removal and smoothing by line fitting. And Fit line segments processing for further smoothing the obtained polygon by line fitting.
- a method of removing a gap will be described. Three consecutive vertices are selected from the vertices representing the polygon, and if this center point is far away from the straight line connecting the end points, this center point is removed. For the remaining vertices, continue this process until there are no more points to remove.
- a line fitting method will be described. Three consecutive vertices are selected from the vertices indicating the polygon, and a straight line approximating these three points and the error between the straight line and the three points are obtained by the least squares method.
- FIG. 22 is a schematic diagram for explaining a method of calculating staircase parameters. As shown in FIG. 22A, it is assumed that the obtained polygon 140 is a region surrounded by one point 147 and one point 147. Here, the line segment forming the front boundary of the polygon 140 as viewed from the robot apparatus 201 is the front edge FE, and the line segment forming the rear boundary is the back edge BE.
- the width W of the stair tread is the length of the line connecting the center point C of the front edge FE and the reference point G.
- the reference point G can be set at the approximate center of the plane to be a tread.
- the center point of all the supporting points, the center of gravity of the polygon 140, the end points of the front edge FE and the back edge BE are determined.
- the center of gravity of the safety area 152 shown in FIG. 22B can be used.
- the length L of the stairs is, for example, the shorter of the lengths of the front edge FE and the back edge BE, or the length L of the front edge FE including the left and right margins and the back edge BE including the left and right margins shown below. Or longer.
- FIG. 23 is a schematic diagram for explaining the tread and stair parameters finally recognized.
- margins M, M are provided at the left and right ends of the safety area 152, and the area 151 including the left and right margins M, M is finally treaded.
- Left and right margins M and M are front edge FE and back edge If a polygon is outside the safety area 152 defined by BE, those points are selected first. In Figure 22A, for example, to find the right margin M
- the point 142 that is the farthest point from the safety area 152 is selected, and a perpendicular line is drawn from this point 142 to the front edge FE and the back edge BE. Then, it is assumed that the area 151 surrounded by the perpendicular, the front edge FE, and the knock edge BE is recognized as a tread.
- the margin may be obtained by simply drawing a line that passes through the point 142 and intersects the front edge FE or the back edge BE.
- the length of the left margin M on the same line as the front edge FE is lmf
- the length of the left margin M on the same line as the back edge BE is lbm.
- the lengths of the right margin M on the same straight line as the front edge FE and the back edge BE be rfm and rbm, respectively.
- FIGS. 24A and 24B are schematic diagrams showing two types of stairs.
- FIG. 24A shows a step having a rectangular tread as shown in FIGS. 9 and 10
- FIG. 24B shows a step having a spiral shape.
- the back edge BE is not parallel to the front edge FE. Therefore, for example, an algorithm that simply extracts a detected plane force rectangular area may not be applicable. Therefore, as in the present embodiment, by obtaining a polygon from the detected plane and obtaining the front edge FE and the back edge BE, it is possible for the robot apparatus to perform a vertical movement even with such a spiral staircase. Become.
- the staircase integrator 6 receives stair data (stair parameters) D3 detected by the stair detector 5 as an input, and temporally integrates the stair data D3 to obtain more accurate and high-frequency stair information. It is an estimate. For example, when the field of view of the robot device is narrow, it may not be possible to recognize the entire staircase at once. In such a case, for example, in the old stair data such as the previous frame and the new stair data such as the current frame, for example, a set of spatially overlapping stairs is searched for and the stairs are overlapped. By integrating stairs, new Define a virtual staircase. By continuing this operation until there are no overlapping stairs, accurate stairs can be recognized.
- FIG. 25 is a flowchart showing a method of the staircase integration process in the staircase integrator 6.
- the current stairs data New Stairs
- old and stairs data Old Stairs
- stairs data are input (Step SI 1), and all of the new, stairs and old! And stairs data are combined into one set (ujnion).
- Step S12 In these combined staircase data sets, spatially overlapping staircase data is searched (step S13). If there are overlapping sets (step S14: Yes), those staircase data are searched. Is integrated and registered in the staircase data set (step S15). Then, the processing of steps S13 and S14 is continued until there is no spatially overlapping set of stairs (step S14: No), and the finally updated stair data set is output as stair data D4.
- FIG. 26 is a schematic diagram for explaining the processing in step S13 for integrating the overlapping staircase data.
- FIG. 26 shows staircase data ST11 and ST12 that overlap spatially. To judge whether or not they are spatially overlapping, for example, the difference in height (distance) at the reference point G of the two staircase data ST11 and ST12 and the tread area including the left and right margins overlap The size of the area can be used. That is, the difference between the heights of the centers of gravity G and G of the two steps is equal to or less than the threshold (maxdz), and
- the stair data ST11 and ST12 are integrated and the center of gravity G is calculated.
- Step data is ST13.
- the area of the outer frame including the stair data ST11 and ST12 is defined as step ST13, and the area including the safety area excluding the left and right margins of the stair data ST11 and the stair data ST12 before integration is integrated.
- a new safety area 165 is defined, and areas obtained by removing the safety area 165 from the staircase data ST13 are defined as margins M and M.
- the integrated front edge FE and back edge BE can be obtained.
- both end points of the front edge FE of the combined staircase data ST13 are the left and right ends of the front edge FE of the staircase data ST11 and the front edge FE of the staircase data ST12. Comparing the points, the right end point 163 is on the right side and the left end point is on the left side.
- the position of the line of the front edge FE is a line position closer to the robot apparatus (front side) as compared with the front edge FE of the stair data ST11 and the front edge FE of the stair data ST12.
- the position on the farther side is selected, and the left and right end points 161 and 162 are selected so as to spread to the left and right.
- the integration method is not limited to this.
- a rectangular area determined by the front edge FE and the back edge BE and the integrated data ST13 are integrated so as to be the largest in consideration of the field of view of the robot device, for example. If the field of view is wide or the accuracy of the distance data is sufficiently high, a combined area of the two staircase data may be used as the combined staircase data. Further, the reference point G after integration can be obtained by taking a weighted average according to the ratio of the number of supporting points included in the staircase data ST11 and the staircase data ST12.
- the stair climbing controller 4 uses the stair data D4 integrated and detected by the stair detector and the stair integrator 6 to perform control for the robot apparatus to actually perform the stair climbing operation.
- This ascent / descent control includes an operation of searching for stairs.
- the stair climbing operation realized in this embodiment can be constructed as the following five state machines.
- FIG. 27 is a diagram for explaining the alignment operation.
- the area 170 is the first step of the stair tread recognized by the robot apparatus.
- the center point force of the front edge FE of the tread 170 also moves to a target position (hereinafter referred to as an aligning position) separated by a predetermined distance ad (align.distance) in a direction orthogonal to the treading surface FE.
- ad align.distance
- Arain position 172 but when the angle difference between the direction facing the case and the direction and the mouth bot device perpendicular to the front edge FE away more than a predetermined threshold max_d is equal to or greater than a predetermined threshold value ma X _ a, the robot device intended It is assumed that the alignment operation is completed when these conditions are satisfied.
- FIG. 28 is a schematic diagram for explaining the approach operation. Face the stairs as shown in Figure 28
- the robot apparatus 201 that has moved to the target position 172, which is a target position separated by Align_distance, and has completed the alignment operation, moves the stairs 170 up and down so that the center point C of the front edge FE of the tread 170 and the robot apparatus 201 are correct. And the distance is a predetermined value ax (
- approach position Move to the target position that will be approach.x (hereinafter referred to as approach position).
- a stair climbing operation is performed based on the stair data obtained by the stair recognition.
- step When moving to the next step (step) and the next step is observed, continue the ascent or descent. By continuing this operation until there is no next step, a stair climbing operation is realized.
- FIG. 29 is a flowchart showing the procedure of the stair climbing operation.
- the stairs are searched for by the search (Search) 'Align' (Approach) operation, and the searched stairs are raised against the stairs. Is moved to the predetermined position (aligned), and an approach operation approaching the first step of the stairs is executed (step S21). If this search 'alignment' approach operation is successful (step S22: Yes), It moves up and down (step S23) and outputs success. If the approach fails (step S22: No), the failure is output and the processing ends. In this case, the processing power of step S21 is repeated again.
- FIG. 30 is a flowchart showing a search 'align' approach processing method.
- step S32 when the search 'alignment' approach is started, the search operation (1) is executed (step S32).
- the search operation (1) the head is shaken to collect information as wide as possible.
- step S32 it is determined whether or not there are stairs that can be moved up and down.
- step S32 using the height n of the plane that forms the first tread surface among the detected stairs, if the height satisfies step_min_z ⁇ n ⁇ step_max_z, it is determined that it is possible to move up and down. If there is a stair that can be moved up and down (Step S32: Yes), an alignment operation is performed to move the stairs to the specified distance (align_distance) in order to recognize the stairs nearby (Step S33). Then, the stairs about to go up and down are recognized again (step S34).
- the operation in step S34 is the search operation (2).
- step S35 the force of the stairs that can be raised and lowered is checked again (step S35). If the search operation (2) is successful, the stairs that have been re-recognized staircase against the re-recognized stairs and have a predetermined distance. It is checked whether the force has been successfully moved to the line position, that is, whether or not the aligning operation in step S33 has been successful (step S36). If there is a stair that can be raised and lowered and the aligning is performed, In steps S35 and S36: Yes), an approach operation is performed to advance to the front edge of the first staircase (step S37). On the other hand, if there is no stair that can be moved up and down in step S35, the process returns to step S31. If the alignment operation is successful and V ⁇ in step S36, the processing power in step S33 is repeated.
- the stair climbing operation consists of ascending and descending operation processing 1 when the robot is able to recognize the next step up or down (hereinafter referred to as the next step) and the next step from the current moving plane.
- the next step when two or more steps, upper steps or lower steps (hereinafter referred to as two or more steps ahead) can be recognized.
- FIG. 31, FIG. 33, and FIG. 34 are flow charts showing the processing method of the lifting operation processing 13 respectively.
- the step that is currently moving is step-0
- the next step is step-1
- the next step is step-2
- the next step is m. Step-m.
- step S41 an operation of climbing up the stairs and descending Z (climb operation (1)) is executed (step S41).
- the climb operation (1) since the height n of the stairs has been recognized in the above-described step S32, the positive / negative judgment of the height n is determined by z z
- Control parameter values used for system operation are different. That is, it is possible to switch between the climbing operation and the descending operation only by switching the control parameters.
- step S42 it is determined whether or not the climb operation (1) is successful (step S42). If the force is successful (step S42: Yes), the search operation (3) is executed (step S43).
- This search operation (3) is a process in which the head unit equipped with the stereo vision system is moved, the surrounding distance data is acquired, and the next step is detected. This is an operation process.
- FIG. 32 is a schematic diagram showing a staircase surface recognized or scheduled to be recognized by the robot device. As shown in FIG. 32, for example, it is assumed that the sole 121LZR of the currently moving robot apparatus is on the tread 181. In FIG. 32, the safety area sandwiched between the front edge FE and the back edge BE and the left and right margins M, M adjacent to the safety area are recognized as treads. This
- the robot apparatus can recognize the tread surface 182 of the next next step (step-2).
- a gap 184 exists between the tread 181 and the tread 182 due to a kick-up or the like.
- the elevating operation (1) it is determined whether it is possible to move (climb operation) from the tread surface 181 of the current step (st mark-0) to the tread surface 182 of the next step (st mark-1). Those that meet the criteria of above shall be judged as movable.
- step-1 The deviation of the angle sufficiently close to the front edge FE of the tread surface 182 of the next step (step-1) is below a predetermined threshold
- step-1 The size of the tread 182 of the next step (step-1) is sufficiently large.
- Front edge FE force is also the distance to the rear end of the sole 121LZR front_x is larger than the control parameter front_x_limit in the specified elevating mode
- Back edge BE force is also the distance to the front end of the sole 121LZR back_x is larger than the control parameter back_x_limi1: in the elevation mode! / ⁇
- the height zl at the reference point of the tread 182 of the next step (step-1) and the tread of the next step (step-2) From the difference (z2-zl) of the height z2 at the reference point 183, whether climbing from the tread 182 of the next step (step-1) to the tread 183 of the next step (step-2) is climbing It can be determined whether it is going down. If the tread 183 at the step two steps ahead (st mark-2) cannot be recognized, the current ascending / descending state may be maintained.
- the robot device is aligned with the back edge BE of the tread 181 on the tread 181 of the current step (step-0).
- tread 181 and tread 182 are aligned with the back edge BE of the tread 181 on the tread 181 of the current step (step-0).
- step-2 If the gap 184 is large, the operation moves to the next step after aligning with the front edge FE of the tread 182 of the next step (step-1), and then aligns with the knock edge BE. Then, in the next climb operation, an error is applied to the front edge FE of the tread 183 of the next step (step-2).
- tread 183 and align with its back edge BE Move in to tread 183 and align with its back edge BE. That is, for example,
- the climbing operation is performed by aligning with the front edge FE of the tread of the next step, moving up and down, and aligning with the back edge BE of the moved tread.
- the alignment operation may be performed only on one of the edges. That is, the current tread 181 Align to the back edge BE, move to the next step tread 182, and
- the climb operation is a process of omitting the process of performing an error on the front edge FE of the next stage, and performing an alignment operation on the back edge BE of the tread moved to the next stage.
- the ascending / descending operation processing 1 described above can be applied when the robot device can observe the tread of the next movable step (step-1) during the ascent / descent operation of the stairs.
- a biped robot device needs to be equipped with a stereo vision system 1 that can look down on its feet.
- the current step (due to the restriction of the movable angle of the connection between the head unit and the trunk unit of the robot unit, etc.) Tread force at step-0)
- the tread of the next step (step-1) was observed, and the tread of the step two steps ahead (step-2) or more steps (step-m) could not be recognized. Elevation operation when recognition is possible Processing 2 and 3 will be described.
- a case where the tread of the step two steps ahead (st mark-2) can be recognized will be described.
- the search operation ( Execute 4) (step S51).
- This search operation (4) is the same processing as the above-described step S43 and the like, except for recognizing the tread of the step two steps ahead (step-2).
- the climb operation (2) is executed (step S52).
- the climb operation (2) is the same operation as the climb operation (1). In this case as well, switching between climbing and descending stairs in climbing is also determined by the tread height n of the next step (step-1).
- the tread of the next step is a tread that moves forward in time with respect to the tread of the current step (step- 0), and is observed on the tread of the next step.
- Step S53 Yes
- Step S54 Yes
- Step S56 the next step is performed.
- step S56 the next step is performed.
- step S55 No
- step S55 the finish operation is executed (step S55), and the process ends.
- a lifting operation 3 when the tread surface up to a plurality of steps (hereinafter referred to as m steps) can be observed and recognized.
- the search operation (5) is basically the same as step S51, except that the tread surface up to the recognizable m-step ahead (st-m) is the observation target.
- the climb operation (3) is performed for k stages (step S62).
- the differential force between the heights of a plurality of treads that has been observed so far can also determine the elevating mode. That is, the height of the first tread z—z
- i i-1 is negative, the operation goes down the stairs. If i i-1 is 0 or positive, the operation mode goes up the stairs.
- the information on the tread moving in this climb operation (3) is data that has been observed m steps before the current step.
- Step S63 Yes
- Step S64: No there is no tread to be moved next, so a finish operation is performed (Step S65), and the process ends.
- the same procedure is used for climbing and descending the stairs only by changing the control parameters used for the climbing operation and the descending operation in the climb operation. Can be executed.
- the control parameters used for the stair climbing operation are for regulating the position of the sole of the robot device with respect to the current tread surface.
- FIG.35A is a diagram for explaining the relationship between the tread and the sole recognized by the robot device.
- FIG. 35B is a diagram showing an example of control parameters used for the climb operation.
- Each control parameter shown in FIG. 35A indicates the following.
- step min.z The minimum value of the height difference (kick-up) between the current step and the next step that can be raised and lowered step
- max.z The difference in height between the current step and the next step (kick-up) )
- ax (approach.x): Distance between the front edge FE and the robot at the approach position
- front_x_limit Limit value of the distance between the front edge FE on the tread and the rear end of the sole 121 (.minimal x-value)
- back_x_limit Limit value of the distance between the back edge BE on the tread and the front end of the sole 121 (maximal x-value)
- back_x_desired Desired value of the distance between the back edge BE and the front end of the sole 121 (desired value;
- aligi iistance is a parameter that is used only when performing an alignment operation, and is used when starting the stairs elevating operation processing that executes the operation of climbing the stairs and descending the Z step, that is, performing the elevating operation of the first step.
- appr 0ac h_x also a parameter to be used only when the approach operation, is used to initiate the stair climbing operation process for performing Z down operation climbing stairs.
- front_x_limit and back_x_limit specify the relationship between the tread surface recognized by the robot device and the sole of the robot device.
- the distance between the back edge BE and front edge FE of the tread surface and the end of the sole that is, If a small portion of the tread when moving to the tread is smaller than these values, the tread cannot be moved, or even if it can be moved, the next Is determined to be impossible to ascend and descend.
- a negative value of both front_x_limit and back_x_limit indicates that the tread is smaller than the sole. That is, in the climbing operation, even if the tread surface is smaller than the sole, it can be determined that the tread is movable.
- back_x_desired indicates the distance between the back edge BE at the position where the robot device wants to align with the back edge BE on the current tread and the front end of the sole, and as shown in Fig.
- back_x_desired is located before the knock edge BE, and in this embodiment, at a position 15 mm before the back edge BE.On the other hand, when descending, the sole protrudes beyond the knock edge BE. In this embodiment, the position protrudes by 5 mm. This is because climbing requires a certain distance before moving up to the next step, while descending does not require such a distance, and it must extend beyond the tread. This is because it is easier to observe and recognize the next or subsequent tread at a position that is easier.
- FIG. 36 and FIG. 37 are traces obtained by photographing a state in which the robot apparatus actually performs the ascent / descent operation using the control parameters shown in FIG. FIG. 36 shows the operation of the robot apparatus climbing the stairs.
- the search operation (4) was performed and the next tread was not observed (No. 17), the stairs ascent / descent operation was finished (finish).
- the appearance (No. 18) is shown.
- Fig. 37 shows the descending operation, and the search operation (No. 1, No. 4, No. 7, No. 10, No. 13, No. 16), climb Repeat the operation (including the alignment operation) (No. 5, No. 6, No. 8, No. 9, No. 11, No. 14, No. 15, No. 15), and the tread of the next step is no longer observed Finishes (No. 18) and moves up and down End the operation.
- FIG. 38 is a diagram showing the relationship between a single step and the sole of the robot device.
- the robot device is located at the next step (stepl).
- the case of moving from the lower side to the upper side will be described.
- the next area from the step 191 is determined. It can be determined that the movement to is a descending operation.
- the value of the above-mentioned control parameter may be changed in the climb operation in accordance with this determination.
- the ront_x_limit and the back_x_limit in the case of the climbing operation are both negative values, and the sole 121 of the robot apparatus has the step 191 as shown in FIG. Indicates that it is determined that it is possible to move even if it protrudes.
- FIG. 39 is a diagram illustrating a relationship between a single recess and the sole of the robot device.
- the robot apparatus is located below the space of the recess 192 at the next step (stepl). The case where the lateral force also moves upward will be described.
- the movement can be determined to be a climbing operation. Therefore, similar to the step portion 191, in the climb operation according to this determination, What is necessary is just to change the value of the control parameter.
- ront_x_limit and back_x_limit in the case of the descending operation are both positive values, and the sole 121 of the robot apparatus is moved from the recess 191 as shown in FIG. 39. It is determined that it can be moved only when it is small.
- a plane that can be determined to be movable such as a detected plane force, for example, is extracted, and the tread surface of the polygonal stairs including that area is recognized. Then, the stair climbing operation is performed using the information on the treads such as the polygonal front edge FE and back edge BE, and the stairs information including the height from the floor.
- the elevating operation a search operation is performed on the moving tread, and an align operation is performed on the front edge FE of the searched tread or the knock edge BE on the current moving surface, and the next moving surface is compared with the current moving surface.
- control parameters By switching control parameters by judging whether to climb or descend from the height of the moving surface, it is possible to perform not only stairs with a normal rectangular tread force, but also up and down operations such as spiral stairs. At the same time, the climbing operation and the descending operation can be executed in the same procedure only by changing the control parameters. Therefore, not only the stairs, but also the movement to a single step or a single concave part can be moved by the same control method. For example, a robot device with a limited field of view due to the stairs being large relative to the size of the mouth bot device, or a restriction on the position of the stereo vision system mounted on the robot device, etc. Can be recognized over the steps.
- the plane detection device can reliably detect a plurality of planes by the line segment expansion method even when there are a plurality of planes such as stairs that are not only dominant planes in the visual field, In the line segment extraction that is extracted when detecting a plane, it is possible to obtain a plane detection result that is robust against measurement noise by fitting a line segment adaptively according to the distribution of points in the distance data.
- FIG. 40 is a functional block diagram showing a flat panel detection device according to this modification. As shown in FIG. 40, the plane detection device 100 converts a stereo vision system (Stereo Vision System) 1 as a distance data measuring means for acquiring three-dimensional distance data into a distance image composed of three-dimensional distance data.
- a stereo vision system Stereo Vision System
- a plane detection unit 2 for detecting an existing plane by a line segment extension method.
- the plane detection unit 2 selects a distance data point group estimated to be on the same plane from the distance data points forming the image, and extracts a line segment for each distance data point group.
- An area extending section 2b for detecting one or a plurality of planar areas existing in the image from a group of line segments consisting of the total line force extracted by the line segment extracting section 2a included in the image.
- the area extension unit 2b selects any three line segments estimated to exist on the same plane from the group of line segments, and obtains a reference plane from these. Then, it is determined whether or not a line segment adjacent to the selected three line segments belongs to the same plane as the reference plane. If it is determined that the line segment belongs to the same plane, the line segment as a region extending line segment is determined. Updates the reference plane and extends the area of the reference plane.
- the line segment extraction unit 2a extracts a distance data point group that is estimated to be on the same plane in a three-dimensional space from each data column for each column or row in the distance image, and extracts this distance data point group. Generates one or more line segments according to the distribution of distance data point cloud from. In other words, if it is determined that the distribution is biased, it is determined that the data points are not on the same plane, the data points are divided, and it is determined whether the distribution is again biased for each of the divided data points. The determination process is repeated, and if the distribution is not biased, a line segment is generated from the data point group. The above process is performed for all data strings, and the generated line segment group D11 is output to the area extension unit 2b.
- the area extension unit 2b selects three line segments estimated to belong to the same plane in the line group D11, and obtains a plane that also serves as a seed as a reference plane.
- the range image is extended by integrating line segments belonging to the same plane as the region type into the region of this type of flat surface (region type: seed region). And output the plane group D2.
- the robot 201 can perform information processing such as avoiding obstacles and going up and down stairs, or by performing these processes periodically, to obtain important information such as stairs, floors, and walls. Get surface information.
- a pattern (texture) is required on the surface of the staircase ST2.
- parallax since parallax can be obtained by two cameras, parallax cannot be calculated for objects without a pattern, and distance cannot be measured accurately. That is, the measurement accuracy of the distance data in the stereo vision system depends on the texture to be measured. Note that parallax indicates the difference between a point in the space mapped to the left eye and the right eye, and varies depending on the distance of the camera.
- the head unit of the robot apparatus is provided with a stereo camera 11RZL constituting a stereo vision system, and also outputs, for example, infrared light or the like as a projection means to the head unit, for example.
- a light source 12 is provided.
- the light source 12 projects (irradiates) an object such as a stairless ST3 having no pattern, an object having little or no texture, a wall, etc., and operates as a pattern giving means for giving a random pattern PT. .
- the means for applying the random pattern PT is not limited to a light source that projects infrared light. It may be written, but if it is infrared light, it can be given a pattern that is invisible to human eyes but can be observed with a CCD camera mounted on a robot device.
- FIG. 42 is a diagram illustrating a plane detection method using the line segment extension method.
- processing is performed on a data string in the row direction or the column direction in the image 11 in which the focal F force is also captured. For example, in a row of pixels (image row) in an image, if a distance data point belongs to the same plane, it becomes a straight line, and the distance data point is assumed to belong to the same plane.
- Generates a line segment consisting of Then, in the obtained line segment group consisting of a plurality of line segments, a method of estimating and detecting a plane based on the line group that is supposed to constitute the same plane.
- FIG. 43 is a flowchart showing a plane detection process by the line segment extension method. Shown in Figure 43 First, a distance image is input (step S71), and a line segment is also obtained for a data point force estimated to belong to the same plane in each pixel column in the row direction (or column direction) of the distance image (step S72). ). Then, a line segment presumed to belong to the same plane is extracted from the group of line segments, and a plane composed of these line segments is obtained (step S73). In this step S73, first, a region serving as a plane seed (hereinafter referred to as a “seed region”) is selected, and a corresponding region type is selected.
- a plane seed region hereinafter referred to as a “seed region”
- the condition is that three line segments including one line in the vertically adjacent row direction (or the right and left adjacent column direction) are on the same plane.
- the plane to which the selected three line segment force region types belong is set as a reference plane, and a plane obtained by averaging the three line segments is obtained.
- an area composed of three line segments is defined as a reference plane area. Then, it is determined whether or not a straight line composed of pixel columns in the row direction (or column direction) adjacent to the selected region type is the same plane as the reference plane by comparing spatial distances.
- the adjacent line segment is added to the reference plane area (area extension processing), and the above-mentioned reference plane is updated to include the added line segment (plane update processing), and this is added to the plane area.
- This operation is repeated until no line segment on the same plane exists in the adjacent data string.
- the region type is searched and the plane updating and the region expansion processing are repeatedly executed until there is no longer a region to be a seed (three line segments). Finally, those that form the same plane are connected from among the plurality of obtained region groups.
- a plane recalculation process of re-obtaining a plane except for a line segment that deviates from the plane by a predetermined threshold or more from the plane among the line segments belonging to the obtained plane is further provided as step S74, although it is a plane, details will be described later.
- the process of detecting a line segment from the three-dimensional distance data and combining the region on the same plane into a single plane is a plane detection process using the conventional line segment extension method.
- the line segment extraction method in step S72 is different from the conventional one. That is, as described above, even if a line segment is obtained from a distance data point and the line segment is generated so as to fit the distance data point as much as possible, if the threshold is not changed according to the accuracy of the distance data, over-segmentation is performed. Or under-segmentation and other problems occur. Therefore, in this modified example, in this line segment extraction, a method of adaptively changing the threshold value according to the accuracy of the distance data and the noise by analyzing the distribution of the distance data is introduced. To do.
- the line extraction unit (Line Extraction) 2a receives the three-dimensional range image from the stereo vision system 1 and assumes that each column or row of the range image is on the same plane in the three-dimensional space. Detect the estimated line segment.
- line segment extraction over-segmentation and under-segmentation problems due to measurement noise, etc., that is, multiple planes are originally recognized as one plane,
- algorithm Adaptive Line Fitting that adaptively fits line segments according to the distribution of data points.
- the line segment extraction unit 2a first roughly extracts a line segment as a first line segment using a relatively large threshold value, and then extracts data belonging to the extracted first line segment.
- Point group force The distribution of the data point group with respect to a line segment as a second line segment obtained by the least square method described later is analyzed.
- the data points are extracted by roughly estimating whether or not they are on the same plane, and whether or not there is a bias in the distribution of the data points in the extracted data points is analyzed to see if they exist on the same plane. Estimate again whether the force is being applied.
- the distribution of the data points is analyzed, and if the data point group fits into a zig-zag-shape described later, the process of dividing the data point group is determined to be biased. , And by repeating this, an algorithm that adaptively extracts line segments for noise included in the data point group shall be used.
- FIG. 44 is a flowchart showing details of the processing in the line segment extracting unit 2a, that is, the processing in step S72 in FIG.
- distance data is input to the line segment extraction unit 2a.
- a data point group that is presumed to be present on the same plane in a three-dimensional space is extracted.
- Data points that are estimated to be on the same plane in the three-dimensional space are those whose distance in the three-dimensional space between the data points is less than a predetermined threshold, for example, the distance between adjacent data points is, for example, 6 cm or less.
- a set of data points can be obtained, and this is extracted as a data point group ( ⁇ [0 ⁇ ⁇ -1]) (step S81). And this data point cloud ⁇ [0 ⁇ ⁇ ⁇ n-1] is checked whether the number n of samples included in the processing is greater than the minimum required number of samples (required minimum value) min_n (step S82), and if the number n of data is less than the required minimum value min_n In (S82: YES), an empty set is output as a detection result, and the process ends.
- the data point group data point group ⁇ [0 ⁇ ⁇ -1] is set as a point of interest (division point) brk Then, it is divided into two data point groups ⁇ [0 ⁇ brk] and P [brk ' ⁇ ⁇ -1] (step S88).
- step S85 the data point group division threshold value max_d (S84: NO)
- step S85 the data point group division threshold value max_d (S84: NO)
- step S86 it is checked whether or not the data point group ⁇ [0 ⁇ ⁇ -1] is a Zig-Zag-Shape described later for this line segment L2 (step S86). If it is not a Zig-Zag-Shape (S86) : NO), the obtained line equation line is added to the line segment extraction result list (step S87), and the process ends.
- step S86 if it is determined that the line segment obtained in step S85 is a Zig-Zag-Shape (S86: YES), the process proceeds to step S88, as in step S84 described above.
- step S83 the data point group is divided into two data point groups P [0 ⁇ 'brk] and P [brk' ⁇ ⁇ -1] at the point of interest brk for which the distance dist is obtained.
- step S81 the processes from step S81 are performed again recursively. This process is repeated until all the data points are no longer divided, that is, until all the data points have passed through step S87. Get the list.
- the influence of the data point group ⁇ [0 ⁇ n-1] noise is eliminated, and a line group consisting of a plurality of line forces can be detected with high accuracy.
- a line segment L1 connecting the end points of the data point group ⁇ [0 ⁇ ⁇ -1] is generated in step S83.
- the point of interest brk is one point having the largest distance to the line segment L1 connecting the end points.1S
- the distance to the line segment obtained by the least square as described above is the largest. If there are multiple points whose distance is equal to or greater than the data point group division threshold value max_d, the data point group ⁇ [0 ⁇ ⁇ -1] is divided by all of these points or one or more selected points You may make it.
- step S85 a method of generating a line segment using least squares (Least-Squares Line Fitting) in step S85 will be described.
- n data points ⁇ [0 ⁇ ⁇ _1] we show how to find the equation of the straight line that best fits the data point group.
- the model of the equation of the straight line is expressed by the following equation (1).
- E flt L (x i ⁇ s + y t sia + dy ... (2) best-fit straight line to the data points are thus required for minimizing the total error in the formula (2).
- ⁇ and d that minimize the above equation (2) can be obtained as in the following (3) using the average and variance covariance matrix of the data point group P.
- a method of determining the zigzag shape (Zig-Zag-Shape) in step S86 will be described.
- FIG. 46 is a flowchart showing a Zig-Zag-Shape discrimination method.
- a data point group ⁇ [0 ⁇ ⁇ 1] and a straight line Line, d, ⁇ ) are input (step S90).
- ⁇ indicates the standard deviation of the point sequence.
- a counter that counts the number of consecutive data points on the same side (hereinafter referred to as a continuous point counter).
- a count value count Is set to 1 (step S92).
- sign (x) is a function that returns the sign (+ or 1) of the value of X
- sdist (i) is calculated as P [i] .xcosa + P [i] .ycos ⁇ + d Indicates the positive / negative distance from the i-th data point in the straight line Line. In other words, Val indicates on which side of the straight line Line the data point P [0] is.
- the count value i of a counter for counting data points (hereinafter, referred to as a data point counter, and this count value is referred to as a count value i) is set to 1 (step S93).
- the count value i of the data point counter is smaller than the number n of data (step S94: YES)
- the data point P [i] which is the data point of the next data (hereinafter, i-th)
- Sing dis (P [i] >>) is used to determine which side is located, and the result is assigned to val (step S95).
- val obtained in step 92 is compared with val obtained in step S95, and val and val
- step S96 If 0 is different from 0 (step S96: NO), substitute val for val and count the continuous point counter.
- Substitute 1 for count (step S98), increment the count value i of the data point counter (step S100), and return to the processing from step S94.
- step S96 YES
- step S97 it is determined whether or not the count value count of the continuous point counter is larger than the minimum number of data points min_c to be determined as Zig-Zag-Shape (step S99). If it is larger (step S99: YES), Judge as Zig-Zag-Shape, output TRUE and end the process. On the other hand, when the count value count of the continuous point counter is smaller than the minimum number of data points min_c (step S99: NO), the process proceeds to step S100, and the count value i of the data point counter is incremented (step S100). The processing from step S94 is repeated.
- the processing from step S91 to step S100 can be expressed as shown in FIG.
- FIG. 48 is a block diagram illustrating a processing unit that performs a Zig-Zag-Shape determination process. As shown in FIG. 48, the Zig-Zag-Shape discrimination processing unit 20 receives n data point groups P [0 •• ⁇ 1] and sequentially converts each data point P [i] into a straight line.
- a direction discriminator 21 that outputs the discrimination result Val
- a delay unit 22 for comparing the immediately succeeding data with the result of the direction discriminator 21, and a direction discrimination result Val at the data point P [i] and the data
- the comparison unit 23 compares the direction discrimination result Val at the data point P [i-1] with the comparison unit 23.
- a comparing unit 25 for comparing the minimum data points MIN_ C read from the count value count and the minimum data points storage unit 26 of the counter 24.
- the operation of the Zig-Zag-Shape discrimination processing unit is as follows. That is, the direction discriminating unit 21 obtains a straight line Line by the least squares method for the data point group ⁇ [0 ⁇ ⁇ 1] force, and calculates a positive / negative distance between each data point P [i] and the straight line Line. , And outputs its sign. When a positive or negative sign with respect to the distance to the straight line Line of the data point P [i-1] is input, the delay unit 2 2 outputs the data until the next positive or negative sign of the data point P [i] is input. Is stored.
- the comparison unit 23 compares the above-mentioned positive and negative signs of the data point P [i] and the data point P [i-1], and outputs a signal for incrementing the count value count of the counter 24 if the signs are the same. If the positive and negative signs are different, a signal that substitutes 1 for the count value count is output.
- the comparison unit 25 compares the count value count with the minimum data point number min_c. If the count value count is larger than the minimum data point number min_c, the data point group ⁇ [0 ⁇ ⁇ -1] must be zigzag. A signal indicating is output.
- the region extension unit 2b receives the line segment group obtained by the line segment extraction unit 2a as input, determines which plane each of the line segments belongs to by applying a sequence of points to the plane (Plane Fitting), and gives Is divided into a plurality of planes (plane areas). The following method is used to separate the planes.
- a plane (reference plane) obtained by these three line segments is a seed of a plane, and a region including the three line segments is called a seed region.
- FIG. 49 is a schematic diagram for explaining the area expansion processing.
- three line segments 32a to 32c indicated by thick lines are selected as the region types.
- the region consisting of these three line segments 32a-32c is the region type.
- one plane (reference plane) P is obtained from these three line segments 32a-32c.
- a line segment which is the same plane as the plane P is selected in the data string 33 or 34 adjacent to the outermost line segment 32a or 32c of the region type outside the region type, respectively.
- the line segment 33a is selected.
- a plane P ′ consisting of these four line segments is obtained, and the reference plane P is updated.
- a plane P ′ ′ composed of a group of five line segments is obtained, and the plane P ′ is updated.
- the second tread of the stairs 31 is obtained as the plane 45 surrounded by the broken line.
- the area expansion processing is performed until there is no line segment to be added using the selected area type as a seed.
- a process of retrieving three line segments as the region type from the image 30 and executing the region enlarging process is repeated, and the three line segments as the region type are repeated.
- the process of step S3 in FIG. 43 is repeated until there is no more.
- a and b can be updated as shown in (7) below, and can be extended to a plane update process for n data point groups.
- FIG. 50 is a flowchart showing the procedure of the area type search processing and the area expansion processing.
- the region type is selected by first selecting three adjacent line segments (1, 1, 1) in the row or column direction data string used in the line segment extraction. The pixel position in each line segment (1, 1), (1, 1) is
- a search is made for duplicates in the direction orthogonal to the data sequence (step S101).
- Each data point has an index indicating the pixel position in the image.For example, if the data point is a line segment in the data column in the row direction, this index is compared to determine whether the data point overlaps in the column direction. Compare. If the search is successful (step S102: YES), the above (6-1) is calculated using the above equation (7). As a result, the plane parameters n and d can be determined, and are used to calculate the mean square error (1, 1, 1) of the plane equation shown in the above equation (8).
- Step S103 the root-mean-square error rms (l, 1, 1) of this plane equation is, for example, lcm
- Step SI 04 If the value is larger than the predetermined threshold th1, the process returns to step S101 again.
- the region is extended by the line segment extension method from the region type thus selected. That is, first, a line segment that is a candidate to be added to the region type region is searched (step S105). Note that this area also includes an updated area type described later when the area type has already been updated.
- the candidate line segments are the line segments included in the region type region (for example, 1).
- step S106 the mean square error rms (1) of the plane equation is calculated. It is determined whether or not this is smaller than a predetermined threshold th2 (step S107).
- step S108 the plane parameters are updated (step S108), and the processing from step S105 is repeated again.
- the process is repeated until there are no more candidate line segments.
- step S106: NO the process returns to step S101 and the region type is searched again. Then, when there is no region type included in the line segment group (step S102: NO), the plane parameters obtained so far are output and the processing is terminated.
- the region type is searched, it is determined whether or not the three line segments belong to the same plane, and the force belonging to the reference plane or the updated plane that has been updated when performing the region extension processing is determined.
- Equation (8) above is used to determine whether or not there is no error. That is, only when the root-mean-square error rms of the plane equation is less than a predetermined threshold (th_rms), the line segment (group) is estimated to belong to the same plane, and the plane including the line segment is re-assembled as a plane. Is calculated. In this way, by using the mean square error rms of the plane equation to determine whether or not a force belongs to the same plane, even if the noise is more robust and contains fine steps, the plane can be accurately detected. Can be extracted. The reason will be described below.
- FIG. 51 is a diagram showing the effect, and is a schematic diagram showing an example in which the root mean square error rms of the plane equation is different even when the distance between the end point and the straight line is equal.
- a straight line La intersecting the plane P Fig. 51A
- a straight line Lb parallel to the plane P and shifted by a predetermined distance Fig. 51B
- the square root of the plane equation obtained from the straight line Lb in FIG. 51B is compared with the root mean square error rms (La) of the plane equation obtained from the straight line La in FIG. 51A.
- Average error rms (Lb) is larger. That is, when the straight line La intersects with the plane P as shown in Fig. 51A, the root mean square error rms of the plane equation is relatively small and the effect of noise is often small. In such a case, there is a high probability that the straight line Lb in which the mean square error rms of the plane equation is large is not the same plane as the plane P but a different plane P '.
- the root mean square error rms of the plane equation is calculated as in this modification.
- the value is less than a predetermined threshold, it is preferable to determine that the plane is the same. If the distance between the end point of the line segment and the plane is equal to or smaller than a predetermined threshold value, the line segment may be included in the plane or a combination of these may be used, as in the past, depending on the environment and the properties of the distance data. Also, once the surface parameters n and d are calculated, the mean square error rms of the plane equation is updated from the values of the two moments obtained during line segment extraction for the data point group. However, it can be easily calculated by the above equation (8).
- rms (l, 1, 1) is calculated by using the above equation (6) to calculate the plane equation 2 for all three straight lines.
- Neighbor (index) is a function that returns an index adjacent to the given index, for example, ⁇ index-1, index + 1 ⁇ .
- step S74 after performing the area expansion process in step S73 in FIG. 43 to update the plane equation, in step S74, recalculating the plane equation (Post processing) )I do.
- the distance data points or line segments that are deemed to belong to the plane represented by the finally obtained plane equation and are updated as described above are calculated, and the distance data points or line segments that deviate from the plane by a predetermined value or more are calculated as follows: Excluding this, updating the plane equation again can further reduce the effect of noise.
- step S74 will be described in detail.
- a method of calculating the plane equation again in two steps will be described.
- a data point is detected that includes a plane that does not belong to any plane and whose distance is equal to or smaller than a relatively large threshold value, for example, 1.5 cm
- processing for including the data point in the plane is performed. These processes can be performed by searching for data points near the boundary of each plane area. After the above processing is completed, the plane equation is calculated again.
- Fig. 54A is a schematic diagram showing the floor surface when looking down on the floor surface with the robot device standing
- Fig. 54B shows the x-axis on the vertical axis, y on the horizontal axis, and the z-axis by shading each data point.
- FIG. 4 is a diagram showing three-dimensional distance data, and further shows a data point group force assumed to be on the same plane in a pixel column force line segment extraction process in a row direction in which a straight line is detected.
- FIG. 54C shows a plane region obtained by the region extension processing from the straight line group shown in FIG. 54B.
- FIG. 55 shows the result when one step is placed on the floor.
- FIG. 55A on the floor F, one step ST3 is placed.
- FIG. 55B is a diagram showing the experimental conditions. If the distance between the point of interest and the straight line (line segment) exceeds 3 ⁇ 4ax_d, the data point group is divided. Ma The success / failure of extraction (horizontal) is the number of successful plane detections by line segmentation, which performs a total of 10 line segment extractions for each row-wise data column. (Correct extraction (vertical)) indicates the success or failure of extraction for each data column in the column direction. No. 1-No.
- 55C and 55D are diagrams showing the results of plane detection by the line segment extension method, and show the results of plane detection by the method in the present modified example and the results of plane detection by the conventional line segment extension method, respectively. (Comparative Example) is shown. As shown in FIG.
- FIGS. 56B and 56C show a case where three-dimensional distance data is obtained from a captured image.
- the left diagram shows an example in which a line segment is extracted from a pixel column in the row direction (distance data sequence)
- the right diagram shows an example in which a line segment is extracted from a pixel column in the column direction (distance data sequence).
- three-dimensional distance data can be acquired from images obtained by photographing different stairs as described above, and plane detection can be performed.
- plane detection can be performed. For example, as shown in FIGS. 11 and 12, in all cases, all treads can be detected as planes. In FIG. 12B, a part of the floor surface is successfully detected as another plane.
- a large threshold is initially set. If the line does not have a data point exceeding the threshold but has a zigzag shape, the line is divided by multiple planes that are not noises by Zig-Zag-Shape discrimination processing. Since the line segment is assumed to be divided, the distance information including noise can detect a plurality of planes with high accuracy.
- the uneven floor surface constituted by a plurality of planes is not erroneously recognized as a walkable plane, and the movement of the robot device is further simplified.
- one or more of the above-described plane detection processing, stair recognition processing, and stair climbing control processing can be realized by causing a computing unit (CPU) to execute a computer program, even if the processing is configured by hardware. You may. When it is a computer program, it can be provided by recording it on a recording medium, or can be provided by transmitting it via the Internet or other transmission media.
- a computing unit CPU
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Image Processing (AREA)
- Manipulator (AREA)
Abstract
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006511065A JP4618247B2 (ja) | 2004-03-17 | 2005-03-17 | ロボット装置及びその動作制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-077214 | 2004-03-17 | ||
JP2004077214 | 2004-03-17 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005087452A1 true WO2005087452A1 (fr) | 2005-09-22 |
WO2005087452A9 WO2005087452A9 (fr) | 2008-03-13 |
Family
ID=34975410
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/004838 WO2005087452A1 (fr) | 2004-03-17 | 2005-03-17 | Dispositif robot, procede de commande de comportement pour ce dispositif robot et dispositif mobile |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP4618247B2 (fr) |
WO (1) | WO2005087452A1 (fr) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008246609A (ja) * | 2007-03-29 | 2008-10-16 | Honda Motor Co Ltd | 脚式移動ロボット |
JP2011093024A (ja) * | 2009-10-28 | 2011-05-12 | Honda Motor Co Ltd | 脚式移動ロボットの制御装置 |
EP2372652A1 (fr) | 2010-03-08 | 2011-10-05 | Optex Co. Ltd. | Procédé pour estimer un plan dans une image de plage et caméra à plage d'image |
US8289321B2 (en) | 2004-03-17 | 2012-10-16 | Sony Corporation | Method and apparatus for detecting plane, and robot apparatus having apparatus for detecting plane |
JP2013109750A (ja) * | 2011-11-23 | 2013-06-06 | Samsung Electronics Co Ltd | 3次元データ映像の階段認識方法 |
CN103879471A (zh) * | 2014-04-05 | 2014-06-25 | 凌昕 | 登山车 |
KR101591471B1 (ko) | 2008-11-03 | 2016-02-04 | 삼성전자주식회사 | 물체의 특징 정보를 추출하기 위한 장치와 방법, 및 이를 이용한 특징 지도 생성 장치와 방법 |
US9441319B2 (en) | 2014-02-26 | 2016-09-13 | Brother Kogyo Kabushiki Kaisha | Embroidery data generating device and non-transitory computer-readable medium storing embroidery data generating program |
JP2016212824A (ja) * | 2015-05-06 | 2016-12-15 | 高麗大学校 産学協力団 | 外郭空間特徴情報抽出方法{methodforextractingoutterstaticstructureofspacefromgeometricdataofspace} |
JP2017522195A (ja) * | 2014-07-23 | 2017-08-10 | グーグル インコーポレイテッド | 予測調節可能な油圧レール |
US10434649B2 (en) | 2017-02-21 | 2019-10-08 | Fanuc Corporation | Workpiece pick up system |
CN111127497A (zh) * | 2019-12-11 | 2020-05-08 | 深圳市优必选科技股份有限公司 | 一种机器人及其爬楼控制方法和装置 |
JP2020075340A (ja) * | 2018-11-08 | 2020-05-21 | 株式会社東芝 | 作動システム、制御装置、およびプログラム |
CN112597857A (zh) * | 2020-12-16 | 2021-04-02 | 武汉科技大学 | 一种基于kinect的室内机器人楼梯攀爬位姿快速估计方法 |
CN112699734A (zh) * | 2020-12-11 | 2021-04-23 | 深圳市银星智能科技股份有限公司 | 门槛检测方法、移动机器人及存储介质 |
US11123869B2 (en) | 2019-04-12 | 2021-09-21 | Boston Dynamics, Inc. | Robotically negotiating stairs |
US20210331754A1 (en) * | 2020-04-22 | 2021-10-28 | Boston Dynamics, Inc. | Stair Tracking for Modeled and Perceived Terrain |
WO2021216264A1 (fr) * | 2020-04-22 | 2021-10-28 | Boston Dynamics, Inc. | Perception et adaptation pour un dispositif de suivi d'escalier |
WO2021216235A1 (fr) * | 2020-04-20 | 2021-10-28 | Boston Dynamics, Inc. | Identification d'escaliers à partir de bruits de pas |
CN114766975A (zh) * | 2022-04-13 | 2022-07-22 | 江苏商贸职业学院 | 一种专用于楼梯打扫的扫地机器人 |
US11396101B2 (en) | 2018-11-08 | 2022-07-26 | Kabushiki Kaisha Toshiba | Operating system, control device, and computer program product |
CN115256470A (zh) * | 2022-08-09 | 2022-11-01 | 七腾机器人有限公司 | 一种基于深度视觉的楼梯测量方法、系统及四足机器人 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5623362B2 (ja) * | 2011-09-28 | 2014-11-12 | 本田技研工業株式会社 | 段差部認識装置 |
NL2008490C2 (nl) | 2012-03-15 | 2013-09-18 | Ooms Otto Bv | Werkwijze, inrichting en computerprogramma voor het extraheren van informatie over een of meerdere ruimtelijke objecten. |
JP7280700B2 (ja) | 2019-01-21 | 2023-05-24 | 株式会社東芝 | 保持装置、制御システム、及び検査システム |
JP7462466B2 (ja) | 2020-04-20 | 2024-04-05 | 株式会社東芝 | 保持装置、検査システム、移動方法、及び検査方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05223549A (ja) * | 1992-02-10 | 1993-08-31 | Honda Motor Co Ltd | 移動体の階段などの認識方法 |
-
2005
- 2005-03-17 JP JP2006511065A patent/JP4618247B2/ja not_active Expired - Fee Related
- 2005-03-17 WO PCT/JP2005/004838 patent/WO2005087452A1/fr active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05223549A (ja) * | 1992-02-10 | 1993-08-31 | Honda Motor Co Ltd | 移動体の階段などの認識方法 |
Non-Patent Citations (1)
Title |
---|
OKADA S. ET AL.: "Jitsujikan Plane Segment Finder no Kenkyu.", DAI 6 KAI ROBOTICS SYMPOSIA YOKOSHU., 18 March 2001 (2001-03-18), pages 51 - 56, XP002994251 * |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8289321B2 (en) | 2004-03-17 | 2012-10-16 | Sony Corporation | Method and apparatus for detecting plane, and robot apparatus having apparatus for detecting plane |
JP2008246609A (ja) * | 2007-03-29 | 2008-10-16 | Honda Motor Co Ltd | 脚式移動ロボット |
KR101591471B1 (ko) | 2008-11-03 | 2016-02-04 | 삼성전자주식회사 | 물체의 특징 정보를 추출하기 위한 장치와 방법, 및 이를 이용한 특징 지도 생성 장치와 방법 |
JP2011093024A (ja) * | 2009-10-28 | 2011-05-12 | Honda Motor Co Ltd | 脚式移動ロボットの制御装置 |
EP2372652A1 (fr) | 2010-03-08 | 2011-10-05 | Optex Co. Ltd. | Procédé pour estimer un plan dans une image de plage et caméra à plage d'image |
US8599278B2 (en) | 2010-03-08 | 2013-12-03 | Optex Co., Ltd. | Method for estimating a plane in a range image and range image camera |
US9552640B2 (en) | 2011-11-23 | 2017-01-24 | Samsung Electronics Co., Ltd. | Method of recognizing stairs in three dimensional data image |
JP2013109750A (ja) * | 2011-11-23 | 2013-06-06 | Samsung Electronics Co Ltd | 3次元データ映像の階段認識方法 |
KR101820299B1 (ko) | 2011-11-23 | 2018-03-02 | 삼성전자주식회사 | 3차원 데이터 영상의 계단 인식 방법 |
US9441319B2 (en) | 2014-02-26 | 2016-09-13 | Brother Kogyo Kabushiki Kaisha | Embroidery data generating device and non-transitory computer-readable medium storing embroidery data generating program |
CN103879471B (zh) * | 2014-04-05 | 2016-04-13 | 凌昕 | 登山车 |
CN103879471A (zh) * | 2014-04-05 | 2014-06-25 | 凌昕 | 登山车 |
JP2017522195A (ja) * | 2014-07-23 | 2017-08-10 | グーグル インコーポレイテッド | 予測調節可能な油圧レール |
US11077898B2 (en) | 2014-07-23 | 2021-08-03 | Boston Dynamics, Inc. | Predictively adjustable hydraulic pressure rails |
JP2016212824A (ja) * | 2015-05-06 | 2016-12-15 | 高麗大学校 産学協力団 | 外郭空間特徴情報抽出方法{methodforextractingoutterstaticstructureofspacefromgeometricdataofspace} |
JP2018088275A (ja) * | 2015-05-06 | 2018-06-07 | 高麗大学校 産学協力団 | 外郭空間特徴情報抽出方法{method for extracting outter static structure of space from geometric data of space} |
US10434649B2 (en) | 2017-02-21 | 2019-10-08 | Fanuc Corporation | Workpiece pick up system |
US11396101B2 (en) | 2018-11-08 | 2022-07-26 | Kabushiki Kaisha Toshiba | Operating system, control device, and computer program product |
JP2020075340A (ja) * | 2018-11-08 | 2020-05-21 | 株式会社東芝 | 作動システム、制御装置、およびプログラム |
US11123869B2 (en) | 2019-04-12 | 2021-09-21 | Boston Dynamics, Inc. | Robotically negotiating stairs |
US11660752B2 (en) | 2019-04-12 | 2023-05-30 | Boston Dynamics, Inc. | Perception and fitting for a stair tracker |
US11548151B2 (en) | 2019-04-12 | 2023-01-10 | Boston Dynamics, Inc. | Robotically negotiating stairs |
CN111127497A (zh) * | 2019-12-11 | 2020-05-08 | 深圳市优必选科技股份有限公司 | 一种机器人及其爬楼控制方法和装置 |
US11644841B2 (en) | 2019-12-11 | 2023-05-09 | Ubtech Robotics Corp Ltd | Robot climbing control method and robot |
US12094195B2 (en) | 2020-04-20 | 2024-09-17 | Boston Dynamics, Inc. | Identifying stairs from footfalls |
WO2021216235A1 (fr) * | 2020-04-20 | 2021-10-28 | Boston Dynamics, Inc. | Identification d'escaliers à partir de bruits de pas |
US11599128B2 (en) | 2020-04-22 | 2023-03-07 | Boston Dynamics, Inc. | Perception and fitting for a stair tracker |
WO2021216264A1 (fr) * | 2020-04-22 | 2021-10-28 | Boston Dynamics, Inc. | Perception et adaptation pour un dispositif de suivi d'escalier |
US20210331754A1 (en) * | 2020-04-22 | 2021-10-28 | Boston Dynamics, Inc. | Stair Tracking for Modeled and Perceived Terrain |
US12077229B2 (en) * | 2020-04-22 | 2024-09-03 | Boston Dynamics, Inc. | Stair tracking for modeled and perceived terrain |
CN112699734B (zh) * | 2020-12-11 | 2024-04-16 | 深圳银星智能集团股份有限公司 | 门槛检测方法、移动机器人及存储介质 |
CN112699734A (zh) * | 2020-12-11 | 2021-04-23 | 深圳市银星智能科技股份有限公司 | 门槛检测方法、移动机器人及存储介质 |
CN112597857B (zh) * | 2020-12-16 | 2022-06-14 | 武汉科技大学 | 一种基于kinect的室内机器人楼梯攀爬位姿快速估计方法 |
CN112597857A (zh) * | 2020-12-16 | 2021-04-02 | 武汉科技大学 | 一种基于kinect的室内机器人楼梯攀爬位姿快速估计方法 |
CN114766975A (zh) * | 2022-04-13 | 2022-07-22 | 江苏商贸职业学院 | 一种专用于楼梯打扫的扫地机器人 |
CN114766975B (zh) * | 2022-04-13 | 2023-06-02 | 江苏商贸职业学院 | 一种专用于楼梯打扫的扫地机器人 |
CN115256470A (zh) * | 2022-08-09 | 2022-11-01 | 七腾机器人有限公司 | 一种基于深度视觉的楼梯测量方法、系统及四足机器人 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2005087452A1 (ja) | 2008-01-24 |
JP4618247B2 (ja) | 2011-01-26 |
WO2005087452A9 (fr) | 2008-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4618247B2 (ja) | ロボット装置及びその動作制御方法 | |
US8289321B2 (en) | Method and apparatus for detecting plane, and robot apparatus having apparatus for detecting plane | |
JP4479372B2 (ja) | 環境地図作成方法、環境地図作成装置、及び移動型ロボット装置 | |
US20210331317A1 (en) | Perception and Fitting for a Stair Tracker | |
Lee et al. | RGB-D camera based wearable navigation system for the visually impaired | |
Pérez-Yus et al. | Detection and modelling of staircases using a wearable depth sensor | |
JP6617830B2 (ja) | 骨格推定装置、骨格推定方法および骨格推定プログラム | |
JP3994950B2 (ja) | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 | |
EP3680618A1 (fr) | Procédé et système de suivi d'un dispositif mobile | |
JP4100239B2 (ja) | 障害物検出装置と同装置を用いた自律移動ロボット、障害物検出方法、及び障害物検出プログラム | |
US20240193936A1 (en) | Identifying stairs from footfalls | |
Tang et al. | Plane-based detection of staircases using inverse depth | |
CN115702405A (zh) | 建模和感知地形的楼梯跟踪 | |
JP2006054681A (ja) | 移動体周辺監視装置 | |
JP2007041656A (ja) | 移動体制御方法および移動体 | |
US11073842B1 (en) | Perception and fitting for a stair tracker | |
CN114683290B (zh) | 一种足式机器人位姿优化的方法,装置以及存储介质 | |
JP2003271975A (ja) | 平面抽出方法、その装置、そのプログラム、その記録媒体及び平面抽出装置搭載型ロボット装置 | |
Schwarze et al. | Stair detection and tracking from egocentric stereo vision | |
Pradeep et al. | Piecewise planar modeling for step detection using stereo vision | |
Struebig et al. | Stair and ramp recognition for powered lower limb exoskeletons | |
JP2007041657A (ja) | 移動体制御方法および移動体 | |
JP2006053754A (ja) | 平面検出装置及び検出方法 | |
CN110694252A (zh) | 一种基于六轴传感器的跑步姿态检测方法 | |
CN113720323B (zh) | 基于点线特征融合的单目视觉惯导slam方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006511065 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |