CN117270551A - Method and system for autonomous climbing stairs by quadruped robot based on local environment perception - Google Patents

Method and system for autonomous climbing stairs by quadruped robot based on local environment perception Download PDF

Info

Publication number
CN117270551A
CN117270551A CN202311393752.4A CN202311393752A CN117270551A CN 117270551 A CN117270551 A CN 117270551A CN 202311393752 A CN202311393752 A CN 202311393752A CN 117270551 A CN117270551 A CN 117270551A
Authority
CN
China
Prior art keywords
stair
robot
plane
stairs
climbing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311393752.4A
Other languages
Chinese (zh)
Inventor
陈腾
姜含
刘大宇
荣学文
李贻斌
范永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Youbaote Intelligent Robot Co ltd
Shandong University
Original Assignee
Shandong Youbaote Intelligent Robot Co ltd
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Youbaote Intelligent Robot Co ltd, Shandong University filed Critical Shandong Youbaote Intelligent Robot Co ltd
Priority to CN202311393752.4A priority Critical patent/CN117270551A/en
Publication of CN117270551A publication Critical patent/CN117270551A/en
Pending legal-status Critical Current

Links

Abstract

The invention discloses a method and a system for autonomous climbing stairs by a quadruped robot based on local environment perception, wherein a single depth camera is arranged on the abdomen of a quadruped robot body, and the method comprises the following steps: obtaining depth data of a structured stair under the current view angle of a robot, clustering the depth data and dividing a plane by using a PEAC algorithm to obtain stair plane point cloud data; calculating each plane of the current stair by using a PCA algorithm, and determining each plane of the stair as a horizontal plane or a vertical plane; based on each plane of the stairs, calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data, generating foot-end motion tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center motion tracks, and utilizing model predictive control and whole-body motion control tracking to obtain joint action quantity so as to control the robot to complete stair climbing tasks. The invention can complete reliable perception movement by using only a single depth camera, thereby realizing reliable stair climbing operation.

Description

Method and system for autonomous climbing stairs by quadruped robot based on local environment perception
Technical Field
The invention relates to the technical field of quadruped robots, in particular to a method and a system for autonomous climbing of stairs by the quadruped robot based on local environment perception.
Background
With the continuous progress of the performance and motion control algorithm of the electrically driven joints, the quadruped robot starts to gradually enter various actual scenes to finish the operation, and most of the scenes consist of flat ground, slopes and stairs. The existing motion controller of the four-legged robot can effectively cope with uneven road conditions such as land, slopes and the like, but cannot realize stable and robust adaptation to undulating topography such as stairs and the like, so that the application range of the four-legged robot in life is limited.
Under the condition that an external sensor is not used for sensing the terrain, the robot designs a swing leg and supporting leg control strategy by utilizing body information such as the gesture, the joint angle, the speed and the like, so as to cope with unknown terrain fluctuation, and a certain successful operation case exists in the method, but the phenomenon of step stumbling, stepping on the air and the like can be inevitably caused by blind walking, so that the gesture fluctuation is overlarge and even falls down, and the method has low robustness and poor stability, and can not meet the performance requirement of the operation far.
With the development of vision, radar and other sensors and positioning mapping (SLAM, simultaneous Localization and Mapping, instant positioning and map construction) technologies, in the existing research, external sensing devices are expanded around the trunk of the robot to enable the robot to actively acquire geometric information of the terrain, so that the performance and stability of the operation are improved. The current common method is to integrate the pose estimation of the robot, the data of a visual sensor, a laser sensor and the like, establish a global terrain elevation map of the environment, and plan foot-end tracks for the step terrain according to the position of the robot in the map. However, the accurate position of the robot needs to be obtained, the pose estimation of the robot has serious drift, the accumulated error of the pose estimation causes the topographic map to be poorer and worse along with the increase of the movement distance, and the accurate position of the stair step can not be accurately positioned finally; secondly, the global terrain elevation map generally needs fusion of a plurality of sensors to improve the precision, and most of the sensors are high in price, so that the application cost is greatly increased; finally, algorithms such as multi-sensor fusion, mapping and positioning have high complexity, high operation speed and parallel processing kernel are needed in most cases, and the cost is high.
Disclosure of Invention
In order to solve the defects in the prior art, the invention provides the method and the system for independently climbing the stairs by the quadruped robot based on local environment perception, which scan the foot drop point area of the quadruped robot in real time through the effective visual field of a single camera to acquire real-time local data, and plan a motion track by using only real-time local distance information without depending on a global map, and can complete reliable perception motion by using only a single depth camera, thereby realizing reliable stair climbing operation, and having low algorithm complexity and low hardware system cost.
In a first aspect, the invention provides a method for autonomous climbing of stairs by a quadruped robot based on local environment perception.
A method for autonomous climbing stairs by a quadruped robot based on local environment perception, wherein a visual camera module is arranged on the abdomen of the quadruped robot body, the visual camera module comprises a single depth camera, and the method comprises:
obtaining depth data of a structured stair under the current view angle of a robot, clustering the depth data and dividing a plane by using a PEAC algorithm to obtain stair plane point cloud data;
based on the stair plane point cloud data, each plane of the current stair is calculated by utilizing a PCA algorithm, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane;
calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, calculating and generating foot-end movement tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center movement tracks, and tracking the generated tracks by model prediction control and whole body movement control to obtain joint action amount, controlling the leg joint movement of the robot, and completing stair climbing tasks.
Further technical scheme, control robot shank joint motion accomplishes stair climbing task, includes:
continuously monitoring stair zone bit data, and judging the current climbing stage of the robot;
in different climbing stages, calculating distances between four foot ends and a current vertical plane according to stair characteristic parameters, solving foot drop points at the next gait moment, and generating foot end movement tracks between a swing leg starting point and the foot drop points by using Bezier curves;
correcting the direction of the robot through a PD correction algorithm according to the yaw angle error, and determining a desired yaw angle; fitting planes of the four foot ends by a terrain adaptation algorithm, and taking the inclination angle of the planes as an expected pitch angle; generating a robot centroid movement track based on the expected yaw angle and the expected pitch angle;
based on the foot end movement track and the robot mass center movement track, model prediction control and whole body movement control tracking are utilized to obtain joint action quantity, and the leg joint movement of the robot is controlled to complete stair climbing tasks.
In a second aspect, the invention provides a four-legged robot autonomous climbing stair system based on local environmental awareness.
A quadruped robot autonomous climbing stair system based on local environmental awareness, comprising:
the visual camera module comprises a single depth camera, wherein the single depth camera is arranged at the abdomen of the quadruped robot body and is used for acquiring depth data of the structured stairs under the current visual angle of the robot;
the router module is used for transmitting the depth data obtained by the visual information module to the on-board computer module;
the on-board computing module is used for processing the depth data and executing a motion control algorithm;
the four-foot robot body is used for carrying a vision camera module, a router module and an on-board computing module and executing a stair climbing task;
the processing depth data and executing a motion control algorithm includes:
clustering depth data and dividing planes by using a PEAC algorithm to obtain stair plane point cloud data;
based on the stair plane point cloud data, each plane of the current stair is calculated by utilizing a PCA algorithm, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane;
calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, generating foot-end movement tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center movement tracks, and tracking the generated tracks by model prediction control and whole body movement control to obtain joint action amount, controlling the leg joint movement of the robot, and completing stair climbing tasks.
Further technical scheme, control robot shank joint motion accomplishes stair climbing task, includes:
continuously monitoring stair zone bit data, and judging the current climbing stage of the robot;
calculating the distance between four foot ends and the current vertical plane according to the stair characteristic parameters in different climbing stages, solving the foot drop point of the next gait moment, and generating a foot end movement track between the starting point of the swing leg and the foot drop point by using a Bezier curve;
correcting the direction of the robot through a PD correction algorithm according to the yaw angle error, and determining a desired yaw angle; fitting planes of the four foot ends by a terrain adaptation algorithm, and taking the inclination angle of the planes as an expected pitch angle; generating a robot centroid movement track based on the expected yaw angle and the expected pitch angle;
based on the foot end movement track and the robot mass center movement track, model prediction control and whole body movement control tracking are utilized to obtain joint action quantity, and the leg joint movement of the robot is controlled to complete stair climbing tasks.
The one or more of the above technical solutions have the following beneficial effects:
1. the invention provides a method and a system for autonomous climbing of a four-foot robot based on local environment perception, which are characterized in that a foot drop point area of the four-foot robot is scanned in real time through an effective visual field of a single camera to obtain real-time local data, a global map is not relied on, a motion track is planned by using real-time local distance information, reliable perception motion can be completed by using a single depth camera, reliable stair climbing operation is realized, algorithm complexity is low, and cost is low.
2. In the invention, a single depth camera is arranged on the abdomen of the quadruped robot, the arrangement mode reduces the occupied space of the surface of the robot, and the effective visual field of the camera scans the foot drop point area of the quadruped robot in real time, so that 'which is seen and which is moved to' can be realized; the four-foot robot stair climbing algorithm is independent of a global map, only uses real-time local distance information acquired based on a single depth camera to plan a motion track, realizes reliable stair climbing operation, avoids complex steps of multi-sensor fusion and map building positioning, can run in real time, is not influenced by estimation drift of the pose of the robot, and solves the problems of lower positioning precision and more serious distortion of foot falling positions caused by accumulated errors of the global map building.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention.
FIG. 1 is a schematic diagram of a four-legged robot in an embodiment of the present invention;
fig. 2 is a coordinate system definition diagram of a method for autonomous climbing stairs by a quadruped robot based on local environment perception according to an embodiment of the invention;
fig. 3 is a schematic diagram of a stair climbing method of a quadruped robot based on local environment awareness according to an embodiment of the present invention.
Fig. 4 is a visual information processing flow chart of a stair climbing method of a quadruped robot based on local environment perception according to an embodiment of the invention.
Fig. 5 is a flow chart of a motion control algorithm of a stair climbing method of a quadruped robot based on local environment perception according to an embodiment of the invention.
Detailed Description
It should be noted that the following detailed description is exemplary only for the purpose of describing particular embodiments and is intended to provide further explanation of the invention and is not intended to limit exemplary embodiments according to the invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Furthermore, it will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, steps, operations, devices, components, and/or groups thereof.
As described in the background art, in the motion control method for completing the stair climbing operation by the existing quadruped robot, there is a certain problem: firstly, the current four-foot robot sensing system generally needs multi-sensor fusion, and has high cost and low applicability; secondly, the algorithm complexity of multi-sensor fusion, map construction, positioning and the like is high; finally, the robot performs planning motion in the constructed global map, accurate pose estimation is needed, and the pose estimation of the foot robot has serious drift, so that the final planning motion error is larger, and the algorithm performance is greatly reduced. In order to solve the problems, the invention provides a method and a system for autonomous climbing of a four-foot robot based on local environment perception, which are used for planning foot-end movement tracks in real time only by using real-time local information without depending on global maps by recording visual perception information, so that mature and stable stair climbing capability is provided for the four-foot robot.
Example 1
The embodiment provides a four-foot robot autonomous climbing stair method based on local environment perception, wherein the four-foot robot comprises a four-foot robot body, a visual camera module, an onboard computing module and a router module, and the method comprises the following steps:
step S1, obtaining depth data of a structured stair under a current visual angle of a robot, and clustering and planar segmentation of the depth data by using a PEAC algorithm to obtain stair planar point cloud data;
step S2, calculating each plane of the current stair by using a PCA algorithm based on stair plane point cloud data, and determining the type of each plane of the stair, wherein the type comprises a horizontal plane and a vertical plane;
step S3, calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
and S4, generating a foot end movement track based on the stair characteristic parameters, the yaw angle errors and the stair-out zone bit data, introducing a PD correction algorithm and a terrain adaptation algorithm to generate a robot mass center movement track, and performing model predictive control and whole body movement control tracking on the generated track to obtain joint action quantity, controlling the leg joint movement of the robot, and completing a stair climbing task.
Wherein, control robot shank joint motion accomplishes stair climbing task, includes:
continuously monitoring stair zone bit data, and judging the current climbing stage of the robot;
calculating the distance between four foot ends and the current vertical plane according to the stair characteristic parameters in different climbing stages, solving the foot drop point of the next gait moment, and generating a foot end movement track between the starting point of the swing leg and the foot drop point by using a Bezier curve;
correcting the direction of the robot through a PD correction algorithm according to the yaw angle error, and determining a desired yaw angle; fitting planes of the four foot ends by a terrain adaptation algorithm, and taking the inclination angle of the planes as an expected pitch angle; generating a robot centroid movement track based on the expected yaw angle and the expected pitch angle;
based on the foot end movement track and the robot mass center movement track, model prediction control and whole body movement control tracking are utilized to obtain joint action quantity, and the leg joint movement of the robot is controlled to complete stair climbing tasks.
The method for autonomous climbing stairs by the four-legged robot based on local environment perception, which is proposed by the present embodiment, is described in more detail by the following.
In this embodiment, the quadruped robot as shown in fig. 1 includes a quadruped robot body, a vision camera module, an on-board computing module, and a router module.
The visual camera module is arranged at the abdomen of the quadruped robot body and used for acquiring depth data of the structured stairs under the current visual angle of the robot; the vision camera module comprises a single depth camera, the depth camera is fixedly arranged on the abdomen of the quadruped robot body through a camera support, the inclination angle of the depth camera is 30 degrees downwards with the trunk of the quadruped robot body, and the foot falling positions of two front feet of the quadruped robot can be collected in real time under the visual field, so that real-time perception control of 'which is seen and which is moved to' is realized.
The on-board calculation module is arranged on the front side above the quadruped robot body and is used for processing visual information (namely depth data) and calculating a characteristic extraction algorithm, such as extracting geometric characteristic parameters of stairs, yaw angle errors of the robot, stair-out marker positions of the robot and the like.
The router module is arranged on the front side above the quadruped robot body and used for transmitting data information between different modules or devices, in the embodiment, the on-board computing module comprises an on-board computer, the router module comprises a router, an outer protective shell is fixedly arranged above the trunk of the quadruped robot body, the router and the on-board computer are sequentially arranged in the outer protective shell from top to bottom, and the router is used as a bridge for transmitting data acquired by the visual information module to the on-board computer.
The quadruped robot body is used for carrying the modules or the equipment and executing stair climbing operation.
The four-legged robot is controlled to perform stair climbing operation, so that the stair can be independently climbed, as shown in fig. 4, and the method comprises the following steps:
in step S1, depth data of a structured stair under the current view angle of the robot is obtained, and the depth data is clustered and segmented into planes by using a PEAC algorithm to obtain stair plane point cloud data. Specifically, a single depth camera is used for acquiring depth data of a structured stair under the current visual angle of the robot, the depth data are transmitted to an onboard computer through a router, and a PEAC algorithm is operated in the onboard computer to cluster and separate the depth data into planes. Because the sensor has noise and miscellaneous points irrelevant to stair scenes exist around the camera, the operations of cutting, filtering, point cloud conversion and the like are performed on the data after plane segmentation to obtain available stair plane point cloud data.
In step S2, each plane of the current stair is calculated by using a PCA algorithm based on the stair plane point cloud data, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane. Specifically, the stair plane point cloud data is operated by using a Principal Component Analysis (PCA) algorithm, and the center point and the normal vector of each plane are calculated, wherein the vectors are expressed in a machine body coordinate system, and the definition of the coordinate system is shown in fig. 2. Meanwhile, the robot itself sends a gravity vector under a machine body coordinate system to the on-board computer, and judges the type of each plane according to the included angle between the normal vector of each plane and the gravity vector, namely judges whether each plane belongs to the horizontal plane and the vertical plane of the stair, and the judgment formula is shown as 1-1:
in the above, delta 1 And delta 2 Whether the vectors satisfy a threshold value set in parallel or perpendicular (delta for plane determination) 1 And delta 2 Respectively-5 degrees and 5 degrees, delta for judging vertical plane 1 And delta 2 85 deg. and 95 deg., respectively), g is a gravity vector in the machine body coordinate system,is the i-th plane p i Is defined in the specification.
In step S3, the stairs-based level and verticalAnd (5) calculating stair characteristic parameters, yaw angle errors and stair-out marker bit data on a straight surface. In particular, the stair characteristic parameter comprises the height and depth of a single step, and therefore the height d of the stair is calculated from the distance between two horizontal planes h Calculating the depth d of the stairs by the distance between two vertical surfaces s The calculation formulas are shown as 1-2 and 1-3:
in the above-mentioned method, the step of,and->Respectively denoted as vertical plane v i With horizontal plane h i Normal vector of->And->Respectively denoted as vertical plane v i With horizontal plane h i Is defined by a center point of the lens.
Forward unit vector l passing through x-axis direction in robot coordinate system b (the forward unit vector l) b Can be regarded as (1, 0)) and normal vector to the vertical planeObtain yaw angle error θ error For correcting the yaw angle during movement, the formula is shown as 1-4:
considering that after the front foot steps out of the stair surface, the rear foot of the robot can lose visual information, the track planning algorithm of the rear foot needs to be changed at the moment, and therefore the stair zone bit needs to be continuously monitored in the process of climbing the stair. Judging whether the robot is in a stair-out state or not through the difference value of the center distances of two continuous planes, wherein the robot is used for changing a foot end planning method, and the formula is shown as 1-6:
in the above-mentioned method, the step of,representing the distance between the centers of the two levels detected according to equation 1-2 at the i-th time step. The distance between the central points increases when going out of stairs, and whether the threshold value of the error distance is larger than delta or not is detected l To determine whether the condition is met, typically the threshold is 1.5d s
Vector l from camera to central point of stair vertical plane through coordinate transformation cv Vector l transformed to the center of mass of the base to the center point of the vertical plane bv The method is used for calculating the foot end distance, and the transformation formula is shown as 1-5:
in the above-mentioned method, the step of,is a transformation matrix from the camera coordinate system to the body coordinate system.
The corresponding variables are obtained through the calculation and are transmitted to on-board computing equipment in the robot through the router, and the on-board computing equipment is used for executing a motion control algorithm.
In step S4, a foot end movement track is generated based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, a PD correction algorithm and a terrain adaptation algorithm are introduced, a robot mass center movement track is generated, model predictive control and whole body movement control tracking are utilized on the generated track, joint action quantity is obtained, and the robot is controlled to complete stair climbing tasks.
Specifically, as shown in fig. 5, after the on-board computer in the robot acquires the stair characteristic parameters, yaw angle errors and stair-out zone bit data obtained by calculation, a motion control algorithm starts to be executed. Firstly, dividing the whole climbing process into 5 stages according to the position of a robot relative to stairs, continuously monitoring stair marker bit data, and judging the current climbing stage of the robot; secondly, calculating the distances between the four foot ends and the current vertical plane according to the characteristic parameters of the stairs in different climbing stages, and solving the foot drop points of the next gait moment; then, generating a foot end motion track between a starting point and a foot falling point of the swing leg by utilizing a Bezier curve; then, according to the yaw angle error, a PD correction algorithm is introduced to calculate an expected yaw angle for correcting the orientation of the robot, so that the head always faces the middle of the stairs, a terrain adaptation algorithm is introduced, the plane where the four foot ends are located is fitted, and the inclination angle of the plane is used as the expected pitch angle of the robot, so that the stability of movement is improved; finally, the generated foot end motion track and the robot mass center motion track (which can be simply called as a mass center track and a leg track) are tracked by using Model Predictive Control (MPC) and whole body motion control (WBC), so that joint action quantity is obtained and transmitted to a motor, and the robot is controlled to complete a stair climbing task, so that reliable control is realized.
First, the depth d of the stairs conveyed by the vision processing end s Height d h Vector l from torso centroid to vertical plane bv To calculate the foot end distance. Obtaining a vector from the leg to the trunk centroid according to kinematics, and using the vector and l bv The difference is made to obtain the vector l of the front leg and the rear leg to the vertical plane fv And/l rv Vector l fv And/l rv Transforming to a stair coordinate system, and taking x coordinate to obtain distance d between front leg and rear leg and vertical plane fv And d rv
Since the detected vertical plane is not the vertical plane of the step where the current leg is located, a case-by-case process is required to obtain the distance of the foot from the vertical plane of the step where it is located. The whole process is divided into 5 stages according to the number of the four-foot robot legs in the stairs, namely four legs in a plane, two legs in the stairs, four legs in the stairs, two legs out of the stairs and four legs out of the stairs, as shown in figure 3. The first three stages contain visual information, and the second two stages have no visual information, so that the marker bit needs to be continuously detected, the climbing stage of the robot is judged, different stages have different visual information, and different algorithms are adopted in different stages.
Under the condition of using visual information (the first three stages, namely the stair-advancing state of the robot), the distance d between the current front leg and the vertical surface of the step is obtained by directly using the residual operation f Obtaining the distance d from the current rear leg to the vertical surface of the step by using kinematic transformation r Formulas are shown as 1-7 and 1-8:
d f =d fv %d s (1-7)
in the above, d span Representing the x-coordinate of the vector between the diagonal leg positions in the stair coordinate system, i.e. the distance between the diagonal legs of the robot, n s Indicating the number of steps on the stairs the front leg has been stepped on, d s Represents the width of the stair step (i.e. the depth of the stair), d f Representing the distance of the front leg of the robot from the vertical surface of the step where the front leg is located, phase representing each of the 5 phases of the robot climbing stairs, when phase=1, representing that the robot is in Phase 1, i.e. the four legs of the robot are in a plane; d, d rv Indicating the distance of the rear leg of the robot from the vertical plane of the step where it is located.
For the last two phases (the last two phases, i.e. the robot going out of stairs) without visual information, according to d span To determine whether the rear leg has touched the stairs, wherein d span Representing the distance between the diagonal legs of the robot, d when the robot is walking normally span Will be less than a threshold value, but when the rear leg touches the vertical surface of the stairs, the rear leg is blocked from stepping forward, the distance d span Will become larger, the spacing is measured, and whether the rear leg touches the stairs is determined by comparing whether the spacing is greater than a set threshold. When judging that the rear leg touches the stair, the distance between the rear leg and the vertical surface of the stair is manually set to be a smaller number, so that the automatic planning of the foot drop point to the next step is realized.
After the distance from each foot end to the vertical surface of the step is obtained, the foot drop points are processed in stages according to whether the original swing leg span exceeds the current distance, no processing is carried out on the foot drop points when the span is far smaller than the distance, and an increment is added to the x and z coordinates of the original foot drop points (expressed in a world coordinate system) when the span approaches the distance so as to enable the foot drop points to cross the current step.
Secondly, the pitch angle and the yaw angle of the quadruped robot need to be corrected in real time in the process of going upstairs so as to increase stability. For pitch angle, fitting planes of four feet by using a plane function and calculating the inclination angle of the planes to serve as expected pitch angle; for yaw angle, correcting the expected yaw angle by using the yaw angle error acquired by the vision processing end and the PD model, wherein the formula is shown as 1-9:
in the above, k p And k is equal to d Representing the rigidity and damping coefficient, theta, of PD model error,i Representing the yaw angle error at the ith time step,indicating the desired yaw angle at time i.
Finally, based on the generated desired trajectories for torso (i.e., centroid) and legs (i.e., foot ends), the tracking trajectories and system inputs are optimized using MPC, the optimization equations being shown in equations 1-10:
in the above, x i And u is equal to i Representing the state (composed of centroid position, velocity, posture and angular velocity) and plantar force input, x, respectively, at the ith time step i,ref Representing a reference state at an i-th time step; q (Q) i And R is R i Respectively representing the weight matrix of the optimized track tracking and the input; equations 1-11 give the dynamics, inequality and equality constraints of the optimization objective.
Support leg plantar force input optimized according to the above taskAnd then, the error of track tracking is reduced through WBC, and the dynamic response time is improved. Define configuration space q= [ q ] of robot f ,q j ]Wherein q is f ∈R 6 And q j ∈R 12 Representing the base floating degree of freedom and the joint degree of freedom respectively, mapping the trunk movement, trunk rotation and swing leg tasks under the support leg task zero space according to task priorities, and converting tracking errors into task accelerations, as shown in formulas 1-12:
in the above-mentioned method, the step of,acceleration representing j-th priority task, +.>Representing the inverse, J, of the high priority task zero space mapping matrix j Jacobian matrix representing j-th priority task,>a second derivative of the reference trajectory representing the jth task; />Indicating the acceleration of the j-1 th priority task.
Adding all task accelerations to obtain the total expected configuration space accelerationFloating variable delta for simultaneously optimizing acceleration and plantar force q And delta F To satisfy the kinetic constraints, the optimization equations are shown in fig. 1-13:
in the above, W 1 And W is equal to 2 To optimize the weight matrix of the floating variables, equations 1-14 give the dynamics constraint, the equality constraint, and the inequality constraint, respectively, in the optimization process.
The action quantity of the joint space is obtained through the steps, and is transmitted to the driver to control the leg joint movement of the robot, so that the stair climbing task is completed.
Example two
The embodiment provides a four-legged robot independently climbing stair system based on local environment perception, include:
the visual camera module comprises a single depth camera, wherein the single depth camera is arranged at the abdomen of the quadruped robot body and is used for acquiring depth data of the structured stairs under the current visual angle of the robot;
the router module is used for transmitting the depth data obtained by the visual information module to the on-board computer module;
the on-board computing module is used for processing the depth data and executing a motion control algorithm;
the four-foot robot body is used for carrying the vision camera module, the router module and the on-board computing module and executing stair climbing tasks.
Wherein processing depth data and executing a motion control algorithm comprises:
clustering depth data and dividing planes by using a PEAC algorithm to obtain stair plane point cloud data;
based on the stair plane point cloud data, each plane of the current stair is calculated by utilizing a PCA algorithm, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane;
calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, generating foot-end movement tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center movement tracks, and tracking the generated tracks by model prediction control and whole body movement control to obtain joint action amount, controlling the leg joint movement of the robot, and completing stair climbing tasks.
The steps involved in the second embodiment correspond to those of the first embodiment of the method, and the detailed description of the second embodiment can be found in the related description section of the first embodiment.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented by general-purpose computer means, alternatively they may be implemented by program code executable by computing means, whereby they may be stored in storage means for execution by computing means, or they may be made into individual integrated circuit modules separately, or a plurality of modules or steps in them may be made into a single integrated circuit module. The present invention is not limited to any specific combination of hardware and software.
While the present invention has been described in connection with the preferred embodiments, it should be understood that the present invention is not limited to the specific embodiments, but is set forth in the following claims.

Claims (10)

1. A method for autonomous climbing stairs by a quadruped robot based on local environment perception, characterized in that the abdomen of the quadruped robot body is provided with a vision camera module, the vision camera module comprises a single depth camera, the method comprises:
obtaining depth data of a structured stair under the current view angle of a robot, clustering the depth data and dividing a plane by using a PEAC algorithm to obtain stair plane point cloud data;
based on the stair plane point cloud data, each plane of the current stair is calculated by utilizing a PCA algorithm, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane;
calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, generating foot-end movement tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center movement tracks, and tracking the generated tracks by model prediction control and whole body movement control to obtain joint action amount, controlling the leg joint movement of the robot, and completing stair climbing tasks.
2. The method for autonomous four-legged robot to climb stairs based on local environment awareness according to claim 1, wherein controlling the movement of the robot leg joints to complete the task of climbing stairs comprises:
continuously monitoring stair zone bit data, and judging the current climbing stage of the robot;
calculating the distance between four foot ends and the current vertical plane according to the stair characteristic parameters in different climbing stages, solving the foot drop point of the next gait moment, and generating a foot end movement track between the starting point of the swing leg and the foot drop point by using a Bezier curve;
correcting the direction of the robot through a PD correction algorithm according to the yaw angle error, and determining a desired yaw angle; fitting planes of the four foot ends by a terrain adaptation algorithm, and taking the inclination angle of the planes as an expected pitch angle; generating a robot centroid movement track based on the expected yaw angle and the expected pitch angle;
based on the foot end movement track and the robot mass center movement track, model prediction control and whole body movement control tracking are utilized to obtain joint action quantity, and the leg joint movement of the robot is controlled to complete stair climbing tasks.
3. The method for autonomous climbing of stairs by a quadruped robot based on local environment perception according to claim 1, characterized in that each plane of the current stairs is calculated by using a PCA algorithm based on the stair plane point cloud data, and the type of each plane of the stairs is determined, wherein the type comprises a horizontal plane and a vertical plane, and the method comprises the following steps:
processing the stair plane point cloud data by using a PCA algorithm, and calculating to obtain a center point and a normal vector of each plane under a machine body coordinate system;
the method comprises the steps of combining gravity vectors of the robot under a machine body coordinate system, judging the type of each plane according to the included angle between the normal vector of each plane and the gravity vector, wherein the judging formula is as follows:
in the above, delta 1 And delta 2 Whether the vectors meet the threshold value set in parallel or perpendicular is judged, g is the gravity vector in the machine body coordinate system,is the i-th plane p i Is defined in the specification.
4. The method for autonomous four-legged robot to climb stairs based on local environment awareness according to claim 1, characterized in that, based on the horizontal plane and vertical plane of the stairs, stair characteristic parameters are calculated, including the height and depth of a single step, the calculation method comprises:
calculating the height d of the stairs by the distance between two horizontal planes h Calculating the depth d of the stairs by the distance between two vertical surfaces s The calculation formula is as follows:
in the above-mentioned method, the step of,and->Respectively denoted as vertical plane v i With horizontal plane h i Normal vector of->And->Respectively denoted as vertical plane v i With horizontal plane h i Is defined by a center point of the lens.
5. The method for autonomous climbing stairs by a quadruped robot based on local environment perception according to claim 1, wherein the yaw angle error is obtained by the included angle between the forward vector of the robot and the normal vector of the vertical plane, and the yaw angle is used for correcting the motion, and the calculation formula is as follows:
in the above, l b Representing the forward vector of the robot,representing a vertical plane v i Is defined in the specification.
6. The method for automatically climbing stairs by using a quadruped robot based on local environment perception according to claim 1, wherein the difference value of the central distances of two continuous horizontal planes is used for judging whether the robot is in a stair-climbing state or not, and the calculation formula is as follows:
in the above-mentioned method, the step of,representing the distance between the centers of the two levels detected at the ith time step, +.>Representing the distance, delta, between the centers of two levels detected at time step i-1 l Indicating a set threshold;
if the detected error distance is greater than the set threshold delta l Judging the stair-out state of the robot.
7. The method for automatically climbing stairs by a quadruped robot based on local environment perception according to claim 2, wherein the robot climbing stairs comprise 5 climbing stages, namely four legs on a plane, two legs in stairs, four legs in stairs, two legs out stairs and four legs out stairs; wherein, four legs enter the stairs in plane, two legs enter the stairs, four legs enter the stairs for the robot to enter the stairs, two legs go out of the stairs and four legs go out of the stairs for the robot to go out of the stairs.
8. The method for automatically climbing stairs by using the quadruped robot based on local environment perception according to claim 7, wherein the stair marker data are continuously monitored, and the current climbing stage of the robot is judged; when the robot is in a stage that four legs enter a stair on a plane or two legs enter the stair or four legs enter the stair, calculating the distance between four foot ends and the current vertical plane according to the stair characteristic parameters, wherein the method comprises the following steps:
obtaining the distance d from the current front leg to the vertical surface of the step by using the residual operation f Obtaining the distance d from the current rear leg to the vertical surface of the step by using kinematic transformation r The calculation formula is as follows:
d f =d fv %d s
in the above, d span Representing the x-coordinate of the vector between the diagonal leg positions in the stair coordinate system, i.e. the distance between the diagonal legs of the robot, n s Indicating the number of steps on the stairs the front leg has been stepped on, d s Represents the width of the stair steps, d f Representing the distance of the front leg of the robot from the vertical surface of the step where the robot is located, phase representing each of the 5 stages of the robot climbing stairs, d rv Indicating the distance of the rear leg of the robot from the vertical plane of the step where it is located.
9. Four-legged robot independently climbs stair system based on local environment perception, characterized by comprising:
the visual camera module comprises a single depth camera, wherein the single depth camera is arranged at the abdomen of the quadruped robot body and is used for acquiring depth data of the structured stairs under the current visual angle of the robot;
the router module is used for transmitting the depth data obtained by the visual information module to the on-board computer module;
the on-board computing module is used for processing the depth data and executing a motion control algorithm;
the four-foot robot body is used for carrying a vision camera module, a router module and an on-board computing module and executing a stair climbing task;
the processing depth data and executing a motion control algorithm includes:
clustering depth data and dividing planes by using a PEAC algorithm to obtain stair plane point cloud data;
based on the stair plane point cloud data, each plane of the current stair is calculated by utilizing a PCA algorithm, and the type of each plane of the stair is determined, wherein the type comprises a horizontal plane and a vertical plane;
calculating stair characteristic parameters, yaw angle errors and stair-out zone bit data based on the horizontal plane and the vertical plane of the stairs;
based on stair characteristic parameters, yaw angle errors and stair-out zone bit data, generating foot-end movement tracks, introducing a PD correction algorithm and a terrain adaptation algorithm, generating robot mass center movement tracks, and tracking the generated tracks by model prediction control and whole body movement control to obtain joint action amount, controlling the leg joint movement of the robot, and completing stair climbing tasks.
10. The autonomous four-legged robot stair climbing system based on localized environmental awareness according to claim 9, wherein controlling the robot leg joints to move, performs stair climbing tasks, comprises:
continuously monitoring stair zone bit data, and judging the current climbing stage of the robot;
calculating the distance between four foot ends and the current vertical plane according to the stair characteristic parameters in different climbing stages, solving the foot drop point of the next gait moment, and generating a foot end movement track between the starting point of the swing leg and the foot drop point by using a Bezier curve;
correcting the direction of the robot through a PD correction algorithm according to the yaw angle error, and determining a desired yaw angle; fitting planes of the four foot ends by a terrain adaptation algorithm, and taking the inclination angle of the planes as an expected pitch angle; generating a robot centroid movement track based on the expected yaw angle and the expected pitch angle;
based on the foot end movement track and the robot mass center movement track, model prediction control and whole body movement control tracking are utilized to obtain joint action quantity, and the leg joint movement of the robot is controlled to complete stair climbing tasks.
CN202311393752.4A 2023-10-25 2023-10-25 Method and system for autonomous climbing stairs by quadruped robot based on local environment perception Pending CN117270551A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311393752.4A CN117270551A (en) 2023-10-25 2023-10-25 Method and system for autonomous climbing stairs by quadruped robot based on local environment perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311393752.4A CN117270551A (en) 2023-10-25 2023-10-25 Method and system for autonomous climbing stairs by quadruped robot based on local environment perception

Publications (1)

Publication Number Publication Date
CN117270551A true CN117270551A (en) 2023-12-22

Family

ID=89200866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311393752.4A Pending CN117270551A (en) 2023-10-25 2023-10-25 Method and system for autonomous climbing stairs by quadruped robot based on local environment perception

Country Status (1)

Country Link
CN (1) CN117270551A (en)

Similar Documents

Publication Publication Date Title
US11660752B2 (en) Perception and fitting for a stair tracker
US20220362932A1 (en) Determination of robotic step path
Jenelten et al. Perceptive locomotion in rough terrain–online foothold optimization
JP4929506B2 (en) Legged mobile robot
JP5945419B2 (en) Leg motion trajectory generator for legged mobile robots.
US11059532B1 (en) Control of robotic devices with non-constant body pitch
US7912583B2 (en) Environment map building method, environment map building apparatus and mobile robot apparatus
US20060241827A1 (en) Obstacle avoiding apparatus, obstacle avoiding method, obstacle avoiding program and mobile robot apparatus
JP2022543997A (en) Constrained mobility mapping
US20210331754A1 (en) Stair Tracking for Modeled and Perceived Terrain
CN110597267B (en) Local optimal foot drop point selection method for foot type robot
KR20080088513A (en) Projective transformation convergence calculation method
US20210323618A1 (en) Identifying Stairs from Footfalls
US20230143315A1 (en) Perception and fitting for a stair tracker
US9964956B2 (en) Operating environment information generating device for mobile robot
US10196104B1 (en) Terrain Evaluation for robot locomotion
CN114022824A (en) Narrow environment-oriented quadruped robot motion planning method
CN110815211A (en) Method for quadruped robot to dynamically cross convex obstacle
CN116619382A (en) Robot motion control method and system and electronic equipment
CN116449711A (en) Four-foot robot crawling state planning method and system capable of crossing large obstacle
CN117270551A (en) Method and system for autonomous climbing stairs by quadruped robot based on local environment perception
Jordan et al. Real-time model based path planning for wheeled vehicles
Bhujbal et al. Probabilistic Method for Mapping & 3D SLAM of an off-road Terrain with Four Wheeled Robot
WO2023021734A1 (en) Movement device, movement device control method, and program
CN115963850B (en) Four-foot robot and gradient terrain environment self-adaptive movement method and control system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination