CN108621161B - Method for estimating body state of foot type robot based on multi-sensor information fusion - Google Patents

Method for estimating body state of foot type robot based on multi-sensor information fusion Download PDF

Info

Publication number
CN108621161B
CN108621161B CN201810431905.2A CN201810431905A CN108621161B CN 108621161 B CN108621161 B CN 108621161B CN 201810431905 A CN201810431905 A CN 201810431905A CN 108621161 B CN108621161 B CN 108621161B
Authority
CN
China
Prior art keywords
leg
information
imu
body state
acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810431905.2A
Other languages
Chinese (zh)
Other versions
CN108621161A (en
Inventor
马宏绪
司振飞
安宏雷
韦庆
王剑
王发林
张芷僮
杨宇
刘轶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201810431905.2A priority Critical patent/CN108621161B/en
Publication of CN108621161A publication Critical patent/CN108621161A/en
Application granted granted Critical
Publication of CN108621161B publication Critical patent/CN108621161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Abstract

The invention discloses a method for estimating the state of a body of a foot type robot based on multi-sensor information fusion, which comprises the following steps: s1, respectively acquiring motion information of each leg through a sensor in a target robot, and performing single-leg kinematic solution on each leg according to the acquired information to obtain kinematic information of each leg; the kinematic information of the support legs in each leg is fused to obtain an initial state estimation result of the body; s2, obtaining IMU information through an IMU system, and performing inertial navigation modeling resolving based on the obtained IMU information to obtain an inertial navigation resolving result; and S3, fusing the obtained body initial state estimation result and the inertial navigation resolving result to obtain a final body state estimation result. The method has the advantages of IMU and kinematics calculation estimation modes, the estimation result does not drift, peak noise can be filtered, and the accuracy of the estimation result is improved.

Description

Method for estimating body state of foot type robot based on multi-sensor information fusion
Technical Field
The invention relates to the technical field of state estimation of a foot type robot, in particular to a method for estimating a body state of the foot type robot based on multi-sensor information fusion.
Background
The foot robot is a kind of robot simulating human or animal, and has serially connected legs in multiple joint structure. Depending on the simulated object and the required task, the legged robots can be divided into several categories: 1. the single-legged robot is used for simulating a single leg of a mammal to carry out bounce function research; 2. the biped walking robot is used for simulating the human to perform functional researches such as stable walking, sprinting, interaction and the like; 3. the quadruped robot is used for simulating quadruped mammals such as dogs, cats, horses, cheetah and the like to carry out the research on stability, flexibility, load capacity and environmental adaptability; 4. the multi-legged robot is used for simulating crawling insects such as spiders and centipedes and is used for researching and researching tasks such as dangerous narrow space exploration and rescue.
An important goal of foot-based robotics research is to enable it to possess the ability to walk in the real world like an animal being simulated, with perceptual ability being a particularly important item of content. Generally, the perception of robots can be divided into two broad categories: the first is the perception of the external environment, including the contact condition of the foot end and the ground, whether an obstacle exists, the position of the obstacle, external interference and the like, and can be used for obstacle avoidance, obstacle crossing and the like; and secondly, the self state is sensed, including the bending degree of the leg, the motion condition of the joint, the acceleration, the speed, the position, the motion direction, the attitude angle and the like of the body, and the self state is used for feedback in control so as to improve the effect of the controller and improve the motion performance of the robot.
The accurate estimation of the speed and the position state of the body of the foot robot can not only describe the motion state of the robot in the real world more accurately, but also provide high-quality input and feedback signals for related control, thereby improving the control effect and the robot performance. Sensing information sources are various sensors generally, and mainly comprise an inertial navigation unit (IMU), a GPS (global positioning system), a radar, a camera, a joint displacement sensor, a joint force sensor, a foot end force sensor and the like, at present, state estimation aiming at a foot type robot is generally inertial navigation modeling calculation based on the IMU or kinematic modeling analysis based on an internal sensor, wherein the inertial navigation calculation method is accurate in short time and can generate drift along with time accumulation, and the drift mainly comes from device deviation, integral algorithm error and the like of the IMU; the kinematics analysis does not generate drift, but a large amount of noise exists, especially spike noise caused by vibration, impact and the like when the robot falls to the ground cannot be eliminated through a common filtering algorithm. Due to the defects, accurate estimation of the state of the robot body is difficult to realize through inertial navigation calculation or kinematic analysis based on single-class sensor information, and meanwhile, the real-time control requirement of long-time movement of the robot cannot be met.
In order to solve the above problems, some practitioners propose to use a multi-source information fusion method to achieve state estimation of a legged robot body, but generally, the estimated robot speed is obtained by directly fusing inertial navigation information and kinematic information, position estimation is not considered, and fusion of kinematic internal information is not considered in a fusion process, such a fusion mode is only in first-level information fusion, problems that an estimation mode based on an IMU may generate drift, a solution mode based on kinematics may generate a large amount of noise, and the like still exist, and the accuracy of an estimation result is still not high.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: aiming at the technical problems in the prior art, the invention provides the method for estimating the body state of the legged robot based on the multi-sensor information fusion, which is simple in implementation method, has the advantages of IMU and kinematics calculation estimation modes, does not generate drift in the estimation result, can filter peak noise, and has high estimation result precision.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
a method for estimating the body state of a legged robot based on multi-sensor information fusion comprises the following steps:
s1, primary information fusion: acquiring the motion information of each leg through a sensor in the target robot respectively, and performing single-leg kinematics calculation on each leg according to the acquired information to obtain the kinematics information of each leg; fusing the kinematic information of the support legs in each leg to obtain an estimation result of the initial state of the body;
s2, inertial navigation modeling resolving: obtaining IMU information through an IMU system installed on a target robot, and performing inertial navigation modeling calculation based on the obtained IMU information to obtain an inertial navigation calculation result;
s3, second-level information fusion: and fusing the obtained body initial state estimation result and the inertial navigation resolving result to obtain a final body state estimation result.
As a further improvement of the present invention, the specific steps of step S1 are:
s11, single-leg kinematics calculation: acquiring joint displacement information of each single leg of the target robot through a joint displacement sensor, and performing single-leg kinematic solution to obtain position and speed information of each leg;
s12, support period judgment: detecting the contact force between the foot end and the ground through a force sensor at the foot end of each leg in the target robot, judging whether each leg is in a supporting period, and determining to obtain the supporting leg in each leg;
s13, information fusion: and acquiring position information corresponding to each supporting leg, fusing the position information corresponding to each supporting leg to obtain an initial position of the body, and obtaining an initial speed of the body according to the initial position of the body.
As a further improvement of the present invention, the specific steps of the one-leg kinematic solution in step S11 are as follows:
s111, defining a body fixed connection coordinate system and a reference inertia system, resolving according to the joint displacement information of a target robot single leg, and solving the position of a foot end in the body fixed connection coordinate system;
s112, acquiring attitude angle information output by the IMU, and projecting the position solved in the step S11 to three axial directions of a reference inertial system according to the acquired attitude angle information;
s113, according to the projection back-solution of the step S112, the position and the speed of the body in the reference inertial system are obtained.
As a further improvement of the invention: in step S13, the position information of each support leg is weighted to realize fusion, and the initial position of the body is obtained from the weighted result.
As a further improvement of the invention: the initial position of the body is specifically the average value of the weighted sum of the position information of all the supporting legs.
As a further improvement of the present invention, the specific steps of step S2 are:
s21, calculating an attitude transformation matrix of the carrier system and the reference inertial system by utilizing the attitude angle output by the IMU system;
s22, converting the carrier system acceleration output by the IMU system into a reference inertial system acceleration through coordinate system transformation according to the attitude conversion matrix;
s23, carrying out static calibration on the IMU system drift error to compensate for deterministic deviation and obtain a primary real acceleration;
and S24, solving an inertial navigation solution value of the body state according to the body initial speed and the body initial position obtained by the body initial state estimation result to obtain the inertial navigation solution result.
As a further improvement of the invention: in the step S22, the reference inertial system acceleration is obtained through conversion according to the following formula;
Figure GDA0002815922930000031
wherein the content of the first and second substances,
Figure GDA0002815922930000032
as an attitude transformation matrix, fbAcceleration of the carrier system, g, for IMU direct outputnIn order to be the acceleration of the gravity,
Figure GDA0002815922930000033
is the converted reference inertial system acceleration.
As a further improvement of the present invention, in step S23, a six-position method is specifically adopted to calculate to obtain a preliminary real acceleration, and the specific steps are as follows:
s231, constructing an error simplification model as follows:
Figure GDA0002815922930000034
wherein D isx,Dy,DZTrue for three directions of accelerometer in IMU systemReal value; mx,My,MZMeasuring values of an accelerometer in three directions in an IMU system; sx,Sy,SzIs the scale factor of the accelerometer; b isx,By,BZZero offset for three directions of the accelerometer;
s232, solving S and B in the error simplified model through a six-position rolling calibration method, and compensating the S and B into the acceleration of the reference inertial system to obtain the real acceleration of the reference inertial system.
As a further improvement of the invention: in step S24, an inertial navigation solution of the body state is obtained through an integration algorithm specifically in combination with the body initial velocity and position obtained from the body initial state estimation result.
As a further improvement of the invention: in the step S3, a kalman filtering method is specifically used to fuse the initial estimation result and the inertial navigation solution result.
Compared with the prior art, the invention has the advantages that:
1) according to the method for estimating the body state of the legged robot based on multi-sensor information fusion, the two types of sensor information of kinematics information and IMU information are subjected to two-stage fusion, so that the body state of the legged robot is estimated, the advantages of the two types of sensor information can be fully exerted, on one hand, drift generated by IMU integral can be inhibited, on the other hand, noise influence caused by oscillation, impact and the like in kinematics calculation can be reduced, further, the optimal estimation of the body speed and the position state is given, and the estimation precision is improved.
2) The body state estimation method of the legged robot based on multi-sensor information fusion realizes body state estimation by adopting a method of fusion of various perception information, can realize deeper and more precise simulation on perception, behavior, consciousness and the like of animals, and is favorable for further improving the intelligent level of the robot.
3) According to the method for estimating the body state of the legged robot based on multi-sensor information fusion, the inertial navigation information and the kinematic information are subjected to secondary information fusion based on a Kalman filtering method, the speed and the position of the robot are estimated, the data calculation amount can be greatly reduced, the real-time performance is high, and the estimation efficiency and the estimation precision can be further improved.
Drawings
Fig. 1 is a schematic flow chart of the implementation of the method for estimating the body state of the legged robot based on multi-sensor information fusion in the present embodiment.
Fig. 2 is a schematic diagram of the principle of the embodiment for realizing the state estimation of the body of the legged robot.
Fig. 3 is a schematic diagram of the simplified model of the quadruped robot constructed in the present embodiment.
Fig. 4 is a schematic diagram of the implementation flow of step S1 in this embodiment.
Fig. 5 is a schematic diagram of a single-leg mechanical structure and a simplified model of the robot constructed in the embodiment.
Detailed Description
The invention is further described below with reference to the drawings and specific preferred embodiments of the description, without thereby limiting the scope of protection of the invention.
As shown in fig. 1 and 2, the method for estimating the body state of the legged robot based on multi-sensor information fusion in the present embodiment includes the following steps:
s1, primary information fusion: acquiring the motion information of each leg through a sensor in the target robot respectively, and performing single-leg kinematics calculation on each leg according to the acquired information to obtain the kinematics information of each leg; the method comprises the steps of fusing kinematic information of supporting legs in each leg to obtain an initial state estimation result of a body, wherein the state of the body comprises the speed and the position of the body;
s2, inertial navigation modeling resolving: obtaining IMU information of a target robot through an IMU system, and performing inertial navigation modeling resolving based on the obtained IMU information to obtain an inertial navigation resolving result;
s3, second-level information fusion: and fusing the obtained body initial state estimation result and the inertial navigation resolving result to obtain a final body state estimation result.
In the embodiment, the two types of sensor information of the kinematics information and the IMU information are subjected to two-stage fusion, so that the estimation of the body state of the legged robot is realized, the advantages of the two types of sensor information can be fully exerted, on one hand, drift generated by IMU integral can be inhibited, on the other hand, noise influence caused by oscillation, impact and the like in kinematics calculation can be reduced, further, the optimal estimation of the body speed and the position state is given, and the estimation precision is improved; the embodiment adopts a method of fusing various perception information to realize the estimation of the body state, and can realize deeper and more precise simulation of perception, behavior, consciousness and the like of animals, thereby being beneficial to further improving the intelligent level of the robot.
The robot of this embodiment specifically is four-footed bionic robot, and four-footed bionic robot simplifies the model as shown in fig. 3, and the robot mainly comprises body and four joint legs, imitates four-footed mammal, and every total four joints of leg are from supreme down to be in proper order: ankle, knee, anterior hip and lateral hip joints, whereinn-XnYnZnDenotes the reference inertial frame of the robot motion, { O }b-XbYbZbAnd expressing a robot body fixedly connected coordinate system. The kinematics of IMU and four supporting legs is resolved and all can be regarded as the information source of body state estimation, wherein belong to the data of the same kind between the supporting leg, and the supporting leg is heterogeneous data with IMU, and the integration of data of the same kind is carried out earlier to this embodiment to accomplish one-level information fusion, later will tentatively fuse the result and carry out heterogeneous integration with IMU again, with accomplishing second grade information fusion, finally realize robot body state estimation, as shown in FIG. 2.
As shown in fig. 4, the specific step of step S1 in this embodiment is:
s11, single-leg kinematics calculation: acquiring joint displacement information of each single leg of the target robot through a joint displacement sensor, and performing single-leg kinematic solution to obtain position and speed information of each leg;
s12, support period judgment: detecting the contact force between the foot end and the ground through a force sensor at the foot end of each leg in the target robot, judging whether each leg is in a supporting period, and determining to obtain the supporting leg in each leg;
s13, information fusion: and acquiring position information corresponding to each supporting leg, fusing the position information corresponding to each supporting leg to obtain an initial position of the body, and obtaining an initial speed of the body according to the initial position of the body.
In the embodiment, firstly, the state of a foot type robot body is solved by performing kinematics modeling based on an internal displacement sensor, the pose of the body relative to the foot end of a supporting leg is solved by performing positive kinematics modeling analysis on a single leg of the robot, and then the kinematics information of the internal displacement sensor is subjected to primary information fusion to solve the speed and the position of the body.
In this embodiment, the specific step of the one-leg kinematics calculation in step S11 is as follows:
s111, defining a body fixed connection coordinate system and a reference inertia system, resolving according to joint displacement information of a single leg of the target robot, and solving the position of a foot end in the body fixed connection coordinate system;
s112, acquiring attitude angle information output by the IMU, and projecting the position solved in the step S11 to three axial directions of a reference inertial system according to the acquired attitude angle information;
s113, according to the projection back-solution of the step S112, the position and the speed of the body in the reference inertial system are obtained.
When single leg kinematics of this embodiment is resolved, the robot body of specific definition links the initial point of coordinate system at the body barycenter, ignores IMU's installation error, then IMU year system is unanimous with three axial directions of body fixed connection system. Firstly, based on an internal joint displacement sensor and the geometric configuration of a robot, the position of a foot end in a body fixed connection coordinate system is solved through single leg kinematics; then, outputting attitude angle information by utilizing the IMU, and projecting the position to three axial directions of a reference inertial system; finally, the position and the speed of the body in the reference inertial system are reversely solved, and the detailed steps in the specific application embodiment are as follows:
(1) and solving the position of the foot end in the body fixed connection coordinate system.
As shown in FIG. 5, a robot single-leg mechanical structure diagram and a simplified model are provided, and a coordinate system { O } is established at each jointi-XiYiZiIs e {1,2,3,4,5}, and then from the foot end { O ∈5To hip { O }0The conversion relation of the coordinate system is as follows:
Figure GDA0002815922930000061
the position of the foot end point in the foot end coordinate system is P5The position of the foot end point in the hip coordinate system is P0Then the position of the foot endpoint in the hip coordinate system can be expressed as:
Figure GDA0002815922930000062
the same principle can solve the conversion relation between the hip coordinate system and the body fixed connection coordinate system, and then the position of the foot end point in the body system can be obtained.
(2) And (3) projecting the position obtained in the step (1) to a reference inertial system of the robot motion by utilizing IMU information.
And (3) outputting three attitude angles through the IMU, calculating a conversion matrix between a coordinate system of the robot body at the current moment and a reference inertial system at the initial moment by using the attitude angles, and multiplying the conversion matrix by the position calculated in the step (1) to obtain the projection of the position in the reference inertial system.
(3) And solving the position and the speed.
And (3) solving the position change of the mass center of the body relative to the foot end of the supporting leg inversely according to the results of the steps (1) and (2), and differentiating the position to obtain the speed. When the position of the center of mass at the initial moment is known, the position of the center of mass of the robot at the current moment can be obtained by continuously accumulating the displacement of each time period.
In this embodiment, in step S13, the position information of each support leg is weighted to realize fusion, and the initial position of the body is obtained from the weighting result, where the initial position of the body is specifically an average value of weighted sums of the position information of all the support legs.
In order to reduce the error influence brought by foot end sliding, mechanism clearance and the like, when a plurality of legs are in a supporting state, the average value of displacement calculation is taken as the final displacement of the center of mass of the body, and then the embodiment has for the quadruped robot:
Figure GDA0002815922930000063
that is, 16 leg parts may land on the ground in total, and the motion calculation information of the plurality of support legs is fused by a weighting method, which is detailed in table 1. In the table, LF, RF, LH, RH represent the left front leg, right front leg, left rear leg, right rear leg of the quadruped robot, respectively. Correspondingly, the speed V of the center of mass of the bodyMEAThe position differential can be obtained, wherein the judgment of the supporting leg is specifically obtained according to the contact force condition of the foot end and the ground.
TABLE 1 corresponding relationship between the displacement of mass center of the body and the supporting leg condition
Figure GDA0002815922930000064
Figure GDA0002815922930000071
In the embodiment, after the displacement of the center of mass of the robot from the moment k-1 to the moment k is solved through the table 1, the position of the current moment is obtained through accumulation under the condition that the position of the initial moment is known; the average speed in the sampling time period can be obtained by using the bit removal and the sampling time, and when the sampling time is short, the average speed can be regarded as the real-time speed of the current moment.
For other biped or hexapod robots, the leg landing conditions are different, and the displacement can be solved by the mean value fusion method which is the same as the principle of the quadruped robot.
In this embodiment, the specific steps of step S2 are as follows:
s21, calculating an attitude transformation matrix of the carrier system and the reference inertial system by utilizing the attitude angle output by the IMU system;
s22, converting the carrier system acceleration output by the IMU system into a reference inertial system acceleration through coordinate system transformation according to the attitude transformation matrix;
s23, carrying out static calibration on the IMU system drift error to compensate for deterministic deviation and obtain a primary real acceleration;
and S24, solving an inertial navigation solution value of the body state according to the body initial speed and the body initial position obtained by the body initial state estimation result to obtain an inertial navigation solution result.
In this embodiment, in step S22, the reference inertial system acceleration is obtained by conversion according to the following formula;
Figure GDA0002815922930000081
wherein the content of the first and second substances,
Figure GDA0002815922930000082
as an attitude transformation matrix, fbAcceleration of the carrier system, g, for IMU direct outputnIn order to be the acceleration of the gravity,
Figure GDA0002815922930000083
is the converted reference inertial system acceleration.
In this embodiment, in step S23, a six-position method is specifically adopted to calculate to obtain a preliminary real acceleration, and the specific steps are as follows:
s231, constructing an error simplification model as follows:
Figure GDA0002815922930000084
wherein D isx,Dy,DZReal values of the accelerometer in three directions in the IMU system are obtained; mx,My,MZMeasuring values of an accelerometer in three directions in an IMU system; sx,Sy,SzIs the scale factor of the accelerometer; b isx,By,BZZero offset for three directions of the accelerometer;
s232, solving S and B in the error simplified model through a six-position rolling calibration method, and compensating the S and B into the acceleration of the reference inertial system to obtain the real acceleration of the reference inertial system.
In this embodiment, in step S24, the inertial navigation solution value of the body state is obtained through an integral algorithm specifically by combining the body initial velocity and the body initial position obtained from the body initial state estimation result.
When the IMU carrier coordinate system is converted into a reference inertial system of robot motion, the X-axis direction in the IMU fixed connection system is vertical to the interface surface of the sensor, the direction is positive inwards, the Z-axis direction is vertical to the upper plane and the lower plane of the sensor, the direction is positive downwards, and the Y-axis is determined according to the right-hand rule. During the installation process, the errors of installation and a platform are ignored, so that the X axis of the IMU faces to the advancing direction of the movement of the robot, and the Z axis faces to the direction of the gravity acceleration. In order to realize inertial navigation modeling calculation, the IMU is initialized to output the angles of the IMU fixed connection system at the current moment relative to the IMU fixed connection system at the initial moment (namely, the ground inertial reference inertial system), namely the roll angles
Figure GDA0002815922930000085
The pitch angle theta and the yaw angle psi are within the following value ranges according to the actual motion condition:
Figure GDA0002815922930000086
ψ∈[-π,π](ii) a While the transformation from the ground inertial reference inertial system to the IMU fixed system can be represented by a direction cosine matrix, which is mainly a transformation matrix generated by rotation along the X axis
Figure GDA0002815922930000087
Transformation matrix RM generated by rotation along Y-axisθAnd a transformation matrix RM generated by rotation along the Z-axisψCombined, the direction cosine matrix DCM, expressed in terms of the euler angle, can be expressed as:
Figure GDA0002815922930000091
also, for any transformation matrix, its inverse is equivalent to the transpose, i.e.
DCM-1=DCMT (7)
The acceleration and velocity vectors of the body in the reference inertial system can be represented by the corresponding vectors projected on the IMU coordinate system
aNED=DCM·aIMU (8)
VNED=DCM·VIMU (9)
Wherein, three attitude angles
Figure GDA0002815922930000092
θ and ψ are outputs after information fusion by the built-in extended kalman filter.
When inertial navigation modeling calculation is performed, firstly, an attitude transformation matrix of a carrier system and a reference inertial system is calculated by utilizing an Inertial Measurement Unit (IMU) to output an attitude angle, carrier system acceleration output by the IMU is transformed into reference inertial system acceleration through coordinate system transformation, and a speed updating differential equation arranged by a ground inertial reference inertial system is as shown in the above (4); then, performing static calibration on IMU drift errors, and compensating deterministic deviation by adopting a six-position method to obtain a preliminary real acceleration, wherein a simplified model of the deterministic errors is shown as the formula (5), and after S and B are solved by a six-position rolling calibration method, the simplified model is compensated into the acceleration of a reference inertial system, so that the real acceleration of the reference inertial system can be obtained; and finally, obtaining an inertial navigation solution value of the body state through an integral algorithm by combining the body initial speed and the body initial position obtained by the body initial state estimation result.
In this embodiment, in step S3, the initial estimation result and the inertial navigation solution result are fused by specifically using a kalman filter method, a state model and a measurement model of the estimation object are established based on the kalman filter method, sensor information that needs to be used is included in the two models, and then the optimal estimation of the state is obtained through a series of updated prediction equations. The Kalman filtering method only needs to use the state and the measured data of the previous moment when calculating the state of the current moment, and the inertial navigation information and the kinematic information are subjected to secondary information fusion based on the Kalman filtering method, so that the speed and the position of the robot are estimated, the data calculation amount can be greatly reduced, the real-time performance is high, and the estimation efficiency and the estimation precision can be further improved. In a specific application embodiment, modified Kalman filters, such as EKF (extended Kalman filter), ukf (unknown Kalman filter), particle filter, etc., may be used, wherein EKF may be applied to systems with weak nonlinearity, and for systems with strong nonlinearity, the use of the ukf (unknown Kalman filter) and particle filter methods may be considered.
The foregoing is considered as illustrative of the preferred embodiments of the invention and is not to be construed as limiting the invention in any way. Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical spirit of the present invention should fall within the protection scope of the technical scheme of the present invention, unless the technical spirit of the present invention departs from the content of the technical scheme of the present invention.

Claims (7)

1. A method for estimating the body state of a legged robot based on multi-sensor information fusion is characterized by comprising the following steps:
s1, primary information fusion: acquiring motion information of each leg through a sensor in the target legged robot respectively, and performing single-leg kinematic solution on each leg according to the acquired information to obtain kinematic information of each leg; the kinematic information of the supporting legs in each leg is fused to obtain an initial estimation result of a body state, wherein the body state comprises displacement and speed, and the foot type robot comprises a body and each leg;
s2, inertial navigation modeling resolving: obtaining IMU information through an IMU system installed on a target legged robot, and performing inertial navigation modeling calculation based on the obtained IMU information to obtain an inertial navigation calculation result;
s3, second-level information fusion: fusing the obtained initial estimation result of the body state and the inertial navigation resolving result to obtain a final body state estimation result;
the specific steps of step S1 are:
s11, single-leg kinematics calculation: acquiring joint displacement information of each single leg of the target legged robot through a joint displacement sensor, and performing single-leg kinematic solution to obtain position and speed information of each leg;
s12, support period judgment: detecting the contact force between the foot end and the ground through a force sensor at the foot end of each leg in the target foot type robot, judging whether each leg is in a supporting period, and determining to obtain a supporting leg in each leg;
s13, information fusion: acquiring position information corresponding to each supporting leg, fusing the position information corresponding to each supporting leg to obtain an initial position of the body, and obtaining an initial speed of the body according to the initial position of the body;
the specific steps of the one-leg kinematics calculation in the step S11 are as follows:
s111, defining a robot body fixed coordinate system and a reference inertia system, resolving according to joint displacement information of a single leg of the target foot type robot, and solving the position of a foot end in the body fixed coordinate system;
s112, acquiring attitude angle information output by an IMU system, and projecting the position solved in the step S111 to three coordinate systems of a reference inertial system according to the acquired attitude angle information;
s113, solving the position and the speed of the body in the reference inertial system according to the projection back solution of the step S112;
the specific steps of step S2 are:
s21, calculating an attitude transformation matrix of an IMU carrier system and a reference inertial system by utilizing an attitude angle output by the IMU system, wherein three coordinate systems of the IMU carrier system and a body fixed coordinate system are consistent in direction;
s22, converting the carrier system acceleration output by the IMU system into a reference inertial system acceleration through coordinate system transformation according to the attitude conversion matrix;
s23, carrying out static calibration on the IMU system drift error to compensate for deterministic deviation and obtain a primary real acceleration;
and S24, solving an inertial navigation solution value of the body state according to the initial speed and the initial position of the body obtained by the initial estimation result of the body state, and obtaining the inertial navigation solution result.
2. The method for estimating the body state of the legged robot based on the multi-sensor information fusion according to claim 1, characterized in that: in step S13, the position information of each support leg is weighted to realize fusion, and the initial position of the body is obtained from the weighted result.
3. The method for estimating the body state of the legged robot based on the multi-sensor information fusion as claimed in claim 2, wherein: and the initial position of the body is the average value of the weighted sum of the position information of all the supporting legs.
4. The method for estimating the body state of the legged robot based on the multi-sensor information fusion as claimed in claim 1, wherein the step S22 is performed by converting the acceleration of the reference inertial system according to the following formula;
Figure FDA0002815922920000021
wherein the content of the first and second substances,
Figure FDA0002815922920000022
as an attitude transformation matrix, fbAcceleration of carrier system, g, for IMU system outputnIn order to be the acceleration of the gravity,
Figure FDA0002815922920000023
is the converted reference inertial system acceleration.
5. The method for estimating the body state of the legged robot based on the multi-sensor information fusion as claimed in claim 4, wherein the step S23 specifically adopts a six-position method to obtain a preliminary real acceleration by calculation, and the specific steps are as follows:
s231, constructing an error simplification model as follows:
Figure FDA0002815922920000024
wherein D isx,Dy,DZReal values of an accelerometer in three directions in an IMU carrier system are obtained; mx,My,MZMeasuring values of an accelerometer in three directions in an IMU carrier system; sx,Sy,SzIs the scale factor of the accelerometer; b isx,By,BZZero offset for three directions of the accelerometer;
s232, solving the S in the error simplified model by a six-position rolling calibration methodx、Sy、SzAnd Bx、By、BZAnd compensating the acceleration into the acceleration of the reference inertial system to obtain the real acceleration of the reference inertial system.
6. The method for estimating the body state of the legged robot based on the multi-sensor information fusion as claimed in claim 5, wherein in step S24, the inertial navigation solution value of the body state is obtained by an integration algorithm specifically combining the initial velocity and the initial position of the body obtained from the initial estimation result of the body state.
7. The method for estimating the body state of the legged robot based on multi-sensor information fusion according to any one of claims 1-4, characterized in that: in the step S3, a kalman filtering method is specifically used to fuse the initial estimation result of the body state and the inertial navigation solution result.
CN201810431905.2A 2018-05-08 2018-05-08 Method for estimating body state of foot type robot based on multi-sensor information fusion Active CN108621161B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810431905.2A CN108621161B (en) 2018-05-08 2018-05-08 Method for estimating body state of foot type robot based on multi-sensor information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810431905.2A CN108621161B (en) 2018-05-08 2018-05-08 Method for estimating body state of foot type robot based on multi-sensor information fusion

Publications (2)

Publication Number Publication Date
CN108621161A CN108621161A (en) 2018-10-09
CN108621161B true CN108621161B (en) 2021-03-02

Family

ID=63695826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810431905.2A Active CN108621161B (en) 2018-05-08 2018-05-08 Method for estimating body state of foot type robot based on multi-sensor information fusion

Country Status (1)

Country Link
CN (1) CN108621161B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110412921B (en) * 2019-08-09 2021-07-27 山东大学 Robot single-leg high-real-time control system based on EtherCAT
CN110861123A (en) * 2019-11-14 2020-03-06 华南智能机器人创新研究院 Method and device for visually monitoring and evaluating running state of robot
CN110906923B (en) * 2019-11-28 2023-03-14 重庆长安汽车股份有限公司 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle
CN111086001B (en) * 2019-12-25 2021-09-14 广东省智能制造研究所 State estimation method and system for multi-modal perception of foot robot
CN111158482B (en) * 2019-12-30 2023-06-27 华中科技大学鄂州工业技术研究院 Human body motion gesture capturing method and system
CN110928420B (en) * 2019-12-30 2023-06-13 华中科技大学鄂州工业技术研究院 Human body motion gesture capturing method and system
CN111189577B (en) * 2020-01-16 2022-01-07 腾讯科技(深圳)有限公司 Sensor calibration and data measurement method, device, equipment and storage medium
CN111896007B (en) * 2020-08-12 2022-06-21 智能移动机器人(中山)研究院 Attitude calculation method for quadruped robot for compensating foot-ground impact
CN111949929B (en) * 2020-08-12 2022-06-21 智能移动机器人(中山)研究院 Design method of multi-sensor fusion quadruped robot motion odometer
CN111993391B (en) * 2020-08-25 2022-02-15 深圳市优必选科技股份有限公司 Robot pose estimation method and device, humanoid robot and storage medium
CN114131604B (en) * 2020-08-26 2023-11-03 北京市商汤科技开发有限公司 Method and device for determining state of robot, robot and storage medium
CN112596534A (en) * 2020-12-04 2021-04-02 杭州未名信科科技有限公司 Gait training method and device for quadruped robot based on deep reinforcement learning, electronic equipment and medium
CN113091999A (en) * 2021-04-01 2021-07-09 燕山大学 Foot type robot leg one-dimensional force sensor calibration method and system
CN113358121A (en) * 2021-06-10 2021-09-07 中国北方车辆研究所 Electrically-driven insect configuration foot type robot foot-to-ground slip estimation method
CN113405548A (en) * 2021-06-10 2021-09-17 中国北方车辆研究所 Foot ground slip estimation method for electrically-driven lactating configuration foot type robot
CN113267181B (en) * 2021-06-22 2022-08-16 武汉大学 Construction method of foot-type odometer suitable for foot-type robot
CN114216456B (en) * 2021-11-27 2023-12-08 北京工业大学 Attitude measurement method based on fusion of IMU and robot body parameters

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102426353B (en) * 2011-08-23 2013-08-14 北京航空航天大学 Method for offline acquisition of airborne InSAR (Interferometric Synthetic Aperture Radar) motion error by utilizing high-precision POS (Position and Orientation System)
CN102508954A (en) * 2011-10-21 2012-06-20 天津大学 Full-digital simulation method and device for global positioning system (GPS)/strapdown inertial navigation system (SINS) combined navigation
US8805584B2 (en) * 2011-11-22 2014-08-12 Disney Enterprises, Inc Kinematic and dynamic calibration methods for legged robots with force-controlled joints
CN104215262A (en) * 2014-08-29 2014-12-17 南京航空航天大学 On-line dynamic inertia sensor error identification method of inertia navigation system

Also Published As

Publication number Publication date
CN108621161A (en) 2018-10-09

Similar Documents

Publication Publication Date Title
CN108621161B (en) Method for estimating body state of foot type robot based on multi-sensor information fusion
Camurri et al. Probabilistic contact estimation and impact detection for state estimation of quadruped robots
Lin et al. Sensor data fusion for body state estimation in a hexapod robot with dynamical gaits
US20210309310A1 (en) Control of Robotic Devices with Non-Constant Body Pitch
US20230278214A1 (en) Robot localization using variance sampling
CN114022824A (en) Narrow environment-oriented quadruped robot motion planning method
Chitta et al. Proprioceptive localilzatilon for a quadrupedal robot on known terrain
CN108621201B (en) Method for measuring state of body of foot type robot
Masuya et al. Com motion estimation of a humanoid robot based on a fusion of dynamics and kinematics information
Benallegue et al. Humanoid flexibility deformation can be efficiently estimated using only inertial measurement units and contact information
Wawrzyński et al. Robust estimation of walking robots velocity and tilt using proprioceptive sensors data fusion
Zhao et al. A real-time low-computation cost human-following framework in outdoor environment for legged robots
Liu et al. State estimation of a heavy-duty hexapod robot with passive compliant ankles based on the leg kinematics and IMU data fusion
Gür et al. Model-based proprioceptive state estimation for spring-mass running
Lowrey et al. Real-time state estimation with whole-body multi-contact dynamics: A modified UKF approach
Lubbe et al. State estimation for a hexapod robot
Wawrzyński et al. Robust velocity estimation for legged robot using on-board sensors data fusion
Ravichandran et al. Joint angle measurement using MEMs based inertial sensors for biped robot
Hauschildt et al. Multi body kalman filtering with articulation constraints for humanoid robot pose and motion estimation
Leng et al. An improved method for odometry estimation based on EKF and Temporal Convolutional Network
Babu et al. Trajectory Following using Nonlinear Model Predictive Control and 3D Point-Cloud-based Localization for Autonomous Driving
Gupta State estimation for legged robots using proprioceptive sensors
A. Martínez-García et al. Multi-legged robot dynamics navigation model with optical flow
CN115145292B (en) Terrain detection method based on wheel-foot robot joint motion analysis
Piperakis Robust nonlinear state estimation for humanoid robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant