CN116901084A - Track generation and tracking control method and system for leg-arm cooperative robot dancing - Google Patents

Track generation and tracking control method and system for leg-arm cooperative robot dancing Download PDF

Info

Publication number
CN116901084A
CN116901084A CN202311103584.0A CN202311103584A CN116901084A CN 116901084 A CN116901084 A CN 116901084A CN 202311103584 A CN202311103584 A CN 202311103584A CN 116901084 A CN116901084 A CN 116901084A
Authority
CN
China
Prior art keywords
leg
robot
arm
track
mechanical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311103584.0A
Other languages
Chinese (zh)
Inventor
柴汇
杨志远
刘松
罗闯
袁铭
张国璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN202311103584.0A priority Critical patent/CN116901084A/en
Publication of CN116901084A publication Critical patent/CN116901084A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of leg-arm cooperative robots, and provides a track generation and tracking control method and a track generation and tracking control system for a dance of a leg-arm cooperative robot, wherein the track generation and tracking control method comprises the following steps: obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software; track extraction and track optimization are carried out on dance movements, so that a trunk, a mechanical arm, a mechanical leg foot end track and plantar force which accord with robot dynamics are obtained; after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the position, the speed and the acceleration of each task are calculated through zero space mapping; based on acceleration of tasks at all levels and plantar force of foot ends of mechanical legs, joint torque is obtained through a whole body dynamics equation of the robot after relaxation optimization of a trunk dynamics equation of the leg-arm cooperative robot, and control rate of the joint torque is obtained by combining positions and speeds of tasks at all levels, so that reproduction of dance movements is completed. The track generation efficiency is improved, and the robot is ensured to reproduce dance motions as much as possible on the premise of stability.

Description

Track generation and tracking control method and system for leg-arm cooperative robot dancing
Technical Field
The invention belongs to the technical field of leg-arm cooperative robots, and particularly relates to a track generation and tracking control method and system for a leg-arm cooperative robot dancing.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
The leg-arm cooperative robot is a novel intelligent mobile robot system with active operation capability, which is formed by adding mechanical arms on a leg-foot type mobile platform, has stronger terrain adaptability of the leg-foot type robot and strong operation capability of the mechanical arms, and has a great development prospect in the fields of disaster relief, logistics storage and the like. The realization of the dancing of the leg-arm cooperative robot can break through the traditional industrial production scene, expand the field of entertainment performance and the like, and change the availability of the robot in the real world.
The realization of the dancing of the leg-arm cooperative robot needs to consider a plurality of problems:
and firstly, track generation of a multitasking end. The task end of the leg-arm cooperative robot has three kinds: legs, trunk and robotic arms. The multi-end task track can be obtained according to the given task objective function through the track planning algorithm, but the leg-arm cooperative robot dance has aesthetic feeling and integral coordination, has higher requirements on the design of the planning algorithm, and different dance movements need different planning algorithms, so that the difficulty is high and the efficiency is low. The multitask track with dance aesthetic feeling can be obtained by a real biological action capturing method, but the track generating method has higher requirements on corresponding equipment and has a certain threshold of entering. The complexity of a trajectory planning algorithm and the entrance threshold of motion capture can be avoided through a reinforcement learning mode, but a large amount of training time is often needed, the efficiency is low, and the deployment of a sample model is difficult.
And secondly, modeling of the leg-arm cooperative robot. After the mechanical arm is added to the leg-foot robot, the interaction problem between the leg-foot robot and the ground and between the mechanical arm and the robot body needs to be considered, so that the dynamic behavior of the system becomes particularly complex, and great difficulty is brought to system modeling. For the leg-arm cooperative robot, the modeling manner may be divided into decentralized modeling and overall modeling according to whether the robot body and the mechanical arm are modeled separately. The decentralized modeling treats the influence of the mechanical arm on the robot body as external disturbance, and although the modeling difficulty is simplified, the overall control of the robot becomes particularly difficult because of the uncertainty of the disturbance. Meanwhile, because the integral joint position type changes greatly in the dancing process of the leg-arm cooperative robot, the off-line dynamics modeling cannot reflect the real dynamics relation of the robot and can only turn to the on-line dynamics modeling with higher complexity.
Thirdly, the stability of the leg-arm cooperative robot. Stabilization is a primary problem faced by all robots. In the dance motion, the robot has the pose transformation condition of three ends of legs, a trunk and a mechanical arm, and the preset dance motion tracking control is realized as far as possible on the premise of preventing the robot from being unstable.
Disclosure of Invention
In order to solve the technical problems in the background art, the invention provides the track generation and tracking control method and system for the dancing of the leg-arm cooperative robot, which adopts animation software to edit, track generation and track optimization of the dancing action of the leg-arm cooperative robot, reduces the equipment threshold and complexity compared with the main track generation modes such as track planning algorithm, biological action capturing, reinforcement learning and the like, improves the track generation efficiency, and ensures that the dancing action of the robot is reproduced as much as possible on the premise of stability by dividing the multitasking end of the leg-arm robot into priority and mapping the whole body with zero space.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a first aspect of the present invention provides a trajectory generation and tracking control method of a leg-arm cooperative robot dance, comprising:
obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software;
track extraction and track optimization are carried out on the dance movements, so that trunk tracks, mechanical arm tracks, mechanical leg foot end tracks and mechanical leg foot end plantar forces which accord with robot dynamics are obtained;
after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force;
Based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot;
and taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
Further, the robot whole body dynamics equation is:
wherein M is 24×24 Is a robot mass distribution matrix, C 24×1 Is a generalized family force matrix, G 24×1 Is a generalized gravity matrix, f out,changed =f out +δf, Obtained by zero space iteration, f out For mechanical leg foot sole force δf and +.>All are relaxation optimization solving quantities meeting the trunk dynamics equation, J out Representing jacobian matrix, τ 24×1 Indicating joint torque.
Further, the constraint condition of the trajectory optimization includes: rigid body dynamics model, force constraint during foot end support, force constraint during foot end suspension, friction cone constraint of foot end, position constraint during foot end support and position constraint of the tail end of the mechanical arm.
Further, the null space mapping includes: task level zero space mapping, position level zero space mapping, and acceleration level zero space mapping.
Further, the trunk, the mechanical arm and the mechanical leg are all composed of a plurality of rigid bodies, and the rigid bodies are connected through joints.
Further, the zero-space mapping is based on jacobian matrices for each task;
the jacobian matrix is obtained by deducing the mapping relation from Cartesian space to joint space of the leg-arm cooperative robot through the virtual work principle.
Further, the priority ranking is in sequence: support legs, torso, robotic arms, and swing legs.
A second aspect of the present invention provides a trajectory generation and tracking control system of a leg-arm cooperative robot dance, comprising:
a data acquisition module configured to: obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software;
a trajectory extraction module configured to: track extraction and track optimization are carried out on the dance movements, so that trunk tracks, mechanical arm tracks, mechanical leg foot end tracks and mechanical leg foot end plantar forces which accord with robot dynamics are obtained;
a null space mapping module configured to: after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force;
A slack optimization module configured to: based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot;
a control module configured to: and taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
A third aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps in a trajectory generation and tracking control method of a leg-arm cooperative robot dancing as described above.
A fourth aspect of the present invention provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in a trajectory generation and tracking control method of a leg-arm collaborative robot dancing as described above when the program is executed.
Compared with the prior art, the invention has the beneficial effects that:
According to the invention, animation software is adopted to edit the dance movements of the leg-arm cooperative robot, track generation and track optimization, and compared with the main stream track generation modes such as track planning algorithm, biological action capturing, reinforcement learning and the like, the equipment threshold and complexity are reduced, and the track generation efficiency is improved.
According to the invention, the multi-task end of the leg-arm robot is prioritized and the whole body control is mapped in a zero space, so that the robot can reproduce dance motions as much as possible on the premise of stability.
According to the invention, the space vector, the kinematic tree and the advanced dynamics algorithm are introduced to integrally model the leg-arm cooperative robot, so that the accuracy of the dynamics model in the robot dancing process is ensured.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention.
Fig. 1 is a schematic diagram of a leg-arm cooperative robot according to a first embodiment of the present invention;
FIG. 2 is a flowchart of a method for generating and tracking a dance track of a leg-arm cooperative robot according to an embodiment of the present invention;
FIG. 3 is a skeletal relationship diagram of a robot animation model according to a first embodiment of the present invention;
FIG. 4 is a schematic diagram of a robot kinematic tree according to a first embodiment of the present invention;
FIG. 5 is an overall schematic diagram of a robot coordinate system according to a first embodiment of the present invention;
FIG. 6 is a robot single rigid coordinate system T according to an embodiment of the present invention λ(i),i And T is i A relationship diagram.
Detailed Description
The invention will be further described with reference to the drawings and examples.
It should be noted that the following detailed description is illustrative and is intended to provide further explanation of the invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Example 1
The embodiment provides a track generation and tracking control method for a leg-arm cooperative robot dancing.
According to the track generation and tracking control method for the leg-arm cooperative robot dancing, which is provided by the embodiment, the track generation method can enable the robot dancing design and track generation to be more efficient, a designer can interactively create and modify the robot actions through visual effects, and a multi-end track meeting the robot trunk dynamics relation is obtained after the track is optimized. The tracking control method ensures that the robot realizes the reproduction of dance movements as much as possible on the premise of accurate modeling and overall stability.
The track generation and tracking control method for dance of a leg-arm cooperative robot provided by the embodiment, wherein the leg-arm cooperative robot, as shown in fig. 1, comprises a 6-degree-of-freedom mechanical arm and a four-foot robot with 12 degrees of freedom.
According to the track generation and tracking control method for dance of the leg-arm cooperative robot, firstly, a technical path for editing dance movements, generating tracks and optimizing tracks of the leg-arm cooperative robot through animation software is introduced, and compared with main stream track generation modes such as track planning algorithm, biological motion capturing and reinforcement learning, the device threshold and complexity are reduced, and track generation efficiency is improved. Secondly, by introducing a space vector, a kinematic tree and an advanced dynamics algorithm, the leg-arm cooperative robot is integrally modeled, and the accuracy of a dynamics model in the robot dancing process is ensured. Finally, the multi-task end of the leg-arm robot is divided into priority and zero space mapping whole body control is performed, so that the robot is ensured to reproduce dance movements as much as possible on the premise of stability.
The track generation and tracking control method for the leg-arm cooperative robot dancing provided in this embodiment, as shown in fig. 2, specifically includes the following steps:
And 1, building a Blender leg-arm collaborative robot animation model.
STL files of all parts of the leg-arm cooperative robot are exported through three-dimensional CAD software SolidWorks, the STL files are imported into Blender animation software, and Blender realizes motion control of a model through a skin animation principle, namely, a Mesh entity is bound to a specific skeleton. The bone is made up of a series of connected bones, each having a position and orientation. Bones may be combined into layered structures to form a skeleton. The skeleton in the skeleton is rotated, translated, zoomed and the like, so that the shape and actions of the model or the object associated with the skeleton can be controlled. In the design of a leg-arm cooperative robot, bones are used to represent the four legs, one arm and body structure of the robot, and to control their actions.
In order to ensure the accuracy of the leg-arm cooperative robot model in Blender, the number and the hierarchical structure of bones in the skeleton are correspondingly designed. The robot model is composed of 31 bones in total and comprises 6 control bones for controlling a left front leg, a right front leg, a left rear leg, a right rear leg, a mechanical arm and a trunk part of the leg-arm cooperative robot. Each leg comprises 4 pieces of part bones, the mechanical arm comprises 7 pieces of part bones, the trunk part consists of 2 pieces of part bones, and the bone relation diagram is shown in figure 3. On the legs and the mechanical arm, the part bones are nested step by step through a parent-child relationship, and rotation constraint is carried out on each bone. And then, the movements of bones are controlled by controlling six blocks, so that the visual dancing action design of three ends of the trunk, the legs and the arms of the robot is realized.
And 2, extracting and optimizing the multitasking track. And performing track extraction and track optimization on dance movements to obtain a trunk track, a mechanical arm track, a mechanical leg foot end track and mechanical leg foot end plantar force which meet single rigid body dynamics.
After the design of the robot dance motion is completed, writing a Python-based track extraction and visualization plug-in, realizing the extraction of the track of the robot multitasking end, and providing data support for the follow-up track optimization and tracking control.
The robot track extracted from the animation software only accords with the kinematic relation of the robot in general, so that the animation track is optimized to ensure that the track accords with the robot dynamics, and meanwhile, the animation track and the optimized track are as close as possible. For this, the following optimization problem is constructed:
in the objective function, the inputs are: the track derived by the animation software is meant as a reference. The output is: and the track meeting the constraint condition is the optimal solution quantity.
Taking the robot trunk track as an example, the track of the robot is continuously changed in the motion process, and the following is stipulated: and taking a very small time interval dt, wherein the track is unchanged in the dt time period, and the track of the robot trunk is changed after the dt time interval. dt represents the time interval between the previous torso trace and the next torso trace. Also, x N It is understood that the nth dt time step torso optimized trajectory, of course, also represents the trajectory of the last time step.
①x ref The reference torso trajectory derived for the animation trajectory, the right lower corner ref being an abbreviation for reference, meaning reference quantity, x ref =[roll,pitch,yaw,comx,comy,comz,dr,dp,dyaw,dx,dy,dz,g] 1×13 ,x ref The meaning of each symbol in (c) is as follows,torso attitude angle: roll angle roll, pitch angle pitch, yaw angle yaw, torso centroid position vector: comx, comy, comz, torso pose angular velocity vector: dr, dp, dyaw, torso centroid velocity vector: dx, dy, dz, g represents the gravitational acceleration. And x is the trunk track which meets the constraint condition after optimization and solving. X is x N,ref The reference trunk track derived from the animation track representing the last time point (note: the end position of the robot trunk motion track is important because of factors such as motion engagement, so that the end position is independently lifted out to serve as an optimization index), and the x of the last time point N is marked ref I.e. x N,ref ,x N And the trunk track of the last time point which meets the constraint condition after the optimization solution is adopted.
②p ref The positions of the tail end and each foot end of the mechanical arm derived from the animation track are the reference track p ref =[x 1 ,y 1 ,z 1 ,x 2 ,y 2 ,z 2 ,x 3 ,y 3 ,z 3 ,x 4 ,y 4 ,z 4 ,x arm ,y arm ,z arm ]。p ref The meaning of each symbol is as follows: x is x 1 ,y 1 ,z 1 Representing a right front leg foot end position vector; x is x 2 ,y 2 ,z 2 Representing a left anterior leg foot end position vector; x is x 3 ,y 3 ,z 3 Representing the position vector of the foot end of the right rear leg; x is x 4 ,y 4 ,z 4 Representing a left hind leg foot end position vector; x is x arm ,y arm ,z arm Is the arm end position vector. And p is the track of the tail end and each foot end of the mechanical arm which meets the constraint condition after optimization and solving.
③f ref =[f RF T ,f LF T ,f RH T ,f LH T ] 1×12 The ground contact force of the contact foot end of the robot is f RF In the case of an example of this,the x, y, z of the Right lower corner mark represents the direction of the Right Front leg (RF) linear plantar force. Animation of moving pictureThe software only involves the robot kinematics (position, speed) relationship and cannot derive f ref Where f is the value of ref The values of (2) are specified as: if the leg touches the ground (does not swing in the air), the z-direction linear force of the leg is equal to the robot mass divided by the number of touchdown legs, and the x, y-direction linear force is 0.f is the touchdown force which meets the constraint after optimization.
④Q x 、Q p 、Q f 、Q N Each of the parameters x, p, f, x in the objective function N For optimizing weight parameters (e.g. increasing Q x Reduce Q p 、Q f 、Q N The overall optimization process can be made more biased towards solving the torso trajectory).
The setting of the optimization constraint condition mainly considers the dynamics of the robot, omits pose transformation of the quadruped and the mechanical arm in order to improve the optimization solving speed, and only considers the state quantity x of the trunk k With the contact force f of four feet and the ground k As an input quantity, constructing a simplified single rigid body dynamics model of the leg-arm cooperative robot:
wherein, the liquid crystal display device comprises a liquid crystal display device,
wherein a (12, 13) =1, i body R is the representation of the torso inertia in the world system i Represents the relative position of the foot end and the mass center, m body Representing the mass of the fuselage. Because the trajectories derived by the animation software satisfy the kinematic relationship, no constraints are placed on robot kinematics. For other constraints, f k,z,stance 0 is equal to or more than the z-direction force constraint when the foot end is supported, f k,z,swing =0 represents a z-direction force of 0 when the foot end is suspended. - μf k,z ≤f k,x ≤μf k,z 、-μf k,z ≤f k,y ≤μf k,z The friction cone representing the foot end is restrained, so that the stability of the machine body can be ensured. P is p k,foot,z,satance =0 represents the position at foot support, p arm,z And > 0 is the position constraint of the robot arm tip. f (f) k I.e. f, the lower right corner mark k represents the kth time step, f k I.e. the kth dt time step optimizes the solved f. f (f) k,x Is the x-direction component of the kth time step linear force f. f (f) k,y Is the y-direction component of the kth time step linear force f. f (f) k,z Is the z-direction component of the kth time step linear force f. μ is the coefficient of sliding friction. f (f) k,z,swing Middle swing represents leg suspension. f (f) k,z,stance Middle stand stands for leg touchdown.
And solving a nonlinear optimization problem by using a planning solver, and finally obtaining a trunk track x, a mechanical arm and foot end track p and corresponding plantar force f which meet the single rigid body dynamics.
It should be noted that, the single rigid body model is built for the cut-in point by the following two formulas:
(1) Robot trunk stress analysis (translational dynamics): vertical upward supporting force given by the supporting legs: component f of the z-axis of f z The method comprises the steps of carrying out a first treatment on the surface of the The trunk receives gravity; from newton's second law f=ma, the torso acceleration is obtained
(2) Rotational dynamics: according to the angular momentum theorem, the angular momentum of a rigid body at the centroid is differentiated over time by a vector sum of moments generated at the centroid by all external forces acting on the rigid body, i.eω is the torso pose angular velocity vector: dr, dp, dyaw, f i Representing the three-dimensional linear force of the ith contact leg.
(3) Analysis:the two formulas are written into a matrix form.
And 3, modeling a robot kinematic tree and kinematics.
In order to ensure the accuracy of the model, a URDF file of the leg-arm cooperative robot is derived through three-dimensional CAD software SolidWorks, and a robot kinematic tree is established through the URDF file, as shown in figure 4. A single sequence number in the kinematic tree represents one rigid body of the robot, and a connecting line between the rigid bodies represents a joint between the rigid bodies.
In order to realize the control of the floating base four-foot robot, a virtual 6-degree-of-freedom of the trunk is introduced, and meanwhile, in order to meet the programming requirement, the serial number of the trunk rigid body of the leg-arm cooperation robot is marked as 5 in a kinematic tree. The whole kinematic tree is provided with 18 rigid bodies and 18 joints, the serial number 5 represents the trunk of the robot, the right front leg comprises three rigid body parts with serial numbers 6-7-8, the 6 represents the hip, the 7 represents the thigh, the 8 represents the calf, the left front leg is 9-10-11, the right rear leg is 12-13-14, the left rear leg is 15-16-17, and the 18-23 represents the rigid body parts of the mechanical arm. The relative position relation, mass and inertia tensor of each rigid body, the joint type between the rigid bodies and the position of the rotating shaft can be directly obtained by the configuration of the robot.
To facilitate the subsequent derivation of kinetic parameters, rigid body relation symbols are introduced. The parent rigid body of rigid body i is denoted by λ (i). Mu (i) represents a sub-rigid body of the rigid body i. v (i) represents all subtrees starting with rigid body i and ending with branches, including rigid body i. Kappa (i) represents a subtree ending from the root to the rigid body i, and does not include the rigid body i. According to fig. 4, a simple example of a rigid body relation symbol is as follows:
λ(6)=λ(9)=λ(12)=λ(15)=λ(18)=5
λ(19)=18,λ(23)=22
μ(5)={6,9,12,15,18}
μ(18)=19,μ(20)=21
ν(5)={5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23}
ν(18)={18,19,20,21,22,23}
κ(11)={5,9,10}
κ(22)={5,18,19,20,21}
deriving the kinematic relationship of the leg-arm cooperative robot based on the kinematic tree, and establishing each rigid body coordinate system for the leg-arm cooperative robot, wherein w represents a world coordinate system, B represents a trunk centroid fixed coordinate system, and two coordinate systems exist for each of the other rigid bodies as shown in fig. 5 and 6: t (T) λ(i) And T is λ(i),i (post Wen Chufei states in particular that the emerging rigid body coordinate system is denoted T λ(i) )。T λ(i) And T is λ(i),i Is determined according to the configuration of the leg-arm cooperative robot, and the joint angle change between the parent rigid body lambda (i) and the child rigid body i is defined as q i When q i When=0, T λ(i),i And T is i And (5) overlapping. T (T) λ(i) And T is λ(i),i Are all coordinate systems built on a rigid body lambda (i), T i Is a coordinate system established on the rigid body i. λ (i) represents the parent rigid body of rigid body i, T λ(i) Representing the first coordinate system on rigid body lambda (i), T λ(i),i Representing a second coordinate system on rigid body lambda (i). Two coordinate systems are established on one rigid body to integrate two sets of dynamics (rigid body translation dynamics and rotation dynamics) calculation processes by one set of formulas, so that the formula quantity is reduced, and the calculation efficiency is improved. The conditions of integrated translational dynamics and rotational dynamics are that relative displacement does not occur between the front rigid body coordinate system and the rear rigid body coordinate system, and if a single rigid body establishes a single coordinate system, the integrated conditions cannot be met, so that two coordinate systems T are established on each rigid body λ(i) And T is λ(i),i Wherein T is λ(i) And T is λ(i),i The two represent the length relation of the rigid body, and the relative displacement and the relative rotation cannot occur; and T is λ(i),i T with the next rigid body i i It is understood that the length relationship of the joints connecting the two rigid bodies λ (i) and i will be relatively rotated without relative displacement. The first parameter of the right lower corner mark represents the coordinate system established on the rigid body lambda (i), and the second parameter i represents the sub-rigid body i connected with the first parameter.
Aiming at the problem that the traditional kinematics and dynamics algorithm are low in calculation efficiency due to the complex structure of the leg-arm cooperative robot, the formula deduction quantity is simplified based on the space vector, and the modeling efficiency is improved. In a 6-dimensional vector [ omega ] xyz ,v x ,v y ,v z T ]Represents the spatial velocity of a rigid body, where ω represents the angular velocity and v represents the linear velocity. In a 6-dimensional vector [ n ] ox ,n oy ,n oz ,f x ,f y ,f z ] T Representing the spatial force of the rigid body, where n represents the moment and f represents the force.
The conversion relation of the space velocity from the A rigid body coordinate system to the B rigid body coordinate system is as follows:
the values from B to A are:
wherein e= B R A For a three-dimensional rotation matrix from an A coordinate system to a B coordinate system, r is a three-dimensional vector pointing from an origin of the A coordinate system to an origin of the B coordinate system, and r is a cross operator for representing the three-dimensional vector, and the three-dimensional rotation matrix is a 3-dimensional matrix.
The conversion relation of the space force from the A rigid body coordinate system to the B rigid body coordinate system is as follows:
The values from B to A are:
space velocity conversion matrix B X A And space force conversion matrixThe following conversion relationship exists:
and deducing the kinematics of the leg-arm cooperative robot through the constructed kinematics tree and the space vector. Torso attitude angle calculation from a robot torso IMU 5 E w Obtaining the position r of the robot mass center relative to the world coordinate system by state estimation w5 Further obtain the space velocity conversion matrix from the world coordinate system w to the trunk centroid coordinate system B 5 X w 5 represents the number of trunk in the kinematic tree (hereinafter, abbreviated as trunk 5). After the relation of the body 5 rigid coordinate system relative to the world system is obtained, traversing the leg-arm cooperative robot kinematic tree through the rigid relation, and deducing the following formula of the kinematic relation:
i X λ(i) =X J (i)X T (i)
wherein, the liquid crystal display device comprises a liquid crystal display device, i X λ(i) the conversion relationship representing the spatial velocity from the parent rigid body λ (i) of the rigid body i to the i rigid body coordinate system representation. X is X J (i) Representing the coordinate system T at the joint i λ(i),i And T is i (first coordinate System on rigid body i), since all joints of the leg-arm cooperative robot are rotary joints, X J (i) According to different forms of joint rotation shafts and rotation angle values q of feedback of each joint i Can be expressed as:
X T (i) Representing two coordinate systems T on an i rigid body i And T is i,μ(i) The fixed position relationship of the robot is not included with angle transformation, and can be directly determined according to the actual condition of the leg-arm cooperative robot. i v i The spatial velocity of the i rigid body in the i rigid body coordinate system is represented. i S i Joint subspace mapping matrix representing rigid body iExpressed in the coordinate system of the body i, if the joint corresponding to the rigid body i is in a single degree of freedom and rotates around the x axis i S i =[1,0,0,0,0,0] T While the torso has 6 degrees of freedom in space, it is in the torso 5 coordinate system 5 S 5 Is a six-dimensional identity matrix, and is used for all joints of the robot i S i Can be directly determined according to the types of the joints of the robot.The joint angle change rate of the joint i can be directly obtained by a joint sensor.
For a 6-degree-of-freedom robot kinematic tree branch, the transformation matrix of the spatial velocity from the world system to the robot 23 coordinate system can be expressed as:
wherein S is i Is a joint subspace mapping matrix becauseIs the rate of change of the angle of the joint, is 1-dimensional, and is the space velocity 18 v 18 Is 6-dimensional, so S is required i It is mapped to 6 dimensions.
By dividing the matrix into blocks 23 X w Obtaining w X 23 Obtaining the space velocity of the tail end of the mechanical arm w v 23 Combining the relation of the tail end of the mechanical arm relative to a 23 coordinate system to obtain the position of the tail end of the mechanical arm in the world coordinate system w p arm
For four 3-degree-of-freedom leg kinematics tree branches, taking the right front leg as an example, the space velocity transformation matrix can be expressed as:
8 X w8 X 7 7 X 6 6 X 5 5 X w
w v 8w X 8 8 v 8
w v RFw v 8
the positions of the rigid bodies at the tail ends of the other legs relative to the world coordinate system are obtained by the same method as the mechanical arm w p RF , w p LF , w p RH , w p LH And space velocity w v RF , w v LF , w v RH , w v LH
Thus, the mapping from the joint space of the leg-arm cooperative robot to the Cartesian space is completed, and the forward kinematics calculation is completed.
And deducing the mapping relation from the Cartesian space to the joint space of the leg-arm cooperative robot through the virtual work principle, and solving a task jacobian matrix corresponding to the multi-task end to complete inverse kinematics calculation of the robot.
When the robot is in a static balance state, for the whole robot, the work done by each task end (Cartesian space) subjected to the space force vector set F to generate the relative displacement set δx is the same as the work done by each joint (joint space) due to the joint angle change set δq generated by the joint torque set τ, namely the virtual work principle:
F T δx=τ T δq
(F) T =[F RF T ,F LF T ,F RH T ,F LH T ,F Arm T ] 1×30
δx=[δx RF ,δθ RF ,...,δx LH ,δθ LH ,δx Arm ,δθ Arm ] T 30×1
τ T =[τ 0 ,...,τ 6 ,...,τ 23 ] 1×24
δq=[δq 0 ,...,δq 6 ,...,δq 23 ] T 24×1
wherein F is RF 、δx RF 、δθ RF The space force vector, the displacement and the posture change quantity of the foot end of the right front leg are respectively F LF 、δx LF 、δθ LF The space force vector, the displacement and the posture change quantity of the foot end of the left front leg are respectively F LH 、δx LH 、δθ LH The space force vector, the displacement and the posture change quantity of the foot end of the left rear leg are respectively F RH 、δx RH 、δθ RH The space force vector, the displacement and the posture change quantity of the foot end of the right rear leg are respectively F Arm 、δx Arm 、δθ Arm Respectively the space force vector, displacement and attitude change quantity of the mechanical arm, and tau i 、δq i The torque and the angle change quantity of the joint corresponding to the robot i rigid body are respectively obtained.
Further, δx=jδq may be the following relationship in the whole robot:
τ=J T F
for a single rigid body and corresponding joint τ ii S i T F i ,F i For the spatial force received by the i rigid body, the spatial force vector received by the corresponding rigid body is mapped to the joint spatial torque through the transposition of the joint subspace mapping matrix.
By combining the formulas, traversing the kinematic tree, and deducing a jacobian matrix corresponding to the task space and the generalized joint space:
/>
as shown in the kinematic tree of FIG. 4, toFor example, the upper left corner RF represents the front right leg, the lower right corner 5 represents the torso, its kinematic tree rigid body number is 5, the upper right corner T represents the transpose of the matrix, X represents the velocity space rotation matrix, i X λ(i) the conversion relationship representing the spatial velocity from the parent rigid body λ (i) of the rigid body i to the i rigid body coordinate system representation. X is X * The upper right corner represents the spatial force rotation,representing the spatial force rotation matrix.
Can be obtained through positive kinematics RF X 5 Equal motion space transformation matrix, corresponding joint subspace mapping matrix is obtained through joint type i S i So far, the solution of the jacobian matrix J corresponding to the task space and the generalized joint space is finished, the jacobian matrix of the task corresponding to the multi-task end can be obtained by partitioning the J, and the inverse kinematics of the robot is finished. Taking a 3-degree-of-freedom mechanical arm as an example, the task space represents the end position [ x, y, z ] of the mechanical arm ]. The generalized joint space represents the angle change quantity [ q1, q2, q3 ] of each joint of the mechanical arm]. The arm is terminated to a position, and the joint angle is matched with the position. The task space corresponds to the end position of the mechanical arm, and the joint space corresponds to the joint angle of the mechanical arm. The reason for writing the generalized joint space in this embodiment is to consider not only the angle of the robot arm but also the angle of each leg and the virtual 6 degrees of freedom of the torso.
And 4, modeling the overall dynamics of the robot.
Leg-arm cooperative robot dance is a passObtaining the corresponding joint torque tau 24×1 Reverse powerAnd (5) learning a solving process.Although it can be obtained by twice differentiating the amount of change in the joint angle when the animation software derives the trajectory, its error is excessive. Therefore, the overall dynamics modeling in this step will be +.>Regarding the known quantity, in the following step 5, q, + is taken as follows>Acquisition of +.>
The dynamic equation form of the leg-arm cooperative robot is as follows:
wherein M is 24×24 Is a robot mass distribution matrix, C 24×1 Is a generalized family force matrix, G 24×1 Is a generalized gravity matrix, all three matrixes are matched with the robot configuration q,related to (I)>For the space vector of the generalized joint of the robot, tau 24×1 For joint torque, J c,RF For the right front leg foot end contact force, only the foot end three-position linear force (f RF ) 3×1 、(f LF ) 3×1 、(f RH ) 3×1 、(f LH ) 3×1 And (f) RF ) 3×1 、(f LF ) 3×1 、(f RH ) 3×1 、(f LH ) 3×1 And (3) obtaining the sole force f through track optimization in the step (2). Corresponding contact jacobian matrix (J c,RF ) 3×24 、(J c,LF ) 3×24 、(J c,RH ) 3×24 、(J c,LH ) 3×24 The method can be obtained by partitioning the task space obtained in the step 3 and the generalized joint space jacobian matrix J. Taken together, in addition to joint torque τ 24×1 The current unknown quantity in the dynamics equation of the leg-arm cooperative robot is M 24×24 、C 24×1 、G 24×1
First, a robot mass distribution matrix M is obtained by a combined rigid body method 24×24
The total kinetic energy of the robot kinematic tree is equal to the sum of the kinetic energy of the rigid bodies in the tree, so that,
wherein v is k For the k-rigid body space velocity,kappa (k) is the rigid body relation symbol mentioned in step 3, representing the subtree ending from the root to the rigid body k, excluding the rigid body k, I k In the form of a spatial inertia matrix,m k is the mass of k rigid body, r × A cross operator representing the position of the centroid in the rigid coordinate system, I i Is the rigid body inertia tensor, I 3×3 Is a third-order identity matrix.
Velocity v of rigid body k The formula is substituted into the kinetic energy equation to obtain:
this equation traverses the entire kinematic tree and the i, j rigid bodies support the k rigid bodies together, then this equation can be re-expressed as:
The total kinetic energy of the robot kinematic tree can also pass through the mass distribution matrix M 24×24 And joint space state quantity representation:
/>
comparing the two robot kinetic energy expressions to obtain M ij The corresponding expression is:
I i c representing a combined rigid body i, i X j * representing the spatial force rotation matrix. To this end, the robot mass distribution matrix M 24×24 And (5) finishing calculation.
Next, a robot mass distribution matrix C is obtained 24×1 And G 24×1 . Observing the kinetic equation of the robot whenFoot-end space contact force F foot At 0, the kinetic equation is expressed as: c (C) 24×1 +G 24×1 =τ 24×1 Therefore, τ can be determined by the recursive Newton-Euler method 24×1 Obtaining C 24×1 And G 24×1 . The recursive Newton-Euler method is as follows:
τ ii S i Ti F i
wherein, the liquid crystal display device comprises a liquid crystal display device, i a i spatial acceleration of rigid body i in coordinate system T i Is represented by (a). λ(i) a λ(i) Spatial acceleration of rigid body lambda (i) in coordinate system T λ(i) Is represented by (a). i v i The space velocity for rigid body i is in coordinate system T i Is represented by (a). I i Representing the spatial inertia of rigid body i, x * Is a cross-star multiplication of spatial vectors. a, a i Representing the spatial acceleration of the rigid body i. I i Representing the spatial inertia of the rigid body i. v i Representing the spatial velocity of rigid body i. i X w * Representing a spatial force rotation matrix transformed from the world system w to the i rigid body coordinate system. i X j * Represented by the j rigid coordinate system T j Conversion to the i rigid coordinate system T i Is provided. j F j Representing that the space force F born by the j rigid body is in the j rigid body coordinate system T j Is represented by (a).
Let the above formulaF foot Calculated τ for 0 i The column vectors are combined to be C 24×1 +G 24×1
And 5, task hierarchical control and relaxation optimization. After the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force; based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot; and taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
The tasks of the trunk, the legs and the arms of the robot are classified according to priority, the whole body control of the leg-arm cooperative robot is completed through a zero-space mapping method, and dance movements are reproduced on the premise of guaranteeing the stability of the robot.
The task classification is as follows: support legs, trunk pose, mechanical arm pose and swinging legs (the foot ends of the support legs always touch the ground, the foot ends of the swinging legs swing in the air, and the ground touching information of the legs is obtained when an animation model is exported). The task of supporting legs provides the plantar force required by the execution action for the robot and is the root for realizing the dance of the robot. The trunk pose task is a precondition for keeping the robot body stable, so the robot body is arranged in the second priority and the third priority. The accurate control to arm position appearance can make leg arm cooperation robot demonstrate better dance effect, for example "chicken head mode". The swing legs of the robot have low influence on the stability and dance effect of the robot, so that tasks at the foot ends of the swing legs are ranked at the lowest priority.
Torso trajectory x represents a torso pose task. The corresponding plantar force f represents the support leg task. Foot end trajectory represents swing leg task. The robot track represents the robot pose task.
For two tasks of the leg-arm cooperative robot, jacobian matrix J of tasks 1, 2 1 、J 2 The respective tasks and the generalized joint state quantity (generalized joint space, representing the angle values of each joint of the mechanical arm, the leg joint and the virtual 6-degree-of-freedom joint of the trunk) are respectively completed, namelyIn general J 1 Rather than square, find J 1 Is +.>Thus->Can be converted into +.>N 1 For a zero space projection matrix>The effect is to add arbitrary->Mapping to J 1 In the zero space of (1) so that it does not->Causing any effect. Will->Substituted into->In (a):therefore, task 1 and task 2 generalized space vectors are realized simultaneously>Due to N 1 Is an idempotent hermite matrix, which comprisesThe generalized space state quantity of the completion task 1 is marked as +.>Generalized spatial state variables for simultaneously realizing task 1 and task 2 are denoted as->Let->There is->And so on, mapping the third task +.>Within the null space of the second task there is +.>Wherein the method comprises the steps ofWhen the number of tasks is extended to n, there is task-level zero-space mapping Wherein the method comprises the steps of
From the following components(position differentiation derived speed)>(angular differentiation to angular velocity) an iterative formula to obtain a position level zero-space mapping: />Will->Differentiating to obtain an acceleration level zero space mapping iterative formula: />Through zero space mapping, the multi-task priority control of the leg-arm cooperative robot can be realized, and the iterative formulas of the position, the speed and the acceleration level of the nth task are integrated as follows:
position level iteration formula:
differentiating to obtain a speed stage iteration formula:
and differentiating again to obtain an acceleration level iteration formula:
/>
after the iteration is completed, [ (J) c,RF T ) 24×3 (f RF ) 3×1 +...+(J c,LH T ) 24×3 (f LH ) 3×1 ]Is denoted as J out T f out The trunk dynamics equation of the leg-arm cooperative robot is as follows:
wherein J is c,RF A contact jacobian matrix for the right forefoot, where c represents contact, because only the leg foot end is in contact with the ground. RF stands for right front right forefoot. The matrix is obtained by partitioning a jacobian matrix J of which the task space finally obtained in the step 3 corresponds to the generalized joint space. J (J) out Representing the total contact jacobian combined from the contact jacobian of each support leg. f (f) out Representing the total contact force vector combined by the contact forces. S is S base Representing the torso 6-dimensional selection matrix. Without adding S base And is a systemic kinetic equation.
The equality is not equal on both sides because of the left sideObtained through zero space iteration, and the plantar force on the right is obtained through the simplified single rigid body model optimization of the robot (step 2), so that the trunk dynamics equation of the robot is subjected to relaxation optimization to obtain proper +.>The external force with contact meets the dynamics equation of the trunk.
The relaxation optimization process is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,δf is the plantar force f obtained by optimizing the trajectory out Joint space acceleration iterated with zero space>Relaxation optimization solution quantity, - μf meeting trunk dynamics equation k,z ≤f k,x ≤μf k,z And- μf k,z ≤f k,y ≤μf k,z And (5) optimizing the restraint quantity of plantar force in the track of the step 2.
After obtainingAnd f out,changed Then, the corresponding joint torque tau can be obtained through a robot whole body dynamics equation 24×1
Will result inIs taken as feed-forward quantity, and combines the positions and speeds of each stage of tasks after the zero space iterationAnd delta q n The physical joints of the robot are controlled by joints PD (proportional-differential), and the control rate is as follows:
wherein τ joint The torque of the motor of the physical joint of the robot is a vector formed by removing the torque of the virtual joint of the front 6-dimensional trunk by the torque tau. k (k) p : proportional coefficients in proportional-derivative control. q: current joint angle value.Current angular velocity of the joint.
Calculating the control rate tau of each joint torque of the robot cmd And finally, the reproduction of the dance motions of the leg-arm cooperative robot can be completed.
Example two
The embodiment provides a track generation and tracking control system of a leg-arm cooperative robot dancing, which specifically comprises the following steps:
a data acquisition module configured to: obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software;
a trajectory extraction module configured to: track extraction and track optimization are carried out on the dance movements, so that trunk tracks, mechanical arm tracks, mechanical leg foot end tracks and mechanical leg foot end plantar forces which accord with robot dynamics are obtained;
a null space mapping module configured to: after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force;
a slack optimization module configured to: based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot;
A control module configured to: and taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
It should be noted that, each module in the embodiment corresponds to each step in the first embodiment one to one, and the implementation process is the same, which is not described here.
Example III
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps in the trajectory generation and tracking control method of the leg-arm cooperative robot dance as described in the above embodiment.
Example IV
The present embodiment provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the program to implement the steps in the track generation and tracking control method for dance of the leg-arm cooperative robot according to the above embodiment.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

Claims (10)

1. The track generation and tracking control method for the leg-arm cooperative robot dancing is characterized by comprising the following steps of:
obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software;
track extraction and track optimization are carried out on the dance movements, so that trunk tracks, mechanical arm tracks, mechanical leg foot end tracks and mechanical leg foot end plantar forces which accord with robot dynamics are obtained;
after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force;
based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot;
And taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
2. The method for generating and tracking a trajectory of a dance of a leg-arm cooperative robot according to claim 1, wherein the robot whole-body dynamics equation is:
wherein M is 24×24 Is a robot mass distribution matrix, C 24×1 Is a generalized family force matrix, G 24×1 Is a generalized gravity matrix, f out,changed =f out +δf,Obtained by zero space iteration, f out For mechanical leg foot sole force δf and +.>All are relaxation optimization solving quantities meeting the trunk dynamics equation, J out Representing jacobian matrix, τ 24×1 Indicating joint torque.
3. The trajectory generation and tracking control method of a leg-arm cooperative robot dance of claim 1, wherein the constraint condition of trajectory optimization includes: rigid body dynamics model, force constraint during foot end support, force constraint during foot end suspension, friction cone constraint of foot end, position constraint during foot end support and position constraint of the tail end of the mechanical arm.
4. The trajectory generation and tracking control method of a leg-arm cooperative robot dance of claim 1, wherein the zero-space mapping comprises: task level zero space mapping, position level zero space mapping, and acceleration level zero space mapping.
5. The method for generating and tracking the trajectory of dance of a leg-arm cooperative robot according to claim 1, wherein the trunk, the mechanical arm and the mechanical leg are all composed of a plurality of rigid bodies, and the rigid bodies are connected through joints.
6. The track generation and tracking control method of the leg-arm cooperative robot dancing according to claim 1, wherein the zero-space mapping is based on jacobian matrix of each task;
the jacobian matrix is obtained by deducing the mapping relation from Cartesian space to joint space of the leg-arm cooperative robot through the virtual work principle.
7. The track generation and tracking control method of the leg-arm cooperative robot dance of claim 1, wherein the priority ranking is in order of precedence: support legs, torso, robotic arms, and swing legs.
8. The track generation and tracking control system of the leg-arm cooperative robot dance is characterized by comprising:
a data acquisition module configured to: obtaining dance movements of an animation model of the leg-arm cooperative robot in animation software;
a trajectory extraction module configured to: track extraction and track optimization are carried out on the dance movements, so that trunk tracks, mechanical arm tracks, mechanical leg foot end tracks and mechanical leg foot end plantar forces which accord with robot dynamics are obtained;
A null space mapping module configured to: after the tasks of the trunk, the mechanical arm and the mechanical leg are classified according to priority, the positions, the speeds and the accelerations of the tasks of all levels are calculated through zero space mapping by combining the trunk track, the mechanical arm track, the mechanical leg foot end track and the mechanical leg foot end plantar force;
a slack optimization module configured to: based on acceleration of tasks at all levels and foot force of the foot end of the mechanical leg, obtaining joint torque through a whole body dynamics equation of the robot after relaxing and optimizing the trunk dynamics equation of the leg-arm cooperative robot;
a control module configured to: and taking the joint torque as a feedforward quantity, combining the positions and the speeds of all stages of tasks, and performing proportional-differential control on the joints of the robot to obtain the control rate of the joint torque, thereby completing the reproduction of the dance motions.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps in the trajectory generation and tracking control method of the leg-arm cooperative robot dancing as claimed in any one of claims 1 to 7.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the trajectory generation and tracking control method of the leg-arm cooperative robot dancing of any one of claims 1 to 7 when the program is executed by the processor.
CN202311103584.0A 2023-08-30 2023-08-30 Track generation and tracking control method and system for leg-arm cooperative robot dancing Pending CN116901084A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311103584.0A CN116901084A (en) 2023-08-30 2023-08-30 Track generation and tracking control method and system for leg-arm cooperative robot dancing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311103584.0A CN116901084A (en) 2023-08-30 2023-08-30 Track generation and tracking control method and system for leg-arm cooperative robot dancing

Publications (1)

Publication Number Publication Date
CN116901084A true CN116901084A (en) 2023-10-20

Family

ID=88360445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311103584.0A Pending CN116901084A (en) 2023-08-30 2023-08-30 Track generation and tracking control method and system for leg-arm cooperative robot dancing

Country Status (1)

Country Link
CN (1) CN116901084A (en)

Similar Documents

Publication Publication Date Title
Geilinger et al. Skaterbots: Optimization-based design and motion synthesis for robotic creatures with legs and wheels
Aristidou et al. Inverse kinematics techniques in computer graphics: A survey
US9449416B2 (en) Animation processing of linked object parts
JP3972854B2 (en) Robot motion control device
Bouyarmane et al. Using a multi-objective controller to synthesize simulated humanoid robot motion with changing contact configurations
Yamane et al. Simultaneous tracking and balancing of humanoid robots for imitating human motion capture data
Sentis Synthesis and control of whole-body behaviors in humanoid systems
Miura et al. Human-like walking with toe supporting for humanoids
JP4682791B2 (en) Operation space physical quantity calculation device, operation space physical quantity calculation method, and computer program
CN108241339A (en) The movement solution of apery mechanical arm and configuration control method
CN106041928B (en) A kind of robot manipulating task task generation method based on part model
Nishiwaki et al. The experimental humanoid robot H7: a research platform for autonomous behaviour
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
JP2003058907A (en) Method for generating pose and motion in tree structure link system
JP2001157973A (en) Walk control device, and walk control method for robot
Khatib et al. Human-centered robotics and interactive haptic simulation
Ramos et al. Dynamic motion capture and edition using a stack of tasks
CN115464659A (en) Mechanical arm grabbing control method based on deep reinforcement learning DDPG algorithm of visual information
Liu et al. Ensemble bootstrapped deep deterministic policy gradient for vision-based robotic grasping
Xu et al. Omnidrones: An efficient and flexible platform for reinforcement learning in drone control
Hodgins Simulation of human running
Yoo et al. Recent progress and development of the humanoid robot HanSaRam
Tang et al. Humanmimic: Learning natural locomotion and transitions for humanoid robot via wasserstein adversarial imitation
CN116901084A (en) Track generation and tracking control method and system for leg-arm cooperative robot dancing
JP2003266347A (en) Motion editing device for leg type moving robot and motion editing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination