CN106078752A - Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect - Google Patents

Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect Download PDF

Info

Publication number
CN106078752A
CN106078752A CN201610480222.7A CN201610480222A CN106078752A CN 106078752 A CN106078752 A CN 106078752A CN 201610480222 A CN201610480222 A CN 201610480222A CN 106078752 A CN106078752 A CN 106078752A
Authority
CN
China
Prior art keywords
robot
action
kinect
joint
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610480222.7A
Other languages
Chinese (zh)
Other versions
CN106078752B (en
Inventor
朱光明
张亮
宋娟
沈沛意
程志浩
刘宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Institute Of Computing Technology Xi'an University Of Electronic Science And Technology
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201610480222.7A priority Critical patent/CN106078752B/en
Publication of CN106078752A publication Critical patent/CN106078752A/en
Application granted granted Critical
Publication of CN106078752B publication Critical patent/CN106078752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/4202Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of anthropomorphic robot human body behavior based on Kinect and imitate method, comprise the following steps: 1) from the RGB D image that Kinect collects, extract human skeleton information, build the motion model of human body;2) the extremity body structures feature according to anthropomorphic robot self builds the motion model of anthropomorphic robot;3) action human action collected being converted in robot motion model by mapping model;4) utilizing balance to control technological adjustment robot joint angles makes robot in process of imitation keep stable;5) state to robot carries out after self collision avoids processing, final joint configuration data being sent to robot to complete echopraxia;The present invention uses the mapping techniques of eight kinematic chains to imitate mankind's joint position as much as possible by anthropomorphic robot, and action imitation is similar, can perform the most in real time.May be used for anthropomorphic robot to aspects such as the study of human action, long-range controls.

Description

Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect
Technical field
The present invention relates to anthropomorphic robot action control technical field, a kind of anthropomorphic robot based on Kinect Method is imitated in human body behavior.
Background technology
Anthropomorphic robot is the robot of a kind of outer image people, it can work in the actual environment residing for people and The instrument of the mankind, anthropomorphic robot the most the earliest can be used to first elect Jia Tengyilang research department of Waseda University in 1973 WABOT-1, through the development of more than 40 years, anthropomorphic robot was widely used in every field the most;At present, apery machine Device people enters a new conceptual phase, and the combination of anthropomorphic robot and artificial intelligence will allow anthropomorphic robot have think of Dimension, and then human lives can be better blended into for mankind's service and work.
Anthropomorphic robot has shown the elegance of uniqueness as a burning hot research field at human society;Day book The ASIMO robot of Tian company research and development performs jump on exhibition, runs and be the actions such as guest's pouring, shows apery machine The service ability that device people is superpower;The mobile phone robot of Japanese Sharp research and development is the anthropomorphic robot of a micro, and it is except typically Can also walk and dance outside the function of smart mobile phone, and this robot phone has been put into market and deeply the most joyous by people Meet;Therefore, the application of anthropomorphic robot has begun to put goods on the market, and is bound to spread to the daily life of the mankind in the future In.
In the research of anthropomorphic robot, artificial intelligence and anthropomorphic robot being combined is to allow robot incorporate human lives Key, and at present artificial intelligence is the most immature and a lot of artificial intelligence's product is all to have field targetedly, and this can not Anthropomorphic robot is allowed to have than more completely consciousness;Based on present case, on the one hand robot can be allowed to do some the heaviest Assembly work on multiple mechanical action, such as streamline can be given robot and be completed;On the other hand the mankind can remotely grasp Control and go to replace the mankind to complete some dangerous or difficult tasks in robot.Owing to robot intelligence is low, it cannot be according to working as Before situation make correct judgement, can only accepting in instruction action, if making anthropomorphic robot have imitativeness, allowing machine People and then people does similar action, then can widen the range of anthropomorphic robot greatly, improve its serviceability.
Imitation technology is that one obtains human motion data based on action capture device and utilizes imitation algorithm to complete from people Class action is mapped to the technology of anthropomorphic robot action;Imitation can ensure that robot stabilized on the premise of allow it make and people The action that class is similar, so can allow mankind's remote control robot do the thing of some danger, such as fire rescue or grab Take dangerous goods;Robot can also be according to some basis such as based on anthropomorphic robot sign languages of action of the action learning of the mankind Action learning, robot just can carry out sign language with deaf mute and exchanges afterwards;In addition, utilize action learning robot permissible Record mankind's dancing and dance movement being reproduced with the form of robot, thus can record the non-of our country well Matter input, the dancing of the minority name race soon disappeared.
It is well known, however, that great majority imitate algorithms be all off-line or the imitation just for upper limb, despite the presence of on a small quantity Whole body behavior imitate algorithm, but imitate position less so that imitate effect undesirable;For problem above, it is contemplated that Propose a kind of real-time whole body behavior and imitate algorithm, it is desirable to the position of imitation is many as far as possible imitates effect to improve.
Summary of the invention
It is an object of the invention to provide the human skeleton extracted in a kind of RGB-D image to collect from Kinect Remove to build the motion model of eight kinematic chains of human body based on information, and by mapping model and balance control mechanism, action is reflected It is mapped in robot motion model the anthropomorphic robot human body based on Kinect going to allow robot reproduce echopraxia in real time Method is imitated in behavior, with the problem solving to propose in above-mentioned background technology.
For achieving the above object, the present invention provides following technical scheme:
Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect, with the RGB-D figure collected from Kinect Based on the human skeleton information extracted in Xiang, build the motion model of eight kinematic chains of human body, and by mapping model and Action is mapped in robot motion model by balance control mechanism removes real-time representation, comprises the following steps:
1) set up action data collection, action data processes and three modules of joint configuration, and action data acquisition module is born Duty builds the abstract model of human action, and action data processing module is mapped to robot human action and moves robot Making optimization to process, the joint data after configuration module in joint will process are sent to robot to perform echopraxia;
2) action data gatherer process: the RGB-D image collected for Kinect extracts modelling of human body motion, often frame by frame One frame all goes to represent the behavior of men of current capture with abstract model;
3) action data processing procedure: the every frame data collected are carried out respectively at mapping model process, balance control Reason and self collision are avoided processing three steps, finally give corresponding anthropomorphic robot and imitate the joint angles configuration of attitude;
4) mapping model processing procedure: frame by frame eight the kinematic chain vectors represented under modelling of human body motion are transformed into machine People's motion model goes down expression, utilizes the position of vector computer device each end-effector of people obtained after mapping, then ties Close all joints configuration on all kinematic chains of inverse kinematics equations robot;
5) balance control process: whether the position of centre of gravity calculating Mei Zheng robot the vertical projection judging robot center of gravity To determine the poised state of robot within its support polygon, it is judged that join for revising the joint of robot lower limb during imbalance Put and make the vertical projection of its center of gravity fall in the range of support polygon;
6) self collision avoids process: utilize cube quantization means trunk to be taken up space, in conjunction with positive kinematics and inverse motion Learn judge the end-effector of robot whether with body generation self collision, it is determined that then regulate upper limb angle during generation self collision and make End-effector away from trunk, and again judge self collision state;
7) joint configuration process: judge whether the lower limb of robot have action, it is judged that for upper limks movements but lower limb attonity Action executing is raised speed by Shi Ze, shortens setup time to improve the execution speed of robot;
As the further scheme of the present invention: real-time action imitates and carries out online for consecutive image sequence, image sequence In human body behavior initial of comprising and to terminate time point be all unknown.
As the present invention further scheme: in described step 1, the every two field picture collected is tied in action processing procedure Joint configuration process is performed after bundle.
As the present invention further scheme: in described step 2, the human motion model extracted from Kinect is to use Article eight, kinematic chain represents, eight kinematic chains are respectively by two of the two of upper limb forearms, postbrachium and lower limb upper lower limbs and lower lower limb Constituting, go to represent eight kinematic chains with vector in abstract model, attitude representation is:
P = [ V L E l b o w L S h o u l d e r , V L H a n d L E l b o w , V R E l b o w R S h o u l d e r , V R H a n d R E l b o w , V L K n e e L H i p , V L A n k l e L K n e e , V R K n e e R H i p , V R A n k l e R K n e e ] ;
In formula, vectorRepresent from joint From to the kinematic chain of joint To.
As the present invention further scheme: in described step 3, processing module performs action in order to every frame data Map, balance controls and self collision avoids process.
As the present invention further scheme: in described step 4, by the unit vector of matching human motion chain and right The unit vector execution answering robot motion's chain maps, by the joint angles correction value on the computation of inverse-kinematics kinematic chain Mode is:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be affected the situation of pose error according to solution and change Positive number, I is unit matrix, and Δ x is ideal position and the difference of current location in joint, and Δ q is all on kinematic chain corresponding closes The angle correction of joint, W is weight matrix, by joint angles scope and the angle, θ of current joint iiObtaining, calculation is:
w i = 1 + | ∂ H ( θ ) ∂ θ i | ;
∂ H ( θ ) ∂ θ i = ( θ i , m a x - θ i , min ) 2 ( 2 θ i - θ i , m a x - θ i , min ) 4 ( θ i , m a x - θ i ) 2 ( θ i - θ i , min ) 2 .
As the present invention further scheme: in described step 5, process of imitation allows robot do Quasi-static Movement and make Obtaining the center of gravity vertical projection on ground and be equal to point of zero moment, point of zero moment then judges robot stabilized in support polygon, The computational methods of center of gravity are:
P C o M = Σ i = 0 N m i p i Σ i = 0 N m i ;
In formula, piRepresent robot links i centroid position under its local coordinate system, miIt it is the quality of connecting rod i.
As the present invention further scheme: in described step 6, it is judged that the position that self collision occurs includes trunk and hands Portion.
As the present invention further scheme: in described step 7, when detecting that lower limb have action to slow down machine when configuring The action executing speed of people is so that robot reaches to do the effect of Quasi-static Movement.
Compared with prior art, the invention has the beneficial effects as follows: the present invention proposes a kind of apery machine based on Kinect Method is imitated in device people's human body behavior, it is not necessary to carrying out imitating the behavior calibration of front robot and people, mapping model uses eight motions The echopraxia that the action that chain represents i.e. can allow robot represent is the most similar to the mankind;The method of the present invention makes robot weight The line of the heart and a spike controls the balance of robot approximately perpendicular to ground, thus ensure that robot can in real time, surely Surely imitate;The method of the present invention in whole process of imitation, action data acquisition module, action data processing module and pass Joint configuration module is all parallel, is carried out the communication of intermodule by the method for read-write relief area, and execution efficiency is high, Ke Yishi Line performs in real time now;Go based on the human skeleton information extracted in the RGB-D image collected from Kinect to build The motion model of eight kinematic chains of human body, and with balancing control mechanism, action is mapped to robot motion's mould by mapping model Type goes allow robot reproduce echopraxia in real time.
Accompanying drawing explanation
Fig. 1 is the workflow diagram of the present invention.
Fig. 2 is modelling of human body motion schematic diagram in the present invention.
Fig. 3 is robot motion model's schematic diagram in the present invention.
Fig. 4 is that in the present invention, in the present invention, support modes switches schematic diagram.
Fig. 5 is support polygon schematic diagram in the present invention.
Fig. 6 is single foot support modes schematic diagram of robot in the present invention.
Fig. 7 is the double-legged support modes schematic diagram of robot in the present invention.
Fig. 8 is the double-legged support modes side view of robot in the present invention.
Fig. 9 is the system architecture diagram of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Describe, it is clear that described embodiment is only a part of embodiment of the present invention rather than whole embodiments wholely.Based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under not making creative work premise Embodiment, broadly falls into the scope of protection of the invention.
Refer to Fig. 1~9, in the embodiment of the present invention, a kind of anthropomorphic robot human body behavior imitation side based on Kinect Method, algorithm flow is as it is shown in figure 1, build the motion model of human body, build the motion model of robot, action mapping, balance control What system and self collision were avoided realizes process, implements according to following steps:
A) motion model of human body is built
Coordinate system is set up, by RGB-D image zooming-out in mankind's trunk center that the motion model of human body collects at Kinect Human motion model out as in figure 2 it is shown, the motion model represented by eight kinematic chains can represent arbitrary behavior of men, Attitude is expressed as:
P = [ V L E l b o w L S h o u l d e r , V L H a n d L E l b o w , V R E l b o w R S h o u l d e r , V R H a n d R E l b o w , V L K n e e L H i p , V L A n k l e L K n e e , V R K n e e R H i p , V R A n k l e R K n e e ] ;
In formula, vectorRepresent from joint From to the kinematic chain of joint To.
The depth data in each joint under Kinect coordinate system is transformed in mankind's trunk coordinate system and goes the concrete of expression Method is: for any point under mankind's trunk coordinate systemIt is known that all articulares under Kinect coordinate system Coordinate includes the coordinate of trunkAnd attitudeAssume that this point is under Kinect coordinate systemThen have with Lower relation:
C i K i n e c t = R T o r s o K i n e c t C i T o r s o + C T o r s o K i n e c t ;
Thus, solve according to method calculated below
C i T o r s o = ( R T o r s o K i n e c t ) T ( C i K i n e c t - C T o r s o K i n e c t ) ;
In formula,Because all of spin matrix is all skew symmetric matrix, skew symmetry Matrix R has RT=R-1
B) motion model of robot is built
Anthropomorphic robot can set up coordinate system according to the construction features of self at trunk center, and motion model is by eight motions Chain represents, as it is shown on figure 3, the attitude of robot is also referred to as:
P = [ V L E l b o w L S h o u l d e r , V L H a n d L E l b o w , V R E l b o w R S h o u l d e r , V R H a n d R E l b o w , V L K n e e L H i p , V L A n k l e L K n e e , V R K n e e R H i p , V R A n k l e R K n e e ] ;
In formula, vectorRepresent from joint From to the kinematic chain of joint To;
C) action maps
According to the motion model built, by the unit vector of matching robot kinematic chain corresponding with human motion model Completing the conversion from behavior of men to robot behavior, after mapping, the calculation of joint of robot coordinate is:
C L E l b o w = u L I N K ( L S h o u l d e r ) · p + V L E l b o w L S h o u l d e r | | V L E l b o w L S h o u l d e r | | · L L E l b o w L S h o u l d e r ;
In formula, uLINK (LShoulder) p is a constant, represents the three-dimensional of its shoulder under robot trunk coordinate system Position, the distance at shoulder and trunk center is fixing, and joint LShoulder is the abbreviation of Left Shoulder, represents left shoulder Joint,Representing the upper arm lengths of robot, be also a constant, LElbow is the abbreviation of Left Elbow, represents Left elbow joint;VectorIt is the vector representation form of human motion chain, | |. | | for Euclidean distance operator;
After being mapped after the three-dimensional position of joint of robot, calculate the joint angle of corresponding kinematic chain according to joint position Degree, computing formula is:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be affected the situation of pose error according to solution and change Positive number, I is unit matrix, and Δ x is ideal position and the difference of current location in joint, and Δ q is all on kinematic chain corresponding closes The angle correction of joint, W is weight matrix, by joint angles scope and the angle, θ of current joint iiObtaining, computing formula is:
w i = 1 + | ∂ H ( θ ) ∂ θ i | ;
∂ H ( θ ) ∂ θ i = ( θ i , m a x - θ i , min ) 2 ( 2 θ i - θ i , m a x - θ i , min ) 4 ( θ i , m a x - θ i ) 2 ( θ i - θ i , min ) 2 ;
D) balance controls
Allowing robot do Quasi-static Movement in process of imitation makes center of gravity represent point of zero moment at the projection approximation on ground, As long as point of zero moment will be stablized support polygon inner machine people, the computing formula of center of gravity is:
P C o M = Σ i = 0 N m i p i Σ i = 0 N m i ;
In formula, piRepresent robot links i centroid position under its local coordinate system, miIt it is the quality of connecting rod i;
The switching schematic diagram of support modes as shown in Figure 4, is only switched to a spike and free foot and support when center of gravity The difference in height of foot reaches the most just to be considered to be switched to single foot support modes from both feet support modes;In like manner, only when weight The heart leaves the difference in height of a spike and free foot and a spike and is just considered to be switched to from single foot support modes less than certain value Both feet support modes;Concrete switching condition is expressed as:
C o n d 1 : | T o r s o . y - L A n k l e . y | ≤ O F F S E T R A n k l e . z - L A n k l e . z ≥ M I N _ H E I G H T ;
C o n d 2 : | T o r s o . y - R A n k l e . y | ≤ O F F S E T L A n k l e . z - R A n k l e . z ≥ M I N _ H E I G H T ;
C o n d 3 : | T o r s o . y - L A n k l e . y | ≥ O F F S E T R A n k l e . z - L A n k l e . z ≤ M I N _ H E I G H T ;
C o n d 4 : | T o r s o . y - R A n k l e . y | ≥ O F F S E T L A n k l e . z - R A n k l e . z ≤ M I N _ H E I G H T ;
In formula, Torso is that Kinect collects the three-dimensional coordinate at mankind's trunk center for approximate substitution mankind's center of gravity Position, LAnkle and RAnkle full name is Left Ankle and Right Ankle represents that Kinect collects about the mankind respectively The three-dimensional coordinate of ankle joint, OFFSET and MIN_HEIGHT is and imitates the threshold value that default is good, is used for eliminating Kinect and adopts The impact that collection fluctuation brings;
In Fig. 4, the implication of partial joint is as shown in the table:
Support the mathematical model of shape changeable as it is shown in figure 5, set up plane coordinate system, support polygon at the center of a foot Six border l1、l2、l3、l4、l5And l6It is represented as the form of y=a x+b, so sits in known machine people's center of gravity three-dimensional It may determine that whether its vertical projection on ground determines in the range of support polygon that robot is current in the case of target Poised state;
Single foot support modes of robot as shown in Figure 6, vector [0,0,1]TFirst have rotated α angle around roll axle, so After around pitch axle have rotated β angle obtain vector Ankle-CoM, vector Ankle-CoM represent support ankle arthrosis Ankle and The line of center of gravity CoM (full name is CenterofMass);It practice, pitch axle here be not in world coordinate system and It is under Ankle local coordinate system, because roll axle is relative to female connecting rod that ground is pitch axle;Asked by space vector method Solving angle [alpha] and β, computing formula is as follows:
n → = V → r o l l × V → A n k l e - C o M ;
| α | = a c o s ( n → · V → p i t c h | | n → | | · | | V → p i t c h | | ) ;
V → s t a r t = [ 0 , s i n ( | α | ) , c o s ( α ) ] T ;
| β | = a c o s ( V → s t a r t · V → A n k l e - C o M | | V → s t a r t | | · | | V → A n k l e - C o M | | ) ;
In formula,It is respectively | |. | | for Euclidean distance operator;
Constantly revise ankle joint angle with angle [alpha] and β, why to be repeatedly because along with ankle joint angle The position adjusting center of gravity will change;The present invention uses the mode of this cyclic approximation to obtain final ankle joint configuration, Loop ends is until the projected position of maximum iteration time or center of gravity meets requirement.Although the similar general numerical value of this method Approach method such as method of least square, but speed can more faster, because front adjustment several times just can allow the three-dimensional position of center of gravity Meet condition without too successive ignition, be directly to allow Ankle-CoM the most vertical with ground, it is simple to allow center of gravity fall within support In polygon.
As Figure 7-8, robot is the double-legged bolster model after ankle joint adjusts to the double-legged supporting die of robot, Preferably adjusting attitude result is that Ankle-Hip falls within pitch-yaw plane, and Hip is the hip joint of robot;AB is joint The rotary shaft of Hip, it and vector Ankle-Hip are orthogonal;γ be pitch-yaw plane and by vector AB and Hip-CoM structure Angle between plane AB-CoM become, goes to update the angle of joint of robot Hip with γ, if vector Ankle-CoM peace Face pitch-yaw is parallel, and robot is exactly stable, and the calculation of both legs support modes is:
n → = V → A B × V → H i p - C o M ;
| γ | = a c o s ( V → r o l l · n → | | V → r o l l | | · | | n → | | ) ;
In formula, normal vectorPlane AB-CoM, corresponding adjustment angle γ by calculate two planes AB-CoM and The angle of yaw-pitch normal vector obtains, | |. | | for Euclidean distance operator;
E) self collision is avoided
Self collision is analyzed after balance control, realizes by positive kinematics and inverse kinematics;First may robot The position that self collision occurs is abstracted into simple space multistory, can be to be set up by cubic unit to form or simple cylinder mould Type;Robot hand three-dimensional position under trunk coordinate system, can be by judging that this point is whether abstract touching out The generation of self collision situation is determined within the scope of hitting cylinder;If it occur that self collision, the position of end-effector can be allowed Vertical column axis is avoided away from certain distance, solves new joint configuration with positive kinematics the most again, finally uses positive motion Learn the pose updating robot.
Fig. 9 illustrates the framework that realizes of imitation system, and system, by three module compositions, processes including acquisition module, data Module and configuration module;The coordinate in the crucial joint that acquisition module is obtained by Kinect builds the motion model of the mankind, adopts Collection module can also control acquisition frame rate and filter bad frame;Processing module is divided into three steps: first, sets up according to joint coordinates Coordinate system with trunk as initial point, the mankind of eight chains run model and are constructed;Then, every, matching robot motion The unit vector of the kinematic chain of the unit vector of chain and the corresponding mankind reaches to imitate effect;Finally, balance controls and self collision Avoidance mechanism needs to be integrated into;In order to shorten the inverse kinematics (Inverse of process time, upper limb and joint of the lower extremity Kinematics is called for short IK) and positive kinematics (Forward Kinematics is called for short FK) calculating parallel processing, also gather mould Block, processing module and configuration module are run the most simultaneously and are improved systematic function.
The anthropomorphic robot of the present invention imitates human body behavior and it is critical only that action maps and balance controlling party face, and both can Behavior of men is imitated efficiently so that guarantee is robot stabilized.
It is obvious to a person skilled in the art that the invention is not restricted to the details of above-mentioned one exemplary embodiment, Er Qie In the case of the spirit or essential attributes of the present invention, it is possible to realize the present invention in other specific forms.Therefore, no matter From the point of view of which point, all should regard embodiment as exemplary, and be nonrestrictive, the scope of the present invention is by appended power Profit requires rather than described above limits, it is intended that all by fall in the implication of equivalency and scope of claim Change is included in the present invention.Should not be considered as limiting involved claim by any reference in claim.
Although moreover, it will be appreciated that this specification is been described by according to embodiment, but the most each embodiment only wraps Containing an independent technical scheme, this narrating mode of description is only that for clarity sake those skilled in the art should Description can also be formed those skilled in the art through appropriately combined as an entirety, the technical scheme in each embodiment May be appreciated other embodiments.

Claims (9)

1. anthropomorphic robot human body behavior based on Kinect imitates a method, with the RGB-D image collected from Kinect In based on the human skeleton information extracted, build the motion model of eight kinematic chains of human body, and by mapping model peace Action is mapped in robot motion model by weighing apparatus control mechanism removes real-time representation, it is characterised in that comprise the following steps:
1) set up action data collection, action data processes and three modules of joint configuration, and action data acquisition module is responsible for structure Building the abstract model of human action, human action is mapped to robot to action data processing module and the action to robot is excellent Change processes, and the joint data after configuration module in joint will process are sent to robot to perform echopraxia;
2) action data gatherer process: the RGB-D image collected for Kinect extracts modelling of human body motion, each frame frame by frame All go to represent the behavior of men of current capture with abstract model;
3) action data processing procedure: the every frame data collected are carried out respectively mapping model process, balance control process with And self collision avoids processing three steps, finally give corresponding anthropomorphic robot and imitate the joint angles configuration of attitude;
4) mapping model processing procedure: frame by frame eight the kinematic chain vectors represented under modelling of human body motion are transformed into robot fortune Movable model goes down expression, utilizes the position of vector computer device each end-effector of people obtained after mapping, then in conjunction with inverse All joints configuration on all kinematic chains of kinesiology equations robot;
5) balance control process: calculate the position of centre of gravity of Mei Zheng robot and judge that whether the vertical projection of robot center of gravity is at it To determine the poised state of robot within support polygon, it is judged that make for revising the joint configuration of robot lower limb during imbalance The vertical projection of its center of gravity falls in the range of support polygon;
6) self collision avoids process: utilizes cube quantization means trunk to be taken up space, sentences in conjunction with positive kinematics and inverse kinematics The end-effector of disconnected robot whether with body generation self collision, it is determined that occur then to regulate upper limb angle during self collision and make end End effect device is away from trunk, and again judges self collision state;
7) joint configuration process: judge whether the lower limb of robot have action, it is judged that for upper limks movements but during lower limb attonity then Action executing is raised speed, shortens setup time to improve the execution speed of robot.
2. imitate method according to the anthropomorphic robot human body behavior based on Kinect described in right 1, it is characterised in that move in real time Imitating and carry out online for consecutive image sequence, the initial and termination time point of the human body behavior comprised in image sequence is all Unknown.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that In described step 1, the every two field picture collected terminates to perform joint configuration process afterwards in action processing procedure.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that institute Stating in step 2, the human motion model extracted from Kinect represents with eight kinematic chains, and eight kinematic chains are respectively by upper The upper lower limb of two of two forearms, postbrachium and the lower limb of limb and lower lower limb are constituted, and go to represent eight motions with vector in abstract model Chain, attitude representation is:
In formula, vectorRepresent from joint From to the kinematic chain of joint To.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that institute Stating in step 3, processing module performs action mapping in order to every frame data, balance controls and self collision avoids process.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that institute State in step 4, reflected by the unit vector execution of the unit vector of matching human motion chain and corresponding robot motion's chain Penetrate, by the mode of the joint angles correction value on the computation of inverse-kinematics kinematic chain be:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be affected the situation of pose error according to solution and just change Number, I is unit matrix, and Δ x is ideal position and the difference of current location in joint, and Δ q is all corresponding joints on kinematic chain Angle correction, W is weight matrix, by joint angles scope and the angle, θ of current joint iiObtaining, calculation is:
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that In described step 5, process of imitation allows robot do Quasi-static Movement and make the center of gravity vertical projection on ground be equal to zero-g Square point, point of zero moment then judges robot stabilized in support polygon, and the computational methods of center of gravity are:
In formula, piRepresent robot links i centroid position under its local coordinate system, miIt it is the quality of connecting rod i.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that institute State in step 6, it is judged that the position that self collision occurs includes trunk and hand.
Method is imitated in anthropomorphic robot human body behavior based on Kinect the most according to claim 1, it is characterised in that institute State in step 7, when detecting that lower limb have action to slow down the action executing speed of robot when configuring so that robot reaches to do The effect of Quasi-static Movement.
CN201610480222.7A 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect Active CN106078752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610480222.7A CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610480222.7A CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Publications (2)

Publication Number Publication Date
CN106078752A true CN106078752A (en) 2016-11-09
CN106078752B CN106078752B (en) 2019-03-19

Family

ID=57252883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610480222.7A Active CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Country Status (1)

Country Link
CN (1) CN106078752B (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600000A (en) * 2016-12-05 2017-04-26 中国科学院计算技术研究所 Method and system for human-robot motion data mapping
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN107127760A (en) * 2017-07-12 2017-09-05 清华大学 A kind of track combined anthropomorphic robot of foot
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107283386A (en) * 2017-05-27 2017-10-24 江苏物联网研究发展中心 Man-machine synchronous method
CN107443396A (en) * 2017-08-25 2017-12-08 魔咖智能科技(常州)有限公司 A kind of intelligence for imitating human action in real time accompanies robot
CN107598897A (en) * 2017-10-19 2018-01-19 北京工业大学 A kind of method of humanoid robot gait's planning based on human body teaching
CN107932510A (en) * 2017-11-28 2018-04-20 中国人民解放军陆军工程大学 NAO robot systems based on action collection
CN107953331A (en) * 2017-10-17 2018-04-24 华南理工大学 A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN108284436A (en) * 2018-03-17 2018-07-17 北京工业大学 Remote mechanical dual arm system and method with learning by imitation mechanism
CN108908353A (en) * 2018-06-11 2018-11-30 安庆师范大学 Robot expression based on the reverse mechanical model of smoothness constraint imitates method and device
CN109015631A (en) * 2018-07-03 2018-12-18 南京邮电大学 The method that anthropomorphic robot based on more working chains imitates human motion in real time
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method
CN109830078A (en) * 2019-03-05 2019-05-31 北京智慧眼科技股份有限公司 Intelligent behavior analysis method and intelligent behavior analytical equipment suitable for small space
CN109821243A (en) * 2019-01-25 2019-05-31 丰羽教育科技(上海)有限公司 A method of simulation reappears shooting process
CN109840923A (en) * 2019-01-22 2019-06-04 绍兴文理学院 The method for obtaining orientative feature based on Robot dancing posture mirror image subgraph
CN110135303A (en) * 2019-04-30 2019-08-16 西安理工大学 The method with interactive learning is held in a kind of non-heredity of dancing class
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN111113429A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN111208783A (en) * 2019-12-30 2020-05-29 深圳市优必选科技股份有限公司 Action simulation method, device, terminal and computer storage medium
CN111300421A (en) * 2020-03-17 2020-06-19 北京理工大学 Mapping method applied to simulation of actions of both hands of humanoid robot
WO2020215213A1 (en) * 2019-04-23 2020-10-29 西门子股份公司 Multi-axis motion controller, multi-axis motion control method and system
CN112044013A (en) * 2020-09-18 2020-12-08 宿州赛尔沃德物联网科技有限公司 Robot fire rescue system
CN112580582A (en) * 2020-12-28 2021-03-30 达闼机器人有限公司 Action learning method, action learning device, action learning medium and electronic equipment
CN112847336A (en) * 2020-12-24 2021-05-28 达闼机器人有限公司 Action learning method, action learning device, storage medium and electronic equipment
CN112894828A (en) * 2021-03-02 2021-06-04 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
CN112959330A (en) * 2021-02-02 2021-06-15 浙江大学 Robot double-arm motion man-machine corresponding device and method based on master-slave dynamic motion elements
CN113492404A (en) * 2021-04-21 2021-10-12 北京科技大学 Humanoid robot action mapping control method based on machine vision
US20220226996A1 (en) * 2019-06-17 2022-07-21 Sony Interactive Entertainment Inc. Robot control system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049752A1 (en) * 2003-08-28 2005-03-03 Sony Corporation Robot apparatus, control method for robot apparatus, and toy for robot apparatus
US20080244468A1 (en) * 2006-07-13 2008-10-02 Nishihara H Keith Gesture Recognition Interface System with Vertical Display
CN102708582A (en) * 2012-05-08 2012-10-03 电子科技大学 Character movement redirecting method for heterogeneous topology
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN104794722A (en) * 2015-04-30 2015-07-22 浙江大学 Dressed human body three-dimensional bare body model calculation method through single Kinect
CN104932254A (en) * 2015-05-12 2015-09-23 北京理工大学 Humanoid robot front-fall protection control strategy
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049752A1 (en) * 2003-08-28 2005-03-03 Sony Corporation Robot apparatus, control method for robot apparatus, and toy for robot apparatus
US20080244468A1 (en) * 2006-07-13 2008-10-02 Nishihara H Keith Gesture Recognition Interface System with Vertical Display
CN102708582A (en) * 2012-05-08 2012-10-03 电子科技大学 Character movement redirecting method for heterogeneous topology
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN104794722A (en) * 2015-04-30 2015-07-22 浙江大学 Dressed human body three-dimensional bare body model calculation method through single Kinect
CN104932254A (en) * 2015-05-12 2015-09-23 北京理工大学 Humanoid robot front-fall protection control strategy
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
于超: "基于体感的NAO机器人展示系统研究与开发", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600000A (en) * 2016-12-05 2017-04-26 中国科学院计算技术研究所 Method and system for human-robot motion data mapping
CN106971050B (en) * 2017-04-18 2020-04-28 华南理工大学 Kinect-based Darwin robot joint mapping analysis method
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN107283386A (en) * 2017-05-27 2017-10-24 江苏物联网研究发展中心 Man-machine synchronous method
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107127760A (en) * 2017-07-12 2017-09-05 清华大学 A kind of track combined anthropomorphic robot of foot
CN107443396A (en) * 2017-08-25 2017-12-08 魔咖智能科技(常州)有限公司 A kind of intelligence for imitating human action in real time accompanies robot
CN107953331B (en) * 2017-10-17 2019-12-10 华南理工大学 human body posture mapping method applied to humanoid robot action simulation
CN107953331A (en) * 2017-10-17 2018-04-24 华南理工大学 A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN107598897A (en) * 2017-10-19 2018-01-19 北京工业大学 A kind of method of humanoid robot gait's planning based on human body teaching
CN107598897B (en) * 2017-10-19 2020-11-27 北京工业大学 Humanoid robot gait planning method based on human body teaching
CN107932510A (en) * 2017-11-28 2018-04-20 中国人民解放军陆军工程大学 NAO robot systems based on action collection
CN108284436B (en) * 2018-03-17 2020-09-25 北京工业大学 Remote mechanical double-arm system with simulation learning mechanism and method
CN108284436A (en) * 2018-03-17 2018-07-17 北京工业大学 Remote mechanical dual arm system and method with learning by imitation mechanism
CN108908353A (en) * 2018-06-11 2018-11-30 安庆师范大学 Robot expression based on the reverse mechanical model of smoothness constraint imitates method and device
CN108908353B (en) * 2018-06-11 2021-08-13 安庆师范大学 Robot expression simulation method and device based on smooth constraint reverse mechanical model
CN109015631A (en) * 2018-07-03 2018-12-18 南京邮电大学 The method that anthropomorphic robot based on more working chains imitates human motion in real time
CN109079794B (en) * 2018-09-18 2020-12-22 广东省智能制造研究所 Robot control and teaching method based on human body posture following
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method
CN109840923A (en) * 2019-01-22 2019-06-04 绍兴文理学院 The method for obtaining orientative feature based on Robot dancing posture mirror image subgraph
CN109821243A (en) * 2019-01-25 2019-05-31 丰羽教育科技(上海)有限公司 A method of simulation reappears shooting process
CN109830078B (en) * 2019-03-05 2021-03-30 智慧眼科技股份有限公司 Intelligent behavior analysis method and intelligent behavior analysis equipment suitable for narrow space
CN109830078A (en) * 2019-03-05 2019-05-31 北京智慧眼科技股份有限公司 Intelligent behavior analysis method and intelligent behavior analytical equipment suitable for small space
WO2020215213A1 (en) * 2019-04-23 2020-10-29 西门子股份公司 Multi-axis motion controller, multi-axis motion control method and system
CN113396032A (en) * 2019-04-23 2021-09-14 西门子股份公司 Multi-axis motion controller, multi-axis motion control method and system
CN110135303B (en) * 2019-04-30 2022-09-13 西安理工大学 Dance non-genetic bearing and interactive learning method
CN110135303A (en) * 2019-04-30 2019-08-16 西安理工大学 The method with interactive learning is held in a kind of non-heredity of dancing class
US20220226996A1 (en) * 2019-06-17 2022-07-21 Sony Interactive Entertainment Inc. Robot control system
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN111208783A (en) * 2019-12-30 2020-05-29 深圳市优必选科技股份有限公司 Action simulation method, device, terminal and computer storage medium
CN111208783B (en) * 2019-12-30 2021-09-17 深圳市优必选科技股份有限公司 Action simulation method, device, terminal and computer storage medium
CN111113429B (en) * 2019-12-31 2021-06-25 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN111113429A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN111300421A (en) * 2020-03-17 2020-06-19 北京理工大学 Mapping method applied to simulation of actions of both hands of humanoid robot
CN112044013B (en) * 2020-09-18 2021-11-23 宿州赛尔沃德物联网科技有限公司 Robot fire rescue system
CN112044013A (en) * 2020-09-18 2020-12-08 宿州赛尔沃德物联网科技有限公司 Robot fire rescue system
WO2022134702A1 (en) * 2020-12-24 2022-06-30 达闼机器人股份有限公司 Action learning method and apparatus, storage medium, and electronic device
CN112847336A (en) * 2020-12-24 2021-05-28 达闼机器人有限公司 Action learning method, action learning device, storage medium and electronic equipment
CN112847336B (en) * 2020-12-24 2023-08-22 达闼机器人股份有限公司 Action learning method and device, storage medium and electronic equipment
CN112580582A (en) * 2020-12-28 2021-03-30 达闼机器人有限公司 Action learning method, action learning device, action learning medium and electronic equipment
CN112959330A (en) * 2021-02-02 2021-06-15 浙江大学 Robot double-arm motion man-machine corresponding device and method based on master-slave dynamic motion elements
CN112894828B (en) * 2021-03-02 2022-05-20 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
CN112894828A (en) * 2021-03-02 2021-06-04 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
CN113492404A (en) * 2021-04-21 2021-10-12 北京科技大学 Humanoid robot action mapping control method based on machine vision
CN113492404B (en) * 2021-04-21 2022-09-30 北京科技大学 Humanoid robot action mapping control method based on machine vision

Also Published As

Publication number Publication date
CN106078752B (en) 2019-03-19

Similar Documents

Publication Publication Date Title
CN106078752A (en) Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect
CN106313049B (en) A kind of apery mechanical arm motion sensing control system and control method
CN103440037B (en) Real-time interaction virtual human body motion control method based on limited input information
CN106607910B (en) A kind of robot imitates method in real time
CN106980385A (en) A kind of Virtual assemble device, system and method
CN109202901A (en) A kind of biped robot's stair climbing gait planning method, apparatus and robot
CN107598897A (en) A kind of method of humanoid robot gait's planning based on human body teaching
CN107050763B (en) Novel ankle joint rehabilitation robot and control method thereof
CN107818318B (en) Humanoid robot simulation similarity evaluation method
Wang et al. A real-time human imitation system
CN107330967A (en) Knight's athletic posture based on inertia sensing technology is caught and three-dimensional reconstruction system
CN109333506A (en) A kind of humanoid intelligent robot system
CN109483534A (en) A kind of grasping body methods, devices and systems
CN109086466A (en) Single leg multiaxis biped robot kinematics joint simulation method
CN101246601A (en) Three-dimensional virtual human body movement generation method based on key frame and space-time restriction
CN107529630A (en) A kind of method that robot for space establishes kinetic model
CN109079794A (en) It is a kind of followed based on human body attitude robot control and teaching method
Vukobratovic When were active exoskeletons actually born?
Barros et al. Bimanual haptics for humanoid robot teleoperation using ROS and V-REP
CN107529498A (en) A kind of method that robot for space arrests noncooperative target
CN111300408A (en) Humanoid double-arm robot motion planning control method combining shape similarity and expression similarity
Boutin et al. From human motion capture to humanoid locomotion imitation Application to the robots HRP-2 and HOAP-3
Boutin et al. An auto-adaptable algorithm to generate human-like locomotion for different humanoid robots based on motion capture data
Yoo et al. Recent progress and development of the humanoid robot HanSaRam
CN110694286B (en) Method for simulating palm puppet performance by using mechanical arm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210111

Address after: Building 28 and 29, Tian'an Digital City, No.88 Chunyang Road, Chengyang District, Qingdao City, Shandong Province

Patentee after: Qingdao Institute of computing technology Xi'an University of Electronic Science and technology

Address before: No.2, Taibai South Road, Yanta District, Xi'an City, Shaanxi Province

Patentee before: XIDIAN University