CN106078752B - A kind of anthropomorphic robot human body behavior imitation method based on Kinect - Google Patents

A kind of anthropomorphic robot human body behavior imitation method based on Kinect Download PDF

Info

Publication number
CN106078752B
CN106078752B CN201610480222.7A CN201610480222A CN106078752B CN 106078752 B CN106078752 B CN 106078752B CN 201610480222 A CN201610480222 A CN 201610480222A CN 106078752 B CN106078752 B CN 106078752B
Authority
CN
China
Prior art keywords
robot
joint
movement
human
kinect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610480222.7A
Other languages
Chinese (zh)
Other versions
CN106078752A (en
Inventor
朱光明
张亮
宋娟
沈沛意
程志浩
刘宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Institute Of Computing Technology Xi'an University Of Electronic Science And Technology
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201610480222.7A priority Critical patent/CN106078752B/en
Publication of CN106078752A publication Critical patent/CN106078752A/en
Application granted granted Critical
Publication of CN106078752B publication Critical patent/CN106078752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/4202Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of, and method is imitated in the anthropomorphic robot human body behavior based on Kinect, comprising the following steps: 1) extracts human skeleton information from the collected RGB-D image of Kinect, construct the motion model of human body;2) motion model of anthropomorphic robot is constructed according to the extremity body structures feature of anthropomorphic robot itself;3) movement being converted into collected human action by mapping model in robot motion model;4) robot in process of imitation is made to keep stablizing using balance control technology adjustment robot joint angles;5) state of robot is carried out after avoiding processing from collision, final joint configuration data is sent to robot to complete echomotism;The present invention imitates mankind's joint position as much as possible by anthropomorphic robot using the mapping techniques of eight kinematic chains, and action imitation is similar, can execute in real time online.It can be used for anthropomorphic robot to the study of human action, remotely control etc..

Description

A kind of anthropomorphic robot human body behavior imitation method based on Kinect
Technical field
The present invention relates to anthropomorphic robot action control technical field, specifically a kind of anthropomorphic robot based on Kinect Method is imitated in human body behavior.
Background technique
Anthropomorphic robot is the robot of outer image people a kind of, it can work in the actual environment locating for people and It is able to use the tool of the mankind, the first-elected Waseda University in 1973 of earliest anthropomorphic robot adds one youth research department of rattan in the world WABOT-1, by development in more than 40 years, anthropomorphic robot was widely used in every field now;Currently, apery machine Device people enters a new conceptual phase, and the combination of anthropomorphic robot and artificial intelligence will allow anthropomorphic robot to have think of Dimension, and then can better blend into human lives is that the mankind service and work.
The anthropomorphic robot research field burning hot as one shows unique elegance in human society;Day book The ASIMO robot of Tian company research and development performs jump, running in exhibition and acts for guest's pouring etc., shows apery machine The superpower service ability of device people;The mobile phone robot of Japanese Sharp research and development is a miniature anthropomorphic robot, it is in addition to general It can also walk and dance outside the function of smart phone, and market and deeply joyous by people has been put into this robot phone It meets;Therefore, the application of anthropomorphic robot, which has begun, puts goods on the market, and is bound to spread to the daily life of the mankind in the future In.
In the research of anthropomorphic robot, it is that robot is allowed to incorporate human lives that artificial intelligence and anthropomorphic robot, which are combined, Key, and artificial intelligence is still immature at present and many artificial intelligence products are all to have field targetedly, this can not Anthropomorphic robot is allowed to have than more completely realizing;Based on present case, robot on the one hand can be allowed to do some simple weights Multiple mechanical action, such as assembly work on assembly line can give robot completion;The another aspect mankind can remotely grasp It controls and goes that the mankind is replaced to complete some dangerous or difficult tasks in robot.Since robot intelligence is low, it can not be according to working as Preceding situation makes correct judgement, can only accept and take action in instruction, if anthropomorphic robot is made to have mimetism, allows machine People follows people to do similar movement, then can greatly widen the use scope of anthropomorphic robot, improve its service performance.
Imitation technology is that one kind is obtained human motion data and utilized to imitate algorithm completion from people based on movement capture device Class movement is mapped to the technology of anthropomorphic robot movement;Imitation can guarantee it is robot stabilized under the premise of allow it to make and the mankind Similar movement can allow mankind's remote control robot to do the thing of some danger, such as fire rescue or crawl in this way Dangerous goods;Robot can also be dynamic according to some basis movement such as sign languages based on anthropomorphic robot of action learning of the mankind Work learns, and robot can be exchanged with deaf-mute's progress sign language later;In addition to this, can be remembered using action learning robot It records mankind's dancing and reproduces dance movement in the form of robot, can thus record the non-object of our countries well Matter cultural heritage, such as the dancing of some a small number of name races soon to disappear.
It is well known, however, that most of imitation algorithms be all imitation offline or just for upper limb, despite the presence of a small amount of Whole body behavior imitate algorithm, but imitate position very less so that imitate effect it is undesirable;In view of the above problems, the present invention is directed to It is proposed that algorithm is imitated in a kind of real-time whole body behavior, it is desirable that mostly imitate effect as far as possible to improve in the position of imitation.
Summary of the invention
The purpose of the present invention is to provide a kind of human skeletons to extract from the collected RGB-D image of Kinect The motion model of building eight kinematic chains of human body is removed based on information, and is reflected movement by mapping model and balance controlling mechanism It is mapped to the anthropomorphic robot human body based on Kinect for going to allow robot to reproduce echomotism in robot motion model in real time Method is imitated in behavior, to solve the problems mentioned in the above background technology.
To achieve the above object, the invention provides the following technical scheme:
A kind of anthropomorphic robot human body behavior imitation method based on Kinect, to scheme from the collected RGB-D of Kinect As in based on the human skeleton information extracted, construct the motion model of eight kinematic chains of human body, and by mapping model and Movement is mapped in robot motion model by balance controlling mechanism removes real-time representation, comprising the following steps:
1) action data acquisition, action data processing and joint are established and configures three modules, action data acquisition module is negative Human action is mapped to robot and moved to robot by the abstract model of duty building human action, action data processing module Make optimization processing, joint data are sent to robot to execute echomotism to joint configuration module by treated;
2) action data collection process: modelling of human body motion is extracted frame by frame for the collected RGB-D image of Kinect, often One frame all goes the human behavior for indicating currently to capture with abstract model;
3) action data treatment process: mapping model processing is carried out to collected every frame data respectively, is balanced at control Reason and certainly collision avoid three steps of processing, finally obtain the joint angles configuration that corresponding anthropomorphic robot imitates posture;
4) indicate under modelling of human body motion eight kinematic chain vectors mapping model treatment process: are transformed into machine frame by frame Then the expression of going down of people's motion model is tied using the position of each end-effector of vector computer device people obtained after mapping Close all joints configuration on all kinematic chains of inverse kinematics equations robot;
5) balance control process: calculate the position of centre of gravity of every frame robot and judge robot center of gravity vertical projection whether The equilibrium state that robot is determined within its support polygon is judged as that the joint that robot lower limb is modified when imbalance is matched Setting falls in the vertical projection of its center of gravity within the scope of support polygon;
6) process is avoided from collision: being taken up space using cube quantization means trunk, in conjunction with positive kinematics and inverse movement It learns and judge whether the end-effector of robot occurs with body from colliding, determine to occur from then adjusting upper limb angle when collision and make End-effector is obtained far from trunk, and is judged again from collision status;
7) joint configuration process: judging whether the lower limb of robot have movement, is judged as upper limks movements but lower limb attonity Shi Ze executes speed-raising to movement, shortens setup time to improve the execution speed of robot;
As a further solution of the present invention: real-time action is imitated to carry out online for consecutive image sequence, image sequence In include human body behavior starting and terminate time point be all unknown.
As further scheme of the invention: in the step 1, collected every frame image is in movement treatment process knot Joint configuration process is executed after beam.
As further scheme of the invention: in the step 2, the human motion model extracted from Kinect is to use What eight kinematic chains indicated, eight kinematic chains are respectively by two upper legs of two forearms of upper limb, postbrachium and lower limb and lower leg It constitutes, is gone to indicate eight kinematic chains, posture representation with vector in abstract model are as follows:
In formula, vectorIt indicates from joint From to the kinematic chain of joint To.
As further scheme of the invention: in the step 3, processing module executes movement to every frame data in order Mapping, balance control and avoid process from collision.
As the present invention further scheme: in the step 4, by being fitted the unit vector of human motion chain and right The unit vector execution of robot motion's chain is answered to map, with the joint angles correction value on the computation of inverse- kinematics kinematic chain Mode are as follows:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be changed according to the case where solution influence pose error Positive number, I is unit matrix, and Δ x is the ideal position in joint and the difference of current location, and Δ q is all on kinematic chain corresponding closes The angle correction of section, W are weight matrix, by the angle, θ of joint angles range and current joint iiIt obtains, calculation are as follows:
As further scheme of the invention: allowing robot to do Quasi-static Movement in the step 5, in process of imitation makes Vertical projection of the center of gravity on ground is equal to point of zero moment, point of zero moment then determine in support polygon it is robot stabilized, The calculation method of center of gravity are as follows:
In formula, piIndicate centroid position of the robot links i under its local coordinate system, miIt is the quality of connecting rod i.
As further scheme of the invention: in the step 6, the position of judgement from collision generation includes trunk and hand Portion.
As further scheme of the invention: in the step 7, slowing down machine when detecting that lower limb have movement configuration The movement of people executes speed so that robot achievees the effect that do Quasi-static Movement.
Compared with prior art, the beneficial effects of the present invention are: the invention proposes a kind of apery machine based on Kinect Everybody imitates method at body behavior with device, and the behavior without robot before being imitated and people is calibrated, and mapping model uses eight movements The echomotism that the movement that chain indicates can allow robot to show is similar to mankind's height;Method of the invention enables robot weight The heart and the line of support leg control the balance of robot approximately perpendicular to ground, to ensure that robot can in real time, surely Surely it imitates;Method of the invention is in entire process of imitation, action data acquisition module, action data processing module and joint Configuration module be all it is parallel, the communication of intermodule is carried out by reading and writing the method for buffer area, execution efficiency is high, may be implemented It is online to execute in real time;To go building people based on the human skeleton information extracted in the collected RGB-D image of Kinect The motion model of eight kinematic chains of body, and movement is mapped to by robot motion model by mapping model and balance controlling mechanism In go to allow robot to reproduce echomotism in real time.
Detailed description of the invention
Fig. 1 is work flow diagram of the invention.
Fig. 2 is modelling of human body motion schematic diagram in the present invention.
Fig. 3 is robot motion model's schematic diagram in the present invention.
Fig. 4 is that support modes switch schematic diagram in the present invention in the present invention.
Fig. 5 is support polygon schematic diagram in the present invention.
Fig. 6 is single foot support modes schematic diagram of robot in the present invention.
Fig. 7 is the double-legged support modes schematic diagram of robot in the present invention.
Fig. 8 is the double-legged support modes side view of robot in the present invention.
Fig. 9 is system architecture diagram of the invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Please refer to Fig. 1~9, in the embodiment of the present invention, a kind of anthropomorphic robot human body behavior imitation side based on Kinect Method, algorithm flow is as shown in Figure 1, the motion model of building human body, the motion model for constructing robot, movement mapping, balance control It makes and collides the realization process avoided certainly, follow the steps below to implement:
A) motion model of human body is constructed
The motion model of human body establishes coordinate system at Kinect collected mankind's trunk center, by RGB-D image zooming-out Human motion model out as shown in Fig. 2, can indicate arbitrary human behavior by the motion model that eight kinematic chains indicate, Posture indicates are as follows:
In formula, vectorIt indicates from joint From to the kinematic chain of joint To.
The depth data in joint each under Kinect coordinate system is transformed into and goes to indicate specific in mankind's trunk coordinate system Method are as follows: for any point under mankind's trunk coordinate systemIt is known that all artis under Kinect coordinate system Coordinate includes the coordinate of trunkAnd postureAssuming that this point is under Kinect coordinate systemThen have with Lower relationship:
It is solved as a result, according to following calculation method
In formula,It is equivalent toBecause all spin matrixs are all skew symmetric matrix, skew symmetry Matrix R has RT=R-1
B) motion model of robot is constructed
Anthropomorphic robot can establish coordinate system at trunk center according to the design feature of itself, and motion model is moved by eight Chain indicates, as shown in figure 3, the posture of robot may also indicate that are as follows:
In formula, vectorIt indicates from joint From to the kinematic chain of joint To;
C) movement mapping
According to the motion model built, the unit vector of kinematic chain is corresponded to by fitting robot and human motion model The conversion from human behavior to robot behavior is completed, the calculation of joint of robot coordinate after mapping are as follows:
In formula, uLINK (LShoulder) p is a constant, represents the three-dimensional of its shoulder under robot trunk coordinate system The distance at position, shoulder and trunk center be it is fixed, joint LShoulder is the abbreviation of Left Shoulder, indicates left shoulder Joint,The upper arm lengths and a constant of robot are represented, LElbow is the abbreviation of Left Elbow, is represented Left elbow joint;VectorIt is the vector representation of human motion chain, | | | | it is Euclidean distance operator;
After being mapped after the three-dimensional position of joint of robot, the joint angle of corresponding kinematic chain is calculated according to joint position Degree, calculation formula are as follows:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be changed according to the case where solution influence pose error Positive number, I is unit matrix, and Δ x is the ideal position in joint and the difference of current location, and Δ q is all on kinematic chain corresponding closes The angle correction of section, W are weight matrix, by the angle, θ of joint angles range and current joint iiIt obtains, calculation formula are as follows:
D) balance control
Allowing robot to do Quasi-static Movement in process of imitation makes center of gravity indicate point of zero moment in the projection approximation on ground, As long as point of zero moment will be stablized in support polygon inner machine people, the calculation formula of center of gravity are as follows:
In formula, piIndicate centroid position of the robot links i under its local coordinate system, miIt is the quality of connecting rod i;
The switching schematic diagrames of support modes is as shown in figure 4, only when center of gravity is switched to support leg and free foot and support The difference in height of foot, which reaches a certain level, to be just considered to be switched to single foot support modes from double-legged support modes;Similarly, only when weight The heart leaves support leg and the difference in height of free foot and support leg is less than certain value and is just considered to be switched to from single foot support modes Double-legged support modes;Specific switching condition indicates are as follows:
In formula, Torso is that Kinect collects the three-dimensional coordinate at mankind's trunk center for approximate substitution mankind center of gravity Position, LAnkle and RAnkle full name respectively indicate Kinect for Left Ankle and Right Ankle and collect mankind or so The three-dimensional coordinate of ankle-joint, OFFSET and MIN_HEIGHT are to imitate the good threshold value of default, are adopted for eliminating Kinect Collection fluctuation bring influences;
The meaning of partial joint is as shown in the table in Fig. 4:
Support the mathematical model of shape changeable as shown in figure 5, establishing plane coordinate system, support polygon at the center of a foot Six boundary l1、l2、l3、l4、l5And l6It is represented as the form of y=ax+b, is sat in this way in known machine people's center of gravity three-dimensional It may determine that whether its vertical projection on ground within the scope of support polygon determines that robot is current in the case of target Equilibrium state;
Single foot support modes of robot are as shown in fig. 6, vector [0,0,1]Tα angle is had rotated around roll axis first, so Have rotated β angle around pitch axis afterwards and obtain vector Ankle-CoM, vector Ankle-CoM represent support ankle arthrosis Ankle and The line of center of gravity CoM (full name is Center of Mass);In fact, pitch axis here is not in world coordinate system But under Ankle local coordinate system, because roll axis is female connecting rod of pitch axis relative to ground;Pass through space vector method It solves angle [alpha] and β, calculation formula is as follows:
In formula,WithRespectively| | | | it is Euclidean distance operator;
Ankle joint angle is constantly corrected with angle [alpha] and β, why to be repeated as many times, is because with ankle joint angle The position of adjustment center of gravity will change;The present invention obtains final ankle-joint configuration by the way of this cyclic approximation, Circulation terminates to meet the requirements until the projected position of maximum number of iterations or center of gravity.Although the similar general numerical value of this method Approach method such as least square method, but speed can more faster, because preceding adjustment several times can allow the three-dimensional position of center of gravity Meet condition without too successive ignition, be directly to allow Ankle-CoM directly vertical with ground, convenient for allowing center of gravity to fall within support In polygon.
The double-legged branch hold mode of robot is as Figure 7-8, robot be by ankle-joint double-legged bolster model adjusted, Ideal adjustment posture result is that Ankle-Hip falls within pitch-yaw plane, and Hip is the hip joint of robot;AB is joint The rotary shaft of Hip, it and vector Ankle-Hip are orthogonal;γ is pitch-yaw plane and by vector AB and Hip-CoM structure At plane AB-CoM between angle, the angle for updating joint of robot Hip is gone with γ, if vector Ankle-CoM peace Face pitch-yaw is parallel, and robot is exactly stable, the calculation of both legs support modes are as follows:
In formula, normal vectorPlane AB-CoM, corresponding adjustment angle γ by calculate two plane AB-CoM and The angle of yaw-pitch normal vector obtains, | | | | it is Euclidean distance operator;
E) it is avoided from collision
From crash analysis after balance control, realized by positive kinematics and inverse kinematics;It first may robot Occur to be abstracted into simple space multistory from the position of collision, can be and formed by cubic unit establishment or simple cylindrical body mould Type;Three-dimensional position of the robot hand under trunk coordinate system, can be by judging whether this point touches what is abstracted It hits within the scope of cylindrical body and determines the generation from collision situation;In case of from colliding, can allowing the position of end-effector Vertical column axis is avoided far from certain distance, is then solved new joint with positive kinematics again and is configured, finally uses positive motion Learn the pose for updating robot.
Fig. 9 illustrates the realization framework of imitation system, and system is by three module compositions, including acquisition module, data processing Module and configuration module;Acquisition module constructs the motion model of the mankind by the coordinate in the crucial joint obtained Kinect, adopts Collection module can also control acquisition frame rate and filtering bad frame;Processing module is divided into three steps: firstly, being established according to joint coordinates Using trunk as the coordinate system of origin, the mankind of eight chains run model and are constructed;Then, every movement of fitting robot The unit vector of the unit vector of chain and the kinematic chain of the corresponding mankind reaches imitation effect;Finally, balance control and certainly collision Avoidance mechanism needs are integrated into;In order to shorten processing time, the inverse kinematics (Inverse of upper limb and joint of lower extremity Kinematics abbreviation IK) and positive kinematics (Forward Kinematics abbreviation FK) calculating parallel processing, there are also acquire mould Block, processing module and configuration module are also run simultaneously to improve system performance.
Anthropomorphic robot of the invention imitates human body behavior key and is to act mapping and balance controlling party face, and the two can To guarantee robot stabilized efficiently to imitate human behavior.
It is obvious to a person skilled in the art that invention is not limited to the details of the above exemplary embodiments, Er Qie In the case where without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims Variation is included within the present invention.Any reference signs in the claims should not be construed as limiting the involved claims.
In addition, it should be understood that although this specification is described in terms of embodiments, but not each embodiment is only wrapped Containing an independent technical solution, this description of the specification is merely for the sake of clarity, and those skilled in the art should It considers the specification as a whole, the technical solutions in the various embodiments may also be suitably combined, forms those skilled in the art The other embodiments being understood that.

Claims (1)

1. method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect, with from the collected RGB-D image of Kinect In based on the human skeleton information extracted, construct the motion model of eight kinematic chains of human body, and peaceful by mapping model Movement is mapped in robot motion model by weighing apparatus controlling mechanism removes real-time representation, and real-time action, which imitates, is directed to consecutive image sequence Online to carry out, the starting for the human body behavior for including in image sequence and termination time point are all unknown;It is characterised in that it includes Following steps:
1) action data acquisition, action data processing and joint are established and configures three modules, action data acquisition module is responsible for structure Build the abstract model of human action, action data processing module human action be mapped to robot and to the movement of robot it is excellent Change processing, by treated, joint data are sent to robot to execute echomotism, collected every frame to joint configuration module Image executes joint configuration process after movement treatment process terminates;
2) action data collection process: modelling of human body motion, each frame are extracted frame by frame for the collected RGB-D image of Kinect The human behavior for indicating currently to capture all is gone with abstract model, the human motion model extracted from Kinect is moved with eight What chain indicated, eight kinematic chains are made of two upper legs of two forearms of upper limb, postbrachium and lower limb and lower leg respectively, are abstracted It is gone to indicate eight kinematic chains, posture representation with vector in model are as follows:
In formula, vectorIt indicates from joint From to the kinematic chain of joint To;
3) action data treatment process: to collected every frame data carry out respectively mapping model processing, balance control processing with And three steps of processing are avoided from collision, the joint angles configuration that corresponding anthropomorphic robot imitates posture is finally obtained, mould is handled To every frame data, execution movement maps block, balance controls in order and avoids process from collision;
4) indicate under modelling of human body motion eight kinematic chain vectors mapping model treatment process: are transformed into robot fortune frame by frame Movable model goes down expression, using the position of each end-effector of vector computer device people obtained after mapping, then in conjunction with inverse All joints configuration on all kinematic chains of kinematics equations robot;By be fitted human motion chain unit vector and The unit vector execution mapping of corresponding robot motion's chain, with the joint angles correction value on the computation of inverse- kinematics kinematic chain Mode are as follows:
Δ q=W-1JT(JW-1JT+λI)-1Δx;
In formula, J is Jacobian matrix, and damped coefficient λ is one to be changed just according to the case where solution influence pose error Number, I is unit matrix, and Δ x is the ideal position in joint and the difference of current location, and Δ q is all corresponding joints on kinematic chain Angle correction, W are weight matrix, by the angle, θ of joint angles range and current joint iiIt obtains, calculation are as follows:
5) it balances control process: calculating the position of centre of gravity of every frame robot and judge the vertical projection of robot center of gravity whether at it The equilibrium state that robot is determined within support polygon is judged as that the joint configuration that robot lower limb is modified when imbalance makes The vertical projection of its center of gravity is fallen within the scope of support polygon, is allowed robot to do Quasi-static Movement in process of imitation and is made center of gravity on ground The vertical projection in face is equal to point of zero moment, and point of zero moment then determines robot stabilized, the calculating of center of gravity in support polygon Method are as follows:
In formula, piIndicate centroid position of the robot links i under its local coordinate system, miIt is the quality of connecting rod i;
6) process is avoided from collision: being taken up space using cube quantization means trunk, sentenced in conjunction with positive kinematics and inverse kinematics Whether the end-effector of disconnected robot occurs with body from colliding, and determines to occur to make end from then adjusting upper limb angle when colliding End effect device judges that the position of judgement from collision generation includes trunk and hand from collision status far from trunk again;
7) joint configuration process: judging whether the lower limb of robot have movement, when being judged as upper limks movements but lower limb attonity then Speed-raising is executed to movement, shortens setup time to improve the execution speed of robot, is put when detecting that lower limb have movement configuration The movement of slow robot executes speed so that robot achievees the effect that do Quasi-static Movement.
CN201610480222.7A 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect Active CN106078752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610480222.7A CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610480222.7A CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Publications (2)

Publication Number Publication Date
CN106078752A CN106078752A (en) 2016-11-09
CN106078752B true CN106078752B (en) 2019-03-19

Family

ID=57252883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610480222.7A Active CN106078752B (en) 2016-06-27 2016-06-27 A kind of anthropomorphic robot human body behavior imitation method based on Kinect

Country Status (1)

Country Link
CN (1) CN106078752B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109015631A (en) * 2018-07-03 2018-12-18 南京邮电大学 The method that anthropomorphic robot based on more working chains imitates human motion in real time

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600000A (en) * 2016-12-05 2017-04-26 中国科学院计算技术研究所 Method and system for human-robot motion data mapping
CN106971050B (en) * 2017-04-18 2020-04-28 华南理工大学 Kinect-based Darwin robot joint mapping analysis method
CN107283386A (en) * 2017-05-27 2017-10-24 江苏物联网研究发展中心 Man-machine synchronous method
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107127760A (en) * 2017-07-12 2017-09-05 清华大学 A kind of track combined anthropomorphic robot of foot
CN107443396A (en) * 2017-08-25 2017-12-08 魔咖智能科技(常州)有限公司 A kind of intelligence for imitating human action in real time accompanies robot
CN107953331B (en) * 2017-10-17 2019-12-10 华南理工大学 human body posture mapping method applied to humanoid robot action simulation
CN107598897B (en) * 2017-10-19 2020-11-27 北京工业大学 Humanoid robot gait planning method based on human body teaching
CN107932510A (en) * 2017-11-28 2018-04-20 中国人民解放军陆军工程大学 NAO robot systems based on action collection
CN108284436B (en) * 2018-03-17 2020-09-25 北京工业大学 Remote mechanical double-arm system with simulation learning mechanism and method
CN108908353B (en) * 2018-06-11 2021-08-13 安庆师范大学 Robot expression simulation method and device based on smooth constraint reverse mechanical model
CN109079794B (en) * 2018-09-18 2020-12-22 广东省智能制造研究所 Robot control and teaching method based on human body posture following
CN109840923B (en) * 2019-01-22 2020-12-01 绍兴文理学院 Method for obtaining azimuth characteristics based on robot dance posture mirror image
CN109821243A (en) * 2019-01-25 2019-05-31 丰羽教育科技(上海)有限公司 A method of simulation reappears shooting process
CN109830078B (en) * 2019-03-05 2021-03-30 智慧眼科技股份有限公司 Intelligent behavior analysis method and intelligent behavior analysis equipment suitable for narrow space
CN113396032A (en) * 2019-04-23 2021-09-14 西门子股份公司 Multi-axis motion controller, multi-axis motion control method and system
CN110135303B (en) * 2019-04-30 2022-09-13 西安理工大学 Dance non-genetic bearing and interactive learning method
JP7285703B2 (en) * 2019-06-17 2023-06-02 株式会社ソニー・インタラクティブエンタテインメント robot control system
CN110480634B (en) * 2019-08-08 2020-10-02 北京科技大学 Arm guide motion control method for mechanical arm motion control
CN111208783B (en) * 2019-12-30 2021-09-17 深圳市优必选科技股份有限公司 Action simulation method, device, terminal and computer storage medium
CN111113429B (en) * 2019-12-31 2021-06-25 深圳市优必选科技股份有限公司 Action simulation method, action simulation device and terminal equipment
CN111300421A (en) * 2020-03-17 2020-06-19 北京理工大学 Mapping method applied to simulation of actions of both hands of humanoid robot
CN112044013B (en) * 2020-09-18 2021-11-23 宿州赛尔沃德物联网科技有限公司 Robot fire rescue system
CN112847336B (en) * 2020-12-24 2023-08-22 达闼机器人股份有限公司 Action learning method and device, storage medium and electronic equipment
CN112580582B (en) * 2020-12-28 2023-03-24 达闼机器人股份有限公司 Action learning method, action learning device, action learning medium and electronic equipment
CN112959330B (en) * 2021-02-02 2022-05-17 浙江大学 Robot double-arm motion man-machine corresponding device and method based on master-slave dynamic motion elements
CN112894828B (en) * 2021-03-02 2022-05-20 乐聚(深圳)机器人技术有限公司 Robot motion simulation method, device, equipment and storage medium
CN113492404B (en) * 2021-04-21 2022-09-30 北京科技大学 Humanoid robot action mapping control method based on machine vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102708582A (en) * 2012-05-08 2012-10-03 电子科技大学 Character movement redirecting method for heterogeneous topology
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN104794722A (en) * 2015-04-30 2015-07-22 浙江大学 Dressed human body three-dimensional bare body model calculation method through single Kinect
CN104932254A (en) * 2015-05-12 2015-09-23 北京理工大学 Humanoid robot front-fall protection control strategy
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3963162B2 (en) * 2003-08-28 2007-08-22 ソニー株式会社 Robot apparatus and control method of robot apparatus
US8180114B2 (en) * 2006-07-13 2012-05-15 Northrop Grumman Systems Corporation Gesture recognition interface system with vertical display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102708582A (en) * 2012-05-08 2012-10-03 电子科技大学 Character movement redirecting method for heterogeneous topology
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN104794722A (en) * 2015-04-30 2015-07-22 浙江大学 Dressed human body three-dimensional bare body model calculation method through single Kinect
CN104932254A (en) * 2015-05-12 2015-09-23 北京理工大学 Humanoid robot front-fall protection control strategy
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于体感的NAO机器人展示系统研究与开发;于超;《中国优秀硕士学位论文全文数据库 信息科技辑》;20150515;第15-31,45-64页

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109015631A (en) * 2018-07-03 2018-12-18 南京邮电大学 The method that anthropomorphic robot based on more working chains imitates human motion in real time

Also Published As

Publication number Publication date
CN106078752A (en) 2016-11-09

Similar Documents

Publication Publication Date Title
CN106078752B (en) A kind of anthropomorphic robot human body behavior imitation method based on Kinect
CN106607910B (en) A kind of robot imitates method in real time
CN109202901A (en) A kind of biped robot's stair climbing gait planning method, apparatus and robot
CN103440037B (en) Real-time interaction virtual human body motion control method based on limited input information
CN107330967A (en) Knight's athletic posture based on inertia sensing technology is caught and three-dimensional reconstruction system
CN107416195A (en) A kind of imitative hawk grasping system of aerial operation multi-rotor aerocraft
CN107598897A (en) A kind of method of humanoid robot gait's planning based on human body teaching
CN102375416B (en) Human type robot kicking action information processing method based on rapid search tree
CN101579238A (en) Human motion capture three dimensional playback system and method thereof
Gong et al. Bionic quadruped robot dynamic gait control strategy based on twenty degrees of freedom
CN109333506A (en) A kind of humanoid intelligent robot system
Wang et al. A real-time human imitation system
CN107953331A (en) A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN101246601A (en) Three-dimensional virtual human body movement generation method based on key frame and space-time restriction
CN109079794A (en) It is a kind of followed based on human body attitude robot control and teaching method
CN107818318A (en) A kind of anthropomorphic robot imitates method for evaluating similarity
CN105892626A (en) Lower limb movement simulation control device used in virtual reality environment
CN109086466A (en) Single leg multiaxis biped robot kinematics joint simulation method
CN109048897A (en) A kind of method of principal and subordinate's teleoperation of robot
CN108447077A (en) A kind of horsemanship jockey posture information acquisition analysis system
CN107529498A (en) A kind of method that robot for space arrests noncooperative target
CN107719709B (en) A kind of space junk removes system configuration and its design method
Barros et al. Bimanual haptics for humanoid robot teleoperation using ROS and V-REP
CN114474066A (en) Intelligent humanoid robot control system and method
Rosado et al. Reproduction of human arm movements using Kinect-based motion capture data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210111

Address after: Building 28 and 29, Tian'an Digital City, No.88 Chunyang Road, Chengyang District, Qingdao City, Shandong Province

Patentee after: Qingdao Institute of computing technology Xi'an University of Electronic Science and technology

Address before: No.2, Taibai South Road, Yanta District, Xi'an City, Shaanxi Province

Patentee before: XIDIAN University

TR01 Transfer of patent right