CN112949084A - Force-feedback-free stress motion capture error correction method - Google Patents

Force-feedback-free stress motion capture error correction method Download PDF

Info

Publication number
CN112949084A
CN112949084A CN202110318227.0A CN202110318227A CN112949084A CN 112949084 A CN112949084 A CN 112949084A CN 202110318227 A CN202110318227 A CN 202110318227A CN 112949084 A CN112949084 A CN 112949084A
Authority
CN
China
Prior art keywords
human body
force
model
anybody
motion capture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110318227.0A
Other languages
Chinese (zh)
Other versions
CN112949084B (en
Inventor
葛哲学
戚祝琦
杨拥民
罗旭
张弈
李强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110318227.0A priority Critical patent/CN112949084B/en
Publication of CN112949084A publication Critical patent/CN112949084A/en
Application granted granted Critical
Publication of CN112949084B publication Critical patent/CN112949084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/14Force analysis or force optimisation, e.g. static or dynamic forces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a force-free feedback stress action capture error correction method, which comprises the following steps: step 1: establishing a biomechanical model of the upper limb of the human body through AnyBody simulation software; step 2: obtaining theoretical action parameter change curves of the upper limbs of the human body under different loads through a biomechanics model of the upper limbs of the human body; and step 3: carrying out third-order polynomial fitting on the theoretical action parameter change curve to obtain a linear fitting function of corresponding action; and 4, step 4: and predicting and correcting corresponding motion capture data which is powerless to feed back under different loads through a linear fitting function. The invention can effectively reduce the motion capture error, and the accuracy can meet the accuracy requirement of most tasks.

Description

Force-feedback-free stress motion capture error correction method
Technical Field
The invention relates to the technical field of mechanical engineering and human-machine engineering, in particular to a force-bearing action capture error correction method capable of realizing force-free feedback.
Background
The motion capture technology avoids complex human body modeling work, can generate highly vivid motion in real time, and is widely applied to the field of human body motion simulation. However, due to the current force feedback technology development level and the restriction of application cost, the motion capture system under the actual stress condition is generally established on the basis of powerless feedback, a real person cannot feel the real stress in the operation process, the influence of carrying, twisting, pushing and the like on the action of the virtual person cannot be effectively and accurately reflected in the motion capture process, and the result inevitably has errors compared with the real stress action. For specific application fields with high requirements on motion accuracy, such as virtual maintenance, virtual assembly, virtual surgery, etc., the motion errors may cause that subsequent human factor analysis and evaluation results are difficult to meet the use requirements.
For error correction of motion capture, researchers at home and abroad develop a series of research works based on a video method and a kinematics and dynamics method. Grochow et al propose a reverse kinematics method based on physical kinematics characteristics, which combines and applies a global nonlinear dimensionality reduction technology, a Gaussian implicit variable processing model (GPLVM) and a priori kinematics model, and is suitable for correcting similar small-scale human motion data but may not be suitable for large-scale heterogeneous data [2 ]; wolfgang Seemann et al propose a method of projecting observed position, velocity and acceleration on a corresponding constraint manifold to generate a new corrected trajectory, ensuring consistency of motion parameters [3 ]; liang Zhang and G Brunnet et al propose a hybrid method of real-time human motion capture using a simplified marker set and monocular video, perform pose estimation based on marker positions using an improved inverse motion solver, and refine and improve the pose based on video images [4 ]; qiu Shi Guang and the like propose that a virtual human real-time motion control model is established by adopting a dynamic constraint mode based on joint rotation information, a grey system theory is introduced to establish a motion compensation model, and online compensation of missing data is realized [5 ]; shi Xue and the like adopt Motion Builder software to carry out post-processing on the captured original data of the Motion, modify the human joints frame by referring to the actual video Motion, successfully correct the motions of the virtual human ankle joints and the fingers, form a special virtual human body by recording the basic postures of experimenters, and establish a convenient-to-use special Motion library [6 ]; zhang Hongbo et al propose a method of recognizing human body behavior using a Motion Difference Histogram (MDH) as a motion feature descriptor, correcting a motion estimation result by background motion estimation, and encoding a motion difference between a background and an object [7 ].
It can be found that most current motion capture error correction aims at the influences of sensor precision, environmental factors and the like, mainly solves the problem of human body self kinematics under the condition of non-force action, and has less error research on the feedback deficiency of external force. Aiming at the problems, a dynamic method is adopted to carry out human body modeling and action correction, the posture change rule of the human body upper limb joint under different external load conditions is summarized, and a corresponding action compensation correction method is explored.
Disclosure of Invention
The invention aims to provide a force-free feedback force action capturing error correction method to solve the problem that an existing force capturing system is built on the basis of force-free feedback and has an error inevitably.
In order to achieve the purpose, the invention provides the following technical scheme:
a force-free motion capture error correction method comprises the following steps:
step 1: establishing a biomechanical model of the upper limb of the human body through AnyBody simulation software;
and step 3: obtaining theoretical action parameter change curves of the upper limbs of the human body under different loads through a biomechanics model of the upper limbs of the human body;
and 4, step 4: carrying out third-order polynomial fitting on the theoretical action parameter change curve to obtain a linear fitting function of corresponding action;
and 5: and predicting and correcting corresponding motion capture data which is powerless to feed back under different loads through a linear fitting function.
Preferably, the biomechanical model of the human upper limb is established by AnyBody simulation software, comprising:
step 11: establishing a human upper limb structure model, wherein the human upper limb structure model comprises a shoulder joint, an elbow joint, a wrist joint and a lumbar vertebra joint;
step 12: based on the human upper limb structure model, human body and environment models are defined in an AnyScript script of AnyBody simulation software, so that a biomechanics model of the human upper limb is obtained.
Preferably, the human body and environment model is defined in an AnyScript script of AnyBody simulation software, including: and editing the skeleton size, inertia parameters, kinetic parameters and biological parameters of the human body.
Preferably, the defining human body and environment models in the AnyScript script of AnyBody simulation software according to the characteristics of the experimenter and the maintenance task includes: and editing the skeleton size and inertia parameters, the action type, the action path, the task duration and the external stress state of the human body.
Preferably, the obtaining of the change curve of the theoretical motion parameters of the upper limb of the human body under different loads through the biomechanical model of the upper limb of the human body comprises: the AnyBody simulation software drives a human body to execute actions based on a muscle system of a Hill model of the AnyBody simulation software, calculates the change rule of joint angles according to Euler angle transformation between different coordinate systems, solves the muscle force and moment by a forward or reverse dynamic Lagrangian method, and displays a theoretical action parameter change curve after the simulation is finished.
Preferably, the theoretical motion parameter variation curve is a variation curve of the flexion-extension angle of the elbow joint along with the change of the load.
By adopting the technical scheme, the invention has the following technical effects:
for the carrying loads with different masses, the correction action based on the dynamic model calculation is closer to the reality than the capturing result fed back without force, the action capturing error can be effectively reduced, the accuracy can meet the precision requirement of most tasks, and the action capturing error correction method is physically clear, low in solving difficulty and low in popularization and application, and is suitable for wide popularization and application.
Drawings
FIG. 1 is a flowchart of a force feedback-free method for correcting an error in a force capture according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a model of a human upper limb structure according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating the relationship between the flexion-extension angle of the right elbow joint and the variation of the load mass according to the theoretical calculation provided by the embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating the variation of the flexion-extension angle of the right elbow joint with the load mass according to the motion capture provided by the embodiment of the present invention;
fig. 5 is a schematic diagram illustrating comparison between the correction result and the actual capture result according to the embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the following figures and embodiments:
as shown in fig. 1, a method for correcting force-less motion capture errors without force feedback includes:
step 1: and establishing a biomechanical model of the upper limb of the human body through AnyBody simulation software.
The method comprises the following steps:
step 11: establishing a human upper limb structure model, wherein the human upper limb structure model comprises a shoulder joint, an elbow joint, a wrist joint and a lumbar vertebra joint;
step 12: based on the human upper limb structure model, human body and environment models are defined in an AnyScript script of AnyBody simulation software, so that a biomechanics model of the human upper limb is obtained.
Preferably, the human body and environment models are defined in an AnyScript script of AnyBody simulation software, and comprise: and editing the skeleton size, inertia parameters, kinetic parameters and biological parameters of the human body.
Specifically, establishing a human upper limb structure model specifically comprises: integrating a human body measurement model and a biomechanics model, simplifying main movable joints of a human body into 11 joints including a clavicle, a shoulder, an elbow, a wrist, a hip, a thoracic vertebra and a lumbar vertebra, and establishing a simplified structure model of the upper half of the human body; on the basis, the cervical vertebra, the left and right clavicle joints are ignored, and the thoracic vertebra and the lumbar vertebra joints are combined to obtain a further simplified model as shown in fig. 2, wherein the simplified upper limb structure model of the human body comprises shoulder joints, elbow joints, wrist joints and lumbar vertebra joints.
Based on a human upper limb structure model, according to characteristics of a tester and a maintenance task, a human body and an environment model are defined in an AnyScript script of AnyBody simulation software, and skeleton size, inertia parameters, action types, action paths, task duration and external stress states of the human body are edited, so that a biomechanics model of the human upper limb is obtained. The AnyBody software is modeling simulation analysis software based on ergonomics and biomechanics, and can calculate the biomechanics response characteristics of a human body to the environment. An integral human skeletal muscle system is established by AnyBody based on anatomy and Hill model, dynamic parameters such as muscle force and moment are solved based on Lagrange method, and the muscle recruitment optimization criterion of the algorithm biomechanics and the Lagrange dynamics derivation result are internationally verified and approved. The AnyBody can obtain the static and dynamic kinematic, dynamic and biomechanical parameter indexes of all skeletal muscle system models including bones, muscles and joints in single simulation, and visually displays the parameters through a visual model.
Step 2: obtaining theoretical action parameter change curves of the upper limbs of the human body under different loads through a biomechanics model of the upper limbs of the human body;
specifically, the method comprises the following steps: the AnyBody simulation software drives a human body to execute actions based on a muscle system of a Hill model of the AnyBody simulation software, calculates the change rule of joint angles according to Euler angle transformation between different coordinate systems, solves the muscle force and moment by a forward or reverse dynamic Lagrangian method, and displays a theoretical action parameter change curve after the simulation is finished.
Preferably, the theoretical motion parameter variation curve is a variation curve of the flexion-extension angle of the elbow joint along with the change of the load.
Specifically, the change relation of the angle of the right elbow joint along with the weight of the box body is selected as a research object, a space rectangular coordinate system is established by taking the midpoint of a double-foot connecting line as an original point, and the front direction, the right direction and the upper direction of the human body are respectively the positive directions of an x axis, a y axis and a z axis. The external dimension of the carried heavy object is 0.4 multiplied by 0.2m, and the mass center of the carried heavy object is positioned at the center of the geometric solid. Setting the initial position of the mass center of the weight as (0.5, -0.2, 1.3), and the moving direction is horizontally towards the right and approximately at a constant speed, and the unit is m. The carrying duration is 1s, and the transverse displacement distance of the mass center of the weight is 0.4 m.
The calculation was performed in six cases, i.e., a conveying load value m of 0, a conveying load value m of 1kg, a conveying load value m of 2.5kg, a conveying load value m of 5kg, a conveying load value m of 10kg, and a conveying load value m of 20 kg. The external force applied to the hand is F-mg/2 (g-9.81 m/s2), and the direction is always vertical downwards. In the course of action, the palm always keeps close to the object to be transported, so that the posture change of the upper limb caused by the angle change of the wrist joint can be approximately ignored. According to MATLAB-based human upper limb movement analysis and simulation, the calculation method of each main kinetic parameter of the upper half of the human body is shown in Table 1.
Table 1 human upper limb kinetic parameter calculation formula (unit: m, kg): H. w is height and weight respectively
Length of Position of center of gravity Weight (D) Radius of rotation Moment of inertia
Front arm L1=0.157H O1=0.43L1 W1=0.018W R1=0.526L3 I1=W1×R1 2
Upper arm L2=0.172H O2=0.436L2 W2=0.0375W R2=0.542L3 I2=W2×R2 2
Trunk L3=0.3H O3=0.66L3 W3=0.5804W R3=0.837L3 I3=W3×R3 2
In this example, the height H of the subject was 1.80m, and the weight W was 70 kg. The parameters of the corresponding upper limb obtained by calculation are as follows:
Figure BDA0002992119650000061
Figure BDA0002992119650000062
to simplify the calculation process, it is assumed that the lower limb portion of the human body remains stationary while ignoring the influence of the clavicular joint. And determining the motion of the coordinate system of each body joint relative to the base coordinate system by adopting coordinate transformation and a rotation matrix. The initial rotation angles and constraint intervals of the main joints relative to the origin of the coordinate system of the main joints in the static state are shown in table 2.
TABLE 2 initial Angle values and constraint intervals (units: °) of the Main joints
Figure BDA0002992119650000063
Inputting the dynamic parameters of the upper limbs of the human body, the initial rotation angles and the constraint intervals of the origin points of the coordinate systems of the main joints in the relative static state into AnyBody simulation software to obtain a change curve of the flexion and extension angles of the right elbow joints along with the load mass, dividing the conveying action in a change curve 1s into 16 steps, recording theoretical calculation results of the joint postures along with the load value change in different action stages (the 1 st, 4 th, 7 th, 10 th, 13 th and 16 th steps), and drawing a scatter diagram as shown in figure 3.
The human motion capture experiment and verification are carried out by adopting a Perception Neuron inertial motion capture system, the horizontal carrying motion of the experimenter is captured according to the same load group, the carrying motion within 1s is also divided into 16 steps, the posture angle data of the right elbow joint in the corresponding frame obtained by a plurality of times of motion capture experiments is recorded, and a curve is drawn as shown in figure 4.
By observing the joint flexion and extension angle trajectory curves obtained by the action capture experiment and theoretical calculation respectively, the following common laws can be found:
with the increase of the weight of the carrying load, the flexion and extension angles of the right elbow joint show a descending trend in different stages; when the load value is large (10kg and 20kg groups), the deviation of the flexion-extension angle of the right elbow joint compared with the unloaded state generally exceeds 10 degrees and can reach 30-40 degrees at most in multiple stages of the carrying action, the influence caused by the deviation cannot be ignored, and the right elbow joint needs to be reasonably corrected; the calculation result obtained according to the dynamic model and the actual motion capture result have similar change rules under different loads, and the calculation result and the actual motion capture result are further close to be matched with each other along with the increase of the loads.
Based on this, it can be considered that the dynamics calculation result is accurate enough, that is, the motion capture result under the condition of powerless feedback can be replaced by the motion capture result under the actual stress state by correcting the motion capture result under the condition of powerless feedback.
And step 3: carrying out third-order polynomial fitting on the theoretical action parameter change curve to obtain a linear fitting function of corresponding action;
and 4, step 4: and predicting and correcting corresponding motion capture data which is powerless to feed back under different loads through a linear fitting function.
Specifically, a plurality of groups of curves obtained through theoretical calculation in fig. 3 are observed, and a third-order polynomial is used to fit the curves, so that linear fitting functions corresponding to the curves are obtained as follows:
Figure BDA0002992119650000071
according to the results obtained by the above formula, the flexion and extension angles of the right elbow joint corresponding to different loads at each stage can be predicted and corrected.
The correction method is verified by taking m as 7.5kg and m as 15kg as examples respectively, formula (1) is replaced, and the flexion and extension angles of the right elbow joint at different stages are calculated as follows:
Figure BDA0002992119650000072
Figure BDA0002992119650000073
real-person motion capture was performed by setting the carrying loads to 7.5kg and 15kg, and the captured files were saved and the corresponding data were recorded as shown in table 3.
Table 3 right elbow flexion and extension angle capture results corresponding to m-7.5 kg and m-15 kg
Figure BDA0002992119650000074
Figure BDA0002992119650000081
A scatter diagram is drawn based on the equations (2) and (3) and the corresponding actual motion capture data, as shown in fig. 5. Wherein NF-CAP is the unloaded motion capture result, and CAP and COR respectively represent the actual capture result and the correction result based on kinetic calculation.
Comparing the actual captured result with the theoretical corrected result, the corresponding standard deviation is s13.138(m 7.5kg) and s2=2.48(m=15kg)。
Defining the relative error of the action as:
Figure BDA0002992119650000082
in the formula (4), θiAnd theta i is a fitting correction result based on dynamic calculation corresponding to the actual stress motion capture result (i is 1, 4, 7, 10, 13 and 16) at a certain moment. If λ<0.05, the relative error is considered to be small enough, i.e. the fitting correction result has sufficient accuracy.
The data were substituted to obtain relative errors λ 1 ═ 0.0333(m ═ 7.5kg) and λ 2 ═ 0.0368(m ═ 15kg), respectively.
In summary, for the carrying loads with different masses, the correction actions based on the dynamic calculation are closer to the actual actions than the capturing results fed back without force, that is, the action capturing errors can be effectively reduced by adopting the dynamic correction method; the correction result based on the dynamic calculation approximately falls on the corresponding actual capturing action curve, the maximum angle error at a single moment does not exceed 6 degrees, and the accuracy of the correction result can meet the accuracy requirement of most tasks.
The foregoing is merely an example of the present invention and common general knowledge in the art of designing and/or characterizing particular aspects and/or features is not described in any greater detail herein. It should be noted that, for those skilled in the art, without departing from the technical solution of the present invention, several variations and modifications can be made, which should also be regarded as the protection scope of the present invention, and these will not affect the effect of the implementation of the present invention and the practicability of the patent. The scope of the claims of the present application shall be determined by the contents of the claims, and the description of the embodiments and the like in the specification shall be used to explain the contents of the claims.

Claims (5)

1. A force-free motion capture error correction method is characterized by comprising the following steps:
step 1: establishing a biomechanical model of the upper limb of the human body through AnyBody simulation software;
step 2: obtaining theoretical action parameter change curves of the upper limbs of the human body under different loads through a biomechanics model of the upper limbs of the human body;
and step 3: carrying out third-order polynomial fitting on the theoretical action parameter change curve to obtain a linear fitting function of corresponding action;
and 4, step 4: and predicting and correcting corresponding motion capture data which is powerless to feed back under different loads through a linear fitting function.
2. The force-free motion capture error correction method of claim 1, wherein the establishing of the biomechanical model of the upper limb of the human body by the AnyBody simulation software comprises:
step 11: establishing a human upper limb structure model, wherein the human upper limb structure model comprises a shoulder joint, an elbow joint, a wrist joint and a lumbar vertebra joint;
step 12: based on the human upper limb structure model, human body and environment models are defined in an AnyScript script of AnyBody simulation software, so that a biomechanics model of the human upper limb is obtained.
3. The force-feedback-free stress motion capture error correction method of claim 2, wherein the definition of the human body and environment model in the AnyScript of AnyBody simulation software comprises: and editing the skeleton size, inertia parameters, kinetic parameters and biological parameters of the human body.
4. The method for correcting capturing errors of force-free motion according to claim 3, wherein the obtaining of the change curve of the theoretical motion parameters of the upper limbs of the human body under different loads through the biomechanical model of the upper limbs of the human body comprises: the AnyBody simulation software drives a human body to execute actions based on a muscle system of a Hill model of the AnyBody simulation software, calculates the change rule of joint angles according to Euler angle transformation between different coordinate systems, solves the muscle force and moment by a forward or reverse dynamic Lagrangian method, and displays a theoretical action parameter change curve after the simulation is finished.
5. The force-feedback-free force capture error correction method as claimed in any one of claims 1 to 4, wherein the theoretical motion parameter variation curve is a variation curve of the flexion-extension angle of the elbow joint with the load.
CN202110318227.0A 2021-03-25 2021-03-25 Force action capturing error correction method based on weak feedback Active CN112949084B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110318227.0A CN112949084B (en) 2021-03-25 2021-03-25 Force action capturing error correction method based on weak feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110318227.0A CN112949084B (en) 2021-03-25 2021-03-25 Force action capturing error correction method based on weak feedback

Publications (2)

Publication Number Publication Date
CN112949084A true CN112949084A (en) 2021-06-11
CN112949084B CN112949084B (en) 2023-04-25

Family

ID=76227796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110318227.0A Active CN112949084B (en) 2021-03-25 2021-03-25 Force action capturing error correction method based on weak feedback

Country Status (1)

Country Link
CN (1) CN112949084B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533528A (en) * 2009-04-18 2009-09-16 大连大学 Optical motion capture data processing method based on module piecewise linear model
JP2011224048A (en) * 2010-04-15 2011-11-10 Institute Of National Colleges Of Technology Japan Upper limb movement model
CN105160139A (en) * 2015-10-16 2015-12-16 中国电子科技集团公司第三十八研究所 Hybrid driving method for virtual human maintenance actions
US20170004358A1 (en) * 2010-08-26 2017-01-05 Blast Motion Inc. Motion capture system that combines sensors with different measurement ranges
US10507121B1 (en) * 2015-10-15 2019-12-17 Hrl Laboratories, Llc Device and method to decode volitional motor commands using a biomechanical model for controlling a prosthetic limb
CN111369626A (en) * 2020-03-04 2020-07-03 刘东威 Markless point upper limb movement analysis method and system based on deep learning
CN112446162A (en) * 2020-11-23 2021-03-05 四川大学华西医院 Intervertebral disc stress measuring device and method based on attitude recognition

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101533528A (en) * 2009-04-18 2009-09-16 大连大学 Optical motion capture data processing method based on module piecewise linear model
JP2011224048A (en) * 2010-04-15 2011-11-10 Institute Of National Colleges Of Technology Japan Upper limb movement model
US20170004358A1 (en) * 2010-08-26 2017-01-05 Blast Motion Inc. Motion capture system that combines sensors with different measurement ranges
US10507121B1 (en) * 2015-10-15 2019-12-17 Hrl Laboratories, Llc Device and method to decode volitional motor commands using a biomechanical model for controlling a prosthetic limb
CN105160139A (en) * 2015-10-16 2015-12-16 中国电子科技集团公司第三十八研究所 Hybrid driving method for virtual human maintenance actions
CN111369626A (en) * 2020-03-04 2020-07-03 刘东威 Markless point upper limb movement analysis method and system based on deep learning
CN112446162A (en) * 2020-11-23 2021-03-05 四川大学华西医院 Intervertebral disc stress measuring device and method based on attitude recognition

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HARWIN W S,LOUREIRO R C V: "Analysis of the Fugl-Meyer Outcome Measures Assessing the Effectiveness of Ro-bot-Mediated Stroke Therapy" *
常影;孟凡冬;郑福建;朱帅飞;: "上肢骨骼模型的建立" *
杨雨辰: "基于AnyBody的步态稳定性仿真分析及应用研究", 《中国优秀博硕士学位论文全文数据库(硕士)基础科学辑》 *
王胤杰;沈林勇;章亚男;: "基于运动捕捉仪的人体上肢运动轨迹参数的测量与分析" *
高飞: "一种基于Anybody的人-自行车骑行仿真与车架参数优化研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 *

Also Published As

Publication number Publication date
CN112949084B (en) 2023-04-25

Similar Documents

Publication Publication Date Title
CN111281743B (en) Self-adaptive flexible control method for exoskeleton robot for upper limb rehabilitation
CN107953331B (en) human body posture mapping method applied to humanoid robot action simulation
CN112605996B (en) Model-free collision avoidance control method for redundant mechanical arm
CN113829343B (en) Real-time multitasking and multi-man-machine interaction system based on environment perception
CN111975771A (en) Mechanical arm motion planning method based on deviation redefinition neural network
CN114641375A (en) Dynamic programming controller
CN114102600B (en) Multi-space fusion human-machine skill migration and parameter compensation method and system
CN111208730B (en) Rapid terminal sliding mode impedance control algorithm
CN114800532B (en) Mechanical arm control parameter determination method, device, equipment, medium and robot
Chen et al. Lower-body control of humanoid robot NAO via Kinect
CN112949084B (en) Force action capturing error correction method based on weak feedback
CN113171271A (en) Gravity compensation method for upper limb rehabilitation robot
KR20120048106A (en) Motion control system and method for robot
CN104667488B (en) The method and system that omnirange displacement is offset is produced in motion platform
Bohlin et al. Unified solution of manikin physics and positioning. Exterior root by introduction of extra parameters
Wang et al. Hand movement prediction based collision-free human-robot interaction
Qi et al. Human Motion Capture Error Correction Method without Force-Feedback
CN111002292B (en) Robot arm humanoid motion teaching method based on similarity measurement
CN112057083B (en) Wearable human upper limb pose acquisition equipment and acquisition method
CN112894794B (en) Human body arm action simulation method and device, terminal equipment and storage medium
CN113160295A (en) Method and device for correcting joint point position
CN113158910A (en) Human skeleton recognition method and device, computer equipment and storage medium
Chen et al. Optimized 3D stable walking of a bipedal robot with line-shaped massless feet and sagittal underactuation
Li et al. Path planning for a cable-driven parallel waist rehabilitation robot based on discriminant analysis model
Zhang et al. Research on humanoid movements of a 7-DOF manipulator for planar grasping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant