CN112885465A - Motion data analysis method, device, system and computer readable storage medium - Google Patents

Motion data analysis method, device, system and computer readable storage medium Download PDF

Info

Publication number
CN112885465A
CN112885465A CN202110276755.4A CN202110276755A CN112885465A CN 112885465 A CN112885465 A CN 112885465A CN 202110276755 A CN202110276755 A CN 202110276755A CN 112885465 A CN112885465 A CN 112885465A
Authority
CN
China
Prior art keywords
data
user
analysis
motion data
limb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110276755.4A
Other languages
Chinese (zh)
Inventor
朱庆棠
杨建涛
张哲锦
戚剑
王洪刚
刘小林
吕璐璐
顾凡彬
范景元
朱长兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bangbang Health Management Co ltd
First Affiliated Hospital of Sun Yat Sen University
Original Assignee
Guangdong Bangbang Health Management Co ltd
First Affiliated Hospital of Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bangbang Health Management Co ltd, First Affiliated Hospital of Sun Yat Sen University filed Critical Guangdong Bangbang Health Management Co ltd
Publication of CN112885465A publication Critical patent/CN112885465A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the invention provides a motion data analysis method, a motion data analysis device, a motion data analysis system and a computer readable storage medium. The method comprises the following steps: acquiring motion data of a user through a data acquisition device, wherein the motion data comprises limb action and/or posture data of the user; analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result; and outputting the analysis result. The invention can realize the limb function evaluation based on the user motion data analysis, has wide application range, improves the objectivity, accuracy, convenience and efficiency of evaluation, reduces the evaluation threshold and can achieve the aim of saving labor cost.

Description

Motion data analysis method, device, system and computer readable storage medium
Technical Field
The present invention relates to the field of data analysis technologies, and in particular, to a method, an apparatus, a system, and a computer-readable storage medium for analyzing motion data.
Background
The limb function assessment is widely applied to the relevant fields of clinical medicine, and has important significance for special medicine (such as aerospace, aviation, navigation and special skill medicine), judicial identification and labor capacity identification. Currently, the assessment of the functional status of the limb also depends mainly on the results of clinical examinations by the physician. The labor cost is high, and the accuracy of the clinical examination result is affected by the medical knowledge and medical experience of the doctor, so that the situation of inaccurate evaluation result is easily caused.
Therefore, how to find an assessment method which can improve the accuracy and objectivity of limb assessment and reduce the labor cost is a big problem to be solved in the industry.
Disclosure of Invention
In view of this, the present invention is directed to a method, an apparatus, a system and a computer readable storage medium for analyzing motion data, which can be used to implement limb function assessment based on user motion data analysis, so as to provide important data references for final assessment of limb function status, thereby improving objectivity, accuracy, convenience and efficiency of assessment, reducing assessment threshold and saving labor cost.
An embodiment of the present invention provides a motion data analysis method applied to a computer device, including:
acquiring motion data of a user through a data acquisition device, wherein the motion data comprises limb action and/or posture data of the user;
analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result;
and outputting the analysis result.
An aspect of an embodiment of the present invention further provides a motion data analysis apparatus, including:
the data acquisition module is used for acquiring motion data of a user through a data acquisition device, wherein the motion data comprises limb action and/or posture data of the user;
the data analysis module is used for analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result;
and the output module is used for outputting the analysis result.
An aspect of an embodiment of the present invention further provides a motion data analysis system, including: the system comprises a data acquisition device, a data analysis device and a server;
wherein the data analysis device is used for executing each step in the motion data analysis method;
and the server is used for storing the analysis result output by the data analysis device.
An aspect of an embodiment of the present invention further provides an electronic apparatus, including:
a memory and a processor;
the memory stores executable program code;
the processor, coupled to the memory, invokes the executable program code stored in the memory to perform the steps of the motion data analysis method described above.
An aspect of the embodiments of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the motion data analysis method.
In each embodiment of the invention, the motion data of the user is acquired by using the data acquisition device, the motion data is analyzed to obtain the completion degree of the limb action and/or posture of the user, the limb function of the user is analyzed according to the completion degree to obtain an analysis result and output the analysis result, so that the user can finish automatic acquisition and automatic analysis of all motion data only by standing, sitting, lying or lying on the front or on the side in a test area and making a specified action and/or posture according to the prompt and guide of computer equipment, compared with the existing manual evaluation mode, the invention realizes full-automatic and standardized function evaluation based on motion data analysis by using the computer equipment, and the evaluation process is unified and standardized in the computer equipment like this, so that important data reference can be well provided for the final evaluation of the limb function, therefore, the objectivity, the accuracy, the convenience and the evaluation efficiency of evaluation are greatly improved, the evaluation threshold is reduced, the evaluation cost is effectively saved, and the method is applicable to evaluation scenes of various limb functional states, so that the method also has the advantage of wide application range.
Drawings
Fig. 1 is a flowchart illustrating an implementation of a motion data analysis method according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating an implementation of a motion data analysis method according to another embodiment of the present invention.
Fig. 3 is a schematic diagram of a motion data analysis method according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a motion data analysis apparatus according to an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of a motion data analysis device according to another embodiment of the present invention.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a motion data analysis system according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, an implementation flowchart of a motion data analysis method according to an embodiment of the present invention is provided. The method can be implemented by a computer device with data processing functions, such as: the mobile terminal can be used for processing data in the moving process, such as a mobile phone, a tablet personal computer and intelligent wearable equipment, or the electronic terminal can be used for processing data in the moving process, such as a desktop computer, a server and an all-in-one machine. The computer equipment can establish data connection with a data acquisition device for acquiring motion data through a data network, or a data acquisition module is arranged in the computer equipment or the computer equipment is electrically connected with a data acquisition peripheral. As shown in fig. 1, the method may specifically include the following steps:
s101, acquiring motion data of a user through a data acquisition device;
specifically, the data acquisition device may include, but is not limited to: optical imaging technologies such as an optical general camera and an optical depth camera, and at least one of other data acquisition devices such as an electrode plate, a triaxial inertial navigation sensor and a pressure sensor. Wherein the motion data comprises limb motion and/or gesture data of the user, such as: when the user makes limb motions and/or postures, the data acquisition device acquires data at a preset region of interest on the limb of the user. Wherein, the preset region of interest may include, but is not limited to: the hand, wrist, forearm, elbow, arm, shoulder, foot, ankle, calf, knee, thigh, hip, etc.
Step S102, analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result;
specifically, the motion data is compared with preset standard motion data to obtain an error between the motion data and the preset standard motion data, and the completion degree of the limb action and/or the posture of the user is determined according to the error. Then, matching out the function state matched with the completion degree in a plurality of preset function states, and taking the matched function state and preset description data of the function state as an analysis result. The standard motion data is consistent with the motion data type and can be obtained based on big data analysis.
It is understood that, in practical applications, the object of analysis may be only the body movement of the user, only the posture of the user, or both the body movement and the posture of the user, as required. The limb movement refers to a process of changing the position of the limb, and the posture refers to a state that the limb assumes.
And step S103, outputting the analysis result.
Specifically, the analysis result includes, but is not limited to, output in the form of data, image, video, or text. Optionally, the analysis result may be output to a local memory of the computer device, or may be output to the cloud server for storage, or may be output to a display screen or an external display of the computer device for presentation to the user.
In the embodiment of the invention, the motion data of the user is acquired by using the data acquisition device, the motion data is analyzed to obtain the completion degree of the limb action and/or posture of the user, the limb function of the user is analyzed according to the completion degree to obtain the analysis result and output the analysis result, so that the user can finish the automatic acquisition and the automatic analysis of all the motion data only by standing, sitting, lying or lying on the back in the test area and making the designated action and/or posture according to the prompt and guide of computer equipment, compared with the existing manual evaluation mode, the invention realizes the full-automatic and standardized function evaluation based on the motion data analysis by using the computer equipment, and the evaluation process is unified and objective in the computer equipment like this, thereby providing important data reference for the final evaluation of the limb function, therefore, the objectivity, the accuracy, the convenience and the evaluation efficiency of evaluation are greatly improved, the evaluation threshold is reduced, the evaluation cost is effectively saved, and the method is applicable to evaluation scenes of various limb functional states, so that the method also has the advantage of wide application range.
Referring to fig. 2, a flowchart for implementing a motion data analysis method according to another embodiment of the present invention is provided. The method can be implemented by a computer device with data processing functions, such as: the mobile terminal can be used for processing data in the moving process, such as a mobile phone, a tablet personal computer and intelligent wearable equipment, or the electronic terminal can be used for processing data in the moving process, such as a desktop computer, a server and an all-in-one machine. The computer equipment can establish data connection with a data acquisition device for acquiring motion data through a data network, or a data acquisition module is arranged in the computer equipment or the computer equipment is electrically connected with a data acquisition peripheral. As shown in fig. 2, the method may specifically include the following steps:
step S201, creating an action database;
the data stored in the action database includes: description data of a plurality of preset actions and/or postures, description data of limbs corresponding to the preset actions and/or postures, description data of functional states of the limbs corresponding to the preset actions and/or postures, and analysis rules of the functional states;
limbs include bilateral upper and lower limbs. The specific parts of the upper limb from the proximal end to the distal end of the body include the shoulder joint, the arm (the first two are commonly known as the arms), the elbow joint, the forearm, the wrist joint, and the hand (including the joints of the fingers). The specific parts of the lower limbs respectively comprise hip joints (body surface, namely buttocks), thighs, knee joints, calves, ankle joints and feet (comprising joints of toes) from the near end to the far end of the body.
The functional state of a limb may be analysed by the degree of completion of at least one or a set of predetermined actions and/or gestures made using the limb or an associated limb of the limb. The limb movement refers to a process of changing the position of the limb, and the posture refers to a state that the limb assumes.
Specifically, the description data of the preset action and/or gesture is used for describing what the preset action and/or gesture is, and the specific form includes: at least one of descriptive text, descriptive speech, and presentation video.
The description data of the limb corresponding to the preset action and/or gesture is used to describe the corresponding relationship between the preset action and/or gesture and the corresponding limb and what the limb is, and specifically may include: such as the name of the limb.
The description data of the functional state of the limb corresponding to the preset action and/or posture is used for describing a concrete expression form of the functional state, such as: pathological state-loss of 40% of function of the thumb. Further, the description data of the functional state of the limb corresponding to the preset motion and/or posture can also be used for describing the generation reason of the functional state, the suggested correction method and the like, for example: for thumb fracture, plaster fixation is recommended.
Specifically, in connection with fig. 3, the functional states of the limb include: physiological and pathological states, the analysis rule of each functional state includes: each functional state comprises a plurality of sub-states which respectively correspond to a threshold interval, and the threshold interval comprises a normal interval and a variation interval. Optionally, the physiological state comprises: a normal sub-state (corresponding to a normal threshold interval) and one or more variant sub-states (corresponding to one or more variant threshold intervals (e.g. variant interval 1, variant interval 2, … …, variant interval N in fig. 3)), wherein the pathological states include: one or more variant states (one or more variant threshold intervals).
When the motion data of the user falls into any threshold interval, or when the score of the corresponding limb obtained by analyzing the motion data falls into any threshold interval, it can be determined that the functional state of the limb reflected by the motion data is matched with the sub-state corresponding to the falling threshold interval.
It is understood that, referring to fig. 3, the principle of step S201 is to make clinical biological information of each limb in different functional states into a reference standard for performing limb functional state analysis based on user motion data.
Specifically, a plurality of specific motions and/or gestures (i.e., preset motions and/or gestures) for limb functional state analysis are defined in the motion database. The specific action and/or posture is determined based on medical professional background knowledge and clinical work experience, so that the method has certain guiding significance, and the analysis result is higher in accuracy and has directivity and referential property. The data recorded in the motion database is used for describing medical significance reflected by the specific limb motion and/or posture, normal or variant intervals reflected by different completions of the specific limb motion and/or posture, evaluation criteria or analysis rules of functional states represented by different limb motions and/or postures or combinations thereof. The evaluation criterion or the analysis rule may specifically include, but is not limited to: decision logic, threshold intervals, etc.
Optionally, the preset actions and/or gestures may include, but are not limited to, any combination of at least one or more of the following actions or gestures: firstly, resting positions of limbs: the posture of the limbs (such as the rest position of the hands) under the state of natural relaxation and no external force; ② full range of motion of the limb including all movements and/or gestures: refers to all movements of a specific joint of a limb on a specific axis, including the maximum range of active movements (e.g., flexion-extension movements of the knee joint); ③ target movements and/or postures included by the customized movements for disease analysis: the target action and/or posture included by the motion (such as single-foot jumping, OK gesture and other motions) which is designed by clinical experts for achieving the analysis of certain diseases and has clinical pathophysiological significance and disease distinguishing value, for example: all actions and/or gestures in the course of the start to the end of an action.
Step S202, when a data acquisition command is triggered, inquiring the action database to obtain description data of at least one or a group of target preset actions and/or postures pointed by the data acquisition command;
specifically, the computer device provides an operation interface or a physical function button, and the data acquisition instruction may be triggered according to a control operation performed by a user on the operation interface, such as clicking a button in the operation interface for triggering the data acquisition instruction, or according to an operation performed by the user pressing the physical function button.
When the data acquisition instruction is triggered, acquiring identification information of an analysis object which triggers the data acquisition instruction to point to, and querying the action database according to the identification information to obtain the description data of the at least one or one group of target preset actions and/or gestures.
Wherein, the identification information of the analysis object may include but is not limited to: the at least one target or group of targets presets the name or number of the limb or analysis target to be analyzed with which the motion and/or gesture is associated. The action database also stores the corresponding relation between the identification information of different analysis objects and the description data of each preset action and/or gesture. According to the corresponding relation, the description data of at least one or a group of target preset actions and/or gestures required to be made by the user in the analysis task can be inquired and obtained from the action database.
Step S203, outputting first prompt information according to the description data of the at least one or one group of target preset actions and/or postures;
specifically, the first prompt information is output through an output device of the computer device or an output device such as an external display or a player connected to the computer device according to the description data of the at least one or one group of target preset actions and/or gestures, so as to prompt the user to make the at least one or one group of target preset actions and/or gestures. For example: displaying the descriptive text of the at least one or one group of target preset actions and/or gestures in the screen of the computer equipment, playing a demonstration video of the at least one or one group of target preset actions and/or gestures, and playing descriptive voice of the at least one or one group of target preset actions and/or gestures through a loudspeaker of the computer equipment.
It will be appreciated that, in practical applications, the user may need to make one or more or one or more sets of specified body movements, one or more or one or more sets of specified gestures, or a combination of one or more or one or more sets of specified body movements and gestures, depending on the purpose of the evaluation.
Optionally, when the number of the target preset actions and/or gestures is plural, the first prompt information may be output at one time, the output information includes description data of all target preset actions and/or gestures, or the first prompt information may be output in a divided manner by taking a single or single group of actions and/or gestures as a unit according to the data acquisition sequence, and the output information at each time includes only description data of the actions and/or gestures currently required to be made by the user.
Step S204, acquiring motion data of a user through a data acquisition device;
specifically, the data acquisition device may include, but is not limited to: optical imaging technologies such as an optical general camera and an optical depth camera, and at least one of other data acquisition devices such as an electrode plate, a triaxial inertial navigation sensor and a pressure sensor.
It is understood that, in conjunction with fig. 3, the principle of step S204 is to perform data acquisition on biological information contained in the body movement and/or posture of the user according to the first prompt information by the data acquisition device, and convert the acquired raw data into physical signals such as sound, light, electricity, magnetism, heat, force, etc. as the movement data of the user.
Wherein the motion data comprises limb motion and/or gesture data of the user when making the at least one or a set of target preset motions and/or gestures.
It will be appreciated that any movement of a limb may be broken down into a combination of joint movements of the limbs. The joint can be seen as the deformation of two adjacent parts of limbs with the joint between the two parts as the center of motion during the movement. For example, elbow flexion is actually a process in which the elbow joint is centered and the arm and forearm approach each other; straightening the elbow joint is the opposite process, which is the process of moving the arm and forearm away from each other with the elbow joint as the center.
Optionally, the above-mentioned limb motion and/or posture data may include, but is not limited to: at least one of a motion trajectory, a position parameter, a motion direction, and a motion angle of each joint of the limb used by the user when making the at least one or the set of target preset motions and/or postures.
Step S205, analyzing the motion data, and judging whether the limb action and/or the posture of the user meet preset requirements;
specifically, the acquired motion data of the user is compared with the preset feature data of the target preset motion and/or posture to obtain the similarity between the motion and/or posture actually made by the user and the target preset motion and/or posture, whether the similarity is greater than the preset similarity is judged, if so, the limb motion and/or posture of the user is determined to meet the preset requirement, otherwise, the limb motion and/or posture of the user is determined not to meet the preset requirement.
It can be understood that if the motion data of the user only includes the limb motion data of the user, only the judgment on whether the limb motion of the user meets the preset requirement is needed, that is, only the similarity between the motion actually made by the user and the target preset motion needs to be calculated; if the motion data only comprises the gesture data of the user, only judging whether the gesture of the user meets the preset requirement, namely, only calculating the similarity between the gesture actually made by the user and the target preset gesture; if the exercise data includes the body movement data and the posture data of the user at the same time, it is necessary to determine whether the body movement and the posture of the user both meet the preset requirements, and step S206 is executed as long as any one of the body movement or the posture does not meet the preset requirements.
Step S206, if the preset requirement is not met, outputting a second prompt message, and returning to execute the step S204: acquiring motion data of a user through a data acquisition device;
the second prompt message is used for prompting the user to make the action and/or gesture indicated by the second prompt message.
In particular, when the acquired motion data is data collected by the user while making a single or single group of actions and/or gestures, the second prompt message is used to prompt the user to make the last action and/or gesture made again. When the acquired motion data is data acquired when the user makes one or more groups of actions and/or gestures, the second prompt information is used for prompting the user to make one or more groups of actions and/or gestures which do not meet the preset requirement again, namely, the user does not need to do any more when meeting the preset requirement.
Step S207, if the preset requirement is met, analyzing whether the physiological state or the pathological state corresponds to the limb used by the user by using the motion data;
specifically, if the limb movement and/or posture actually made by the user meets the preset requirement, which indicates that the acquired motion data meets the analysis requirement, the limb movement and/or posture data of the user is compared with the characteristic data range of the target preset movement and/or posture in the physiological state and the characteristic data range of the target preset movement and/or posture in the pathological state, respectively.
And if the body motion and/or posture data of the user falls into the characteristic data range of the target preset motion and/or posture in the physiological state, determining that the body used by the user when the user performs the body motion and/or posture corresponds to the physiological state.
And if the body motion and/or posture data of the user falls into the characteristic data range of the target preset motion and/or posture in the case state, determining that the body used by the user when the user performs the body motion and/or posture corresponds to the pathological state.
Step S208, obtaining a target analysis rule corresponding to the current analysis result by inquiring the action database;
specifically, by querying the action database, the first prompt information is obtained to indicate the target preset action and/or gesture made by the user, and the target analysis rule corresponding to the analysis result obtained in step S207 is, for example: if the target preset action is a shoulder abduction action and the analysis result is that the shoulder abduction action made by the user corresponds to a pathological state, obtaining an analysis rule (such as a threshold interval and judgment logic of each sub-state in the pathological state) of the shoulder abduction action as a target analysis rule by querying the action database.
Step S209, extracting target characteristic data of the region of interest corresponding to the target analysis rule from the motion data;
specifically, the region of interest is a preset region on the limb used when the user performs the target preset action and/or posture, and the preset region can ensure that valuable biological information data are collected from the limb as much as possible on the premise of maintaining data accuracy, and remove false and impurity from interference caused by complicated and polymorphic information in the collection process, for example, the region of interest can be at each joint of the limb. In practical application, the region of interest can be selected according to a user-defined operation.
Step S210, analyzing the target characteristic data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree and the target analysis rule to obtain an analysis result;
with reference to fig. 3, the principle of step S210 is to convert the collected physical information into computer information (e.g. 0 being a normal state and 1 being a pathological state) including a limb movement or posture part, a completion situation, etc. that can be recognized by a computer, and then perform analysis and evaluation to obtain an analysis result. For example, the input computer information such as initial position information describing limb movement or posture, activity range information, real-time movement position information, movement completion degree information and the like is analyzed, and the operation such as assignment, calculation and the like is performed according to the index, so that the analysis result processed by the computer is obtained. The analysis results are then converted into functional assessment results under natural language descriptions that can be understood by the user (i.e., converted back into biological information) for reference by the user. Wherein, the assignment according to the index can be converted into interval division of the completion degree or completion state of any limb action and/or posture.
Specifically, the target characteristic data is compared with preset standard characteristic data of the target preset action and/or posture to obtain an error of the target characteristic data and the standard characteristic data, and the completeness of the limb action and/or posture of the user is calculated according to the error. And then assigning values to the limb actions and/or postures of the user according to the calculated completion degree and a preset assignment rule to obtain an assignment result.
And when the number of the body actions and/or postures of the user is singular, determining the sub-state corresponding to the assignment result according to the target analysis rule, and inquiring the action database to obtain the description data of the sub-state corresponding to the assignment result as the analysis result.
When the number of the body actions and/or the gestures of the user is complex, determining the weight corresponding to each body action and each gesture according to a preset weight distribution rule. And then, calculating a total score according to the assignment result, the weight and a preset algorithm. And then, according to the target analysis rule, determining the sub-state corresponding to the total score, and querying the action database to obtain the description data of the sub-state corresponding to the total score as the analysis result. The preset algorithm may be, for example, a weighted algorithm or a weighted average algorithm.
For example, assuming that the assignment of the shoulder abduction motion made by the user is 5, and the assignment falls into a threshold interval corresponding to a sub-state 2 (e.g. motion 3-pathological state-variation interval 2 in fig. 3) of the shoulder abduction motion in a pathological state, the description data of the sub-state 2 is obtained as an analysis result by querying the motion database. The description data of sub-state 2 may for example but not limited describe what state sub-state 2 is specific, the reason for the generation and the proposed corrective method etc.
It will be appreciated that the physiological and pathological conditions described above are used to reflect whether the range of motion of the various joints of the limb being measured is within a normally acceptable range. The motion data analysis is mainly to compare the acquired motion data with preset standard data or data range to determine whether the movement range of each joint of the measured limb is missing from the normal range (if so, the section corresponding to the function loss is determined according to the data range preset in the motion database for evaluation), and whether each joint of the measured limb can complete a specific joint angle combination (i.e., whether a special motion posture is formed, which can complete a specific daily life task).
It can be understood that, in this embodiment, through the three-level signal acquisition, transmission, and conversion system, the process of clinically defining biological information (limb movement and/or posture), converting the biological information into physical information (optical capture, photoelectric conversion of transmitted electrical information), converting the physical information into data information, analyzing and processing the data information by the analysis module software, and then reducing the data information into biological information (evaluation result) is completed, thereby finally realizing automatic, standardized, and noninvasive function evaluation based on motion data and computer equipment.
And step S211, outputting the analysis result.
Specifically, the analysis result includes, but is not limited to, output in the form of data, image, video, or text. With reference to fig. 3, optionally, the analysis result may be stored locally in the computer device, or may be stored in a storage device in the cloud, such as a cloud server.
Optionally, outputting the analysis result specifically includes: acquiring a preset file format, content description information and a file template; extracting target content from the analysis result according to the content description information; and generating and outputting an analysis report file which contains the target content and has the file format by using the file template. For example: and filling the target content into the corresponding position in the file template, and then packaging the file template filled with the target content into the file with the preset file format and outputting the file.
Further, according to a query instruction triggered by a user, a plurality of analysis results pointed by the query instruction are obtained from all stored analysis results, and a series of function evaluation result reports containing the plurality of analysis results are generated and output.
Optionally, in the step of acquiring and analyzing the motion data, the motion data of all motions and/or gestures performed by the user may be acquired at one time, the acquired motion data is analyzed, all target motions and/or gestures that do not meet the preset requirement are determined, second prompt information is output to prompt the user to perform all target motions and/or gestures again in sequence, then the motion data of all target motions and/or gestures performed by the user is acquired, and the step of executing the motion data acquired by analyzing is returned until all motions and/or gestures performed by the user meet the preset requirement.
Or, each time the user makes an action or gesture, analyzing the motion data obtained when the user makes the current action or gesture, determining whether the current action or gesture meets the preset requirement, if so, prompting the user to make the next action or gesture, and executing the step of analyzing the motion data obtained when the user makes the current action or gesture based on the next action or gesture made by the user, if not, prompting the user to make the current action or gesture again, and returning the motion data obtained when the user makes the current action or gesture based on the action or gesture made by the user, determining whether the current action or gesture meets the preset requirement, and repeating the steps until the user makes all the actions and/or gestures prompted by the first prompt information, and all the actions and postures meet the preset requirements.
Optionally, the manual mode may be switched according to a selection operation performed by the user on the operation interface provided by the computer device, and whether the action and/or the gesture performed by the user meet the preset requirement is determined according to a determination instruction triggered by the user on the operation interface.
Specifically, a voice prompt and example action module is configured in the computer equipment, so that a tested person can be guided to make a specified limb action and/or posture through hearing and vision, whether the actual action and/or posture meets the requirements or not and whether redoing, redoing or skipping a certain action and/or posture is required or not are automatically detected, and the tested person is guided to enter the next limb action data acquisition link until the whole set of data acquisition process is automatically completed; or the device can be switched into a manual mode according to the requirements of testers, the action is judged manually to meet the requirements, the content of next-step limb action data acquisition is determined, and the computer equipment assists in the measurement process.
Optionally, each interval in the action database, the various standard data, the standard data range or the standard characteristic data used as the reference may be obtained based on big data analysis, and may optionally be stored in the action database. The normal state (i.e., physiological state), the normal state variation interval, the pathological state and the pathological state variation interval are distinguished by analyzing the sample big data of the collected motion data by using algorithms such as normal distribution, confidence interval and the like. Furthermore, by dividing each section into sections with function scores, the completion state of any limb action and/or posture can be corresponding to a specific section in a physiological state or a pathological state, and a corresponding function evaluation score or prompt is obtained.
It should be noted that, with reference to fig. 3, the data acquisition step and the data analysis step may be performed continuously at the same time or may be performed separately at different times. The data collection result and the data analysis result may be stored in an internal storage device or an external storage device (e.g., a cloud server) of the computer device, and then output through an internal output device or an external output device (e.g., a display, a router, a printer, etc.) of the computer device.
Further, the data acquisition result and the analysis result may be part of big data for specifying reference data such as the sections and the standard data.
In the embodiment of the invention, the motion data of the user is acquired by using the data acquisition device, the motion data is analyzed to obtain the completion degree of the limb action and/or posture of the user, the limb function of the user is analyzed according to the completion degree to obtain the analysis result and output the analysis result, so that the user can finish the automatic acquisition and the automatic analysis of all the motion data only by standing, sitting, lying or lying on the back in the test area and making the designated action and/or posture according to the prompt and guide of computer equipment, compared with the existing manual evaluation mode, the invention realizes the full-automatic and standardized function evaluation based on the motion data analysis by using the computer equipment, and the evaluation process is unified and objective in the computer equipment like this, thereby providing important data reference for the final evaluation of the limb function, therefore, the objectivity, the accuracy, the convenience and the evaluation efficiency of evaluation are greatly improved, the evaluation threshold is reduced, the evaluation cost is effectively saved, and the method is applicable to evaluation scenes of various limb functional states, so that the method also has the advantage of wide application range.
Referring to fig. 4, a schematic structural diagram of a motion data analysis apparatus according to an embodiment of the present invention is provided. For convenience of explanation, only portions related to the embodiments of the present application are shown. The apparatus may be a computer device or a software module configured with a computer device. As shown in fig. 4, the apparatus includes: a data acquisition module 401, a data analysis module 402 and an output module 403.
A data acquisition module 401, configured to acquire motion data of a user through a data acquisition device, where the motion data includes limb movement and/or posture data of the user;
a data analysis module 402, configured to analyze the motion data to obtain a degree of completion of the limb movement and/or posture of the user, and analyze the limb function of the user according to the degree of completion to obtain an analysis result;
an output module 403, configured to output the analysis result.
The specific process of the modules to implement their respective functions may refer to the related contents in the embodiments shown in fig. 1 to fig. 3, and will not be described herein again.
In the embodiment of the invention, the motion data of the user is acquired by using the data acquisition device, the motion data is analyzed to obtain the completion degree of the limb action and/or posture of the user, the limb function of the user is analyzed according to the completion degree to obtain the analysis result and output the analysis result, so that the user can finish the automatic acquisition and the automatic analysis of all the motion data only by standing, sitting, lying or lying on the back in the test area and making the designated action and/or posture according to the prompt and guide of computer equipment, compared with the existing manual evaluation mode, the invention realizes the full-automatic and standardized function evaluation based on the motion data analysis by using the computer equipment, and the evaluation process is unified and objective in the computer equipment like this, thereby providing important data reference for the final evaluation of the limb function, therefore, the objectivity, the accuracy, the convenience and the evaluation efficiency of evaluation are greatly improved, the evaluation threshold is reduced, the evaluation cost is effectively saved, and the method is applicable to evaluation scenes of various limb functional states, so that the method also has the advantage of wide application range.
Referring to fig. 5, a schematic structural diagram of a motion data analysis apparatus according to another embodiment of the present invention is provided. For convenience of explanation, only portions related to the embodiments of the present application are shown. The apparatus may be a computer device or a software module configured with a computer device. Unlike the embodiment shown in fig. 4, as shown in fig. 5,
further, the apparatus further comprises:
an assignment module 501, configured to create an action database, where data stored in the action database includes: description data of a plurality of preset actions and/or postures, description data of limbs corresponding to the preset actions and/or postures, description data of functional states of the limbs corresponding to the preset actions and/or postures, and analysis rules of the functional states;
wherein the functional state includes a physiological state and a pathological state, and the analysis rule includes: each of the functional states includes a plurality of sub-states each corresponding to a threshold interval, and the threshold interval includes a normal interval and a variation interval.
Further, the apparatus further comprises:
the action prompt module 502 is configured to, when a data acquisition instruction is triggered, query the action database to obtain description data of at least one or a group of target preset actions and/or gestures pointed by the data acquisition instruction;
and outputting first prompt information according to the description data of the at least one or one group of target preset actions and/or gestures to prompt the user to make the at least one or one group of target preset actions and/or gestures.
Further, the apparatus further comprises:
a determining module 503, configured to analyze the motion data and determine whether the body motion and/or posture of the user meets a preset requirement;
the action prompting module 502 is further configured to output second prompting information if the determination result of the determining module 503 is that the preset requirement is not met, where the second prompting information is used to prompt the user to make an action and/or a gesture indicated by the prompting information;
the data acquisition module 401 is further configured to analyze whether the limb used by the user corresponds to the physiological state or the pathological state by using the motion data if the determination result of the determination module 503 is that the preset requirement is met; obtaining a target analysis rule corresponding to an analysis result by inquiring the action database; and extracting target characteristic data of the region of interest corresponding to the target analysis rule from the motion data.
Further, the data analysis module 402 is further configured to compare the target feature data with preset standard feature data to obtain the degree of completion of the limb movement and/or posture of the user; assigning values to the limb actions and/or postures of the user according to the completion degree and a preset assignment rule; when the number of the user's limb actions and/or postures is singular, determining the sub-state corresponding to the assignment result according to the target analysis rule, and querying the action database to obtain the description data of the sub-state corresponding to the assignment result as the analysis result; when the number of the limb actions and/or the gestures of the user is complex, determining the weight corresponding to each limb action and each gesture according to a preset weight distribution rule; calculating a total score according to the assignment result, the weight and a preset algorithm; and determining the sub-state corresponding to the total score according to the target analysis rule, and querying the action database to obtain the description data of the sub-state corresponding to the total score as the analysis result.
Further, the output module 403 is further configured to obtain a preset file format, content description information, and a file template; extracting target content from the analysis result according to the content description information; and generating an analysis report file with the file format by using the file template and outputting the analysis report file, wherein the analysis report file comprises the target content.
The specific process of the modules to implement their respective functions may refer to the related contents in the embodiments shown in fig. 1 to fig. 3, and will not be described herein again.
In the embodiment of the invention, the motion data of the user is acquired by using the data acquisition device, the motion data is analyzed to obtain the completion degree of the limb action and/or posture of the user, the limb function of the user is analyzed according to the completion degree to obtain the analysis result and output the analysis result, so that the user can finish the automatic acquisition and the automatic analysis of all the motion data only by standing, sitting, lying or lying on the back in the test area and making the designated action and/or posture according to the prompt and guide of computer equipment, compared with the existing manual evaluation mode, the invention realizes the full-automatic and standardized function evaluation based on the motion data analysis by using the computer equipment, and the evaluation process is unified and objective in the computer equipment like this, thereby providing important data reference for the final evaluation of the limb function, therefore, the objectivity, the accuracy, the convenience and the evaluation efficiency of evaluation are greatly improved, the evaluation threshold is reduced, the evaluation cost is effectively saved, and the method is applicable to evaluation scenes of various limb functional states, so that the method also has the advantage of wide application range.
Referring to fig. 6, a hardware structure diagram of an electronic device according to an embodiment of the present invention is provided. As shown in fig. 6, the electronic device 60 includes: a network interface 61, a processor 62, a memory 63, a computer program 64 stored in the memory 63 and executable on the processor 62, and a system bus 65. The system bus 65 is used to connect the network interface 61, the processor 62, and the memory 63. The processor 62, when executing the computer program 64, implements the steps in the various motion data analysis method embodiments described above, such as steps S101 through S103 shown in fig. 1, or steps S201 through S211 shown in fig. 2.
The network interface 61 is used to communicate with other servers.
Illustratively, the Processor 62 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, and so on. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
For example, the memory 63 may be, for example, a hard disk drive memory, a non-transitory memory, a non-volatile memory (e.g., a flash memory or other electronically programmable deletion-limited memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), and the like, and the embodiments of the present application are not limited thereto. The memory 63 may include both internal storage units of the electronic apparatus 60 and external storage devices. The memory 63 is used for storing computer programs and other programs and data required by the electronic device 60. The memory 63 may also be used to temporarily store data that has been output or is to be output.
Illustratively, the computer program 64 may be partitioned into one or more modules/units that are stored in the memory 63 and executed by the processor 62 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 64 in the electronic device 60. For example, the data acquisition module 401, the data analysis module 402 and the output module 403 shown in fig. 4, and the assignment module 501, the action prompt module 502 and the judgment module 503 shown in fig. 5. For the detailed functions of the modules, please refer to the related descriptions in the embodiments shown in fig. 4 and fig. 5, which are not repeated herein.
Further, stored in memory 63 are device drivers, which may be network and interface drivers.
Those skilled in the art will appreciate that fig. 6 is merely an example of the electronic device 60, and does not constitute a limitation of the electronic device 60, and in practical applications may include more or less components than those shown, or combine some components, or different components, for example, the electronic device 60 may further include: input/output devices (e.g., keyboard, microphone, camera, speaker, display screen, etc.).
Further, the present application provides a non-transitory computer readable storage medium, which can be configured in the computer device or the control apparatus in the foregoing embodiments, and the non-transitory computer readable storage medium stores a computer program, which when executed by a processor, implements the motion data analysis method described in the foregoing embodiments shown in fig. 1 to 3.
Referring to fig. 7, a schematic structural diagram of a motion data analysis system according to an embodiment of the present invention is provided. As shown in fig. 7, the system includes: a data acquisition device 701, a data analysis device 702, and a server 703.
The data acquisition device 701 is used for acquiring motion data of a user.
The data analysis device 702 may be, for example, a computer device described in the embodiments shown in fig. 1 and fig. 2, and its structure may be specifically shown in fig. 6. The data analysis device 702 is configured to perform the steps of the motion data analysis method provided in the foregoing embodiment, please refer to the related description in the embodiments shown in fig. 1 to fig. 3, which is not repeated herein.
And a server 703 for storing the analysis result output by the data analysis apparatus 702.
Further, the server 703 is also used for creating an action database, or storing the action database created by the data analysis device 703.
Further, the server 703 is further configured to provide an analysis result query, obtain, according to the received query request, a plurality of analysis results pointed by the query request from all stored analysis results, generate and output a series of function evaluation result reports including the plurality of analysis results.
The specific processes of the data acquisition device 701, the data analysis device 702, and the server 703 for implementing their respective functions may refer to the related descriptions in the embodiments shown in fig. 1 to fig. 6, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative modules/elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described apparatus/terminal embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the above embodiment of the present invention can also be implemented by instructing related hardware through a computer program. The computer program may be stored in a computer readable storage medium, which when executed by a processor, may implement the steps of the various method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer readable medium may include any suitable increase or decrease as required by legislation and patent practice in the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A motion data analysis method applied to computer equipment is characterized by comprising the following steps:
acquiring motion data of a user through a data acquisition device, wherein the motion data comprises limb action and/or posture data of the user;
analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result;
and outputting the analysis result.
2. The method of claim 2, wherein prior to obtaining the athletic data of the user via the data collection device, further comprising:
creating an action database, wherein the data stored in the action database comprises: description data of a plurality of preset actions and/or gestures, description data of limbs corresponding to each preset action and/or gesture, description data of functional states of limbs corresponding to each preset action and/or gesture, and analysis rules of each functional state;
wherein the functional states include physiological states and pathological states, and the analysis rule includes: each of the functional states includes a plurality of sub-states each corresponding to a threshold interval, and the threshold interval includes a normal interval and a variant interval.
3. The method of claim 2, wherein the obtaining of the motion data of the user by the data acquisition device comprises:
when a data acquisition instruction is triggered, inquiring the action database to obtain description data of at least one or a group of target preset actions and/or postures pointed by the data acquisition instruction;
outputting first prompt information according to the description data of the at least one or one group of target preset actions and/or gestures to prompt the user to make the at least one or one group of target preset actions and/or gestures;
and acquiring the motion data through the data acquisition device.
4. The method of claim 2 or 3, wherein after the acquiring of the motion data by the data acquisition device, the method further comprises:
analyzing the motion data, and judging whether the limb action and/or the posture of the user meet preset requirements;
if the preset requirement is not met, outputting second prompt information, and returning to the step of executing the step of acquiring the motion data of the user through the data acquisition device, wherein the second prompt information is used for prompting the user to make the action and/or posture indicated by the second prompt information;
if the preset requirements are met, analyzing whether the limb used by the user corresponds to the physiological state or the pathological state by using the motion data;
obtaining a target analysis rule corresponding to the current analysis result by inquiring the action database;
and extracting target feature data of the region of interest corresponding to the target analysis rule from the motion data.
5. The method according to claim 4, wherein the analyzing the motion data to obtain a degree of completion of the user's limb movement and/or posture, and analyzing the user's limb function according to the degree of completion to obtain an analysis result comprises:
comparing the target characteristic data with preset standard characteristic data to obtain the completion degree of the limb action and/or posture of the user;
assigning values to the limb actions and/or postures of the user according to the completion degree and a preset assignment rule;
when the number of the body actions and/or postures of the user is singular, determining the sub-state corresponding to the assignment result according to the target analysis rule, and inquiring the action database to obtain description data of the sub-state corresponding to the assignment result as the analysis result;
when the number of the limb actions and/or postures of the user is complex, determining the weight corresponding to each limb action and each posture according to a preset weight distribution rule;
calculating a total score according to the assignment result, the weight and a preset algorithm;
and determining the sub-state corresponding to the total score according to the target analysis rule, and inquiring the action database to obtain the description data of the sub-state corresponding to the total score as the analysis result.
6. The method of claim 1, wherein the outputting the analysis results comprises:
acquiring a preset file format, content description information and a file template;
extracting target content from the analysis result according to the content description information;
and generating an analysis report file with the file format by using the file template and outputting the analysis report file, wherein the analysis report file comprises the target content.
7. An apparatus for analyzing motion data, the apparatus comprising:
the data acquisition module is used for acquiring motion data of a user through a data acquisition device, wherein the motion data comprises limb action and/or posture data of the user;
the data analysis module is used for analyzing the motion data to obtain the completion degree of the limb action and/or posture of the user, and analyzing the limb function of the user according to the completion degree to obtain an analysis result;
and the output module is used for outputting the analysis result.
8. A motion data analysis system, the system comprising: the system comprises a data acquisition device, a data analysis device and a server;
wherein the data analysis device is configured to perform each step in the motion data analysis method according to any one of claims 1 to 6;
and the server is used for storing the analysis result output by the data analysis device.
9. An electronic device, comprising:
a memory and a processor;
the memory stores executable program code;
the processor, coupled to the memory, invokes the executable program code stored in the memory to perform the steps in the motion data analysis method of any of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method of motion data analysis according to any one of claims 1 to 6.
CN202110276755.4A 2021-02-23 2021-03-15 Motion data analysis method, device, system and computer readable storage medium Pending CN112885465A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021102013241 2021-02-23
CN202110201324 2021-02-23

Publications (1)

Publication Number Publication Date
CN112885465A true CN112885465A (en) 2021-06-01

Family

ID=76042030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110276755.4A Pending CN112885465A (en) 2021-02-23 2021-03-15 Motion data analysis method, device, system and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112885465A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436736A (en) * 2021-06-16 2021-09-24 深圳英鸿骏智能科技有限公司 Rehabilitation assessment method, system, device and storage medium
CN114495262A (en) * 2021-12-24 2022-05-13 北京航空航天大学 Method, system, computer equipment and storage medium for limb evaluation
CN118335282A (en) * 2024-04-01 2024-07-12 青岛黄海学院 Rehabilitation gait pattern targeting generation method and system based on hybrid FES exoskeleton system fusion control

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090058803A (en) * 2007-12-05 2009-06-10 전북대학교산학협력단 Method for learning video exercise
CN106984027A (en) * 2017-03-23 2017-07-28 华映科技(集团)股份有限公司 Action comparison analysis method and device and display
CN107422852A (en) * 2017-06-27 2017-12-01 掣京机器人科技(上海)有限公司 Healing hand function training and estimating method and system
CN109288651A (en) * 2018-08-20 2019-02-01 中国科学院苏州生物医学工程技术研究所 Personalized upper-limbs rehabilitation training robot system and its recovery training method
CN110232963A (en) * 2019-05-06 2019-09-13 中山大学附属第一医院 A kind of upper extremity exercise functional assessment system and method based on stereo display technique
CN110755084A (en) * 2019-10-29 2020-02-07 南京茂森电子技术有限公司 Motion function evaluation method and device based on active and passive staged actions
CN110755085A (en) * 2019-10-29 2020-02-07 南京茂森电子技术有限公司 Motion function evaluation method and equipment based on joint mobility and motion coordination

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090058803A (en) * 2007-12-05 2009-06-10 전북대학교산학협력단 Method for learning video exercise
CN106984027A (en) * 2017-03-23 2017-07-28 华映科技(集团)股份有限公司 Action comparison analysis method and device and display
CN107422852A (en) * 2017-06-27 2017-12-01 掣京机器人科技(上海)有限公司 Healing hand function training and estimating method and system
CN109288651A (en) * 2018-08-20 2019-02-01 中国科学院苏州生物医学工程技术研究所 Personalized upper-limbs rehabilitation training robot system and its recovery training method
CN110232963A (en) * 2019-05-06 2019-09-13 中山大学附属第一医院 A kind of upper extremity exercise functional assessment system and method based on stereo display technique
CN110755084A (en) * 2019-10-29 2020-02-07 南京茂森电子技术有限公司 Motion function evaluation method and device based on active and passive staged actions
CN110755085A (en) * 2019-10-29 2020-02-07 南京茂森电子技术有限公司 Motion function evaluation method and equipment based on joint mobility and motion coordination

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436736A (en) * 2021-06-16 2021-09-24 深圳英鸿骏智能科技有限公司 Rehabilitation assessment method, system, device and storage medium
CN114495262A (en) * 2021-12-24 2022-05-13 北京航空航天大学 Method, system, computer equipment and storage medium for limb evaluation
CN118335282A (en) * 2024-04-01 2024-07-12 青岛黄海学院 Rehabilitation gait pattern targeting generation method and system based on hybrid FES exoskeleton system fusion control

Similar Documents

Publication Publication Date Title
US9761011B2 (en) Motion information processing apparatus obtaining motion information of a subject performing a motion
US11266341B2 (en) Measuring dynamic body movement
CN112885465A (en) Motion data analysis method, device, system and computer readable storage medium
US9710920B2 (en) Motion information processing device
US20150005910A1 (en) Motion information processing apparatus and method
US20150325004A1 (en) Motion information processing apparatus and method
US20150320343A1 (en) Motion information processing apparatus and method
CN111325745B (en) Fracture region analysis method and device, electronic equipment and readable storage medium
WO2017161733A1 (en) Rehabilitation training by means of television and somatosensory accessory and system for carrying out same
Song et al. Cellphone-based automated Fugl-Meyer assessment to evaluate upper extremity motor function after stroke
Hidayat et al. LOVETT scalling with MYO armband for monitoring finger muscles therapy of post-stroke people
JP2018092493A (en) Care support system, control method, and program
CN117546255A (en) Capturing data from a user for assessment of disease risk
CN111820902B (en) Ankle joint ligament injury intelligent decision-making system based on activity degree characteristics
US20240082098A1 (en) Stroke rehabilitation therapy predictive analysis
CN114403858B (en) Human body movement function assessment method, device and system
CN113598714B (en) Automatic palpation method, device, electronic equipment and storage medium
WO2021149629A1 (en) Posture diagnosis system, posture diagnosis method, and data set for posture diagnosis
CN113113108A (en) Motion data analysis method, device and system and computer readable storage medium
CN113705435A (en) Behavior acquisition method, behavior acquisition device, behavior acquisition terminal and storage medium for symptom evaluation
JP2023512351A (en) Method of confirming the position of the intended target on the body
CN114403855B (en) Paralyzed upper limb movement function evaluation method, system and computer readable storage medium
CN219021189U (en) Automatic upper limb movement function evaluation system based on clinical scale
Averell et al. A real-time algorithm for the detection of compensatory movements during reaching
US20220020459A1 (en) Patient assessment system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination