CN112998700A - Apparatus, system and method for assisting assessment of a motor function of an object - Google Patents

Apparatus, system and method for assisting assessment of a motor function of an object Download PDF

Info

Publication number
CN112998700A
CN112998700A CN202110575579.4A CN202110575579A CN112998700A CN 112998700 A CN112998700 A CN 112998700A CN 202110575579 A CN202110575579 A CN 202110575579A CN 112998700 A CN112998700 A CN 112998700A
Authority
CN
China
Prior art keywords
joint
gait
video
subject
phase
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110575579.4A
Other languages
Chinese (zh)
Other versions
CN112998700B (en
Inventor
许可
张建华
杜晓刚
李景阳
喻剑舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ouying Information Technology Co Ltd
Original Assignee
Beijing Ouying Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ouying Information Technology Co Ltd filed Critical Beijing Ouying Information Technology Co Ltd
Priority to CN202110575579.4A priority Critical patent/CN112998700B/en
Publication of CN112998700A publication Critical patent/CN112998700A/en
Application granted granted Critical
Publication of CN112998700B publication Critical patent/CN112998700B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a device, a method and a system for assisting the assessment of the motor function of a subject. The system includes a collection device and an evaluation aid independent of the collection device. The acquisition device is configured to acquire video of subject motion. The evaluation device is configured to process the captured video to obtain quantitative data for assisting in evaluating the motion function of the subject.

Description

Apparatus, system and method for assisting assessment of a motor function of an object
Technical Field
The invention relates to a device, a system and a method for assisting the assessment of a motor function of an object, and a corresponding machine storage medium.
Background
The condition of human body motion function is very important for participating in daily life, so it is very important to evaluate the motion function of people who may have motion function deficiency. Studies have shown that abnormal gait characteristics of the human body are often associated with impaired or absent certain motor functions. Therefore, in the prior art, the motor function of the human body is generally evaluated by analyzing gait characteristics.
In existing gait analysis solutions, large medical facilities and specialized test sites are often required. For example, sensing devices such as an accelerometer, a gyroscope, and a pressure sensor are attached to a human body, and then the person to be evaluated to which the sensing devices are attached is required to walk on a professional test site for gait feature detection. The professional test site requires equipment for receiving the sensing signals from the sensing equipment and equipment for processing the sensing signals. Such existing solutions suffer from high costs and long detection and analysis times, and require high expertise on the part of both the test operator and the evaluator. These deficiencies limit the scope and effectiveness of motor function assessment by gait analysis to varying degrees.
Therefore, it is desirable to provide a solution to the above-mentioned problems in the prior art.
Disclosure of Invention
In view of the above-mentioned problems in the prior art, according to an embodiment of an aspect of the present invention, there is provided an apparatus for assisting assessment of a motor function of a subject, including: an acquisition module configured to acquire a video of a motion of an object; a joint coordinate determination module configured to perform object joint point detection on each frame of picture in the video and determine joint coordinates of the detected joint points; a gait phase determination module configured to determine a gait cycle of the object based on the joint coordinates in each frame of picture and divide the gait cycle into a plurality of gait phase; the joint angle determining module is configured to calculate joint angles of joint points detected in each frame of picture in a gait cycle based on joint coordinates and draw a joint angle-frame number curve representing the change trend of joint activity of the object in the gait cycle; the adjusting module is configured to adjust standard curve segments of a standard curve representing the standard change trend of the joint activity in a gait cycle in each gait time phase respectively so that the number of picture frames contained in each standard curve segment is equal to the number of picture frames contained in a curve segment of a joint angle-frame number curve in a corresponding gait time phase respectively; and an evaluation assistance module configured to generate quantitative data for assisting in evaluating a motion function of the subject, including a joint angle-frame number curve and an adjusted standard curve.
In a possible embodiment, the device further comprises a verification module configured to: after obtaining a video, sampling a predetermined number of pictures from the video, wherein the predetermined number is predetermined based on both the operation speed and the verification effect of a verification module; verifying whether a video is valid by detecting the sampled pictures, wherein the video is valid when only specified object motion, specified object quantity and video quality qualification are met; continuing subsequent operations when the video is verified as valid; and when the video is verified to be invalid, ending the operation.
In one possible embodiment, the specified object motion is a single pass linear motion of the object; the number of designated objects is 1; and the qualified video quality means that the fuzziness of the video does not exceed the fuzziness critical value.
In a possible embodiment, the evaluation assistance module is further configured to determine, as the quantized data, at least one of the following based on a joint angle-frame number curve and an adjusted standard curve:
-a degree of matching between the adjusted standard curve segment and a corresponding curve segment of the joint angle-frame number curve within each gait phase;
-the difference between the maximum joint angle of the subject and the maximum joint angle in the corresponding standard curve segment in each gait phase;
-the difference between the minimum joint angle of the subject and the minimum joint angle in the corresponding standard curve segment in each gait phase; and
-a degree of matching between the range of joint motion of the subject and the range of joint motion in the corresponding standard curve segment in each gait phase.
In a possible embodiment, the evaluation assistance module is further configured to determine, as the quantized data, at least one of the following based on a joint angle-frame number curve and an adjusted standard curve:
-the duration of the gait cycle;
-the duration of each gait phase of a gait cycle;
-the ratio between the duration of each gait phase and the duration of the gait cycle;
-a ratio between the durations of any two gait phases within a gait cycle;
symmetry of joint motion of the left and right feet of the subject in each gait phase.
In a possible embodiment, the joint coordinate determination module is configured to calculate coordinates of the subject joint in a direction of travel and in a height direction, respectively, and the gait phase determination module is configured to determine a gait cycle and each gait phase of the gait cycle based on both coordinates.
In a possible embodiment, the joint of the subject comprises: shoulder, hip, knee, ankle, heel and toe.
In a possible embodiment, said plurality of gait phases comprises two phases; or four time phases; or eight time phases; wherein the two phases include: a support time phase and a swing time phase; the four phases include: the system comprises a double-support phase 1, a single-support phase 1, a double-support phase 2 and a single-support phase 2; the eight phases include: first touchdown, load response period, standing middle period, standing end period, pre-step period, early step period, mid-step period and final step period.
In a possible embodiment, the video comprises video taken from the side, front and back, respectively, of the object.
According to an embodiment of another aspect of the present invention, there is provided a system for assisting assessment of a motor function of a subject, comprising a capturing device and an assessment assisting device independent of the capturing device, wherein the capturing device is configured to capture video of a movement of the subject, the assessment device optionally being a device as described above configured to process the captured video to obtain quantitative data for assisting assessment of a motor function of the subject.
According to an embodiment of a further aspect of the present invention, there is provided a method for assisting a subject with motor function assessment, optionally performed by a device as described above and/or a system as described above, the method comprising: acquiring a video of object motion; carrying out object joint point detection on each frame of picture in the video and determining joint coordinates of the detected joint points; determining a gait cycle of the object based on joint coordinates in each frame of picture and dividing the gait cycle into a plurality of gait time phases; calculating the joint angle of a joint point detected in each frame of picture in a gait cycle based on joint coordinates, and drawing a joint angle-frame number curve representing the change trend of the joint activity of an object in the gait cycle; respectively adjusting standard curve segments of a standard curve representing the standard change trend of the joint activity in a gait cycle in each gait time phase so as to enable the number of picture frames contained in each standard curve segment to be equal to the number of picture frames contained in a curve segment of a joint angle-frame number curve in a corresponding gait time phase; and generating quantitative data for assisting in evaluating the motion function of the subject, including a joint angle-frame number curve and an adjusted standard curve.
According to an embodiment of a further aspect of the present invention, there is provided a machine-readable storage medium having stored thereon executable instructions, wherein the executable instructions, when executed, cause a machine to perform the method as described above.
Therefore, according to the technical scheme of the embodiment of the invention, the quantitative data for evaluating the motion function of the object can be obtained without attaching any sensor to the object and without special test field, and the method has the advantages of simple operation, strong universality and high processing speed.
Drawings
Fig. 1 is a schematic diagram of a system for assisting assessment of a motor function of a subject, according to an embodiment of the invention.
FIG. 2 is a schematic block diagram of one implementation of an evaluation aid of the system of FIG. 1.
Fig. 3 is a flow diagram of a process for assisting assessment of a motor function of a subject, according to an embodiment of the present invention.
Fig. 4 and 5 are schematic diagrams for illustrating joint coordinates of an object.
Fig. 6 is a schematic diagram for illustrating a gait cycle.
Fig. 7 is a schematic diagram for illustrating the principle of adjusting a standard curve representing the change in joint mobility over a gait cycle.
Fig. 8 to 11 are graphs for illustrating changes in joint mobility of a subject in a gait cycle.
FIG. 12 is a flow diagram of a method for assisting in assessment of a subject's motor function in accordance with an embodiment of the present invention.
Detailed Description
Embodiments of the present invention relate to a technical solution in a computer application for detecting and analyzing gait parameters of a subject by means of a simple and easy-to-use system for providing quantitative data for assisting assessment of the motor function of the subject. This quantified data can be used as an objective basis for assessment of the motor function of the subject.
According to the technical scheme of the embodiment of the invention, the device has great application value and popularization prospect in the field of sports rehabilitation. Moreover, the solution according to the embodiment of the invention can provide objective basis for surgical mode selection, intra-operative implant selection and postoperative rehabilitation guidance of orthopedic diseases.
In the present invention, the "subject" may be a human, for example, a patient having an orthopedic disorder, a person suspected of having an orthopedic disorder, a non-patient who requires an assessment of motor function (for example, a person who requires an assessment of motor function because of performing a special task or performing a special exercise). The "subject" may also be an animal.
Hereinafter, specific embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 schematically shows a system 100 for assisting assessment of a motor function of a subject according to an embodiment of the present invention, which mainly comprises an acquisition device 10 and an assessment assisting device 20.
The acquisition device 10 is used to acquire a motion video of the subject and transmit the acquired video to the evaluation aid 20, so that the evaluation aid 20 processes the video and obtains quantitative data for assisting the motion function of the evaluation subject.
The capture device 10 may be implemented as a camera and/or a cell phone with a camera function. The capturing device 10 may be disposed in a space that facilitates capturing video of subject motion. The space is for example a treatment room or a bedroom of the subject.
In the photographing process, the whole body of the subject is photographed. The motion video of the object should contain video of the object performing the specified motion (i.e., one-way straight walking).
In one embodiment, the capture device 10 may comprise a plurality of capture devices for capturing video of object motion from multiple angles. The plurality of video apparatuses are configured to synchronize the photographic subject movements. For example, the plurality of capturing devices are configured to synchronously capture motion video of the subject from the front (preferably directly in front), the side (preferably directly to the side), and the rear (preferably directly behind) of the subject, respectively.
With respect to video shooting, one case is that a subject operates a capturing device by itself to shoot a video of the own motion. For example, the subject places a mobile phone as a capturing device in a bedroom of the subject at a position that meets the shooting requirement so as to shoot a video of one-way straight walking of the subject. Another situation is where specialized personnel assist in completing the video capture. For example, a specialist guides the subject to complete a straight-line walk and operates a video device to capture video of the subject's straight-line walk.
The evaluation aid 20 is configured to be in wireless and/or wired communication with the capturing device 10 in order to receive the captured video from the capturing device 10.
The evaluation aid 20 may be implemented in hardware or software or a combination of software and hardware. For a hardware implementation, the portions may be implemented in one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Data Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic units designed to perform their functions, or a combination thereof. For the part implemented in software, it may be implemented by means of microcode, program code or code segments, which may also be stored in a machine-readable storage medium such as a storage component.
In an embodiment, the assessment assistance apparatus 20 can be implemented to include a plurality of functional modules (e.g., software modules). Referring to fig. 2, the plurality of functional modules may include an acquisition module 21, a verification module 22, a joint coordinate determination module 23, a gait phase determination module 24, a joint angle determination module 25, an adjustment module 26 and an evaluation assistance module 27. It will be appreciated that the division and naming of these modules is logical (i.e., functional) and not a limitation on their physical location. These modules may be provided in the same chip or circuit, or may be provided in different chips or circuits, respectively. One or more of these modules may be further divided into a plurality of sub-modules. Two or more of these modules may be combined into one module.
In an embodiment, the assessment assistance device 20 can be implemented to include a memory and a processor. The memory contains instructions that, when executed by the processor, cause the processor to perform an assessment assistance method according to an embodiment of the invention.
In one embodiment, the evaluation aid 20 may be provided in a server computer. For example, the evaluation assisting device 20 is deployed in a cloud server, and the capturing device 10 uploads a video to the cloud after capturing the video, so that the video is analyzed and processed at the cloud, thereby obtaining quantized data for assisting in the evaluation of the athletic function, and sends the quantized data to one or more of the following: an electronic device on the subject side (e.g., a smartphone, a computer of the subject), a medical system cloud platform, and a medical information system on the doctor side (e.g., a hospital information system HIS).
Fig. 3 schematically illustrates a process 300 for assisting in evaluating a subject motion function according to an embodiment of the present invention, which mainly includes an acquisition process 310 and an evaluation assistance process 320.
First, the acquisition process 310 is described with reference to fig. 3. The acquisition process 310 may be implemented by means of the acquisition device 10, and therefore the description above with respect to the acquisition device 10 is equally applicable here.
In block 311, the capture device 10 captures a motion video of the subject. The object video should contain video in which the object performs a one-way linear motion, and be high-quality video.
It can be understood that the shot motion video may have the situation that the video is invalid due to various reasons such as jitter of the capturing device, interference and the like. And judging whether the shot video is an effective video meeting the requirements or not in the subsequent process.
In block 312, the capture device 10 transmits the captured motion video to the assessment aid 20. For example, the capture device 10 includes a communication unit (not shown), and uploads a captured video to the cloud-based evaluation auxiliary device 20 in a wireless communication manner through the communication unit.
In the case of photographing a moving video of multiple angles using a plurality of capturing apparatuses, the plurality of capturing apparatuses synchronously capture and synchronously transmit the video at the same transmission rate. Like this, can complement the superiority and inferiority of the video of shooing from different angles, be favorable to the joint point location of back in-process, because the video of side angle is favorable to carrying out the contrast of toe and heel, and the video of place ahead angle is favorable to observing the toe state, and the video of rear angle is favorable to observing the heel state. Therefore, in the subsequent analysis and processing process, the multi-angle video is comprehensively processed, and more accurate joint coordinates and joint angles obtained based on the joint coordinates can be obtained.
Next, an assessment assistance process 320 is described with reference to FIG. 3. The evaluation process 320 can be implemented with the aid of the evaluation aid 20, so that the above description of the evaluation aid 20 applies here as well.
In block 321, the acquisition module 21 acquires video of subject motion from the capture device 10.
In block 322, the verification module 22 verifies the validity of the acquired video. By such validation, it is ensured that the video data employed in the subsequent analysis and processing is valid, that the time for the subject to wait for the result is not wasted by employing invalid video (e.g., video quality is not high or does not contain enough video data to perform the analysis), and that unnecessary computation is also saved.
When at least one of a plurality of objects (2 or more), no object, object-to-object movement, and video blurring occurs in a video, the video is considered to be invalid, that is, the video is rejected. Conversely, in the case where a single straight-line walk contains only one object in the video and the video quality meets a predetermined criterion, the video is considered valid, i.e., the video is qualified.
In one embodiment, the validity of a video is verified by sampling a plurality of frames of pictures from the video and performing a plurality of verifications on the plurality of frames of pictures. This embodiment can be realized by the following procedure.
First, a predetermined number of frames of pictures are sampled from the acquired video. The sampling may be implemented as sampling pictures of consecutive frames in a video, or sampling pictures at predetermined equal intervals, or randomly sampling pictures. In one embodiment, the size of the video taken is 5-10M and the number of picture frames sampled is 5 frames.
Then, multiple verifications of the video are achieved by detecting the sampled pictures.
One of the verifications is to verify whether the number of objects in the video meets the requirement by detecting the number of objects (e.g., people) in the sampled picture. According to the requirement, the number of objects in the video should be 1, and thus the number of objects in the sampled picture should be 1. If a plurality of people or no people appear in the picture, the picture is not qualified.
In one embodiment of this verification, when it is detected that the ratio of the number of pictures in which the number of objects is 1 to the number of sampled pictures is equal to or greater than a predetermined percentage, the number of objects in the video is considered to be satisfactory, and the verification is passed. Otherwise, the number of objects in the video is considered to be not qualified, and the verification is not passed. For example, sample N1And (5) frame pictures. And carrying out object detection on the sampled pictures by using a machine learning model, and taking the detected maximum number of objects as the number of objects in the pictures. Detecting that the number of objects in the picture is 1N2And (5) frame. Then, N is added2Divided by N1The resulting ratio is compared to a predetermined percentage so that it can be determined whether the number of objects in the video is satisfactory.
Another verification is to verify whether the object motion in the video is a specified motion by detecting the object position in the sample picture, i.e. a one-way straight walk. The "one-way straight line walking" mainly excludes the situation that the object moves back and forth, and does not require that the motion track of the object is an accurate straight line.
In one embodiment of the verification, the position of the object in each sample picture is detected, so as to obtain the position change of the object, thereby determining the traveling direction of the object. And judging whether the object moves back and forth according to the determined traveling direction, and passing the verification when the object does not move back and forth, otherwise, failing to pass the verification.
Another verification is to detect the ambiguity of the sampled picture to verify whether the video quality is qualified.
In one embodiment of the verification, the blurriness of each sampled picture is calculated separately, and then the average value of the blurriness of each picture is used as the blurriness of the video. For example, the score for the ambiguity is between 0 and 100, and the predetermined 70 is a threshold value. If the average value is greater than 70, the video quality is considered to be disqualified. Otherwise, if the average value is less than or equal to 70, the video quality is considered to be qualified. The predetermined threshold is defined such that the blurriness of the video, if it exceeds the threshold, affects the effectiveness of subsequent processing operations.
It is understood that, in the above verification process, how many frames of pictures are sampled is predetermined. When the number of the picture frames is determined, the factors of two aspects of the operation speed and the verification accuracy are comprehensively considered. Specifically, too many sampled pictures may result in a large amount of computation and a slow speed, while too few sampled pictures may result in inaccurate verification.
It will be appreciated that in the validation process described above, AI techniques, for example, trained machine learning models, may be employed to implement one or more of the validations described above. The method of adopting the machine learning model has the advantages of high operation efficiency, small occupied memory and high verification accuracy. For example, the above-described verification takes only several seconds in total.
It is to be understood that the order of the above verification is not limited, and the verification may be performed sequentially or simultaneously.
In the event that at least one of the above-described verifications fails, the verification module 22 verifies that the video is invalid and the process 300 proceeds to block 323. In block 323, the operation is ended and a reminder indicating that the video is invalid is issued to recapture the video. In the event that each of the above-described verifications pass, the verification module 22 verifies that the video is valid, and the process 300 proceeds to block 324.
In block 324, the joint coordinate determination module 23 identifies and analyzes each frame of picture in the video, thereby identifying joint points in each frame of picture and calculating joint coordinates of the identified joint points.
In one embodiment, the joint coordinate determination module 23 parses each frame of picture in the video, then identifies the object joint point in each frame of picture, and obtains the 2D position of the object joint point. For each joint point identified, information is output, namely the joint part label + the joint coordinates (X, Y). Where X is a position (coordinate) in the direction of travel of the object, and Y is a position (coordinate) in the direction of height. For example, X is the position of the subject's foot in the stepping direction, and Y is the position of the subject's foot in the height direction with respect to the ground, that is, the height at which the foot is raised or lowered. Based on the joint coordinates, a curve graph of the X-direction coordinates of the joint and the frame number can be obtained, and a curve graph of the Y-direction coordinates of the joint and the frame number can also be obtained.
For example, the X-direction coordinates of the heel and toe of the left and right feet in each frame of the picture are shown in fig. 4. In fig. 4, the abscissa is the frame number of a picture frame of a video, and the ordinate is the distance from the starting point of the heel and toe in the direction of travel. From fig. 4, the amount of displacement of the foot in the stepping direction and the change in the posture of the foot joint can be understood.
The Y-direction coordinates of the heels and toes of the left and right feet in each frame of the picture are shown in FIG. 5. In fig. 5, the abscissa is the frame number of the picture frame of the video, and the ordinate is the distance from the ground of the heel and toe in the height direction. For clarity, only the ground-off and ground-on states are shown in this figure. The conditions of lift and fall of the heel, toe, and posture changes of the foot joints can be understood from fig. 5.
It is to be understood that the numerical values in fig. 4 and 5 and fig. 7 to 11 to be described later are illustrative, not restrictive.
In block 325, the gait phase determination module 24 calculates a gait cycle of the subject and a plurality of gait phases of the gait cycle based on the joint coordinates.
The gait cycle is the progression from heel-strike to heel-strike of the same foot during walking. Referring to fig. 6, a gait cycle is shown from the first landing of one lateral heel to the second landing of that lateral heel. The gait cycle may involve a series of typical posture changes. For example, two intermediate positions are involved in the process from the heel first landing in figure 6 to the heel second landing. Based on the change of the typical posture, a gait cycle can be divided into a plurality of gait phases.
One gait cycle can be divided into two gait time phases; or four gait phases; or eight gait phases.
The two gait phases include: a support phase and a swing phase.
The four gait phases include: double-support phase 1, single-support phase 1, double-support phase 2 and single-support phase 2.
The eight gait phases include: first touchdown, load response period, standing middle period, standing end period, pre-step period, early step period, mid-step period and final step period.
It will be appreciated that reference is made to the general definition in the medical field with respect to the definition of each gait phase. The gait phase in embodiments of the invention adopts this general definition.
Referring to fig. 4 and 5, changes in the position of the heel and toe can be accurately determined based on changes in both the X and Y coordinates of the joint, such that the time when the heel lands and lifts off the ground and the time when a series of typical postures occur are determined based on the changes in the position of the heel and toe, from which the gait cycle of the subject and a plurality of gait phases for the gait cycle can be determined. For example, from fig. 4 and 5, it can be determined that 20 frames to 63 frames are one gait cycle. The gait cycle is divided into four gait time phases, wherein 20-27 frames are the first time phase (double support phase 1), 27-40 frames are the second time phase (single support phase 1), 40-46 frames are the third time phase (double support phase 2), and 46-63 frames are the fourth time phase (single support phase 2).
It will be appreciated that different subjects, for example, young versus elderly persons, persons with good motor function versus persons with poor motor function, will have different gait cycles and gait times, and that the curves in fig. 4 and 5 are merely examples and not intended to be limiting.
In block 326, the joint angle determination module 25 calculates the joint angle in each frame of picture in the gait cycle based on the joint position and draws a joint angle-frame number graph. In an embodiment, the joint of the subject may include a shoulder, a hip, a knee, an ankle, a heel, and a toe. A joint angle-frame number graph may be plotted for each joint. The joint angle-frame number graph may represent a trend of change in joint activity of the subject joint in each gait phase of the gait cycle.
In block 327, the adjustment module 26 adjusts a standard curve representing a standard trend of the joint activity with reference to the gait cycle of the subject and each gait phase thereof. For example, the adjusting module 26 adjusts the standard curve segments of the standard curve in each gait time phase respectively, so that the number of the picture frames included in each standard curve segment is equal to the number of the picture frames included in the curve segment of the joint angle-frame number curve in the corresponding gait time phase respectively.
Therefore, the standard curve segment is respectively adjusted according to each gait time phase of the gait cycle of different objects, so that a standard curve with stronger adaptability for different objects can be obtained, and more accurate quantized data can be output.
Fig. 7 illustrates the principle of adjusting the standard curve. Referring to fig. 7, the original standard curve includes 100 frames of pictures, wherein four gait phases include 1-10 frames of pictures respectively; 10-48 frames of pictures; 48-60 pictures and 60-100 pictures (see illustration of "standard values" of the original standard curve above FIG. 7). The gait cycle of the subject determined with reference to fig. 4 and 5 includes, for example, 43 frames of pictures, and each gait phase includes 20 to 27 frames of pictures; 27-40 frames of pictures; 40-46 frames of pictures and 46-63 frames of pictures. And aiming at each gait time phase, mapping the corresponding standard curve segment to the curve segment of the corresponding gait time phase obtained by analyzing the object motion video. That is, standard curve segments of 1-10 frames are mapped to curve segments of 20-27 frames, standard curve segments of 10-48 frames are mapped to curve segments of 27-40 frames, standard curve segments of 48-60 frames are mapped to curve segments of 40-46 frames, and standard curve segments of 60-100 frames are mapped to curve segments of 46-63 frames (see "mapping" and adjusted standard curves illustrated by arrows in FIG. 7). Based on such a mapping, an adjusted standard curve is obtained, in which the number of picture frames included in the curve segment of each phase is the same as the number of picture frames included in the curve segment of the corresponding gait phase obtained by analyzing the subject motion video.
In block 328, the assessment assistance module 27 derives quantitative data for assisting in the assessment of the motion function of the subject based on the determined joint angle versus frame number curve and the adjusted standard curve. The quantization data may include a joint angle-frame number curve, an adjusted standard curve, and a quantization parameter calculated based on both curves.
In one embodiment, both the joint angle-frame number curve and the adjusted standard curve are plotted in one graph, or the joint angle-frame number curve and the adjusted standard curve for the subject at different periods of time (e.g., pre-and post-surgery) for the same subject are plotted in one graph, such that the curves can be compared to obtain quantitative parameters for assisting in assessing motor function.
Fig. 8-11 show a graph of the trend of joint motion for different joints of a subject and corresponding standard curves (adjusted in the manner described above).
Fig. 8 illustrates a hip joint motion tendency graph of the subject, in which the abscissa is the frame number and the ordinate is the hip joint angle. Fig. 8 contains the joint angle-frame number curve (curve indicated by a solid line) of the subject and the adjusted standard curve (curve indicated by a dotted line) for the hip joint of the subject obtained by analyzing the video. It will be appreciated that the adjusted standard curve in the figure is adjusted based on the standard curve of the subject's hip joint.
Fig. 9 is a graph illustrating a knee joint mobility trend of the subject, in which the abscissa is a frame number and the ordinate is a knee joint angle. Fig. 9 contains the knee angle-frame number curve (curve indicated by a solid line) of the subject and the adjusted standard curve (curve indicated by a dashed-dotted line) for the knee of the subject obtained by analyzing the video. It will be appreciated that the adjusted standard curve in this figure is adjusted based on the standard curve of the subject's knee joint.
Fig. 10 illustrates an ankle joint activity trend graph of the subject, in which the abscissa is the frame number and the ordinate is the ankle joint angle. Fig. 10 contains the ankle angle-frame number curve (the curve indicated by the solid line) of the subject and the adjusted standard curve (the curve indicated by the dotted line) for the ankle of the subject obtained by analyzing the video. It will be appreciated that the adjusted standard curve in this figure is adjusted based on the standard curve of the subject's ankle joint.
Fig. 11 illustrates a preoperative activity trend curve (curve represented by a solid line), a postoperative activity trend curve (curve represented by a dotted line) and an adjusted standard curve (curve represented by a dotted line) for the subject's knee joint. Wherein, the abscissa is the frame number, and the ordinate is the knee joint angle. From the figure, quantitative data of the joint mobility of the subject before and after the operation can be obtained, thereby playing a role in assisting in evaluating the postoperative recovery condition of the subject. It will be appreciated that the adjusted standard curve in this figure is adjusted based on the standard curve of the subject's knee joint.
These curves can be analyzed, compared and calculated for each of fig. 8-11, resulting in quantitative parameters that can assist in assessing the motor function of the subject. The quantization parameter may include one or more of:
-a degree of matching between the adjusted standard curve segment and a corresponding curve segment of the joint angle-frame number curve within each gait phase;
-a degree of match between the subject's range of joint angle motion and a standard range of joint angle motion in each gait phase;
-the difference between the maximum and minimum joint motion angles of the subject and the standard maximum and minimum motion angles, respectively, in each gait phase;
-the duration of the gait cycle;
-a duration of each of a plurality of gait phases;
-a ratio of the duration of each gait phase relative to the duration of a gait cycle;
-a ratio of the durations of any two of the plurality of gait phases;
-symmetry of the articulation angles of the left and right feet of the subject in the respective gait phases;
subject pre-and post-operative joint mobility trend changes.
It is to be understood that fig. 8-11 are given by way of illustration only and that other similar graphs may be obtained in accordance with embodiments of the present invention.
After obtaining the above-described quantitative data, the evaluation assisting apparatus 20 may transmit the quantitative data (e.g., the graph and the calculated quantitative parameters) to the electronic apparatus on the subject side, the medical system cloud platform, and the medical information system (e.g., the hospital information system HIS) on the doctor side through its communication unit (not shown).
Fig. 12 shows a flow diagram of a method 400 for assisting assessment of a motor function of a subject, according to an embodiment of the invention. This method may be performed by the above-described evaluation assistance apparatus 20 as well as by the above-described evaluation assistance system 100, and therefore, the above description of the evaluation assistance apparatus 20 and the evaluation assistance system 100 is also applicable here.
Referring to fig. 12, in step 410, a video of the motion of an object is acquired.
In step 420, object joint point detection is performed on each frame of picture in the video and joint coordinates of the detected joint points are determined.
In step 430, a gait cycle of the subject is determined based on the joint coordinates in each frame of picture and divided into a plurality of gait phases.
In step 440, the joint angle of the joint point detected in each frame of picture in the gait cycle is calculated based on the joint coordinates, and a joint angle-frame number curve representing the change trend of the joint activity of the object in the gait cycle is drawn.
In step 450, the standard curve segments of the standard curve representing the standard variation trend of the joint activity in the gait cycle in each gait time phase are respectively adjusted, so that the number of the picture frames contained in each standard curve segment is respectively equal to the number of the picture frames contained in the curve segment of the joint angle-frame number curve in the corresponding gait time phase; and
in step 460, quantized data for assisting in evaluating the motion function of the subject is generated, which includes a joint angle-frame number curve and an adjusted standard curve.
Embodiments of the present invention also provide a machine-readable storage medium having stored thereon executable instructions that, when executed, cause the machine to perform the method 400 as described above.
It should be appreciated that examples of machine-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Storage media may include, but are not limited to: random Access Memory (RAM), Read Only Memory (ROM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, Compact Discs (CD), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium capable of being used to store information.
In some embodiments, a machine-readable storage medium may store executable computer program instructions that, when executed by one or more processing units, cause the processing units to perform the above-described methods. The executable computer program instructions may include any suitable type of code, for example, source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
It is to be understood that various operations may be described as multiple discrete actions or operations in sequence, in a manner that is most helpful in understanding the claimed subject matter. However, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, the operations may be performed out of the order presented. In other implementations, various additional operations may be performed and/or various operations that have been described may be omitted.
What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the novel architecture is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims.

Claims (12)

1. An apparatus for assisting assessment of a subject's motor function, comprising:
an acquisition module configured to acquire a video of a motion of an object;
a joint coordinate determination module configured to perform object joint point detection on each frame of picture in the video and determine joint coordinates of the detected joint points;
a gait phase determination module configured to determine a gait cycle of the object based on the joint coordinates in each frame of picture and divide the gait cycle into a plurality of gait phase;
the joint angle determining module is configured to calculate joint angles of joint points detected in each frame of picture in a gait cycle based on joint coordinates and draw a joint angle-frame number curve representing the change trend of joint activity of the object in the gait cycle;
the adjusting module is configured to adjust standard curve segments of a standard curve representing the standard change trend of the joint activity in a gait cycle in each gait time phase respectively so that the number of picture frames contained in each standard curve segment is equal to the number of picture frames contained in a curve segment of a joint angle-frame number curve in a corresponding gait time phase respectively; and
an assessment assistance module configured to generate quantitative data for assisting in assessing a motion function of a subject, including a joint angle-frame number curve and an adjusted standard curve.
2. The device of claim 1, wherein the device further comprises a verification module configured to:
after obtaining a video, sampling a predetermined number of pictures from the video, wherein the predetermined number is predetermined based on both the operation speed and the verification effect of a verification module;
verifying whether a video is valid by detecting the sampled pictures, wherein the video is valid when only specified object motion, specified object quantity and video quality qualification are met;
continuing subsequent operations when the video is verified as valid; and is
And when the video is verified to be invalid, ending the operation.
3. The apparatus of claim 2, wherein the specified object motion is a single pass linear motion of the object; the number of designated objects is 1; and the qualified video quality means that the fuzziness of the video does not exceed the fuzziness critical value.
4. The device of any one of claims 1-3, wherein the assessment assistance module is further configured to determine, as the quantized data, based on a joint angle-frame number curve and an adjusted standard curve, at least one of:
-a degree of matching between the adjusted standard curve segment and a corresponding curve segment of the joint angle-frame number curve within each gait phase;
-the difference between the maximum joint angle of the subject and the maximum joint angle in the corresponding standard curve segment in each gait phase;
-the difference between the minimum joint angle of the subject and the minimum joint angle in the corresponding standard curve segment in each gait phase; and
-a degree of matching between the range of joint motion of the subject and the range of joint motion in the corresponding standard curve segment in each gait phase.
5. The device of any one of claims 1-3, wherein the assessment assistance module is further configured to determine, as the quantized data, based on a joint angle-frame number curve and an adjusted standard curve, at least one of:
-the duration of the gait cycle;
-the duration of each gait phase of a gait cycle;
-the ratio between the duration of each gait phase and the duration of the gait cycle;
-a ratio between the durations of any two gait phases within a gait cycle;
symmetry of joint motion of the left and right feet of the subject in each gait phase.
6. The apparatus of claim 1, wherein the joint coordinate determination module is configured to calculate coordinates in a direction of travel and in a direction of height of the subject joint, respectively, and the gait phase determination module is configured to determine a gait cycle and each gait phase of the gait cycle based on both coordinates.
7. The apparatus of claim 1, wherein the joint of the subject comprises: shoulder, hip, knee, ankle, heel and toe.
8. The apparatus of claim 1, wherein the plurality of gait phases comprises two phases; or four time phases; or eight time phases;
wherein the two phases include: a support time phase and a swing time phase;
the four phases include: the system comprises a double-support phase 1, a single-support phase 1, a double-support phase 2 and a single-support phase 2;
the eight phases include: first touchdown, load response period, standing middle period, standing end period, pre-step period, early step period, mid-step period and final step period.
9. The apparatus of claim 1, wherein the video comprises video taken from the side, front, and back, respectively, of the subject.
10. A system for assisting assessment of a motor function of a subject, comprising an acquisition device and an assessment assisting device independent of the acquisition device,
wherein the acquisition device is configured to acquire video of subject motion,
the evaluation device, optionally according to any of claims 1-9, configured to process the acquired video to obtain quantitative data for assisting the evaluation of the motor function of the object.
11. A method for assisting assessment of a motor function of a subject, optionally performed by a device according to any of claims 1-9 and/or a system according to claim 10, the method comprising:
acquiring a video of object motion;
carrying out object joint point detection on each frame of picture in the video and determining joint coordinates of the detected joint points;
determining a gait cycle of the object based on joint coordinates in each frame of picture and dividing the gait cycle into a plurality of gait time phases;
calculating the joint angle of a joint point detected in each frame of picture in a gait cycle based on joint coordinates, and drawing a joint angle-frame number curve representing the change trend of the joint activity of an object in the gait cycle;
respectively adjusting standard curve segments of a standard curve representing the standard change trend of the joint activity in a gait cycle in each gait time phase so as to enable the number of picture frames contained in each standard curve segment to be equal to the number of picture frames contained in a curve segment of a joint angle-frame number curve in a corresponding gait time phase; and
quantitative data for assisting in evaluating the motion function of the subject is generated, which includes a joint angle-frame number curve and an adjusted standard curve.
12. A machine-readable storage medium having stored thereon executable instructions, wherein the executable instructions, when executed, cause a machine to perform the method of claim 11.
CN202110575579.4A 2021-05-26 2021-05-26 Apparatus, system and method for assisting assessment of a motor function of an object Active CN112998700B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110575579.4A CN112998700B (en) 2021-05-26 2021-05-26 Apparatus, system and method for assisting assessment of a motor function of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110575579.4A CN112998700B (en) 2021-05-26 2021-05-26 Apparatus, system and method for assisting assessment of a motor function of an object

Publications (2)

Publication Number Publication Date
CN112998700A true CN112998700A (en) 2021-06-22
CN112998700B CN112998700B (en) 2021-09-24

Family

ID=76380798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110575579.4A Active CN112998700B (en) 2021-05-26 2021-05-26 Apparatus, system and method for assisting assessment of a motor function of an object

Country Status (1)

Country Link
CN (1) CN112998700B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113642506A (en) * 2021-08-26 2021-11-12 北京复数健康科技有限公司 Method and system for image annotation based on data matching
CN113807323A (en) * 2021-11-01 2021-12-17 北京大学 Accurate hand function evaluation system and method based on image recognition
CN114569411A (en) * 2022-02-21 2022-06-03 长沙优龙机器人有限公司 Gait self-adaptive control method and system for hemiplegic patient

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1286962A (en) * 2000-10-09 2001-03-14 清华大学 Real-time body gait image detecting method
KR20120017948A (en) * 2010-08-20 2012-02-29 경북대학교 산학협력단 Rehabilitation device using motion analysis based on motion capture and method thereof
WO2014104360A1 (en) * 2012-12-28 2014-07-03 株式会社東芝 Motion information processing device and method
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
US20170231532A1 (en) * 2016-02-12 2017-08-17 Tata Consultancy Services Limited System and method for analyzing gait and postural balance of a person
CN107174255A (en) * 2017-06-15 2017-09-19 西安交通大学 Three-dimensional gait information gathering and analysis method based on Kinect somatosensory technology
CN107609523A (en) * 2017-09-19 2018-01-19 东华大学 Gait cycle and three-dimensional limb activity angle algorithm based on Python
CN108022248A (en) * 2016-11-03 2018-05-11 北京航空航天大学 A kind of lower limb gait rehabilitation assessment system of view-based access control model collecting device
CN108388887A (en) * 2018-03-20 2018-08-10 济南大学 Biped robot's Analytical Methods of Kinematics based on toddlerhood child's Gait extraction
CN109063661A (en) * 2018-08-09 2018-12-21 上海弈知信息科技有限公司 Gait analysis method and device
CN109255293A (en) * 2018-07-31 2019-01-22 浙江理工大学 Model's showing stage based on computer vision walks evaluation method
CN109330605A (en) * 2018-09-07 2019-02-15 福建工程学院 A kind of gait cycle Automated Partition Method and computer equipment
US20190076060A1 (en) * 2016-03-31 2019-03-14 Nec Solution Innovators, Ltd. Gait analyzing device, gait analyzing method, and computer-readable recording medium
WO2019100754A1 (en) * 2017-11-23 2019-05-31 乐蜜有限公司 Human body movement identification method and device, and electronic device
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium
CN110801231A (en) * 2019-10-14 2020-02-18 西安理工大学 Interactive evaluation method for joint function of knee arthritis patient
KR20200084567A (en) * 2019-01-03 2020-07-13 전자부품연구원 Health abnormality detection system and method using gait pattern
CN111950383A (en) * 2020-07-21 2020-11-17 燕山大学 Joint angle-based rhythm and motion collaborative analysis method
CN112438723A (en) * 2019-08-29 2021-03-05 松下电器(美国)知识产权公司 Cognitive function evaluation method, cognitive function evaluation device, and storage medium

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1286962A (en) * 2000-10-09 2001-03-14 清华大学 Real-time body gait image detecting method
KR20120017948A (en) * 2010-08-20 2012-02-29 경북대학교 산학협력단 Rehabilitation device using motion analysis based on motion capture and method thereof
WO2014104360A1 (en) * 2012-12-28 2014-07-03 株式会社東芝 Motion information processing device and method
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
US20170231532A1 (en) * 2016-02-12 2017-08-17 Tata Consultancy Services Limited System and method for analyzing gait and postural balance of a person
US20190076060A1 (en) * 2016-03-31 2019-03-14 Nec Solution Innovators, Ltd. Gait analyzing device, gait analyzing method, and computer-readable recording medium
CN108022248A (en) * 2016-11-03 2018-05-11 北京航空航天大学 A kind of lower limb gait rehabilitation assessment system of view-based access control model collecting device
CN107174255A (en) * 2017-06-15 2017-09-19 西安交通大学 Three-dimensional gait information gathering and analysis method based on Kinect somatosensory technology
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN107609523A (en) * 2017-09-19 2018-01-19 东华大学 Gait cycle and three-dimensional limb activity angle algorithm based on Python
WO2019100754A1 (en) * 2017-11-23 2019-05-31 乐蜜有限公司 Human body movement identification method and device, and electronic device
CN108388887A (en) * 2018-03-20 2018-08-10 济南大学 Biped robot's Analytical Methods of Kinematics based on toddlerhood child's Gait extraction
CN109255293A (en) * 2018-07-31 2019-01-22 浙江理工大学 Model's showing stage based on computer vision walks evaluation method
CN109063661A (en) * 2018-08-09 2018-12-21 上海弈知信息科技有限公司 Gait analysis method and device
CN109330605A (en) * 2018-09-07 2019-02-15 福建工程学院 A kind of gait cycle Automated Partition Method and computer equipment
KR20200084567A (en) * 2019-01-03 2020-07-13 전자부품연구원 Health abnormality detection system and method using gait pattern
CN112438723A (en) * 2019-08-29 2021-03-05 松下电器(美国)知识产权公司 Cognitive function evaluation method, cognitive function evaluation device, and storage medium
CN110801231A (en) * 2019-10-14 2020-02-18 西安理工大学 Interactive evaluation method for joint function of knee arthritis patient
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium
CN111950383A (en) * 2020-07-21 2020-11-17 燕山大学 Joint angle-based rhythm and motion collaborative analysis method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113642506A (en) * 2021-08-26 2021-11-12 北京复数健康科技有限公司 Method and system for image annotation based on data matching
CN113807323A (en) * 2021-11-01 2021-12-17 北京大学 Accurate hand function evaluation system and method based on image recognition
CN114569411A (en) * 2022-02-21 2022-06-03 长沙优龙机器人有限公司 Gait self-adaptive control method and system for hemiplegic patient

Also Published As

Publication number Publication date
CN112998700B (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN112998700B (en) Apparatus, system and method for assisting assessment of a motor function of an object
Wang et al. Real-time estimation of knee adduction moment for gait retraining in patients with knee osteoarthritis
Zhou et al. Validation of an IMU gait analysis algorithm for gait monitoring in daily life situations
KR101053491B1 (en) Walking Cycle Detection System and Method Using Motion Sensor
KR20200056233A (en) A motion accuracy judgment system using artificial intelligence posture analysis technology based on single camera
CN112668549B (en) Pedestrian attitude analysis method, system, terminal and storage medium
CN112568898A (en) Method, device and equipment for automatically evaluating injury risk and correcting motion of human body motion based on visual image
CN114052718A (en) Gait assessment system and gait assessment method
Cimorelli et al. Portable in-clinic video-based gait analysis: validation study on prosthetic users
Perez et al. A smartphone-based system for clinical gait assessment
US11497452B2 (en) Predictive knee joint loading system
CN112998696A (en) Sole correction method and system based on lower limb assessment and gait analysis and application of sole correction method and system
Caramia et al. Spatio-temporal gait parameters as estimated from wearable sensors placed at different waist levels
CN112438723A (en) Cognitive function evaluation method, cognitive function evaluation device, and storage medium
Yang et al. Empowering a gait feature-rich timed-up-and-go system for complex ecological environments
CN114052725B (en) Gait analysis algorithm setting method and device based on human body key point detection
KR20220140220A (en) Gait data-based parkinson's syndrome severity identification system and method
CN112999616A (en) Attitude detection system, method, device and storage medium
Rodrigues et al. On the fly reporting of human body movement based on kinect v2
Rago A comparative evaluation of two markerless methods for sagittal lower limb joint kinematics estimation based on a single RGB-D camera
Budzyńska et al. Verification of Selected Gait Parameters Derived from Inertial Sensors Using Simple Smartphone Based Optical System
KR102556002B1 (en) Skeleton data-based parkinson's syndrome severe phase identification system and method
Wagner et al. Spatiotemporal analysis of human gait, based on feet trajectories estimated by means of depth sensors
KR102545358B1 (en) Pedestrian data-based parkinson's syndrome severe phase identification system and method
Seymens et al. Tracking joint centres with the use of DeepLabCut in comparison to manual annotation: a study on concurrent validity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant