CN114663913B - Kinect-based human gait parameter extraction method - Google Patents

Kinect-based human gait parameter extraction method Download PDF

Info

Publication number
CN114663913B
CN114663913B CN202210185789.7A CN202210185789A CN114663913B CN 114663913 B CN114663913 B CN 114663913B CN 202210185789 A CN202210185789 A CN 202210185789A CN 114663913 B CN114663913 B CN 114663913B
Authority
CN
China
Prior art keywords
joint
foot
gait
knee
ankle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210185789.7A
Other languages
Chinese (zh)
Other versions
CN114663913A (en
Inventor
汪玲
荀小飞
邱静
程洪
陈路锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210185789.7A priority Critical patent/CN114663913B/en
Publication of CN114663913A publication Critical patent/CN114663913A/en
Application granted granted Critical
Publication of CN114663913B publication Critical patent/CN114663913B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a human gait parameter extraction method based on Kinect, which has more universality, and the method is used for acquiring original node position sequence data extracted by a Kinect somatosensory depth sensor; filtering the original node position sequence data and calculating an original gait parameter value; constructing a gait feature space according to the original gait parameter value, wherein the gait feature space comprises a static posture feature, a continuous motion change feature and an integral change feature; a gait feature space is utilized to obtain a preliminary judgment result of a single-side state of each frame by adopting a gait phase division strategy based on a random forest model, and then misjudgment and correction are carried out on abnormal phases to obtain a final gait phase division result; and determining a starting point frame and an end point frame of the swing period from the final gait phase division result to obtain a starting point position, an end point position, starting point time and end point time of the swing period so as to calculate and obtain gait space-time parameters. The invention has the advantages of comprehensive gait characteristic representation, no contact and low cost.

Description

Kinect-based human gait parameter extraction method
Technical Field
The invention relates to a human body induction technology, in particular to a human body gait parameter extraction technology.
Background
Walking, which is one of the most basic motion states of humans, reflects a lot of information about human motion, and is a coupling synergy of skeletal systems, muscular systems, and nervous systems. The coordination, periodicity and stability of human gait have vivid individuation characteristics, and also contain rich kinematic information, kinetic information, muscle activity and oxygen consumption information. Gait analysis, namely acquiring gait characteristics of a subject by acquiring movement information of the subject, is used for researching the change and rule of human body posture and movement, and is characterized in that a plurality of gait parameter information of a person during walking is acquired by a scientific means by using an artificial object, wherein the gait parameter information comprises basic concepts of a walking cycle, a basic constitution of a normal walking cycle, space-time parameters and the like.
At present, gait analysis is performed on a person to be inspected, and two types of gait analysis can be classified according to a motion information acquisition mode: the method is a qualitative analysis method, and the method relies on visual observation of inspectors, has the advantages of simplicity and no dependence on equipment, and has the disadvantages of dependence on long-term working experience and subjective judgment of the inspectors, often having differences in the evaluation results of different inspectors on the same individual, and being not quantitative and accurate enough; the other type is a quantitative analysis method, the mode relies on a sensor to obtain initial motion data, different types of sensors can capture different types of motion signals, the extraction modes of the motion information are different, and the modes can be classified into a plantar pressure mode, an electromyographic signal mode, an inertial sensor mode and a visual capture mode based on marked points according to the types of the sensors.
Benocci et al developed a wireless system for gait and posture analysis, which, due to portability and flexibility of such gait analysis data acquisition, stimulated great interest to researchers, but because of the singleness of plantar pressure data information, could not feed back human posture well, but could only be used as reference information for lower limb movements. The Mega face myoelectricity test system ME6000 in Finland realizes the output of various quantized motion parameters through self-built myoelectricity analysis software MegaWin, but the myoelectricity signal has the defects of easy interference and strong signal randomness. Luis et al propose a system that requires a three-axis accelerometer and a two-axis gyroscope to be placed on one side of the pelvis, uses wavelet decomposition to detect and distinguish the footsteps of each foot, and uses a combination of forward and reverse integrals of high-pass filtered acceleration to calculate the step length, but cannot reject the effect of such sensor position changes on actual human joint angle changes, resulting in data disturbance and accumulated errors. The Oqus motion capture system developed by the Swedish analysis AB company has the characteristics of high precision, high reliability, simplicity and convenience in installation, support of synchronization of various devices and the like, and has the defects of high price and complex operation.
Therefore, objective quantification, low cost and portable motion analysis can be realized, the Kinect somatosensory depth sensor can position a plurality of joint points of a human body, and the construction of a human body skeleton model is realized, so that characteristic information in a human body motion period is obtained, gait analysis is further carried out, the cost is low, and a new thought for solving the problem is provided.
The invention application with publication number of CN108030497A provides a gait analysis device and method based on an inertial sensor IMU, the acquisition of gait data is realized by respectively placing one IMU on two feet, and data interaction is carried out between the two IMUs through Bluetooth. However, the problem of IMU data accumulation errors is not solved, and the actual influence of the wearable device on the actual gait of the human body is not considered.
The invention application with publication number of CN 108960155A provides a Kinect-based adult gait extraction and anomaly analysis method, which realizes the calculation of gait space-time parameters through the change condition of foot-end track. However, the scheme only realizes the division of the gait phases by setting a single threshold value, different individuals often have obvious gait characteristics, and the result lacks universality.
Disclosure of Invention
The invention aims to solve the technical problem of providing a Kinect-based human gait parameter extraction method with more universality.
The technical scheme adopted by the invention for solving the technical problems is that the human gait parameter extraction method based on Kinect comprises the following steps:
1) Acquiring original joint point position sequence data extracted by a Kinect somatosensory depth sensor, wherein the original joint point position sequence data comprises spatial position information of a hip joint, a knee joint, an ankle joint and a foot joint;
2) Filtering the original joint point position sequence data and calculating original gait parameter values, wherein the original gait parameter values comprise knee joint angle, ankle joint angle, hip joint angle, foot joint speed, ankle joint angular speed and knee joint angular speed;
3) Constructing a gait feature space according to the original gait parameter value, wherein the gait feature space comprises a static posture feature, a continuous motion change feature and an integral change feature;
the static posture features include foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame:
the continuous motion change features include differences between foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame and the previous frame;
the overall change characteristics include differences between the foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame and the first frame;
4) A gait feature space is utilized to obtain a preliminary judgment result of a single-side state of each frame by adopting a gait phase division strategy based on a random forest model, and then misjudgment and correction are carried out on abnormal phases to obtain a final gait phase division result; the method for misjudging and repairing the abnormal phase comprises the following steps: when the preliminary judgment result is a swing phase, further judging whether the walking distance of the foot in the swing phase is greater than or equal to a half-step preset value, if so, the final gait phase division result of the single-side state of the frame is the swing phase, otherwise, the final gait phase division result of the single-side state of the frame is modified to be a support phase;
5) Determining a starting point frame and an end point frame of the swing period from the final gait phase division result, and obtaining a starting point position, an end point position, a starting point moment and an end point moment of the swing period according to joint point position coordinates in the starting point frame and the end point frameThereby calculating and obtaining gait space-time parametersA number.
The method has the beneficial effects that the method can realize the calculation of joint angles, gait phase states and gait space-time parameters of multiple parts of the human body, adopts an improved gait phase division strategy based on a random forest model, and solves the defect that the threshold method lacks universality. The invention constructs a novel gait space-time characteristic parameter description model, has more comprehensive gait characteristic representation, and has the advantages of no contact and low cost.
Drawings
Fig. 1 is a schematic diagram of human body parameter extraction based on gait phase division of random forests.
Fig. 2 is a schematic diagram of an embodiment of a camera coordinate system.
Detailed Description
1) Acquiring original joint point position sequence data extracted by a Kinect somatosensory depth sensor; the joint here specifically includes: hip, knee, ankle and foot joints; the original joint point position sequence data includes: spatial position information of hip, knee, ankle and foot joints;
2) The method for filtering the original joint point position sequence data and calculating the joint parameter value specifically comprises the following steps: knee joint angle, ankle joint angle, foot joint velocity, ankle joint angular velocity, knee joint angular velocity, hip joint angle;
embodiments preferably filter Kinect raw data by a Savitzky Golay filter algorithm.
Because the Savitzky Golay filtering algorithm uses the thought of a local polynomial, the best fitting method is carried out by utilizing a least square method through a moving window, and the distribution characteristics of the maximum value, the minimum value, the width and the like of an observed value sample can be reserved, so that the method is suitable for processing Kinect original data to eliminate certain errors. In the invention, the value of the width of the sliding window is set to 17, and the original joint point position sequence data is subjected to filtering processing.
The knee joint and the ankle joint are regarded as rigid body hinge models. Therefore, the knee joint angle theta can be calculated by the filtered hip joint, knee joint and ankle joint space position information Knee Hip joint used hereinThe spatial position information of the joint, the knee joint and the ankle joint is the knee joint point to hip joint direction vector e 1 And knee-to-ankle direction vector e 2 The method comprises the steps of carrying out a first treatment on the surface of the Calculating the ankle angle theta by knee joint, ankle joint and foot joint information Ankle As used herein, knee joint, ankle joint, foot joint are the ankle joint to knee joint direction vector e 3 And ankle joint to foot joint direction vector e 4 The method comprises the steps of carrying out a first treatment on the surface of the The hip joint angle is calculated to obtain the included angle theta between the vector formed by the hip joint and the knee joint and the horizontal plane Hip_Transverse Angle theta with sagittal plane Hip_Coronal Included angle theta with coronal plane Hip_Sagittal To indicate that the normal vector of the horizontal plane corresponding to the vector formed by the hip joint and the knee joint is l 1 The normal vector of the corresponding sagittal plane is l 2 The normal vector of the corresponding coronal plane is l 3
In the camera coordinate system shown in fig. 2, the speed of the foot articulation point in the camera coordinate system is calculated as follows:
wherein the direction of the z-axis in the camera coordinate system is the human body advancing direction, (x) t ,y t ,z t ) And (x) t-1 ,y t-1 ,z t-1 ) The three-dimensional position coordinates of the foot joint at the moment t and the moment t-1 before are respectively shown, and f represents the sampling frequency of the Kinect somatosensory depth sensor.
Ankle angle θ using this point t t_Ankle Ankle angle θ from previous moment t-1 t-1_Ankle Calculating the ankle angular velocity omega Ankle
Knee joint angle θ using this moment t t_Knee Knee angle θ with previous moment t-1 t-1_Knee Calculating knee joint angular velocity omega Knee
3) Gait phase division:
after the original gait parameter value is obtained, the characteristic construction is carried out through Savitzky Golay filter processing again. The c-frame original gait feature comprises: c-frame three-dimensional foot joint coordinate x after Savitzky Golay filtering treatment foot_c ,y foot_c ,z foot_c Three-dimensional foot joint velocity v of c-th frame foot_x_c ,v foot_y_c ,v foot_z_c One-dimensional knee joint angle theta of c frame Knee_c One-dimensional knee joint angular velocity omega of c frame Knee_c Composition of the compositionIs a 8-dimensional data of (2). C frame gait feature space f c Consisting of three channels of original gait features, including static posture features f cc Characteristic of continuous motion change f ci And overall change characteristic f cp ,f c 24 dimensions in total:
f c =[f cc ,f cp ,f ci ]。
wherein the static gesture feature f cc Namely, the original characteristics:
f cc =[x foot_c ,y foot_c ,z foot_c ,v foot_x_c ,v foot_y_c ,v foot_z_cKnee_cKnee_c ];
continuous motion variation characteristic f ci From the c-frame three-dimensional foot joint coordinates x foot_c 、y foot_c 、z foot_c Three-dimensional foot joint velocity v of c-th frame foot_x_c 、v foot_y_c 、v foot_z_c One-dimensional knee joint angle theta of c frame Knee_c One-dimensional knee joint angular velocity omega of c frame Knee_c These 8-dimensional features correspond to feature variations of the previous frame i:
integral variation characteristic f cp The three-dimensional knee joint angle model consists of a c-frame three-dimensional foot joint coordinate, a c-frame three-dimensional foot joint speed, a c-frame one-dimensional knee joint angle and a c-frame one-dimensional knee joint angular velocity 8-dimensional characteristic and a first frame p corresponding characteristic change:
and obtaining a unilateral state S by adopting a gait phase division strategy based on a random forest model according to an integral gait feature space formed by continuous multiframes, wherein 0 represents a supporting phase and 1 represents a swinging phase. The judgment of the supporting phase and the swinging phase corresponding to each frame can be obtained by utilizing the integral gait feature space and the random forest model, but the false judgment condition still appears in the actual result, for example, the shaking in the supporting period is misjudged as the swinging period, and the shaking in the swinging period is misjudged as the supporting period. Therefore, erroneous judgment correction of the abnormal phase is required. Since the step length of each step of a normal person is about 1m and the half step length is about 0.5m, if the position offset corresponding to a certain phase is smaller than 0.5m, the same phase should be judged, and the misjudgment correction is carried out on the single-side state S:
wherein D is t For the walking distance of the foot in a certain swing stage, L t_start For the position coordinates (x) t_start ,y t_start ,z t_start ),L t_end For the position coordinates (x) t_end ,y t_end ,z t_end ),Representing the inversion of the state S, S processed The single-side state after erroneous judgment correction processing.
4) Gait space-time parameter calculation
After the gait phase division is obtained, a starting point frame and an end point frame of the swing period can be obtained, and correspondingly, the joint point position coordinate, namely the starting point position L of the swing period, can be obtained t_start And end position L t_end Starting point timeAnd endpoint time->Further calculating gait space-time parameters, wherein the gait space-time parameters comprise step length, stride length, step frequency and walking speed.
The step length is the distance in m in the direction of travel during adjacent support of the left and right foot joints during walking. For the calculation of this parameter, the calculation of the distance in the running direction between the two feet can be achieved by acquiring the starting point of the supporting phase period (supporting period) of one lower limb and the starting point of the first supporting period of the opposite lower limb in the backward time, and the calculation method is as follows:
wherein li represents the i-th support period of the left lower limb, ri represents the i-th support period of the right lower limb; n1 represents the total number of unilateral support periods observed; l (L) foot_li A space coordinate value (x) representing the left foot at the start of the ith support period of the left foot foot_li ,y foot_li ,z foot_li ),L foot_ri Representation and correspondence L foot_li The spatial coordinate value (x) of the right foot at the beginning of the adjacent right foot support period foot_ri ,y foot_ri ,z foot_ri ) And (3) averaging the position differences of the advancing directions of the adjacent support periods at the left side and the right side to finish the calculation of the step length.
The stride length is the distance in m of the direction of travel of the ipsilateral foot joint during adjacent support. For the calculation of the parameter, the starting point of the supporting period of the joint of the ipsilateral foot and the end point of the corresponding swinging period can be used for calculating the spatial distance change values of the joint of the ipsilateral foot in a plurality of time periods, and the average value is taken, and the calculation method comprises the following steps:
wherein N2 represents the total number of unilateral support periods observed;spatial coordinate value of left foot joint at the beginning of kth supporting period of left foot +.> Spatial coordinate value +.1 representing the start of the k+1 th support period of the left foot>And (3) averaging the position differences of the adjacent support periods and the swing periods on the left side and the right side to finish the calculation of the stride length.
Step frequency (Cadence) refers to the number of steps per unit time during walking, in steps/minute. The calculation of the parameter can use the numbers of the left swing period and the right swing period to represent the steps, calculate the time between the starting point of the first swing period and the ending point of the last swing period as the running process time accumulation, and the calculation method is as follows:
wherein N represents the number of left and right swing periods,indicating the start of the first wobble period,/day>Indicating the end time of the last wobble period.
The walking speed (velocity) refers to the walking distance per unit time during walking, and can obviously be calculated in the following manner in units of m/s:

Claims (6)

1. the human gait parameter extraction method based on Kinect is characterized by comprising the following steps of:
1) Acquiring original joint point position sequence data extracted by a Kinect somatosensory depth sensor, wherein the original joint point position sequence data comprises spatial position information of a hip joint, a knee joint, an ankle joint and a foot joint;
2) Filtering the original joint point position sequence data and calculating original gait parameter values, wherein the original gait parameter values comprise knee joint angle, ankle joint angle, hip joint angle, foot joint speed, ankle joint angular speed and knee joint angular speed;
3) Constructing a gait feature space according to the original gait parameter value, wherein the gait feature space comprises a static posture feature, a continuous motion change feature and an integral change feature;
the static posture features include foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame:
the continuous motion change features include differences between foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame and the previous frame;
the overall change characteristics include differences between the foot joint coordinates, foot joint velocity, knee joint angle, and knee joint angular velocity of the current frame and the first frame;
4) A gait feature space is utilized to obtain a preliminary judgment result of a single-side state of each frame by adopting a gait phase division strategy based on a random forest model, and then misjudgment and correction are carried out on abnormal phases to obtain a final gait phase division result; the method for misjudging and correcting the abnormal phase comprises the following steps: when the preliminary judgment result is a swing phase, further judging whether the walking distance of the foot in the swing phase is greater than or equal to a half-step preset value, if so, the final gait phase division result of the single-side state of the frame is the swing phase, otherwise, the final gait phase division result of the single-side state of the frame is modified to be a support phase;
5) Determining a starting point frame and an end point frame of the swing period from a final gait phase division result, and obtaining a starting point position, an end point position, a starting point moment and an end point moment of the swing period according to joint point position coordinates in the starting point frame and the end point frame, so as to calculate and obtain gait space-time parameters;
the specific method for carrying out misjudgment correction on the abnormal phase is as follows:
wherein S is processed Dividing the final gait phase; s is a preliminary judgment result, wherein 0 represents that the single-side state is a supporting phase, and 1 represents that the single-side state is a swinging phase;representation is inverted for S; l (L) t_start For the spatial position coordinates (x) t_start ,y t_start ,z t_start ),L t_end For the spatial position coordinates (x) t_end ,y t_end ,z t_end ) The method comprises the steps of carrying out a first treatment on the surface of the 0.5 is a half step preset value, and the unit is m;
wherein the position coordinates of the joint point, i.e. the starting position L of the swing stage, are obtained t_start And end position L t_end Starting point timeAnd endpoint time->The specific steps for calculating the gait space-time parameters are as follows:
gait space-time parameters include step length, stride length, step frequency and walking speed;
step length is obtained by obtaining the starting point of the supporting phase period of one lower limb and the starting point of the first supporting phase period of the opposite lower limb in the backward time, and the distance D between the two feet in the advancing direction is realized by the foot joint coordinates of the two time points i As a final result, the average value of the distances over a plurality of supporting periods is calculated as follows:
wherein li represents the i-th support period of the left lower limb, ri represents the i-th support period of the right lower limb; n1 represents the total number of unilateral support periods observed; l (L) foot_li A space coordinate value (x) representing the left foot at the start of the ith support period of the left foot foot_li ,y foot_li ,z foot_li ),L foot_ri Representation and correspondence L foot_li The spatial coordinate value (x) of the right foot at the beginning of the adjacent right foot support period foot_ri ,y foot_ri ,z foot_ri ) Calculating step length after averaging the position difference values of the advancing directions of a plurality of left and right adjacent support periods;
the stride length utilizes the starting point of the joint support period of the same-side foot and the end point of the corresponding swing period to calculate the space distance change value D of the joint of the same-side foot in a plurality of time periods k And taking an average value, wherein the calculation method is as follows:
wherein N2 represents the total number of unilateral support periods observed;spatial coordinate value of left foot joint at the beginning of kth supporting period of left foot +.> Spatial coordinate values representing the start of the k+1th support period of the left footCalculating the stride length after averaging the position difference values of the adjacent support periods and the swing periods on the left side and the right side;
the step frequency is used for expressing the step number by using the number of the left swing period and the right swing period, and the time between the starting point of the first swing period and the ending point of the last swing period is calculated and used as the running process time accumulation, and the calculation method is as follows:
wherein N represents the number of left and right swing periods,indicating the start of the first wobble period,/day>Indicating the final point moment of the last swing period;
the walking speed is calculated in m/s as follows:
2. the method of claim 1, wherein filtering the raw node position sequence data uses Savitzky Golay filtering.
3. The method of claim 2, wherein the sliding window width value used for Savitzky Golay filtering is set to 17.
4. The method of claim 1, wherein the spatial position information of the hip, knee, ankle and foot joints specifically comprises a knee joint point to hip joint direction vector e 1 Direction vector e from knee joint point to ankle joint 2 Ankle joint to knee direction vector e 3 And ankle joint to foot joint direction vector e 4 And a normal vector l of a horizontal plane corresponding to a vector formed by the hip joint and the knee joint 1 Vector normal vector l of sagittal plane corresponding to vector formed by hip joint and knee joint 2 The normal vector of the coronal plane corresponding to the vector formed by the hip joint and the knee joint is l 3
Knee joint angle theta Knee The calculation method of (1) is as follows:
ankle angle θ Ankle The calculation method of (1) is as follows:
the angle theta between the vector formed by the hip joint and the knee joint and the horizontal plane Hip_Transverse Angle theta with sagittal plane Hip_Coronal Included angle theta with coronal plane Hip_Sagittal To express:
5. the method of claim 1, wherein the speed of the foot joint in the camera coordinate system is calculated as follows:
wherein the direction of the z-axis in the camera coordinate system is the human body advancing direction, (x) t ,y t ,z t ) And (x) t-1 ,y t-1 ,z t-1 ) The three-dimensional position coordinates of the foot joint at the moment t and the moment t-1 before are respectively shown, and f represents the sampling frequency of the Kinect somatosensory depth sensor.
6. The method of claim 1 wherein the ankle angular velocity ω Ankle The calculation mode of (a) is as follows:
angular velocity omega of knee joint Knee The calculation mode of (a) is as follows:
wherein the method comprises the steps of,θ t_Ankle For the ankle angle at the current time t, θ t-1_Ankle Angle θ with the ankle joint of the previous moment t-1 t_Knee For the knee joint angle at the current moment t, theta t-1_Knee For the knee angle of the previous moment t-1, f represents the sampling frequency of the Kinect somatosensory depth sensor.
CN202210185789.7A 2022-02-28 2022-02-28 Kinect-based human gait parameter extraction method Active CN114663913B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210185789.7A CN114663913B (en) 2022-02-28 2022-02-28 Kinect-based human gait parameter extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210185789.7A CN114663913B (en) 2022-02-28 2022-02-28 Kinect-based human gait parameter extraction method

Publications (2)

Publication Number Publication Date
CN114663913A CN114663913A (en) 2022-06-24
CN114663913B true CN114663913B (en) 2023-10-31

Family

ID=82028130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210185789.7A Active CN114663913B (en) 2022-02-28 2022-02-28 Kinect-based human gait parameter extraction method

Country Status (1)

Country Link
CN (1) CN114663913B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115153517B (en) * 2022-07-18 2023-03-28 北京中科睿医信息科技有限公司 Testing method, device, equipment and storage medium for timing, standing and walking test
CN117059227B (en) * 2023-10-13 2024-01-30 华南师范大学 Motion monitoring method and device based on gait data and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101403925A (en) * 2008-10-28 2009-04-08 北京理工大学 Control method and system for touchdown time of stable walking feet of humanoid robot
CN104296750A (en) * 2014-06-27 2015-01-21 大连理工大学 Zero speed detecting method, zero speed detecting device, and pedestrian navigation method as well as pedestrian navigation system
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN112115923A (en) * 2020-10-12 2020-12-22 武汉艾格美康复器材有限公司 Multichannel time sequence gait analysis algorithm based on direct feature extraction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6272735B2 (en) * 2014-06-19 2018-01-31 本田技研工業株式会社 Walking assistance device and walking control program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101403925A (en) * 2008-10-28 2009-04-08 北京理工大学 Control method and system for touchdown time of stable walking feet of humanoid robot
CN104296750A (en) * 2014-06-27 2015-01-21 大连理工大学 Zero speed detecting method, zero speed detecting device, and pedestrian navigation method as well as pedestrian navigation system
CN110021398A (en) * 2017-08-23 2019-07-16 陆晓 A kind of gait analysis, training method and system
CN112115923A (en) * 2020-10-12 2020-12-22 武汉艾格美康复器材有限公司 Multichannel time sequence gait analysis algorithm based on direct feature extraction

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"基于Kinect的人体步态分析和重心估算方法研究";荀小飞;《中国优秀硕士学位论文全文数据库 基础科学辑》(第(2022)01期);A006-600,第二章、第三章 *
Body Sensor Network-Based Gait Quality Assessment for Clinical Decision-Support via Multi-Sensor Fusion;SEN QIU 等;IEEE Access;第7卷;59884 - 59894 *
多运动状态下自适应阈值步态检测算法;宁一鹏 等;中国惯性技术学报;第28卷(第02期);172-178+185 *
异常步态3维人体建模和可变视角识别;罗坚 等;中国图象图形学报;第25卷(第08期);1539-1550 *

Also Published As

Publication number Publication date
CN114663913A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN114663913B (en) Kinect-based human gait parameter extraction method
CN107174255B (en) Three-dimensional gait information acquisition and analysis method based on Kinect somatosensory technology
CN106821391B (en) Human body gait acquisition and analysis system and method based on inertial sensor information fusion
CN110916679A (en) Human body lower limb pose gait detection device and method
Gujarathi et al. Gait analysis using imu sensor
Yoon et al. Improvement of dynamic respiration monitoring through sensor fusion of accelerometer and gyro-sensor
CN108245164B (en) Human body gait information acquisition and calculation method for wearable inertial device
Wagenaar et al. Continuous monitoring of functional activities using wearable, wireless gyroscope and accelerometer technology
Stamatakis et al. Gait feature extraction in Parkinson's disease using low-cost accelerometers
CN109579853A (en) Inertial navigation indoor orientation method based on BP neural network
Laudanski et al. A concurrent comparison of inertia sensor-based walking speed estimation methods
US10285628B2 (en) Method for detecting ambulatory status and device for detecting ambulatory status
JP6127873B2 (en) Analysis method of walking characteristics
Han et al. Gait detection from three dimensional acceleration signals of ankles for the patients with Parkinson’s disease
Andrade et al. Pelvic movement variability of healthy and unilateral hip joint involvement individuals
CN113768471B (en) Parkinson disease auxiliary diagnosis system based on gait analysis
Loose et al. Gait patterns in standard scenarios: Using Xsens MTw inertial measurement units
CN114271812A (en) Three-dimensional gait analysis system and method based on inertial sensor
Li et al. Multi-body sensor data fusion to evaluate the hippotherapy for motor ability improvement in children with cerebral palsy
Clément et al. Instantaneous velocity estimation for the four swimming strokes using a 3-axis accelerometer: Validation on paralympic athletes
CN114287890A (en) Method for evaluating motion function of Parkinson patient based on MEMS sensor
JP2019122609A (en) System and method for analysis of operation smoothness
Tham et al. Biomechanical ambulatory assessment of 3D knee angle using novel inertial sensor-based technique
Jamali et al. Quantitative evaluation of parameters affecting the accuracy of microsoft kinect in gait analysis
CN208876547U (en) A kind of gait analysis device based on IMU inertial sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant