CN114694252A - Old people falling risk prediction method - Google Patents

Old people falling risk prediction method Download PDF

Info

Publication number
CN114694252A
CN114694252A CN202210321855.9A CN202210321855A CN114694252A CN 114694252 A CN114694252 A CN 114694252A CN 202210321855 A CN202210321855 A CN 202210321855A CN 114694252 A CN114694252 A CN 114694252A
Authority
CN
China
Prior art keywords
risk
time
leg
angle
clip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210321855.9A
Other languages
Chinese (zh)
Other versions
CN114694252B (en
Inventor
李洪波
刘勇国
朱嘉静
张云
李巧勤
傅翀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210321855.9A priority Critical patent/CN114694252B/en
Publication of CN114694252A publication Critical patent/CN114694252A/en
Application granted granted Critical
Publication of CN114694252B publication Critical patent/CN114694252B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Evolutionary Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a method for predicting falling risk of old people, which comprises the following steps: s1, data acquisition and pretreatment; s2, carrying out skeleton modeling; s3, video time segmentation based on motion characteristics: the video is divided into 7 segments, respectively denoted as S0~S6(ii) a The total of 6 time division points among 7 segments are respectively marked as T in sequence1~T6(ii) a S4, performing time risk judgment; s5, extracting the attitude characteristics of each action stage according to frames from the video of each stage obtained after the segmentation of the step S3; s6, judging posture risks according to the posture characteristics obtained in the step S5; and S7, performing final comprehensive risk judgment according to the time risk and the posture risk obtained in the steps S4 and S6. According to the invention, the time characteristic and the posture characteristic of the movement process are integrated, the obtained risk evaluation result can reflect the real balance and movement capability, and the accuracy of the falling risk prediction result is improved.

Description

Old people falling risk prediction method
Technical Field
The invention relates to a method for predicting falling risk of old people.
Background
The old people can naturally weaken the physical quality, the functions of bones and muscles and the like, the falling risk is increased, and the falling easily causes the injuries such as fracture and the like. In order to find the falling risk as early as possible and take effective protective measures, the falling risk assessment of the old has important practical significance. The TUG Test, namely the timing standing walking Test, is a relatively mature method for predicting the falling risk of the old people at present, is widely concerned and has proved to be effective.
In recent years, researchers have applied computer technology to the TUG test, i.e., have implemented automation of the TUG test. The computer automation of the TUG test can both retain the utility of the TUG test itself and reduce the direct intervention of professional medical personnel, thereby improving the popularity of the test and also effectively reducing the problem of human error that may occur during operation.
The patent application with the application number of 201911412802.2 discloses a method and a system for evaluating the falling risk of the old, which are used for extracting gait characteristics such as single-leg suspended displacement, knee joint angle, ankle joint angle and the like in the walking process of a person and judging the falling risk.
The existing TUG automation method is limited to the timing characteristic of the TUG, omits the rich motion characteristic and attitude characteristic in the TUG process, and has poor robustness in practical application due to the characteristics and algorithm used for time segmentation. The complete TUG motion process comprises the actions of standing up, walking, turning, sitting down and the like, and the actions can reflect the characteristics of physical strength, balance level and the like of the old from different angles. The 201911412802.2 application mainly focuses on extracting features of a walking stage to perform fall risk judgment, and affects the accuracy of the fall risk judgment.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide the old people falling risk prediction method which integrates the time characteristics and the posture characteristics of the movement process, and the obtained risk assessment result can reflect the real balance and movement capability and can improve the accuracy of the falling risk prediction result.
The purpose of the invention is realized by the following technical scheme: a method for predicting fall risk of an old person comprises the following steps:
s1, data acquisition and preprocessing, including the following substeps:
s11, arranging a data acquisition environment according to the TUG standard: a chair 60cm high, which is opposite to a flat walkway 3m long; the image pickup equipment pixel requires at least 720p and is fixedly placed at the height of 1.5 m;
s12, starting camera shooting when the subject sits up, then rising up and moving forward, turning to and moving back after reaching the end point of the walkway, returning to the starting point and seating again, and stopping camera shooting; the resulting video was preprocessed to 1280x720 pixels, 30 frames per second;
s13, taking the real falling times of the subject in the last two years as the label of sample data to form a training data set;
s14, classifying the fall risk into 3 grades: the low risk level corresponds to no fall record within 2 years, the medium risk level corresponds to 1 to 2 fall records within 2 years, and the high risk level corresponds to 3 fall records or more;
s2, carrying out skeleton modeling: modeling human body in video as skeleton of 13 nodes by using OpenPose, and using J0~J12Represents a corresponding joint, and specifically comprises: j is a unit of0Head, J1Left shoulder, J2Right shoulder, J3Left elbow, J4Right elbow, J5Left wrist, J6Right wrist, J7Left hip, J8Right hip, J9Left knee, J10Right knee, J11Left ankle, J12-a right ankle;
obtaining a horizontal axis coordinate and a vertical axis coordinate on each frame of 2d image through an OpenPose framework;
and (3) reconstructing an absolute coordinate obtained by OpenPose, and constructing a relative coordinate system: in each frame of image, the center of the person, i.e. J, is found by the absolute coordinates of the nodes1、J2、J7、J8The human body is selected by a rectangular outer frame with a fixed proportion, the lower left corner of the outer frame is taken as the origin of coordinates, the direction of the longitudinal axis of the transverse axis is unchanged, and a new coordinate system is established;
s3, video time segmentation based on motion characteristics: the video is divided into 7 segments, respectively denoted as S0~S6(ii) a The total of 6 time division points among 7 segments are respectively marked as T in sequence1~T6The whole video start and end time points are respectively T0 and T7Represents; wherein S is0Corresponding sitting up and S1Stand up correspondingly, S2Corresponding to forward walking, S3Corresponding to the in-situ turning and S4Corresponding to the walking in the reverse direction, S5Corresponding to turning and sitting in place S6The corresponding end seat;
s4, according to the time characteristics obtained by segmentation in the step S3, time risk judgment is carried out, and the specific method is as follows: the 5 temporal features are defined as follows:
total time: f1=|S1|+|S2|+|S3|+|S4|+|S5|=T6-T1
Rising stage time: f2=|S1|=T2-T1
Duration of walking phase: f3=|S2|+|S4|=T5+T3-T4-T2
Turning stage time: f4=|S3|=T4-T3
Turning body and sitting down stage time: f5=|S5|=T6-T5
Setting threshold Th for five time characteristics1、Th2、Th3、Th4、Th5The risk assessment is weighted differently for different temporal characteristics, and a risk score weight W is set for each temporal characteristic1、W2、W3、W4、W5Time-characterized risk score Rtime
Figure BDA0003572039460000021
wherein ,
Figure BDA0003572039460000022
risk score R if temporal characteristicstimeExceeds a preset integrated time threshold ThsumDirectly deem the risk rating high without performing subsequent steps; otherwise, executing step S5;
s5, extracting the attitude characteristics of each action stage according to frames from the video of each stage obtained after the segmentation of the step S3;
s6, judging posture risks according to the posture characteristics obtained in the step S5;
and S7, performing final comprehensive risk judgment according to the time risk and the posture risk obtained in the steps S4 and S6.
Further, in step S3, the specific method of video time segmentation based on motion characteristics is as follows:
s31, constructing the following action characteristics:
node J7、J8、J9 and J10The height difference of (a) is defined as follows:
Figure BDA0003572039460000031
Y(J7)、Y(J8)、Y(J9)、Y(J10) Respectively represents J7、J8、J9 and J10The ordinate of (a);
node J1J of (A)2The horizontal distance difference of (a) is defined as:
F_clip2=|X(J1)-X(J2)|
X(J1)、X(J2) Respectively represent J1J of (A)2The abscissa of (a);
node J7 and J8The horizontal distance difference of (a) is defined as:
F_clip3=|X(J7)-X(J8)|
X(J7)、X(J8) Are respectively J7 and J8The abscissa of (a);
s32, respectively calculating the three action characteristics of S31 for each frame of image to obtain F _ clip1、F_clip2、F_clip3Three discrete sequences, namely using a binary inflection point search algorithm BinSeg of an L2 loss model, matching empirical parameters to obtain inflection points of the sequences, and obtaining stage division of actions according to the inflection points; the method comprises the following specific steps:
using F _ clip1Sequence, using inflection point search algorithm with prior knowledge to judge S0And S1Is a dividing point T1(ii) a In particular, T1Appear in F _ clip 11 st inflection point;
using F _ clip1Judgment S1And S2Is a dividing point T2;T2Appear in F _ clip1At the 2 nd inflection point;
using F _ clip2Judgment S2And S3Is a dividing point T3;T3Appear in F _ clip 21 st inflection point;
using F _ clip3Judgment S3And S4Is a dividing point T4;T4Appear in F _ clip3At the 3 rd inflection point;
using F _ clip2Judgment S4And S5Is a dividing point T5;T5Appear in F _ clip2At the 4 th inflection point;
using F _ clip1Judgment S5And S6Is a dividing point T6;T6Appear in F _ clip1At the 4 th inflection point.
Further, the specific implementation method of step S5 is as follows:
s51 rising-middle axle front and back offset characteristic F6: in the standing process, the central axis deviates from the maximum angle of sitting statically, and the central axis of the trunk is obtained by calculating the coordinates of the left shoulder, the right shoulder, the left hip and the right hip; the specific calculation method is as follows:
firstly, calculating the slope of the central axis of the trunk:
Figure BDA0003572039460000041
then the slope is converted into a radian system angle which is expressed by a symbol theta;
for each time tiCalculating the slope according to the above formula
Figure BDA0003572039460000042
And is turned into an angle of radian
Figure BDA0003572039460000043
Then obtaining F6The following were used:
Figure BDA0003572039460000044
wherein T1<ti<T2Corresponding to all the moments in the standing process; thetasitMean offset angle of the fingering stage;
s52, calculating the mean deviation angle F of the middle shaft in the turning process7
Figure BDA0003572039460000045
S53, calculating the maximum angle F of the central axis when the central axis deviates from the standing straight state in the sitting process8
Figure BDA0003572039460000046
wherein T5<ti<T6,θstandMean deviation angle of finger sitting down stage;
s54, calculating walking step length and height ratio characteristic F9: the horizontal moving distance between two landing points of the single foot ankle is called as a single step length, the single step length of the two feet is averaged in the walking process, and the ratio of the average step length to the height is calculated;
obtaining a set of landing time points by solving an extremum value through an ankle height sequence:
set of landing time points for the left foot:
Afloor_left={tm|T2<tm<T3or T4<tm<T5And X' (t)m) Is a minimum value }
tmA time point, X' (t), corresponding to the m-th left ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosem) Represents tmThe left ankle abscissa in the absolute coordinate system of (1); removing landing time points of incomplete steps at the head and the tail, and counting the average 1-step length;
right foot landing time point set:
Afloor_right={tn|T2<tn<T3or T4<tn<T5And X' (t)n) Is a minimum value }
tnA time point, X' (t), corresponding to the nth right ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosen) Is tnThe abscissa of the right ankle in the absolute coordinate system of (1); removing landing time points of incomplete steps at the head and the tail, and counting the average 1-step length;
the left foot single step size is:
Figure BDA0003572039460000047
Figure BDA0003572039460000048
respectively are the m +1 th and m left ankle horizontal coordinates in an absolute coordinate system;
the right foot single step is:
Figure BDA0003572039460000051
Figure BDA0003572039460000052
respectively are the abscissa of the (n + 1) th and the n right ankle in an absolute coordinate system;
step length to height ratio characteristic F9The calculation is as follows:
Figure BDA0003572039460000053
wherein BH represents body height;
s55, calculating the left and right deviation characteristic F of the walking central axis10That is, during walking, the horizontal left-right average angle difference of the central axis when deviating from the standing straight is calculated as follows:
Figure BDA0003572039460000054
wherein θstdThe angle of the middle shaft when standing straight:
Figure BDA0003572039460000055
s56 amplitude characteristic F of walking swing arm11: obtaining vector angle difference formed by the left shoulder, the left elbow, the right shoulder and the right elbow node, namely elbow-shoulder vector when the elbow is swung farthest backwards, elbow-shoulder vector when the elbow is swung farthest forwards and angle difference between the elbow and shoulder vector;
firstly, respectively calculating the swing arm slopes of the left arm and the right arm:
Figure BDA0003572039460000056
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC 1, …, 4;
converting slope into radian angle alphaleft_armAnd alpharight_arm
Calculating the swing arm angles of the left arm and the right arm in each frame to form a sequence of the swing arm angles of the left arm and the right arm, and solving an extreme value in the sequence to obtain the maximum swing angle A of the front swing of the left armaf_leftMaximum angle A of left arm back swingab_leftMaximum angle A of front swing of right armaf_rightMaximum angle A of right arm backswingab_right
Will be adjacent to Aaf_leftAnd Aab_leftAligning and differentiating to obtain a single swing arm angle set A of the left armas_left
Will be adjacent to Aaf_rightAnd Aab_rightAligning and differentiating to obtain a right arm single swing arm angle set Aas_right
Removing Aas_leftAnd Aas_rightThe first and last data in the two sets, then with the remaining set A'as_left A'as_rightBased on, calculate F11
Figure BDA0003572039460000061
wherein αm∈A'as_left,αn∈A'as_right,|A'as_left|、|A'as_rightL represents set A 'respectively'as_left、A'as_rightThe number of middle elements;
s57, calculating walking leg swing amplitude characteristic F12: obtaining the vector angle difference formed by the left hip, the left knee, the right hip and the right knee through the nodes of the left hip, the left knee, the right hip and the right knee;
calculating the leg swinging slopes of the left leg and the right leg respectively:
Figure BDA0003572039460000062
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC-7, …, 10;
will Kleft_leg and Kright_legConversion from slope to radian angle betaleft_legAnd betaright_leg
Calculating the leg swinging angles of the left leg and the right leg in each frame to form a sequence of the leg swinging angles of the left leg and the right leg, and solving an extreme value in the sequence to obtain the maximum front swinging angle A of the left leglf_leftMaximum angle A of left leg back swinglb_leftMaximum angle A of front swing of right leglf_rightMaximum angle A of right leg back swinglb_right
Will be adjacent to Alf_left and Alb_leftAligning and differencing to obtain a left leg swing angle sequence Als_left
Will be adjacent to Alf_right and Alb_rightAligning and differentiating to obtain a leg swing angle sequence A of the right legls_right
Removal of sequence Als_left and Als_rightAnd then in the remaining sequence A'ls_left and A'ls_rightTo obtain F12The calculation is as follows:
Figure BDA0003572039460000063
wherein βm∈A'ls_left,βn∈A'ls_right,|A'ls_left|、|A'ls_rightL represent the sequence A 'respectively'ls_left and A'ls_rightThe number of elements in (c).
Further, the specific implementation method of step S6 is as follows: the posture feature F6~F12Inputting the data into a multilayer perceptron to carry out regression prediction to obtain an attitude risk value Rposture
Further, the specific implementation method of step S7 is as follows: if the time risk R has been obtained in step S4timeGreater than threshold ThsumIf so, the comprehensive risk judgment directly obtains a result, and the falling risk is high risk; otherwise, the following judgment is carried out:
setting the weight W by linear fittingtime and WpostureUltimate risk Rfinal=Wtime·Rtime+Wposture·Rposture
Will be the final risk RfinalWith a high risk reference threshold ThdangerMiddle risk reference threshold ThalarmMaking a comparison higher than or equal to a high risk reference threshold ThdangerThe subject is considered to have a high risk of falling, higher than or equal to the middle risk threshold ThalarmBut below a high risk threshold ThdangerThe subject is considered to have a moderate fall risk, below the moderate risk reference threshold ThalarmThe test person is considered to have a low fall risk.
The invention has the beneficial effects that: according to the method, the action video is divided into different stages, the time dimension falling risk is predicted based on the time characteristics of each stage, and the primary falling risk judgment is carried out; then constructing attitude characteristics of different stages based on video segmentation results, inputting the attitude characteristics into a multilayer perceptron to perform regression prediction, and obtaining attitude risk prediction results of spatial dimensions; and finally, integrating the risks of two dimensions of time and space to obtain a final falling risk prediction result. According to the invention, the time characteristic and the posture characteristic of the movement process are integrated, the obtained risk evaluation result can reflect the real balance and movement capability, and the accuracy of the falling risk prediction result is improved.
Drawings
Fig. 1 is a flow chart of a fall risk prediction method for elderly people according to the present invention;
FIG. 2 is a schematic diagram of a data acquisition environment of the present invention;
FIG. 3 is a schematic diagram of the skeletal modeling of the present invention;
FIG. 4 is a diagram illustrating natural division of action phases according to the present invention;
FIG. 5 is a diagram illustrating the step division of action according to inflection points in the present invention.
Detailed Description
The technical scheme of the invention is further explained by combining the attached drawings.
As shown in fig. 1, the method for predicting fall risk of an elderly person of the present invention includes the following steps:
s1, data acquisition and preprocessing, including the following substeps:
s11, arranging a data acquisition environment according to the TUG standard: a chair 60cm high, which is opposite to a flat walkway 3m long; the image pickup device pixel is required to be at least 720p, and is fixedly placed at a height of 1.5m, and a connecting line between the image pickup device and the end point of the walkway is vertical to the walkway, as shown in FIG. 2;
s12, starting camera shooting when the subject sits up, then rising up and moving forward, turning to and moving back after reaching the end point of the walkway, returning to the starting point and seating again, and stopping camera shooting; the obtained video is preprocessed to 1280x720 pixels, and 30 frames per second;
s13, taking the real falling times of the subject in the last two years as the label of sample data to form a training data set for parameter fitting;
s14, classifying the fall risk into 3 grades: the low risk level corresponds to no fall record within 2 years, the medium risk level corresponds to 1 to 2 fall records within 2 years, and the high risk level corresponds to 3 fall records or more;
s2, carrying out skeleton modeling: modeling human body in video as skeleton of 13 nodes by using OpenPose, and using J0~J12Represents the corresponding joint, and specifically includes, as shown in fig. 3: j. the design is a square0Head, J1Left shoulder, J2Right shoulder, J3Left elbow, J4Right elbow, J5Left wrist, J6Right wrist, J7Left hip, J8Right hip, J9Left knee, J10Right knee, J11Left ankle, J12-a right ankle;
obtaining a horizontal axis coordinate and a vertical axis coordinate on each frame of 2d image through an OpenPose framework; in the following formula, X represents a horizontal coordinate and Y represents a vertical coordinate.
In the process of human body movement, the size of the projection of a human body in an image changes, in order to reduce the influence of the size change of the human image on a characteristic value, the absolute coordinates obtained by OpenPose are reconstructed, and a relative coordinate system is constructed, wherein the method comprises the following steps: in each frame of image, the center of the person, i.e. J, is found by the absolute coordinates of the nodes1、J2、J7、J8The human body is selected by a rectangular outer frame with a fixed proportion, the lower left corner of the outer frame is taken as the origin of coordinates, the direction of the longitudinal axis of the transverse axis is unchanged, and a new coordinate system is established; and the unit length of the new coordinate system is scaled according to the size of the outer frame, namely, the images in the outer frame are scaled to be uniform in size. The strategy for selecting the size of the outer frame is that the height (absolute value of the difference of the Y-axis coordinates) and the width (absolute value of the difference of the X-axis coordinates) are considered at the same time, when the height-width ratio is larger than a ratio r, the height is used as a standard to demarcate the outer frame, and when the height-width ratio is smaller than the ratio r, the outer frame with a fixed aspect ratio is demarcated by using the width as the standard to correspond to the state that the body is curled.
The finally obtained image should have the characteristics that the vertical axis of the horizontal axis of the image obtained by the frame selection is almost occupied by the human body when the human body is upright, the horizontal axis is almost occupied by the body width when the human body is curled, and the distance between the human body and the lens does not influence the proportion of the human body in the image. The reconstructed relative coordinate system is applicable, if not indicated, in the text.
S3 video time based on motion characteristicsSegmenting: the video is divided into 7 segments, respectively denoted as S0~S6(ii) a The total of 6 time division points among 7 segments are respectively marked as T in sequence1~T6The whole video start and end time points are respectively T0 and T7Represents; wherein S is0Corresponding sitting up and S1Stand up correspondingly, S2Corresponding to forward walking, S3Corresponding to the in-situ turning, S4Corresponding to the walking in the reverse direction, S5Corresponding to turning and sitting in place S6The corresponding end seat, as shown in fig. 4;
the specific method of video time segmentation based on motion characteristics comprises the following steps:
s31, constructing the following action characteristics:
node J7、J8、J9 and J10The height difference of (a) is defined as follows:
Figure BDA0003572039460000081
Y(J7)、Y(J8)、Y(J9)、Y(J10) Respectively represents J7、J8、J9 and J10The ordinate of (a);
node J1J of (A)2The horizontal distance difference of (a) is defined as:
F_clip2=|X(J1)-X(J2)|
X(J1)、X(J2) Respectively represent J1J of (A)2The abscissa of (a);
node J7 and J8The horizontal distance difference of (a) is defined as:
F_clip3=|X(J7)-X(J8)|
X(J7)、X(J8) Are respectively J7 and J8The abscissa of (a);
s32, respectively calculating the three motion characteristics of S31 for each frame of image to obtain F _ clip1、F_clip2、F_clip3Three discrete sequences, using L2 loss modesThe binary inflection point search algorithm BinSeg of the type is matched with empirical parameters to obtain inflection points of a sequence, and the stage division of actions is obtained according to the inflection points, as shown in fig. 5; the method comprises the following specific steps:
using F _ clip1Sequence, using inflection point search algorithm with prior knowledge to judge S0And S1Is a dividing point T1(ii) a Specifically, T1Appear in F _ clip 11 st inflection point;
using F _ clip1Judgment S1And S2Is a dividing point T2;T2Appear in F _ clip1At the 2 nd inflection point;
using F _ clip2Judgment S2And S3Is a dividing point T3;T3Appear in F _ clip2At the 1 st inflection point;
using F _ clip3Judgment S3And S4Is a dividing point T4;T4Appear in F _ clip3At the 3 rd inflection point;
using F _ clip2Judgment S4And S5Is a dividing point T5;T5Appear in F _ clip2At the 4 th inflection point;
using F _ clip1Judgment S5And S6Is a demarcation point T6;T6Appear in F _ clip1At the 4 th inflection point.
S4, according to the time characteristics obtained by segmentation in the step S3, time risk judgment is carried out, and the specific method is as follows: the 5 temporal features are defined as follows:
total time: f1=|S1|+|S2|+|S3|+|S4|+|S5|=T6-T1
Rising stage time: f2=|S1|=T2-T1
Duration of walking phase: f3=|S2|+|S4|=T5+T3-T4-T2
Turn-around phase time: f4=|S3|=T4-T3
Turning body and sitting down stage time: f5=|S5|=T6-T5
Setting threshold Th for five time characteristics1、Th2、Th3、Th4、Th5The risk assessment is weighted differently for different temporal characteristics, and a risk score weight W is set for each temporal characteristic1、W2、W3、W4、W5Time-characterized risk score Rtime
Figure BDA0003572039460000091
wherein ,
Figure BDA0003572039460000092
the threshold and the weight in the above formula are obtained by fitting actual risk data. Risk score R if temporal characteristicstimeExceeds a preset integrated time threshold ThsumDirectly deem the risk rating high without performing subsequent steps; otherwise, executing step S5; obtaining a threshold Th having a maximum degree of discrimination in the data set by analyzing actual risk datasum
S5, extracting the attitude characteristics of each action stage according to frames from the video of each stage obtained after the segmentation of the step S3;
the specific implementation method comprises the following steps:
s51 rising-middle axle front and back deviation characteristic F6: in the standing process, the central axis deviates from the maximum angle of sitting statically, and the central axis of the trunk is obtained by calculating the coordinates of the left shoulder, the right shoulder, the left hip and the right hip; the specific calculation method is as follows:
firstly, calculating the slope of the central axis of the trunk:
Figure BDA0003572039460000101
then the slope is converted into a radian system angle which is expressed by a symbol theta;
for each time tiCalculating the slope according to the above formula
Figure BDA0003572039460000102
And is turned into an angle of radian
Figure BDA0003572039460000103
Then F is obtained6The following were used:
Figure BDA0003572039460000104
wherein T1<ti<T2Corresponding to all the moments in the process of standing up (also called frames, moment t)iIn fact, the unit is a frame, that is, each frame calculates the slope and then converts the slope into an angle of radian. Because there are 30 frames per second fixed, the description of the frame sequence and the time series description are equivalent); thetasitMean offset angle of the fingering stage;
s52, calculating the mean deviation angle F of the middle shaft in the turning process7
Figure BDA0003572039460000105
S53, calculating the maximum angle F of the central axis when the central axis deviates from the standing straight state in the sitting process8
Figure BDA0003572039460000106
wherein T5<ti<T6,θstandMean deviation angle of finger sitting down stage;
s54, calculating walking step length and height ratio characteristic F9: the horizontal moving distance between two landing points of the single foot ankle is called as a single step length, the single step length of the two feet is averaged in the walking process, and the ratio of the average step length to the height is calculated; when calculating specificallyAccording to the magnitude of the numerical value, two minimum step values are removed, namely two incomplete steps of starting and stopping at the end point. Here, the absolute coordinate system obtained by openpos is used, and the absolute coordinate symbols are X 'and Y'.
Obtaining a set of landing time points by solving an extreme value through an ankle height sequence:
set of landing time points for the left foot:
Afloor_left={tm|T2<tm<T3or T4<tm<T5And X' (t)m) Is a minimum value }
tmA time point, X' (t), corresponding to the m-th left ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosem) Denotes tmThe left ankle abscissa in the absolute coordinate system of (1); removing landing time points of incomplete steps (when walking starts and finishes, feet are closed, and the definition of single step length is not met, so that the landing time points are removed, only two time points of the start and the end are removed), and counting the average 1-step length;
right foot landing time point set:
Afloor_right={tn|T2<tn<T3or T4<tn<T5And X' (t)n) Is a minimum value }
tnA time point, X' (t), corresponding to the nth right ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosen) Is tnThe abscissa of the right ankle in the absolute coordinate system of (1); removing landing time points of incomplete steps at the head and the tail, and counting the average 1-step length;
the left foot single step size is:
Figure BDA0003572039460000111
Figure BDA0003572039460000112
respectively m +1 th in the absolute coordinate systemAnd m left ankle abscissas;
the right foot single step is:
Figure BDA0003572039460000113
Figure BDA0003572039460000114
respectively are the abscissa of the (n + 1) th and the n right ankle in an absolute coordinate system;
step length to height ratio characteristic F9The calculation is as follows:
Figure BDA0003572039460000115
wherein BH represents body height;
s55, calculating the left and right deviation characteristic F of the walking central axis10That is, during walking, the horizontal left-right average angle difference of the central axis when deviating from the standing straight is calculated as follows:
Figure BDA0003572039460000116
wherein θstdThe angle of the middle shaft when standing straight:
Figure BDA0003572039460000117
s56 amplitude characteristic F of walking swing arm11: obtaining the vector angle difference formed by the left shoulder, the left elbow, the right shoulder and the right elbow node, namely the elbow-shoulder vector at the farthest position of the back swing and the elbow-shoulder vector at the farthest position of the front swing, and the angle difference of the elbow-shoulder vector and the elbow-shoulder vector;
firstly, respectively calculating the swing arm slopes of the left arm and the right arm:
Figure BDA0003572039460000118
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC 1, …, 4;
converting slope into radian angle alphaleft_armAnd alpharight_arm
Calculating the swing arm angles of the left arm and the right arm in each frame to form a sequence of the swing arm angles of the left arm and the right arm, and solving an extreme value in the sequence to obtain the maximum swing angle A of the front swing of the left armaf_leftMaximum angle A of left arm back swingab_leftMaximum angle A of front swing of right armaf_rightMaximum angle A of right arm back swingab_right
Will be adjacent to Aaf_leftAnd Aab_leftAligning and differentiating to obtain a single swing arm angle set A of the left armas_left
Will be adjacent to Aaf_rightAnd Aab_rightAligning and differentiating to obtain a right arm single swing arm angle set Aas_right
Removing Aas_leftAnd Aas_rightThe first and last data in the two sets (these two non-pure striding actions, therefore removed) are then grouped by the remaining set A'as_left A'as_rightBased on, calculate F11
Figure BDA0003572039460000121
wherein αm∈A'as_left,αn∈A'as_right,|A'as_left|、|A'as_rightL represents set A 'respectively'as_left、A'as_rightThe number of middle elements;
s57, calculating walking leg swing amplitude characteristic F12: obtaining the vector angle difference formed by the left hip, the left knee, the right hip and the right knee through the nodes of the left hip, the left knee, the right hip and the right knee;
calculating the leg swinging slopes of the left leg and the right leg respectively:
Figure BDA0003572039460000122
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC-7, …, 10;
will Kleft_leg and Kright_legConversion from slope to radian angle betaleft_legAnd betaright_leg
Calculating the leg swinging angles of the left leg and the right leg in each frame to form a sequence of the leg swinging angles of the left leg and the right leg, and solving an extreme value in the sequence to obtain the maximum front swinging angle A of the left leglf_leftMaximum angle A of left leg back swinglb_leftMaximum angle A of front swing of right leglf_rightMaximum angle A of right leg back swinglb_right
Corresponding according to the time sequence, and corresponding to adjacent Alf_left and Alb_leftAligning and differentiating to obtain a left leg swing angle sequence Als_left
Will be adjacent to Alf_right and Alb_rightAligning and differencing to obtain a right leg swing angle sequence Als_right
Removal of sequence Als_left and Als_rightAnd then in the remaining sequence A'ls_left and A'ls_rightTo obtain F12The calculation is as follows:
Figure BDA0003572039460000123
wherein βm∈A'ls_left,βn∈A'ls_right,|A'ls_left|、|A'ls_rightL represent the sequence A 'respectively'ls_left and A'ls_rightThe number of elements in (c).
S6, judging posture risks according to the posture characteristics obtained in the step S5; the specific implementation method comprises the following steps: step length height ratio F9And a swing arm F11Swing legs F12The larger the size, the better the movement ability and the balance ability, and the center shaft is deviatedShift correlated feature F6、F7、F8、F10Smaller means better balance ability. The posture feature F6~F12Inputting the data into a multilayer perceptron to carry out regression prediction to obtain an attitude risk value Rposture
S7, carrying out final comprehensive risk judgment according to the time risk and the posture risk obtained in the steps S4 and S6; the specific implementation method comprises the following steps: if the time risk R has been obtained in step S4timeGreater than threshold ThsumIf so, the comprehensive risk judgment directly obtains a result, and the falling risk is high risk; otherwise, the following judgment is carried out:
setting the weight W by linear fittingtime and WpostureUltimate risk Rfinal=Wtime·Rtime+Wposture·Rposture
Will end up at risk RfinalWith a high risk reference threshold ThdangerMiddle risk reference threshold ThalarmMaking a comparison higher than or equal to a high risk reference threshold ThdangerThe subject is considered to have a high risk of falling, higher than or equal to the middle risk threshold ThalarmBut below a high risk threshold ThdangerThe subject is considered to have a moderate fall risk, below the moderate risk reference threshold ThalarmThe test person is considered to have a low fall risk.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.

Claims (5)

1. A method for predicting fall risk of an elderly person, comprising the steps of:
s1, data acquisition and preprocessing, including the following substeps:
s11, arranging a data acquisition environment according to the TUG standard: a chair 60cm high, which is opposite to a flat walkway 3m long; the pixel of the camera equipment is required to be at least 720p and is fixedly placed at the height of 1.5 m;
s12, starting camera shooting when the subject sits up, then rising up and moving forward, turning to and moving back after reaching the end point of the walkway, returning to the starting point and seating again, and stopping camera shooting; the obtained video is preprocessed to 1280x720 pixels, and 30 frames per second;
s13, taking the real falling times of the subject in the last two years as the label of sample data to form a training data set;
s14, classifying the fall risk into 3 grades: the low risk level corresponds to no fall record within 2 years, the medium risk level corresponds to 1 to 2 fall records within 2 years, and the high risk level corresponds to 3 fall records or more;
s2, carrying out skeleton modeling: modeling human body in video as skeleton of 13 nodes by using OpenPose, and using J0~J12Represents a corresponding joint, and specifically comprises: j. the design is a square0Head, J1Left shoulder, J2Right shoulder, J3Left elbow, J4Right elbow, J5Left wrist, J6Right wrist, J7Left hip, J8Right hip, J9Left knee, J10Right knee, J11Left ankle, J12-a right ankle;
obtaining a horizontal axis coordinate and a vertical axis coordinate on each frame of 2d image through an OpenPose framework;
and (3) reconstructing an absolute coordinate obtained by OpenPose, and constructing a relative coordinate system: in each frame of image, the center of the person, i.e. J, is found by the absolute coordinates of the nodes1、J2、J7、J8The human body is selected by a rectangular outer frame with a fixed proportion, the lower left corner of the outer frame is taken as the origin of coordinates, the direction of the longitudinal axis of the transverse axis is unchanged, and a new coordinate system is established;
s3, video time segmentation based on motion characteristics: the video is divided into 7 segments, respectively denoted as S0~S6(ii) a The total of 6 time division points among 7 segments are respectively marked as T in sequence1~T6At the beginning and end of the entire videoAt intervals with T0 and T7Represents; wherein S is0Corresponding sitting up and S1Stand up correspondingly, S2Corresponding to forward walking, S3Corresponding to the in-situ turning and S4Corresponding to the walking in the reverse direction, S5Corresponding to turning and sitting in place S6The corresponding end seat;
s4, according to the time characteristics obtained by segmentation in the step S3, time risk judgment is carried out, and the specific method is as follows: the 5 temporal characteristics are defined as follows:
total time: f1=|S1|+|S2|+|S3|+|S4|+|S5|=T6-T1
Rising stage time: f2=|S1|=T2-T1
Duration of walking phase: f3=|S2|+|S4|=T5+T3-T4-T2
Turning stage time: f4=|S3|=T4-T3
Turning body and sitting down stage time: f5=|S5|=T6-T5
Setting threshold Th for five time characteristics1、Th2、Th3、Th4、Th5Setting a risk score weight W for each time characteristic1、W2、W3、W4、W5Time-characterized risk score Rtime
Figure FDA0003572039450000021
wherein ,
Figure FDA0003572039450000022
risk score R if temporal characteristicstimeExceeds a preset integrated time threshold ThsumThen, thenDirectly deeming the risk rating high without performing subsequent steps; otherwise, executing step S5;
s5, extracting the attitude characteristics of each action stage according to frames from the video of each stage obtained after the segmentation of the step S3;
s6, judging posture risks according to the posture characteristics obtained in the step S5;
and S7, performing final comprehensive risk judgment according to the time risk and the posture risk obtained in the steps S4 and S6.
2. The method for predicting fall risk of the elderly as claimed in claim 1, wherein in step S3, the specific method for video time segmentation based on motion features is as follows:
s31, constructing the following action characteristics:
node J7、J8、J9 and J10The height difference of (a) is defined as follows:
Figure FDA0003572039450000023
Y(J7)、Y(J8)、Y(J9)、Y(J10) Respectively represents J7、J8、J9 and J10The ordinate of (a);
node J1J of (A)2The horizontal distance difference of (a) is defined as:
F_clip2=|X(J1)-X(J2)|
X(J1)、X(J2) Respectively represent J1J of (A)2The abscissa of (a);
node J7 and J8The horizontal distance difference of (a) is defined as:
F_clip3=|X(J7)-X(J8)|
X(J7)、X(J8) Are respectively J7 and J8The abscissa of (a);
s32, respectively calculating the three motion characteristics of S31 for each frame of image to obtainF_clip1、F_clip2、F_clip3Three discrete sequences, namely using a binary inflection point search algorithm BinSeg of an L2 loss model, matching empirical parameters to obtain inflection points of the sequences, and obtaining stage division of actions according to the inflection points; the method comprises the following specific steps:
using F _ clip1Sequence, using inflection point search algorithm with prior knowledge to judge S0And S1Is a dividing point T1(ii) a Specifically, T1Appear in F _ clip11 st inflection point;
using F _ clip1Judgment S1And S2Is a dividing point T2;T2Appear in F _ clip1At the 2 nd inflection point;
using F _ clip2Judgment S2And S3Is a dividing point T3;T3Appear in F _ clip21 st inflection point;
using F _ clip3Judgment S3And S4Is a dividing point T4;T4Appear in F _ clip3At the 3 rd inflection point;
using F _ clip2Judgment S4And S5Is a demarcation point T5;T5Appear in F _ clip2At the 4 th inflection point;
using F _ clip1Judgment S5And S6Is a dividing point T6;T6Appear in F _ clip1At the 4 th inflection point.
3. The method for predicting fall risk of the elderly as claimed in claim 1, wherein the step S5 is implemented by:
s51 rising-middle axle front and back deviation characteristic F6: in the standing process, the central axis deviates from the maximum angle of sitting statically, and the central axis of the trunk is obtained by calculating the coordinates of the left shoulder, the right shoulder, the left hip and the right hip; the specific calculation method is as follows:
firstly, calculating the slope of the central axis of the trunk:
Figure FDA0003572039450000031
then the slope is converted into a radian system angle which is expressed by a symbol theta;
for each time tiCalculating the slope according to the above formula
Figure FDA0003572039450000032
And is turned into an angle of radian
Figure FDA0003572039450000033
Then F is obtained6The following were used:
Figure FDA0003572039450000034
wherein T1<ti<T2Corresponding to all the moments in the standing process; thetasitMean offset angle of the fingering stage;
s52, calculating the mean deviation angle F of the middle shaft in the turning process7
Figure FDA0003572039450000035
S53, calculating the maximum angle F of the central axis when the central axis deviates from the standing straight state in the sitting process8
Figure FDA0003572039450000036
wherein T5<ti<T6,θstandMean offset angle at finger sitting down stage;
s54, calculating walking step length and height ratio characteristic F9: the horizontal moving distance between two landing points of the single foot ankle is called as a single step length, the single step length of the two feet is averaged in the walking process, and the ratio of the average step length to the height is calculated;
obtaining a set of landing time points by solving an extremum value through an ankle height sequence:
set of landing time points for the left foot:
Afloor_left={tm|T2<tm<T3or T4<tm<T5And X' (t)m) Is a minimum value }
tmA time point, X' (t), corresponding to the m-th left ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosem) Represents tmThe left ankle abscissa in the absolute coordinate system of (1); removing landing time points of incomplete steps at the head and the tail, and counting the average 1-step length;
right foot landing time point set:
Afloor_right={tn|T2<tn<T3or T4<tn<T5And X' (t)n) Is a minimum value }
tnA time point, X' (t), corresponding to the nth right ankle abscissa minimum value in the absolute coordinate system obtained by OpenPosen) Is tnThe abscissa of the right ankle in the absolute coordinate system of (1); removing landing time points of incomplete steps at the head and the tail, and counting the average 1-step length;
the left foot single step size is:
Figure FDA0003572039450000041
Figure FDA0003572039450000042
respectively are the m +1 th and m left ankle horizontal coordinates in an absolute coordinate system;
the right foot single step is:
Figure FDA0003572039450000043
Figure FDA0003572039450000044
respectively are the abscissa of the (n + 1) th and the n right ankle in an absolute coordinate system;
then the walking step length and height ratio characteristic F9The calculation is as follows:
Figure FDA0003572039450000045
wherein BH represents body height;
s55, calculating the left and right deviation characteristic F of the walking central axis10That is, during walking, the horizontal left-right average angle difference of the central axis when deviating from the standing straight is calculated as follows:
Figure FDA0003572039450000046
wherein θstdThe angle of the middle shaft when standing straight:
Figure FDA0003572039450000047
s56 amplitude characteristic F of walking swing arm11: obtaining the vector angle difference formed by the left shoulder, the left elbow, the right shoulder and the right elbow node, namely the elbow-shoulder vector at the farthest position of the back swing and the elbow-shoulder vector at the farthest position of the front swing, and the angle difference of the elbow-shoulder vector and the elbow-shoulder vector;
firstly, respectively calculating the swing arm slopes of the left arm and the right arm:
Figure FDA0003572039450000048
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC 1, …, 4;
converting slope into radian angle alphaleft_armAnd alpharight_arm
Calculating the swing arm angles of the left arm and the right arm in each frame to form a sequence of the swing arm angles of the left arm and the right arm, and solving an extreme value in the sequence to obtain the maximum swing angle A of the front swing of the left armaf_leftMaximum angle A of left arm back swingab_leftMaximum angle A of front swing of right armaf_rightMaximum angle A of right arm back swingab_right
Will be adjacent to Aaf_leftAnd Aab_leftAligning and differentiating to obtain a single swing arm angle set A of the left armas_left
Will be adjacent to Aaf_rightAnd Aab_rightAligning and differentiating to obtain a right arm single swing arm angle set Aas_right
Removing Aas_leftAnd Aas_rightThe first and last data in the two sets, then with the remaining set A'as_left、A'as_rightBased on, calculate F11
Figure FDA0003572039450000051
wherein αm∈A'as_left,αn∈A'as_right,|A'as_left|、|A'as_rightL represents set A 'respectively'as_left、A'as_rightThe number of middle elements;
s57, calculating walking leg swing amplitude characteristic F12: obtaining the vector angle difference formed by the left hip, the left knee, the right hip and the right knee through the nodes of the left hip, the left knee, the right hip and the right knee;
calculating the leg swinging slopes of the left leg and the right leg respectively:
Figure FDA0003572039450000052
X'(Jc)、Y'(Jc) J in absolute coordinate systems respectively obtained for OpenPosecC-7, …, 10;
will Kleft_leg and Kright_legConversion from slope to radian angle betaleft_legAnd betaright_leg
Calculating the leg swinging angles of the left leg and the right leg in each frame to form a sequence of the leg swinging angles of the left leg and the right leg, and solving an extreme value in the sequence to obtain the maximum front swinging angle A of the left leglf_leftMaximum angle A of left leg back swinglb_leftMaximum angle A of front swing of right leglf_rightMaximum angle A of right leg back swinglb_right
Will be adjacent to Alf_left and Alb_leftAligning and differencing to obtain a left leg swing angle sequence Als_left
Will be adjacent to Alf_right and Alb_rightAligning and differentiating to obtain a leg swing angle sequence A of the right legls_right
Removal of sequence Als_left and Als_rightAnd then in the remaining sequence A'ls_left and A'ls_rightTo obtain F12The calculation is as follows:
Figure FDA0003572039450000053
wherein βm∈A'ls_left,βn∈A'ls_right,|A'ls_left|、|A'ls_rightL represent the sequences A 'respectively'ls_left and A'ls_rightThe number of elements in (c).
4. The method for predicting fall risk of the elderly as claimed in claim 3, wherein the step S6 is implemented by: the posture feature F6~F12Inputting the data into a multilayer perceptron to carry out regression prediction to obtain an attitude risk value Rposture
5. The method for predicting fall risk of the elderly as claimed in claim 1, wherein the step S7 is implemented by: if it has already been done in step S4Get the time risk RtimeGreater than threshold ThsumIf so, the comprehensive risk judgment directly obtains a result, and the falling risk is high risk; otherwise, the following judgment is carried out:
setting the weight W by linear fittingtime and WpostureUltimate risk Rfinal=Wtime·Rtime+Wposture·Rposture
Will end up at risk RfinalWith a high risk reference threshold ThdangerMiddle risk reference threshold ThalarmMaking a comparison higher than or equal to a high risk reference threshold ThdangerThe subject is considered to have a high risk of falling, higher than or equal to the middle risk threshold ThalarmBut below a high risk threshold ThdangerThe subject is considered to have a moderate fall risk, below the moderate risk reference threshold ThalarmThe test person is considered to have a low fall risk.
CN202210321855.9A 2022-03-30 2022-03-30 Old people falling risk prediction method Active CN114694252B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210321855.9A CN114694252B (en) 2022-03-30 2022-03-30 Old people falling risk prediction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210321855.9A CN114694252B (en) 2022-03-30 2022-03-30 Old people falling risk prediction method

Publications (2)

Publication Number Publication Date
CN114694252A true CN114694252A (en) 2022-07-01
CN114694252B CN114694252B (en) 2023-04-28

Family

ID=82140920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210321855.9A Active CN114694252B (en) 2022-03-30 2022-03-30 Old people falling risk prediction method

Country Status (1)

Country Link
CN (1) CN114694252B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279483A (en) * 2015-09-28 2016-01-27 华中科技大学 Fall-down behavior real-time detection method based on depth image
CN108629300A (en) * 2018-04-24 2018-10-09 北京科技大学 A kind of fall detection method
CN109670396A (en) * 2018-11-06 2019-04-23 华南理工大学 A kind of interior Falls Among Old People detection method
CN109919132A (en) * 2019-03-22 2019-06-21 广东省智能制造研究所 A kind of pedestrian's tumble recognition methods based on skeleton detection
KR102035586B1 (en) * 2018-05-17 2019-10-23 화남전자 주식회사 Method for Automatic Finding a Triangle from Camera Images and System Therefor
CN111582158A (en) * 2020-05-07 2020-08-25 济南浪潮高新科技投资发展有限公司 Tumbling detection method based on human body posture estimation
CN112287759A (en) * 2020-09-26 2021-01-29 浙江汉德瑞智能科技有限公司 Tumble detection method based on key points
CN113569793A (en) * 2021-08-06 2021-10-29 上海亲孝行健康科技有限公司 Fall identification method and device
CN113887335A (en) * 2021-09-13 2022-01-04 华南理工大学 Fall risk real-time evaluation system and method based on multi-scale space-time hierarchical network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279483A (en) * 2015-09-28 2016-01-27 华中科技大学 Fall-down behavior real-time detection method based on depth image
CN108629300A (en) * 2018-04-24 2018-10-09 北京科技大学 A kind of fall detection method
KR102035586B1 (en) * 2018-05-17 2019-10-23 화남전자 주식회사 Method for Automatic Finding a Triangle from Camera Images and System Therefor
CN109670396A (en) * 2018-11-06 2019-04-23 华南理工大学 A kind of interior Falls Among Old People detection method
CN109919132A (en) * 2019-03-22 2019-06-21 广东省智能制造研究所 A kind of pedestrian's tumble recognition methods based on skeleton detection
CN111582158A (en) * 2020-05-07 2020-08-25 济南浪潮高新科技投资发展有限公司 Tumbling detection method based on human body posture estimation
CN112287759A (en) * 2020-09-26 2021-01-29 浙江汉德瑞智能科技有限公司 Tumble detection method based on key points
CN113569793A (en) * 2021-08-06 2021-10-29 上海亲孝行健康科技有限公司 Fall identification method and device
CN113887335A (en) * 2021-09-13 2022-01-04 华南理工大学 Fall risk real-time evaluation system and method based on multi-scale space-time hierarchical network

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ABDERRAZAK IAZZI等: "Fall Detection System-Based Posture-Recognition for Indoor Environments" *
SAHAR ABDELHEDI等: "Development of a two-threshold-based fall detection algorithm for elderly health monitoring" *
刘朗等: "脑卒中患者运动功能自动化评定研究进展" *
姜珊: "基于分类学习的住院老年人跌倒行为检测研究" *
王平等: "一种基于视频中人体姿态的跌倒检测方法" *

Also Published As

Publication number Publication date
CN114694252B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
US11328534B2 (en) Monitoring the performance of physical exercises
CN111368810B (en) Sit-up detection system and method based on human body and skeleton key point identification
CN110222665B (en) Human body action recognition method in monitoring based on deep learning and attitude estimation
CN108597578B (en) Human motion assessment method based on two-dimensional skeleton sequence
CN114724241A (en) Motion recognition method, device, equipment and storage medium based on skeleton point distance
US10186041B2 (en) Apparatus and method for analyzing golf motion
CN110490109B (en) Monocular vision-based online human body rehabilitation action recognition method
CN113255522B (en) Personalized motion attitude estimation and analysis method and system based on time consistency
CN110298279A (en) A kind of limb rehabilitation training householder method and system, medium, equipment
CN112800905A (en) Pull-up counting method based on RGBD camera attitude estimation
CN110032940A (en) A kind of method and system that video pedestrian identifies again
CN114219984A (en) Improved YOLOv 3-based micro pest detection system and method
CN108573197A (en) Video actions detection method and device
Yadav et al. YogaTube: a video benchmark for Yoga action recognition
CN116805433B (en) Human motion trail data analysis system
CN113283373A (en) Method for enhancing detection of limb motion parameters by depth camera
CN116721471A (en) Multi-person three-dimensional attitude estimation method based on multi-view angles
CN114694252A (en) Old people falling risk prediction method
KR102181828B1 (en) 4d rig reconstructing device and a method thereof
Washino et al. Projected frontal area and its components during front crawl depend on lung volume
CN115631155A (en) Bone disease screening method based on space-time self-attention
Zhao et al. Human motion reconstruction from monocular images using genetic algorithms
CN112017211A (en) Temporomandibular joint movement tracking method and system
Liu et al. Analysis of human walking posture using a wearable camera
Robertini et al. Capture of arm-muscle deformations using a depth-camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant