CN114795189A - Fall monitoring method and device and storage medium - Google Patents

Fall monitoring method and device and storage medium Download PDF

Info

Publication number
CN114795189A
CN114795189A CN202210430076.2A CN202210430076A CN114795189A CN 114795189 A CN114795189 A CN 114795189A CN 202210430076 A CN202210430076 A CN 202210430076A CN 114795189 A CN114795189 A CN 114795189A
Authority
CN
China
Prior art keywords
monitored user
falling
judgment model
model
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210430076.2A
Other languages
Chinese (zh)
Inventor
李勇强
许扬锦
张翔欣
张帅
张军
胡丹娟
励建安
赵薇薇
杨欣
郑蒙蒙
沈滢
肖莎
陆晓
许光旭
陈雅婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Internet Of Things Co ltd
Original Assignee
Shanghai Internet Of Things Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Internet Of Things Co ltd filed Critical Shanghai Internet Of Things Co ltd
Priority to CN202210430076.2A priority Critical patent/CN114795189A/en
Publication of CN114795189A publication Critical patent/CN114795189A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/08Elderly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Dentistry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Emergency Alarm Devices (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Fuzzy Systems (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)

Abstract

The application provides a falling monitoring method, a falling monitoring device and a storage medium, and relates to the field of human body monitoring. The fall monitoring method comprises the following steps: acquiring real-time trunk posture data of a monitored user, wherein the real-time trunk posture data comprises acceleration and posture angles of a monitored user body; judging whether the monitored user falls or not based on the acceleration and a first falling judgment model; when the first falling judgment model judges that the monitored user falls, whether the monitored user falls is determined based on the attitude angle and a second falling judgment model, wherein the second falling judgment model is a machine learning model, and the second falling judgment model performs machine learning based on the trunk attitude data of the monitored user. The judgment result of the first falling judgment model is judged through the second falling judgment model, so that misjudgment of the first falling judgment model can be effectively reduced, and the accuracy of the falling judgment result is improved.

Description

Fall monitoring method and device and storage medium
Technical Field
The application relates to the field of human body monitoring, in particular to a fall monitoring method, a fall monitoring device and a computer readable storage medium.
Background
The falling easily causes the injury to the human body, especially the old, easily leads to the condition such as fracture, concussion after falling, if not send a doctor in time after falling, may threaten life safety even, therefore, it is especially important to monitor whether the human body falls.
In the existing technology for identifying human body falling events, misjudgment is easy to occur, and if jumping and going up and down stairs are judged to fall, the falling judgment accuracy is not high.
Disclosure of Invention
In view of the above, the present invention is directed to a fall monitoring method, a fall monitoring apparatus, an electronic device and a computer readable storage medium, which are used to improve the accuracy of fall judgment and reduce the false judgment of fall.
In a first aspect, an embodiment of the present application provides a fall monitoring method, including: acquiring real-time trunk posture data of a monitored user, wherein the real-time trunk posture data comprises acceleration and posture angles of a monitored user body; judging whether the monitored user falls or not based on the acceleration and a first falling judgment model; when the first falling judgment model judges that the monitored user falls, whether the user falls is confirmed based on the attitude angle and a second falling judgment model, wherein the second falling judgment model is a machine learning model, and the second falling judgment model performs machine learning based on the past trunk attitude data of the monitored user.
In the embodiment of the application, through obtaining the real-time truck attitude data of monitored user, the acceleration and the attitude angle of the real-time truck attitude data are respectively input into the first falling judgment model and the second falling judgment model, and whether the monitored user falls down or not is judged from the two aspects of acceleration and attitude angle. The second fall judgment model is used for carrying out secondary judgment on the judgment result of the first fall judgment model, so that the output of the grading threshold judgment model on the fall misjudgment result can be effectively reduced, and the accuracy of the fall judgment result is improved. The second falling judgment model is a machine learning model, machine learning can be carried out through the past trunk posture data of the monitored user, so that the falling judgment of the second falling judgment model after the machine learning more accords with the personal behavior habit of the monitored user, misjudgment caused by personal difference is reduced, and the accuracy of judging whether the monitored user falls is further improved.
In one embodiment, the obtaining real-time torso pose data for the monitored user includes: acquiring the acceleration of the monitored user acquired by an accelerometer in real time; acquiring the rotation speed of the monitored user acquired by a gyroscope in real time; determining the attitude angle of the monitored user based on the acceleration and the rotational speed.
In the embodiment of the application, the acceleration and the rotation speed of the monitored user are respectively acquired through the accelerometer and the gyroscope, and then the attitude angle of the monitored user can be calculated, so that whether the user falls down can be judged through the attitude angle.
In an embodiment, the determining whether the monitored user falls based on the acceleration and the first fall determination model includes: determining that the monitored user is in a weightless state when it is determined that the acceleration lasts for a first time period less than a first threshold; within a second time length after the monitored user is determined to be in the weightlessness state, if the instantaneous value of the acceleration is determined to be larger than a second threshold value, determining that the user is impacted; and if the acceleration lasting for the fourth time length is determined to be smaller than a third threshold value, determining that the monitored user is in a static state, wherein the static state represents that the monitored user falls down.
In the embodiment of the application, the first falling judgment model is a grading threshold judgment model, and weightlessness, impact and static stages in the falling process are judged through the grading threshold judgment model, so that whether multiple stages for judging the body state of a monitored user correspond to the falling process or not is realized, misjudgment caused by falling judgment due to a single factor is effectively avoided, and the judgment accuracy is improved.
In an embodiment, when the first fall judgment model judges that the user falls, the determining whether the user falls based on the attitude angle and the second fall judgment model includes: acquiring an attitude angle of the monitored user within a fifth time after the monitored user is in the weightlessness state; and when the posture angle at any moment in the fifth time is determined to be larger than a preset angle, confirming that the monitored user falls down.
In the embodiment of the application, when the posture angle at any moment in the fifth time is determined to be larger than the preset angle, whether the user falls can be determined, and meanwhile, the judgment result of the first falling judgment model is further determined, so that the accuracy of judging whether the monitored user falls is improved.
In one embodiment, before determining whether the monitored user falls based on the acceleration and the first fall determination model, the method further includes: constructing a first initial falling judgment model; setting a preset judgment condition of the first initial falling judgment model, wherein the preset comparison condition comprises a judgment threshold value and a judgment time of each level in the initial first falling judgment model, and the first falling judgment model is the first initial falling judgment model after the preset judgment condition is set; before the determining whether the user has fallen based on the attitude angle and a second fall judgment model, the method further comprises: constructing a second initial falling judgment model; and training the second initial falling judgment model by using the pre-acquired training set to obtain the second falling judgment model, wherein the pre-acquired training set comprises attitude angles of all motion states of the body of a normal person.
In the embodiment of the application, the obtained first falling judgment model can be used for judging whether a user falls or not by constructing the first initial falling judgment model and setting the judgment conditions. And training the constructed initial falling judgment model through a pre-acquired training set, so that the obtained second initial falling judgment model can judge whether the user falls. Therefore, whether the monitored user falls down can be judged by using the first falling judgment model and the second falling judgment model, and the accuracy of the judgment result is improved.
In one embodiment, the method further comprises: storing the trunk posture data into a training set; training the second fall judgment model based on the training set.
In the embodiment of the application, the real-time trunk posture data of the daily behavior of the monitored user is acquired and used as the training set to train the second falling judgment model, so that the judgment condition of the second falling judgment model is more consistent with the behavior habit of the monitored user, and the accuracy of the monitored user is gradually improved and judged.
In one embodiment, the method further comprises: and when the real-time trunk posture data has large deviation between continuous multiframes and the standard trunk posture data in the training set, determining that the monitored user has a falling risk.
In the embodiment of the application, because the training set comprises the trunk posture data of the monitored user in daily behaviors, the standard trunk posture data of the monitored user in a normal state can be determined through the training set. Through comparing the real-time trunk attitude data with the real-time trunk attitude data of the monitored user, when the data deviation between the real-time trunk attitude data and the real-time trunk attitude data of the monitored user is large, the current abnormal state of the monitored user can be determined, the falling risk possibly exists, therefore, the monitored user can be reminded of timely adjusting, and falling is avoided.
In a second aspect, embodiments of the present application provide a fall monitoring device, comprising: the data acquisition module comprises an accelerometer and a gyroscope and is used for collecting real-time trunk posture data of the body of the monitored user; a processing module for performing the fall monitoring method of any of the first aspects.
In one embodiment, the fall monitoring device further comprises: the data acquisition module and the processing module are both arranged in the shell; a spring clamping piece is arranged on one outer surface of the shell and used for fixing the falling monitoring device on the monitored user body so as to monitor whether the monitored user falls down.
In a third aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon, which, when run on a computer, causes the computer to perform a fall monitoring method as described in the first aspect.
Additional features and advantages of the disclosure will be set forth in the description which follows, or in part may be learned by the practice of the above-described techniques of the disclosure, or may be learned by practice of the disclosure.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a block diagram of a fall monitoring device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a fall monitoring apparatus according to an embodiment of the present application;
fig. 3 is a flowchart of a fall monitoring method according to an embodiment of the present application.
Icon: a fall monitoring device 200; a data acquisition module 210; an accelerometer 211; a gyroscope 212; a processing module 220; a switch 230; an operating status breath light 240; a power interface 250; spring clip 260.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, fig. 1 is a block diagram of a fall monitoring device according to an embodiment of the present disclosure. The fall monitoring device 200 comprises: a data acquisition module 210 and a processing module 220.
The data acquisition module 210, including the accelerometer 211 and the gyroscope 212, is used to collect real-time torso posture data of the monitored user. Wherein, the accelerometer 211 may be a three-axis MEMS accelerometer for acquiring the acceleration of the monitored user's body, and the gyroscope 212 may be a three-axis MEMS gyroscope for acquiring the rotation speed of the monitored user's body.
The processing module 220 is configured to obtain real-time trunk posture data of the monitored user, which is acquired by the data acquisition module 210, and determine whether the monitored user falls based on the acceleration and the first falling judgment model; and when the first falling judgment model judges that the monitored user falls, determining whether the monitored user falls or not based on the attitude angle and the second falling judgment model.
The fall monitoring device 200 further comprises a positioning module, a communication module, a power module, a storage module and a housing.
The positioning module is used for acquiring real-time position information of a monitored user.
The communication module is used for sending information that the monitored user falls to a contact preset by the user after confirming that the monitored user falls. In particular, the information of the fall may include the time and location of the fall.
A power module for supplying power to the fall monitoring device 200.
And a storage module, configured to store the data acquired by the data acquisition module 210.
Referring to fig. 2, fig. 2 is a schematic use diagram of a fall monitoring device 200 according to an embodiment of the present application, where (a) in fig. 2 is a top view of the fall monitoring device 200, and (b) in fig. 2 is a side view of the fall monitoring device 200, the fall monitoring device 200 further includes a housing, and the data acquisition module 210, the processing module 220, the module, the communication module, the power module, and the storage module are all disposed in the housing.
A switch 230, a work state breathing lamp 240 and a power interface 250 are arranged on one surface of the shell, wherein the switch 230 is used for turning on and off the fall monitoring device 200; the working state breathing lamp 240 is used for displaying whether the fall monitoring apparatus 200 is in a working state; the power interface 250 is used to connect a power source to charge the fall monitoring device 200.
A surface of the housing is provided with spring clips 260 for enabling the fall monitoring device 200 to be clipped onto the monitored user. In particular, to clothing worn by the person, as shown in fig. 2 (c), in some embodiments the fall monitoring device 200 is clamped at the anterior, thoracic 2 vertebra. Specifically, because the anterior thoracic vertebra and the 2 thoracic vertebra are located on the medial line of the body, and the shape is flat and easy to stably place, the fall monitoring device 200 can constantly keep consistent with the body dring movement position and trend, so that the data collected by the data collection module 210 is more accurate, and the processing module 220 can accurately judge whether the human body falls.
In one embodiment, the fall monitoring device 200 may further include mobile terminal software, which is configured to collect personal information of the user to register information of the detected user and establish a correspondence relationship between the fall monitoring device 200 and the monitored user.
In this embodiment, the mobile terminal software may further set a preset contact of the monitored user, so that when the fall monitoring apparatus 200 determines that the monitored user falls, the fall monitoring apparatus sends fall information to the preset contact through the mobile terminal software.
It can be understood that the fall monitoring device 200 provided in the present application corresponds to the fall monitoring method provided in the present application, and for the sake of brevity of the description, the contents of the subsequent fall monitoring method sections can be described in detail in the same or similar parts.
Referring to fig. 3, fig. 3 is a flowchart of a fall monitoring method according to an embodiment of the present application, the fall monitoring method includes:
and S110, acquiring real-time trunk posture data of the monitored user, wherein the real-time trunk posture data comprises the acceleration and the posture angle of the body of the monitored user.
In this embodiment, it is determined whether the monitored user falls down by analyzing the body trunk attitude angle and the acceleration change caused by the user behavior, and therefore, the acquired trunk attitude data includes the acceleration and the attitude angle of the body of the monitored user.
In one embodiment, the acceleration of a monitored user acquired by an accelerometer in real time is acquired; acquiring the rotation speed of a monitored user acquired by a gyroscope in real time; determining the attitude angle of the monitored user based on the acceleration and the rotational speed.
In this embodiment, an accelerometer may be used to obtain a moving speed and a moving angle of a human body generated by a user behavior. For example, the accelerometer may be a three-axis MEMS accelerometer, placed at the chest, 2 vertebrae of the chest of a person to acquire accelerations of the body of the monitored user in different directions. The resultant acceleration can be calculated through the acquired accelerations in different directions, and the movement angle can be determined due to the fact that the resultant acceleration has the direction. It is understood that, the calculation of the resultant acceleration and the determination of the movement angle by the direction of the resultant acceleration may refer to the prior art, and will not be described herein.
In this embodiment, the rotation speed of the body of the monitored user can be acquired through the gyroscope, and therefore, the intensity of the human body twisting motion is judged through the rotation speed. For example, the gyroscope may be a three-axis MEMS three-axis gyroscope, and may obtain the rotation speed, or angular velocity, of the monitored user in different directions.
In this embodiment, after the acceleration and the rotation speed are acquired, the posture angle of the human body may be acquired by processing the acceleration and the rotation speed.
In some embodiments of the present application, the attitude angle may be calculated using quaternions.
Specifically, the quaternion calculation formula is:
Figure BDA0003609846880000081
(q 0 ,q 1 ,q 2 ,q 3 ) Is a quaternion, gamma is the angle of rotation about the x-axis, and theta is the angle of rotation about the y-axisAnd Ψ is the angle of rotation about the z-axis. The quaternion is known, and reference may be made to the prior art specifically, and the derivation process of the quaternion is not described herein again.
Thus, the rotational angles of the rotation around the x-axis, the y-axis, and the z-axis, that is, euler angles can be determined by the quaternion calculation formula. By differentiating the attitude matrix of the quaternion representation over time, the angular velocity of the rotation about each axis can be determined, in particular:
Figure BDA0003609846880000082
wherein,
Figure BDA0003609846880000083
respectively, the differentiation of the quaternion is shown,
Figure BDA0003609846880000084
representing the angular velocity of rotation about the x, y, z axes, respectively.
Therefore, the Euler angles and the rotation angular speeds of the body of the monitored user in different directions can be calculated, and the real-time attitude angle of the body of the monitored user can be further determined.
It is to be understood that the prior art, for example, euler's angle method, quaternion method, etc., may be adopted with respect to the method of calculating the attitude angle of the human body based on the acceleration and the rotational speed of the human body, and no specific expansion is made herein. It should be noted that parameters used by different calculation methods may be different, which causes different real-time trunk posture data of the monitored user to be acquired, and may be implemented by using different data acquisition devices or apparatuses. For example, in some embodiments, the attitude angle is determined directly by using an angle sensor, and therefore, the detailed description is omitted here.
And S120, judging whether the monitored user falls down or not based on the acceleration and the first falling judgment model.
In this embodiment, the first falling judgment model is a classification threshold judgment model, and the first falling judgment model is divided into three stages, namely, weightlessness interruption detection, impact interruption detection and static interruption detection, and is used for detecting different stages of a human body in a falling process to judge whether the human body falls.
In one embodiment, when it is determined that the acceleration lasts for the first time period and is less than the first threshold value, it is determined that the monitored user is in a weightlessness state; within a second time length after the monitored user is determined to be in the weightlessness state, if the instantaneous value of the acceleration is determined to be larger than a second threshold value, determining that the user is impacted; and if the acceleration is determined to be continuously smaller than the third threshold value for the fourth time length, determining that the monitored user is in a static state, wherein the static state represents that the monitored user falls down.
Specifically, the first-level weightlessness interruption detection is directed at the moment when the human body falls, weightlessness occurs, so that whether the monitored user has weightlessness can be judged, and whether the user has the possibility of falling is further determined. Specifically, the weight loss interruption detection includes: and judging whether the acceleration of the user is smaller than a first threshold value and lasting for a certain time, wherein the first threshold value can be 0.6g, and g represents the gravity acceleration. And calculating time from the moment that the acceleration is smaller than the first threshold, determining that the user has weightlessness when the duration meets the first time, and entering the next step of impact interruption detection, wherein the user is in a weightlessness state within the first time. It should be noted that, when the first fall judgment model judges whether the acceleration satisfies the threshold, the adopted acceleration is the resultant acceleration of the body of the user. It can be understood that, when the user weightlessness duration is short, it may be caused by the monitored user jumping or making some action, and the monitored user is also easy to return to the normal state from the too short weightlessness state, so that it is necessary to determine that the detected user has weightlessness when the weightlessness duration satisfies the first duration, and the determination condition may be reasonably set according to the personal actual condition of the monitored user, which is only an example and should not be a limitation to the present application, and in some embodiments, the acceleration threshold in the determination condition may be selected from 0.5g to 0.7 g.
In the process of falling, after the weightlessness occurs, the body of the monitored user collides with the ground, so that the acceleration generates a highest value, and the highest value of the acceleration is far larger than the acceleration value of normal movement of the user. Specifically, the acceleration of the normal movement of the person does not exceed 2g, and the maximum acceleration during falling can reach 4g, so that the second threshold value can be set to be 3.5g, and the second duration is within 1 second of the calculation for determining that the user is in the weightlessness state. And judging whether the acquired instantaneous value of the acceleration is greater than 3.5g within 1 second of the second time length, and determining that the user has an impact when the acceleration greater than 3.5g occurs, and entering the static interrupt detection. And if the impact does not occur within 1 second, determining that the user does not fall, and carrying out the weight loss interruption detection again. In some embodiments, the acceleration threshold in the determination condition may be selected from 3g to 3.5 g.
After the impact, if the person does not fall, action, such as jumping, can be resumed quickly. And if the person falls, the person can still for a period of time, so that whether the user falls can be further judged by stopping detection. Specifically, the stationary interrupt detection may be that the detection is started after 0.5 second from the time of the impact, that is, the third time period is 0.5 second after the impact of the user, and the detection of the acceleration instantaneous value is started after the third time period. During detection, the fourth time length can be judged within two seconds after the detection is started, namely the fourth time length, it can be understood that whether the acceleration is always smaller than 1.5g within 0.5 second to 2.5 second after the fourth time length user collides, namely the third threshold value is 1.5g, and if the acceleration is continuously smaller than 1.5g within the fourth time length, the user is determined to be in a static state, and the user is judged to fall down. If the acceleration in the fourth time period is greater than 1.5g, the user can be considered as weightlessness and impact caused by falling after jumping, so that the user is determined not to fall, and the weightlessness is returned to the weightlessness interruption detection.
It can be understood that, in the process of determining by the first fall determination model, the acquired acceleration values are acquired and calculated in real time according to a preset frequency, and therefore, the acceleration values at different time stages may be different.
And S130, when the first falling judgment model judges that the monitored user falls, determining whether the monitored user falls or not based on the attitude angle and the second falling judgment model.
In an embodiment, when both the first fall judgment model and the second fall judgment model judge that the user falls, it is determined that the user falls.
In this embodiment, since more erroneous judgment may occur by using the first fall judgment model, the judgment result of the first fall judgment model is confirmed by the second fall judgment model, thereby reducing the erroneous judgment.
In an embodiment, the second fall model is a machine learning model, and the second fall judgment model performs machine learning based on the past trunk posture data of the monitored user.
In this embodiment, since the second fall judgment model is a machine learning model, machine learning can be performed on the second fall judgment model using data, so that optimization of fall judgment is realized. Specifically, the monitoring devices that tumbles can gather the truck attitude data of monitored user when normal state, if walk, run, truck attitude data under the states such as walk up and down stairs, and the truck attitude data when utilizing the normal state that acquires trains the second judgement model of tumbleing as the training set, carry out machine learning promptly, therefore, through the data gathered, can constantly train the second judgement model of tumbleing, optimize the second judgement model of tumbleing to the judgement of tumbleing, and then make the second judgement model of tumbleing can be more accurate the judgement monitored user whether tumbles. As the trunk posture data of the monitored user is used for training, the trained second falling judgment model is more in line with the behavior habit of the monitored user, so that the misjudgment of the second falling judgment model caused by the difference among different people is reduced, and the judgment accuracy is further improved.
In one embodiment, the attitude angle of the monitored user at any moment in a fifth time after the monitored user is in a weightless state is obtained; and when the gesture angle at any time in the fifth time period is determined to be larger than the preset angle, determining that the user falls down.
This application embodiment is provided with the angle of predetermineeing in the second judgement model of tumbleing, and the second judgement model of tumbleing is through the gesture angle and the angle of predetermineeing the monitored user who will acquire comparing to can judge whether the user tumbles.
In the embodiment of the application, when the angle of the monitored user body is abnormal, risks of falling, such as heeling, retroversion and the like, may occur, and therefore, whether the user falls or not may be determined through the posture angle of the user body.
It can be understood that, the user falls usually from weightlessness, and therefore, in this embodiment, after the user is in a weightlessness state, the user starts to acquire the attitude angle of the user at any time within the fifth duration, and determines whether the attitude angle is abnormal, such as whether the deviation from the normal angle range is large, so as to determine whether the user falls. For example, within 1.5 seconds of determining that the user is in the weightless state, that is, within 1.5 seconds of determining that the user is in the weightless state for the fifth time period, the posture angle information of the user in the directions of the x, y and z axes of the coordinate system is obtained, whether any angle is greater than 70 degrees exists is judged, and when the posture angle greater than 70 degrees exists, it is determined that the user falls down. It can be understood that after the training of the trunk posture data of the monitored user in the personal normal state, the preset angle set by the second falling judgment model can be adjusted, so that the falling judgment of the monitored user by the second falling judgment model is more accurate.
In the embodiment of the application, through obtaining the real-time truck attitude data of monitored user, the acceleration and the attitude angle of the real-time truck attitude data are respectively input into the first falling judgment model and the second falling judgment model, and whether the monitored user falls down or not is judged from the two aspects of acceleration and attitude angle. The judgment result of the first falling judgment model is judged through the second falling judgment model, so that the misjudgment of the first falling judgment model on falling can be effectively reduced, and the accuracy of the falling judgment result is improved.
In an embodiment, the acquired real-time trunk posture data of the monitored user body may only include acceleration and rotation speed, and after the monitored user is determined to be in a weightlessness state, the posture angle of the monitored user is calculated through the acceleration and the rotation speed, so as to determine whether the user falls down. Thereby, the amount of calculations during fall monitoring may be reduced.
In an embodiment, the present application further provides a method for acquiring the first fall judgment model and the second fall judgment model.
In one embodiment, the method of obtaining the first fall judgment model comprises: constructing an initial first fall judgment model; setting a preset judgment condition of the initial first falling judgment model, wherein the preset comparison condition comprises a judgment threshold value and a judgment time of each level in the initial first falling judgment model; and determining the initial first falling judgment model after the preset judgment condition is set as a first falling judgment model.
In this embodiment, the constructed first falling judgment model includes weightlessness interruption detection, impact interruption detection, and stillness interruption detection, and is respectively used to detect whether the monitored user is in each stage of the falling process.
And when the acceleration is less than 0.6g and the preset time is continued, judging that the monitored user is in a weightlessness state, wherein g is the gravity acceleration.
The judgment condition of the impact interruption detection is that whether the acceleration is more than 3.5g exists within one second after the monitored user is detected to be in the weightless state, and when the acceleration is more than 3.5g, the monitored user is determined to be impacted.
The judgment condition of the stationary interruption detection is that whether the accelerations in 0.5 th to 2.5 th seconds after the user is collided are all less than 1.5g, and when the accelerations are all less than 1.5g, the monitored user is determined to be in a stationary state, and the monitored user is determined to fall down.
In this embodiment, in the process of monitoring the monitored user, various accelerations of the monitored user under the daily behavior can be collected to adjust the threshold value in real time, so that the judgment condition of the first falling judgment model more conforms to the daily behavior habit of the monitored user, and the misjudgment is reduced.
In one embodiment, the method of obtaining the second fall judgment model comprises: constructing an initial falling judgment model; and training the initial falling judgment model by using a pre-acquired training set to determine that the trained initial falling judgment model is a second falling judgment model, wherein the pre-acquired training set also comprises the attitude angle of the normal human body.
In this embodiment, the initial falling judgment model may be a classifier for judging whether the initial falling judgment model falls, which is constructed by a deep learning model in the form of a neural network model, an SVM (support vector machine), or the like, and the initial falling judgment model is trained through a pre-acquired training set including behavior characteristics of a normal person, so that the initial falling judgment model can judge whether the normal person falls. The preset angle for judging whether the user falls down of the trained initial falling judgment model is the falling attitude angle of the normal person, so that the trained initial falling judgment model can be determined to be the second falling judgment model.
In this embodiment, before the initial fall determination model is trained, a PCA (Principal Component Analysis) may be used to process the training set and extract the feature quantities in the training set, so as to train the initial fall determination model based on the extracted feature quantities. It is understood that the PCA may also be used to process the collected data and extract features of the collected data to determine whether the monitored user falls down based on the extracted features.
In one embodiment, the method further comprises: acquiring real-time trunk posture data of a monitored user; storing the trunk posture data into a training set; and training a falling judgment model based on the training set.
In this embodiment, the second fall judgment model can judge the fall behavior of most people after being trained, and because the personal characteristics of the detected users may be different, the judgment may also be different, such as age, height, weight, physical fitness, and the like, which may cause different fall situations for each person. Consequently, can gather the real-time trunk attitude data of monitored user in daily use and train the second judgement model of tumbleing as training data to adjust the predetermined angle of second judgement model of tumbleing, thereby make the predetermined angle of second judgement model of tumbleing more accord with monitored user's personal behavior habit.
In one embodiment, when the deviation between the continuous multiframes of the real-time trunk posture data and the standard trunk posture data in the training set is large, the fact that the monitored user has the falling risk is determined.
In the embodiment of the application, the training set is constructed by collecting the real-time trunk posture data of the monitored user, and the constructed training set is used for training the second falling judgment model, so that the trained second falling judgment model accords with the behavior characteristics of the monitored user, and the training set also comprises the data of the behavior characteristics of the monitored user under different motion postures. Therefore, the real-time trunk posture data of the monitored user, which is acquired by judgment, can be compared with the standard trunk posture data under different motion postures in the training set, and when continuous multiframe deviation exists between the acquired real-time trunk posture data and the standard trunk posture data, early warning of falling risks can be sent to the monitored user. It will be appreciated that the real-time torso pose data for the monitored user is collected at a frequency, and thus, the real-time torso pose data is described in frames.
In one embodiment, after the monitored user falls down, alarm information is sent to a contact preset by the monitored user.
In this embodiment, after the monitored user tumbles, through sending alarm information to the contact that is preset by the monitored user to the condition that makes the monitored user tumble in time is found, for example, the old man tumbles at home, under the unable condition of moving, sends alarm information to the family of old man to and in time send the old man to doctor.
Based on the same inventive concept, embodiments of the present application further provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is enabled to execute a fall monitoring method.
In the embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. The above-described apparatus embodiments are merely illustrative. The functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A fall monitoring method, comprising:
acquiring real-time trunk posture data of a monitored user, wherein the real-time trunk posture data comprises acceleration and posture angles of a monitored user body;
judging whether the monitored user falls or not based on the acceleration and a first falling judgment model;
when the first falling judgment model judges that the monitored user falls, whether the monitored user falls is confirmed based on the attitude angle and a second falling judgment model, wherein the second falling judgment model is a machine learning model, and the second falling judgment model performs machine learning based on the trunk attitude data of the monitored user.
2. The method of claim 1, wherein the obtaining real-time torso pose data for the monitored user comprises:
acquiring the acceleration of the monitored user acquired by an accelerometer in real time;
acquiring the rotation speed of the monitored user acquired by a gyroscope in real time;
determining the attitude angle of the monitored user's body in real time based on the acceleration and the rotational speed.
3. The method of claim 1, wherein the first fall determination model is a hierarchical threshold determination model, and the determining whether the monitored user falls based on the acceleration and the first fall determination model comprises: determining that the monitored user is in a weightless state when it is determined that the acceleration lasts for a first time period less than a first threshold;
within a second time length after the monitored user is determined to be in the weightlessness state, if the instantaneous value of the acceleration is determined to be larger than a second threshold value, determining that the user is impacted;
and determining that the user is in a static state from the moment corresponding to the third time length after the monitored user is collided, and if the acceleration continues for the fourth time length and is less than the third threshold value, wherein the static state represents that the monitored user falls down.
4. A method as claimed in claim 3, wherein the determining whether the user has fallen based on the attitude angle and a second fall determination model when the first fall determination model determines that the monitored user has fallen comprises:
acquiring an attitude angle of the monitored user at any moment in a fifth time after the monitored user is in the weightlessness state;
and when the gesture angle at any time in any fifth time period is determined to be larger than a preset angle, determining that the user falls down.
5. The method of claim 1, wherein before determining whether the monitored user has fallen based on the acceleration and a first fall determination model, the method further comprises: constructing a first initial falling judgment model; setting a preset judgment condition of the first initial falling judgment model, wherein the preset judgment condition comprises a judgment threshold value and a judgment time of each level in the first initial falling judgment model, and the first falling judgment model is the first initial falling judgment model after the preset judgment condition is set;
before the determining whether the user has fallen based on the attitude angle and a second fall judgment model, the method further comprises: constructing a second initial falling judgment model; and training the second initial falling judgment model by using a pre-acquired training set to obtain the second falling judgment model, wherein the pre-acquired training set comprises attitude angles of all motion states of the body of a normal person.
6. The method of claim 1, further comprising: obtaining the real-time trunk posture data of the monitored user; storing the trunk posture data into a training set; training the second fall judgment model based on the training set.
7. The method of claim 6, further comprising: and when the deviation of the real-time trunk posture data between continuous multiframes and the standard trunk posture data in the training set exceeds a preset threshold value, determining that the monitored user has a falling risk.
8. A fall monitoring device, comprising:
the data acquisition module comprises an accelerometer and a gyroscope and is used for collecting real-time trunk posture data of the body of the monitored user;
a processing module for performing a fall monitoring method as claimed in claims 1 to 7.
9. The method of claim 8, wherein the fall monitoring device further comprises: the data acquisition module and the processing module are both arranged in the shell;
a spring clamping piece is arranged on one outer surface of the shell and used for fixing the falling monitoring device on the monitored user body so as to monitor whether the monitored user falls down.
10. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 7.
CN202210430076.2A 2022-04-22 2022-04-22 Fall monitoring method and device and storage medium Pending CN114795189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210430076.2A CN114795189A (en) 2022-04-22 2022-04-22 Fall monitoring method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210430076.2A CN114795189A (en) 2022-04-22 2022-04-22 Fall monitoring method and device and storage medium

Publications (1)

Publication Number Publication Date
CN114795189A true CN114795189A (en) 2022-07-29

Family

ID=82504646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210430076.2A Pending CN114795189A (en) 2022-04-22 2022-04-22 Fall monitoring method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114795189A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115024717A (en) * 2022-08-09 2022-09-09 广东百年医疗健康科技发展有限公司 Fall detection method, device, equipment and storage medium
CN118021294A (en) * 2024-04-11 2024-05-14 四川省铁路建设有限公司 Fall detection method and system based on multiple sensors

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115024717A (en) * 2022-08-09 2022-09-09 广东百年医疗健康科技发展有限公司 Fall detection method, device, equipment and storage medium
CN118021294A (en) * 2024-04-11 2024-05-14 四川省铁路建设有限公司 Fall detection method and system based on multiple sensors

Similar Documents

Publication Publication Date Title
CN114795189A (en) Fall monitoring method and device and storage medium
EP2451351B1 (en) Fall prevention
JP5437549B2 (en) Activity pattern monitoring method and apparatus
JP5740285B2 (en) Gait analyzer and gait analysis program
US20100049096A1 (en) System for fall prevention and a method for fall prevention using such a system
JP2019512366A (en) System and method for automatic attitude calibration
WO2015009951A1 (en) Fall detection using machine learning
CN107233099A (en) Fall detection system for analyzing severity of fall and wearing device thereof
US9159213B1 (en) Motion detection method and device
JP2008212298A (en) Sleepiness judging apparatus and program
Zhong et al. A real-time pre-impact fall detection and protection system
CN112370048A (en) Movement posture injury prevention method and system based on joint key points and storage medium
CN113164034A (en) System for detecting whether a user wears a visual behavior monitor
CN114469074A (en) Fall early warning method, system, equipment and computer storage medium
CN109799624A (en) A kind of intelligence children's protective spectacles and terminal device
WO2024174674A1 (en) Inertial sensing-based method for early warning of fall and similar action
CN106650300B (en) Old man monitoring system and method based on extreme learning machine
CN116491935B (en) Exercise health monitoring method, system and medium of intelligent wearable equipment
CN112115827A (en) Falling behavior identification method based on human body posture dynamic characteristics
JP6044670B2 (en) Gait analyzer
KR102268445B1 (en) Apparatus for estimation of gait stability based on inertial information and method thereof
Zhuang et al. A novel wearable smart button system for fall detection
KR102422132B1 (en) Fall Prediction Method and Wearable Fall Prediction Device Based Jerk
CN206639357U (en) Falls in Old People detection means based on inertia, position sensor
US20210386329A1 (en) Measurement device, control method, and program recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination