CN114886417A - Intelligent safety nursing monitoring system and method - Google Patents

Intelligent safety nursing monitoring system and method Download PDF

Info

Publication number
CN114886417A
CN114886417A CN202210504779.5A CN202210504779A CN114886417A CN 114886417 A CN114886417 A CN 114886417A CN 202210504779 A CN202210504779 A CN 202210504779A CN 114886417 A CN114886417 A CN 114886417A
Authority
CN
China
Prior art keywords
user
monitoring
time
monitoring part
analysis module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210504779.5A
Other languages
Chinese (zh)
Other versions
CN114886417B (en
Inventor
倪顺康
仲恒平
陈国春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Bullteam Medical Technology Development Co ltd
Original Assignee
Nanjing Bullteam Medical Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Bullteam Medical Technology Development Co ltd filed Critical Nanjing Bullteam Medical Technology Development Co ltd
Priority to CN202210504779.5A priority Critical patent/CN114886417B/en
Publication of CN114886417A publication Critical patent/CN114886417A/en
Application granted granted Critical
Publication of CN114886417B publication Critical patent/CN114886417B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms

Abstract

The invention discloses an intelligent safety nursing monitoring system and a method, wherein the system comprises a monitoring image analysis module, a monitoring image extraction module and a monitoring image analysis module, wherein the monitoring image analysis module is used for analyzing each action image of a user, which is acquired by the monitoring image extraction module, so as to obtain the association degree between each action image and each characteristic monitoring part corresponding to the user; and the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user. The invention realizes effective monitoring of the health state of the user, rapidly alarms under the condition of abnormal state values, and informs nursing staff to perform corresponding nursing operation on the user, thereby not only reducing the work intensity of the nursing staff, but also improving the monitoring effect of the nursing staff on the user and ensuring the timeliness and effectiveness of the nursing staff on the user.

Description

Intelligent safety nursing monitoring system and method
Technical Field
The invention relates to the technical field of health monitoring, in particular to an intelligent safety nursing monitoring system and method.
Background
Along with the development of science and technology and the improvement of living standard, people pay more attention to the health state of the people, and both the old and the sick pay more attention to the safety care of the people, in the current nursing mechanism and medical mechanism, the old and the sick are usually nursed in a manual mode, namely, the state of the old and the sick are nursed by nursing staff; however, the number of nursing staff is limited and is far lower than the number of the patients and the old, so that the nursing staff cannot monitor the health status of the patients or the old in real time, and the mode has high working strength for the nursing staff, and therefore, the nursing staff has a great defect in nursing the patients or the old manually.
In view of the above, there is a need for an intelligent safety nursing monitoring system and method.
Disclosure of Invention
The invention aims to provide an intelligent safety nursing monitoring system and method to solve the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme: an intelligent safety care monitoring system, comprising:
the system comprises a user personal information acquisition module, a database and a characteristic monitoring part acquisition module, wherein the user personal information acquisition module acquires basic information of a user and corresponding characteristic monitoring parts of corresponding illness states in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
the monitoring image analysis module analyzes each action image of the user acquired by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module obtains a state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
The invention realizes the acquisition of the personal information and the corresponding characteristic monitoring part set of the user through the cooperative cooperation of all the modules, obtains the association degree between each user action image and each corresponding characteristic monitoring part of the user through the acquisition and analysis of the user image, quantifies the health state of the user in real time by combining the fluctuation rate of the user physiological data obtained by the physiological data analysis module, obtains the state value of the user, further realizes the effective monitoring of the health state of the user, rapidly alarms under the condition of abnormal state value, informs the nursing staff to perform corresponding nursing operation on the user, reduces the work intensity of the nursing staff, improves the monitoring effect of the nursing staff on the user, and ensures the timeliness and effectiveness of the nursing staff on the user.
Further, the user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information acquisition module comprises a user number and a disease type of the user, the disease type of the same user is one or more,
the user personal information acquisition module compares each disease suffered by the same user with the database respectively to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
the corresponding characteristic monitoring parts of different disease types in the database are different, and the number of the corresponding characteristic monitoring parts of the same disease type in the database is one or more;
the user personal information acquisition module acquires a priority numerical value corresponding to each disease type preset in a database, the number of the disease types corresponding to the same priority numerical value is one or more,
the user personal information acquisition module records a priority value corresponding to the ith disease suffered by the nth user as Ani, records a set of characteristic monitoring parts corresponding to the ith disease suffered by the nth user as Cni, and records a monitoring priority coefficient corresponding to each element in Cni as Bni, wherein the monitoring priority coefficient corresponding to each element in the characteristic monitoring part set is equal to the priority value of the disease type corresponding to the corresponding characteristic monitoring part set, namely Bni equals Ani;
the number of the disease types suffered by the nth user is recorded as iZn, the number of the feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module records a union set of iZn feature monitoring part sets corresponding to the nth user as Cn; obtaining a relation value Cn corresponding to the jth element in Cn and the corresponding iZn feature monitoring part sets j And a first feature monitoring site set Dn corresponding to the nth user,
the above-mentioned
Figure BDA0003637025770000031
Wherein F (j, Cni) represents the relationship value of the j-th element in Cn corresponding to the feature monitoring site set Cni,
when the jth element in Cn belongs to the feature monitoring site set Cni, then F (j, Cni) is Bni,
when the jth element in Cn does not belong to the feature monitoring site set Cni, F (j, Cni) is 0;
comparing the relationship values of any two different elements in Cn with the corresponding iZn feature monitoring part sets respectively,
if the relationship values of the two elements and the corresponding iZn feature monitoring part sets are different, adjusting the position between the two elements, arranging the element with the larger relationship value in front of the element with the smaller relationship value,
if the two elements are respectively the same as the corresponding relation values of the corresponding iZn feature monitoring part sets, further adjusting the position between the two elements according to the corresponding priority values of the feature monitoring parts prefabricated in the database, arranging the element with the larger priority value in front of the element with the smaller priority value, wherein the priority values corresponding to different feature monitoring parts prefabricated in the database are all different,
and recording the corresponding set after the element position adjustment in the Cn is finished as a first feature monitoring part set Dn corresponding to the nth user.
The invention relates to a method for acquiring a first characteristic monitoring part set Dn corresponding to an nth user by a user personal information acquisition module, which aims to confirm a corresponding characteristic monitoring part of the user and a corresponding sequence of each characteristic monitoring part by combining the illness state of the user, wherein the sequence of the characteristic monitoring parts corresponding to different illness states and the corresponding sequence of each characteristic monitoring part are different, and aims to carry out targeted monitoring according to the illness state of the user, improve the effectiveness of monitoring the health state of the user, acquire the sequence of each characteristic monitoring part corresponding to the user, consider the timeliness of monitoring information and the speed condition of a system for processing data, ensure that the health state of the user is judged within a specified time, and further preferentially process the characteristic monitoring parts which are in front sequence in each characteristic monitoring part corresponding to the user, thereby ensuring the timeliness of a monitoring result, and the accuracy of the monitoring result is ensured.
Further, the monitoring image extraction module is connected with the monitoring image analysis module;
the monitoring image acquisition module acquires a user action image every a first preset time T through the camera.
Further, the monitoring image analysis module obtains a first feature monitoring part set Dn corresponding to the nth user and the user action images acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in Dn and the user action image corresponding to each acquisition time one by one, and records the association coefficient between the feature monitoring part corresponding to the jth 1 element in Dn and the user action image corresponding to the acquisition time t1 as
Figure BDA0003637025770000032
When the monitoring image analysis module acquires the correlation coefficient between the characteristic monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the four limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 th element in Dn,
when the characteristic monitoring part corresponding to the j1 th element in Dn is positioned on a certain limb in the limbs, the pixel distances of the limbs corresponding to the characteristic monitoring part corresponding to the j1 th element in the image of the user action corresponding to the acquisition time t1 and the rest three limbs are respectively calculated,
when the characteristic monitoring part corresponding to the j1 th element in Dn is not located on a certain limb in the four limbs, respectively calculating the minimum pixel distance between the four limbs and the characteristic monitoring part corresponding to the j1 th element in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and records the average value of all pixel distances smaller than the first preset pixel distance in the acquired pixel distance as
Figure BDA0003637025770000041
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure BDA0003637025770000042
The monitoring image analysis module obtains a characteristic monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the characteristic monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1-T is recorded as the pixel distance of the movement of the characteristic monitoring part
Figure BDA0003637025770000043
The monitoring image analysis module obtains a correlation coefficient between a characteristic monitoring part corresponding to the j1 th element in Dn and a user action image corresponding to the acquisition time t1
Figure BDA0003637025770000044
Where e1 is the first coefficient and e1 is a constant greater than 0, e1 is obtained by a database query,
the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in limited time and the user action image corresponding to the acquisition time t1
Figure BDA0003637025770000045
Wherein j2 represents the number of elements in the defined time to obtain the correlation coefficient between the images of the user action corresponding to the acquisition time t1 in Dn, j2 is more than or equal to 1, j2 is less than or equal to the number of elements in Dn,
the limited time is the total upper limit time of the correlation coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if in the process of acquiring the correlation coefficient between the characteristic monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1,
and the sum of the time from the calculation of the correlation coefficient between the feature monitoring part corresponding to the 1 st element in the Dn and the user motion image corresponding to the acquisition time t1 to the calculation of the correlation coefficient between the feature monitoring part corresponding to the J0 th element in the Dn and the user motion image corresponding to the acquisition time t1 is recorded as
Figure BDA0003637025770000046
Will be provided with
Figure BDA0003637025770000047
And a comparison is made with the defined time,
when in use
Figure BDA0003637025770000051
Greater than a defined time and
Figure BDA0003637025770000052
less than or equal to a defined time, or
Figure BDA0003637025770000053
If the time is longer than the limit time and J0 ≠ 1, J2 ═ J0-1;
when in use
Figure BDA0003637025770000054
When the time is less than the limit time and J0 is less than the number of elements in Dn, the acquisition is continued
Figure BDA0003637025770000055
When in use
Figure BDA0003637025770000056
When the time is equal to the defined time or J0 is equal to the number of elements in Dn or J0 is equal to 1, then J2 is equal to J0.
When a monitoring image analysis module acquires a correlation coefficient between a characteristic monitoring part corresponding to the j1 th element in Dn and a user action image corresponding to the acquisition time t1, the positions of four limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 th element in Dn are identified, so that the condition that the four limbs touch the corresponding characteristic monitoring part or the corresponding characteristic monitoring part is moved because the characteristic monitoring part of a user is uncomfortable is considered, and further the health state of the user is influenced; obtaining
Figure BDA0003637025770000057
And
Figure BDA0003637025770000058
in the process, when the condition that the four limbs do not have the characteristic monitoring part is considered, the defaulted four limbs of the user do not influence the activity and do not influence the health state of the user, but when the four limbs are closer to the characteristic monitoring part, the four limbs of the user are considered to move to influence the characteristic monitoring part, the closer the distance is, the more the number of the contacted limbs is, the greater the influence of the limbs on the characteristic monitoring part is, and further the
Figure BDA0003637025770000059
The influence of the limb motion in the user motion image corresponding to the user acquisition time t1 on the characteristic monitoring part corresponding to the j1 th element in Dn is shown; monitoring image analysisThe module obtains the correlation degree between each characteristic monitoring part corresponding to Dn in limited time and the user action image corresponding to the acquisition time t1
Figure BDA00036370257700000510
In the process of (1), the time limit is taken into consideration in order to control the time for acquiring the corresponding correlation degree, and the timeliness of the finally acquired state value is ensured.
Further, the monitoring image analysis module calibrates a correlation coefficient between a characteristic monitoring part corresponding to the j1 th element in Dn and a user action image corresponding to the acquisition time t1 in the process of obtaining the correlation degree between each characteristic monitoring part corresponding to Dn and the user action image corresponding to the acquisition time t1,
the monitoring image analysis module obtains the previous m user action images based on the acquisition time t1, and records the average value of the pixel distances smaller than a first preset pixel distance in the pixel distances between the characteristic monitoring part corresponding to the j1 th element in Dn and the four limbs in the previous m1 user action images based on the acquisition time t1
Figure BDA00036370257700000511
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure BDA00036370257700000512
Calculating out
Figure BDA00036370257700000513
When m1 of the first m user motion images based on the capturing time t1 is calculated to be different values, the first m1 user motion images based on the capturing time t1 are respectively calculated
Figure BDA00036370257700000514
And
Figure BDA00036370257700000515
absolute value of difference, noted
Figure BDA00036370257700000516
When m1 is different values in the first m user motion images based on the capturing time t1, the first m1 user motion images based on the capturing time t1 correspond to
Figure BDA00036370257700000517
When the position of the four limbs in the user motion image corresponding to the acquisition time t1 is always smaller than or equal to the second preset value, the position of the four limbs in the user motion image corresponding to the j1 th element in the Dn is judged to be irrelevant, the correlation coefficient between the feature monitoring part corresponding to the j1 th element in the Dn and the user motion image corresponding to the acquisition time t1 needs to be calibrated, and the calibrated correlation coefficient between the feature monitoring part corresponding to the j1 th element in the Dn and the user motion image corresponding to the acquisition time t1 is equal to or smaller than the second preset value
Figure BDA0003637025770000061
When m1 is different values in the first m user motion images based on the capturing time t1, the first m1 user motion images based on the capturing time t1 correspond to
Figure BDA0003637025770000062
And when the condition is larger than the second preset value, judging that the position of the four limbs in the user motion image corresponding to the acquisition time t1 is related to the feature monitoring part corresponding to the j1 th element in Dn, and not calibrating the correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user motion image corresponding to the acquisition time t 1.
In the process of calibrating a correlation coefficient between a characteristic monitoring part corresponding to the j1 th element in Dn and a user action image corresponding to the acquisition time t1 by a monitoring image analysis module, the m previous user action images based on the acquisition time t1 are acquired, so as to ensure whether the positions of four limbs in the acquired m +1 user action images are in contact with the characteristic monitoring part corresponding to the j1 th element in Dn or not and whether the contact state is changed, and when the contact occurs and the corresponding contact occurs in the corresponding m +1 user monitoring imagesThe state is always within the error range (
Figure BDA0003637025770000063
Always less than or equal to the second preset value), it is determined that the four-limb actions in the user action image corresponding to the user acquisition time t1 have not changed, and the corresponding four-limb actions belong to the habits of the user and do not affect the health state of the user, so that the corresponding correlation coefficients are further correlated
Figure BDA0003637025770000064
In (1)
Figure BDA0003637025770000065
Deleted, i.e. calibrated, corresponding correlation coefficients of
Figure BDA0003637025770000066
And further, the finally obtained user state value is more accurate, and the accuracy of the monitoring result is ensured.
Furthermore, the physiological data analysis module monitors the physiological data of the user in real time by the sensor, the monitored data are blood pressure and heart rate,
recording the blood pressure of the nth user at the time t as XYnt, recording the heart rate of the nth user at the time t as XLNt,
calculating the average value of the blood pressure of the nth user in the second unit time before the time t, and recording the average value as
Figure BDA0003637025770000067
The above-mentioned
Figure BDA0003637025770000068
Calculating the average value of the heart rate of the nth user in the second unit time before the nth user time t, and recording the average value as
Figure BDA0003637025770000069
The above-mentioned
Figure BDA00036370257700000610
The second unit time corresponds to a time duration T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the nth user at the time t,
is marked as
Figure BDA00036370257700000611
In the process of monitoring the physiological data of the user in real time by the sensor, the physiological data analysis module judges the health state of the user from the aspects of blood pressure and heart rate according to the stable condition of the physiological data of the user, and the health state of the user is worse if the fluctuation rate of the corresponding physiological data is higher;
Figure BDA0003637025770000071
the integral value of the difference degree between the blood pressure of the user at each time point from the time T-T1 to the time T and the average blood pressure value is reflected, and the stable condition of the blood pressure of the user from the time T-T1 to the time T can be reflected to a certain extent;
Figure BDA0003637025770000072
the integral value of the difference degree between the heart rate of the user at each time point between the time T-T1 and the time T and the average heart rate value is reflected, and the stable condition of the heart rate of the user between the time T-T1 and the time T can be reflected to a certain extent.
Further, the user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module records the state value corresponding to the nth user at the time t as ZTZnt,
the above-mentioned
Figure BDA0003637025770000073
Wherein r1 is a conversion coefficient of the action correlation degree and the physiological fluctuation rate, and r1 is a constant obtained from the database.
When the user state analysis module acquires the state value ZTZnt corresponding to the nth user at the time t, the two angles of the corresponding correlation degree and the corresponding physiological data fluctuation rate are considered, the influence degrees of the two on the user state value are different, and the two are required to be harmonized through r1, so that the influence degrees of the two on the user state value are uniform.
Further, the monitoring and early warning module acquires a state value corresponding to the nth user at the time t, compares the corresponding state value with a third preset value,
when the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
and when the state value corresponding to the current time of the nth user is greater than or equal to the third preset value, judging that the state value corresponding to the current time of the nth user is normal without alarming.
An intelligent safety care monitoring method, the method comprising the steps of:
s1, the user personal information acquisition module acquires the basic information of the user and each corresponding characteristic monitoring part of the corresponding illness state in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
s2, the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
s3, the monitoring image analysis module analyzes each action image of the user collected by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
s4, the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
s5, the user state analysis module obtains the state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and carries out early warning when the user state value is abnormal.
Compared with the prior art, the invention has the following beneficial effects: the invention not only realizes the acquisition of the personal information of the user and the collection of the corresponding characteristic monitoring parts, but also obtains the association degree between each user action image and each corresponding characteristic monitoring part of the user through the acquisition and analysis of the user image, and quantifies the health state of the user in real time by combining the fluctuation rate of the user physiological data obtained by the physiological data analysis module to obtain the state value of the user, thereby realizing the effective monitoring of the health state of the user, and rapidly giving an alarm under the condition of abnormal state value to inform nursing staff to carry out corresponding nursing operation on the user, thereby not only reducing the work lightness of the nursing staff, but also improving the monitoring effect of the nursing staff on the user, and ensuring the timeliness and effectiveness of the nursing staff on the user.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of an intelligent safety care monitoring system according to the present invention;
FIG. 2 is a schematic flow chart of the calibration of the correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 by the intelligent security nursing monitoring system according to the present invention;
fig. 3 is a flow chart of an intelligent safety nursing monitoring method according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the present invention provides a technical solution: an intelligent safety care monitoring system, comprising:
the system comprises a user personal information acquisition module, a database and a characteristic monitoring part acquisition module, wherein the user personal information acquisition module acquires basic information of a user and corresponding characteristic monitoring parts of corresponding illness states in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
the monitoring image analysis module analyzes each action image of the user acquired by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module obtains a state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
The user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information acquisition module comprises a user number and a disease type of the user, the disease type of the same user is one or more,
the user personal information acquisition module compares each disease suffered by the same user with the database respectively to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
the corresponding characteristic monitoring parts of different disease types in the database are different, and the number of the corresponding characteristic monitoring parts of the same disease type in the database is one or more;
the user personal information acquisition module acquires a priority numerical value corresponding to each disease type preset in a database, the number of the disease types corresponding to the same priority numerical value is one or more,
the user personal information acquisition module records a priority value corresponding to the ith disease suffered by the nth user as Ani, records a set of characteristic monitoring parts corresponding to the ith disease suffered by the nth user as Cni, and records a monitoring priority coefficient corresponding to each element in Cni as Bni, wherein the monitoring priority coefficient corresponding to each element in the characteristic monitoring part set is equal to the priority value of the disease type corresponding to the corresponding characteristic monitoring part set, namely Bni equals Ani;
the number of the disease types suffered by the nth user is recorded as iZn, the number of the feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module records a union set of iZn feature monitoring part sets corresponding to the nth user as Cn; obtaining a relation value Cn corresponding to the jth element in Cn and the corresponding iZn feature monitoring part sets j And a first feature monitoring site set Dn corresponding to the nth user,
the described
Figure BDA0003637025770000101
Wherein F (j, Cni) represents the corresponding relation value of the j-th element in Cn and the characteristic monitoring part set Cni,
when the jth element in Cn belongs to feature monitoring site set Cni, then F (j, Cni) is Bni,
when the jth element in Cn does not belong to the feature monitoring site set Cni, F (j, Cni) is 0;
comparing the relationship values of any two different elements in Cn with the corresponding iZn feature monitoring part sets respectively,
if the relationship values of the two elements and the corresponding iZn feature monitoring part sets are different, adjusting the position between the two elements, arranging the element with the larger relationship value in front of the element with the smaller relationship value,
if the two elements are respectively the same as the corresponding relation values of the corresponding iZn feature monitoring part sets, further adjusting the position between the two elements according to the corresponding priority values of the feature monitoring parts prefabricated in the database, arranging the element with the larger priority value in front of the element with the smaller priority value, wherein the priority values corresponding to different feature monitoring parts prefabricated in the database are all different,
and recording the corresponding set after the element position adjustment in the Cn is finished as a first feature monitoring part set Dn corresponding to the nth user.
In this embodiment, for example, the 01 th user has two diseases, wherein the 1 st disease corresponds to the first characteristic monitoring part and the second characteristic monitoring part respectively, the 2 nd disease corresponds to the first characteristic monitoring part and the third characteristic monitoring part respectively,
then the set of 1 st disease that the 01 st user suffers from is recorded as C011 (first characteristic monitoring part, second characteristic monitoring part), the set of 2 nd disease that the 01 st user suffers from is recorded as C012 (first characteristic monitoring part, third characteristic monitoring part),
if the priority value A011 for the first disease is 2 and the priority value A012 for the second disease is 1,
then the 01 th user corresponds to C01 ═ first feature monitoring site, second feature monitoring site, third feature monitoring site },
since the 1 st element in C01 belongs to C011, F (1, C011) is 2,
since the 1 st element in C01 belongs to C012, F (1, C012) is 1,
since the 2 nd element in C01 belongs to C011, F (2, C011) is 2,
since the 2 nd element in C01 does not belong to C012, F (2, C012) is 0,
since the 3 rd element in C01 does not belong to C011, F (3, C011) is 0,
since the 3 rd element in C01 belongs to C012, F (3, C012) is 1,
the relation value of the 1 st element in C01 corresponding to the corresponding C011 and C012
C01 1 =F(1,C011)+F(1,C012)=2+1=3,
Relation value of 2 nd element in C01 corresponding to C011 and C012
C01 2 =F(2,C011)+F(2,C012)=2+0=2,
Relation value of 3 rd element in C01 corresponding to C011 and C012
C01 3 =F(3,C011)+F(3,C012)=0+1=1,
Since 3 > 2 > 1, C01 1 >C01 2 >C01 3 That is, the first feature monitoring site set D01 corresponding to the 01 th user is { first feature monitoring site, second feature monitoring site, third feature monitoring site }.
The monitoring image extraction module is connected with the monitoring image analysis module;
the monitoring image acquisition module acquires a user action image every a first preset time T through the camera.
The monitoring image analysis module obtains a first feature monitoring part set Dn corresponding to the nth user and user action images acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in Dn and the user action image corresponding to each acquisition time one by one, and records the association coefficient between the feature monitoring part corresponding to the jth 1 th element in Dn and the user action image corresponding to the acquisition time t1 as
Figure BDA0003637025770000111
When the monitoring image analysis module acquires the correlation coefficient between the characteristic monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the four limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 th element in Dn,
when the characteristic monitoring part corresponding to the j1 th element in Dn is positioned on a certain limb in the limbs, the pixel distances of the limbs corresponding to the characteristic monitoring part corresponding to the j1 th element in the image of the user action corresponding to the acquisition time t1 and the rest three limbs are respectively calculated,
when the characteristic monitoring part corresponding to the j1 th element in Dn is not located on a certain limb in the four limbs, respectively calculating the minimum pixel distance between the four limbs and the characteristic monitoring part corresponding to the j1 th element in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and records the average value of all pixel distances smaller than the first preset pixel distance in the acquired pixel distance as
Figure BDA0003637025770000121
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure BDA0003637025770000122
The monitoring image analysis module obtains a characteristic monitoring part corresponding to the j1 th element in Dn in the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the characteristic monitoring part is recorded as the pixel distance of the movement of the characteristic monitoring part corresponding to the j1 th element in Dn in the user action image corresponding to the acquisition time T1-T
Figure BDA0003637025770000123
The monitoring image analysis module obtains the characteristics corresponding to the j1 th element in DnCorrelation coefficient between monitoring part and user action image corresponding to acquisition time t1
Figure BDA0003637025770000124
Where e1 is the first coefficient and e1 is a constant greater than 0, e1 is obtained by a database query,
the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in limited time and the user action image corresponding to the acquisition time t1
Figure BDA0003637025770000125
Wherein j2 represents the number of elements in the defined time to obtain the correlation coefficient between the images of the user action corresponding to the acquisition time t1 in Dn, j2 is more than or equal to 1, j2 is less than or equal to the number of elements in Dn,
the limited time is the total upper limit time of the correlation coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if in the process of acquiring the correlation coefficient between the characteristic monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1,
and the sum of the time from the calculation of the correlation coefficient between the feature monitoring part corresponding to the 1 st element in the Dn and the user motion image corresponding to the acquisition time t1 to the calculation of the correlation coefficient between the feature monitoring part corresponding to the J0 th element in the Dn and the user motion image corresponding to the acquisition time t1 is recorded as
Figure BDA0003637025770000126
Will be provided with
Figure BDA0003637025770000127
And a comparison is made with the defined time,
when in use
Figure BDA0003637025770000128
Greater than a defined time and
Figure BDA0003637025770000129
less than or equal to a defined time, or
Figure BDA00036370257700001210
If the time is longer than the limit time and J0 ≠ 1, J2 ═ J0-1;
when in use
Figure BDA00036370257700001211
When the time is less than the limit time and J0 is less than the number of elements in Dn, the acquisition is continued
Figure BDA00036370257700001212
When in use
Figure BDA00036370257700001213
When the time is equal to the defined time or J0 is equal to the number of elements in Dn or J0 is equal to 1, then J2 is equal to J0.
As shown in fig. 2, in the process of obtaining the association degree between each feature monitoring part corresponding to Dn and the user motion image corresponding to the acquisition time t1, the monitoring image analysis module further calibrates the association coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user motion image corresponding to the acquisition time t1,
the monitoring image analysis module obtains the first m user action images based on the acquisition time t1, and records the average value of the pixel distances smaller than the first preset pixel distance in the pixel distances between the characteristic monitoring part corresponding to the j1 th element in Dn and the four limbs in the first m1 user action images based on the acquisition time t1
Figure BDA0003637025770000131
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure BDA0003637025770000132
Computing
Figure BDA0003637025770000133
When m1 of the first m user motion images based on the capturing time t1 is calculated to be different values, the first m1 user motion images based on the capturing time t1 are respectively calculated
Figure BDA0003637025770000134
And
Figure BDA0003637025770000135
the absolute value of the difference, is recorded as
Figure BDA0003637025770000136
When m1 is different values in the first m user motion images based on the capturing time t1, the first m1 user motion images based on the capturing time t1 correspond to
Figure BDA0003637025770000137
When the position of the limb in the user action image corresponding to the acquisition time t1 is always less than or equal to the second preset value, the position of the limb in the user action image corresponding to the acquisition time t1 is judged not to be related to the feature monitoring part corresponding to the j1 th element in Dn, the correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 needs to be calibrated, and the calibrated correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 is equal to or higher than the second preset value
Figure BDA0003637025770000138
When m1 is different in the first m user motion images based on the capturing time t1, the first m1 user motion images based on the capturing time t1 correspond to
Figure BDA0003637025770000139
And when the condition is larger than the second preset value, judging that the position of the four limbs in the user motion image corresponding to the acquisition time t1 is related to the feature monitoring part corresponding to the j1 th element in Dn, and not calibrating the correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user motion image corresponding to the acquisition time t 1.
The physiological data analysis module monitors the physiological data of the user in real time by the sensor, the monitored data are blood pressure and heart rate,
recording the blood pressure corresponding to the nth user at the time t as XYnt, recording the heart rate corresponding to the nth user at the time t as XLNt,
calculating the average value of the blood pressure of the nth user in the second unit time before the time t, and recording the average value as
Figure BDA00036370257700001310
The above-mentioned
Figure BDA00036370257700001311
Calculating the average value of the heart rate of the nth user in the second unit time before the nth user time t, and recording the average value as
Figure BDA00036370257700001312
The above-mentioned
Figure BDA00036370257700001313
The second unit time corresponds to a time duration T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the nth user at the time t,
is marked as
Figure BDA00036370257700001314
The user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module records the state value corresponding to the nth user at the time t as ZTZnt,
the above-mentioned
Figure BDA0003637025770000141
Wherein r1 is a conversion coefficient of the action correlation degree and the physiological fluctuation rate, and r1 is a constant obtained from the database.
The monitoring and early warning module acquires a state value corresponding to the nth user at the time t, compares the corresponding state value with a third preset value,
when the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
and when the state value corresponding to the current time of the nth user is greater than or equal to the third preset value, judging that the state value corresponding to the current time of the nth user is normal without alarming.
Referring to fig. 3, an intelligent safety care monitoring method includes the following steps:
s1, the user personal information acquisition module acquires the basic information of the user and each corresponding characteristic monitoring part of the corresponding illness state in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
s2, the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
s3, the monitoring image analysis module analyzes each action image of the user collected by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
s4, the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
s5, the user state analysis module obtains the state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and carries out early warning when the user state value is abnormal.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. An intelligent safety care monitoring system, comprising:
the system comprises a user personal information acquisition module, a database and a characteristic monitoring part acquisition module, wherein the user personal information acquisition module acquires basic information of a user and corresponding characteristic monitoring parts of corresponding illness states in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
the monitoring image analysis module is used for analyzing each action image of the user, which is acquired by the monitoring image extraction module, so as to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module obtains a state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
2. The intelligent safety care monitoring system of claim 1, wherein: the user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information acquisition module comprises a user number and the types of the diseases suffered by the user, one or more types of the diseases suffered by the same user,
the user personal information acquisition module compares each disease suffered by the same user with the database respectively to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
the corresponding characteristic monitoring parts of different disease types in the database are different, and the number of the corresponding characteristic monitoring parts of the same disease type in the database is one or more;
the user personal information acquisition module acquires a priority numerical value corresponding to each disease type preset in a database, the number of the disease types corresponding to the same priority numerical value is one or more,
the user personal information acquisition module records a priority value corresponding to the ith disease suffered by the nth user as Ani, records a set of characteristic monitoring parts corresponding to the ith disease suffered by the nth user as Cni, and records a monitoring priority coefficient corresponding to each element in Cni as Bni, wherein the monitoring priority coefficient corresponding to each element in the characteristic monitoring part set is equal to the priority value of the disease type corresponding to the corresponding characteristic monitoring part set, namely Bni equals Ani;
recording the number of the disease types suffered by the nth user as iZn, wherein the number of the feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module records a union set of iZn feature monitoring part sets corresponding to the nth user as Cn; obtaining a relation value Cn corresponding to the jth element in Cn and the corresponding iZn feature monitoring part sets j And a first feature monitoring site set Dn corresponding to the nth user,
the above-mentioned
Figure FDA0003637025760000021
Wherein F (j, Cni) represents the corresponding relation value of the j-th element in Cn and the characteristic monitoring part set Cni,
when the jth element in Cn belongs to the feature monitoring site set Cni, then F (j, Cni) is Bni,
when the jth element in Cn does not belong to the feature monitoring site set Cni, F (j, Cni) is 0;
comparing the relationship values of any two different elements in Cn with the corresponding iZn feature monitoring part sets respectively,
if the relationship values of the two elements and the corresponding iZn feature monitoring part sets are different, adjusting the position between the two elements, arranging the element with the larger relationship value in front of the element with the smaller relationship value,
if the two elements are respectively the same as the corresponding relation values of the corresponding iZn feature monitoring part sets, the position between the two elements is adjusted, the element with the higher priority value is arranged in front of the element with the lower priority value, the priority values corresponding to different feature monitoring parts preset in the database are all different,
and recording the corresponding set after the element position adjustment in the Cn is finished as a first feature monitoring part set Dn corresponding to the nth user.
3. The intelligent safety care monitoring system of claim 1, wherein: the monitoring image extraction module is connected with the monitoring image analysis module;
the monitoring image acquisition module acquires a user action image every a first preset time T through the camera.
4. The intelligent safety care monitoring system of claim 2, wherein: the monitoring image analysis module obtains a first feature monitoring part set Dn corresponding to the nth user and user action images acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in Dn and the user action image corresponding to each acquisition time one by one, and records the association coefficient between the feature monitoring part corresponding to the jth 1 th element in Dn and the user action image corresponding to the acquisition time t1 as
Figure FDA0003637025760000022
When the monitoring image analysis module acquires the correlation coefficient between the characteristic monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the four limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 th element in Dn,
when the characteristic monitoring part corresponding to the j1 th element in Dn is positioned on a certain limb in the limbs, the pixel distances of the limbs corresponding to the characteristic monitoring part corresponding to the j1 th element in the image of the user action corresponding to the acquisition time t1 and the rest three limbs are respectively calculated,
when the characteristic monitoring part corresponding to the j1 th element in Dn is not located on a certain limb in the four limbs, respectively calculating the minimum pixel distance between the four limbs and the characteristic monitoring part corresponding to the j1 th element in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and records the average value of all pixel distances smaller than the first preset pixel distance in the acquired pixel distance as
Figure FDA0003637025760000031
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure FDA0003637025760000032
The monitoring image analysis module obtains a characteristic monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the characteristic monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1-T is recorded as the pixel distance of the movement of the characteristic monitoring part
Figure FDA0003637025760000033
The monitoring image analysis module obtains a correlation coefficient between a characteristic monitoring part corresponding to the j1 th element in Dn and a user action image corresponding to the acquisition time t1
Figure FDA0003637025760000034
Where e1 is the first coefficient and e1 is a constant greater than 0, e1 is obtained by a database query,
the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in limited time and the user action image corresponding to the acquisition time t1
Figure FDA0003637025760000035
Wherein j2 represents the number of elements in the defined time to obtain the correlation coefficient between the images of the user action corresponding to the acquisition time t1 in Dn, j2 is more than or equal to 1, j2 is less than or equal to the number of elements in Dn,
the limited time is the total upper limit time of the correlation coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if in the process of acquiring the correlation coefficient between the feature monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1,
and the sum of the time from the calculation of the correlation coefficient between the feature monitoring part corresponding to the 1 st element in the Dn and the user motion image corresponding to the acquisition time t1 to the calculation of the correlation coefficient between the feature monitoring part corresponding to the J0 th element in the Dn and the user motion image corresponding to the acquisition time t1 is recorded as
Figure FDA0003637025760000041
Will be provided with
Figure FDA0003637025760000042
And a comparison is made with the defined time,
when in use
Figure FDA0003637025760000043
Greater than a defined time and
Figure FDA0003637025760000044
less than or equal to a defined time, or
Figure FDA0003637025760000045
If the time is longer than the limit time and J0 ≠ 1, J2 ═ J0-1;
when in use
Figure FDA0003637025760000046
When the time is less than the limit time and J0 is less than the number of elements in Dn, the acquisition is continued
Figure FDA0003637025760000047
When in use
Figure FDA0003637025760000048
Equal to a defined time or J0When the number of elements in Dn or J0 is equal to 1, J2 is equal to J0.
5. The intelligent safety care monitoring system of claim 4, wherein: the monitoring image analysis module also calibrates the correlation coefficient between the characteristic monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 in the process of obtaining the correlation degree between each characteristic monitoring part corresponding to Dn and the user action image corresponding to the acquisition time t1,
the monitoring image analysis module obtains the first m user action images based on the acquisition time t1, and records the average value of the pixel distances smaller than the first preset pixel distance in the pixel distances between the characteristic monitoring part corresponding to the j1 th element in Dn and the four limbs in the first m1 user action images based on the acquisition time t1
Figure FDA0003637025760000049
Recording the number of the acquired pixel distances smaller than a first preset pixel distance
Figure FDA00036370257600000410
Computing
Figure FDA00036370257600000411
When m1 of the first m user motion images based on the capturing time t1 is calculated to be different values, the first m1 user motion images based on the capturing time t1 are respectively calculated
Figure FDA00036370257600000412
And
Figure FDA00036370257600000413
the absolute value of the difference, is recorded as
Figure FDA00036370257600000414
When based on acquisitionWhen m1 is different among the first m user motion images between t1, the first m1 user motion images based on the capture time t1 correspond to
Figure FDA00036370257600000415
When the position of the four limbs in the user motion image corresponding to the acquisition time t1 is always smaller than or equal to the second preset value, the position of the four limbs in the user motion image corresponding to the j1 th element in the Dn is judged to be irrelevant, the correlation coefficient between the feature monitoring part corresponding to the j1 th element in the Dn and the user motion image corresponding to the acquisition time t1 needs to be calibrated, and the calibrated correlation coefficient between the feature monitoring part corresponding to the j1 th element in the Dn and the user motion image corresponding to the acquisition time t1 is equal to or smaller than the second preset value
Figure FDA00036370257600000416
When m1 is different values in the first m user motion images based on the capturing time t1, the first m1 user motion images based on the capturing time t1 correspond to
Figure FDA00036370257600000417
And when the condition is larger than the second preset value, judging that the position of the four limbs in the user motion image corresponding to the acquisition time t1 is related to the feature monitoring part corresponding to the j1 th element in Dn, and not calibrating the correlation coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user motion image corresponding to the acquisition time t 1.
6. The intelligent safety care monitoring system of claim 4, wherein: the physiological data analysis module monitors the physiological data of the user in real time by the sensor, the monitored data are blood pressure and heart rate,
recording the blood pressure of the nth user at the time t as XYnt, recording the heart rate of the nth user at the time t as XLNt,
calculating the average value of the blood pressure of the nth user in the second unit time before the time t, and recording the average value as
Figure FDA0003637025760000051
The above-mentioned
Figure FDA0003637025760000052
Calculating the average value of the heart rate of the nth user in the second unit time before the nth user time t, and recording the average value as
Figure FDA0003637025760000053
The described
Figure FDA0003637025760000054
The second unit time corresponds to a time period T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the nth user at the time t,
is marked as
Figure FDA0003637025760000055
7. The intelligent safety care monitoring system of claim 6, wherein: the user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module records the state value corresponding to the nth user at the time t as ZTZnt,
the above-mentioned
Figure FDA0003637025760000056
Wherein r1 is a conversion coefficient of the action correlation degree and the physiological fluctuation rate, and r1 is a constant obtained from the database.
8. The intelligent safety care monitoring system of claim 7, wherein: the monitoring and early warning module acquires a state value corresponding to the nth user at the time t, compares the corresponding state value with a third preset value,
when the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
and when the state value corresponding to the current time of the nth user is greater than or equal to the third preset value, judging that the state value corresponding to the current time of the nth user is normal without alarming.
9. An intelligent security nursing monitoring method using the intelligent security nursing monitoring system according to any one of claims 1-8, characterized in that: the method comprises the following steps:
s1, the user personal information acquisition module acquires the basic information of the user and each corresponding characteristic monitoring part of the corresponding illness state in the database to obtain a first characteristic monitoring part set Dn corresponding to the nth user;
s2, the monitoring image extraction module acquires the acquisition results of the user action images by the cameras at different times and binds the acquisition time with the corresponding user action images;
s3, the monitoring image analysis module analyzes each action image of the user collected by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
s4, the physiological data analysis module monitors the physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
s5, the user state analysis module obtains the state value of the user according to the correlation degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and carries out early warning when the user state value is abnormal.
CN202210504779.5A 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method Active CN114886417B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210504779.5A CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210504779.5A CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Publications (2)

Publication Number Publication Date
CN114886417A true CN114886417A (en) 2022-08-12
CN114886417B CN114886417B (en) 2023-09-22

Family

ID=82720976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210504779.5A Active CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Country Status (1)

Country Link
CN (1) CN114886417B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005101279A2 (en) * 2004-04-12 2005-10-27 Baxter International Inc. System and method for medical data tracking, analysis and reporting for a healthcare system
JP2010273756A (en) * 2009-05-27 2010-12-09 Yoshida Dental Mfg Co Ltd Temporomandibular arthrosis diagnosis support system and apparatus equipped with pain detector
WO2012153744A1 (en) * 2011-05-12 2012-11-15 日本電気株式会社 Information processing device, information processing method, and information processing program
JP2015197803A (en) * 2014-04-01 2015-11-09 キヤノン株式会社 Behavior record device, behavior record method and program
JP2016080671A (en) * 2014-10-20 2016-05-16 純一 水澤 Robot measuring apparatus measuring human motions
CN109558824A (en) * 2018-11-23 2019-04-02 卢伟涛 A kind of body-building movement monitoring and analysis system based on personnel's image recognition
CN113569996A (en) * 2021-08-30 2021-10-29 平安医疗健康管理股份有限公司 Method, device, equipment and storage medium for classifying medical record information
WO2021218542A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Visual perception device based spatial calibration method and apparatus for robot body coordinate system, and storage medium
CN114129151A (en) * 2021-11-30 2022-03-04 心智动科技(深圳)有限公司 Method for defining human body action, posture and each joint relation by visual recognition

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005101279A2 (en) * 2004-04-12 2005-10-27 Baxter International Inc. System and method for medical data tracking, analysis and reporting for a healthcare system
JP2010273756A (en) * 2009-05-27 2010-12-09 Yoshida Dental Mfg Co Ltd Temporomandibular arthrosis diagnosis support system and apparatus equipped with pain detector
WO2012153744A1 (en) * 2011-05-12 2012-11-15 日本電気株式会社 Information processing device, information processing method, and information processing program
JP2015197803A (en) * 2014-04-01 2015-11-09 キヤノン株式会社 Behavior record device, behavior record method and program
JP2016080671A (en) * 2014-10-20 2016-05-16 純一 水澤 Robot measuring apparatus measuring human motions
CN109558824A (en) * 2018-11-23 2019-04-02 卢伟涛 A kind of body-building movement monitoring and analysis system based on personnel's image recognition
WO2021218542A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Visual perception device based spatial calibration method and apparatus for robot body coordinate system, and storage medium
CN113569996A (en) * 2021-08-30 2021-10-29 平安医疗健康管理股份有限公司 Method, device, equipment and storage medium for classifying medical record information
CN114129151A (en) * 2021-11-30 2022-03-04 心智动科技(深圳)有限公司 Method for defining human body action, posture and each joint relation by visual recognition

Also Published As

Publication number Publication date
CN114886417B (en) 2023-09-22

Similar Documents

Publication Publication Date Title
US6816603B2 (en) Method and apparatus for remote medical monitoring incorporating video processing and system of motor tasks
JP5388580B2 (en) Residue-based management of human health
CN108899084A (en) A kind of wisdom endowment health monitoring system
US10856810B2 (en) Smart carpet systems and methods of using same for monitoring physical and physiological activities
CN109815858B (en) Target user gait recognition system and method in daily environment
JP6053802B2 (en) A monitoring system that monitors patients and detects patient delirium
Heydarzadeh et al. In-bed posture classification using deep autoencoders
CN111180091A (en) Monitoring system for intelligent medical community service
Lee et al. Issues in data fusion for healthcare monitoring
CN106842979A (en) A kind of mattress of assisting sleep
CN102113034A (en) Monitoring, predicting and treating clinical episodes
CN109863562A (en) For carrying out patient-monitoring to predict and prevent the equipment, system and method for falling from bed
AU2022203004A1 (en) System for recording, analyzing risk(s) of accident(s) or need of assistance and providing real-time warning(s) based on continuous sensor signals
CN106599802A (en) Intelligent corridor monitoring system based on cloud technology
CN106333643B (en) User health monitoring method, monitoring device and monitoring terminal
EP3160328B1 (en) Device, system and computer program for detecting a health condition of a subject
CN114886417A (en) Intelligent safety nursing monitoring system and method
CN113688720A (en) Neural network recognition-based sleeping posture prediction method
CN115662631A (en) AI intelligence discrimination-based nursing home management system
US11179046B2 (en) Method and system for detection of atrial fibrillation
Nakai et al. Non-restrictive visual respiration monitoring
CN114847872A (en) Medical diagnosis system based on Internet of things technology and implementation method thereof
NL2029920B1 (en) Determining a trigger level for a monitoring algorithm of an epileptic seizure detection apparatus
CN117158913B (en) Monitoring and evaluating system for physiological condition of children
CN117153316B (en) Management system of medical information acquisition terminal for multiple users

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant