CN114886417B - Intelligent safety nursing monitoring system and method - Google Patents

Intelligent safety nursing monitoring system and method Download PDF

Info

Publication number
CN114886417B
CN114886417B CN202210504779.5A CN202210504779A CN114886417B CN 114886417 B CN114886417 B CN 114886417B CN 202210504779 A CN202210504779 A CN 202210504779A CN 114886417 B CN114886417 B CN 114886417B
Authority
CN
China
Prior art keywords
user
monitoring
time
feature
analysis module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210504779.5A
Other languages
Chinese (zh)
Other versions
CN114886417A (en
Inventor
倪顺康
仲恒平
陈国春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Bullteam Medical Technology Development Co ltd
Original Assignee
Nanjing Bullteam Medical Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Bullteam Medical Technology Development Co ltd filed Critical Nanjing Bullteam Medical Technology Development Co ltd
Priority to CN202210504779.5A priority Critical patent/CN114886417B/en
Publication of CN114886417A publication Critical patent/CN114886417A/en
Application granted granted Critical
Publication of CN114886417B publication Critical patent/CN114886417B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physiology (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Cardiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Pulmonology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses an intelligent safety nursing monitoring system and method, wherein a monitoring image analysis module analyzes each action image of a user acquired by a monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user; and the physiological data analysis module monitors physiological data of the user in real time through the sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user. The invention realizes effective monitoring of the health state of the user, and rapidly alarms under the condition of abnormal state value, and informs the nursing staff to carry out corresponding nursing operation on the user, thereby not only reducing the working mildness of the nursing staff, but also improving the monitoring effect of the nursing staff on the user and ensuring the timeliness and effectiveness of the nursing staff on the user nursing.

Description

Intelligent safety nursing monitoring system and method
Technical Field
The invention relates to the technical field of health monitoring, in particular to an intelligent safety nursing monitoring system and method.
Background
Along with the development of science and technology and the improvement of living standard, people pay more attention to the health state of the people, and the old and the patient pay more attention to the safety care of the people, and in the current aged care institutions and medical institutions, patients or the old are usually cared in a manual mode, namely, the state of the patients or the old is cared by nursing staff; however, since the number of nursing staff is limited and is far lower than the number of old people in a patient, the nursing staff cannot monitor the health state of the patient or the old in real time, and the working intensity of the mode is high for the nursing staff, so that the patient or the old is cared in an artificial mode, and the defect is overcome.
In view of the foregoing, there is a need for an intelligent safety care monitoring system and method.
Disclosure of Invention
The invention aims to provide an intelligent safety nursing monitoring system and method for solving the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme: an intelligent safety care monitoring system, comprising:
the system comprises a user personal information acquisition module, a database and a storage module, wherein the user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in the database to obtain a first feature monitoring part set Dn corresponding to an nth user;
The monitoring image extraction module acquires acquisition results of the cameras with different time on the action images of the user, and binds the acquisition time with the corresponding action images of the user;
the monitoring image analysis module is used for analyzing each action image of the user acquired by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors physiological data of a user in real time through a sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module is used for obtaining a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and the monitoring and early warning module is used for monitoring the user state value obtained by the user state analysis module in real time and carrying out early warning when the user state value is abnormal.
The invention realizes the acquisition of the personal information of the user and the corresponding feature monitoring part set through the cooperation of each module, obtains the association degree between each user action image and each corresponding feature monitoring part of the user through the acquisition and analysis of the user image, quantifies the health state of the user in real time by combining the user physiological data fluctuation rate obtained by the physiological data analysis module, obtains the state value of the user, further realizes the effective monitoring of the health state of the user, rapidly alarms under the condition of abnormal state value, informs the nursing staff to carry out corresponding nursing operation on the user, lightens the working mildness of the nursing staff, improves the monitoring effect of the nursing staff on the user, and ensures the timeliness and effectiveness of nursing of the nursing staff on the user.
Further, the user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information obtaining module comprises a user number and disease types of the user, one or more disease types of the same user are adopted,
the user personal information acquisition module is used for comparing each disease suffered by the same user with the database to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
The corresponding characteristic monitoring positions of different disease types in the database are different, and the number of the corresponding characteristic monitoring positions of the same disease type in the database is one or more;
the user personal information acquisition module acquires the priority value corresponding to each disease type prefabricated in the database, the number of the disease types corresponding to the same priority value is one or more,
the user personal information acquisition module marks the value of the corresponding priority of the ith disease of the nth user as an, marks the set of the monitoring positions of the ith disease corresponding to each feature of the nth user as Cni, marks the monitoring priority coefficient corresponding to each element in Cni as Bni, and the monitoring priority coefficient corresponding to each element in the set of the feature monitoring positions is equal to the priority value of the disease type corresponding to the corresponding feature monitoring position set, namely Bni =an;
the number of disease types suffered by the nth user is iZn, the number of feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module marks a union set of iZn feature monitoring part sets corresponding to the nth user as Cn; obtaining the j element in Cn Corresponding relationship values Cn of corresponding iZn feature monitoring part sets j And a first feature monitoring part set Dn corresponding to the nth user,
the said
Wherein F (j, cni) represents the corresponding relation value of the j-th element in Cn and the feature monitoring part set Cni,
when the j-th element in Cn belongs to the feature monitoring site set Cni, F (j, cni) = Bni,
when the j-th element in Cn does not belong to the feature monitoring part set Cni, F (j, cni) =0;
comparing any two different elements in Cn with corresponding relation values of iZn feature monitoring part sets respectively,
if the two elements are respectively different from the corresponding relation value corresponding to the iZn feature monitoring part sets, the positions of the two elements are adjusted, the elements with large relation value are arranged in front of the elements with small relation value,
if the two elements are respectively the same as the corresponding relation value of the corresponding iZn feature monitoring part sets, the priority values corresponding to the prefabricated feature monitoring parts in the database are further adjusted, the positions between the two elements are adjusted, the elements with the large priority values are arranged in front of the elements with the small priority values, the priority values corresponding to the different feature monitoring parts in the database are different,
And marking the corresponding set after the element position adjustment in Cn as a first characteristic monitoring part set Dn corresponding to the nth user.
The invention provides a user personal information acquisition module, which acquires a first characteristic monitoring part set Dn corresponding to an nth user, and aims to confirm corresponding characteristic monitoring parts of the user and corresponding sequence of each characteristic monitoring part according to illness state of the user, wherein the characteristic monitoring parts corresponding to different illness states and the corresponding sequence of each characteristic monitoring part are different.
Further, the monitoring image extraction module is connected with the monitoring image analysis module;
The monitoring image extraction module acquires user action images at intervals of a first preset time T through the camera.
Further, the monitoring image analysis module acquires a first feature monitoring part set Dn corresponding to the nth user and the user action image acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in Dn and the user action image corresponding to each acquisition time one by one, and records the association number between the feature monitoring part corresponding to the jth 1 element in Dn and the user action image corresponding to the acquisition time t1 as
When the monitoring image analysis module obtains the association coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 element in Dn,
when the feature monitoring part corresponding to the j1 th element in Dn is positioned on one limb in the limbs, respectively calculating the pixel distances between the other three limbs and the limb corresponding to the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
When the feature monitoring part corresponding to the j1 th element in Dn is not positioned on one limb of the limbs, respectively calculating the minimum pixel distance between the limb and the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and marks the average value of the acquired pixel distances which are smaller than the first preset pixel distance asThe number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +.>
The monitoring image analysis module obtains a feature monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the feature monitoring part corresponding to the j1 st element in the Dn of the user action image corresponding to the acquisition time T1-T is recorded as
The monitoring image analysis module obtains a correlation coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1Wherein e1 is a first coefficient and e1 is a constant greater than 0, e1 is obtained by database query,
the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in the limited time and the user action image corresponding to the acquisition time t1 Wherein j2 represents the number of elements of which the correlation coefficient between the user action images corresponding to the acquisition time t1 in Dn is obtained in a limited time, j2 is more than or equal to 1 and j2 is less than or equal to the number of elements in Dn,
the limiting time is the total upper limit time of the association coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if the correlation coefficient between the feature monitoring part corresponding to the J0 element in Dn and the user action image corresponding to the acquisition time t1 is obtained,
obtaining the sum of time from calculating the association coefficient between the feature monitoring part corresponding to the 1 st element in Dn and the user action image corresponding to the acquisition time t1 to obtaining the association coefficient between the feature monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1, and recording asWill->And in comparison with the defined time period,
when (when)Greater than a defined time and->Less than or equal to a defined time, or->Greater than a defined time and j0+.1, then j2=j0-1;
when (when)If the number of elements J0 is smaller than Dn and the time is smaller than the limit time, the process continues to obtain +.>
When (when)When equal to the limit time or J0 is equal to the number of elements in Dn or j0=1, then j2=j0.
When the monitoring image analysis module acquires the association coefficient between the characteristic monitoring position corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1, the position of the limb in the user action image corresponding to the acquisition time t1 and the characteristic monitoring position corresponding to the j1 element in Dn are identified, and the situation that the limb touches the corresponding characteristic monitoring position or moves the corresponding characteristic monitoring position due to uncomfortable characteristic set monitoring position of the user is considered, so that the health state of the user is influenced; acquisition ofIs->In the course of (2), when the limb does not have the feature monitoring part, the limb of the default user does not influence the activity and does not influence the health state of the user, but when the limb is close to the feature monitoring part, the limb action of the user influences the feature monitoring part, the closer the distance is, the more the number of contacted limbs is, the more the limb action influences the feature monitoring part, and furtherThe influence of limb movements in the user action image corresponding to the user acquisition time t1 on the feature monitoring part corresponding to the j1 element in Dn is shown; the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in the limited time and the user action image corresponding to the acquisition time t1 >In the process of (2), the time for acquiring the corresponding association degree is controlled by considering the limiting time, and the timeliness of the finally acquired state value is ensured.
Further, the monitoring image analysis module also calibrates the association coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 in the process of obtaining the association degree between each feature monitoring part corresponding to Dn and the user action image corresponding to the acquisition time t1,
the monitoring image analysis module acquires the previous m user action images based on the acquisition time t1, and records the average value of each pixel distance smaller than a first preset pixel distance between the j 1-th element corresponding feature monitoring part in Dn and the pixel distance corresponding to the four limbs in the previous m1 user action images based on the acquisition time t1 as the average value of each pixel distanceThe number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +.>Calculate->
Respectively calculating the corresponding m1 th user action image based on the acquisition time t1 when m1 is different values in the first m user action images based on the acquisition time t1And->The absolute value of the difference, noted->
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond to When the acquired time t1 is always smaller than or equal to the second preset value, judging that the position of the limb in the user action image corresponding to the acquired time t1 is not related to the feature monitoring part corresponding to the j1 element in Dn, and needing to be carried out on the DnThe correlation coefficient between the characteristic monitoring part corresponding to the j1 element and the user action image corresponding to the acquisition time t1 is calibrated, and the calibrated correlation coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1 is equal to->
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond toWhen the acquired position of the limb in the user action image corresponding to the acquisition time t1 is judged to be related to the characteristic monitoring position corresponding to the j1 element in the Dn, and the correlation coefficient between the characteristic monitoring position corresponding to the j1 element in the Dn and the user action image corresponding to the acquisition time t1 is not required to be calibrated.
In the process of calibrating the association coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module acquires the first m user action images based on the acquisition time t1, so as to ensure that the limb positions in the acquired m+1 user action images are in contact with and change in contact state relative to the feature monitoring part corresponding to the j1 th element in Dn, and when the contact occurs, the corresponding contact state in the corresponding m+1 user monitoring images is always within the error range Always less than or equal to the second preset value), it is determined that the limb motion in the user motion image corresponding to the user acquisition time t1 is unchanged all the time, and the corresponding limb motion belongs to the habit of the user, does not influence the health state of the user, and further corresponds to the association coefficient> Is->Deleting, i.e. the corresponding correlation coefficient after calibration is +.>And further, the finally obtained user state value is more accurate, and the accuracy of the monitoring result is ensured.
Further, in the process that the sensor monitors the physiological data of the user in real time, the physiological data analysis module monitors the blood pressure and the heart rate,
the blood pressure corresponding to the nth user at time t is marked as XYnt, the heart rate corresponding to the nth user at time t is marked as XLnt,
calculating the average value of blood pressure in the second unit time before the nth user time t, and recording asThe said
Calculating the average value of heart rate in the second unit time before the nth user time t and recording asThe said
The second unit time corresponds to a time length T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the time t of the nth user and marks the physiological data fluctuation rate SLBnt as
Physiological data partitioning according to the inventionThe analysis module considers two angles of blood pressure and heart rate in the process of monitoring the physiological data of the user in real time by the sensor, judges the health state of the user according to the stable condition of the physiological data of the user, and the higher the fluctuation rate of the corresponding physiological data is, the worse the health state of the user is; The integral value of the difference degree between the blood pressure of the user at each time point between the time T-T1 and the time T and the average blood pressure value is reflected, so that the stability condition of the blood pressure of the user between the time T-T1 and the time T can be reflected to a certain extent;
the integral value of the difference degree between the heart rate and the average heart rate value at each time point between the time T-T1 and the time T of the heart rate of the user is reflected, and the stability condition of the heart rate of the user between the time T-T1 and the time T can be reflected to a certain extent.
Further, the user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module marks the state value corresponding to the nth user at time t as ZTZnt,
the said
Wherein r1 is a conversion coefficient of action association degree and physiological fluctuation rate, and r1 is a constant obtained from a database.
When the user state analysis module acquires the state value ZTZnt of the nth user corresponding to the time t, the state value ZTZnt is considered from two angles of corresponding association degree and corresponding physiological data fluctuation rate, and the influence degree of the state value ZTZnt and the physiological data fluctuation rate on the user state value is different, and the state value ZTZnt and the physiological data fluctuation rate are regulated by r1, so that the influence degree of the state value ZTZnt and the physiological data fluctuation rate on the user state value is uniform.
Further, the monitoring and early warning module obtains a state value corresponding to the nth user at time t, compares the corresponding state value with a third preset value,
when the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
when the state value corresponding to the current time of the nth user is larger than or equal to a third preset value, the state value corresponding to the current time of the nth user is judged to be normal, and no alarm is needed.
An intelligent safety care monitoring method, the method comprising the steps of:
s1, a user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in a database to obtain a first feature monitoring part set Dn corresponding to an nth user;
s2, a monitoring image extraction module acquires acquisition results of the cameras with different time for the action images of the user, and binds the acquisition time with the corresponding action images of the user;
s3, analyzing each action image of the user acquired by the monitoring image extraction module by the monitoring image analysis module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
S4, a physiological data analysis module monitors physiological data of the user in real time through a sensor, and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
s5, the user state analysis module obtains a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each feature monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
Compared with the prior art, the invention has the following beneficial effects: the invention not only realizes the acquisition of the personal information of the user and the corresponding feature monitoring part set, but also obtains the association degree between each user action image and each corresponding feature monitoring part of the user through the acquisition and analysis of the user image, and quantifies the health state of the user in real time by combining the user physiological data fluctuation rate obtained by the physiological data analysis module, thereby obtaining the state value of the user, further realizing the effective monitoring of the health state of the user, and rapidly giving an alarm under the condition of abnormal state value, notifying the nursing staff to carry out corresponding nursing operation on the user, thereby not only reducing the work mildness of the nursing staff, but also improving the monitoring effect of the nursing staff on the user, and ensuring the timeliness and effectiveness of nursing of the user by the nursing staff.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a schematic diagram of an intelligent safety care monitoring system according to the present invention;
FIG. 2 is a schematic flow chart of an intelligent safety care monitoring system for calibrating a correlation coefficient between a feature monitoring part corresponding to a j1 element in Dn and a user action image corresponding to acquisition time t 1;
fig. 3 is a schematic flow chart of an intelligent safety care monitoring method of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention provides the following technical solutions: an intelligent safety care monitoring system, comprising:
The system comprises a user personal information acquisition module, a database and a storage module, wherein the user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in the database to obtain a first feature monitoring part set Dn corresponding to an nth user;
the monitoring image extraction module acquires acquisition results of the cameras with different time on the action images of the user, and binds the acquisition time with the corresponding action images of the user;
the monitoring image analysis module is used for analyzing each action image of the user acquired by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors physiological data of a user in real time through a sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module is used for obtaining a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
And the monitoring and early warning module is used for monitoring the user state value obtained by the user state analysis module in real time and carrying out early warning when the user state value is abnormal.
The user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information obtaining module comprises a user number and disease types of the user, one or more disease types of the same user are adopted,
the user personal information acquisition module is used for comparing each disease suffered by the same user with the database to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
the corresponding characteristic monitoring positions of different disease types in the database are different, and the number of the corresponding characteristic monitoring positions of the same disease type in the database is one or more;
the user personal information acquisition module acquires the priority value corresponding to each disease type prefabricated in the database, the number of the disease types corresponding to the same priority value is one or more,
the user personal information acquisition module marks the value of the corresponding priority of the ith disease of the nth user as an, marks the set of the monitoring positions of the ith disease corresponding to each feature of the nth user as Cni, marks the monitoring priority coefficient corresponding to each element in Cni as Bni, and the monitoring priority coefficient corresponding to each element in the set of the feature monitoring positions is equal to the priority value of the disease type corresponding to the corresponding feature monitoring position set, namely Bni =an;
The number of disease types suffered by the nth user is iZn, the number of feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module marks a union set of iZn feature monitoring part sets corresponding to the nth user as Cn; obtaining a relationship value Cn corresponding to the j-th element in Cn and the corresponding iZn feature monitoring part set j And a first feature monitoring part set Dn corresponding to the nth user,
the said
Wherein F (j, cni) represents the corresponding relation value of the j-th element in Cn and the feature monitoring part set Cni,
when the j-th element in Cn belongs to the feature monitoring site set Cni, F (j, cni) = Bni,
when the j-th element in Cn does not belong to the feature monitoring part set Cni, F (j, cni) =0;
comparing any two different elements in Cn with corresponding relation values of iZn feature monitoring part sets respectively,
if the two elements are respectively different from the corresponding relation value corresponding to the iZn feature monitoring part sets, the positions of the two elements are adjusted, the elements with large relation value are arranged in front of the elements with small relation value,
if the two elements are respectively the same as the corresponding relation value of the corresponding iZn feature monitoring part sets, the priority values corresponding to the prefabricated feature monitoring parts in the database are further adjusted, the positions between the two elements are adjusted, the elements with the large priority values are arranged in front of the elements with the small priority values, the priority values corresponding to the different feature monitoring parts in the database are different,
And marking the corresponding set after the element position adjustment in Cn as a first characteristic monitoring part set Dn corresponding to the nth user.
In this embodiment, for example, the 01 st user suffers from two diseases, and the 1 st disease corresponds to the first feature-monitoring part and the second feature-set-monitoring part, respectively, the 2 nd disease corresponds to the first feature-monitoring part and the third feature-set-monitoring part, respectively,
the set of the 1 st disease corresponding to each feature monitoring site of the 01 st user is denoted as C011= { first feature monitoring site, second feature monitoring site }, the set of the 2 nd disease corresponding to each feature monitoring site of the 01 st user is denoted as C012= { first feature monitoring site, third feature monitoring site },
if the priority value a011 corresponding to the first disease is 2, the priority value a012 corresponding to the second disease is 1,
then c01= { first feature monitoring site, second feature monitoring site, third feature monitoring site },
because the 1 st element in C01 belongs to C011, F (1, C011) =2,
since the 1 st element in C01 belongs to C012, F (1, C012) =1,
because the 2 nd element in C01 belongs to C011, F (2, C011) =2,
Because the 2 nd element in C01 does not belong to C012, F (2, C012) =0,
because the 3 rd element in C01 does not belong to C011, F (3, C011) =0,
since the 3 rd element in C01 belongs to C012, F (3, C012) =1,
the 1 st element in C01 has a corresponding relation value with the corresponding C011 and C012
C01 1 =F(1,C011)+F(1,C012)=2+1=3,
Corresponding relation value between the 2 nd element in C01 and corresponding C011 and C012
C01 2 =F(2,C011)+F(2,C012)=2+0=2,
Corresponding relation value between 3 rd element in C01 and corresponding C011 and C012
C01 3 =F(3,C011)+F(3,C012)=0+1=1,
Since 3 > 2 > 1, C01 1 >C01 2 >C01 3 That is, the first feature monitoring site set D01 corresponding to the 01 st user is { the first feature monitoring site, the second feature monitoring site, the third feature monitoring site }.
The monitoring image extraction module is connected with the monitoring image analysis module;
the monitoring image extraction module acquires user action images at intervals of a first preset time T through the camera.
The monitoring image analysis module acquires a first feature monitoring part set Dn corresponding to an nth user and the user action image acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in the Dn and the user action image corresponding to each acquisition time one by one, and marks the association number between the feature monitoring part corresponding to a jth 1 element in the Dn and the user action image corresponding to the acquisition time t1 as
When the monitoring image analysis module obtains the association coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 element in Dn,
when the feature monitoring part corresponding to the j1 th element in Dn is positioned on one limb in the limbs, respectively calculating the pixel distances between the other three limbs and the limb corresponding to the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
when the feature monitoring part corresponding to the j1 th element in Dn is not positioned on one limb of the limbs, respectively calculating the minimum pixel distance between the limb and the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and marks the average value of the acquired pixel distances which are smaller than the first preset pixel distance asThe number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +. >
The monitoring image analysis module obtains a feature monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the feature monitoring part corresponding to the j1 st element in the Dn of the user action image corresponding to the acquisition time T1-T is recorded as
The monitoring image analysis module obtains a correlation coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1Wherein e1 is a first coefficient and e1 is a constant greater than 0, e1 is obtained by database query,
the monitoring image analysis module obtains each feature monitoring corresponding to Dn in a limited timeCorrelation degree between part and user action image corresponding to acquisition time t1Wherein j2 represents the number of elements of which the correlation coefficient between the user action images corresponding to the acquisition time t1 in Dn is obtained in a limited time, j2 is more than or equal to 1 and j2 is less than or equal to the number of elements in Dn,
the limiting time is the total upper limit time of the association coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if the correlation coefficient between the feature monitoring part corresponding to the J0 element in Dn and the user action image corresponding to the acquisition time t1 is obtained,
Obtaining the sum of time from calculating the association coefficient between the feature monitoring part corresponding to the 1 st element in Dn and the user action image corresponding to the acquisition time t1 to obtaining the association coefficient between the feature monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1, and recording asWill->And in comparison with the defined time period,
when (when)Greater than a defined time and->Less than or equal to a defined time, or->Greater than a defined time and j0+.1, then j2=j0-1;
when (when)If the number of elements J0 is smaller than Dn and the time is smaller than the limit time, the process continues to obtain +.>
When (when)When equal to the limit time or J0 is equal to the number of elements in Dn or j0=1, then j2=j0.
As shown in fig. 2, the monitoring image analysis module further calibrates the association coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 in the process of obtaining the association degree between each feature monitoring part corresponding to Dn and the user action image corresponding to the acquisition time t1,
the monitoring image analysis module acquires the previous m user action images based on the acquisition time t1, and records the average value of each pixel distance smaller than a first preset pixel distance between the j 1-th element corresponding feature monitoring part in Dn and the pixel distance corresponding to the four limbs in the previous m1 user action images based on the acquisition time t1 as the average value of each pixel distance The number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +.>Calculate->
Respectively calculating the corresponding m1 th user action image based on the acquisition time t1 when m1 is different values in the first m user action images based on the acquisition time t1And->The absolute value of the difference, noted->
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond toWhen the acquired position of the limb in the user action image corresponding to the acquisition time t1 is always smaller than or equal to the second preset value, judging that the position of the limb in the user action image corresponding to the acquisition time t1 is irrelevant to the characteristic monitoring position corresponding to the j1 th element in the Dn, calibrating the association coefficient between the characteristic monitoring position corresponding to the j1 th element in the Dn and the user action image corresponding to the acquisition time t1, and enabling the calibrated association coefficient between the characteristic monitoring position corresponding to the j1 th element in the Dn and the user action image corresponding to the acquisition time t1 to be equal to%>
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond toWhen the acquired position of the limb in the user action image corresponding to the acquisition time t1 is judged to be related to the characteristic monitoring position corresponding to the j1 element in the Dn, and the correlation coefficient between the characteristic monitoring position corresponding to the j1 element in the Dn and the user action image corresponding to the acquisition time t1 is not required to be calibrated.
The physiological data analysis module monitors the physiological data of the user in real time by the sensor, the monitored data are blood pressure and heart rate,
the blood pressure corresponding to the nth user at time t is marked as XYnt, the heart rate corresponding to the nth user at time t is marked as XLnt,
calculating the nth user timethe average value of blood pressure in the first second unit time before t is recorded asThe said
Calculating the average value of heart rate in the second unit time before the nth user time t and recording asThe said
The second unit time corresponds to a time length T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the nth user at time t,
is marked as
The user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module marks the state value corresponding to the nth user at time t as ZTZnt,
the said
Wherein r1 is a conversion coefficient of action association degree and physiological fluctuation rate, and r1 is a constant obtained from a database.
The monitoring and early warning module obtains a state value corresponding to the nth user at time t and compares the corresponding state value with a third preset value,
When the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
when the state value corresponding to the current time of the nth user is larger than or equal to a third preset value, the state value corresponding to the current time of the nth user is judged to be normal, and no alarm is needed.
Referring to fig. 3, an intelligent safety care monitoring method comprises the following steps:
s1, a user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in a database to obtain a first feature monitoring part set Dn corresponding to an nth user;
s2, a monitoring image extraction module acquires acquisition results of the cameras with different time for the action images of the user, and binds the acquisition time with the corresponding action images of the user;
s3, analyzing each action image of the user acquired by the monitoring image extraction module by the monitoring image analysis module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
s4, a physiological data analysis module monitors physiological data of the user in real time through a sensor, and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
S5, the user state analysis module obtains a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each feature monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. An intelligent safety care monitoring system, comprising:
the system comprises a user personal information acquisition module, a database and a storage module, wherein the user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in the database to obtain a first feature monitoring part set Dn corresponding to an nth user;
the monitoring image extraction module acquires acquisition results of the cameras with different time on the action images of the user, and binds the acquisition time with the corresponding action images of the user;
the monitoring image analysis module is used for analyzing each action image of the user acquired by the monitoring image extraction module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
the physiological data analysis module monitors physiological data of a user in real time through a sensor and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
the user state analysis module is used for obtaining a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each corresponding characteristic monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
The monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal;
the user personal information acquisition module is respectively connected with the monitoring image extraction module and the physiological data analysis module;
the user basic information obtained by the user personal information obtaining module comprises a user number and disease types of the user, one or more disease types of the same user are adopted,
the user personal information acquisition module is used for comparing each disease suffered by the same user with the database to obtain a set of characteristic monitoring parts corresponding to the corresponding disease types of the user;
the corresponding characteristic monitoring positions of different disease types in the database are different, and the number of the corresponding characteristic monitoring positions of the same disease type in the database is one or more;
the user personal information acquisition module acquires the priority value corresponding to each disease type prefabricated in the database, the number of the disease types corresponding to the same priority value is one or more,
the user personal information acquisition module marks the value of the corresponding priority of the ith disease of the nth user as an, marks the set of the monitoring positions of the ith disease corresponding to each feature of the nth user as Cni, marks the monitoring priority coefficient corresponding to each element in Cni as Bni, and the monitoring priority coefficient corresponding to each element in the set of the feature monitoring positions is equal to the priority value of the disease type corresponding to the corresponding feature monitoring position set, namely Bni =an;
The number of disease types suffered by the nth user is iZn, the number of feature monitoring part sets corresponding to the nth user is iZn, iZn is more than or equal to 1,
the user personal information acquisition module monitors iZn feature monitoring part sets corresponding to the nth userThe union is denoted Cn; obtaining a relationship value Cn corresponding to the j-th element in Cn and the corresponding iZn feature monitoring part set j And a first feature monitoring part set Dn corresponding to the nth user,
the said
Wherein F (j, cni) represents the corresponding relation value of the j-th element in Cn and the feature monitoring part set Cni,
when the j-th element in Cn belongs to the feature monitoring site set Cni, F (j, cni) = Bni,
when the j-th element in Cn does not belong to the feature monitoring part set Cni, F (j, cni) =0;
comparing any two different elements in Cn with corresponding relation values of iZn feature monitoring part sets respectively,
if the two elements are respectively different from the corresponding relation value corresponding to the iZn feature monitoring part sets, the positions of the two elements are adjusted, the elements with large relation value are arranged in front of the elements with small relation value,
if the two elements are respectively the same as the corresponding relation value of the corresponding iZn feature monitoring part sets, the priority values corresponding to the prefabricated feature monitoring parts in the database are further adjusted, the positions between the two elements are adjusted, the elements with the large priority values are arranged in front of the elements with the small priority values, the priority values corresponding to the different feature monitoring parts in the database are different,
The corresponding set after the element position adjustment in Cn is finished is marked as a first characteristic monitoring part set Dn corresponding to the nth user;
the monitoring image analysis module acquires a first feature monitoring part set Dn corresponding to an nth user and the user action image acquired by the monitoring image extraction module, analyzes the association degree between the feature monitoring part corresponding to each element in the Dn and the user action image corresponding to each acquisition time one by one, and marks the association number between the feature monitoring part corresponding to a jth 1 element in the Dn and the user action image corresponding to the acquisition time t1 as
When the monitoring image analysis module obtains the association coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1, the monitoring image analysis module respectively identifies the position of the limbs in the user action image corresponding to the acquisition time t1 and the characteristic monitoring part corresponding to the j1 element in Dn,
when the feature monitoring part corresponding to the j1 th element in Dn is positioned on one limb in the limbs, respectively calculating the pixel distances between the other three limbs and the limb corresponding to the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
When the feature monitoring part corresponding to the j1 th element in Dn is not positioned on one limb of the limbs, respectively calculating the minimum pixel distance between the limb and the feature monitoring part corresponding to the j1 st element in Dn in the user action image corresponding to the acquisition time t1,
the monitoring image analysis module compares the acquired pixel distance with a first preset pixel distance, and marks the average value of the acquired pixel distances which are smaller than the first preset pixel distance asThe number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +.>
The monitoring image analysis module obtains a feature monitoring part corresponding to the j1 th element in the Dn of the user action image corresponding to the acquisition time T1, and the pixel distance of the movement of the feature monitoring part corresponding to the j1 st element in the Dn of the user action image corresponding to the acquisition time T1-T is recorded as
The monitoring image analysis module obtains a correlation coefficient between the characteristic monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1Wherein e1 is a first coefficient and e1 is a constant greater than 0, e1 is obtained by database query,
the monitoring image analysis module obtains the association degree between each characteristic monitoring part corresponding to Dn in the limited time and the user action image corresponding to the acquisition time t1 Wherein j2 represents the number of elements of which the correlation coefficient between the user action images corresponding to the acquisition time t1 in Dn is obtained in a limited time, j2 is more than or equal to 1 and j2 is less than or equal to the number of elements in Dn,
the limiting time is the total upper limit time of the association coefficient between the characteristic monitoring part corresponding to each element in the corresponding calculation Dn in the database and the user action image corresponding to the acquisition time t1,
if the correlation coefficient between the feature monitoring part corresponding to the J0 element in Dn and the user action image corresponding to the acquisition time t1 is obtained,
obtaining the sum of time from calculating the association coefficient between the feature monitoring part corresponding to the 1 st element in Dn and the user action image corresponding to the acquisition time t1 to obtaining the association coefficient between the feature monitoring part corresponding to the J0 th element in Dn and the user action image corresponding to the acquisition time t1, and recording asWill->And in comparison with the defined time period,
when (when)Greater than a defined time and->Less than or equal to a defined time, or->Greater than a defined time and j0+.1, then j2=j0-1;
when (when)If the number of elements J0 is smaller than Dn and the time is smaller than the limit time, the process continues to obtain +.>
When (when)When equal to the limit time or J0 is equal to the number of elements in Dn or j0=1, then j2=j0;
The physiological data analysis module monitors the physiological data of the user in real time by the sensor, the monitored data are blood pressure and heart rate,
the blood pressure corresponding to the nth user at time t is marked as XYnt, the heart rate corresponding to the nth user at time t is marked as XLnt,
calculating the average value of blood pressure in the second unit time before the nth user time t, and recording asSaid->
Calculating the average value of heart rate in the second unit time before the nth user time t and recording asSaid->
The second unit time corresponds to a time length T1,
the physiological data analysis module obtains the physiological data fluctuation rate SLBnt corresponding to the nth user at time t,
is marked as
The user state analysis module is respectively connected with the monitoring image analysis module, the physiological data analysis module and the monitoring early warning module;
the user state analysis module marks the state value corresponding to the nth user at time t as ZTZnt,
the said
Wherein r1 is a conversion coefficient of action association degree and physiological fluctuation rate, and r1 is a constant obtained from a database.
2. An intelligent safety care monitoring system according to claim 1, wherein: the monitoring image extraction module is connected with the monitoring image analysis module;
The monitoring image extraction module acquires user action images at intervals of a first preset time T through the camera.
3. An intelligent safety care monitoring system according to claim 1, wherein: the monitoring image analysis module also calibrates the association coefficient between the feature monitoring part corresponding to the j1 th element in Dn and the user action image corresponding to the acquisition time t1 in the process of obtaining the association degree between each feature monitoring part corresponding to Dn and the user action image corresponding to the acquisition time t1,
the monitoring image analysis module acquires the previous m user action images based on the acquisition time t1, and records the average value of each pixel distance smaller than a first preset pixel distance between the j 1-th element corresponding feature monitoring part in Dn and the pixel distance corresponding to the four limbs in the previous m1 user action images based on the acquisition time t1 as the average value of each pixel distanceThe number of the obtained pixel distances smaller than the first preset pixel distance is recorded as +.>Calculate->Respectively calculating +.1 corresponding to the m1 th user action image based on the acquisition time t1 when m1 is different values in the m previous user action images based on the acquisition time t1>And->The absolute value of the difference, noted- >
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond toIf the acquired position of the limb in the user action image corresponding to the acquisition time t1 is always smaller than or equal to the second preset value, judging that the position of the limb in the user action image corresponding to the acquisition time t1 is not related to the feature monitoring position corresponding to the j1 element in the Dn, and performing action image on the feature monitoring position corresponding to the j1 element in the Dn and the user corresponding to the acquisition time t1The correlation coefficient between images is calibrated, and the calibrated correlation coefficient between the feature monitoring part corresponding to the j1 element in Dn and the user action image corresponding to the acquisition time t1 is equal to +.>
When m1 is different values in the first m user action images based on the acquisition time t1, the first m1 user action images based on the acquisition time t1 correspond toWhen the acquired position of the limb in the user action image corresponding to the acquisition time t1 is judged to be related to the characteristic monitoring position corresponding to the j1 element in the Dn, and the correlation coefficient between the characteristic monitoring position corresponding to the j1 element in the Dn and the user action image corresponding to the acquisition time t1 is not required to be calibrated.
4. An intelligent safety care monitoring system according to claim 1, wherein: the monitoring and early warning module obtains a state value corresponding to the nth user at time t and compares the corresponding state value with a third preset value,
When the state value corresponding to the current time of the nth user is greater than or equal to a third preset value, judging that the state value corresponding to the current time of the nth user is abnormal, and immediately giving an alarm;
when the state value corresponding to the current time of the nth user is larger than or equal to a third preset value, the state value corresponding to the current time of the nth user is judged to be normal, and no alarm is needed.
5. An intelligent safety care monitoring method using the intelligent safety care monitoring system according to any one of claims 1 to 4, characterized in that: the method comprises the following steps:
s1, a user personal information acquisition module acquires basic information of a user and each feature monitoring part corresponding to a corresponding illness state in a database to obtain a first feature monitoring part set Dn corresponding to an nth user;
s2, a monitoring image extraction module acquires acquisition results of the cameras with different time for the action images of the user, and binds the acquisition time with the corresponding action images of the user;
s3, analyzing each action image of the user acquired by the monitoring image extraction module by the monitoring image analysis module to obtain the association degree between each action image and each corresponding characteristic monitoring part of the user;
S4, a physiological data analysis module monitors physiological data of the user in real time through a sensor, and analyzes the monitored physiological data of the user to obtain the fluctuation rate of the physiological data of the user;
s5, the user state analysis module obtains a state value of the user according to the association degree between each action image obtained by the monitoring image analysis module and each feature monitoring part of the user and the fluctuation rate of the physiological data of the user obtained by the physiological data analysis module;
and S6, the monitoring and early warning module monitors the user state value obtained by the user state analysis module in real time and performs early warning when the user state value is abnormal.
CN202210504779.5A 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method Active CN114886417B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210504779.5A CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210504779.5A CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Publications (2)

Publication Number Publication Date
CN114886417A CN114886417A (en) 2022-08-12
CN114886417B true CN114886417B (en) 2023-09-22

Family

ID=82720976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210504779.5A Active CN114886417B (en) 2022-05-10 2022-05-10 Intelligent safety nursing monitoring system and method

Country Status (1)

Country Link
CN (1) CN114886417B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005101279A2 (en) * 2004-04-12 2005-10-27 Baxter International Inc. System and method for medical data tracking, analysis and reporting for a healthcare system
JP2010273756A (en) * 2009-05-27 2010-12-09 Yoshida Dental Mfg Co Ltd Temporomandibular arthrosis diagnosis support system and apparatus equipped with pain detector
WO2012153744A1 (en) * 2011-05-12 2012-11-15 日本電気株式会社 Information processing device, information processing method, and information processing program
JP2015197803A (en) * 2014-04-01 2015-11-09 キヤノン株式会社 Behavior record device, behavior record method and program
JP2016080671A (en) * 2014-10-20 2016-05-16 純一 水澤 Robot measuring apparatus measuring human motions
CN109558824A (en) * 2018-11-23 2019-04-02 卢伟涛 A kind of body-building movement monitoring and analysis system based on personnel's image recognition
CN113569996A (en) * 2021-08-30 2021-10-29 平安医疗健康管理股份有限公司 Method, device, equipment and storage medium for classifying medical record information
WO2021218542A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Visual perception device based spatial calibration method and apparatus for robot body coordinate system, and storage medium
CN114129151A (en) * 2021-11-30 2022-03-04 心智动科技(深圳)有限公司 Method for defining human body action, posture and each joint relation by visual recognition

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005101279A2 (en) * 2004-04-12 2005-10-27 Baxter International Inc. System and method for medical data tracking, analysis and reporting for a healthcare system
JP2010273756A (en) * 2009-05-27 2010-12-09 Yoshida Dental Mfg Co Ltd Temporomandibular arthrosis diagnosis support system and apparatus equipped with pain detector
WO2012153744A1 (en) * 2011-05-12 2012-11-15 日本電気株式会社 Information processing device, information processing method, and information processing program
JP2015197803A (en) * 2014-04-01 2015-11-09 キヤノン株式会社 Behavior record device, behavior record method and program
JP2016080671A (en) * 2014-10-20 2016-05-16 純一 水澤 Robot measuring apparatus measuring human motions
CN109558824A (en) * 2018-11-23 2019-04-02 卢伟涛 A kind of body-building movement monitoring and analysis system based on personnel's image recognition
WO2021218542A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Visual perception device based spatial calibration method and apparatus for robot body coordinate system, and storage medium
CN113569996A (en) * 2021-08-30 2021-10-29 平安医疗健康管理股份有限公司 Method, device, equipment and storage medium for classifying medical record information
CN114129151A (en) * 2021-11-30 2022-03-04 心智动科技(深圳)有限公司 Method for defining human body action, posture and each joint relation by visual recognition

Also Published As

Publication number Publication date
CN114886417A (en) 2022-08-12

Similar Documents

Publication Publication Date Title
US9927305B2 (en) Method and apparatus for accurate detection of fever
JP7108267B2 (en) Biological information processing system, biological information processing method, and computer program
US6816603B2 (en) Method and apparatus for remote medical monitoring incorporating video processing and system of motor tasks
CN104883962B (en) The patient-monitoring for subacute patient based on active state and posture
US20130090571A1 (en) Methods and systems for monitoring and preventing pressure ulcers
US20210225510A1 (en) Human body health assessment method and system based on sleep big data
DE112006003199T5 (en) Non-obstructive, substantially continuous acquisition of the daily activities of a patient to indicate a change of state of the patient for the access of a remote operator
US8634900B2 (en) Mask comfort diagnostic method
CN104699931A (en) Neural network blood pressure prediction method and mobile phone based on human face
CN109863562A (en) For carrying out patient-monitoring to predict and prevent the equipment, system and method for falling from bed
CN114766057A (en) Accurate health management and risk early warning method and system based on association of family genetic disease and sign data
CN105615852A (en) Blood pressure detection system and method
CN108289633A (en) Sleep study system and method
CN115662631A (en) AI intelligence discrimination-based nursing home management system
CN109935327A (en) Hypertensive patient's cardiovascular risk grading appraisal procedure based on intelligence decision support system
CN114886417B (en) Intelligent safety nursing monitoring system and method
CN116098595B (en) System and method for monitoring and preventing sudden cardiac death and sudden cerebral death
EP3234827B1 (en) Monitoring the exposure of a patient to an environmental factor
US20220254502A1 (en) System, method and apparatus for non-invasive & non-contact monitoring of health racterstics using artificial intelligence (ai)
US20100249529A1 (en) Pain Monitoring Apparatus and Methods Thereof
CN106264479A (en) body temperature data processing method
CN114831596A (en) Deep learning-based sleep monitoring network model construction method and system and sleep monitoring method
Sujin et al. Public e-health network system using arduino controller
CN117831745B (en) Remote nursing management method and system based on data analysis
CN114974538B (en) Ward nursing early warning management system based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant