CN117954079A - Health management intelligent system and human cardiopulmonary function signal monitoring method - Google Patents

Health management intelligent system and human cardiopulmonary function signal monitoring method Download PDF

Info

Publication number
CN117954079A
CN117954079A CN202410115762.XA CN202410115762A CN117954079A CN 117954079 A CN117954079 A CN 117954079A CN 202410115762 A CN202410115762 A CN 202410115762A CN 117954079 A CN117954079 A CN 117954079A
Authority
CN
China
Prior art keywords
data
information
heart
monitoring
monitored object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410115762.XA
Other languages
Chinese (zh)
Inventor
谢俊
朱宏飞
何辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yihuiyun Intelligent Technology Shenzhen Co ltd
Original Assignee
Yihuiyun Intelligent Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yihuiyun Intelligent Technology Shenzhen Co ltd filed Critical Yihuiyun Intelligent Technology Shenzhen Co ltd
Publication of CN117954079A publication Critical patent/CN117954079A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Epidemiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Bioethics (AREA)
  • Veterinary Medicine (AREA)
  • Primary Health Care (AREA)
  • Cardiology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computer Hardware Design (AREA)
  • Pulmonology (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

The invention discloses a health management intelligent system and a human cardiopulmonary function signal monitoring method, wherein the method comprises the following steps: s1, preprocessing data and carrying out frequency domain and time domain transformation; s2, evaluating the importance of the data characteristics and selecting sign data; s3, collecting surrounding video images in real time, and identifying an active person; s4, tracking the position of the monitored object and identifying emotion; s5, collecting heart and lung heat signals to establish a deep learning model; s6, outputting a heart and lung health trend table to emergency personnel; s7, triggering an alarm and outputting abnormal information, communicating a monitoring object and starting emergency facilities, and if the monitoring object is unique, starting rescue and transmitting help seeking information. According to the invention, by combining the monitoring method and the intelligent system, comprehensive analysis and intelligent processing are carried out on the cardiopulmonary function signals from the multidimensional data, the intelligent degree is high, the requirement of personalized evaluation is met, and the first-aid efficiency and timeliness are improved.

Description

Health management intelligent system and human cardiopulmonary function signal monitoring method
Technical Field
The invention relates to the technical field of health management, in particular to an intelligent health management system and a human cardiopulmonary function signal monitoring method.
Background
With the increasing population aging and the continuous development of medical technology, monitoring of cardiopulmonary functions of elderly people and patients has become an important health management means. The heart and lung function signals are important indexes reflecting the health condition of the human body, and potential health problems can be found in time through monitoring and analyzing the heart and lung function signals, so that basis is provided for subsequent treatment, rehabilitation and health management. However, with the increasing health management demands of people, daily monitoring management has become a trend, and there is still a certain limitation in the aspects of traditional monitoring of cardiopulmonary function signals and health management: the intelligent degree is lower, and monitoring analysis of cardiopulmonary function signal is comparatively single, lacks comprehensive analysis and intelligent processing of multidimensional data, is difficult to adjust by oneself and optimize according to user's individual difference, and is difficult to in time provide accurate health condition monitoring and management, has the hysteresis in the aspect of first aid response, lacks timely unusual alarm and complete first aid SOS information, influences first aid efficiency and timeliness.
Aiming at the problems, the invention provides a health management intelligent system and a human cardiopulmonary function signal monitoring method.
Disclosure of Invention
In view of the above, the present invention aims at overcoming the drawbacks of the prior art, and its primary object is to provide a health management intelligent system and a human cardiopulmonary function signal monitoring method, which solve the technical problems of low intelligent degree, single data analysis, lack of personalized evaluation, and hysteresis in emergency response of the traditional cardiopulmonary function signal monitoring.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
the invention relates to a human cardiopulmonary function signal monitoring method, which comprises the following steps:
S1: acquiring historical diagnosis information, setting a heart-lung threshold value, acquiring and storing body state characteristic information of a monitored object, preprocessing, outputting preprocessing data, performing Fourier transform on the preprocessing data, acquiring a time domain signal, performing wavelet transform, and calculating a covariance matrix, an inter-class walk matrix and an intra-class walk matrix of the preprocessing data to maximize a discrimination ratio;
S2: carrying out data feature importance assessment by using feature assessment indexes, obtaining a data feature assessment result, selecting by using a feature selection algorithm, outputting sign data, and carrying out tracking identification according to the sign data, wherein the feature assessment indexes comprise but are not limited to relevance, variance and information gain;
S3: collecting video image information of the surrounding environment of the monitored object in real time, establishing a background model, removing the monitored object by comparing the current frame of the video image information with the background model, extracting a moving change object, judging whether the moving change object is a movable person or not by combining a deep learning algorithm, and if the moving change object is the movable person, turning to a step S4, otherwise turning to a step S5;
S4: adopting a privacy processing algorithm to block the monitoring of the mobile change object, continuously tracking the position and the movement of the monitoring object based on the sign data, acquiring the state information of the monitoring object through filtering processing, identifying and analyzing the emotion, outputting the state data, and if the state data is abnormal, turning to step S5;
S5: collecting a heart and lung heat signal of the monitored object by using a non-contact detector, capturing heart and lung micro-movements of the monitored object, establishing a standard data structure, synchronizing a time stamp, outputting a comprehensive heart and lung function data set, establishing a deep learning model, performing normal and abnormal heart and lung function feature mode training, performing processing analysis based on the comprehensive heart and lung function data set, acquiring analysis result data and storing the analysis result data, and if the analysis result data is normal, switching to step S6, otherwise switching to step S7;
S6: the analysis result data are called in real time, and a trend table is output, wherein the trend table is used for emergency personnel to judge the trend of the heart and lung health condition of the monitored object, and the emergency personnel make personalized strategies and feed back the personalized strategies to the monitored object;
S7: triggering an alarm and outputting abnormal information to emergency personnel, wherein the emergency personnel acquire the geographical position information of the monitored object based on the abnormal information, formulate an emergency strategy, automatically identify required emergency facilities and enable the emergency facilities, judge whether the monitored object is independent or not based on the abnormal information, and if the monitored object is not independent, the emergency personnel are in emergency communication with the display device of the monitored object for emergency guidance, and if the monitored object is independent, communicate with a community service station according to a nearby principle for rescue, bridge an authorized signal transmitting end, set a signal coverage range from near to far according to steps, and transmit emergency help seeking information.
As a preferred embodiment, based on step S1, the method comprises the following steps:
s101: setting the heart lung threshold according to the historical diagnosis information of the monitored object;
s102: collecting the physical characteristic information of the monitored object, performing preliminary processing and transcoding to generate preprocessing data;
S103: performing fourier transform based on the preprocessed data to obtain the time domain signal, including the following formula:
Wherein F (k) represents a frequency domain signal; f (t) represents a time domain signal; k represents frequency, which is a real number; t represents a time variable; i f (t) i represents the amplitude of the time domain signal; the angle f (t) represents the phase angle of the time domain signal, and the value range of the angle f (t) is [0,2 pi ];
S104: performing wavelet transformation on the time domain signal, wherein the wavelet transformation comprises the following formula:
Wherein, Representing a two-dimensional wavelet transform result; f (x, y) represents an input two-dimensional signal or image, where x and y represent the position coordinates of the pixel; ψ (x, y) represents a two-dimensional wavelet basis function; a represents a scale parameter; p, q represent translation parameters; x and Y represent direction parameters; a -1 represents the inverse of the scale for maintaining conservation of energy of the wavelet transform; /(I)Representing a summation operation over all pixel locations;
S105: calculating the covariance matrix, comprising the following formula:
PCA(X)=argmin_{Z,E}|X-Z-E||_F2+λ*||E||_F2
Wherein X represents an original data matrix; z represents the projected data matrix; e represents a residual matrix; the |X-Z-E||_F 2 represents the square of the Frobenius norm of the sum of the raw data and projection data and residual error, used for the metric reconstruction error; λ represents a regularization parameter for balancing the size of the reconstruction error and residual matrix; the F 2 represents the square of the Frobenius norm of the residual matrix for measuring the size of the residual;
s106: based on the preprocessing data, calculating an inter-class walk matrix and an intra-class walk matrix, and maximizing a discrimination ratio.
As a preferred embodiment, based on step S2, the method comprises the following steps:
S201: based on the wavelet transformation, the covariance matrix, the inter-class walk matrix and the intra-class walk matrix, performing data feature importance assessment by using feature assessment indexes, and outputting a data feature assessment result;
s202: based on the data characteristic evaluation result, performing characteristic selection by using the characteristic selection algorithm, and outputting the sign data;
S203: establishing a tracking model, training the tracking model based on the sign data, and adjusting parameters of the tracking model or improving a feature extraction method according to the accuracy and robustness of the tracking model;
s204: and based on the sign data, using the trained tracking model for identifying and tracking the monitored object.
As a preferred embodiment, based on step S3, the method comprises the following steps:
s301: capturing the video image information, using the formula I (x, y, t) =f (x, y, t) to represent pixel intensity and color on three-dimensional coordinates (x, y, t), where x and y are coordinates of pixels on an image plane, t represents time, and f (x, y, t) is a capture function;
s302: establishing a background model, which comprises the following formulas: wherein, N and M respectively represent the time frame number and the pixel grid size;
S303: comparing the current frame with a background model, comprising the following formula:
D(x,y,t)=|I(x,y,t)-B(x,y)+α·|C(x,y,t)-Bc(x,y),
wherein C (x, y, t) represents the color and texture characteristics of the pixel, obtained by a color space conversion or texture extraction algorithm; b c (x, y) represents the color and texture features of the background; alpha is a weight parameter for balancing the importance of color and texture differences;
s304: generating a segmentation mask comprising the following formula: Wherein D Gaussian represents a difference image after Gaussian blur processing;
S305: the segmentation mask is further processed by using morphological operations and region growing algorithms to obtain the position coordinates of the moving variant object, comprising the following formula:
Mobile object position = corrosion (M (x, y, t)) &expansion (M (x, y, t)),
Wherein etching (M (x, y, t)) means etching operation of the division mask M (x, y, t), and expanding (M (x, y, t)) means expanding operation of the division mask M (x, y, t);
S306: determining whether the moving variant object is an active person using the deep learning algorithm, comprising the following formula: p person (x, y, t) =cnn (P (x, y, t)), where CNN is a multi-layer neural network structure;
S307: decision making includes the following formulas:
if(Pperson(x,y,t)>θ1andPperson(x,y,t)>θ2)thenPpersonDetected,
Where θ 1 and θ 2 are two preset thresholds, when P person (x, y, t) > θ, it is considered a human, and at least one position (x, y, t) exists to satisfy P person (x, y, t) > θ.
As a preferred embodiment, based on step S305, the method includes the following steps:
S3051: etching operation;
S3052: an expansion operation;
S3053: region growing;
s3054: edge detection;
s3055: extracting contours and boundaries;
s3056: calculating the position and size of the moving object, including the following formulas:
wherein C (x ', y', t ') represents the segmentation mask at time t', and 1-C (x ', y', t ') represents the background mask at time t';
Wherein C (x ', y', t ') represents the segmentation mask at time t'.
As a preferred embodiment, based on step S4, the method comprises the following steps:
S401: collecting monitoring information and executing privacy algorithm processing based on the monitoring information, wherein the privacy algorithm processing comprises the following steps:
Introducing random noise to confuse the monitoring information;
Processing the monitoring information by adopting a fuzzy algorithm;
Introducing a differential privacy mechanism, and preventing leakage of individual information by adding the random noise to the monitoring information;
encrypting the monitoring information by adopting an encryption algorithm;
desensitizing the sensitive part in the monitoring information;
obtaining the monitoring information subjected to privacy treatment;
402: continuously tracking the position and movement of the monitored object;
403: filtering and processing the state information of the monitoring object;
404: emotion recognition and analysis, comprising the steps of:
Extracting characteristics in physiological signals of the monitored object, and setting the physiological signals as S (t), wherein t represents time;
combining features in the extracted physiological signals into feature vectors
Setting an emotion label Y, using a training set { (X i,Yi) }, wherein X i is a feature vector of the i-th sample, and Y i is a corresponding emotion label;
Set emotion model as Wherein θ represents a model parameter;
Defining a loss function
Optimizing model parameters theta *=argminθ iota (theta);
Extracting features from the new physiological signal
Emotion prediction using trained models
Setting a judgment threshold tau to judge the emotion intensity, wherein the judgment threshold tau comprises the following formula:
Visualizing emotion change trend, carrying out association analysis on abnormal emotion, and outputting emotion analysis results;
405: combining the state information of the monitored object subjected to filtering processing with an emotion analysis result to output comprehensive state data;
406: and (5) designing an abnormality detection algorithm, judging whether an abnormality exists by monitoring the abnormality in the state data, and if so, turning to step S5.
As a preferred embodiment, based on step S5, the method comprises the following steps:
S501: collecting the cardiopulmonary heat signals of the monitored object by using the non-contact detector, and capturing cardiopulmonary micro-motion signals of the monitored object;
S502: performing temperature calibration and background denoising on the acquired cardiopulmonary heat signals, performing spectrum analysis by applying fast Fourier transform, and extracting respiratory frequency data and heart rate basic component data;
S503: doppler processing and time gating are carried out on the heart and lung micro-motion signals, vital sign motion information of the monitored object is extracted, wavelet transformation is carried out on the basis of the vital sign motion information, and heartbeat transient characteristics and respiratory transient characteristics are captured in a time-frequency domain;
S504: establishing the standard data structure, mapping the respiratory rate data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics into the standard data structure, ensuring the synchronization of the time stamps, and scaling or converting the respiratory rate data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics of different units or magnitudes;
S505: performing time synchronization on the acquired respiratory frequency data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics, selecting a proper data fusion method according to application scenes and signal characteristics, giving weights to different non-contact detectors, processing by using the selected data fusion method, and outputting the comprehensive heart and lung function data set;
S506: establishing the deep learning model, training normal and abnormal heart and lung function characteristic modes, inputting the comprehensive heart and lung function data set into the deep learning model for real-time analysis, and outputting and storing analysis result data;
S507: and matching the analysis result data with the heart and lung threshold value, judging that the heart and lung threshold value is abnormal if the heart and lung threshold value is exceeded, and judging that the heart and lung threshold value is not exceeded, and judging that the heart and lung threshold value is normal.
As a preferred scheme, the first aid help seeking information includes, but is not limited to, the name, age, abnormality information and geographical position information of the monitored object.
The health management intelligent system is used for monitoring the human cardiopulmonary function signals based on the human cardiopulmonary function signal monitoring method.
Compared with the prior art, the invention has obvious advantages and beneficial effects, in particular, the technical proposal can be adopted to realize that the invention mainly comprises the following steps:
1. Personalized assessment and intelligent enhancement: by means of historical diagnosis information and morphological characteristics, personalized evaluation and tracking are achieved, feature geometry is optimized by means of application of feature evaluation and selection algorithms, evaluation accuracy and tracking accuracy are improved, and intelligent degree of tracking is improved by combining sealwort video information and a deep learning algorithm;
2. Data privacy protection and multidimensional comprehensive analysis: a privacy processing algorithm is introduced to ensure the safety of monitoring information, multidimensional data analysis of cardiopulmonary function signals is realized by means of technologies such as Fourier transformation, wavelet transformation and the like, and the comprehensiveness and accuracy of analysis are improved;
3. Emergency response timeliness and integrity improvement: real-time cardiopulmonary function analysis is realized through the deep learning model, an abnormality detection algorithm and emotion analysis are designed, complete emergency help seeking information and abnormal alarm are provided, and the timeliness and the integrity of emergency response are enhanced.
In order to more clearly illustrate the structural features and efficacy of the present invention, the present invention will be described in detail below with reference to the accompanying drawings and examples.
Drawings
Fig. 1 is a flowchart of a method for monitoring cardiopulmonary function signals of a human body according to an embodiment of the present invention.
Detailed Description
For the purpose of making the technical solution and advantages of the present invention more apparent, the present invention will be further described in detail below with reference to the accompanying drawings and examples of implementation. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only.
Example 1:
referring to fig. 1, an embodiment of the present invention provides a method for monitoring cardiopulmonary function signals of a human body, including the following steps:
S1: acquiring historical diagnosis information, setting a heart-lung threshold value, acquiring and storing physical characteristic information of a monitored object, preprocessing, outputting preprocessed data, performing Fourier transform on the preprocessed data, acquiring a time domain signal, performing wavelet transform, calculating a covariance matrix, an inter-class walk matrix and an intra-class walk matrix of the preprocessed data, and maximizing a discrimination ratio.
By the aid of the method, comprehensive knowledge of the health condition of the monitored object is ensured by acquiring historical diagnosis information and setting a heart-lung threshold value. Meanwhile, the physical characteristic information is collected and preprocessed, so that an accurate and reliable data base is provided for subsequent analysis. The inherent characteristics of the data are further deeply excavated through technologies such as Fourier transformation, wavelet transformation and the like, and a more accurate basis is provided for discrimination. The statistics such as covariance matrix is calculated to improve the accuracy and stability of discrimination.
S2: and carrying out data feature importance assessment by using feature assessment indexes, obtaining a data feature assessment result, selecting by using a feature selection algorithm, outputting sign data, and carrying out tracking identification according to the sign data, wherein the feature assessment indexes comprise, but are not limited to, correlation, variance and information gain.
It is clear that the feature evaluation index is utilized to evaluate the importance of the data features, so that the features most sensitive to the state change of the monitored object can be screened out. These features are critical to tracking and identifying changes in the state of the monitored object. Through a feature selection algorithm, a feature set is further optimized, the dimension is reduced, and a more accurate data basis is provided for follow-up tracking and recognition.
S3: video image information of the surrounding environment of the monitored object is collected in real time, a background model is established, the monitored object is eliminated by comparing the current frame of the video image information with the background model, the moving change object is extracted, whether the moving change object is a moving person or not is judged by combining a deep learning algorithm, if the moving change object is the moving person, the step S4 is carried out, and otherwise, the step S5 is carried out.
The method can exclude the monitored object and extract the moving change object by establishing the background model and comparing the current frame with the background model, and can further judge whether the moving change object is a moving person by combining a deep learning algorithm, thereby providing a basis for judging whether the monitored object is unique and providing more accurate dynamic information for subsequent tracking and identification.
S4: and (5) blocking the monitoring of the moving change object by adopting a privacy processing algorithm, continuously tracking the position and the movement of the monitoring object based on the sign data, acquiring the state information of the monitoring object through filtering processing, identifying and analyzing the emotion, outputting the state data, and if the state data is abnormal, turning to step (S5).
It should be noted that, through the privacy processing algorithm, the privacy of the mobile changing object is ensured not to be violated, but the filtering processing can extract the key state information of the monitoring object and perform emotion recognition and analysis.
S5: the method comprises the steps of collecting heart and lung heat signals of a monitored object by using a non-contact detector, capturing heart and lung micro-movements of the monitored object, establishing a standard data structure, synchronizing a time stamp, outputting a comprehensive heart and lung function data set, establishing a deep learning model, performing normal and abnormal heart and lung function feature mode training, performing processing analysis based on the comprehensive heart and lung function data set, obtaining analysis result data, storing the analysis result data, and if the analysis result data are normal, switching to the step S6, otherwise switching to the step S7.
It is clear that establishing the standard data structure and the synchronization time stamp ensures the accuracy and reliability of the data. By establishing a deep learning model and training normal and abnormal heart and lung function characteristic modes, the accuracy and reliability of heart and lung function analysis can be further improved. Based on the comprehensive cardiopulmonary function data set, processing analysis is carried out, analysis result data is obtained and stored, and powerful support is provided for subsequent decisions.
S6: and the analysis result data is called in real time, and a trend table is output, wherein the trend table is used for emergency personnel to judge the heart and lung health condition trend of the monitored object, and the emergency personnel formulate a personalized strategy and feed back the personalized strategy to the monitored object.
In the step, analysis result data are called in real time and a trend table is output, so that emergency personnel can intuitively know the trend of the heart and lung health condition of the monitored object, and by formulating a personalized strategy, the emergency personnel can provide more targeted guidance for the monitored object, so that the interaction between the monitored object and the emergency personnel is enhanced, and the efficiency and effect of health management are improved.
S7: triggering an alarm, outputting abnormal information to emergency personnel, acquiring geographic position information of a monitored object by the emergency personnel based on the abnormal information, formulating an emergency strategy, automatically identifying needed emergency facilities and starting the emergency facilities, judging whether the monitored object is unique or not based on the abnormal information, and if the monitored object is not unique, carrying out emergency guidance by the emergency personnel through a display device of the monitored object in emergency, if the monitored object is unique, communicating a community service station according to a nearby principle for rescue, bridging an authorized signal transmitting end, setting a signal coverage range from near to far according to steps, and transmitting emergency help seeking information.
It is clear that this step ensures that in case of an emergency response can be initiated quickly, providing timely and efficient rescue for the monitored subject. The first-aid help seeking information comprises, but is not limited to, the name, age, abnormal information and geographical position information of the monitored object, and the information is important to first-aid work, so that the patient condition can be quickly and accurately positioned and known, and the first-aid efficiency is improved. The abnormal information can be the abnormal condition of the heart and lung function of the monitored subject, and is helpful for timely finding and processing potential health problems. The geographic location information helps to quickly locate the patient's position, shortening the emergency response time.
In the present embodiment, based on step S1, the following steps are included:
S101: according to the historical diagnosis information of the monitored object, the heart-lung threshold value is set, and by referring to the historical diagnosis information, the heart-lung function status can be estimated and monitored more accurately.
S102: and collecting the physical characteristic information of the monitored object, performing preliminary processing and transcoding to generate preprocessing data, and providing a basis for subsequent cardiopulmonary function evaluation.
S103: performing Fourier transform based on the preprocessed data to obtain a time domain signal, including the following formula:
wherein F (k) represents a frequency domain signal; f (t) represents a time domain signal; k represents frequency, which is a real number; t represents a time variable; i f (t) represents the amplitude of the time domain signal; and the angle f (t) represents the phase angle of the time domain signal, and the range of the phase angle is [0,2 pi ].
S104: wavelet transforming the time domain signal, comprising the following formula:
Wherein, Representing a two-dimensional wavelet transform result; f (x, y) represents an input two-dimensional signal or image, where x and y represent the position coordinates of the pixel; ψ (x, y) represents a two-dimensional wavelet basis function; a represents a scale parameter; p, q represent translation parameters; x and Y represent direction parameters; a -1 represents the inverse of the scale for maintaining conservation of energy of the wavelet transform; /(I)Representing a summation operation over all pixel locations.
In this step, the time signals can be subjected to multi-scale analysis and grading through wavelet transformation, the characteristics of different frequencies and time scales in the signals are extracted, and the details and trends in the cardiopulmonary function signals can be better understood and analyzed.
S105: calculating a covariance matrix comprising the following formula:
PCA(X)=argmin_{Z,E}|X-Z-E||_F2+λ*||E||_F2
Wherein X represents an original data matrix; z represents the projected data matrix; e represents a residual matrix; the |X-Z-E||_F 2 represents the square of the Frobenius norm of the sum of the raw data and projection data and residual error, used for the metric reconstruction error; λ represents a regularization parameter for balancing the size of the reconstruction error and residual matrix; i e_f 2 represents the square of the Frobenius norm of the residual matrix for measuring the size of the residual.
S106: based on the preprocessed data, an inter-class walk matrix and an intra-class walk matrix are calculated, maximizing a discrimination ratio.
The step calculates the inter-class walk matrix and the intra-class walk matrix, aims at distinguishing the difference between different classes and the change in the same class, can improve the accuracy and the reliability of classification by maximizing the discrimination ratio, and is beneficial to more accurately evaluating the cardiopulmonary function condition.
Based on step S2, the method comprises the following steps:
S201: based on wavelet transformation, covariance matrix, inter-class walk matrix and intra-class walk matrix, the feature evaluation index is used for evaluating the importance of the data features, and the data feature evaluation result is output.
Here, by using the feature evaluation index, the features most useful for the evaluation of cardiopulmonary function can be screened out, improving the accuracy and efficiency of the subsequent analysis.
S202: and based on the data characteristic evaluation result, performing characteristic selection by using a characteristic selection algorithm, and outputting sign data.
The feature set can be further optimized through the feature selection algorithm, redundant and irrelevant features are removed, and the accuracy and reliability of cardiopulmonary function signal analysis are improved.
S203: and establishing a tracking model, training the tracking model based on the sign data, and adjusting parameters of the tracking model or improving a feature extraction method according to the accuracy and the robustness of the tracking model to better track and identify the cardiopulmonary function state of the monitored object.
S204: based on the sign data, the trained tracking model is used for identifying and tracking the monitored object, so that the monitoring object's cardiopulmonary function state can be monitored and tracked in real time.
Based on step S3, the method comprises the following steps:
S301: capturing video image information, pixel intensity and color on three-dimensional coordinates (x, y, t) are represented using the formula I (x, y, t) =f (x, y, t), where x and y are coordinates of pixels on an image plane, t represents time, f (x, y, t) is a capture function, in particular I (x, y, t) may be represented as RGB values or as a more complex color space representation, such as HSV, YUV, etc.
S302: establishing a background model, which comprises the following formulas: Where N and M represent the number of time frames and the pixel grid size, respectively, which takes into account the spatial and temporal averages of the pixels to more accurately represent the background.
S303: comparing the current frame to the background model, comprising the following formula:
D(x,y,t)=|I(x,y,t)-B(x,y)+α·|C(x,y,t)-Bc(x,y),
Wherein C (x, y, t) represents the color and texture characteristics of the pixel, obtained by a color space conversion or texture extraction algorithm; b c (x, y) represents the color and texture features of the background; alpha is a weight parameter used to balance the importance of color and texture differences.
S304: generating a segmentation mask comprising the following formula: Wherein D Gaussian represents a difference image after the gaussian blur processing.
S305: the segmentation mask is further processed using morphological operations and region growing algorithms to obtain position coordinates of the moving variant object, comprising the following formula:
Mobile object position = corrosion (M (x, y, t)) &expansion (M (x, y, t)),
The etching (M (x, y, t)) represents etching operation of the division mask M (x, y, t), and the expanding (M (x, y, t)) represents expanding operation of the division mask M (x, y, t), and the results of the two operations are combined to obtain the position coordinates of the moving object.
S306: determining whether the moving variant object is an active person using a deep learning algorithm, comprising the following formula: p person (x, y, t) =cnn (P (x, y, t)), where CNN is a multi-layer neural network structure, whose input is an image of a moving change object, and output as a probability value, in order to improve classification accuracy, the CNN may be trained and optimized using techniques such as data enhancement, migration learning, and the like.
S307: decision making includes the following formulas:
if(Pperson(x,y,t)>θ1andPperson(x,y,t)>θ2)thenPpersonDetected,
wherein, θ 1 and θ 2 are two preset thresholds, and other rules and features, such as motion trajectories, behavior patterns, etc., can be considered for improving robustness, when P person (x, y, t) > θ, it is considered a person, and at least one position (x, y, t) satisfies P person (x, y, t) > θ.
Further, based on step S305, the method includes the following steps:
s3051: a corrosion operation comprising the following formula: m eroded(x,y,t)=M(x,y,t)∩B1, wherein B 1 is a structural element for corrosion operations.
S3052: an expansion operation comprising the following formula: m dilated(x,y,t)=Meroded(x,y,t)∪B2, wherein B 2 is a structural element for an expansion operation.
S3053: region growing, comprising the following formula: Region_ grow i(Mdilated (x, y, t)) represents the i-th region growing operation performed on M dilated (x, y, t), where N is the number of region growing operations.
S3054: edge detection, comprising the following formula: e (x, y, t) =edge_detect (R (x, y, t)), where edge_detect (R (x, y, t)) represents performing an edge detection operation on R (x, y, t).
S3055: extracting contours and boundaries, including the following formulas: c (x, y, t) =extract_ contours (E (x, y, t)), where extract_ contours (E (x, y, t)) represents performing contour and boundary extraction operations on E (x, y, t) (this step helps to capture obvious edge features between moving changing objects and the background).
S3056: calculating the position and size of the moving object, including the following formulas:
wherein C (x ', y', t ') represents the segmentation mask at time t', and 1-C (x ', y', t ') represents the background mask at time t';
where C (x ', y', t ') represents the segmentation mask at time t'.
Still further, based on step S4, the method comprises the following steps:
S401: collecting monitoring information, setting the monitoring information as I (x, y, t), executing the privacy algorithm processing, wherein the algorithm P of the privacy processing comprises the following steps:
Random noise is introduced to confuse the monitored information, making it more difficult to recover, including the following formulas:
Inoisy(x,y,t)=I(x,y,t)+η,
Wherein η is random noise;
the monitoring information is processed by adopting a fuzzy algorithm, so that specific details are difficult to identify, and the method comprises the following formulas: i biurred (x, y, t) =blur (I (x, y, t));
A differential privacy mechanism is introduced to prevent leakage of individual information by adding random noise to the monitored information, comprising the following formula: i DP(x,y,t)=I(x,y,t)+NoiseDP;
the monitoring information is encrypted by adopting an encryption algorithm, so that only authorized users can decrypt and acquire the original information, and the method comprises the following formulas: i encrypted (x, y, t) =encrypt (I (x, y, t));
Desensitizing sensitive parts of the monitored information, such as by replacing sensitive values or using generalized labels, includes the following formulas: i de-identified (x, y, t) =de-identification (I (x, y, t));
obtaining the monitored information subjected to privacy processing, wherein the monitored information comprises the following formula:
Iprivate(x,y,t)=P(I(x,y,t))=Combination(Inoisy,Ibiurred,IDP,Iencrypted,Ide-identified);
402: continuously tracking the position and motion of a monitored object, comprising the following formula:
Where (x, y) is the position of the monitored object, (v x,vy) is the velocity, F x and F y are the external forces, ζ x and ζ y are random noise.
403: Filtering and processing state information of a monitoring object, which comprises the following formulas:
the equation of state: Observation equation: /(I) Where Δt is the time step, ζ x and ζ y are systematic noise, and η x and η y are observation noise.
404: Emotion recognition and analysis, comprising the steps of:
Extracting characteristics in physiological signals of a monitored object, and setting the physiological signals as S (t), wherein t represents time; combining features in the extracted physiological signals into feature vectors
Setting an emotion label Y, using a training set { (X i,Yi) }, wherein X i is a feature vector of the i-th sample, and Y i is a corresponding emotion label;
Set emotion model as Wherein θ represents a model parameter;
Defining a loss function
Optimizing model parameters theta *=argminθ iota (theta);
Extracting features from new physiological signals
Emotion prediction using trained models
Setting a judgment threshold tau to judge the emotion intensity, wherein the judgment threshold tau comprises the following formula:
and visualizing the emotion change trend, carrying out association analysis on the abnormal emotion, and outputting an emotion analysis result.
405: And combining the state information of the monitored object subjected to the filtering processing with the emotion analysis result to output comprehensive state data.
406: And (5) designing an abnormality detection algorithm, judging whether an abnormality exists by monitoring the abnormality in the state data, and if so, turning to step S5.
Based on step S5, the method comprises the following steps:
s501: and acquiring a heart-lung heat signal of the monitored object by using a non-contact detector, and capturing a heart-lung micro-motion signal of the monitored object.
S502: and (3) carrying out temperature calibration and background denoising on the acquired cardiopulmonary heat signals, removing interference factors, improving the accuracy of the signals, carrying out frequency spectrum analysis by applying fast Fourier transform, extracting respiratory frequency data and heart rate basic component data, and providing a basis for subsequent cardiopulmonary function evaluation.
S503: doppler processing and time gating are carried out on the heart and lung micro-motion signals, vital sign motion information of a monitored object is extracted, wavelet transformation is carried out on the basis of the vital sign motion information, and heart beat transient characteristics and respiratory transient characteristics are captured in a time-frequency domain at the same time, so that analysis dimension of heart and lung function data is further enriched.
S504: establishing a standard data structure, mapping the respiratory rate data, the heart rate basic component data, the heart beat transient characteristic and the respiratory transient characteristic into the standard data structure, ensuring the synchronization of time stamps, scaling or converting the respiratory rate data, the heart rate basic component data, the heart beat transient characteristic and the respiratory transient characteristic of different units or magnitudes, and providing a unified standard for subsequent data analysis and processing.
S505: time synchronization is carried out on the acquired respiratory frequency data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics, then a proper data fusion method is selected according to application scenes and signal characteristics, weights of different non-contact detectors are given, and then the selected data fusion method is used for processing, so that a comprehensive heart and lung function data set is output;
S506: the deep learning model is built, normal and abnormal heart and lung function characteristic mode training is carried out, then the comprehensive heart and lung function data set is input into the deep learning model for real-time analysis, the heart and lung function state can be rapidly and accurately identified and analyzed, and analysis result data is output and stored.
S507: and matching the analysis result data with the heart and lung threshold value, judging that the analysis result data is abnormal if the heart and lung threshold value is exceeded, and judging that the analysis result data is normal if the heart and lung threshold value is not exceeded.
Example 2:
the embodiment of the invention provides a health management intelligent system, which monitors human cardiopulmonary function signals based on the human cardiopulmonary function signal monitoring method of the embodiment 1.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the invention, but any modifications, equivalents, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. A method for monitoring cardiopulmonary function signals of a human body, comprising the steps of:
S1: acquiring historical diagnosis information, setting a heart-lung threshold value, acquiring and storing body state characteristic information of a monitored object, preprocessing, outputting preprocessing data, performing Fourier transform on the preprocessing data, acquiring a time domain signal, performing wavelet transform, and calculating a covariance matrix, an inter-class walk matrix and an intra-class walk matrix of the preprocessing data to maximize a discrimination ratio;
S2: carrying out data feature importance assessment by using feature assessment indexes, obtaining a data feature assessment result, selecting by using a feature selection algorithm, outputting sign data, and carrying out tracking identification according to the sign data, wherein the feature assessment indexes comprise but are not limited to relevance, variance and information gain;
S3: collecting video image information of the surrounding environment of the monitored object in real time, establishing a background model, removing the monitored object by comparing the current frame of the video image information with the background model, extracting a moving change object, judging whether the moving change object is a movable person or not by combining a deep learning algorithm, and if the moving change object is the movable person, turning to a step S4, otherwise turning to a step S5;
S4: adopting a privacy processing algorithm to block the monitoring of the mobile change object, continuously tracking the position and the movement of the monitoring object based on the sign data, acquiring the state information of the monitoring object through filtering processing, identifying and analyzing the emotion, outputting the state data, and if the state data is abnormal, turning to step S5;
S5: collecting a heart and lung heat signal of the monitored object by using a non-contact detector, capturing heart and lung micro-movements of the monitored object, establishing a standard data structure, synchronizing a time stamp, outputting a comprehensive heart and lung function data set, establishing a deep learning model, performing normal and abnormal heart and lung function feature mode training, performing processing analysis based on the comprehensive heart and lung function data set, acquiring analysis result data and storing the analysis result data, and if the analysis result data is normal, switching to step S6, otherwise switching to step S7;
S6: the analysis result data are called in real time, and a trend table is output, wherein the trend table is used for emergency personnel to judge the trend of the heart and lung health condition of the monitored object, and the emergency personnel make personalized strategies and feed back the personalized strategies to the monitored object;
S7: triggering an alarm and outputting abnormal information to emergency personnel, wherein the emergency personnel acquire the geographical position information of the monitored object based on the abnormal information, formulate an emergency strategy, automatically identify required emergency facilities and enable the emergency facilities, judge whether the monitored object is independent or not based on the abnormal information, and if the monitored object is not independent, the emergency personnel are in emergency communication with the display device of the monitored object for emergency guidance, and if the monitored object is independent, communicate with a community service station according to a nearby principle for rescue, bridge an authorized signal transmitting end, set a signal coverage range from near to far according to steps, and transmit emergency help seeking information.
2. The method for monitoring the cardiopulmonary function signals of a human body according to claim 1, based on step S1, comprising the steps of:
s101: setting the heart lung threshold according to the historical diagnosis information of the monitored object;
s102: collecting the physical characteristic information of the monitored object, performing preliminary processing and transcoding to generate preprocessing data;
S103: performing fourier transform based on the preprocessed data to obtain the time domain signal, including the following formula:
Wherein F (k) represents a frequency domain signal; f (t) represents a time domain signal; k represents frequency, which is a real number; t represents a time variable; i f (t) represents the amplitude of the time domain signal; the angle f (t) represents the phase angle of the time domain signal, and the value range of the angle f (t) is [0,2 pi ];
S104: performing wavelet transformation on the time domain signal, wherein the wavelet transformation comprises the following formula:
Wherein, Representing a two-dimensional wavelet transform result; f (x, y) represents an input two-dimensional signal or image, where x and y represent the position coordinates of the pixel; ψ (x, y) represents a two-dimensional wavelet basis function; a represents a scale parameter; p, q represent translation parameters; x and Y represent direction parameters; a -1 represents the inverse of the scale for maintaining conservation of energy of the wavelet transform; /(I)Representing a summation operation over all pixel locations;
S105: calculating the covariance matrix, comprising the following formula:
PCA(X)=argmin_{Z,E}|X-Z-E||_F2+λ*||E||_F2
Wherein X represents an original data matrix; z represents the projected data matrix; e represents a residual matrix; the |X-Z-E||_F 2 represents the square of the Frobenius norm of the sum of the raw data and projection data and residual error, used for the metric reconstruction error; λ represents a regularization parameter for balancing the size of the reconstruction error and residual matrix; the F 2 represents the square of the Frobenius norm of the residual matrix for measuring the size of the residual;
s106: based on the preprocessing data, calculating an inter-class walk matrix and an intra-class walk matrix, and maximizing a discrimination ratio.
3. The method for monitoring the cardiopulmonary function signals of a human body according to claim 1, based on step S2, comprising the steps of:
S201: based on the wavelet transformation, the covariance matrix, the inter-class walk matrix and the intra-class walk matrix, performing data feature importance assessment by using feature assessment indexes, and outputting a data feature assessment result;
s202: based on the data characteristic evaluation result, performing characteristic selection by using the characteristic selection algorithm, and outputting the sign data;
S203: establishing a tracking model, training the tracking model based on the sign data, and adjusting parameters of the tracking model or improving a feature extraction method according to the accuracy and robustness of the tracking model;
s204: and based on the sign data, using the trained tracking model for identifying and tracking the monitored object.
4. The method for monitoring the cardiopulmonary function signals of a human body according to claim 1, based on step S3, comprising the steps of:
s301: capturing the video image information, using the formula I (x, y, t) =f (x, y, t) to represent pixel intensity and color on three-dimensional coordinates (x, y, t), where x and y are coordinates of pixels on an image plane, t represents time, and f (x, y, t) is a capture function;
s302: establishing a background model, which comprises the following formulas: wherein, N and M respectively represent the time frame number and the pixel grid size;
S303: comparing the current frame with a background model, comprising the following formula:
D(x,y,t)=|I(x,y,t)-B(x,y)|+α·|C(x,y,t)-Bc(x,y)|,
wherein C (x, y, t) represents the color and texture characteristics of the pixel, obtained by a color space conversion or texture extraction algorithm; b c (x, y) represents the color and texture features of the background; alpha is a weight parameter for balancing the importance of color and texture differences;
s304: generating a segmentation mask comprising the following formula: Wherein D Gaussian represents a difference image after Gaussian blur processing;
S305: the segmentation mask is further processed by using morphological operations and region growing algorithms to obtain the position coordinates of the moving variant object, comprising the following formula:
Mobile object position = corrosion (M (x, y, t)) &expansion (M (x, y, t)),
Wherein etching (M (x, y, t)) means etching operation of the division mask M (x, y, t), and expanding (M (x, y, t)) means expanding operation of the division mask M (x, y, t);
S306: determining whether the moving variant object is an active person using the deep learning algorithm, comprising the following formula: p person (x, y, t) =cnn (P (x, y, t)), where CNN is a multi-layer neural network structure;
S307: decision making includes the following formulas:
if(Pperson(x,y,t)>θ1andPperson(x,y,t)>θ2)thenPpersonDetected,
Where θ 1 and θ 2 are two preset thresholds, when P person (x, y, t) > θ, it is considered a human, and at least one position (x, y, t) exists to satisfy P person (x, y, t) > θ.
5. The method for monitoring the cardiopulmonary function signal of a human body according to claim 4, based on step S305, comprising the steps of:
S3051: etching operation;
S3052: an expansion operation;
S3053: region growing;
s3054: edge detection;
s3055: extracting contours and boundaries;
s3056: calculating the position and size of the moving object, including the following formulas:
wherein C (x ', y', t ') represents the segmentation mask at time t', and 1-C (x ', y', t ') represents the background mask at time t';
Wherein C (x ', y', t ') represents the segmentation mask at time t'.
6. The method for monitoring the cardiopulmonary function signals of a human body according to claim 1, based on step S4, comprising the steps of:
S401: collecting monitoring information and executing privacy algorithm processing based on the monitoring information, wherein the privacy algorithm processing comprises the following steps:
Introducing random noise to confuse the monitoring information;
Processing the monitoring information by adopting a fuzzy algorithm;
Introducing a differential privacy mechanism, and preventing leakage of individual information by adding the random noise to the monitoring information;
encrypting the monitoring information by adopting an encryption algorithm;
desensitizing the sensitive part in the monitoring information;
obtaining the monitoring information subjected to privacy treatment;
402: continuously tracking the position and movement of the monitored object;
403: filtering and processing the state information of the monitoring object;
404: emotion recognition and analysis, comprising the steps of:
Extracting characteristics in physiological signals of the monitored object, and setting the physiological signals as S (t), wherein t represents time;
combining features in the extracted physiological signals into feature vectors
Setting an emotion label Y, using a training set { (X i,Yi) }, wherein X i is a feature vector of the i-th sample, and Y i is a corresponding emotion label;
Set emotion model as Wherein θ represents a model parameter;
Defining a loss function
Optimizing model parameters theta *=argminθ iota (theta);
Extracting features from the new physiological signal
Emotion prediction using trained models
Setting a judgment threshold tau to judge the emotion intensity, wherein the judgment threshold tau comprises the following formula:
Visualizing emotion change trend, carrying out association analysis on abnormal emotion, and outputting emotion analysis results;
405: combining the state information of the monitored object subjected to filtering processing with an emotion analysis result to output comprehensive state data;
406: and (5) designing an abnormality detection algorithm, judging whether an abnormality exists by monitoring the abnormality in the state data, and if so, turning to step S5.
7. The method for monitoring the cardiopulmonary function signals of a human body according to claim 1, based on step S5, comprising the steps of:
S501: collecting the cardiopulmonary heat signals of the monitored object by using the non-contact detector, and capturing cardiopulmonary micro-motion signals of the monitored object;
S502: performing temperature calibration and background denoising on the acquired cardiopulmonary heat signals, performing spectrum analysis by applying fast Fourier transform, and extracting respiratory frequency data and heart rate basic component data;
S503: doppler processing and time gating are carried out on the heart and lung micro-motion signals, vital sign motion information of the monitored object is extracted, wavelet transformation is carried out on the basis of the vital sign motion information, and heartbeat transient characteristics and respiratory transient characteristics are captured in a time-frequency domain;
S504: establishing the standard data structure, mapping the respiratory rate data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics into the standard data structure, ensuring the synchronization of the time stamps, and scaling or converting the respiratory rate data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics of different units or magnitudes;
S505: performing time synchronization on the acquired respiratory frequency data, heart rate basic component data, heart beat transient characteristics and respiratory transient characteristics, selecting a proper data fusion method according to application scenes and signal characteristics, giving weights to different non-contact detectors, processing by using the selected data fusion method, and outputting the comprehensive heart and lung function data set;
S506: establishing the deep learning model, training normal and abnormal heart and lung function characteristic modes, inputting the comprehensive heart and lung function data set into the deep learning model for real-time analysis, and outputting and storing analysis result data;
S507: and matching the analysis result data with the heart and lung threshold value, judging that the heart and lung threshold value is abnormal if the heart and lung threshold value is exceeded, and judging that the heart and lung threshold value is not exceeded, and judging that the heart and lung threshold value is normal.
8. The method for monitoring human cardiopulmonary function signals according to claim 1, wherein: the first aid distress information includes, but is not limited to, a name, an age of the monitored subject, the anomaly information, the geographic location information.
9. A health management intelligent system, characterized by: the health management intelligent system monitors the human cardiopulmonary function signal based on the human cardiopulmonary function signal monitoring method according to any one of claims 1-8.
CN202410115762.XA 2023-04-14 2024-01-29 Health management intelligent system and human cardiopulmonary function signal monitoring method Pending CN117954079A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2023103963946 2023-04-14
CN202310396394.6A CN116491913A (en) 2023-04-14 2023-04-14 Health management intelligent system and human cardiopulmonary function signal monitoring method

Publications (1)

Publication Number Publication Date
CN117954079A true CN117954079A (en) 2024-04-30

Family

ID=87323993

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310396394.6A Pending CN116491913A (en) 2023-04-14 2023-04-14 Health management intelligent system and human cardiopulmonary function signal monitoring method
CN202410115762.XA Pending CN117954079A (en) 2023-04-14 2024-01-29 Health management intelligent system and human cardiopulmonary function signal monitoring method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310396394.6A Pending CN116491913A (en) 2023-04-14 2023-04-14 Health management intelligent system and human cardiopulmonary function signal monitoring method

Country Status (1)

Country Link
CN (2) CN116491913A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117686890B (en) * 2024-02-01 2024-04-12 北京中成康富科技股份有限公司 Single board testing method and system for millimeter wave therapeutic apparatus

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999004685A1 (en) * 1997-07-25 1999-02-04 Vaeaenaenen Mikko A personal health status alarm method
CN106539574A (en) * 2017-01-03 2017-03-29 浙江物云科技有限公司 A kind of emergency cooperative health monitoring systems and method
CN109498059A (en) * 2018-12-18 2019-03-22 首都师范大学 A kind of contactless humanbody condition monitoring system and body state manage monitoring method
CN111192257A (en) * 2020-01-02 2020-05-22 上海电气集团股份有限公司 Method, system and equipment for determining equipment state
CN114190913A (en) * 2021-12-07 2022-03-18 东南大学 Millimeter wave radar-based driver driving state monitoring system and method
CN114246563A (en) * 2021-12-17 2022-03-29 重庆大学 Intelligent heart and lung function monitoring equipment based on millimeter wave radar
CN114732384A (en) * 2022-06-14 2022-07-12 亿慧云智能科技(深圳)股份有限公司 Heart health monitoring method and device based on microwave radar and storage medium
CN114983373A (en) * 2022-06-02 2022-09-02 谢俊 Method for detecting human heart rate
CN115422976A (en) * 2022-09-14 2022-12-02 湖南万脉医疗科技有限公司 Artificial network-based cardiopulmonary coupling relationship analysis method and monitoring system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11412937B2 (en) * 2017-03-29 2022-08-16 Texas Instruments Incorporated Multi-person vital signs monitoring using millimeter wave (mm-wave) signals
US10677905B2 (en) * 2017-09-26 2020-06-09 Infineon Technologies Ag System and method for occupancy detection using a millimeter-wave radar sensor
CN111481184B (en) * 2020-04-24 2022-07-01 华侨大学 Multi-target respiration heart rate monitoring method and system based on millimeter wave radar technology
CN111938613B (en) * 2020-08-07 2023-10-31 南京茂森电子技术有限公司 Health monitoring device and method based on millimeter wave radar

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999004685A1 (en) * 1997-07-25 1999-02-04 Vaeaenaenen Mikko A personal health status alarm method
CN106539574A (en) * 2017-01-03 2017-03-29 浙江物云科技有限公司 A kind of emergency cooperative health monitoring systems and method
CN109498059A (en) * 2018-12-18 2019-03-22 首都师范大学 A kind of contactless humanbody condition monitoring system and body state manage monitoring method
CN111192257A (en) * 2020-01-02 2020-05-22 上海电气集团股份有限公司 Method, system and equipment for determining equipment state
CN114190913A (en) * 2021-12-07 2022-03-18 东南大学 Millimeter wave radar-based driver driving state monitoring system and method
CN114246563A (en) * 2021-12-17 2022-03-29 重庆大学 Intelligent heart and lung function monitoring equipment based on millimeter wave radar
CN114983373A (en) * 2022-06-02 2022-09-02 谢俊 Method for detecting human heart rate
CN114732384A (en) * 2022-06-14 2022-07-12 亿慧云智能科技(深圳)股份有限公司 Heart health monitoring method and device based on microwave radar and storage medium
CN115422976A (en) * 2022-09-14 2022-12-02 湖南万脉医疗科技有限公司 Artificial network-based cardiopulmonary coupling relationship analysis method and monitoring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
R. LUKOCIUS等: "Physiological Parameters Monitoring System for Occupational Safety", 《ELEKTRONIKA IR ELEKTROTECHNIKA》, 31 December 2014 (2014-12-31), pages 57 - 60 *

Also Published As

Publication number Publication date
CN116491913A (en) 2023-07-28

Similar Documents

Publication Publication Date Title
Harrou et al. An integrated vision-based approach for efficient human fall detection in a home environment
Geng et al. Enlighten wearable physiological monitoring systems: On-body rf characteristics based human motion classification using a support vector machine
Mubashir et al. A survey on fall detection: Principles and approaches
Zhou et al. Activity analysis, summarization, and visualization for indoor human activity monitoring
Zhang et al. RGB-D camera-based daily living activity recognition
Zhang et al. Driver drowsiness recognition based on computer vision technology
Johnson et al. A multi-view method for gait recognition using static body parameters
Gowsikhaa et al. Suspicious Human Activity Detection from Surveillance Videos.
Ghose et al. UbiHeld: ubiquitous healthcare monitoring system for elderly and chronic patients
Chen et al. A fall detection system based on infrared array sensors with tracking capability for the elderly at home
CN108710868A (en) A kind of human body critical point detection system and method based under complex scene
US20080123975A1 (en) Abnormal Action Detector and Abnormal Action Detecting Method
US20070291991A1 (en) Unusual action detector and abnormal action detecting method
CN117954079A (en) Health management intelligent system and human cardiopulmonary function signal monitoring method
WO2021068781A1 (en) Fatigue state identification method, apparatus and device
Zerrouki et al. Accelerometer and camera-based strategy for improved human fall detection
Sun et al. Real-time elderly monitoring for senior safety by lightweight human action recognition
Qiu et al. Skeleton-based abnormal behavior detection using secure partitioned convolutional neural network model
Thaman et al. Face mask detection using mediapipe facemesh
Albawendi et al. Video based fall detection with enhanced motion history images
Stein et al. Recognising complex activities with histograms of relative tracklets
CN109918994B (en) Commercial Wi-Fi-based violent behavior detection method
Jeoung et al. Thermal comfort prediction based on automated extraction of skin temperature of face component on thermal image
Kavya et al. Fall detection system for elderly people using vision-based analysis
JP6992900B2 (en) Information processing equipment, control methods, and programs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination