CN113456027B - Sleep parameter assessment method based on wireless signals - Google Patents

Sleep parameter assessment method based on wireless signals Download PDF

Info

Publication number
CN113456027B
CN113456027B CN202110703691.1A CN202110703691A CN113456027B CN 113456027 B CN113456027 B CN 113456027B CN 202110703691 A CN202110703691 A CN 202110703691A CN 113456027 B CN113456027 B CN 113456027B
Authority
CN
China
Prior art keywords
time
sleep
bed
bedridden
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110703691.1A
Other languages
Chinese (zh)
Other versions
CN113456027A (en
Inventor
方震
邹勇刚
赵荣建
何光强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Runnan Medical Electronic Research Institute Co ltd
Original Assignee
Nanjing Runnan Medical Electronic Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Runnan Medical Electronic Research Institute Co ltd filed Critical Nanjing Runnan Medical Electronic Research Institute Co ltd
Priority to CN202110703691.1A priority Critical patent/CN113456027B/en
Publication of CN113456027A publication Critical patent/CN113456027A/en
Application granted granted Critical
Publication of CN113456027B publication Critical patent/CN113456027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1115Monitoring leaving of a patient support, e.g. a bed or a wheelchair
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Abstract

The invention discloses a sleep parameter evaluation method based on wireless signals, which comprises the following steps: s1, dividing a detection area and identifying a bed area; s2, identifying and recording the bedridden time of the person in the bed area; s3, classifying the bedridden time in time periods; s4, calculating sleep parameters according to the classified bedridden time. The beneficial effects are that: according to the invention, the human body position and the breathing information are extracted by utilizing the human body reflection radio signals so as to calculate the sleep parameters, a user does not need to wear a sensor and record the sleep condition, a plurality of users on different beds can be monitored simultaneously, and the detected sleep parameters have clinical significance; the user breathing, heart rate and position are extracted by analyzing the reflected signals, the position of the user bed is automatically detected, the sleeping time and the getting-out time of the user are identified, and further the sleeping parameters of the subject are monitored on the premise that the privacy of the user is not related.

Description

Sleep parameter assessment method based on wireless signals
Technical Field
The invention relates to the field of medical equipment and physiological signal detection, in particular to a sleep parameter evaluation method based on wireless signals.
Background
Insomnia and sleep insufficiency are common health problems. Sleep monitoring is important for both finding and treating insomnia. Insomnia and sleep insufficiency are generally assessed using the following sleep parameters: sleep Latency (SL), which is the time between sleeping and falling asleep; total Sleep Time (TST) is the total time actually spent in sleep; bedridden Time (TIB): time difference between getting on and getting off; sleep Efficiency (SE): i.e. total sleep time/total bedridden time (TST/TIB); wake time after sleep onset (WASO): is the total duration of insomnia that occurs after sleep onset.
The gold standard for sleep monitoring is the night Polysomnography (PSG), which is performed in hospitals or sleep laboratories, and subjects use electroencephalogram, electrocardiogram, electromyogram, respiration and pulse monitors while sleeping, but discomfort in the sensor can affect sleep and thus measurement. Patient diaries are more commonly used to monitor insomnia, record when they sleep every day, how long they take to go to sleep, how often they wake up in the evening, etc. Writing a sleep diary requires a great deal of manpower and is difficult to persist for a long period of time. The health consumer industry has developed health devices that track sleep, but with less accuracy than medical grade devices. Inferring sleep quality using data from smart phone microphones, accelerometers, cameras, phone usage, etc. is easily related to user privacy concerns and is not easily configurable for elderly and children.
For the problems in the related art, no effective solution has been proposed at present.
Disclosure of Invention
Aiming at the problems in the related art, the invention provides a sleep parameter evaluation method based on wireless signals, which aims to overcome the technical problems existing in the prior related art.
For this purpose, the invention adopts the following specific technical scheme:
a sleep parameter assessment method based on wireless signals, the method comprising the steps of:
s1, dividing a detection area and identifying a bed area;
s2, identifying and recording the bedridden time of the person in the bed area;
s3, classifying the bedridden time in time periods;
s4, calculating sleep parameters according to the classified bedridden time.
Further, in the step S1, the detecting area is divided, and the bed area is identified, which includes the following steps:
s11, dividing a detection area into 500 x 500 pixels by using a frequency modulation continuous wave radio and an antenna;
s12, dividing 500 x 500 pixels into empty, static and moving sets by adopting a filter;
s13, manufacturing a pixel plane histogram of the static set;
s14, extracting a static area according to the pixel plane histogram;
s15, identifying the bed area in the static area.
Further, the step of dividing 500×500 pixels into empty, stationary and moving sets in S12 by using a filter includes the steps of:
s121, measuring the reflection signal of each pixel every 50 milliseconds;
s122, capturing chest movement when people breathe by utilizing a band-pass filter around the breathing frequency, and judging whether people exist in the detection area;
s123, capturing human body movement by using a high-pass filter, and judging whether the person is moving or stationary;
s124, dividing 500 x 500 pixels into empty, static and moving sets according to detection results of the high-pass filter and the band-pass filter.
Further, the step of extracting the still region according to the pixel plane histogram in S14 includes the following steps:
s141, binarizing the plane histogram into a foreground and a background by adopting a gray image automatic threshold algorithm;
s142, if two connected static areas appear, normalizing the converted image according to the value of the connection part of the two static areas;
s143, separating the connected static areas by using a watershed algorithm.
Further, in S141, the formula for binarizing the plane histogram into the foreground and the background by using the gray image automatic threshold algorithm is as follows:
Var=w 0 ×(u 0 -u) 2 +w 1 ×(u 1 -u) 2
wherein, the proportion of the foreground points to the image is w 0 Average gray level u 0 The proportion of the background points to the image is w 1 Average gray level u 1 The total average gray level of the image is u.
Further, the equation for identifying the bed region in the stationary region in S15 is:
wherein,is for region A i F (-) is an SVM classifier, f (a) i The method comprises the steps of carrying out a first treatment on the surface of the d) Is day A of day d i Binary prediction of regions.
Further, the identifying and recording the bedridden time of the person in the bed region in S2 includes the steps of:
s21, constructing a hidden Markov model observation value according to a pixel static and moving set;
s22, learning a transition probability matrix and an observation probability matrix of the hidden Markov model by utilizing the manual marker training data;
s23, adopting dynamic programming of a Viterbi algorithm to obtain a state sequence with the maximum probability;
s24, determining the time for getting on/off the bed according to the state transition, and calculating and recording the bedridden time.
Further, in S23, the formula for solving the state sequence with the maximum probability by adopting the dynamic programming of the viterbi algorithm is as follows:
wherein the hidden state at the time t is modeled as s t E S, observed modeling as o t E, O; the state set is s= { S 0 ,S 1 },S 0 Representing the state in bed, S 1 Indicating an out-of-bed condition; the observation set isThe observation region is divided into a bed region (R 0 ) A buffer zone (R) 50 cm wide surrounding the implantation zone 1 ) And an outer region (R 2 ) For each 5 second window observed user position, R s Represents the area corresponding to the first second of the window, and R e Representing an area corresponding to the last second of the window; n is the total number of sequences; p(s) 1 ) Is the initial state probability.
Further, the classifying the bedridden time in the step S3 includes classifying the bedridden time into a sleep latency period, a sleep-in time period, and a wake-up time after the start of sleep.
Further, the step of classifying the bedridden time in S3 includes the following steps:
s31, measuring pixel reflection signals of an area where a person is located every 30 seconds in bedridden time;
s32, processing the pixel reflection signals into spectrograms and inputting the spectrograms into a CNN classifier;
s33, processing the spectrogram by using the constructed 18-layer CNN classifier;
s34, calculating a probability value of falling asleep time within a 30-second time period, and outputting the falling asleep time.
The beneficial effects of the invention are as follows: according to the invention, the human body position and the breathing information are extracted by utilizing the human body reflection radio signals so as to calculate the sleep parameters, a user does not need to wear a sensor and record the sleep condition, a plurality of users on different beds can be monitored simultaneously, and the detected sleep parameters have clinical significance; the user breathing, heart rate and position are extracted by analyzing the reflected signals, the position of the user bed is automatically detected, the sleeping time and the getting-out time of the user are identified, and further the sleeping parameters of the subject are monitored on the premise that the privacy of the user is not related.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a sleep parameter evaluation method based on wireless signals according to an embodiment of the invention;
fig. 2 is a schematic diagram of sleep parameter definition in a sleep parameter evaluation method based on wireless signals according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an automatic thresholding method for gray scale images in a sleep parameter estimation method based on wireless signals according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of identifying a bed region in a sleep parameter assessment method based on wireless signals according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of extracting a static region from a pixel plane histogram in a sleep parameter evaluation method based on a wireless signal according to an embodiment of the present invention;
fig. 6 is a flowchart of classifying bedridden time according to a sleep parameter evaluation method based on wireless signals according to an embodiment of the present invention.
Detailed Description
For the purpose of further illustrating the various embodiments, the present invention provides the accompanying drawings, which are a part of the disclosure of the present invention, and which are mainly used to illustrate the embodiments and, together with the description, serve to explain the principles of the embodiments, and with reference to these descriptions, one skilled in the art will recognize other possible implementations and advantages of the present invention, wherein elements are not drawn to scale, and like reference numerals are generally used to designate like elements.
According to an embodiment of the invention, a sleep parameter evaluation method based on wireless signals is provided.
The invention will now be further described with reference to the accompanying drawings and detailed description, as shown in fig. 1-6, a wireless signal-based sleep parameter evaluation method according to an embodiment of the invention, the method comprising the steps of:
s1, dividing a detection area and identifying a bed area;
s2, identifying and recording the bedridden time of the person in the bed area;
s3, classifying the bedridden time in time periods;
s4, calculating sleep parameters according to the classified bedridden time.
In one embodiment, the dividing the detection area and identifying the bed area in S1 includes the following steps:
s11, dividing a detection area into 500 x 500 pixels by using a frequency modulation continuous wave radio and an antenna;
s12, dividing 500 x 500 pixels into empty, static and moving sets by adopting a filter;
in particular, the system uses a combination of Frequency Modulated Continuous Wave (FMCW) radio and an antenna array. The detection area x-y plane is divided into M x M pixels. The reflected signal is measured every 50 milliseconds for each pixel, which is labeled with "have moving person", "have stationary person", or "have no person (null)" by two filters.
S13, manufacturing a pixel plane histogram of the static set;
s14, extracting a static area according to the pixel plane histogram;
s15, identifying the bed area in the static area.
In particular, the area where a person sits down or lies down (resting area) will accumulate a number of "people at rest" labels in a day and will therefore appear as peaks in the histogram. The darker the pixel, the longer the dwell time at that location.
In one embodiment, the step of dividing 500×500 pixels into empty, stationary and moving sets in S12 using a filter includes the steps of:
s121, measuring the reflection signal of each pixel every 50 milliseconds;
s122, capturing chest movement when people breathe by utilizing a band-pass filter around the breathing frequency, and judging whether people exist in the detection area;
s123, capturing human body movement by using a high-pass filter, and judging whether the person is moving or stationary;
s124, dividing 500 x 500 pixels into empty, static and moving sets according to detection results of the high-pass filter and the band-pass filter.
In one embodiment, the extracting the still region according to the pixel plane histogram in S14 includes the steps of:
s141, binarizing the plane histogram into a foreground and a background by adopting a gray image automatic threshold algorithm;
s142, if two connected static areas appear, normalizing the converted image according to the value of the connection part of the two static areas;
s143, separating the connected static areas by using a watershed algorithm.
Specifically, to segment the quiescent zone of adhesion and screen out the zone that may be a bed, the following is done:
1. binarizing the input image into foreground and background using an automatic thresholding method of gray scale images, where there may be two stationary regions connected;
2. normalizing the converted image according to the value of the connecting part of the two static areas, then carrying out distance transformation, calculating the nearest distance between each pixel in the foreground of the image and the background, and obtaining the central part of the static area by applying a distance threshold value as a mark of a watershed algorithm;
3. the watershed algorithm is used for dividing an independent static area, so that an area which is too small to be used as a bed is removed, and the rest areas are areas which are possibly used as the bed;
furthermore, the basic meaning of the distance transformation is to calculate the distance from the non-zero pixel point to the nearest zero pixel point in an image, i.e. the shortest distance to the zero pixel point. The most common distance conversion algorithm is implemented by a continuous erosion operation, the stop condition of which is that all foreground pixels are completely eroded. Thus, according to the corrosion sequence, the distance from each foreground pixel point to the foreground central skeleton pixel point is obtained. And setting different gray values according to the distance values of the pixel points. Thus, the distance transformation of the binary image is completed; the method adopted above is to set a distance threshold to obtain the central part as the mark of the watershed algorithm; and (5) dividing the watershed by using the original image and the mark.
In one embodiment, the formula for binarizing the plane histogram into the foreground and the background using the gray image automatic threshold algorithm in S141 is:
Var=w 0 ×(u 0 -u) 2 +w 1 ×(u 1 -u) 2
wherein, the proportion of the foreground points to the image is w 0 Average gray level u 0 The proportion of the background points to the image is w 1 Average gray level u 1 The total average gray level of the image is u.
In one embodiment, the formula for identifying the bed region in the stationary region in S15 is:
wherein,is for region A i F (-) is an SVM classifier, f (a) i The method comprises the steps of carrying out a first treatment on the surface of the d) Is day A of day d i Binary prediction of regions.
In order to identify the region of the bed in the stationary region, it is not sufficient to use only the amount of time spent in one stationary region as a classification feature, and also the ratio of stationary time to moving time: the sum of all "people with rest" labels for all pixels in the area is taken, divided by the sum of all "people with movement" labels for all pixels in the area. The two features are used to train the SVM bed classifier with the linear core according to the distribution characteristics of the collected sample data. To determine if a quiescent zone is a bed, we look at its predictive label over the past D days. If the percentage of days marked as bedridden is greater than γ, then the region of the bed can be determined, with the default value being d=7 and γ=5/7.
In one embodiment, the identifying and recording bed time of the person in the bed area in S2 comprises the steps of:
s21, constructing a hidden Markov model observation value according to a pixel static and moving set;
s22, learning a transition probability matrix and an observation probability matrix of the hidden Markov model by utilizing the manual marker training data;
s23, adopting dynamic programming of a Viterbi algorithm to obtain a state sequence with the maximum probability;
s24, determining the time for getting on/off the bed according to the state transition, and calculating and recording the bedridden time.
In particular, the error in locating the position of the returning person in the environment based on radio signals may be up to one meter, so when detecting the time to get on/off the bed, the system treats the position measurement as a noisy observation of the real state, deducing the real state from the observation using a hidden markov model.
In one embodiment, the formula for solving the state sequence with the largest probability in the step S23 by adopting the dynamic programming of the viterbi algorithm is as follows:
wherein the hidden state at the time t is modeled as s t E S, observed modeling as o t E, O; the state set is s= { S 0 ,S 1 },S 0 Representing the state in bed, S 1 Indicating an out-of-bed condition; the observation set isThe observation region is divided into a bed region (R 0 ) A buffer zone (R) 50 cm wide surrounding the implantation zone 1 ) And an outer region (R 2 ). For each ofThe user position observed in a 5 second window, R s Represents the area corresponding to the first second of the window, and R e Representing an area corresponding to the last second of the window; n is the total number of sequences P(s) 1 ) Is the initial state probability.
The viterbi algorithm finds the state sequence flow with the highest probability:
1. initializing: input P(s) 1 ) Sequence of observationsTransition probability matrix T and observation probability matrix E, delta is calculated 1 (s 1 )=P(s 1 )P(o 1 |s 1 );
2. And (5) recursion: at time t=2, 3, …, N for s t All possible states s t =S 0 ,S 1 Calculating t time s t Probability maximum among all individual paths of state: delta t (s t )=maxδ t-1 (s t-1 )P(s t |s t-1 )P(o t |s t ) And records the state of the maximum value corresponding to the moment of the single path t-1:
3. at time t=n, the optimal state isAnd (4) performing optimal path backtracking: for t=n-1, N-2, …, moment 1, +.>The optimal state sequence +.>
The bed time TIB can be determined by using hidden markov models to predict when a person enters the bed (state transition from S1 to S0) and when he leaves the bed (state transition from S0 to S1).
In one embodiment, the classifying the bedridden time in S3 includes classifying the bedridden time into a sleep latency period, a sleep-in time, and a wake-up time after the sleep is started.
In one embodiment, the classifying the bedridden time in S3 includes the steps of:
s31, measuring pixel reflection signals of an area where a person is located every 30 seconds in bedridden time;
s32, processing the pixel reflection signals into spectrograms and inputting the spectrograms into a CNN classifier;
s33, processing the spectrogram by using the constructed 18-layer CNN classifier;
s34, calculating a probability value of falling asleep time within a 30-second time period, and outputting the falling asleep time. Specifically, in order to detect when the user falls asleep, and the sleep-awake time throughout the night, the bedridden time is divided into time periods of 30 seconds, each of which needs to be classified into two categories, sleep and awake. Specifically, for each time period, signals reflected from pixels labeled "stationary person" or "moving person" are considered, and these signals are used to estimate the user's breathing pattern and body movement. Classification is based on Convolutional Neural Networks (CNNs) on time signals. An 18-layer CNN classifier was built with the residual network model. The classifier takes the spectrogram of the signal in each time period as input to output the probability of sleeping of the person, namely a series of probabilities { P }, is obtained i } n Where n is the total number of 30 second time periods in bed time and i is the ith time period.
Sleep onset time: in training the CNN, the model attempts to maximize the overall accuracy of predicting sleep-wakefulness throughout the night, but cannot maximize the accuracy of detecting the exact sleep onset time. Thus, a window is taken containing P i 15 minutes before and after the first time period of greater than 0.5. On the basis of the CNN, a gradient enhancement regressor is constructed, the regressor takes signals of all time periods in the window as input, predicts the probability of becoming a sleep starting time period, and the time period with the highest probability is regarded as a sleep starting time period k.
TST: beginning P from after time period k i >Total duration of all time periods of 0.5.
WASO: start P after time period k i <Total duration of all time periods=0.5.
SE: finally, sleep Efficiency (SE) is calculated directly as TST/TIB.
SL: the sleep latency time (SL) is the time difference between the time of getting in bed and the time of starting the sleep start period k.
In summary, by means of the above technical solution of the present invention, the present invention extracts the human body position and the respiratory information by using the human body reflected radio signal to calculate the sleep parameter, and the user does not need to wear a sensor and record the sleep condition, so that a plurality of users on different beds can be monitored at the same time, and the detected sleep parameter has clinical significance; the user breathing, heart rate and position are extracted by analyzing the reflected signals, the position of the user bed is automatically detected, the sleeping time and the getting-out time of the user are identified, and further the sleeping parameters of the subject are monitored on the premise that the privacy of the user is not related.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (8)

1. A sleep parameter assessment method based on wireless signals, the method comprising the steps of:
s1, dividing a detection area and identifying a bed area;
s2, identifying and recording the bedridden time of the person in the bed area;
s3, classifying the bedridden time in time periods;
s4, calculating sleep parameters according to the classified bedridden time;
the step S1 of dividing the detection area and identifying the bed area comprises the following steps:
s11, dividing a detection area into 500 x 500 pixels by using a frequency modulation continuous wave radio and an antenna;
s12, dividing 500 x 500 pixels into empty, static and moving sets by adopting a filter;
s13, manufacturing a pixel plane histogram of the static set;
s14, extracting a static area according to the pixel plane histogram;
s15, identifying a bed area in the static area;
the step of extracting the still region according to the pixel plane histogram in S14 includes the following steps:
s141, binarizing the plane histogram into a foreground and a background by adopting a gray image automatic threshold algorithm;
s142, if two connected static areas appear, normalizing the converted image according to the value of the connection part of the two static areas;
s143, separating the connected static areas by using a watershed algorithm.
2. The method of claim 1, wherein the step of using a filter to divide 500 x 500 pixels into null, stationary and moving sets in S12 comprises the steps of:
s121, measuring the reflection signal of each pixel every 50 milliseconds;
s122, capturing chest movement when people breathe by utilizing a band-pass filter around the breathing frequency, and judging whether people exist in the detection area;
s123, capturing human body movement by using a high-pass filter, and judging whether the person is moving or stationary;
s124, dividing 500 x 500 pixels into empty, static and moving sets according to detection results of the high-pass filter and the band-pass filter.
3. The sleep parameter evaluation method based on wireless signals according to claim 2, wherein in S141, an automatic gray image threshold algorithm is used to find an optimal threshold T, so as to maximize an inter-class variance between a foreground and a background, and the plane histogram is binarized into the foreground and the background, and the inter-class variance formula of the foreground and the background is:
wherein, the proportion of the foreground points to the image is w 0 Average gray level u 0 The proportion of the background points to the image is w 1 Average gray level u 1 The total average gray level of the image is u.
4. The sleep parameter assessment method according to claim 2, wherein the formula for identifying the bed region in the stationary region in S15 is:
wherein,is for region A i F (-) is an SVM classifier, f (a) i The method comprises the steps of carrying out a first treatment on the surface of the d) Is day A of day d i Binary prediction of regions.
5. The sleep parameter assessment method based on wireless signals as claimed in claim 4, characterized in that, the identifying and recording of the bedridden time of the person in the bed region in S2 comprises the steps of:
s21, constructing a hidden Markov model observation value according to a pixel static and moving set;
s22, learning a transition probability matrix and an observation probability matrix of the hidden Markov model by utilizing the manual marker training data;
s23, adopting dynamic programming of a Viterbi algorithm to obtain a state sequence with the maximum probability;
s24, determining the time for getting on/off the bed according to the state transition, and calculating and recording the bedridden time.
6. The sleep parameter evaluation method based on wireless signals according to claim 5, wherein the formula for solving the state sequence with the largest probability by adopting the dynamic programming of the viterbi algorithm in S23 is as follows:
wherein, the hidden state at the moment t is modeled asObservation modeling is +.>The method comprises the steps of carrying out a first treatment on the surface of the The state set is +.>,S 0 Representing the state in bed, S 1 Indicating an out-of-bed condition; the observation set is +.>The observation area is divided into a bed area R 0 A buffer region R of 50 cm width surrounding the implantation region 1 And an outer region R 2 For each 5 second window observed user position, R s Represents the area corresponding to the first second of the window, and R e Representing an area corresponding to the last second of the window; n is the total number of sequences; />Is the initial state probability.
7. The method according to claim 1, wherein the classifying the bedridden time in S3 includes classifying the bedridden time into a sleep latency period, a sleep-in time period, and a wake-up time after the start of sleep.
8. The sleep parameter assessment method based on wireless signals as claimed in claim 7, characterized in that, the time period classification of bedridden time in S3 comprises the following steps:
s31, measuring pixel reflection signals of an area where a person is located every 30 seconds in bedridden time;
s32, processing the pixel reflection signals into spectrograms and inputting the spectrograms into a CNN classifier;
s33, processing the spectrogram by using the constructed 18-layer CNN classifier;
s34, calculating a probability value of falling asleep time within a 30-second time period, and outputting the falling asleep time.
CN202110703691.1A 2021-06-24 2021-06-24 Sleep parameter assessment method based on wireless signals Active CN113456027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110703691.1A CN113456027B (en) 2021-06-24 2021-06-24 Sleep parameter assessment method based on wireless signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110703691.1A CN113456027B (en) 2021-06-24 2021-06-24 Sleep parameter assessment method based on wireless signals

Publications (2)

Publication Number Publication Date
CN113456027A CN113456027A (en) 2021-10-01
CN113456027B true CN113456027B (en) 2023-12-22

Family

ID=77872636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110703691.1A Active CN113456027B (en) 2021-06-24 2021-06-24 Sleep parameter assessment method based on wireless signals

Country Status (1)

Country Link
CN (1) CN113456027B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543281B (en) * 2022-02-18 2023-08-18 青岛海信日立空调系统有限公司 Sleeping tool position detection method and device based on radar equipment and air conditioner indoor unit

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341474A (en) * 1998-05-28 1999-12-10 Matsushita Electric Works Ltd Abnormality supervisory system
CN103325112A (en) * 2013-06-07 2013-09-25 中国民航大学 Quick detecting method for moving objects in dynamic scene
JP2015223215A (en) * 2014-05-26 2015-12-14 アイシン精機株式会社 Sleep evaluation device
CN106510663A (en) * 2016-11-28 2017-03-22 沃康(上海)家具有限公司 Sleep monitoring method based on internet of things

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI317630B (en) * 2007-03-12 2009-12-01 Taiwan Textile Res Inst Respiration monitoring system
JP5771778B2 (en) * 2010-06-30 2015-09-02 パナソニックIpマネジメント株式会社 Monitoring device, program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341474A (en) * 1998-05-28 1999-12-10 Matsushita Electric Works Ltd Abnormality supervisory system
CN103325112A (en) * 2013-06-07 2013-09-25 中国民航大学 Quick detecting method for moving objects in dynamic scene
JP2015223215A (en) * 2014-05-26 2015-12-14 アイシン精機株式会社 Sleep evaluation device
CN106510663A (en) * 2016-11-28 2017-03-22 沃康(上海)家具有限公司 Sleep monitoring method based on internet of things

Also Published As

Publication number Publication date
CN113456027A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
US11915825B2 (en) Systems and methods of analyte measurement analysis
Fraser et al. Automated biosignal quality analysis for electromyography using a one-class support vector machine
JP2024026058A (en) Non-invasive cardiac monitors and how to use recorded cardiac data to infer patient physiological characteristics
CN109452935B (en) Non-invasive method and system for estimating blood pressure from a vascular plethysmogram using statistical post-processing
CN103429150A (en) Monitoring apparatus for monitoring physiological signal.
JP2011083393A (en) Apparatus and method for automatically identifying sleep stage, and computer program for the same
CN111248879B (en) Hypertension old people activity analysis method based on multi-mode attention fusion
CN116098602B (en) Non-contact sleep respiration monitoring method and device based on IR-UWB radar
CN111685774B (en) OSAHS Diagnosis Method Based on Probability Integrated Regression Model
WO2019216320A1 (en) Machine learning apparatus, analysis apparatus, machine learning method, and analysis method
CN116602663B (en) Intelligent monitoring method and system based on millimeter wave radar
CN113456027B (en) Sleep parameter assessment method based on wireless signals
Alivar et al. Motion detection in bed-based ballistocardiogram to quantify sleep quality
CN113793300A (en) Non-contact type respiration rate detection method based on thermal infrared imager
JP2008253727A (en) Monitor device, monitor system and monitoring method
Ahmed et al. Classification of sleep-wake state in ballistocardiogram system based on deep learning
CN115474901A (en) Non-contact living state monitoring method and system based on wireless radio frequency signals
Warrick et al. A machine learning approach to the detection of fetal hypoxia during labor and delivery
JP7216408B2 (en) Device for determining sleep apnea syndrome, method for determining sleep apnea syndrome, and program for determining sleep apnea syndrome
Liang et al. Passive fetal movement signal detection system based on intelligent sensing technology
CN114550932A (en) Sleep apnea risk assessment method, device, equipment and storage medium
KR102645586B1 (en) Apparatus and method for classifying breathing state during sleep using biosignals
CN114145722B (en) Pulse pathological feature mining method for pancreatitis patients
CN115496105B (en) Sleep prediction model training method, sleep condition prediction method and related devices
KR102570665B1 (en) CNN-based exercise intensity classification system using pulse waves

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant