CN116048250A - Sleep management method and device based on wearable equipment - Google Patents

Sleep management method and device based on wearable equipment Download PDF

Info

Publication number
CN116048250A
CN116048250A CN202211650972.6A CN202211650972A CN116048250A CN 116048250 A CN116048250 A CN 116048250A CN 202211650972 A CN202211650972 A CN 202211650972A CN 116048250 A CN116048250 A CN 116048250A
Authority
CN
China
Prior art keywords
sleep
user
sleep stage
stage
acquisition module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211650972.6A
Other languages
Chinese (zh)
Inventor
孙小玄
肖晓
李玉春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fenda Intelligent Technology Co ltd
Original Assignee
Shenzhen Fenda Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fenda Intelligent Technology Co ltd filed Critical Shenzhen Fenda Intelligent Technology Co ltd
Publication of CN116048250A publication Critical patent/CN116048250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Fuzzy Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention belongs to the technical field of intelligent wearing, and provides a sleep management method based on wearable equipment.

Description

Sleep management method and device based on wearable equipment
Technical Field
The invention relates to the technical field of intelligent wearing, in particular to a sleep management method and device based on wearable equipment.
Background
In modern life, sleep plays a very important regulation role in the life of people, and good sleep conditions are helpful for improving the efficiency of work and study, but if the sleep quality is low for a long time, the sleep quality can lead to the fact that the body is in sub-health or induces partial physiological diseases, and simultaneously, the sleep quality can lead to mental diseases such as depression and the like, and serious negative influence is brought to the health of people, and even life is threatened. Therefore, the sleep state and the history statistics result of the user can be known timely and accurately, and proper adjustment of work and life is necessary for the sleep state of the user.
In the prior art, the sleep state evaluation method mainly carries out stage judgment on sleep, namely judges the states of deep sleep, shallow sleep, wakefulness and the like of the sleep. In the conventional judging method, there is a method for judging sleep stages by the activity and duration of a user based on the acquisition parameters of an acceleration sensor, and the method uses parameters such as the sleep stage duration, the awake times, the whole sleep period and the like to calculate the sleep quality score. In addition, there is a method for acquiring pulse characteristics by using a PPG sensor, calculating sleep stage by using Heart Rate Variability (HRV) related characteristics, and calculating sleep quality by using indexes such as sleep stage and the like. In addition, a method for fusing the data of the acceleration sensor and the PPG sensor is higher in reliability compared with the former two methods, but due to the fact that the individual difference between different people is strong, various physiological indexes of different people are different to a certain extent in each sleep stage. In general, although the three methods in the prior art have advantages, the problems of over-dependent detection, difficulty in compatibility with individual differences, low sleep detection accuracy and the like exist.
In summary, the existing sleep stage management technology has the technical problems of over-dependent detection, difficulty in compatibility with individual differences, low sleep detection accuracy and the like.
Disclosure of Invention
In order to solve the technical problems, the invention provides the following scheme.
In one aspect, the invention provides a sleep management method based on wearable equipment, comprising the following steps:
the sleep monitoring and collecting module is assembled on wearable equipment, and the wearable equipment is used for being worn when the user sleeps;
extracting the sleep characteristics of the user, which are correspondingly acquired by the sleep monitoring acquisition module, in different time windows;
performing sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report;
providing the sleep stage report to the user side for the user side to provide sleep stage calculation correction feedback after reference;
and calculating correction feedback according to the sleep stage, and correcting the calculation score of the sleep stage to obtain a corrected sleep stage.
In one aspect, the present invention provides a sleep management apparatus based on a wearable device, where the sleep management apparatus based on a wearable device performs any one of the methods described above, and the sleep management apparatus based on a wearable device includes:
the sleep characteristic acquisition module is used for acquiring the sleep characteristics of the user acquired by the sleep monitoring acquisition module, and the sleep monitoring acquisition module is assembled on wearable equipment which is worn when the user sleeps;
the sleep characteristic extraction module is used for extracting the sleep characteristics of the user, which are correspondingly acquired by the sleep monitoring acquisition module, in different time windows;
the stage calculation scoring module is used for carrying out sleep stage calculation scoring according to the extracted sleep characteristics of the user so as to obtain a sleep stage report;
the stage report providing module is used for providing the sleep stage report for the user side so as to provide sleep stage calculation correction feedback after the user side refers to the sleep stage report;
the sleep stage correction module is used for calculating correction feedback according to the sleep stage, and correcting the calculation score of the sleep stage to obtain a corrected sleep stage.
In one aspect, the present invention provides a computer device comprising: a processor and a memory storing program modules that run on the processor to implement any of the methods described above.
In one aspect, the invention provides a wearable device, and the wearable device invokes a program module to run, so as to implement any one of the methods.
Compared with the prior art, the invention has the beneficial effects that:
according to the sleep management method based on the wearable equipment, the sleep characteristics of the user acquired by the sleep monitoring acquisition module are acquired, the sleep characteristics of the user acquired by the sleep monitoring acquisition module in different time windows are extracted, sleep stage calculation scoring is carried out according to the extracted sleep characteristics of the user to obtain a sleep stage report, the sleep stage report is provided for the user side, the sleep stage calculation correction feedback is provided after the user side refers to the sleep stage report, the sleep stage calculation scoring is corrected according to the sleep stage calculation correction feedback, so that the sleep stage is corrected, feedback data of the user are obtained, the sleep stage calculated based on the sleep monitoring acquisition module is used for correcting the sleep stage acquired by the sleep characteristics of the user, the defects that the sleep stage is excessively dependent on detection, cannot be compatible with differences among individuals and the like are overcome, the accuracy of sleep stage detection is improved, the accuracy and the referenceability of the sleep stage are improved, and the user experience is improved.
Drawings
FIG. 1 is a flow diagram of a wearable device-based sleep management method;
FIG. 2 is a schematic diagram of a system architecture for a wearable device-based sleep management method operation;
FIG. 3 is a schematic diagram of one architecture of a wearable device-based sleep management apparatus;
fig. 4 is a schematic diagram of an architecture of a computer device.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein.
It should be understood that, in various embodiments of the present invention, the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
It should be understood that in the present invention, "comprising" and "having" and any variations thereof are intended to cover non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements that are expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present invention, "plurality" means two or more. "and/or" is merely an association relationship describing an association object, and means that three relationships may exist, for example, and/or B may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and C", "comprising A, B, C" means that all three of A, B, C comprise, "comprising A, B or C" means that one of the three comprises A, B, C, and "comprising A, B and/or C" means that any 1 or any 2 or 3 of the three comprises A, B, C.
It should be understood that in the present invention, "B corresponding to a", "a corresponding to B", or "B corresponding to a" means that B is associated with a, from which B can be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information. The matching of A and B is that the similarity of A and B is larger than or equal to a preset threshold value.
As used herein, "if" may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection" depending on the context.
The technical scheme of the invention is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Example 1
Referring to fig. 1 and 2, the present embodiment provides a sleep management method based on a wearable device.
It should be noted that, the execution body of the method shown in fig. 1 may be a software and/or hardware device. The execution bodies of the present application may include, but are not limited to, at least one of: user equipment, network equipment, etc. The user equipment may include, but is not limited to, computers, smart phones, personal digital assistants (Personal Digital Assistant, abbreviated as PDA), and the above-mentioned electronic devices. The network device may include, but is not limited to, a single network server, a server group of multiple network servers, or a cloud of a large number of computers or network servers based on cloud computing, where cloud computing is one of distributed computing, and a super virtual computer consisting of a group of loosely coupled computers. This embodiment is not limited thereto.
Example 1
Referring to fig. 1, the present embodiment provides a sleep management method based on a wearable device, including the following steps S101, S102, S103, S104, and S105, which are specifically as follows:
s101, acquiring sleep characteristics of a user acquired by a sleep monitoring acquisition module, wherein the sleep monitoring acquisition module is assembled on a wearable device, and the wearable device is used for being worn when the user sleeps;
s102, extracting the sleep characteristics of the user, which are correspondingly acquired by the sleep monitoring acquisition module in different time windows;
s103, performing sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report;
s104, providing the sleep stage report to the user side so as to provide sleep stage calculation correction feedback after the user side refers to the sleep stage report;
s105, calculating correction feedback according to the sleep stage, and correcting the calculation score of the sleep stage to obtain a corrected sleep stage.
In this embodiment, referring to fig. 2, all or part of the steps of the sleep management method based on the wearable device may be performed on the smart wearable device. In particular, all or part of the steps of the wearable device-based sleep management method may be run on the wearable device. The MCU module in the wearable device can communicate with an upper computer, and the upper computer comprises, but is not limited to, a user side, a server and a gateway. The user terminal is connected with the MCU module and the server for communication, and the gateway is connected with the server for communication. In addition, the MCU module can be connected and communicated with the photoelectric volume pulse wave signal acquisition module, and the photoelectric volume pulse wave signal acquisition module can also be called as a PPG signal acquisition module. The MCU module can be connected with and communicated with the triaxial acceleration signal acquisition module, and the triaxial acceleration signal acquisition module can also be called as an ACC signal acquisition module. The MCU module can also be connected with the wearing detection module, the signal quality detection module, the sleep stage calculation scoring module and the sleep stage correction module.
In step S101, the sleep monitoring and collecting module may include a photoplethysmography signal collecting module and a triaxial acceleration signal collecting module. The sleep monitoring acquisition module acquires the sleep characteristics of the user acquired by the sleep monitoring acquisition module, and comprises the following steps: acquiring sleeping characteristics of a photoplethysmography user acquired by a photoplethysmography signal acquisition module; and acquiring the sleep characteristics of the triaxial acceleration user acquired by the triaxial acceleration signal acquisition module. The PPG signal acquisition module acquires an infrared light PPG sensing signal and a green light PPG sensing signal. Sample rates of the infrared light PPG signal and the green light PPG signal include, but are not limited to, 128Hz. The ACC signal acquisition module acquires signals of the triaxial acceleration sensor, and the sampling rate comprises, but is not limited to, 25Hz.
In step S102, after the sleep monitoring and collecting module user sleep characteristics, there will be user sleep characteristics corresponding to the time window in different time windows, where the user sleep characteristics include, but are not limited to, pulse wave signal characteristics, tri-axial acceleration signal characteristics, and the like. In this step, extracting sleep characteristics of the user corresponding to different time windows may include the following steps:
performing low-pass filtering on the collected infrared light PPG sensing signal to obtain a current baseline mu;
carrying out band-pass filtering on the collected green light PPG sensing signals to obtain heart rate passband signals; the passband frequency of the heart rate passband signal can be preferably 0.4-5Hz, and the time window length can be preferably 6s;
calculating the signal fluctuation characteristic sigma;
Figure BDA0004010602780000061
where i represents the ith sample point in the window time, n is the total number of sample points in the window time, f i For green PPG sensing signalThe ith value of the heart rate passband signal,
Figure BDA0004010602780000062
is the mean value of the heart rate passband signal of the green light PPG sensing signal in the time window;
extracting an effective peak value point of the heart rate passband signal, wherein the amplitude value of the effective peak value point is higher than a seventh threshold value;
calculating the time interval average value tau of adjacent peak points;
the collected signals of the triaxial acceleration sensor are subjected to combined acceleration calculation, and the intensity characteristic A of the combined acceleration intensity in window time is calculated acc The length of the window time may be set, for example, to 1s;
Figure BDA0004010602780000071
where i represents the ith sample point in the window time, n is the total number of sample points in the window time, x i Is the ith value of the X axis in the window period, y i Z is the ith value of the Y axis in the window period i The Z axis is the ith value in the window period, and delta is the offset value corresponding to 1 time of gravitational acceleration;
through A acc Calculating intensity level characteristics D acc
Figure BDA0004010602780000072
Where th3 is the third threshold of ACC intensity, th2 is the second threshold of ACC intensity, th1 is the first threshold of ACC intensity, and the intensity level duration E is calculated.
In step S103, performing sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report, which may include the following steps:
judging whether the wearable equipment is in a wearing state or not;
when the wearable equipment is in a wearing state, sleep stage calculation is carried out;
and prompting to wear the wearable equipment when the wearable equipment is not in a wearing state.
In some optional embodiments, determining whether the wearable device is in a worn state may include the steps of
By the current baseline mu, the signal fluctuation characteristic sigma, the time interval mean tau of adjacent peak points and the intensity characteristic A acc Intensity level feature D acc Judging the wearing state;
if the user is in an awake state and the device is in a wearing state, the ACC intensity level is a first set value, for example, 0, and the intensity level duration epsilon is larger than an eighth threshold value, judging that the user is in a sleep state;
if the wearing is not judged, the user is wakened;
in step S103, performing sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report, which may include the following steps:
evaluating the signal quality of the sleep monitoring acquisition module in different time windows;
when the signal quality of the sleep monitoring acquisition module meets a preset condition, performing sleep stage calculation;
and prompting to check the sleep monitoring acquisition module when the signal quality of the sleep monitoring acquisition module does not meet the preset condition.
In this embodiment, in a sleep state, the heart rate variability feature of the human body may be calculated by the green PPG sensing signal, where the heart rate variability feature of the human body may be referred to as HRV feature; the method comprises the steps of carrying out signal quality evaluation on green light PPG sensing signals in each time window, judging that more movements exist in the green light PPG sensing signals and are not suitable for carrying out HRV parameter calculation if the number of the green light PPG sensing signals with the signal quality being a second set value (for example, 0) in a selected time window (for example, 2 min) is larger than a ninth threshold, and meanwhile, counting the number of continuous windows of the uncomputed HRV parameters, and if the number of the windows is larger than the tenth threshold, carrying out stage adjustment on the last sleep, wherein the stage adjustment is specifically carried out: the rapid eye movement is adjusted to be awake, the shallow sleep is adjusted to be rapid eye movement, and the deep sleep is adjusted to be shallow sleep. If the number of the signal quality in the selected time window is larger than the eleventh threshold value and is equal to a third set value (for example, 1), judging that the quality of the PPG signal collected in the selected time window is poor, and calculating the reliability of the obtained HRV parameter is low, wherein only the HRV time domain parameters SDNN and RMSSD are calculated;
Figure BDA0004010602780000081
/>
Figure BDA0004010602780000082
wherein RR i For the i-th RR interval in the window time, rr_mean is the RR interval average. If the number of signal quality (e.g. 2) in the selected time window (e.g. 2 min) is greater than the twelfth threshold, the signal quality in the selected time window is better, and then, in addition to calculating SDNN and RMSSD, the RR interval sequence in the selected time window is extended, and then, the first interval band-pass signal extraction is performed, e.g. [0.04,0.15 ]]Hz), second interval bandpass signal extraction, e.g., [0.15,0.4 ]]Hz; the low-frequency energy LF and the high-frequency energy HF are obtained by calculating the square sum of the amplitudes of all pass band signals, the LF and the HF are divided by the RR interval signal length, the influence of different RRI signal lengths on the value is reduced, and the ratio LF/HF of the LF and the HF is calculated.
In some alternative embodiments, the step of scoring the sleep stage calculation according to the extracted sleep characteristics of the user to obtain a sleep stage report comprises the following steps:
inputting the extracted sleep characteristics of the user into a random forest model to calculate sleep stages;
integrating sleep fragments calculated by sleep stage in different time windows to obtain complete monitoring sleep;
scoring the complete monitoring sleep using an xgboost model to obtain a sleep stage report.
In this embodiment, sleep staging is performed for signals within each selected time window. Such asIf the signal state within the selected time window (e.g. 2 min) is a third set value (e.g. 1), then the input to the random forest model is: sleep stage, SDNN, RMSSD, ACC intensity level characteristic D within a previous selected time window (e.g., 2 min) acc The intensity level duration e and the intensity characteristic a within the time window acc The method comprises the steps of carrying out a first treatment on the surface of the If the signal state within the selected time window (e.g., 2 min) is a fourth set point (e.g., 2), then the random forest model is input as: SDNN, RMSSD, LF, HF, RR _mean, ACC intensity ranking feature D acc The intensity level duration e and the intensity characteristic a within the time window acc I.e. the sleep stage models employed for different signal states are different. If the user is in a sleep state, when the sleep stage results of a plurality of continuous time windows are awake, namely the number of times of continuous occurrence of the wakefulness is larger than a thirteenth threshold value, or the time length of occurrence of non-wearing exceeds a fourteenth threshold value, judging that the user falls asleep. The user may wake up or perform other activities for several times during sleep, so that the small sleep data needs to be integrated before outputting the complete sleep report, when the sleep time is greater than the fifteenth threshold, the user is considered to finish sleeping, at this time, the embodiment will integrate the sleep data of all the sleep time nodes and the sleep time node not greater than the fifteenth threshold, update the sleep time node recorded during sleep to the last time node, define the time segment from each sleep time node to the adjacent sleep time node in the middle as the awake time segment, update the whole sleep time to the first sleep time node to the last sleep time node, and update the data of the time proportion, the night awakening number and the like of each sleep period. In addition, personal information of the user, sleeping time length, sleeping stage proportion, night wake times, night wake time length and deep sleep continuity characteristics are input into an xgboost model, and the xgboost is adopted for final sleep scoring to obtain a sleeping stage report.
In step S104 and step S105, the sleep stage report includes a sleep time point, a sleep stage duration and distribution, a night wake number, a night wake duration, and a sleep score; the step of correcting the calculation score of the sleep stage to obtain the corrected sleep stage further comprises the following steps:
responding to the user side correction request and providing a questionnaire for the user side; the questionnaire includes the sleep stage calculation correction feedback, the sleep stage calculation correction feedback including: time point feedback of falling asleep, dream feedback, tired feedback, night wake feedback, difficulty in falling asleep feedback, sleep middle body feeling and sleep quality self-evaluation feedback;
and receiving the questionnaire returned by the user side, and extracting the sleep stage calculation correction feedback of the questionnaire.
In some alternative embodiments, calculating correction feedback from the sleep stage, correcting the calculated score of the sleep stage to obtain a corrected sleep stage, comprising the steps of:
synchronizing the corrected sleep stage to a user; and synchronizing the corrected sleep session to the wearable device; and synchronizing the corrected sleep stage to a server.
In this embodiment, after the user finishes sleeping on the same day, the user may obtain a sleep report of this sleep through the mobile phone APP, where the sleep report includes information such as a time point of falling asleep, duration and distribution of each sleep stage, number of night wakeups, duration of night wakeups, and sleep score, and if the user feels that there is a large difference between the report and the expected result or wants to formulate a sleep algorithm belonging to the user, the user may initiate a calibration request on the APP. The user fills out questionnaire information, the questionnaire information surrounds the time point of falling asleep, dream, tired feeling, night waking, difficulty of falling asleep, body feeling during sleep and sleep quality self-evaluation, wherein the time point of falling asleep and the time point of falling asleep are time information, and 4 options are set for related problems of dream, tired feeling and difficulty of falling asleep: non-conforming, some conforming, basically conforming, very conforming, take dream as an example, non-conforming means that the user does not remember the dream information in the sleeping process at all, some conforming means that the user can judge that he or she has dream, but remembers unclear dream information, basically conforming means that the user can judge that he or she has dream, remembers the general information of dream, very conforming means that the user can describe the dream information of himself or herself more clearly, night wake frequency setting: there are no, one, two, more 4 options, night wake duration settings: no, ten minutes, thirty minutes, 4 options longer, sleep somatosensory settings: normal, cold, hot 3 options, sleep quality self-assessment settings: poor sleep quality, general sleep quality, good sleep quality, and very good sleep quality for 4 options. It should be noted that this embodiment includes, but is not limited to, the question and answer settings described above.
Comparing the user feedback information obtained by calibration with the actual acquired information, if obvious deviation occurs between the feedback information and the acquired information, returning to a calibration failure state, and if the deviation is smaller than a threshold value, entering stage calculation and score calculation. Taking the sleeping time point as an example, if the sleeping time point fed back by the user is different from the actual acquisition time point of the sensor by more than a fourth threshold value or the time point fed back by the user is obviously inconsistent with the current state of the sensor, the calibration failure is returned. In the stage calculation and the score calculation. The proportion of each sleep stage in the sleep process can be estimated through a questionnaire result fed back by a user, sleep stage parameters search the optimal threshold value of each feature under the limitation of a fifth threshold value by feeding back the sleep stage proportion and PPG and ACC signal features of each time window in the whole sleep process, the feature importance degree of each feature of the PPG and the ACC is ordered, a greedy algorithm is adopted, and the threshold values are adjusted sequentially from high to low in feature importance degree. The feedback sleep score is obtained through the questionnaire, the feedback sleep score and the sleep score characteristics (the sleep stage proportion, the night wake times, the night wake time length, the sleep time length and the like) of the sleep are adopted, the optimal threshold value of each characteristic under the limitation of adjusting the sixth threshold value is searched, the method is the same as the sleep stage parameter acquisition, wherein the feedback sleep score can be obtained through the steps that the sleep quality difference corresponds to the sleep score 60, the sleep quality is generally corresponding to the sleep score 70, the sleep quality is good and is corresponding to the sleep score 80, the sleep quality is very good and is corresponding to the sleep score 90, and the better sleep score acquisition method can further comprise questionnaire results in the aspect of user personality because individuals with different personality have different favorites on the questionnaire selection.
Finally, it should be noted that the wearable device synchronizes the personal information of the user and the sleep parameter information, including but not limited to data exchange through APP bluetooth, or the device may directly access the server through the WI F I module to obtain. The sensor signals collected by the wearable device include, but are not limited to, green light and infrared light capacitive pulse wave signals (PPG) and triaxial acceleration sensors. Sleep stages include, but are not limited to, 4 kinds of wakefulness, rapid eye movement, shallow sleep, and deep sleep. Signal sampling rates include, but are not limited to, PPG125Hz, ACC25Hz. Wear detection includes, but is not limited to, infrared PPG and green PPG heart rate cycles. Features involved in signal quality detection include, but are not limited to, amplitude and variance of combined acceleration, green PPG peak amplitude ratio, RR I (cardiac interval) ratio. Signal quality levels include, but are not limited to, the 3 levels above. The sleep report includes, but is not limited to, information such as a time point of falling asleep, a time length and distribution of each sleep stage, a number of night wakefulness, a night wakefulness time length, a sleep score, etc. Calibration questionnaires include, but are not limited to, a point in time to fall asleep, dreaming, tired feeling, night waking, difficulty falling asleep, feeling of body in sleep, and self-assessment of sleep quality. The questionnaire answer settings include, but are not limited to, the forms mentioned above; features used for sleep staging include, but are not limited to, the previous 2min sleep stage, SDNN, RMSSD, LF, HF, RR _mean, ACC intensity level feature D acc The intensity level duration e and the intensity characteristic a within the time window acc . Features used by the sleep score include, but are not limited to, user personal information, sleep duration, individual sleep stage duty cycle, number of wakefulness, duration of wakefulness, and deep sleep continuity features. The algorithmic model used for sleep staging includes, but is not limited to, random forests. Algorithm models used for sleep score include, but are not limited to, xgboost;
example two
Referring to fig. 3, on the basis of the above embodiment, the present embodiment provides a sleep management apparatus based on a wearable device, where the sleep management apparatus based on a wearable device runs any one of the methods of the above embodiment, and the sleep management apparatus based on a wearable device includes:
the sleep characteristic acquisition module 101 is used for acquiring the sleep characteristics of the user acquired by the sleep monitoring acquisition module, and the sleep monitoring acquisition module is assembled on a wearable device which is worn when the user sleeps;
the sleep characteristic extraction module 102 is configured to extract the sleep characteristics of the user correspondingly acquired by the sleep monitoring acquisition module in different time windows;
a stage calculation scoring module 103, configured to perform sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report;
the stage report providing module 104 is configured to provide the sleep stage report to the user side, so that the user side can provide the sleep stage calculation correction feedback after consulting;
the sleep stage correction module 105 is configured to calculate correction feedback according to the sleep stage, and correct a calculation score of the sleep stage to obtain a corrected sleep stage.
In this embodiment, the sleep management device based on the wearable device extracts the user sleep characteristics acquired by the sleep monitoring acquisition module, and the sleep monitoring acquisition module correspondingly acquires the user sleep characteristics in different time windows, performs sleep stage calculation scoring according to the extracted user sleep characteristics to obtain a sleep stage report, provides the sleep stage report to the user terminal, provides the sleep stage calculation correction feedback after the user terminal refers to the sleep stage report, calculates the correction feedback according to the sleep stage, corrects the calculation scoring of the sleep stage to obtain the correction sleep stage, thereby obtaining feedback data of the user, correcting the sleep stage calculated based on the sleep monitoring acquisition module, avoiding overdependent detection, being incapable of being compatible with defects such as differences between individuals, and the like, improving the accuracy of sleep detection, improving the accuracy and referenceability of the sleep stage, and improving the user experience.
Example IV
Referring to fig. 4, the present embodiment provides a computer device 40, comprising: a processor 41, a memory 42 and a computer program.
The memory 42 is used for storing a computer program, and may also be a flash memory (flash). The computer program is, for example, an application program, a functional module, or the like that implements the above-described method.
A processor 41 for executing the computer program stored in the memory to perform the steps performed by the apparatus in the above method. Reference may be made in particular to the description of the embodiments of the method described above.
Alternatively, the memory 42 may be separate or integrated with the processor 41.
When memory 42 is a device separate from processor 41, the apparatus may further include:
a bus 43 for connecting the memory 42 and the processor 41.
The present invention also provides a readable storage medium having stored therein a computer program for implementing the methods provided by the various embodiments described above when executed by a processor.
The readable storage medium may be a computer storage medium or a communication medium. Communication media includes any medium that facilitates transfer of a computer program from one place to another. Computer storage media can be any available media that can be accessed by a general purpose or special purpose computer. For example, a readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). In addition, the ASIC may reside in a user device. The processor and the readable storage medium may reside as discrete components in a communication device. The readable storage medium may be read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tape, floppy disk, optical data storage device, etc.
The present invention also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the device may read the execution instructions from the readable storage medium, the execution instructions being executed by the at least one processor to cause the device to implement the methods provided by the various embodiments described above.
In the above embodiment of the apparatus, it should be understood that the processor may be a central processing unit (english: central ProcessingUnit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (10)

1. A sleep management method based on wearable equipment, which is characterized by comprising the following steps:
the sleep monitoring and collecting module is assembled on wearable equipment, and the wearable equipment is used for being worn when the user sleeps;
extracting the sleep characteristics of the user, which are correspondingly acquired by the sleep monitoring acquisition module, in different time windows;
performing sleep stage calculation scoring according to the extracted sleep characteristics of the user to obtain a sleep stage report;
providing the sleep stage report to the user side for the user side to provide sleep stage calculation correction feedback after reference;
and calculating correction feedback according to the sleep stage, and correcting the calculation score of the sleep stage to obtain a corrected sleep stage.
2. The method of claim 1, wherein the sleep monitoring acquisition module comprises: a photoelectric volume pulse wave signal acquisition module and a triaxial acceleration signal acquisition module; the sleep characteristic of the user acquired by the sleep monitoring acquisition module is acquired, and the method comprises the following steps:
acquiring sleeping characteristics of a photoplethysmography user acquired by a photoplethysmography signal acquisition module; and acquiring the sleep characteristics of the triaxial acceleration user acquired by the triaxial acceleration signal acquisition module.
3. The method of claim 1, wherein scoring the sleep stage calculation based on the extracted sleep characteristics of the user to obtain a sleep stage report, comprising the steps of:
judging whether the wearable equipment is in a wearing state or not;
when the wearable equipment is in a wearing state, sleep stage calculation is carried out;
and prompting to wear the wearable equipment when the wearable equipment is not in a wearing state.
4. The method of claim 1, wherein scoring the sleep stage calculation based on the extracted sleep characteristics of the user to obtain a sleep stage report, comprising the steps of:
evaluating the signal quality of the sleep monitoring acquisition module in different time windows;
when the signal quality of the sleep monitoring acquisition module meets a preset condition, performing sleep stage calculation;
and prompting to check the sleep monitoring acquisition module when the signal quality of the sleep monitoring acquisition module does not meet the preset condition.
5. The method of claim 1, wherein the sleep staging report includes a point in time to fall asleep, respective sleep stage durations and distributions, number of wakefulness, duration of wakefulness, and sleep score; the step of correcting the calculation score of the sleep stage to obtain the corrected sleep stage further comprises the following steps:
responding to the user side correction request and providing a questionnaire for the user side; the questionnaire includes the sleep stage calculation correction feedback, the sleep stage calculation correction feedback including: time point feedback of falling asleep, dream feedback, tired feedback, night wake feedback, difficulty in falling asleep feedback, sleep middle body feeling and sleep quality self-evaluation feedback;
and receiving the questionnaire returned by the user side, and extracting the sleep stage calculation correction feedback of the questionnaire.
6. The method of claim 1, wherein calculating correction feedback based on the sleep stage corrects the calculated score for the sleep stage to obtain a corrected sleep stage, comprising the steps of:
synchronizing the corrected sleep stage to a user; and synchronizing the corrected sleep session to the wearable device; and synchronizing the corrected sleep stage to a server.
7. The method of any of claims 1-6, wherein scoring sleep stage calculations based on the extracted sleep characteristics of the user to obtain a sleep stage report comprises the steps of:
inputting the extracted sleep characteristics of the user into a random forest model to calculate sleep stages;
integrating sleep fragments calculated by sleep stage in different time windows to obtain complete monitoring sleep;
scoring the complete monitoring sleep using an xgboost model to obtain a sleep stage report.
8. A sleep management apparatus based on a wearable device, wherein the sleep management apparatus based on a wearable device runs the method of any of claims 1-7, the sleep management apparatus based on a wearable device comprising:
the sleep characteristic acquisition module is used for acquiring the sleep characteristics of the user acquired by the sleep monitoring acquisition module, and the sleep monitoring acquisition module is assembled on wearable equipment which is worn when the user sleeps;
the sleep characteristic extraction module is used for extracting the sleep characteristics of the user, which are correspondingly acquired by the sleep monitoring acquisition module, in different time windows;
the stage calculation scoring module is used for carrying out sleep stage calculation scoring according to the extracted sleep characteristics of the user so as to obtain a sleep stage report;
the stage report providing module is used for providing the sleep stage report for the user side so as to provide sleep stage calculation correction feedback after the user side refers to the sleep stage report;
the sleep stage correction module is used for calculating correction feedback according to the sleep stage, and correcting the calculation score of the sleep stage to obtain a corrected sleep stage.
9. A computer device, comprising: a processor and a memory storing program modules that are executed on the processor to implement the method of any one of claims 1-7.
10. A wearable device, characterized in that the wearable device invokes a program module to run, implementing the method of any of claims 1-7.
CN202211650972.6A 2022-09-28 2022-12-21 Sleep management method and device based on wearable equipment Pending CN116048250A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2022111938320 2022-09-28
CN202211193832 2022-09-28

Publications (1)

Publication Number Publication Date
CN116048250A true CN116048250A (en) 2023-05-02

Family

ID=86120991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211650972.6A Pending CN116048250A (en) 2022-09-28 2022-12-21 Sleep management method and device based on wearable equipment

Country Status (1)

Country Link
CN (1) CN116048250A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116649917A (en) * 2023-07-24 2023-08-29 北京中科心研科技有限公司 Sleep quality monitoring method and device and wearable equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116649917A (en) * 2023-07-24 2023-08-29 北京中科心研科技有限公司 Sleep quality monitoring method and device and wearable equipment
CN116649917B (en) * 2023-07-24 2023-10-24 北京中科心研科技有限公司 Sleep quality monitoring method and device and wearable equipment

Similar Documents

Publication Publication Date Title
CN107595245B (en) Sleep management method, system and terminal equipment
CN109464130B (en) Sleep assisting method, system and readable storage medium
CN104615851B (en) A kind of Sleep-Monitoring method and terminal
CN107106085A (en) Apparatus and method for sleep monitor
US20080234785A1 (en) Sleep controlling apparatus and method, and computer program product thereof
CN206045144U (en) A kind of novel intelligent sleeping and the device for waking up naturally
CN106163391A (en) System for multiphase sleep management, method for the operation thereof, device for sleep analysis, method for classifying a current sleep phase, and use of the system and the device in multiphase sleep management
AU2018257774A1 (en) Activity recognition
CN104720746A (en) Sleeping stage determination method and system
CN105496356A (en) Sleep state determination apparatus, sleep state determination method, and sleep management system
CN111657855B (en) Sleep evaluation and sleep awakening method and device and electronic equipment
CN111631697A (en) Intelligent sleep and fatigue state information monitoring control system and method and monitor
CN107890339A (en) A kind of sleep stage detection method and wearable sleep stage detection means
WO2018077020A1 (en) Wristband-based method for determining human emotions
JP6040874B2 (en) Sleep stage estimation device
CN109199336A (en) A kind of sleep quality quantization method, device and equipment based on machine learning
CN109846291A (en) A kind of intelligent pillow awakening method and intelligent pillow
CN116048250A (en) Sleep management method and device based on wearable equipment
WO2019019489A1 (en) Exercise guidance method and apparatus, and intelligent wearable device
CN114916935A (en) Posture analysis auxiliary correction system based on correction process of correction personnel
US20220375590A1 (en) Sleep staging algorithm
CN109350826A (en) A kind of sleeping and anti-apnea device
CN114145717A (en) Sleep state analysis method based on PPG heart rate characteristic parameters and motion quantity
CN113440122B (en) Emotion fluctuation monitoring and identifying big data early warning system based on vital signs
CN110313922A (en) A kind of pressure regulating method, pressure regulating system and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication