CN115154837A - Control method and device of sleep-assisting equipment, terminal and storage medium - Google Patents

Control method and device of sleep-assisting equipment, terminal and storage medium Download PDF

Info

Publication number
CN115154837A
CN115154837A CN202211047779.3A CN202211047779A CN115154837A CN 115154837 A CN115154837 A CN 115154837A CN 202211047779 A CN202211047779 A CN 202211047779A CN 115154837 A CN115154837 A CN 115154837A
Authority
CN
China
Prior art keywords
data
sleep
determining
target
change curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211047779.3A
Other languages
Chinese (zh)
Other versions
CN115154837B (en
Inventor
韩璧丞
苏度
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mental Flow Technology Co Ltd
Original Assignee
Shenzhen Mental Flow Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mental Flow Technology Co Ltd filed Critical Shenzhen Mental Flow Technology Co Ltd
Priority to CN202211047779.3A priority Critical patent/CN115154837B/en
Publication of CN115154837A publication Critical patent/CN115154837A/en
Application granted granted Critical
Publication of CN115154837B publication Critical patent/CN115154837B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M21/02Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • A61B5/374Detecting the frequency distribution of signals, e.g. detecting delta, theta, alpha, beta or gamma waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4836Diagnosis combined with treatment in closed-loop systems or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/08Other bio-electrical signals
    • A61M2230/10Electroencephalographic signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/08Other bio-electrical signals
    • A61M2230/14Electro-oculogram [EOG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/18Rapid eye-movements [REM]

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Anesthesiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Acoustics & Sound (AREA)
  • Hematology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Pain & Pain Management (AREA)
  • Human Computer Interaction (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a control method, a device, a terminal and a storage medium of sleep-assisting equipment, wherein the method comprises the following steps: acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period; determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electrooculogram data and the eye movement data; and determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period. The invention can dynamically adjust the working parameters of the sleep-assisting device through the sleep state and the current time of the user, and solves the problem that the prior art adopts fixed working parameters to control the sleep-assisting device to hardly achieve good sleep-assisting effect.

Description

Control method and device of sleep-assisting equipment, terminal and storage medium
Technical Field
The present invention relates to the field of device control, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling a sleep-assisting device.
Background
High quality sleep is an important basis for healthy life, and studies have shown that sleep quality is more important than sleep time. Therefore, various sleep-aiding devices have come into play. Sleep-aiding equipment is a type of instrument that helps the human body sleep, including but not limited to, hot compress devices, massage devices, and music devices. They reduce the activity of the brain of a human body by relaxing the mood or relaxing the muscles, thereby achieving the effect of helping sleep. The prior sleep-assisting device usually operates with fixed working parameters, however, the sleep state and the sleep time of a user can change, so that the sleep-assisting device is controlled by adopting the fixed working parameters and cannot achieve a good sleep-assisting effect.
Thus, there is still a need for improvement and development of the prior art.
Disclosure of Invention
The present invention is to provide a method, an apparatus, a terminal and a storage medium for controlling a sleep-assisting device, which are used to solve the above-mentioned drawbacks of the prior art, and aims to solve the problem that it is difficult to achieve a good sleep-assisting effect by using a fixed working parameter to control the sleep-assisting device in the prior art.
The technical scheme adopted by the invention for solving the problems is as follows:
in a first aspect, an embodiment of the present invention provides a method for controlling a sleep-assisting apparatus, where the method includes:
acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period;
determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data and the eye movement data;
and determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
In one embodiment, the determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data, and the eye movement data includes:
determining a current intensity change curve according to the electroencephalogram data and the electro-oculogram data;
determining an eye movement intensity change curve according to the eye movement data;
determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve;
determining the target sleep state based on the brain activity.
In one embodiment, the determining a current intensity variation curve according to the electroencephalogram data and the electro-oculogram data includes:
determining an electroencephalogram intensity change curve according to the electroencephalogram data, and determining an electro-oculogram intensity change curve according to the electro-oculogram data;
respectively carrying out normalization processing on the brain electric intensity change curve and the eye electric intensity change curve to obtain a standard brain electric intensity change curve and a standard eye electric intensity change curve;
and determining the current intensity change curve according to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve, wherein the intensity value corresponding to each time point in the current intensity change curve is determined based on the data value of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve respectively.
In one embodiment, the determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve includes:
determining a plurality of first data points according to the current intensity change curve, wherein the current intensity corresponding to each first data point is higher than a preset first threshold value;
determining a plurality of second data points according to the eye movement intensity variation curve, wherein the eye movement intensity corresponding to each second data point is higher than a preset second threshold value;
and acquiring data distribution characteristics corresponding to the first data points and the second data points, and inputting the data distribution characteristics into a pre-trained prediction model to obtain the brain activeness.
In one embodiment, the obtaining the data distribution characteristic corresponding to each of the first data points and each of the second data points includes:
normalizing the numerical values of the first data points and the second data points to obtain first standard data points corresponding to the first data points and second standard data points corresponding to the second data points;
determining a target time series according to the first standard data points and the second standard data points, wherein the target time series comprises a plurality of variables arranged in a time sequence, and each variable is determined based on the first standard data point and/or the second standard data point corresponding to the time point corresponding to the variable;
and determining the data distribution characteristics according to the target time sequence.
In one embodiment, the determining the data distribution characteristic according to the target time series includes:
acquiring time intervals between adjacent variables in the target time sequence, and determining time distribution characteristics according to the time intervals between the adjacent variables;
obtaining the numerical value of each variable in the target time sequence, and determining the numerical value distribution characteristic according to the numerical value of each variable;
and determining the data distribution characteristics according to the time distribution characteristics and the numerical value distribution characteristics.
In one embodiment, the working parameters include working duration and working strength, and determining the target sleep aid device and the working parameters corresponding to the target sleep aid device according to the target sleep state and the acquisition time period includes:
determining the target sleep-assisting equipment from a plurality of preset sleep-assisting equipment according to the target sleep state;
and determining the working time length and the working intensity corresponding to the target sleep-assisting equipment according to the target sleep state and the acquisition time period.
In a second aspect, an embodiment of the present invention further provides a control apparatus for a sleep-assisting device, where the apparatus includes:
the data acquisition module is used for acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period;
the state determining module is used for determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electrooculogram data and the eye movement data;
and the device control module is used for determining the target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
In one embodiment, the state determination module comprises:
the current analysis unit is used for determining a current intensity change curve according to the electroencephalogram data and the electrooculogram data;
the eye movement analysis unit is used for determining an eye movement intensity change curve according to the eye movement data;
the brain analysis unit is used for determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve;
and the state analysis unit is used for determining the target sleep state according to the brain activity.
In one embodiment, the current analyzing unit includes:
the curve drawing unit is used for determining an electroencephalogram intensity change curve according to the electroencephalogram data and determining an electro-oculogram intensity change curve according to the electro-oculogram data;
the first normalization processing unit is used for respectively carrying out normalization processing on the electroencephalogram intensity change curve and the electro-oculogram intensity change curve to obtain a standard electroencephalogram intensity change curve and a standard electro-oculogram intensity change curve;
and the curve fusion unit is used for determining the current intensity change curve according to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve, wherein the intensity value corresponding to each time point in the current intensity change curve is determined based on the data value of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve respectively.
In one embodiment, the determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve includes:
the screening unit is used for determining a plurality of first data points according to the current intensity change curve, wherein the current intensity corresponding to each first data point is higher than a preset first threshold value;
determining a plurality of second data points according to the eye movement intensity variation curve, wherein the eye movement intensity corresponding to each second data point is higher than a preset second threshold value;
and the prediction unit is used for acquiring data distribution characteristics corresponding to the first data points and the second data points, and inputting the data distribution characteristics into a pre-trained prediction model to obtain the brain activity.
In one embodiment, the prediction unit comprises:
the second normalization processing unit is used for normalizing the numerical values of the first data points and the second data points to obtain first standard data points corresponding to the first data points and second standard data points corresponding to the second data points;
a sequence generating unit, configured to determine a target time sequence according to each of the first standard data points and each of the second standard data points, where the target time sequence includes a plurality of variables arranged in a time sequence, and each of the variables is determined based on the first standard data point and/or the second standard data point corresponding to a time point corresponding to the variable;
and the characteristic extraction unit is used for determining the data distribution characteristics according to the target time sequence.
In one embodiment, the feature extraction unit includes:
the time analysis unit is used for acquiring time intervals between every two adjacent variables in the target time sequence and determining time distribution characteristics according to the time intervals between every two adjacent variables;
the numerical analysis unit is used for acquiring the numerical value of each variable in the target time sequence and determining the numerical distribution characteristic according to the numerical value of each variable;
and the comprehensive analysis unit is used for determining the data distribution characteristics according to the time distribution characteristics and the numerical distribution characteristics.
In one embodiment, the operating parameters include an operating duration and an operating intensity, and the device control module includes:
the device screening unit is used for determining the target sleep-assisting device from a plurality of preset sleep-assisting devices according to the target sleep state;
and the parameter determining unit is used for determining the working time length and the working intensity corresponding to the target sleep-assisting equipment according to the target sleep state and the acquisition time period.
In a third aspect, an embodiment of the present invention further provides a terminal, where the terminal includes a memory and more than one processor; the memory stores more than one program; the program includes instructions for executing a control method of a sleep-aid apparatus as described in any of the above; the processor is configured to execute the program.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a plurality of instructions are stored, where the instructions are adapted to be loaded and executed by a processor to implement the steps of the control method for a sleep-assisting apparatus described in any one of the above.
The invention has the beneficial effects that: the embodiment of the invention dynamically adjusts the working parameters of the sleep-assisting device according to the sleep state and the current time of the user, and solves the problem that the sleep-assisting device is difficult to achieve a good sleep-assisting effect by adopting fixed working parameters to control the sleep-assisting device in the prior art.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and it is also possible for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a control method of a sleep-assisting apparatus according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of internal modules of a control device of a sleep-assisting apparatus according to an embodiment of the present invention.
Fig. 3 is a schematic block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The invention discloses a control method, a device, a terminal and a storage medium of sleep-assisting equipment, and in order to make the purpose, technical scheme and effect of the invention clearer and clearer, the invention is further explained in detail by referring to the attached drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
High quality sleep is an important basis for healthy life, and studies have shown that sleep quality is more important than sleep time. Therefore, various sleep-aiding devices have come into play. Sleep-aiding equipment is a type of instrument for aiding the sleep of a human body, and includes, but is not limited to, a hot compress device, a massage device and a music device. They reduce the activity of the brain of a human body by relaxing the mood or relaxing the muscles, thereby achieving the effect of helping sleep. The prior sleep-assisting device usually operates with fixed working parameters, but the sleep state and the sleep time of a user can be changed, so that the sleep-assisting device is controlled by adopting the fixed working parameters and cannot achieve a good sleep-assisting effect.
In view of the above-mentioned drawbacks of the prior art, the present invention provides a method for controlling a sleep-assisting apparatus, the method including: acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period; determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electrooculogram data and the eye movement data; and determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period. The invention can dynamically adjust the working parameters of the sleep-assisting device through the sleep state and the current time of the user, and solves the problem that the prior art adopts fixed working parameters to control the sleep-assisting device to hardly achieve good sleep-assisting effect.
Exemplary method
As shown in fig. 1, the method comprises the steps of:
the method comprises the steps of S100, obtaining electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same collection time period.
Specifically, the target user in this embodiment may be any user who needs to use the sleep-assisting device to improve sleep quality. In order to dynamically adjust the working parameters of the sleep-aiding device, the present embodiment needs to simultaneously acquire the current electroencephalogram data, electrooculogram data, and eye movement data of the target user. Wherein the electroencephalogram data is generated based on the current electroencephalogram of the target user; the electro-ocular data is generated based on resting potential of an eyeball of the target user; eye movement data is generated based on eye rotations of the target user.
As shown in fig. 1, the method further comprises the steps of:
s200, determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data and the eye movement data.
Specifically, when the target user is in different sleep states, the brain activity and eyeball rotation of the target user can be different. Therefore, when the target user is in different sleep states, the data characteristics of the acquired electroencephalogram data, the acquired electro-oculogram data and the acquired eye movement data are different. Therefore, the current sleep state of the target user can be judged by analyzing the current electroencephalogram data, the current electrooculogram data and the current eye movement data of the target user, and the target sleep state can be obtained. Compared with the method using a single type of data, the present embodiment can accurately determine the current sleep state of the target user using a plurality of different types of data.
In one implementation, the step S200 specifically includes the following steps:
step S201, determining a current intensity change curve according to the electroencephalogram data and the electrooculogram data;
step S202, determining an eye movement intensity change curve according to the eye movement data;
step S203, determining the brain activity corresponding to the target user according to the current intensity change curve and the eye movement intensity change curve;
and step S204, determining the target sleep state according to the brain activeness.
Specifically, because the electroencephalogram data and the electro-oculogram data are generated based on current data generated by different head parts of the target user, the electroencephalogram data and the electro-oculogram data can be combined and analyzed to generate a change curve for reflecting the head current intensity of the target user, namely the current intensity change curve. And then generating a change curve for reflecting the eye movement intensity of the target user according to the eye movement data, namely obtaining the eye movement intensity change curve. Since the current intensity and the eye movement intensity of the head of the target user are different in different active states of the brain of the target user, the current brain activity of the target user can be comprehensively determined based on the current intensity variation curve and the eye movement intensity variation curve. Finally, because the target user has different brain activity degrees in different sleep states, the current sleep state of the target user can be judged based on the brain activity of the target user, and the target sleep state can be obtained. Specifically, the present embodiment presets the correspondence between different brain activity intervals and different sleep states. For example, a parasomnia state is formed when the brain activity is between 0 and 25; the brain activity is between 26-50, and the state of light sleep is obtained; the brain activity is between 51 and 75, and the patient is in a sleep state; the brain activity is between 76 and 100, and the brain is in a waking state. And judging which interval the currently acquired brain activity is positioned in, and acquiring the target sleep state corresponding to the target user.
In an implementation manner, the step S201 specifically includes the following steps:
step S2011, determining an electroencephalogram intensity change curve according to the electroencephalogram data, and determining an electro-oculogram intensity change curve according to the electro-oculogram data;
step S2012, respectively carrying out normalization processing on the electroencephalogram intensity change curve and the eye electrical intensity change curve to obtain a standard electroencephalogram intensity change curve and a standard eye electrical intensity change curve;
step S2013, determining the current intensity change curve according to the standard electroencephalogram intensity change curve and the standard electro-oculogram intensity change curve, wherein the intensity value corresponding to each time point in the current intensity change curve is determined based on the data value of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electro-oculogram intensity change curve respectively.
Specifically, firstly, the electroencephalogram data and the electro-oculogram data are converted into an electroencephalogram intensity variation curve and an electro-oculogram intensity variation curve respectively. Because the measurement units corresponding to the electroencephalogram intensity and the ocular electrical intensity are different, in order to combine the electroencephalogram intensity and the ocular electrical intensity for analysis, normalization processing needs to be carried out on the electroencephalogram intensity change curve and the ocular electrical intensity change curve, so that the values of the electroencephalogram intensity and the ocular electrical intensity fall into the same value range, and the standard electroencephalogram intensity change curve and the standard ocular electrical intensity change curve are obtained. And then, fusing the standard brain electric intensity change curve and the standard eye electric intensity change curve into a new curve to obtain a current intensity change curve. The current intensity change curve can reflect the change characteristics of the brain electrical intensity and the change characteristics of the eye electrical intensity, namely the current intensity change curve is obtained, so that the analysis of the brain activity of the target user through the current intensity change curve is more accurate.
In another implementation manner, compared with the electro-ocular data, the change characteristics of the electroencephalogram data are more consistent with the change condition of the brain activity, so that the weight values can be set for the electro-ocular data and the electroencephalogram data respectively. And aiming at each time point on the current intensity change curve, determining the intensity value corresponding to the time point based on the weighted average value of the data values of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electrooculogram intensity change curve respectively.
In one implementation, step S203 specifically includes the following steps:
step S2031, determining a plurality of first data points according to the current intensity variation curve, wherein the current intensity corresponding to each first data point is higher than a preset first threshold;
step S2032, determining a plurality of second data points according to the eye movement intensity variation curve, wherein the eye movement intensity corresponding to each second data point is higher than a preset second threshold value;
step S2033, obtaining data distribution characteristics corresponding to the first data points and the second data points, and inputting the data distribution characteristics into a pre-trained prediction model to obtain the brain activity.
Specifically, for the current intensity variation curve, the present embodiment sets a first threshold in advance, and a plurality of first data points with higher current intensity can be screened out through the first threshold. For the eye movement intensity variation curve, a second threshold is preset in this embodiment, and a plurality of second data points with higher electro-oculogram intensities can be screened out through the second threshold. It can be understood that when the target user is in different brain activity levels, the data distribution characteristics of the first data point and the second data point may change, and thus the current brain activity of the target user may be analyzed through the data distribution characteristics of the first data point and the second data point. Therefore, the embodiment trains a prediction model in advance, and the prediction model learns the complex mapping relationship between different data distribution characteristics and the brain activity degree through a large amount of training data in advance, so that the data distribution characteristics of the first data point and the second data point which are acquired currently are input into the prediction model, and the current brain activity degree of the target user can be obtained.
In one implementation, the step S2033 includes the following steps:
step S20331, performing normalization processing on the numerical values of each of the first data points and each of the second data points to obtain a first standard data point corresponding to each of the first data points and a second standard data point corresponding to each of the second data points;
step S20332, determining a target time series according to each of the first standard data points and each of the second standard data points, wherein the target time series includes a plurality of variables arranged in a time sequence, and each of the variables is determined based on the first standard data point and/or the second standard data point corresponding to the time point corresponding to the variable;
step S20333, determining the data distribution characteristics according to the target time sequence.
Specifically, since the numerical values of the first data point and the second data point are respectively different in corresponding measurement units, in order to effectively analyze the first data point and the second data point, in this embodiment, normalization processing needs to be performed on the numerical values of each first data point and each second data point, the first data point is normalized to obtain a first standard data point, and the second data point is normalized to obtain a second standard data point. It is to be understood that, since the first data point and the second data point are extracted based on the curve describing the intensity change with time, and both the first data point and the second data point have a time attribute, the first standard data point and the second standard data point obtained after the normalization process also have a time attribute. A time series can thus be constructed from the respective first and second standard data points, i.e. the target time series. Specifically, the respective first standard data points and second standard data points may be arranged in chronological order. For each sequence bit in the arrangement result, if the time point corresponding to the sequence bit only corresponds to one first standard data point/second standard data point, determining the value of the variable corresponding to the sequence bit according to the value of the first standard data point/second standard data point; if the time point corresponding to the sequence bit corresponds to a first standard data point and a second standard data point at the same time, the value of the variable corresponding to the sequence bit is determined comprehensively according to the values of the first standard data point and the second standard data point. Because the target time series summarizes the information of each first data point and each second data point, the data distribution characteristics of each first data point and each second data point can be quickly obtained by analyzing the target time series.
In one implementation, the step S20333 specifically includes the following steps:
step S203331, obtaining time intervals between adjacent variables in the target time sequence, and determining time distribution characteristics according to the time intervals between the adjacent variables;
step S203332, obtaining the numerical value of each variable in the target time sequence, and determining the numerical value distribution characteristic according to the numerical value of each variable;
and S203333, determining the data distribution characteristics according to the time distribution characteristics and the numerical value distribution characteristics.
Specifically, the data distribution characteristics in this embodiment mainly include two types, one type is a time distribution characteristic, and the other type is a numerical distribution characteristic. Because each variable in the target time sequence is determined based on the data point with the higher intensity value in the current intensity variation curve and/or the eye movement intensity variation curve, the density of the time distribution of each variable can laterally reflect the current brain activity of the target user. The denser the time distribution of each variable, the higher the brain activity is; the more dispersed the time distribution of the variables, the lower the brain activity. Similarly, the concentration interval of the numerical values of the variables can also reflect the current brain activity of the target user laterally. The more concentrated the numerical value of each variable is at the high numerical position, the higher the brain activity is represented; the more concentrated the values of the variables at the lower value, the lower the brain activity. Therefore, the time distribution characteristics and the numerical distribution characteristics extracted based on the target time sequence are used as final data distribution characteristics, and the current brain activity of the target user can be accurately judged through the data distribution characteristics.
As shown in fig. 1, the method further comprises the steps of:
step S300, determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
Specifically, because the target user has different sleep-assisting requirements in different sleep states and different time periods, the present embodiment may select a suitable sleep-assisting device and dynamically adjust the working parameters of the sleep-assisting device according to the current sleep state and the time period of the target user, so as to meet the sleep-assisting requirements of the target user in different states and improve the sleep quality of the target user.
In one implementation, the working parameters include a working duration and a working intensity, and the step S300 specifically includes the following steps:
step S301, determining the target sleep-assisting equipment from a plurality of preset sleep-assisting equipment according to the target sleep state;
step S302, determining the working duration and the working intensity corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
Specifically, different sleep-assisting devices have different sleep-assisting effects, so that the most appropriate sleep-assisting device can be selected for the target user according to the current sleep state of the target user, that is, the target sleep-assisting device can be obtained. The current time information can be reflected by the acquisition time period of the electroencephalogram data, the electro-oculogram data and the eye movement data, and the difficulty degree of the target user falling asleep can be analyzed through the current time information and the current sleep state. For example, also in the awake state, the target user is in the awake state at nine pm and in the awake state at three morning points, which indicates that the target user may have severe insomnia and thus has a greater difficulty in falling asleep. Therefore, the sleeping difficulty of the target user can be determined according to the target sleeping state and the acquisition time period, and the working time length and the working intensity of the target sleeping aid device are dynamically adjusted. When the difficulty of falling asleep is larger than a first difficulty threshold, increasing the working time and the working intensity; and when the difficulty of falling asleep is smaller than a second difficulty threshold, reducing the working time and the working intensity, wherein the second difficulty threshold is smaller than the first difficulty threshold.
In one implementation, when the working time of the sleep-aid device exceeds a preset time threshold or according to a preset period, the electroencephalogram data, the electrooculogram data and the eye movement data corresponding to the target user are collected again, and the target sleep state is determined again. And when the target sleep state is converted into the deep sleep state and the preset duration is continued, the target sleep assisting equipment is closed.
Exemplary devices
Based on the above embodiment, the present invention further provides a control device of a sleep-assisting apparatus, as shown in fig. 2, the device includes:
the data acquisition module 01 is used for acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period;
the state determining module 02 is configured to determine a target sleep state corresponding to the target user according to the electroencephalogram data, the electrooculogram data, and the eye movement data;
and the device control module 03 is configured to determine a target sleep-assisting device and a working parameter corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
In one implementation, the state determination module 02 includes:
the current analysis unit is used for determining a current intensity change curve according to the electroencephalogram data and the electrooculogram data;
the eye movement analysis unit is used for determining an eye movement intensity change curve according to the eye movement data;
the brain analysis unit is used for determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve;
and the state analysis unit is used for determining the target sleep state according to the brain activity.
In one implementation, the current analysis unit includes:
the curve drawing unit is used for determining an electroencephalogram intensity change curve according to the electroencephalogram data and determining an electro-oculogram intensity change curve according to the electro-oculogram data;
the first normalization processing unit is used for respectively carrying out normalization processing on the electroencephalogram intensity change curve and the electro-oculogram intensity change curve to obtain a standard electroencephalogram intensity change curve and a standard electro-oculogram intensity change curve;
and the curve fusion unit is used for determining the current intensity change curve according to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve, wherein the intensity value corresponding to each time point in the current intensity change curve is determined based on the data value of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve respectively.
In one implementation, the determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve includes:
the screening unit is used for determining a plurality of first data points according to the current intensity change curve, wherein the current intensity corresponding to each first data point is higher than a preset first threshold value;
determining a plurality of second data points according to the eye movement intensity variation curve, wherein the eye movement intensity corresponding to each second data point is higher than a preset second threshold value;
and the prediction unit is used for acquiring data distribution characteristics corresponding to the first data points and the second data points, and inputting the data distribution characteristics into a pre-trained prediction model to obtain the brain activity.
In one implementation, the prediction unit includes:
the second normalization processing unit is used for performing normalization processing on numerical values of the first data points and the second data points to obtain first standard data points corresponding to the first data points and second standard data points corresponding to the second data points;
a sequence generating unit, configured to determine a target time sequence according to each of the first standard data points and each of the second standard data points, where the target time sequence includes a plurality of variables arranged in a time sequence, and each of the variables is determined based on the first standard data point and/or the second standard data point corresponding to a time point corresponding to the variable;
and the characteristic extraction unit is used for determining the data distribution characteristics according to the target time sequence.
In one implementation, the feature extraction unit includes:
the time analysis unit is used for acquiring time intervals between every two adjacent variables in the target time sequence and determining time distribution characteristics according to the time intervals between every two adjacent variables;
the numerical analysis unit is used for acquiring numerical values of all the variables in the target time sequence and determining numerical distribution characteristics according to the numerical values of all the variables;
and the comprehensive analysis unit is used for determining the data distribution characteristics according to the time distribution characteristics and the numerical distribution characteristics.
In one implementation, the working parameters include working duration and working intensity, and the device control module 03 includes:
the device screening unit is used for determining the target sleep-assisting device from a plurality of preset sleep-assisting devices according to the target sleep state;
and the parameter determining unit is used for determining the working time length and the working intensity corresponding to the target sleep-assisting equipment according to the target sleep state and the acquisition time period.
Based on the above embodiments, the present invention further provides a terminal, and a schematic block diagram thereof may be as shown in fig. 3. The terminal comprises a processor, a memory, a network interface and a display screen which are connected through a system bus. Wherein the processor of the terminal is configured to provide computing and control capabilities. The memory of the terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the terminal is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a control method of a sleep-aid apparatus. The display screen of the terminal can be a liquid crystal display screen or an electronic ink display screen.
It will be understood by those skilled in the art that the block diagram shown in fig. 3 is a block diagram of only a portion of the structure associated with the inventive arrangements and is not intended to limit the terminals to which the inventive arrangements may be applied, and that a particular terminal may include more or less components than those shown, or may have some components combined, or may have a different arrangement of components.
In one implementation, one or more programs are stored in a memory of the terminal and configured to be executed by one or more processors, including instructions for conducting a method of controlling a sleep-aid apparatus.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
In summary, the present invention discloses a method, an apparatus, a terminal and a storage medium for controlling a sleep-assisting device, wherein the method comprises: acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period; determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data and the eye movement data; and determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period. The invention can dynamically adjust the working parameters of the sleep-assisting device according to the sleep state and the current time of the user, and solves the problem that the prior art adopts fixed working parameters to control the sleep-assisting device to difficultly achieve good sleep-assisting effect.
It is to be understood that the invention is not limited to the examples described above, but that modifications and variations may be effected thereto by those of ordinary skill in the art in light of the foregoing description, and that all such modifications and variations are intended to be within the scope of the invention as defined by the appended claims.

Claims (10)

1. A method of controlling a sleep-aid device, the method comprising:
acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period;
determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data and the eye movement data;
and determining a target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
2. The method for controlling a sleep-aid device according to claim 1, wherein the determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electrooculogram data, and the eye movement data includes:
determining a current intensity change curve according to the electroencephalogram data and the electrooculogram data;
determining an eye movement intensity change curve according to the eye movement data;
determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve;
determining the target sleep state based on the brain activity.
3. The method of controlling a sleep-aid device according to claim 2, wherein said determining a current intensity variation curve from said brain electrical data and said eye electrical data comprises:
determining an electroencephalogram intensity change curve according to the electroencephalogram data, and determining an electro-oculogram intensity change curve according to the electro-oculogram data;
respectively carrying out normalization processing on the brain electric intensity change curve and the eye electric intensity change curve to obtain a standard brain electric intensity change curve and a standard eye electric intensity change curve;
and determining the current intensity change curve according to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve, wherein the intensity value corresponding to each time point in the current intensity change curve is determined based on the data value of the time point corresponding to the standard electroencephalogram intensity change curve and the standard electro-ocular intensity change curve respectively.
4. The method for controlling a sleep-aid device according to claim 2, wherein the determining the brain activity corresponding to the target user according to the current intensity variation curve and the eye movement intensity variation curve comprises:
determining a plurality of first data points according to the current intensity change curve, wherein the current intensity corresponding to each first data point is higher than a preset first threshold value;
determining a plurality of second data points according to the eye movement intensity variation curve, wherein the eye movement intensity corresponding to each second data point is higher than a preset second threshold value;
and acquiring data distribution characteristics corresponding to the first data points and the second data points, and inputting the data distribution characteristics into a pre-trained prediction model to obtain the brain activity.
5. The method for controlling a sleep-aid apparatus according to claim 4, wherein the obtaining of the data distribution characteristic corresponding to each of the first data points and each of the second data points includes:
normalizing the numerical values of the first data points and the second data points to obtain first standard data points corresponding to the first data points and second standard data points corresponding to the second data points;
determining a target time series according to the first standard data points and the second standard data points, wherein the target time series comprises a plurality of variables arranged in a time sequence, and each variable is determined based on the first standard data point and/or the second standard data point corresponding to the time point corresponding to the variable;
and determining the data distribution characteristics according to the target time sequence.
6. The method of controlling a sleep-aid device according to claim 5, wherein the determining the data distribution characteristics from the target time series includes:
acquiring time intervals between adjacent variables in the target time sequence, and determining time distribution characteristics according to the time intervals between the adjacent variables;
obtaining the numerical value of each variable in the target time sequence, and determining the numerical value distribution characteristic according to the numerical value of each variable;
and determining the data distribution characteristics according to the time distribution characteristics and the numerical distribution characteristics.
7. The method for controlling a sleep-aid device according to claim 1, wherein the working parameters include working duration and working intensity, and the determining the target sleep-aid device and the working parameters corresponding to the target sleep-aid device according to the target sleep state and the collection time period includes:
determining the target sleep-assisting equipment from a plurality of preset sleep-assisting equipment according to the target sleep state;
and determining the working time length and the working intensity corresponding to the target sleep-assisting equipment according to the target sleep state and the acquisition time period.
8. A control apparatus for a sleep-aid device, the apparatus comprising:
the data acquisition module is used for acquiring electroencephalogram data, electro-oculogram data and eye movement data corresponding to a target user, wherein the electroencephalogram data, the electro-oculogram data and the eye movement data respectively correspond to the same acquisition time period;
the state determination module is used for determining a target sleep state corresponding to the target user according to the electroencephalogram data, the electro-oculogram data and the eye movement data;
and the device control module is used for determining the target sleep-assisting device and working parameters corresponding to the target sleep-assisting device according to the target sleep state and the acquisition time period.
9. A terminal, characterized in that the terminal comprises a memory and more than one processor; the memory stores more than one program; the program includes instructions for executing a control method of a sleep-aid apparatus according to any one of claims 1 to 7; the processor is configured to execute the program.
10. A computer readable storage medium having stored thereon a plurality of instructions adapted to be loaded and executed by a processor to implement the steps of a method of controlling a sleep-aid device as claimed in any one of claims 1 to 7.
CN202211047779.3A 2022-08-30 2022-08-30 Control method and device of sleep-assisting equipment, terminal and storage medium Active CN115154837B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211047779.3A CN115154837B (en) 2022-08-30 2022-08-30 Control method and device of sleep-assisting equipment, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211047779.3A CN115154837B (en) 2022-08-30 2022-08-30 Control method and device of sleep-assisting equipment, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN115154837A true CN115154837A (en) 2022-10-11
CN115154837B CN115154837B (en) 2022-12-09

Family

ID=83480794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211047779.3A Active CN115154837B (en) 2022-08-30 2022-08-30 Control method and device of sleep-assisting equipment, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115154837B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114010206A (en) * 2022-01-05 2022-02-08 深圳市心流科技有限公司 Sleep plan customizing method, system and terminal based on electroencephalogram signals
CN114053551A (en) * 2022-01-13 2022-02-18 深圳市心流科技有限公司 Electroencephalogram signal-based auxiliary sleep-in method and device, terminal and storage medium
CN114305346A (en) * 2022-03-03 2022-04-12 深圳市心流科技有限公司 Sleep monitoring method and device, intelligent eyeshade and storage medium
CN114917451A (en) * 2022-06-09 2022-08-19 北京清霆科技有限公司 Sleep aiding method and system based on real-time measurement signals

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114010206A (en) * 2022-01-05 2022-02-08 深圳市心流科技有限公司 Sleep plan customizing method, system and terminal based on electroencephalogram signals
CN114053551A (en) * 2022-01-13 2022-02-18 深圳市心流科技有限公司 Electroencephalogram signal-based auxiliary sleep-in method and device, terminal and storage medium
CN114305346A (en) * 2022-03-03 2022-04-12 深圳市心流科技有限公司 Sleep monitoring method and device, intelligent eyeshade and storage medium
CN114917451A (en) * 2022-06-09 2022-08-19 北京清霆科技有限公司 Sleep aiding method and system based on real-time measurement signals

Also Published As

Publication number Publication date
CN115154837B (en) 2022-12-09

Similar Documents

Publication Publication Date Title
WO2021203719A1 (en) Acoustic-electric stimulation neuromodulation therapy and apparatus combining electroencephalogram testing, analysis and control
CN114010206B (en) Sleep plan customizing method, system and terminal based on electroencephalogram signals
US20230301588A1 (en) Mediation of traumatic brain injury
CN114247026B (en) Meditation training scoring method, device and terminal based on electroencephalogram signals
CN114511160B (en) Method, device, terminal and storage medium for predicting sleep time
CN114053551A (en) Electroencephalogram signal-based auxiliary sleep-in method and device, terminal and storage medium
CN115171850A (en) Sleep scheme generation method and device, terminal equipment and storage medium
CN114569863B (en) Sleep-assisted awakening method and system, electronic equipment and storage medium
CN114668563A (en) Multi-level regulation method for sampling frequency of electromyographic signals
CN107233653A (en) Decompression method is loosened based on brain wave context aware and cloud platform storage technology
CN114451869A (en) Sleep state evaluation method and device, intelligent terminal and storage medium
CN115154837B (en) Control method and device of sleep-assisting equipment, terminal and storage medium
CN115517688B (en) Control method and device of wearable equipment, intelligent terminal and storage medium
CN114676737B (en) Dynamic regulation method for sampling frequency of electromyographic signal
CN113995939B (en) Sleep music playing method and device based on electroencephalogram signals and terminal
CN116269289A (en) Method for evaluating psychological and physiological health of athlete based on short-time heart rate variability
US20220059210A1 (en) Systems, methods, and devices for custom sleep age implementation
CN115192907A (en) Real-time biofeedback percutaneous vagus nerve electronic acupuncture device
CN113842117A (en) Sleep data acquisition and processing method and system
CN111760194A (en) Intelligent closed-loop nerve regulation and control system and method
CN113967023B (en) Closed-loop optogenetic intervention system and intervention method
CN113112017B (en) Electroencephalogram grading and prognosis FPGA decoding system based on neural manifold
KR102478102B1 (en) Method of treatment for insomnia
US11938275B2 (en) Systems, methods, and devices for custom sleep implementation
Ma Analysis of electroencephalogram (EEG) microstate for studying neural networks: A brief review

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant