CN108882083B - Signal processing method and Related product - Google Patents
Signal processing method and Related product Download PDFInfo
- Publication number
- CN108882083B CN108882083B CN201810496982.6A CN201810496982A CN108882083B CN 108882083 B CN108882083 B CN 108882083B CN 201810496982 A CN201810496982 A CN 201810496982A CN 108882083 B CN108882083 B CN 108882083B
- Authority
- CN
- China
- Prior art keywords
- oscillogram
- sleep
- user
- oscillograms
- interval
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 22
- 230000003860 sleep quality Effects 0.000 claims abstract description 78
- 238000010586 diagram Methods 0.000 claims abstract description 59
- 238000012545 processing Methods 0.000 claims abstract description 43
- 230000005484 gravity Effects 0.000 claims abstract description 31
- 238000000034 method Methods 0.000 claims abstract description 30
- 230000008667 sleep stage Effects 0.000 claims abstract description 28
- 238000004891 communication Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 20
- 238000004458 analytical method Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 14
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 11
- 206010062519 Poor quality sleep Diseases 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 8
- 238000013507 mapping Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 241001669679 Eleotris Species 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 206010049816 Muscle tightness Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 238000013441 quality evaluation Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000037152 sensory function Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1058—Manufacture or assembly
- H04R1/1075—Mountings of transducers in earphones or headphones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4815—Sleep quality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R25/00—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
- H04R25/55—Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception using an external connection, either wireless or wired
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Otolaryngology (AREA)
- Heart & Thoracic Surgery (AREA)
- Manufacturing & Machinery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Neurosurgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
This application discloses a kind of signal processing method and Related products, the wearable device is for being worn on user's head, the wearable device includes storage and processing circuit, and the sensor with the storage and processing circuit connection, wherein, this method comprises: the sensing data of acquisition gravity direction;First waveform figure is generated according to the sensing data;Intercept operation is carried out to the first waveform figure, obtains the second waveform diagram, second waveform diagram is the waveform diagram of user's sleep stage;It is analyzed according to second waveform diagram, obtains the sleep quality of user.The sensing data of gravity direction can be acquired using the embodiment of the present application, and is analyzed it, the sleep state of user is obtained, and enrich the function of wireless headset, the user experience is improved.
Description
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a signal processing method and a related product.
Background
With the maturity of wireless technology, wireless earphones are connected with electronic devices such as mobile phones through wireless technology in more and more scenes. People can realize various functions such as listening to music, making a call and the like through the wireless earphone. However, the target wireless headset is single in function, and therefore, the user experience is reduced.
Disclosure of Invention
The embodiment of the application provides a signal processing method and a related product, which can realize sleep quality analysis, enrich the functions of a wireless earphone and improve user experience.
In a first aspect, embodiments of the present application provide a wearable device for wearing on a user's head, the wearable device comprising storage and processing circuitry, and a sensor connected to the storage and processing circuitry, wherein,
the sensor is used for acquiring sensor data in the gravity direction;
the storage and processing circuitry to generate a first waveform map from the sensor data; intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage; and analyzing according to the second oscillogram to obtain the sleep quality of the user.
In a second aspect, an embodiment of the present application provides a signal processing method, which is applied to a wearable device, where the wearable device is worn on a head of a user, and the method includes:
collecting sensor data in the gravity direction;
generating a first oscillogram from the sensor data;
intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage;
and analyzing according to the second oscillogram to obtain the sleep quality of the user.
In a third aspect, an embodiment of the present application provides a signal processing apparatus, which is applied to a wearable device, where the wearable device is worn on a head of a user, and the apparatus includes: the device comprises an acquisition unit, a generation unit, an interception unit and an analysis unit, wherein:
the acquisition unit is used for acquiring sensor data in the gravity direction;
the generating unit is used for generating a first oscillogram according to the sensor data;
the intercepting unit is used for intercepting the first oscillogram to obtain a second oscillogram, and the second oscillogram is the oscillogram of the user in the sleep stage;
and the analysis unit is used for analyzing according to the second oscillogram to obtain the sleep quality of the user.
In a fourth aspect, embodiments of the present application provide a wearable device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the programs include instructions for performing the steps of any of the methods of the second aspect of the embodiments of the present application.
In a fifth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods in the second aspect of the present application.
In a sixth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps described in any one of the methods of the second aspect of the present application. The computer program product may be a software installation package.
It can be seen that the signal processing method and the related product described in the embodiments of the present application are applied to a wearable device, the wearable device is worn on a head of a user, sensor data in a gravity direction is collected by a sensor, a first oscillogram is generated according to the sensor data, the first oscillogram is intercepted to obtain a second oscillogram, the second oscillogram is a oscillogram of a sleep stage of the user, and analysis is performed according to the second oscillogram to obtain sleep quality of the user.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of a wearable device disclosed in an embodiment of the present application;
fig. 1B is a schematic flow chart of a signal processing method disclosed in an embodiment of the present application;
FIG. 1C is a schematic illustration of a waveform diagram disclosed in an embodiment of the present application;
fig. 2 is a schematic flow chart of another signal processing method disclosed in the embodiments of the present application;
fig. 3 is a schematic flow chart of another signal processing method disclosed in the embodiments of the present application;
fig. 4 is a schematic structural diagram of another wearable device disclosed in the embodiments of the present application;
fig. 5 is a schematic structural diagram of a signal processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The wearable device may include at least one of: wireless earphones, brain wave acquisition devices, Augmented Reality (AR)/Virtual Reality (VR) devices, smart glasses, and the like, wherein the wireless earphones may implement communication by: wireless fidelity (Wi-Fi) technology, bluetooth technology, visible light communication technology, invisible light communication technology (infrared communication technology, ultraviolet communication technology), and the like. In the embodiment of the present application, a wireless headset is taken as an example, and the wireless headset includes a left earplug and a right earplug, where the left earplug can be taken as an independent component, and the right earplug can also be taken as an independent component.
Optionally, the wireless headset may be an ear-hook headset, an ear-plug headset, or a headset, which is not limited in the embodiments of the present application.
The wireless headset may be housed in a headset case, which may include: two receiving cavities (a first receiving cavity and a second receiving cavity) sized and shaped to receive a pair of wireless headsets (a left earbud and a right earbud); one or more earphone housing magnetic components disposed within the case for magnetically attracting and respectively magnetically securing a pair of wireless earphones into the two receiving cavities. The earphone box may further include an earphone cover. Wherein the first receiving cavity is sized and shaped to receive a first wireless headset and the second receiving cavity is sized and shaped to receive a second wireless headset.
The wireless headset may include a headset housing, a rechargeable battery (e.g., a lithium battery) disposed within the headset housing, a plurality of metal contacts for connecting the battery to a charging device, the driver unit including a magnet, a voice coil, and a diaphragm, the driver unit for emitting sound from a directional sound port, and a speaker assembly including a directional sound port, the plurality of metal contacts disposed on an exterior surface of the headset housing.
In one possible implementation, the wireless headset may further include a touch area, which may be located on an outer surface of the headset housing, and at least one touch sensor is disposed in the touch area for detecting a touch operation, and the touch sensor may include a capacitive sensor. When a user touches the touch area, the at least one capacitive sensor may detect a change in self-capacitance to recognize a touch operation.
In one possible implementation, the wireless headset may further include an acceleration sensor and a triaxial gyroscope, the acceleration sensor and the triaxial gyroscope may be disposed within the headset housing, and the acceleration sensor and the triaxial gyroscope are used to identify a picking up action and a taking down action of the wireless headset.
In a possible implementation manner, the wireless headset may further include at least one air pressure sensor, and the air pressure sensor may be disposed on a surface of the headset housing and configured to detect air pressure in the ear after the wireless headset is worn. The wearing tightness of the wireless earphone can be detected through the air pressure sensor. When it is detected that the wireless earphone is worn loosely, the wireless earphone can send prompt information to an electronic device connected with the wireless earphone so as to prompt a user that the wireless earphone has a risk of falling.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of a wearable device disclosed in an embodiment of the present application, the wearable device 100 includes a storage and processing circuit 110, and a sensor 170 connected to the storage and processing circuit 110, wherein:
the wearable device 100 may include control circuitry, which may include storage and processing circuitry 110. The storage and processing circuitry 110 may be a memory, such as a hard drive memory, a non-volatile memory (e.g., flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. The processing circuitry in the storage and processing circuitry 110 may be used to control the operation of the wearable device 100. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuitry 110 may be used to run software in the wearable device 100, such as an Internet browsing application, a Voice Over Internet Protocol (VOIP) phone call application, an email application, a media playing application, operating system functions, and so forth. Such software may be used to perform control operations such as camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality based on status indicators such as status indicator lights of light emitting diodes, touch event detection based on touch sensors, functionality associated with displaying information on multiple (e.g., layered) displays, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in wearable device 100, to name a few, embodiments of the present application are not limited.
The wearable device 100 may also include input-output circuitry 150. The input-output circuitry 150 may be used to enable the wearable device 100 to enable input and output of data, i.e., to allow the wearable device 100 to receive data from an external device and also to allow the wearable device 100 to output data from the wearable device 100 to an external device. The input-output circuit 150 may further include a sensor 170. The sensors 170 may include ambient light sensors, proximity sensors based on light and capacitance, touch sensors (e.g., based on optical touch sensors and/or capacitive touch sensors, where the touch sensors may be part of a touch display screen or used independently as a touch sensor structure), acceleration sensors, gravity sensors, and other sensors, among others.
Input-output circuitry 150 may also include one or more displays, such as display 130. Display 130 may include one or a combination of liquid crystal displays, organic light emitting diode displays, electronic ink displays, plasma displays, displays using other display technologies. Display 130 may include an array of touch sensors (i.e., display 130 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The audio component 140 may be used to provide audio input and output functionality for the wearable device 100. The audio components 140 in the wearable device 100 may include speakers, microphones, buzzers, tone generators, and other components for generating and detecting sounds.
The communication circuit 120 may be used to provide the wearable device 100 with the ability to communicate with external devices. The communication circuit 120 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 120 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless Communication circuitry in Communication circuitry 120 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving Near Field coupled electromagnetic signals. For example, the communication circuit 120 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 120 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuitry and antenna, and so forth.
The wearable device 100 may further include a battery, power management circuitry, and other input-output units 160. The input-output unit 160 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, and the like.
A user may input commands through the input-output circuitry 150 to control operation of the wearable device 100, and may use output data of the input-output circuitry 150 to enable receiving status information and other outputs from the wearable device 100.
Based on the wearable device described in fig. 1A above, the following functions may be implemented:
the sensor 170 is configured to collect sensor data in a gravity direction;
the storage and processing circuitry 110 for generating a first waveform map from the sensor data; intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage; and analyzing according to the second oscillogram to obtain the sleep quality of the user.
In one possible example, in the aspect of performing the clipping operation on the first waveform diagram to obtain the second waveform diagram, the storage and processing circuit 110 is specifically configured to:
dividing the first oscillogram into a plurality of segmented oscillograms;
determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
In one possible example, in terms of obtaining the sleep quality of the user through the analysis according to the second waveform diagram, the storage and processing circuit 110 is specifically configured to:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
to the above
Dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
In one possible example, the storage and processing circuitry 110 is further specifically configured to:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
in said determining the sleep quality of the user in dependence on the plurality of sleep states, the storage and processing circuitry 110 is specifically configured to:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
In one possible example, each of the plurality of target sleep states corresponds to a duration of time;
in said determining the sleep quality of the user in accordance with the plurality of target sleep states, the storage and processing circuitry 110 is specifically configured to:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
The wearable device described in fig. 1A can be used to perform a signal processing method, which is as follows:
the sensor 170 collects sensor data in the direction of gravity;
the storage and processing circuitry 110 generates a first waveform from the sensor data; intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage; and analyzing according to the second oscillogram to obtain the sleep quality of the user.
Referring to fig. 1B, fig. 1B is a schematic flowchart of a signal processing method disclosed in an embodiment of the present application, applied to the wearable device shown in fig. 1A, where the signal processing method includes the following steps.
101. Sensor data is collected for the direction of gravity.
Among other things, the wearable device may include a gravity sensor or an acceleration sensor, which may be used to collect sensor data of the direction of gravity.
102. A first waveform map is generated from the sensor data.
The sensor data can be sensor data in a period of time, so that a first oscillogram can be generated according to the sensor data, a coordinate system is established, the horizontal axis is time, and the vertical axis is a gravity acceleration value acquired by the sensor.
103. And intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage.
Since each segment in the first waveform diagram does not necessarily belong to the waveform diagram of the sleep stage, the first waveform diagram can be intercepted to obtain a second waveform diagram, and the second waveform diagram is the waveform diagram of the sleep stage of the user.
Optionally, in the step 103, performing a clipping operation on the first waveform diagram to obtain a second waveform diagram, which may include the following steps:
31. dividing the first oscillogram into a plurality of segmented oscillograms;
32. determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
33. selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
34. and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
In a specific implementation, the preset threshold may be set by a user, or a default of the system. The wearable device can divide the first oscillogram into a plurality of segmented oscillograms, the time length corresponding to each segmented oscillogram can be the same, further, the average energy value of each segmented oscillogram in the segmented oscillograms is determined, a plurality of first average energy values are obtained, the first average energy value smaller than a preset threshold value is selected from the first average energy values, a plurality of first target average energy values are obtained, under the common condition, the energy value in the sleep stage is smaller, the segmented oscillograms can be screened according to the principle, and finally, the segmented oscillograms corresponding to the first target average energy values can be connected according to the time sequence to obtain the second oscillogram.
104. And analyzing according to the second oscillogram to obtain the sleep quality of the user.
The second waveform diagram corresponds to the waveform diagram of the sleep stage, so that the second waveform diagram can be processed in a segmented manner, and each segment can be analyzed to analyze the sleep state of each segment, wherein the sleep state refers to the state of a person during sleep and is opposite to the waking state. In the embodiment of the present application, the sleep state may include, but is not limited to: a sleeping state, a light sleeping state, a deep sleeping state, etc., wherein the sleeping state starts from drowsiness and gradually falls asleep without remaining awake. At this time, the breathing becomes slow, the muscle tension is reduced, the body is slightly relaxed, and at the moment, the sleeping person belongs to an initial sleeping state, and is easy to be awakened by external sound or touch; light sleep stage, or light sleep stage. The sleep of the stage belongs to a light sleep state or a mild to moderate sleep state, a sleeper is not easy to be awakened, the muscles are further relaxed at the moment, and the electroencephalogram shows fusiform sleep waves; in the deep sleep state, the sleeper enters the deep sleep state, the muscle tension disappears, the muscles are fully relaxed, the sensory function is further reduced, and the sleeper is not easy to be awakened.
In the embodiment of the present application, the sleep quality may be a specific sleep quality evaluation value, or an analysis result of the second oscillogram. For the sleep quality evaluation value, for example, expressed in percentage, 100 is classified into an ideal sleep state, 0 is a complete wakefulness, 0 to 60 are poor in sleep quality, 60 to 80 are good, and 80 to 100 are excellent. For the analysis result, for example, T1 to T2 fall asleep, T2 to T3 light asleep, and T3 to T4 deep asleep, and for example, the duration of the fall asleep is T1, the duration of the light asleep is T2, and the duration of the deep asleep is T3.
Optionally, the step 104 of analyzing according to the second oscillogram to obtain the sleep quality of the user includes the following steps:
41. sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
42. dividing the continuous oscillogram into a plurality of interval oscillograms;
43. counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
44. determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
45. determining a sleep quality of the user in accordance with the plurality of sleep states.
The second oscillogram can be a discrete oscillogram, the second oscillogram can include a plurality of acquisition points, the acquisition points in the second oscillogram can be sequentially connected according to time, the specific connection mode can be a straight line connection or a curve connection, the connection mode is not limited, a continuous oscillogram is obtained, the continuous oscillogram is divided into a plurality of interval oscillograms, and an over-average value of each interval oscillogram in the plurality of interval oscillograms is counted to obtain a plurality of over-average values. The wearable device can pre-store a mapping relation between the range of the over-average rate and the sleep state, and further can determine the sleep state corresponding to each over-average rate in the multiple over-average rates according to the mapping relation to obtain multiple sleep states, and determine the sleep quality of the user according to the multiple sleep states. Specifically, each sleep state corresponds to one time duration, the multiple sleep states may be classified to obtain multiple types of sleep states, for example, the falling sleep state is one type, the light sleep state is one type, and the deep sleep state is one type, further, the total time duration of each type of sleep state in the multiple types of sleep states may be counted, that is, the time duration of each type is accumulated to obtain multiple total time durations, the sleep quality of the user is determined according to the multiple total time durations, for example, the specific gravity of the deep sleep state is used as the sleep quality of the user, for example, the multiple total time durations may be used as the sleep quality of the user and displayed to the user, specifically, the user may be displayed in the form of a text, a bar graph, and a pie graph, and of course, other ways are not limited herein.
The mapping relationship between the range of the over-average value rate and the sleep state is provided as follows:
range of over average rate | Sleep state |
(a1,a2) | Deep sleep state |
(a2,a3) | Light sleep state |
(a3,a4) | Falling asleep |
Wherein a1< a2< a3< a 4. For any over-average rate, the corresponding over-average rate range can be determined, and then the corresponding sleep state can be found through the mapping relation.
In addition, the above-mentioned excess average rate may be understood as a ratio between the number of times the interval waveform graph passes through the average line corresponding to the interval waveform graph and the interval length, as shown in fig. 1C, the horizontal axis is time T, the vertical axis is acceleration a, fig. 1C shows the interval waveform graph between any intervals [ T1, T2], the average line is a line parallel to the time axis corresponding to the average of the acceleration values of all the points in the interval waveform graph, as shown in fig. 1C, the number of times the average line passes through in fig. 1C is 5, and the excess average rate is 5/(T2-T1).
Optionally, after the step 42, the following steps may be further included:
a1, performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one to one;
a2, respectively determining the average energy value of each frequency domain oscillogram in the frequency domain oscillograms to obtain a plurality of second average energy values;
then, the step 45 of determining the sleep quality of the user according to the plurality of sleep states may include the following steps:
b11, verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
b12, determining the sleep quality of the user according to the target sleep states.
The frequency domain conversion can adopt Fourier transform or fast Fourier transform, and further, after frequency domain conversion is carried out on each interval oscillogram in the interval oscillograms, a plurality of frequency domain oscillograms are obtained, and the interval oscillograms correspond to the frequency domain oscillograms one to one. And respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values. In specific implementation, each sleep state can be in an energy value range, that is, a mapping relationship between the sleep state and the energy value range is stored in advance, and then, the plurality of sleep states are verified according to the mapping relationship and the plurality of second average energy values to obtain a plurality of sleep states, wherein the verification aims at further judging the accuracy of the sleep states. Specifically, the mapping relationship may be as follows:
range of energy values | Sleep state |
(E1,E2) | Deep sleep state |
(E2,E3) | Light sleep state |
(E3,E4) | Falling asleep |
Wherein E1< E2< E3< E4. In the specific verification process, for example, if a certain section is identified as a light sleep state by the step 44 and the corresponding energy value is greater than E3, the section may be removed if the section has a false determination, or for example, if a certain section is identified as a sleep state by the step 44 and the corresponding energy value is (E3, E4), the section is retained. The steps A1 and A2 further identify the sleep state in the frequency domain dimension, and the identification precision is improved.
Optionally, each target sleep state of the plurality of target sleep states corresponds to a duration; in the step 45, determining the sleep quality of the user according to the target sleep states may include the following steps:
b21, classifying the target sleep states to obtain multiple sleep states;
b22, counting the total duration of each sleep state in the multiple sleep states to obtain multiple total durations;
and B23, determining the sleep quality of the user according to the total time length.
Each target sleep state in the target sleep states corresponds to one time length. In a specific implementation, the multiple target sleep states may be classified to obtain multiple classes of sleep states, specifically, for example, the falling sleep state is one class, the light sleep state is one class, and the deep sleep state is one class, further, the total duration of each class of sleep state in the multiple classes of sleep states may be counted, that is, the duration of each class is accumulated to obtain multiple total durations, the sleep quality of the user is determined according to the multiple total durations, for example, the specific gravity of the deep sleep state is used as the sleep quality of the user, for example, the multiple total durations may be used as the sleep quality of the user and displayed to the user, specifically, the multiple total durations may be displayed to the user in the form of a text, a bar graph, and a pie graph.
It can be seen that the signal processing method described in the embodiment of the present application is applied to wearable devices, where the wearable devices are worn on the head of a user, collect sensor data in the gravity direction, generate a first oscillogram according to the sensor data, intercept the first oscillogram to obtain a second oscillogram, where the second oscillogram is a oscillogram of a sleep stage of the user, and analyze the first oscillogram according to the second oscillogram to obtain sleep quality of the user.
Referring to fig. 2, fig. 2 is a schematic flowchart of a signal processing method disclosed in an embodiment of the present application, applied to the wearable device shown in fig. 1A, and the signal processing method includes the following steps.
201. Sensor data is collected for the direction of gravity.
202. A first waveform map is generated from the sensor data.
203. The first waveform map is divided into a plurality of segmented waveform maps.
204. And determining the average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values.
205. And selecting a first average energy value smaller than a preset threshold value from the plurality of first average energy values to obtain a plurality of first target average energy values.
206. And connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage.
207. And sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram.
208. And dividing the continuous oscillogram into a plurality of interval oscillograms.
209. And counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates.
210. And determining the sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states.
211. Determining a sleep quality of the user in accordance with the plurality of sleep states.
It can be seen that the signal processing method described in the foregoing embodiment of the present application is applied to a wearable device, the wearable device is worn on a head of a user, collects sensor data in a gravity direction, generates a first waveform diagram according to the sensor data, divides the first waveform diagram into a plurality of segmented waveform diagrams, determines an average energy value of each segmented waveform diagram in the plurality of segmented waveform diagrams, obtains a plurality of first average energy values, selects a first average energy value smaller than a preset threshold from the plurality of first average energy values, obtains a plurality of first target average energy values, connects the segmented waveform diagrams corresponding to the plurality of first target average energy values according to a time sequence, obtains a second waveform diagram, the second waveform diagram is a waveform diagram of a sleep stage of the user, connects acquisition points in the second waveform diagram according to the time sequence, obtains a continuous waveform diagram, the method comprises the steps of dividing a continuous oscillogram into a plurality of interval oscillograms, counting the average value passing rate of each interval oscillogram in the interval oscillograms to obtain a plurality of average value passing rates, determining the sleep state corresponding to each interval oscillogram according to the average value passing rates to obtain a plurality of sleep states, and determining the sleep quality of a user according to the sleep states.
Referring to fig. 3, fig. 3 is a schematic flowchart of a signal processing method disclosed in an embodiment of the present application, applied to the wearable device shown in fig. 1A, where the signal processing method includes the following steps.
301. Sensor data is collected for the direction of gravity.
302. A first waveform map is generated from the sensor data.
303. The first waveform map is divided into a plurality of segmented waveform maps.
304. And determining the average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values.
305. And selecting a first average energy value smaller than a preset threshold value from the plurality of first average energy values to obtain a plurality of first target average energy values.
306. And connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage.
307. And sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram.
308. And dividing the continuous oscillogram into a plurality of interval oscillograms.
309. And counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates.
310. And carrying out frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one.
311. And respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values.
312. And determining the sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states.
313. And verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states.
314. Determining the sleep quality of the user according to the plurality of target sleep states.
It can be seen that the signal processing method described in the foregoing embodiment of the present application is applied to a wearable device, the wearable device is worn on a head of a user, collects sensor data in a gravity direction, generates a first waveform diagram according to the sensor data, divides the first waveform diagram into a plurality of segmented waveform diagrams, determines an average energy value of each segmented waveform diagram in the plurality of segmented waveform diagrams, obtains a plurality of first average energy values, selects a first average energy value smaller than a preset threshold from the plurality of first average energy values, obtains a plurality of first target average energy values, connects the segmented waveform diagrams corresponding to the plurality of first target average energy values according to a time sequence, obtains a second waveform diagram, the second waveform diagram is a waveform diagram of a sleep stage of the user, connects acquisition points in the second waveform diagram according to the time sequence, obtains a continuous waveform diagram, dividing the continuous oscillogram into a plurality of interval oscillograms, counting the over-average rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average rates, carrying out frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one, respectively determining the average energy value of each frequency domain oscillogram in the frequency domain oscillograms to obtain a plurality of second average energy values, determining the sleep state corresponding to each interval oscillogram according to the plurality of over-average rates to obtain a plurality of sleep states, verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states, and determining the sleep quality of the user according to the plurality of target sleep states. So, can be through gathering the sensor data of gravity direction to carry out the analysis to it, obtain user's sleep state, richened wireless earphone's function, promoted user experience.
Referring to fig. 4, fig. 4 is a schematic structural diagram of another wearable device disclosed in the embodiment of the present application, and as shown in fig. 4, the wearable device includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
collecting sensor data in the gravity direction;
generating a first oscillogram from the sensor data;
intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage;
and analyzing according to the second oscillogram to obtain the sleep quality of the user.
It can be seen that, the wearable device described in the embodiment of the present application is worn on the head of a user, collects sensor data in the gravity direction, generates a first oscillogram according to the sensor data, intercepts the first oscillogram to obtain a second oscillogram, the second oscillogram is a oscillogram of a sleep stage of the user, and analyzes the oscillogram according to the second oscillogram to obtain the sleep quality of the user.
In one possible example, in the aspect of performing the clipping operation on the first waveform diagram to obtain the second waveform diagram, the program includes instructions for:
dividing the first oscillogram into a plurality of segmented oscillograms;
determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
In one possible example, in said analyzing from said second waveform map to obtain the sleep quality of the user, the above program includes instructions for performing the following steps:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
In one possible example, the program further includes instructions for performing the steps of:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
in said determining the quality of sleep of the user in dependence on the plurality of sleep states, the program comprises instructions for performing the steps of:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
In one possible example, each of the plurality of target sleep states corresponds to a duration of time;
in said determining the quality of sleep of the user in accordance with the plurality of target sleep states, the program comprises instructions for:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the wearable device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the wearable device may be divided into the functional units according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a signal processing apparatus, which is applied to a wearable device, the wearable device being worn on a head of a user, the apparatus including: an obtaining unit 501, a generating unit 502, a clipping unit 503 and an analyzing unit 504, wherein:
the acquiring unit 501 is configured to acquire sensor data in a gravity direction;
the generating unit 502 is configured to generate a first oscillogram according to the sensor data;
the intercepting unit 503 is configured to intercept the first waveform diagram to obtain a second waveform diagram, where the second waveform diagram is a waveform diagram of a sleep stage of the user;
the analysis unit 504 is configured to perform analysis according to the second oscillogram to obtain the sleep quality of the user.
In a possible example, in terms of performing the clipping operation on the first waveform diagram to obtain a second waveform diagram, the clipping unit 503 is specifically configured to:
dividing the first oscillogram into a plurality of segmented oscillograms;
determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
In a possible example, in terms of obtaining the sleep quality of the user through the analysis according to the second waveform diagram, the analysis unit 504 is specifically configured to:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
In one possible example, the analyzing unit 504 is further specifically configured to:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one; respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
in said determining the sleep quality of the user in dependence on the plurality of sleep states, the analyzing unit 504 is specifically configured to:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
In one possible example, each of the plurality of target sleep states corresponds to a duration of time;
in said determining the sleep quality of the user according to the plurality of target sleep states, the analyzing unit 504 is specifically configured to:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
It can be seen that the signal processing device described in the above embodiment of the present application is applied to wearable devices, where the wearable devices are worn on the head of a user, collect sensor data in the direction of gravity, generate a first oscillogram according to the sensor data, intercept the first oscillogram to obtain a second oscillogram, the second oscillogram is a oscillogram of a sleep stage of the user, and analyze according to the second oscillogram to obtain the sleep quality of the user, so that the sensor data can be collected by a gravity sensor and analyzed to obtain the sleep state of the user, thereby enriching the functions of wireless earphones and improving the user experience.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to perform part or all of the steps of any one of the methods as described in the above method embodiments, and the computer includes a wearable device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising a wearable device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific implementation and application scope, and in view of the above, the content of the present specification should not be construed as a limitation to the present application.
Claims (17)
1. A wearable device for wearing on a user's head, the wearable device comprising a storage and processing circuit, and a sensor connected to the storage and processing circuit, wherein,
the sensor is used for acquiring sensor data in the gravity direction;
the storage and processing circuitry to generate a first waveform map from the sensor data; intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage; analyzing according to the second oscillogram to obtain the sleep quality of the user;
wherein, in the aspect of performing the clipping operation on the first waveform diagram to obtain the second waveform diagram, the storage and processing circuit is specifically configured to:
dividing the first oscillogram into a plurality of segmented oscillograms;
determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
2. The wearable device of claim 1, wherein in said analyzing from the second waveform profile yields a sleep quality of the user, the storage and processing circuitry is specifically configured to:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
3. The wearable device of claim 2, wherein the storage and processing circuit is further specific to:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
in said determining the quality of sleep of the user in accordance with the plurality of sleep states, the storage and processing circuitry is specifically configured to:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
4. The wearable device of claim 3, wherein each of the plurality of target sleep states corresponds to a duration of time;
in the determining the quality of sleep of the user in accordance with the plurality of target sleep states, the storage and processing circuitry is specifically configured to:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
5. A wearable device for wearing on a user's head, the wearable device comprising a storage and processing circuit, and a sensor connected to the storage and processing circuit, wherein,
the sensor is used for acquiring sensor data in the gravity direction;
the storage and processing circuitry to generate a first waveform map from the sensor data; intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage; analyzing according to the second oscillogram to obtain the sleep quality of the user;
wherein,
in the aspect of obtaining the sleep quality of the user by analyzing according to the second waveform diagram, the storage and processing circuit is specifically configured to:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
6. The wearable device of claim 5, wherein the storage and processing circuit is further specific to:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
in said determining the quality of sleep of the user in accordance with the plurality of sleep states, the storage and processing circuitry is specifically configured to:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
7. The wearable device of claim 6, wherein each of the plurality of target sleep states corresponds to a duration of time;
in the determining the quality of sleep of the user in accordance with the plurality of target sleep states, the storage and processing circuitry is specifically configured to:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
8. A signal processing method is applied to a wearable device, wherein the wearable device is worn on the head of a user, and the method comprises the following steps:
collecting sensor data in the gravity direction;
generating a first oscillogram from the sensor data;
intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage;
analyzing according to the second oscillogram to obtain the sleep quality of the user;
wherein, the intercepting operation of the first oscillogram to obtain a second oscillogram includes:
dividing the first oscillogram into a plurality of segmented oscillograms;
determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values;
selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values;
and connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain the second oscillogram.
9. The method of claim 8, wherein said analyzing from said second waveform map for sleep quality of the user comprises:
sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
10. The method of claim 9, further comprising:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
the determining the sleep quality of the user according to the plurality of sleep states comprises:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
11. The method of claim 10, wherein each of the plurality of target sleep states corresponds to a duration of time;
the determining the sleep quality of the user according to the plurality of target sleep states comprises:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
12. A signal processing method is applied to a wearable device, wherein the wearable device is worn on the head of a user, and the method comprises the following steps:
collecting sensor data in the gravity direction;
generating a first oscillogram from the sensor data;
intercepting the first oscillogram to obtain a second oscillogram, wherein the second oscillogram is the oscillogram of the user in the sleep stage;
analyzing according to the second oscillogram to obtain the sleep quality of the user;
wherein
Analyzing according to the second oscillogram to obtain the sleep quality of the user, wherein the analyzing comprises the following steps: sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram;
dividing the continuous oscillogram into a plurality of interval oscillograms;
counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates;
determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states;
determining a sleep quality of the user in accordance with the plurality of sleep states.
13. The method of claim 12, further comprising:
performing frequency domain conversion on each interval oscillogram in the interval oscillograms to obtain a plurality of frequency domain oscillograms, wherein the interval oscillograms correspond to the frequency domain oscillograms one by one;
respectively determining the average energy value of each frequency domain oscillogram in the plurality of frequency domain oscillograms to obtain a plurality of second average energy values;
the determining the sleep quality of the user according to the plurality of sleep states comprises:
verifying the plurality of sleep states according to the plurality of second average energy values to obtain a plurality of target sleep states;
determining the sleep quality of the user according to the plurality of target sleep states.
14. The method of claim 13, wherein each of the plurality of target sleep states corresponds to a duration of time;
the determining the sleep quality of the user according to the plurality of target sleep states comprises:
classifying the target sleep states to obtain multiple sleep states;
counting the total duration of each type of sleep state in the multiple types of sleep states to obtain multiple total durations;
and determining the sleep quality of the user according to the total duration.
15. A signal processing apparatus, applied to a wearable device, the wearable device being worn on a head of a user, the apparatus comprising: the device comprises an acquisition unit, a generation unit, an interception unit and an analysis unit, wherein:
the acquisition unit is used for acquiring sensor data in the gravity direction;
the generating unit is used for generating a first oscillogram according to the sensor data;
the intercepting unit is used for intercepting the first oscillogram to obtain a second oscillogram, and the second oscillogram is the oscillogram of the user in the sleep stage;
the analysis unit is used for analyzing according to the second oscillogram to obtain the sleep quality of the user;
wherein,
wherein, the intercepting operation of the first oscillogram to obtain a second oscillogram includes:
dividing the first oscillogram into a plurality of segmented oscillograms; determining an average energy value of each segmented oscillogram in the segmented oscillograms to obtain a plurality of first average energy values; selecting a first average energy value smaller than a preset threshold value from the first average energy values to obtain a plurality of first target average energy values; connecting the segmented oscillograms corresponding to the first target average energy values according to the time sequence to obtain a second oscillogram;
or
Analyzing according to the second oscillogram to obtain the sleep quality of the user, wherein the analyzing comprises the following steps: sequentially connecting the acquisition points in the second oscillogram according to the time sequence to obtain a continuous oscillogram; dividing the continuous oscillogram into a plurality of interval oscillograms; counting the over-average value rate of each interval oscillogram in the interval oscillograms to obtain a plurality of over-average value rates; determining a sleep state corresponding to each interval oscillogram according to the multiple over-average values to obtain multiple sleep states; determining a sleep quality of the user in accordance with the plurality of sleep states.
16. A wearable device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 8-14.
17. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of the claims 8-14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810496982.6A CN108882083B (en) | 2018-05-22 | 2018-05-22 | Signal processing method and Related product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810496982.6A CN108882083B (en) | 2018-05-22 | 2018-05-22 | Signal processing method and Related product |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108882083A CN108882083A (en) | 2018-11-23 |
CN108882083B true CN108882083B (en) | 2019-10-18 |
Family
ID=64333190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810496982.6A Expired - Fee Related CN108882083B (en) | 2018-05-22 | 2018-05-22 | Signal processing method and Related product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108882083B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109876271B (en) * | 2019-03-06 | 2019-12-06 | 深圳市神舟电脑股份有限公司 | Wearable equipment remote control governing system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106333676A (en) * | 2016-09-21 | 2017-01-18 | 广州视源电子科技股份有限公司 | Electroencephalogram data type labeling device in waking state |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9610030B2 (en) * | 2015-01-23 | 2017-04-04 | Hello Inc. | Room monitoring device and sleep analysis methods |
JP6459241B2 (en) * | 2014-06-25 | 2019-01-30 | Tdk株式会社 | Sleep state estimation device, sleep state estimation method, and program |
CN105380600A (en) * | 2015-11-04 | 2016-03-09 | 北京握奇数据系统有限公司 | Automatic sleep detection method and system based on wearable intelligent equipment |
JP6439729B2 (en) * | 2016-03-24 | 2018-12-19 | トヨタ自動車株式会社 | Sleep state estimation device |
CN107638165B (en) * | 2016-07-20 | 2021-01-26 | 平安科技(深圳)有限公司 | Sleep detection method and device |
CN106667435A (en) * | 2016-12-17 | 2017-05-17 | 复旦大学 | Intelligent sensing mattress for monitoring sleep |
CN106725338A (en) * | 2017-01-06 | 2017-05-31 | 山东诺安诺泰信息系统有限公司 | A kind of sleep quality signal detecting method |
-
2018
- 2018-05-22 CN CN201810496982.6A patent/CN108882083B/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106333676A (en) * | 2016-09-21 | 2017-01-18 | 广州视源电子科技股份有限公司 | Electroencephalogram data type labeling device in waking state |
Also Published As
Publication number | Publication date |
---|---|
CN108882083A (en) | 2018-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108810693B (en) | Wearable device and device control device and method thereof | |
US9949008B2 (en) | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method | |
CN108702567B (en) | Earphone, method for detecting wearing state of earphone and electronic equipment | |
CN108538320B (en) | Recording control method and device, readable storage medium and terminal | |
JP6365939B2 (en) | Sleep assist system | |
CN108668009B (en) | Input operation control method, device, terminal, earphone and readable storage medium | |
WO2018045536A1 (en) | Sound signal processing method, terminal, and headphones | |
CN108763901B (en) | Ear print information acquisition method and device, terminal, earphone and readable storage medium | |
CN108874130B (en) | Play control method and related product | |
CN108540900B (en) | Volume adjusting method and related product | |
CN110278509A (en) | A kind of wireless headset control method, device and wireless headset and storage medium | |
CN109656511A (en) | A kind of audio frequency playing method, terminal and computer readable storage medium | |
CN108540660B (en) | Voice signal processing method and device, readable storage medium and terminal | |
CN207560279U (en) | Earphone and electronic equipment | |
CN112947886B (en) | Method and device for protecting hearing of user and electronic equipment | |
CN108683790B (en) | Voice processing method and related product | |
US10582290B2 (en) | Earpiece with tap functionality | |
CN114095847B (en) | Environmental and polymeric acoustic dosimetry | |
CN107863110A (en) | Safety prompt function method, intelligent earphone and storage medium based on intelligent earphone | |
CN109039355B (en) | Voice prompt method and related product | |
CN107749306B (en) | Vibration optimization method and mobile terminal | |
CN110058837B (en) | Audio output method and terminal | |
CN108827338B (en) | Voice navigation method and related product | |
CN108810787B (en) | Foreign matter detection method and device based on audio equipment and terminal | |
CN107705804A (en) | A kind of audible device condition detection method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20191018 |