CN113951837A - Reading surface light measuring method in sleeping environment - Google Patents

Reading surface light measuring method in sleeping environment Download PDF

Info

Publication number
CN113951837A
CN113951837A CN202110899460.2A CN202110899460A CN113951837A CN 113951837 A CN113951837 A CN 113951837A CN 202110899460 A CN202110899460 A CN 202110899460A CN 113951837 A CN113951837 A CN 113951837A
Authority
CN
China
Prior art keywords
sleep
change rate
color
light
light color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110899460.2A
Other languages
Chinese (zh)
Inventor
邹细勇
张维特
胡晓静
李晓艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University Shangyu Advanced Research Institute Co Ltd
Original Assignee
China Jiliang University Shangyu Advanced Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University Shangyu Advanced Research Institute Co Ltd filed Critical China Jiliang University Shangyu Advanced Research Institute Co Ltd
Priority to CN202110899460.2A priority Critical patent/CN113951837A/en
Publication of CN113951837A publication Critical patent/CN113951837A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Epidemiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Pulmonology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a reading surface light measuring method in a sleeping environment, which comprises the steps of firstly connecting a light color sensor in a light color acquisition module to a bracket near a user in a world coordinate system in a sleeping scene through a pitching plate, a rolling plate and a first connecting piece in sequence; then, in a light environment, respectively and rotatably connecting a pitching rotating shaft of the pitching plate and the rolling plate and a rolling rotating shaft of the rolling plate and the first connecting piece, recording a pitching angle alpha and a rolling angle beta corresponding to each direction, and establishing a mapping table of combining alpha and beta to each light color parameter value; in the field environment, the surface of the photochromic sensor is parallel to a reading surface in the sleep environment, and the corresponding photochromic parameter value is obtained through Euclidean distance weighted interpolation calculation in the angle combination space based on the mapping table. The invention avoids a complex light irradiation space distribution model and can flexibly obtain the light color parameters on the target reading surface.

Description

Reading surface light measuring method in sleeping environment
The application is a divisional application of patent application No. 201910335756.4, application date 2019, 24.04.9, and invention title "sleeping environment illumination condition identification method and reading surface light measurement method".
Technical Field
The invention relates to the field of intelligent illumination and sleep assistance, in particular to a reading surface light measuring method in a sleep environment.
Background
In a typical twenty-four hour biological clock cycle, the human body has different physiological characteristics in time periods, such as the maximum depth of sleep of the human body at 2 am, the cessation of melatonin secretion for 30 minutes at 7 am, the highest efficiency of cardiovascular work at 17 pm, and the onset of melatonin secretion at 21 pm.
An endocrine organ called pineal body exists in the brain of human, and one of the functions of pineal body is to secrete melatonin, which plays an extremely important role in promoting sleep. The melatonin secretion can inhibit the sympathetic nerve excitation of human body, reduce the blood pressure of human body, slow down the heartbeat, make the heart rest, and simultaneously can enhance the immunity and eliminate the fatigue. The blue light can inhibit melatonin secretion of pineal bodies, the blue light is strongest in daytime, and people are spirited and shaken; the blue light is weakest at night, melatonin is secreted by pineal bodies in the brain, and the melatonin entering blood promotes the human body to be sleepy, fall asleep and deeply sleep.
Although the influence of light on human rhythm has been studied more, there is no specific research scheme but only some general reasoning about how the stimulus response of different lights in the sleep stage of human body, especially how the human characteristics change gradually in the sleep stage. For example, in chinese patent application No. 2016107972446, a doppler device is used to detect the limb movement of a user, and a method based on group probability statistics is used to determine a possible time point when the user falls asleep.
In a dimmable environment, what transition will be shown to the user from preparing to sleep to going to sleep?
For this reason, a method for recognizing the illumination condition of the sleep environment is needed.
Disclosure of Invention
It is an object of the present invention to provide a method to detect the effect of lighting conditions on the speed of falling asleep or the efficiency of falling asleep and to provide a prediction in a field environment of what effect the lighting conditions of the field environment will have on falling asleep for the user.
For this reason, it is necessary to first perform detection and judgment of sleep onset behavior for a user, and then model a mapping relationship between different lighting conditions and factors related to sleep onset efficiency.
At night, when people are ready to rest, transitional activities such as work planning and reading before sleep are often performed, and more people can use smart phones or tablets to see relaxing contents. In the stage before falling asleep, the backlight of a lamp or equipment with low color temperature and low brightness can help the body relax until the user makes a trouble and falls asleep. But we need a model to reflect the relationship between the sleep efficiency or speed and the lighting conditions.
Since this model is a multi-input multi-output nonlinear system, it needs to rely on nonlinear system identification. In a nonlinear system, an artificial neural network is a network formed by widely interconnecting a large number of processing units, has large-scale parallel simulation processing capacity and strong self-adaption, self-organization and self-learning capacity, is generally emphasized in system modeling, identification and control, and has nonlinear transformation characteristics which provide an effective method for system identification, particularly identification of a nonlinear system. Because the falling asleep of the human body is a continuous dynamic process, there is a close correlation between human body characteristics in adjacent time periods. To this end, the present invention employs a dynamic recurrent neural network to model the system.
The method is based on a dynamic recursive Elman neural network, and models a complex nonlinear mapping relation between illumination conditions and sleep-in efficiency factors, wherein the illumination conditions comprise illumination of a reading surface, color temperature and xyz color coordinate values of colors, and the sleep-in efficiency factors are represented by 5 parameters of eye opening change rate, eye-closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate of a user.
The technical scheme of the invention is that the signals of several human body characteristics related to falling asleep are acquired, the trend of the signals is extracted, and accidental factors in various signals are eliminated by adopting a data fusion method, so that accurate falling asleep characteristic data is obtained. And further, repeatedly extracting sleep-falling characteristics under different illumination conditions to obtain an evaluation sample of the influence of illumination on sleep-falling. And finally, establishing a pre-judging model of the sleep characteristics of the human body in different light environments based on a nonlinear mapping theory and processing calculation.
The evaluation of the sleep onset efficiency based on the physical sign sensing data has the following problems. Firstly, in the sampled sign data, the former section may be smooth and has no significant change or the change is less than a certain range, and the latter section temporarily starts to change from a certain time point such as the feeling of human body; then, how does this point in time judge? Is the time point earlier than a valid data sample?
Second, even if the latter signs begin to change, such as the eye opening becomes smaller or the eye-closing duration increases, the amount of change or rate of change per time itself changes, such as the rate of change or first derivative of the negative exponential function gradually decreases as the independent variable increases. For this reason, it is difficult to define the sleep onset efficiency by a first derivative of the vital sign data sequence.
Based on the above two problems, the sleep onset efficiency is defined to reflect the general trend of the vital sign data sequence with uncertain turning point and inconstant change rate in the sleep onset stage, and is expressed in a quantitative form.
In order to obtain sample data required for establishing a pre-judging model of sleep characteristics of a human body in different light environments, illumination on a reading surface observed by a user in a sleep stage is detected. Therefore, the technical scheme of the invention is to provide a reading surface light measuring method in a sleep environment, which comprises the following steps:
p1, connecting a light color sensor in the light color acquisition module to a bracket near a user in a world coordinate system in a sleeping scene through a pitching plate, a rolling plate and a first connecting piece in sequence;
p2, in a light environment, respectively rotating a pitching rotating shaft connecting the pitching plate and the rolling plate and a rolling rotating shaft connecting the rolling plate and the first connecting piece to change the orientation of the surface of the photochromic sensor, sampling reflected light, then calculating photochromic parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the oriented surface by a photochromic judgment module, recording a pitch angle alpha and a roll angle beta corresponding to each orientation, and establishing a mapping table of alpha and beta combination to each photochromic parameter value;
p3, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the illuminated plane is not in the mapping table, obtaining the corresponding photochromic parameter value through Euclidean distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
Preferably, in step P3, the following steps are performed: in a field environment, the pitching rotating shaft and the rolling rotating shaft are respectively rotated to enable the surface of the photochromic sensor to be parallel to a reading surface in a sleeping environment.
Preferably, the orientation is detected based on a depth camera on the support in a world coordinate system.
Preferably, the orientation is detected by a triaxial acceleration sensor fixed to the illuminated plane, and a signal processing module in the photochromic identification unit converts a signal obtained by the detection into a corresponding pitch angle and a corresponding roll angle.
Preferably, the distance-weighted interpolation calculation specifically includes:
for a combination P (alpha) of pitch and roll angles0,β0) First, four points around P in the angle space are found: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure BDA0003199161560000031
Figure BDA0003199161560000032
wherein d is1Represents the shortest distance of P to four points, d2The second shortest point, and so on, dTIs the sum of all distances; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
Preferably, the photochromic parameters include 5 photochromic parameters including illumination, color temperature and color xyz coordinate values of the reading surface light, and the method further includes the following steps:
establishing a dynamic recursive Elman neural network, taking the 5 photochromic parameters as input quantities, and taking 5 characteristic parameters of the user, namely the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate, as output quantities;
and the neural network trained by the training sample in an off-line mode predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user based on the illuminance, the color temperature and the xyz color coordinate value of the color of the current reading surface and outputs the result as the sleep efficiency factor.
Preferably, the photochromic parameters include 5 photochromic parameters including illumination, color temperature and color xyz coordinate values of the reading surface light, and the method further includes the following steps:
the change rate of physical sign parameters such as the opening value of eyes, the duration of eye closure, the heart rate, the body movement frequency, the body temperature and the like of a user is taken as a sleep efficiency factor; establishing a sleep efficiency mapping table, recording the change rate of each physical sign parameter under each light color parameter combination,
predicting sleep-in efficiency factors under specific light color parameter combinations in a field environment, searching the sleep-in efficiency mapping table according to light color parameter combination values, and when the combinations are not in the sleep-in efficiency mapping table, obtaining change rate physical sign parameter values of the corresponding sleep-in efficiency factors through distance weighted interpolation calculation based on the sleep-in efficiency mapping table, wherein the distances are Euclidean distances in a light color combination space; if the combination exists in the sleep efficiency mapping table, the table is directly looked up to obtain the corresponding change rate physical sign parameter value,
and outputting the table lookup result.
Compared with the prior art, the scheme of the invention has the following advantages: the method comprises the steps of representing an illumination condition by an xyz color coordinate value of a reading surface illumination, a color temperature and a color, representing sleep onset efficiency by using physical parameters such as user eye opening degree change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate, body temperature change rate and the like obtained through data fusion and data fitting, carrying out signal acquisition and processing on each parameter through a photochromic recognition unit and a sleep onset recognition unit, carrying out construction modeling on an influence relation between the illumination condition of the environment and user sleep onset efficiency factors in a control unit by using nonlinear mapping, and predicting the user sleep onset efficiency in different luminous environments by the trained or fitted mapping, thereby providing a basis for searching and recommending the subsequent high sleep onset efficiency luminous environments. When the invention is used for measuring the light of the reading surface in the sleeping environment, a complex light irradiation space distribution model is avoided, and the light color parameters on the target reading surface can be flexibly obtained.
Drawings
FIG. 1 is a schematic diagram of human body's biological clock rhythm;
FIG. 2 is a block diagram of a sleep environment illumination condition identification system;
FIG. 3 is a view showing a constitution of a control unit;
FIG. 4 is a structural diagram of a photochromic identification unit;
FIG. 5 is a view showing a constitution of a sleep-in recognition unit;
FIG. 6 is a structural diagram of a dimmable lamp set;
FIG. 7 is a schematic diagram of an Elman neural network structure;
FIG. 8 is a schematic view of the layout structure of the present invention;
FIG. 9 is a schematic view of the pan/tilt head rotation of the image capture module;
FIG. 10 is a rotation diagram of the light color obtaining module;
FIG. 11 is a schematic structural diagram of a light color obtaining module rotating platform;
FIG. 12 is a flowchart of the method operation of the present invention;
FIG. 13 is a graph of eye opening detection sequences;
FIG. 14 is a graph illustrating a fitting function.
Wherein: 100 a sleep environment illumination condition identification system, 110 a light color identification unit, 120 a sleep-in identification unit, 130 an identity identification unit, 140 a control unit, 150 a user interface unit, 160 a dimmable lamp set,
a 111 light color obtaining module, a 112 light color judging module, a 113 rotating platform,
121 image acquisition module, 122 wearable module, 123 sleep-falling judgment module, 1231 image processing part, 1232 heart rate calculating part, 1233 body motion frequency calculating part, 1234 body temperature calculating part, 1235 data fusion processing part,
141 an input interface module, 142 a processing module, 143Elman neural network, 144 an iterative learning module, 145 a memory, 146 a first connection array, 147 a second connection array, 148 an output module,
the number of drivers 161, LED strings 162,
101 a base, 102 a support, 103 a depth camera, 104 a pan-tilt, 105 a display bar, 106 a light color sensing block, 107 a key block, 108 a dimming panel,
1061 first connecting piece, 1062 rolling shaft, 1063 rolling plate, 1064 pitching shaft, 1065 pitching plate, 1066 light color sensor and 1067 second connecting piece.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings, but the present invention is not limited to only these embodiments. The invention is intended to cover alternatives, modifications, equivalents and alternatives which may be included within the spirit and scope of the invention.
In the following description of the preferred embodiments of the present invention, specific details are set forth in order to provide a thorough understanding of the present invention, and it will be apparent to those skilled in the art that the present invention may be practiced without these specific details.
The invention is described in more detail in the following paragraphs by way of example with reference to the accompanying drawings. It should be noted that the drawings are in simplified form and are not to precise scale, which is only used for convenience and clarity to assist in describing the embodiments of the present invention.
Example 1:
the human biological clock is the phenomena of physiological and biochemical processes, morphological structures, behaviors and the like which change periodically with time in a human body. The biological clocks in human body are various, and various physiological indexes of human body, such as pulse, body temperature, blood pressure, physical strength, emotion, intelligence and the like, can change periodically along with day and night changes.
As shown in fig. 1, in 2 am, the sleep of a person reaches the maximum depth, in 4 am, 30 am, the body temperature reaches the minimum, in 6 am, the blood pressure rises fastest at 45 am, in 7 am, the melatonin secretion stops at 30 am, in 8 am, 30 am, the intestinal peristalsis occurs frequently, in 9 am, the testosterone secretion reaches the maximum, in 10 am, the brain is most awake, in 14 pm, the limb movement of the person is matched to the optimum condition at 30 am, in 15 pm, the 30 am is the time when the reaction of the person is most sensitive, in 17 pm, the cardiovascular work efficiency of the human body is the maximum, in 18 pm, the blood pressure of the person reaches the peak of 1 day, in 19 pm, the body temperature reaches the peak, in 21 pm, the melatonin secretion starts at 22 pm, at 30 am, and the intestine of the person is inhibited.
According to the periodic change of physiological and biochemical activities of people, people can reasonably arrange activities in one day, so that the working efficiency and the resting efficiency are the highest, and the physical and psychological health states of people are the best. Among them, it is necessary for people to keep energy to arrange and guide sleep according to a biological clock.
When a human body falls into sleep from a waking state, the heartbeat is slowed down, the body temperature is lowered, the breathing is slowed down, muscles are relaxed, and a change process of relaxation, loss, fatigue, sleepiness and falling asleep in spirit is adapted to the muscles. Comparative studies with electroencephalography have shown that the longer the duration of eye closure, the more severe the fatigue. Therefore, the fatigue degree can be determined by measuring the opening degree of eyes and the duration of closing, thereby providing a detection means for the falling asleep process.
In the sleep stage, the human body shows a tendency change such as increased fatigue, eyelid sagging, intermittent blinking until complete eye closure, slow body movement, pulse and reduced body temperature, which can be detected by means of sensors. The detection of the eye state of the face, particularly the change of the opening degree, can be based on technologies such as machine vision, image processing and the like, the heart rate, the body movement and the body temperature can be detected by wearable modules such as a bracelet, and the detection means are applied to traffic driving or sleep monitoring.
The light has direct and important influence on the sleep of the human body, and in order to help find the illumination which is beneficial to the sleep to be more quickly, the invention detects and pre-judges the sleep efficiency characteristics of the user in different light color environments through nonlinear system modeling.
As shown in fig. 2, the system 100 for identifying the sleep environment lighting condition by using the method of the present invention includes a light color identification unit 110, a sleep-in identification unit 120, an identity identification unit 130, a control unit 140, a user interface unit 150, and a dimmable light set 160. The identification unit 130 employs a fingerprint recognizer, biometric or other feature recognizer, and the biometric features may employ iris features or facial measurement data features such as the distance between the eyes, nose and mouth of the user, etc.
As shown in fig. 2, 5 and 8, the sleep onset identifying unit 120 includes an image capturing module 121, a wearable module 122 and a sleep onset determining module 123, wherein the image capturing module 121 is supported by the pan/tilt head 104. The camera 103 of the image acquisition module, together with the cradle head 104, is fixed on a support 102 placed near the user in a sleeping scene, and the bottom of the support 102 is supported by a base 101.
The image acquisition module 121 acquires continuous images of a face and a reading object in a scene of falling asleep, the image processing unit processes the acquired images, periodically monitors the eye opening of the user, and acquires the eye opening value of the user and the change rate thereof, the eye closing duration and the change rate thereof. The image processing part also identifies the orientation of the reading object relative to the bracket in order to match the light color identification unit to identify the light color of the reading surface.
As shown in fig. 8 and 9, the image acquisition module adopts a depth camera, and images are captured by a color camera and a group of depth-of-field infrared cameras, the color camera is used for capturing images, the infrared cameras are used for generating a pixel depth matrix, and depth information of a target is generated through operation, so that human eyes at various angles are tracked and detected. In the process of tracking and detecting the human eyes, the holder supporting the camera is rotated according to the processing result of the image processing part, so that the camera is aligned to the face of the user, and imaging and processing are facilitated.
The wearable module 122 includes information acquisition modules such as a pulse sensor, an acceleration sensor, and a body temperature sensor, and signals acquired by these sensors are processed by the heart rate calculating part 1232, the body movement frequency calculating part 1233, and the body temperature calculating part 1234 in the sleep determination module 123, respectively, to obtain the heart rate, the body movement frequency, and the body temperature of the user.
Based on the image of the scene of falling asleep collected by the depth camera, the image processing part 1231 firstly performs smoothing processing and threshold segmentation, removes noise, positions the face and the eye region of the user, and extracts characteristic information such as the aspect ratio of human eyes; and secondly, performing geometric correction based on the depth information, performing three-dimensional reconstruction on the eye region, obtaining three-dimensional world coordinates of the eye region, and obtaining actual eye opening values at different angles and distances.
The eye opening value can be calculated based on the periodically acquired human eye height-width ratio, and the eye closing duration can be acquired in the process of periodically sampling images. The closed eye state is defined as the area of the eyelid covering the pupil exceeding 80%, and in the image sampling process, if the eye images acquired in two consecutive times are both in the closed state, the two acquisition time intervals are considered as the closed eye duration. An eye-open-eye-closed-eye-open sequence is continuously acquired, and the difference between the two eye-open times is the eye-closed duration.
The image-based sleep characteristic processing process comprises the following steps: after the face of the image is positioned, the left eye region and the right eye region are segmented, and the eye opening and the eye closing duration are respectively identified for two eyes.
The amplitude and frequency of the body motion are gradually reduced in the sleeping process, so that the method can be used for auxiliary detection of sleeping. The current state is characterized by the statistical period of physical activity, such as wrist activity energy and frequency, over half a minute. By adopting zero-crossing detection, if the acceleration value is compared with a reference value slightly larger than zero, the counting is carried out once every time the reference value passes. The following formula is adopted to represent the body motion frequency characteristics:
Figure BDA0003199161560000081
wherein A isiThe number of times of wrist movement in the i-th cycle, R, is obtained according to the acceleration valueiIs a time sequence coefficient, ηj(j-1, 2,3,4) is a term coefficient, QiSD is a function for solving standard deviation, and the number of the periods of measuring the cycle time and the number of the periods of 2 periods before and after the cycle time is higher than a set threshold value such as 5. In the formula, each coefficient can be a value between 0 and 1, and d can also beiAnd comparing with other physiological indexes such as myoelectricity recorded at the same time to carry out fitting calibration.
The pulse sensor is used for measuring the heart rate based on the principle that substances absorb light, and the pulse sensor irradiates blood vessels through a green light LED and is matched with a photosensitive photodiode to measure reflected light. Because the blood is red, the blood can reflect red light and absorb green light, and when the heart beats, the blood flow is increased, and the absorption amount of the green light is increased; the blood flow decreases in the beating gap of the heart, and the green light absorption decreases accordingly. Thus, heart rate can be measured from the absorbance of blood.
The pulse sensor converts the absorption of blood flow to light into a fluctuation signal, the signal is a mixture of a direct current signal and an alternating current signal, the alternating current signal reflecting the blood flow characteristics is extracted through band-pass filtering between 0.8Hz and 2.5Hz, then the maximum value point of the amplitude is extracted by adopting Fourier transform, the frequency value corresponding to the point is obtained, and the frequency value is multiplied by 60 times to obtain the actual heart rate value.
The body temperature calculating part carries out filtering processing on the signals collected by the body temperature sensor and calculates a body temperature value.
After basic data such as eye opening, eye closing duration, heart rate, body movement frequency, body temperature and the like are obtained, a data fusion processing part in the sleep judging module carries out data fusion on the physical sign parameters so as to eliminate inconsistent parts in a data set.
The data fusion adopts an evidence reasoning method and is based on the set heuristic rule. Rules include both single-factor and multi-factor categories. Take the single factor rule as an example: for the eye opening degree, if it is detected that one eye is closed while the other eye is open, the current state is determined as open. Other signs, such as occasional large amplitude increases in body temperature during the fall, occasional rather than sustained reverse increases following heart rate fall, all require evidential reasoning to exclude individual data.
In the multi-factor rule inference, the opposite change trend of the individual sign data is excluded according to the consistent change trend of most feature data. When data fitting is carried out on the eye closing duration by using a curve such as exponential distribution, for the eye closing duration, a plurality of short eye closing durations which are mixed in the data sequence are gradually increased, if other physical sign data show that sleepiness is gradually deepened, the plurality of short eye closing duration data are excluded, which may be anti-fatigue actions which are actively generated when a person consciously adjusts the state of the person during falling asleep, and are shown as blinking for a plurality of times. Similarly, if other vital sign data do not change much, i.e., they appear to be fatigued, but the length of time that the eyes are closed is much longer than normal, the data should be excluded, which may be the presence of a foreign object in the eyes. For another example, when the body tends to be calm, the acceleration sensor detects a sudden touch of the body, if the change of other sign data is not large, the touch may be caused by dozing during falling asleep, and the touch data should be deleted when the body movement frequency trend is calculated.
Based on various physical sign data sequences after data fusion processing, the sleep onset judging module expresses the data sequences by adopting a data fitting method. Fig. 13 shows a detection sequence of eye opening during reading before sleep, in which the sampling sequence of normalized eye opening de is first subjected to a pre-filtering process, and then subjected to data fusion to further remove the influence of accidental factors. In the first stage, the eye opening de does not change much, and basically changes within a range from the average value to the next value in the normal state; in the second stage, as sleepiness approaches, the eye opening gradually decreases until finally being detected as substantially closed.
As can be seen from fig. 13, the turning point of the human eye opening is hard to predict during the transition period of falling asleep, and gradually closes within a short time from the turning point; meanwhile, the gradual change duration is greatly different at different times. In order to fit the sampling sequence, the method is different from the common trend functions such as Sigmoid, tanh and the like, and the following fitting functions are designed:
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
where b is a scaling factor, which can be 0.5 for normalized data, and a and c are parameters related to the sample.
Referring to fig. 14, the values of a and c for the left curve are 2 and 2, respectively, and the values of a and c for the right curve are 5 and 1, respectively, it can be seen that, by appropriately changing the values of a and c, the data sequence with various turning point positions and different changing rates and a descending trend can be fitted.
Correspondingly, the y2 function can be adopted to perform data fitting on the sign data sequence which tends to be stable after the heart rate, the body motion frequency, the body temperature and the like are reduced. For the duration of the closed eye, accordingly, another fitting function is designed:
y1=g1(t)=8·b/exp(4·c·(a-t))+1。
also, for the duration of the eye-closing, if it reaches 4 seconds, it is generally judged that the person has entered the sleep state. Thus, the duration of the eye closure is pre-processed:
y1=max(y1,4),
otherwise, the duration of the closed eye can take many values, and the sample loses the meaning of the characterization.
How to characterize the rate of change of these signs, such as eye openness, on the basis of data fitting to the sign data sequence? If only the first derivative of the fitted function is calculated at a certain point in time, the characterization significance is lost due to the different values at different points in time. Similarly, the second derivative of the fitted function may not be able to characterize the difference between the different trend curves. Therefore, the invention calculates the time difference of independent variables corresponding to the two determined dependent variables according to the fitted trend function to characterize the sign change rate. For example, for the eye opening, the rate of change k is calculatedeo
keoT2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1)。
Similarly, the rate of change of other signs can be calculated. Through the data processing, various physical signs and the change rate thereof can show consistent evaluation standards; for example, a smaller defined rate of change of the physical sign indicates a shorter transition time to sleep. Meanwhile, compared with single-factor evaluation such as eye opening evaluation, the multi-factor sign evaluation can reflect the sleep-in efficiency or speed characteristics of different people, so that a foundation is provided for subsequent illumination influence modeling and illumination optimization control.
Preferably, the credibility of physical parameters such as the opening degree of eyes, the duration of eye closure, the heart rate, the body movement frequency, the body temperature and the like of the user is calculated according to a plurality of front and rear terms of the time data sequence of the user, and a Bayesian data fusion method is used for fusing a plurality of physical parameters into one output.
As shown in fig. 2, 4, 10 and 11, the light color identification unit 110 includes a light color obtaining module 111 disposed on the rotary platform 113 and a light color determining module 112 for processing and calculating the obtained light sensing signal. The rotary platform and the light color obtaining module form a light color sensing block 106, which is connected to the bracket 102.
The light color sensor 1066 in the light color obtaining module is connected to the bracket 102 sequentially through the pitching plate 1065, the rolling plate 1063 and the first connecting member 1061. The pitching plate 1065 is connected to the roll plate 1063 through a pitching rotating shaft 1064, and drives the photochromic sensor 1066 to pitch around the Y axis; the roll plate 1062 is connected to the first connecting member 1061 via a roll shaft 1062, and drives the pitching plate 1065 and the light color sensor 1066 to roll around the X axis. The roll shaft 1062 and the pitch shaft 1064 are driven to rotate by a motor, and they are powered by a first connecting piece 1061 and a second connecting piece 1067, respectively, and the control of the motor is realized by a light color judging module or a control unit located in the base. The first connecting part 1061 is a hard connection and provides an electrical connection channel in addition to supporting and fixing, and the second connecting part 1067 is a soft connection and only provides an electrical connection channel.
The photochromic sensor comprises an illuminance sensor, a color temperature sensor and a color sensor, wherein the color temperature and the color can be acquired by the same RGB or xyz color sensing module. Preferably, the color sensing module may be a TCS3430 sensor with five channels including X, Y, Z channel and two Infrared (IR) channels that can be used to infer the light source type. The TCS3430 sensor collects the light color signal of the reading surface in real time, and the xyz color coordinate value and the color temperature of the color are respectively obtained after signal processing and conversion by the processing module in the control unit.
The user may move about the table before falling asleep, such as for a work schedule or schedule for the next day or for a short reading, while the reading surface is substantially stationary and light detection may be performed in the horizontal plane. However, sometimes, the reading surface of the user is not horizontal, for example, the user leans on a couch, a sofa or a bed head to read, at this time, based on the recognition of the reading surface orientation by the image processing unit, there are two methods to detect the illumination, especially the illuminance, of the reading surface, one is to convert the illuminance detected by the photochromic sensor 1066 to the reading surface according to the spatial distribution characteristics of the light source, and the other is to convert the photochromic sensor to the orientation parallel to the reading surface by the rotating platform, so that the illuminance of the reading surface is obtained by the photochromic calculation module. The former method requires modeling of the spatial distribution of the light source, and has a small application range, and for this reason, the second method is adopted.
In a light environment, the orientation of the surface of the light color sensor is changed by respectively rotating the pitching rotating shaft and the rolling rotating shaft so that the surface is parallel to a target reading surface, light color parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the orientation surface are calculated by a light color judging module after sampling incident light, a pitch angle alpha and a roll angle beta corresponding to each orientation are recorded, and a mapping table combining alpha and beta to each light color parameter value is established. The direction of the target reading surface is obtained after sampling and processing are respectively carried out through the image acquisition module and the image processing part.
In order to generalize the mapping table to any specific orientation, when the combination of the pitch angle and the roll angle of the orientation is not in the mapping table, the corresponding photochromic parameter value is obtained through distance weighted interpolation calculation in the angle combination space based on the mapping table, and the process is as follows.
For simplicity, without loss of generality, only 2 parameters of the light color parameters, i.e., the reading surface illuminance and the color temperature, are taken as examples, and more light color parameters can be processed similarly.
Mapping tables for each photochromic parameter value based on pitch angle alpha and roll angle beta combinations for a specific angle combination (alpha)0,β0) And the values of the illumination and the color temperature are obtained by interpolation in the mapping table.
First, find P (alpha) in the angle space0,β0) Four points around: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure BDA0003199161560000121
Figure BDA0003199161560000122
wherein d is1Represents the shortest distance of P to four points, d2The point is the second shortest, and so on; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
Preferably, in establishing the mapping of the angle combinations to the light color parameter values, the light color sensor surface is arranged as close as possible to the reading surface, so that the difference in illumination on the two planes is not small enough to affect the efficiency of falling asleep. This is easily met when the light source is at a distance from the reading surface.
As shown in fig. 2 and fig. 3, the control unit includes an input interface module 140, a processing module 142, an Elman neural network 143, an iterative learning module 144, a memory 145, a first connection array 146, a second connection array 147, and an output module 148.
The invention adopts the neural network to construct and model the mapping relation between the illumination condition of the environment and the sleep efficiency factor of the user. Specifically, an Elman neural network shown in fig. 3 is established, the network takes the illumination of the reading surface, the color temperature and the xyz color coordinate value of the color as input quantities, and takes 5 individual characteristic parameters of the opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user eye as output quantities.
Compared with a BP (back propagation) neural network, the Elman neural network has a recursive structure and also comprises a receiving layer besides an input layer, a hidden layer and an output layer, wherein the receiving layer is used for feedback connection among layers, so that the receiving layer can express time delay and parameter time sequence characteristics between input and output, and the network has a memory function. Referring to fig. 3, the built neural network has 5 units in the input layer, 5 units in the hidden layer and the node number in the receiving layer, and 5 units in the output layer.
The neural network model is:
xck(t)=xk(t-mod(k,q)-1),
Figure BDA0003199161560000131
Figure BDA0003199161560000132
wherein mod is a remainder function, and f () is a sigmoid function; xck(t) is the carry layer output, xj(t) is the hidden layer output, ui(t-1) and yh(t) input layer input and output layer output, wj、wjkAnd wjiRespectively, the connection weight from the hidden layer to the output layer, the connection weight from the receiving layer to the hidden layer and the connection weight from the input layer to the hidden layer, thetahAnd thetajOutput layer and hidden layer thresholds, respectively; k is 1,2 … m, q is the selected regression delay scale, and is optimized according to the sampling period; j is 1,2 … m, i is 1,2 … 5, the number m of hidden layer and accepting layer nodes can be selected from 12-25; h is 1,2 … 5.
Referring to fig. 12, the method for identifying the illumination condition of the sleep environment of the present invention includes the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that a dynamic recursive Elman neural network is established in a control unit by taking 5 photochromic parameters including illumination, color temperature and color xyz coordinate values of reading surface light as input quantities, and 5 characteristic parameters including a user eye opening change rate, a user eye closing duration change rate, a heart rate change rate, a body motion frequency change rate and a body temperature change rate as output quantities;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in the field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user based on the illuminance, the color temperature and the xyz color coordinate value of the color of the current reading surface light acquired by the light color recognition unit and outputs the results through the output module.
In order to improve the generalization capability of the neural network, enough training samples are collected. The control unit sends out dimming signals to the lamp group through the output module or the user interface unit, obtains a training sample set of the neural network based on the light color identification unit and the sleep-in identification unit in different light environments aiming at specific users, and records output values y of all sampleshIs the actual value of (i.e. the expected value y)hd
Through the processing modules in the sleep-in recognition unit and the control unit, the 5 characteristic parameters of the neural network output quantity are obtained by processing in the following way:
based on the sleep-in recognition unit, the change process of the physical sign parameters in the sleep-in process under various illumination conditions is obtained and recorded, and for the recorded data in the physical sign parameter sequence in each sleep-in process,
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1);
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5;
Wherein y1 and y2 are values obtained after sign parameter preprocessing or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) corresponds to the eye opening change rate keoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
In the process of sampling the sleep onset process, when the change rates of a plurality of physical sign parameters are detected to be smaller than a set threshold value in a plurality of continuous periods, the user is considered to be asleep, and the sleep onset sampling is stopped.
The neural network training adopts a gradient descent method, and the weight and threshold value adjusting method in the training is as follows.
Assuming a total of P training samples, let the error function be:
Figure BDA0003199161560000151
then the adjustment of the weight from the hidden layer to the output layer is shown as follows:
whj(t+1)=whj(t)+Δwhj(t+1),
wherein the content of the first and second substances,
Figure BDA0003199161560000152
δyh=-(yhd-yh)·yh·(1-yh),
the adjustment formula of the output layer threshold is as follows:
θ(t+1)=θ(t)+Δθ(t+1),
wherein the content of the first and second substances,
Figure BDA0003199161560000153
similarly, the input layer to hidden layer connection weights, hidden layer thresholds, and the accept layer to hidden layer connection weights are adjusted.
The initial value range of each weight is an interval of (-0.1, 0.1), the learning rate eta is a decimal less than 1, and the learning rate eta can be dynamically adjusted by adopting a fixed rate or according to the total error of the current network output. The training end condition may be set to a total error or a variation thereof smaller than a set value or a number of times of training up to a certain amount.
Before network training, normalization preprocessing can be performed on input quantity and output quantity:
r'=r-rmin/rmax-rmin
wherein r is an unprocessed physical quantity, r' is a normalized physical quantity, rmaxAnd rminRespectively the maximum and minimum values of the sample data set.
When calculating the predicted value, the network output quantity is converted back to the output quantity value by the following formula:
r=rmin+r'·(rmax-rmin)。
when the online prediction is applied, the first connection array is disconnected, the neural network predicts each output quantity and outputs the output quantity to the processing module through the second connection array, and the output quantity is displayed and output through the output module and is sent to the outside in a signal form after the output quantity is processed and analyzed by the processing module.
As shown in fig. 1 and fig. 6, in an environment where the method is tested or used, preferably, the dimmable light bank 160 is a dimmable LED light bank, the driver adjusts the driving current value of each LED string 162 in the light bank, the driver 161 is a driver capable of changing the output current, and the driver performs light emission adjustment by changing the PWM duty cycle of the driving current of each channel of the LED string. By changing the driving current, the dimmable light set 160 can adjust at least one of the light properties such as brightness, color temperature, color, and illumination angle.
Preferably, the LED string is a dimming lamp including RGB three-primary-color current channels, and at this time, the light color of the lamp can be changed by changing the driving current value of one of the channels. When the three channel currents are increased or decreased in synchronization from a certain state, the lamp exhibits no change in color but a brightness that gradually increases or decreases.
Preferably, the control unit changes the light emission of the LED lamp set in a stepwise manner within a known dimming range of the LED lamp set through the output module. For example, a variable mapping table is established by combining the values of the channel currents of the LED strings with the corresponding illuminance, color temperature and color collected on the reading surface, only one variable, such as the illuminance, is changed and the other variables, such as the color temperature and the color, are kept unchanged in the value interval of the illumination vector space composed of the illuminance, the color temperature and the color, the mapping table is reversely searched to find the current value of each channel of the LED strings corresponding to the current illumination vector, and the control unit sends the PWM wave duty ratio of each channel current to the driver in the form of a signal through the output module. The control unit obtains enough training samples of the neural network after multiple sleep-in detections by continuously changing working points of an illumination vector space, wherein sampling points can be sparse in end value areas of various light color variables, and the sampling points are denser in low color temperature areas such as areas near color temperature 3000k and illumination of 100 lx-300 lx. The collected sample is stored in a memory.
The parameters such as preset values required for processing by the control unit are input through keys in the user interface unit. The trained neural network can predict and judge the sleep efficiency of the user under the current illumination condition in a new light environment based on the generalization capability of the neural network, and display or output the predicted result through an output module.
Specifically, as shown in fig. 1 and 8, on the base 101, the keys of the user interface unit are disposed in the area of the key block 107, and on the other side opposite to the key block, the user interface unit may further be disposed with a dimming panel 108 for manually adjusting the light emission of the light set.
Preferably, the output module 148 includes a display bar 105 for indicating the values of the factors of the sleep efficiency of the current user in turn. Preferably, the output module further comprises a communication interface, and the detected or predicted values of the factors of the sleep onset efficiency are output to the outside through the interface module.
Since the drowsiness or fatigue level in preparation for falling asleep varies, it is preferable that a key indicating the current fatigue level is provided in the user interface unit while the neural network increases an input amount of a fatigue index, which may be an integer between 1 and 5.
When the user has difficulty falling asleep due to emotions and the like, the collected samples have larger deviation from the samples under normal conditions, and although the neural network has better fault tolerance, the accuracy of the network is affected when too many samples are available. For this purpose, a cancel sampling key is preferably provided in the user interface unit, and the control unit suspends data sampling and sample recording after detecting that this key is pressed.
To increase the applicability of the network, the control unit may preferably further include a real-time clock module, and the neural network module may further include a seasonal parameter obtained from the real-time clock module as an input.
Preferably, the neural network module may further add a time period parameter obtained from the real-time clock module as an input, the time period being noon or night respectively.
Preferably, the control unit can be additionally provided with a temperature and humidity measurement module, and the neural network module is used for adding two parameters of temperature and humidity acquired from the temperature and humidity measurement module as input.
Preferably, the control unit may further include a noise measurement module, and the neural network module adds a noise level parameter obtained from the noise measurement module as an input.
Example 2:
in this embodiment, referring to fig. 10 and fig. 11, a reading surface light measuring method in a sleep environment is provided, which includes the following steps:
p1, connecting a light color sensor in the light color acquisition module to a bracket near a user in a world coordinate system in a sleeping scene through a pitching plate, a rolling plate and a first connecting piece in sequence;
p2, in a light environment, respectively rotating a pitching rotating shaft connecting the pitching plate and the rolling plate and a rolling rotating shaft connecting the rolling plate and the first connecting piece to change the orientation of the surface of the light color sensor, so that the surface is parallel to a target reading surface, sampling reflected light, calculating light color parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the oriented surface by a light color judging module, recording a pitch angle alpha and a roll angle beta corresponding to each orientation, and establishing a mapping table combining alpha and beta to each light color parameter value;
p3, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the illuminated plane is not in the mapping table, obtaining the corresponding photochromic parameter value through Euclidean distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
In order to identify the orientation of the illuminated plane, the orientation can be detected based on a depth camera on a bracket in a world coordinate system, or detected through a triaxial acceleration sensor fixed on the illuminated plane, and a signal obtained by detection is converted into a corresponding pitch angle and a corresponding roll angle by a signal processing module in the photochromic identification unit.
The interpolation calculation process is as follows:
for simplicity, without loss of generality, only 2 parameters of the light color parameters, i.e., the reading surface illuminance and the color temperature, are taken as examples, and more light color parameters can be processed similarly.
Based onMapping tables of combinations of pitch angle alpha and roll angle beta to parameter values of each light color for a specific angle combination (alpha)0,β0) And the values of the illumination and the color temperature are obtained by interpolation in the mapping table.
First, find P (alpha) in the angle space0,β0) Four points around: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure BDA0003199161560000181
Figure BDA0003199161560000182
wherein d is1Represents the shortest distance of P to four points, d2The point is the second shortest, and so on; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
Example 3:
in contrast to embodiment 1, in the present embodiment, the sleep onset duration is introduced into the input parameters of the non-linear mapping. The implementation provides a sleep environment illumination condition identification method, which comprises the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that 6 parameters including illumination, color temperature, color xyz color coordinate values and sleep duration of reading surface light are used as input quantities, 5 characteristic parameters including user eye opening change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate are used as output quantities, and a dynamic recursive Elman neural network is established in a control unit;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
wherein, the parameters of each training sample are obtained according to the following processing procedures:
continuously detecting the eye opening of the user, when the eye opening value is continuously smaller than (1-delta%) times of the eye opening value in the initial stage of falling asleep within a set time length, taking the current time as the timing zero point of the falling asleep duration, and simultaneously discarding the sample record before the zero point time, wherein delta can be an integer between 5 and 10,
the user eye opening degree change rate keoRate of change k of duration of eye closureecHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kpThese 5 individual feature parameters are all calculated by a moving average filter, such as for the eye opening change rate,
keo|t=u=ave(dEOu-2,dEOu-1,dEOu,dEOu+1,dEOu+2),
where ave is the mean function, dEOuThe difference between the eye opening value at the time u and the eye opening value at the last time;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in a field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of a user based on the illuminance, the color temperature, the xyz color coordinate value of the color and the sleep duration of the current reading surface light acquired by the light color identification unit and outputs the results through the output module.
As shown in fig. 13, since the turning point of falling asleep of the user cannot be predicted, in the present embodiment, by continuously monitoring the eye opening, when it is significantly deviated from the normal range, the data sequence after sampling and recording is started.
Compared with embodiment 1, since the time length from the turning point of falling asleep is introduced into the input of the neural network, the sign parameters at a certain time point after can be predicted by the trained neural network.
Preferably, the sleep progress process can be characterized by an exponential distribution function, physical sign parameters such as the eye opening degree, the eye closing duration, the heart rate, the body movement frequency and the body temperature of the user are taken as fitting sample data in a form of normalization and instantaneous data weighted average, and all the physical sign parameters are fused into a function.
Example 4:
different from embodiment 1, in this embodiment, the control unit replaces the neural network with the sleep onset efficiency mapping table to implement mapping from the light color condition to each change rate physical sign parameter of the sleep onset efficiency.
In this embodiment, a method for identifying a sleep environment lighting condition is provided, which includes the following steps:
s1, establishing a data sample structure:
the illumination condition is represented by 2 light color parameters including the illumination of the reading surface and the color temperature, the change rate of 5 individual characteristic parameters such as the opening value of the eyes, the duration of the closed eyes, the heart rate, the body movement frequency, the body temperature and the like of a user is taken as the sleep efficiency factor,
establishing an empty sleeping efficiency mapping table, which takes the combination of the photochromic parameters as a row index and takes the 5 change rate physical sign parameters of the sleeping efficiency factors as column titles, namely fields;
s2, acquiring a fitting data sample set in the falling asleep process: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, acquiring change process records of physical sign parameters in the sleeping process under various illumination conditions based on the light color identification unit and the sleeping identification unit in different light environments for a specific user in the sleeping process, and performing the following processing for the characteristic parameter change process records corresponding to each light color combination condition formed by illumination and color temperature:
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1),
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5,
Wherein y1 and y2 are values obtained after sign parameter preprocessing or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) corresponds to the eye opening change rate keoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
Recording each change rate physical sign parameter under each light color combination in a sleep efficiency mapping table;
s3, repeating the step S2 to obtain a fitting data sample set;
s4, online prediction: predicting sleep-in efficiency factors under specific light color combination in a field environment, searching the sleep-in efficiency mapping table according to the light color combination value, and when the combination is not in the sleep-in efficiency mapping table, obtaining each change rate sign parameter value of the corresponding sleep-in efficiency factor through distance weighted interpolation calculation based on the sleep-in efficiency mapping table, wherein the distance is the Euclidean distance in a light color combination space; if the combination exists in the sleep efficiency mapping table, the table is directly looked up to obtain the corresponding change rate physical sign parameter value,
and outputting the table look-up result through an output module.
It can be understood that, in the solution of the present invention, the illumination in the color parameters of the reading surface is for the reading object without an active light source, and for the reading object with a backlight source, such as a mobile phone, a tablet, an electronic book, etc., in the mapping from the color condition to the sleep-enabling efficiency related factor, a backlight source brightness item is supplemented to the parameter set of the color condition.
In addition, all models related to the sleep efficiency factor are based on specific individuals, so that related data in the process of generating network training samples, mapping tables and the like are based on users with the same identity; for multiple users, one data set should be created and saved for each user independently.
The invention is applied to detect and prejudge various factors of sleep-in efficiency under different light environments, and after samples with abundant changes are collected, due to infinite combinations in light color change domains, sleep-in efficiency parameters including eye opening change rate, heart rate change rate and the like under illumination conditions in various field environments can be predicted by the invention, thereby providing a basis for searching potential high sleep-in efficiency light environments.
While the embodiments of the present invention have been described above, these embodiments are presented as examples and do not limit the scope of the invention. These embodiments may be implemented in other various ways, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. These embodiments and modifications are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.

Claims (7)

1. A reading surface light measuring method in a sleep environment comprises the following steps:
p1, connecting a light color sensor in the light color acquisition module to a bracket near a user in a world coordinate system in a sleeping scene through a pitching plate, a rolling plate and a first connecting piece in sequence;
p2, in a light environment, respectively rotating a pitching rotating shaft connecting the pitching plate and the rolling plate and a rolling rotating shaft connecting the rolling plate and the first connecting piece to change the orientation of the surface of the photochromic sensor, sampling reflected light, then calculating photochromic parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the oriented surface by a photochromic judgment module, recording a pitch angle alpha and a roll angle beta corresponding to each orientation, and establishing a mapping table of alpha and beta combination to each photochromic parameter value;
p3, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the illuminated plane is not in the mapping table, obtaining the corresponding photochromic parameter value through Euclidean distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
2. The reading surface photometry method in a sleep environment as claimed in claim 1, wherein in the step P3, the following steps are performed: in a field environment, the pitching rotating shaft and the rolling rotating shaft are respectively rotated to enable the surface of the photochromic sensor to be parallel to a reading surface in a sleeping environment.
3. The reading surface photometry method in a sleep environment as claimed in claim 2, wherein the orientation is detected based on a depth camera on a support in a world coordinate system.
4. The method as claimed in claim 2, wherein the orientation is detected by a three-axis accelerometer fixed on the illuminated plane, and the signal obtained by the detection is converted into corresponding pitch angle and roll angle by a signal processing module in the light color identification unit.
5. The reading surface photometry method in the sleep environment as claimed in claim 1, wherein the distance weighted interpolation calculation specifically comprises:
for a combination P (alpha) of pitch and roll angles0,β0) First, four points around P in the angle space are found: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure FDA0003199161550000011
Figure FDA0003199161550000012
wherein d is1Represents the shortest distance of P to four points, d2The second shortest point, and so on, dTIs the sum of all distances; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
6. The method as claimed in claim 1, wherein the light color parameters include 5 light color parameters including illumination, color temperature, and xyz color coordinate values of the reading surface light, and the method further includes:
establishing a dynamic recursive Elman neural network, taking the 5 photochromic parameters as input quantities, and taking 5 characteristic parameters of the user, namely the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate, as output quantities;
and the neural network trained by the training sample in an off-line mode predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user based on the illuminance, the color temperature and the xyz color coordinate value of the color of the current reading surface and outputs the result as the sleep efficiency factor.
7. The method as claimed in claim 1, wherein the light color parameters include 5 light color parameters including illumination, color temperature, and xyz color coordinate values of the reading surface light, and the method further includes:
the change rate of physical sign parameters such as the opening value of eyes, the duration of eye closure, the heart rate, the body movement frequency, the body temperature and the like of a user is taken as a sleep efficiency factor; establishing a sleep efficiency mapping table, recording the change rate of each physical sign parameter under each light color parameter combination,
predicting sleep-in efficiency factors under specific light color parameter combinations in a field environment, searching the sleep-in efficiency mapping table according to light color parameter combination values, and when the combinations are not in the sleep-in efficiency mapping table, obtaining change rate physical sign parameter values of the corresponding sleep-in efficiency factors through distance weighted interpolation calculation based on the sleep-in efficiency mapping table, wherein the distances are Euclidean distances in a light color combination space; if the combination exists in the sleep efficiency mapping table, the table is directly looked up to obtain the corresponding change rate physical sign parameter value,
and outputting the table lookup result.
CN202110899460.2A 2019-04-24 2019-04-24 Reading surface light measuring method in sleeping environment Withdrawn CN113951837A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110899460.2A CN113951837A (en) 2019-04-24 2019-04-24 Reading surface light measuring method in sleeping environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110899460.2A CN113951837A (en) 2019-04-24 2019-04-24 Reading surface light measuring method in sleeping environment
CN201910336196.4A CN109998497B (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910336196.4A Division CN109998497B (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment

Publications (1)

Publication Number Publication Date
CN113951837A true CN113951837A (en) 2022-01-21

Family

ID=67174074

Family Applications (5)

Application Number Title Priority Date Filing Date
CN202110829176.8A Withdrawn CN113558581A (en) 2019-04-24 2019-04-24 Illuminance detection device for illuminated plane in luminous environment and use method of illuminance detection device in sleep-in detection and judgment system
CN202110899460.2A Withdrawn CN113951837A (en) 2019-04-24 2019-04-24 Reading surface light measuring method in sleeping environment
CN202110829174.9A Withdrawn CN113545758A (en) 2019-04-24 2019-04-24 System for detecting and judging sleep in luminous environment
CN202110834336.8A Withdrawn CN113576426A (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment
CN201910336196.4A Active CN109998497B (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110829176.8A Withdrawn CN113558581A (en) 2019-04-24 2019-04-24 Illuminance detection device for illuminated plane in luminous environment and use method of illuminance detection device in sleep-in detection and judgment system

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN202110829174.9A Withdrawn CN113545758A (en) 2019-04-24 2019-04-24 System for detecting and judging sleep in luminous environment
CN202110834336.8A Withdrawn CN113576426A (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment
CN201910336196.4A Active CN109998497B (en) 2019-04-24 2019-04-24 Sleep-in detection and judgment system in luminous environment

Country Status (1)

Country Link
CN (5) CN113558581A (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11141103B2 (en) * 2019-07-25 2021-10-12 Mediatek Inc. Vital-sign detection system and control method therefor
CN111741093B (en) * 2020-06-12 2023-06-30 喻军 Screen-based data transmitting method, receiving device, setting system and readable storage medium
CN115062764B (en) * 2022-06-17 2023-07-11 淮阴工学院 Intelligent illuminance adjustment and environmental parameter Internet of things big data system
CN116959214B (en) * 2023-07-18 2024-04-02 北京至真互联网技术有限公司 Method and system for reminding user of eye protection through intelligent glasses

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6995355B2 (en) * 2003-06-23 2006-02-07 Advanced Optical Technologies, Llc Optical integrating chamber lighting using multiple color sources
KR100646868B1 (en) * 2004-12-29 2006-11-23 삼성전자주식회사 Home control system and method using information of galvanic skin response and heart rate
US10610153B2 (en) * 2014-07-21 2020-04-07 Withings System and method to monitor and assist individual's sleep
JP6909018B2 (en) * 2017-03-01 2021-07-28 任天堂株式会社 Light emission control device and electronic equipment
CN107601083B (en) * 2017-09-19 2019-04-02 中国计量大学 Straight weight-loss type material baiting method neural network based
CN108712809B (en) * 2018-05-18 2019-12-03 浙江工业大学 A kind of luminous environment intelligent control method neural network based
CN108958047A (en) * 2018-07-09 2018-12-07 西安交通大学 A kind of intelligent sleep system and its working method
CN109106349A (en) * 2018-08-09 2019-01-01 上海常仁信息科技有限公司 A kind of domestic consumer's sleep monitor system
CN109199336A (en) * 2018-09-30 2019-01-15 深圳个人数据管理服务有限公司 A kind of sleep quality quantization method, device and equipment based on machine learning

Also Published As

Publication number Publication date
CN109998497B (en) 2021-08-06
CN109998497A (en) 2019-07-12
CN113576426A (en) 2021-11-02
CN113545758A (en) 2021-10-26
CN113558581A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN110163371B (en) Dimming optimization method for sleep environment
CN110013231B (en) Sleep environment illumination condition identification method
CN110113843B (en) Lighting control system based on sleep efficiency factor
CN109998497B (en) Sleep-in detection and judgment system in luminous environment
EP3773155B1 (en) System and method for non-invasive determination of blood pressure dip based on trained prediction models
US20210345888A1 (en) Detecting alcohol intoxication from video images
WO2019079503A2 (en) Applied data quality metrics for physiological measurements
CA2873193A1 (en) System and method for monitoring the health of a user
KR102416878B1 (en) Healthcare apparatus for measuring heart rate
Yu et al. Video-based heart rate measurement using short-time Fourier transform
WO2020140913A1 (en) Data processing method and apparatus, electronic device and storage medium
Wang et al. Maximum weight multi-modal information fusion algorithm of electroencephalographs and face images for emotion recognition
CN109313729A (en) The control of sensor privacy settings
CN112232256A (en) Non-contact motion and body measurement data acquisition system
EP4169042A1 (en) Pulse shape analysis
KR102435808B1 (en) Healthcare apparatus for measuring stress score
CN108720825A (en) A kind of seamless detection method of the non-contact vital sign parameter based on multi-cam
Liu et al. Adaptive-weight network for imaging photoplethysmography signal extraction and heart rate estimation
US20230218240A1 (en) Healthcare apparatus for heart rate measurement
US20230210423A1 (en) Healthcare apparatus for calculating stress index
Cuppens Detection of epileptic seizures based on video and accelerometer recordings
WO2020132941A1 (en) Identification method and related device
Paracchini Remote biometric signal processing based on deep learning using SPAD cameras
Daqrouq et al. HEART RATE MEASUREMENT USING IMAGE RECOGNITION TECHNOLOGY.
Pedram Designing Resource Efficient Embedded Systems for Health Monitoring

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220121

WW01 Invention patent application withdrawn after publication