CN113598722A - Sleep environment illumination condition identification method - Google Patents

Sleep environment illumination condition identification method Download PDF

Info

Publication number
CN113598722A
CN113598722A CN202110899472.5A CN202110899472A CN113598722A CN 113598722 A CN113598722 A CN 113598722A CN 202110899472 A CN202110899472 A CN 202110899472A CN 113598722 A CN113598722 A CN 113598722A
Authority
CN
China
Prior art keywords
change rate
color
sleep
module
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110899472.5A
Other languages
Chinese (zh)
Inventor
邹细勇
张维特
李子印
胡晓静
李晓艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University Shangyu Advanced Research Institute Co Ltd
Original Assignee
China Jiliang University Shangyu Advanced Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University Shangyu Advanced Research Institute Co Ltd filed Critical China Jiliang University Shangyu Advanced Research Institute Co Ltd
Priority to CN202110899472.5A priority Critical patent/CN113598722A/en
Publication of CN113598722A publication Critical patent/CN113598722A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Abstract

The invention discloses a method for identifying a sleeping environment illumination condition, which comprises the steps of firstly taking the illumination of a reading surface, the color temperature, the xyz color coordinate value of a color, the falling asleep duration time and the like as input quantities, taking sign parameters such as the eye opening change rate, the eye closing duration time change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user obtained after data fusion and fitting as output quantities, and establishing a dynamic recursive Elman neural network for representing the mapping relation between the environment illumination condition and the falling asleep efficiency of the user; changing the current of the lamp group, collecting a sample after the light color combination is changed, and training a neural network; in practical application, the trained network is used for predicting all relevant parameters of the user sleep efficiency under the current illumination condition. The time length from the turning point of falling asleep is introduced into the input of the neural network, so that the physical sign parameters of a certain time point after falling asleep can be predicted, and a basis is provided for searching and recommending a potential light environment with high falling asleep efficiency.

Description

Sleep environment illumination condition identification method
The application is a divisional application of patent application No. 201910335756.4, application date 2019, 24.04.9, and invention title "sleeping environment illumination condition identification method and reading surface light measurement method".
Technical Field
The invention relates to the field of intelligent lighting and sleep assistance, in particular to a sleep environment illumination condition identification method.
Background
In a typical twenty-four hour biological clock cycle, the human body has different physiological characteristics in time periods, such as the maximum depth of sleep of the human body at 2 am, the cessation of melatonin secretion for 30 minutes at 7 am, the highest efficiency of cardiovascular work at 17 pm, and the onset of melatonin secretion at 21 pm.
An endocrine organ called pineal body exists in the brain of human, and one of the functions of pineal body is to secrete melatonin, which plays an extremely important role in promoting sleep. The melatonin secretion can inhibit the sympathetic nerve excitation of human body, reduce the blood pressure of human body, slow down the heartbeat, make the heart rest, and simultaneously can enhance the immunity and eliminate the fatigue. The blue light can inhibit melatonin secretion of pineal bodies, the blue light is strongest in daytime, and people are spirited and shaken; the blue light is weakest at night, melatonin is secreted by pineal bodies in the brain, and the melatonin entering blood promotes the human body to be sleepy, fall asleep and deeply sleep.
Although the influence of light on human rhythm has been studied more, there is no specific research scheme but only some general reasoning about how the stimulus response of different lights in the sleep stage of human body, especially how the human characteristics change gradually in the sleep stage. For example, in chinese patent application No. 2016107972446, a doppler device is used to detect the limb movement of a user, and a method based on group probability statistics is used to determine a possible time point when the user falls asleep.
In a dimmable environment, what transition will be shown to the user from preparing to sleep to going to sleep?
For this reason, a method for recognizing the illumination condition of the sleep environment is needed.
Disclosure of Invention
It is an object of the present invention to provide a method to detect the effect of lighting conditions on the speed of falling asleep or the efficiency of falling asleep and to provide a prediction in a field environment of what effect the lighting conditions of the field environment will have on falling asleep for the user.
For this reason, it is necessary to first perform detection and judgment of sleep onset behavior for a user, and then model a mapping relationship between different lighting conditions and factors related to sleep onset efficiency.
At night, when people are ready to rest, transitional activities such as work planning and reading before sleep are often performed, and more people can use smart phones or tablets to see relaxing contents. In the stage before falling asleep, the backlight of a lamp or equipment with low color temperature and low brightness can help the body relax until the user makes a trouble and falls asleep. But we need a model to reflect the relationship between the sleep efficiency or speed and the lighting conditions.
Since this model is a multi-input multi-output nonlinear system, it needs to rely on nonlinear system identification. In a nonlinear system, an artificial neural network is a network formed by widely interconnecting a large number of processing units, has large-scale parallel simulation processing capacity and strong self-adaption, self-organization and self-learning capacity, is generally emphasized in system modeling, identification and control, and has nonlinear transformation characteristics which provide an effective method for system identification, particularly identification of a nonlinear system. Because the falling asleep of the human body is a continuous dynamic process, there is a close correlation between human body characteristics in adjacent time periods. To this end, the present invention employs a dynamic recurrent neural network to model the system.
The method is based on a dynamic recursive Elman neural network, and models a complex nonlinear mapping relation between illumination conditions and sleep-in efficiency factors, wherein the illumination conditions comprise illumination of a reading surface, color temperature and xyz color coordinate values of colors, and the sleep-in efficiency factors are represented by 5 parameters of eye opening change rate, eye-closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate of a user.
The technical scheme of the invention is that the signals of several human body characteristics related to falling asleep are acquired, the trend of the signals is extracted, and accidental factors in various signals are eliminated by adopting a data fusion method, so that accurate falling asleep characteristic data is obtained. And further, repeatedly extracting sleep-falling characteristics under different illumination conditions to obtain an evaluation sample of the influence of illumination on sleep-falling. And finally, establishing a pre-judging model of the sleep characteristics of the human body in different light environments based on a nonlinear mapping theory and processing calculation.
The evaluation of the sleep onset efficiency based on the physical sign sensing data has the following problems. Firstly, in the sampled sign data, the former section may be smooth and has no significant change or the change is less than a certain range, and the latter section temporarily starts to change from a certain time point such as the feeling of human body; then, how does this point in time judge? Is the time point earlier than a valid data sample?
Second, even if the latter signs begin to change, such as the eye opening becomes smaller or the eye-closing duration increases, the amount of change or rate of change per time itself changes, such as the rate of change or first derivative of the negative exponential function gradually decreases as the independent variable increases. For this reason, it is difficult to define the sleep onset efficiency by a first derivative of the vital sign data sequence.
Based on the above two problems, the sleep onset efficiency is defined to reflect the general trend of the vital sign data sequence with uncertain turning point and inconstant change rate in the sleep onset stage, and is expressed in a quantitative form.
Specifically, the invention provides a method for identifying the illumination condition of a sleep environment, which comprises the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that 6 parameters including illumination, color temperature, color xyz color coordinate values and sleep duration of reading surface light are used as input quantities, 5 characteristic parameters including user eye opening change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate are used as output quantities, and a dynamic recursive Elman neural network is established in a control unit;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
wherein, the parameters of each training sample are obtained according to the following processing procedures:
continuously detecting the eye opening of the user, when the eye opening value is continuously smaller than (1-delta%) times of the eye opening value in the initial stage of falling asleep within a set time length, taking the current time as the timing zero point of the falling asleep duration, and simultaneously discarding the sample record before the zero point time, wherein delta can be an integer between 5 and 10,
the user eye opening degree change rate keoRate of change k of duration of eye closureecHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kpThese 5 individual feature parameters are all calculated by a moving average filter, such as for the eye opening change rate,
keo|t=u=ave(dEOu-2,dEOu-1,dEOu,dEOu+1,dEOu+2),
where ave is the mean function, dEOuThe difference between the eye opening value at the time u and the eye opening value at the last time;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in a field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of a user based on the illuminance, the color temperature, the xyz color coordinate value of the color and the sleep duration of the current reading surface light acquired by the light color identification unit and outputs the results through the output module.
Preferably, the dimmable light bank in the sleeping environment adopts an LED light bank with adjustable light properties, such as brightness, color temperature, color and illumination angle, and the light emitting adjustment is performed by adjusting the PWM duty cycle of the driving current of each channel of the LED strings in the light bank through the dimming driver.
Preferably, in the acquiring of the sample set in step S2: and changing the light emission of the LED lamp set in a stepping mode within the known dimming range of the dimmable lamp set, and acquiring enough data samples after multiple sleep-in detections by continuously changing the working point of an illumination vector space, wherein sampling points can be sparse in the end value region of each light color variable, and the sampling points are denser in a low color temperature region such as a region near the color temperature of 3000k and a region near the illumination of 100 lx-300 lx.
Preferably, the step S4 is preceded by the steps of:
t1, establishing a mapping table for combining the pitch angle and roll angle corresponding to the illuminated surface orientation to the parameter values of each light color in the control unit,
the pitch angle and the roll angle are rotation angles of a photochromic sensor parallel to the illuminated surface in a world coordinate system, a bracket arranged near a user is arranged in the coordinate system under a sleeping scene, the photochromic sensor is connected to the bracket through a pitch plate, a roll plate and a first connecting piece in sequence,
the step S4 further includes the following steps:
t2, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the reading surface is not in the mapping table, obtaining the corresponding photochromic parameter value through the distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
Preferably, in step S2, the 5 characteristic parameters of the neural network output are obtained through the following processes:
periodically acquiring and recording the state change of the physical sign parameters in the sleeping process under various illumination conditions based on the sleeping identification unit, and for the recorded data in the physical sign parameter sequence in each sleeping process,
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1);
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5;
Wherein y1 and y2 are values obtained after sign parameter preprocessing or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) respectively corresponds to the eye opening change ratekeoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
Preferably, the neural network model is:
xck(t)=xk(t-mod(k,q)-1),
Figure BDA0003199175060000051
Figure BDA0003199175060000052
wherein mod is a remainder function, and f () is a sigmoid function; xck(t) is the carry layer output, xj(t) is the hidden layer output, ui(t-1) and yh(t) input layer input and output layer output, wj、wjkAnd wjiRespectively, the connection weight from the hidden layer to the output layer, the connection weight from the receiving layer to the hidden layer and the connection weight from the input layer to the hidden layer, thetahAnd thetajOutput layer and hidden layer thresholds, respectively; k is 1,2 … m, q is the selected regression delay scale, and is optimized according to the sampling period; j is 1,2 … m, i is 1,2 … 5, the number m of hidden layer and accepting layer nodes can be selected from 12-25; h is 1,2 … 5;
the training uses a gradient descent method.
Preferably, the sleep-in recognition unit comprises an image acquisition module, a wearable module and a sleep-in judgment module, wherein the image acquisition module adopts a depth camera to acquire images;
the step S2 includes the following processing procedures:
the image processing part in the sleep judging module continuously detects the eye opening of the user, the heart rate calculating part, the body movement frequency calculating part and the body temperature calculating part calculate the heart rate, the body movement frequency and the body temperature based on the human body sensing signals acquired by the wearable module,
the data fusion processing part in the sleep judging module performs data fusion on the physical sign parameters output by the image processing part, the heart rate calculating part, the body movement frequency calculating part and the body temperature calculating part to eliminate inconsistent parts in a data set,
and rotating a holder supporting the camera according to the processing result of the image processing part to align the camera with the face of the user.
Preferably, a display bar is arranged in the output module, and the display bar is used for indicating the factor values of the sleep efficiency of the current user in turn; and outputs the obtained values of the factors of the falling asleep efficiency to the outside through a communication interface block.
Preferably, the neural network further comprises an input quantity using the fatigue index as a parameter, and the user inputs the fatigue index through a key in the user interface unit according to the current fatigue degree; in the training sample collection process, data sampling and sample recording can be suspended by pressing a sampling cancel button;
preferably, the neural network module can also add a time period parameter acquired from the real-time clock module as an input, wherein the time period is noon or evening; the neural network module can also increase two parameters of temperature and humidity acquired from the temperature and humidity measurement module as input; the neural network module may also add as input a noise level parameter obtained from the noise measurement module.
Compared with the prior art, the scheme of the invention has the following advantages: the method comprises the steps of representing an illumination condition by an xyz color coordinate value of a reading surface illumination, a color temperature and a color, representing sleep onset efficiency by using physical parameters such as user eye opening degree change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate, body temperature change rate and the like obtained through data fusion and data fitting, carrying out signal acquisition and processing on each parameter through a photochromic recognition unit and a sleep onset recognition unit, carrying out construction modeling on an influence relation between the illumination condition of the environment and user sleep onset efficiency factors in a control unit by using nonlinear mapping, and predicting the user sleep onset efficiency in different luminous environments by the trained or fitted mapping, thereby providing a basis for searching and recommending the subsequent high sleep onset efficiency luminous environments.
Drawings
FIG. 1 is a schematic diagram of human body's biological clock rhythm;
FIG. 2 is a block diagram of a sleep environment illumination condition identification system;
FIG. 3 is a view showing a constitution of a control unit;
FIG. 4 is a structural diagram of a photochromic identification unit;
FIG. 5 is a view showing a constitution of a sleep-in recognition unit;
FIG. 6 is a structural diagram of a dimmable lamp set;
FIG. 7 is a schematic diagram of an Elman neural network structure;
FIG. 8 is a schematic view of the layout structure of the present invention;
FIG. 9 is a schematic view of the pan/tilt head rotation of the image capture module;
FIG. 10 is a rotation diagram of the light color obtaining module;
FIG. 11 is a schematic structural diagram of a light color obtaining module rotating platform;
FIG. 12 is a flowchart of the method operation of the present invention;
FIG. 13 is a graph of eye opening detection sequences;
FIG. 14 is a graph illustrating a fitting function.
Wherein: 100 a sleep environment illumination condition identification system, 110 a light color identification unit, 120 a sleep-in identification unit, 130 an identity identification unit, 140 a control unit, 150 a user interface unit, 160 a dimmable lamp set,
a 111 light color obtaining module, a 112 light color judging module, a 113 rotating platform,
121 image acquisition module, 122 wearable module, 123 sleep-falling judgment module, 1231 image processing part, 1232 heart rate calculating part, 1233 body motion frequency calculating part, 1234 body temperature calculating part, 1235 data fusion processing part,
141 an input interface module, 142 a processing module, 143Elman neural network, 144 an iterative learning module, 145 a memory, 146 a first connection array, 147 a second connection array, 148 an output module,
the number of drivers 161, LED strings 162,
101 a base, 102 a support, 103 a depth camera, 104 a pan-tilt, 105 a display bar, 106 a light color sensing block, 107 a key block, 108 a dimming panel,
1061 first connecting piece, 1062 rolling shaft, 1063 rolling plate, 1064 pitching shaft, 1065 pitching plate, 1066 light color sensor and 1067 second connecting piece.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings, but the present invention is not limited to only these embodiments. The invention is intended to cover alternatives, modifications, equivalents and alternatives which may be included within the spirit and scope of the invention.
In the following description of the preferred embodiments of the present invention, specific details are set forth in order to provide a thorough understanding of the present invention, and it will be apparent to those skilled in the art that the present invention may be practiced without these specific details.
The invention is described in more detail in the following paragraphs by way of example with reference to the accompanying drawings. It should be noted that the drawings are in simplified form and are not to precise scale, which is only used for convenience and clarity to assist in describing the embodiments of the present invention.
Example 1:
the human biological clock is the phenomena of physiological and biochemical processes, morphological structures, behaviors and the like which change periodically with time in a human body. The biological clocks in human body are various, and various physiological indexes of human body, such as pulse, body temperature, blood pressure, physical strength, emotion, intelligence and the like, can change periodically along with day and night changes.
As shown in fig. 1, in 2 am, the sleep of a person reaches the maximum depth, in 4 am, 30 am, the body temperature reaches the minimum, in 6 am, the blood pressure rises fastest at 45 am, in 7 am, the melatonin secretion stops at 30 am, in 8 am, 30 am, the intestinal peristalsis occurs frequently, in 9 am, the testosterone secretion reaches the maximum, in 10 am, the brain is most awake, in 14 pm, the limb movement of the person is matched to the optimum condition at 30 am, in 15 pm, the 30 am is the time when the reaction of the person is most sensitive, in 17 pm, the cardiovascular work efficiency of the human body is the maximum, in 18 pm, the blood pressure of the person reaches the peak of 1 day, in 19 pm, the body temperature reaches the peak, in 21 pm, the melatonin secretion starts at 22 pm, at 30 am, and the intestine of the person is inhibited.
According to the periodic change of physiological and biochemical activities of people, people can reasonably arrange activities in one day, so that the working efficiency and the resting efficiency are the highest, and the physical and psychological health states of people are the best. Among them, it is necessary for people to keep energy to arrange and guide sleep according to a biological clock.
When a human body falls into sleep from a waking state, the heartbeat is slowed down, the body temperature is lowered, the breathing is slowed down, muscles are relaxed, and a change process of relaxation, loss, fatigue, sleepiness and falling asleep in spirit is adapted to the muscles. Comparative studies with electroencephalography have shown that the longer the duration of eye closure, the more severe the fatigue. Therefore, the fatigue degree can be determined by measuring the opening degree of eyes and the duration of closing, thereby providing a detection means for the falling asleep process.
In the sleep stage, the human body shows a tendency change such as increased fatigue, eyelid sagging, intermittent blinking until complete eye closure, slow body movement, pulse and reduced body temperature, which can be detected by means of sensors. The detection of the eye state of the face, particularly the change of the opening degree, can be based on technologies such as machine vision, image processing and the like, the heart rate, the body movement and the body temperature can be detected by wearable modules such as a bracelet, and the detection means are applied to traffic driving or sleep monitoring.
The light has direct and important influence on the sleep of the human body, and in order to help find the illumination which is beneficial to the sleep to be more quickly, the invention detects and pre-judges the sleep efficiency characteristics of the user in different light color environments through nonlinear system modeling.
As shown in fig. 2, the system 100 for identifying the sleep environment lighting condition by using the method of the present invention includes a light color identification unit 110, a sleep-in identification unit 120, an identity identification unit 130, a control unit 140, a user interface unit 150, and a dimmable light set 160. The identification unit 130 employs a fingerprint recognizer, biometric or other feature recognizer, and the biometric features may employ iris features or facial measurement data features such as the distance between the eyes, nose and mouth of the user, etc.
As shown in fig. 2, 5 and 8, the sleep onset identifying unit 120 includes an image capturing module 121, a wearable module 122 and a sleep onset determining module 123, wherein the image capturing module 121 is supported by the pan/tilt head 104. The camera 103 of the image acquisition module, together with the cradle head 104, is fixed on a support 102 placed near the user in a sleeping scene, and the bottom of the support 102 is supported by a base 101.
The image acquisition module 121 acquires continuous images of a face and a reading object in a scene of falling asleep, the image processing unit processes the acquired images, periodically monitors the eye opening of the user, and acquires the eye opening value of the user and the change rate thereof, the eye closing duration and the change rate thereof. The image processing part also identifies the orientation of the reading object relative to the bracket in order to match the light color identification unit to identify the light color of the reading surface.
As shown in fig. 8 and 9, the image acquisition module adopts a depth camera, and images are captured by a color camera and a group of depth-of-field infrared cameras, the color camera is used for capturing images, the infrared cameras are used for generating a pixel depth matrix, and depth information of a target is generated through operation, so that human eyes at various angles are tracked and detected. In the process of tracking and detecting the human eyes, the holder supporting the camera is rotated according to the processing result of the image processing part, so that the camera is aligned to the face of the user, and imaging and processing are facilitated.
The wearable module 122 includes information acquisition modules such as a pulse sensor, an acceleration sensor, and a body temperature sensor, and signals acquired by these sensors are processed by the heart rate calculating part 1232, the body movement frequency calculating part 1233, and the body temperature calculating part 1234 in the sleep determination module 123, respectively, to obtain the heart rate, the body movement frequency, and the body temperature of the user.
Based on the image of the scene of falling asleep collected by the depth camera, the image processing part 1231 firstly performs smoothing processing and threshold segmentation, removes noise, positions the face and the eye region of the user, and extracts characteristic information such as the aspect ratio of human eyes; and secondly, performing geometric correction based on the depth information, performing three-dimensional reconstruction on the eye region, obtaining three-dimensional world coordinates of the eye region, and obtaining actual eye opening values at different angles and distances.
The eye opening value can be calculated based on the periodically acquired human eye height-width ratio, and the eye closing duration can be acquired in the process of periodically sampling images. The closed eye state is defined as the area of the eyelid covering the pupil exceeding 80%, and in the image sampling process, if the eye images acquired in two consecutive times are both in the closed state, the two acquisition time intervals are considered as the closed eye duration. An eye-open-eye-closed-eye-open sequence is continuously acquired, and the difference between the two eye-open times is the eye-closed duration.
The image-based sleep characteristic processing process comprises the following steps: after the face of the image is positioned, the left eye region and the right eye region are segmented, and the eye opening and the eye closing duration are respectively identified for two eyes.
The amplitude and frequency of the body motion are gradually reduced in the sleeping process, so that the method can be used for auxiliary detection of sleeping. The current state is characterized by the statistical period of physical activity, such as wrist activity energy and frequency, over half a minute. By adopting zero-crossing detection, if the acceleration value is compared with a reference value slightly larger than zero, the counting is carried out once every time the reference value passes. The following formula is adopted to represent the body motion frequency characteristics:
Figure BDA0003199175060000101
wherein A isiThe number of times of wrist movement in the i-th cycle, R, is obtained according to the acceleration valueiIs a time sequence coefficient, ηj(j-1, 2,3,4) is a term coefficient, QiSD is a function for solving standard deviation, and the number of the periods of measuring the cycle time and the number of the periods of 2 periods before and after the cycle time is higher than a set threshold value such as 5. In the formula, each coefficient can be a value between 0 and 1, and d can also beiAnd comparing with other physiological indexes such as myoelectricity recorded at the same time to carry out fitting calibration.
The pulse sensor is used for measuring the heart rate based on the principle that substances absorb light, and the pulse sensor irradiates blood vessels through a green light LED and is matched with a photosensitive photodiode to measure reflected light. Because the blood is red, the blood can reflect red light and absorb green light, and when the heart beats, the blood flow is increased, and the absorption amount of the green light is increased; the blood flow decreases in the beating gap of the heart, and the green light absorption decreases accordingly. Thus, heart rate can be measured from the absorbance of blood.
The pulse sensor converts the absorption of blood flow to light into a fluctuation signal, the signal is a mixture of a direct current signal and an alternating current signal, the alternating current signal reflecting the blood flow characteristics is extracted through band-pass filtering between 0.8Hz and 2.5Hz, then the maximum value point of the amplitude is extracted by adopting Fourier transform, the frequency value corresponding to the point is obtained, and the frequency value is multiplied by 60 times to obtain the actual heart rate value.
The body temperature calculating part carries out filtering processing on the signals collected by the body temperature sensor and calculates a body temperature value.
After basic data such as eye opening, eye closing duration, heart rate, body movement frequency, body temperature and the like are obtained, a data fusion processing part in the sleep judging module carries out data fusion on the physical sign parameters so as to eliminate inconsistent parts in a data set.
The data fusion adopts an evidence reasoning method and is based on the set heuristic rule. Rules include both single-factor and multi-factor categories. Take the single factor rule as an example: for the eye opening degree, if it is detected that one eye is closed while the other eye is open, the current state is determined as open. Other signs, such as occasional large amplitude increases in body temperature during the fall, occasional rather than sustained reverse increases following heart rate fall, all require evidential reasoning to exclude individual data.
In the multi-factor rule inference, the opposite change trend of the individual sign data is excluded according to the consistent change trend of most feature data. When data fitting is carried out on the eye closing duration by using a curve such as exponential distribution, for the eye closing duration, a plurality of short eye closing durations which are mixed in the data sequence are gradually increased, if other physical sign data show that sleepiness is gradually deepened, the plurality of short eye closing duration data are excluded, which may be anti-fatigue actions which are actively generated when a person consciously adjusts the state of the person during falling asleep, and are shown as blinking for a plurality of times. Similarly, if other vital sign data do not change much, i.e., they appear to be fatigued, but the length of time that the eyes are closed is much longer than normal, the data should be excluded, which may be the presence of a foreign object in the eyes. For another example, when the body tends to be calm, the acceleration sensor detects a sudden touch of the body, if the change of other sign data is not large, the touch may be caused by dozing during falling asleep, and the touch data should be deleted when the body movement frequency trend is calculated.
Based on various physical sign data sequences after data fusion processing, the sleep onset judging module expresses the data sequences by adopting a data fitting method. Fig. 13 shows a detection sequence of eye opening during reading before sleep, in which the sampling sequence of normalized eye opening de is first subjected to a pre-filtering process, and then subjected to data fusion to further remove the influence of accidental factors. In the first stage, the eye opening de does not change much, and basically changes within a range from the average value to the next value in the normal state; in the second stage, as sleepiness approaches, the eye opening gradually decreases until finally being detected as substantially closed.
As can be seen from fig. 13, the turning point of the human eye opening is hard to predict during the transition period of falling asleep, and gradually closes within a short time from the turning point; meanwhile, the gradual change duration is greatly different at different times. In order to fit the sampling sequence, the method is different from the common trend functions such as Sigmoid, tanh and the like, and the following fitting functions are designed:
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
where b is a scaling factor, which can be 0.5 for normalized data, and a and c are parameters related to the sample.
Referring to fig. 14, the values of a and c for the left curve are 2 and 2, respectively, and the values of a and c for the right curve are 5 and 1, respectively, it can be seen that, by appropriately changing the values of a and c, the data sequence with various turning point positions and different changing rates and a descending trend can be fitted.
Correspondingly, the y2 function can be adopted to perform data fitting on the sign data sequence which tends to be stable after the heart rate, the body motion frequency, the body temperature and the like are reduced. For the duration of the closed eye, accordingly, another fitting function is designed:
y1=g1(t)=8·b/exp(4·c·(a-t))+1。
also, for the duration of the eye-closing, if it reaches 4 seconds, it is generally judged that the person has entered the sleep state. Thus, the duration of the eye closure is pre-processed:
y1=max(y1,4),
otherwise, the duration of the closed eye can take many values, and the sample loses the meaning of the characterization.
How to characterize the rate of change of these signs, such as eye openness, on the basis of data fitting to the sign data sequence? If only the first derivative of the fitted function is calculated at a certain point in time, the characterization significance is lost due to the different values at different points in time. Similarly, the second derivative of the fitted function may not be able to characterize the difference between the different trend curves. Therefore, the invention calculates the time difference of independent variables corresponding to the two determined dependent variables according to the fitted trend function to characterize the sign change rate. For example, for the eye opening, the rate of change k is calculatedeo
keoT2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1)。
Similarly, the rate of change of other signs can be calculated. Through the data processing, various physical signs and the change rate thereof can show consistent evaluation standards; for example, a smaller defined rate of change of the physical sign indicates a shorter transition time to sleep. Meanwhile, compared with single-factor evaluation such as eye opening evaluation, the multi-factor sign evaluation can reflect the sleep-in efficiency or speed characteristics of different people, so that a foundation is provided for subsequent illumination influence modeling and illumination optimization control.
Preferably, the credibility of physical parameters such as the opening degree of eyes, the duration of eye closure, the heart rate, the body movement frequency, the body temperature and the like of the user is calculated according to a plurality of front and rear terms of the time data sequence of the user, and a Bayesian data fusion method is used for fusing a plurality of physical parameters into one output.
As shown in fig. 2, 4, 10 and 11, the light color identification unit 110 includes a light color obtaining module 111 disposed on the rotary platform 113 and a light color determining module 112 for processing and calculating the obtained light sensing signal. The rotary platform and the light color obtaining module form a light color sensing block 106, which is connected to the bracket 102.
The light color sensor 1066 in the light color obtaining module is connected to the bracket 102 sequentially through the pitching plate 1065, the rolling plate 1063 and the first connecting member 1061. The pitching plate 1065 is connected to the roll plate 1063 through a pitching rotating shaft 1064, and drives the photochromic sensor 1066 to pitch around the Y axis; the roll plate 1062 is connected to the first connecting member 1061 via a roll shaft 1062, and drives the pitching plate 1065 and the light color sensor 1066 to roll around the X axis. The roll shaft 1062 and the pitch shaft 1064 are driven to rotate by a motor, and they are powered by a first connecting piece 1061 and a second connecting piece 1067, respectively, and the control of the motor is realized by a light color judging module or a control unit located in the base. The first connecting part 1061 is a hard connection and provides an electrical connection channel in addition to supporting and fixing, and the second connecting part 1067 is a soft connection and only provides an electrical connection channel.
The photochromic sensor comprises an illuminance sensor, a color temperature sensor and a color sensor, wherein the color temperature and the color can be acquired by the same RGB or xyz color sensing module. Preferably, the color sensing module may be a TCS3430 sensor with five channels including X, Y, Z channel and two Infrared (IR) channels that can be used to infer the light source type. The TCS3430 sensor collects the light color signal of the reading surface in real time, and the xyz color coordinate value and the color temperature of the color are respectively obtained after signal processing and conversion by the processing module in the control unit.
The user may move about the table before falling asleep, such as for a work schedule or schedule for the next day or for a short reading, while the reading surface is substantially stationary and light detection may be performed in the horizontal plane. However, sometimes, the reading surface of the user is not horizontal, for example, the user leans on a couch, a sofa or a bed head to read, at this time, based on the recognition of the reading surface orientation by the image processing unit, there are two methods to detect the illumination, especially the illuminance, of the reading surface, one is to convert the illuminance detected by the photochromic sensor 1066 to the reading surface according to the spatial distribution characteristics of the light source, and the other is to convert the photochromic sensor to the orientation parallel to the reading surface by the rotating platform, so that the illuminance of the reading surface is obtained by the photochromic calculation module. The former method requires modeling of the spatial distribution of the light source, and has a small application range, and for this reason, the second method is adopted.
In a light environment, the orientation of the surface of the light color sensor is changed by respectively rotating the pitching rotating shaft and the rolling rotating shaft so that the surface is parallel to a target reading surface, light color parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the orientation surface are calculated by a light color judging module after sampling incident light, a pitch angle alpha and a roll angle beta corresponding to each orientation are recorded, and a mapping table combining alpha and beta to each light color parameter value is established. The direction of the target reading surface is obtained after sampling and processing are respectively carried out through the image acquisition module and the image processing part.
In order to generalize the mapping table to any specific orientation, when the combination of the pitch angle and the roll angle of the orientation is not in the mapping table, the corresponding photochromic parameter value is obtained through distance weighted interpolation calculation in the angle combination space based on the mapping table, and the process is as follows.
For simplicity, without loss of generality, only 2 parameters of the light color parameters, i.e., the reading surface illuminance and the color temperature, are taken as examples, and more light color parameters can be processed similarly.
Mapping tables for each photochromic parameter value based on pitch angle alpha and roll angle beta combinations for a specific angle combination (alpha)0,β0) And the values of the illumination and the color temperature are obtained by interpolation in the mapping table.
Firstly, find the angleP (alpha) in space0,β0) Four points around: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure BDA0003199175060000141
Figure BDA0003199175060000142
wherein d is1Represents the shortest distance of P to four points, d2The point is the second shortest, and so on; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
Preferably, in establishing the mapping of the angle combinations to the light color parameter values, the light color sensor surface is arranged as close as possible to the reading surface, so that the difference in illumination on the two planes is not small enough to affect the efficiency of falling asleep. This is easily met when the light source is at a distance from the reading surface.
As shown in fig. 2 and fig. 3, the control unit includes an input interface module 140, a processing module 142, an Elman neural network 143, an iterative learning module 144, a memory 145, a first connection array 146, a second connection array 147, and an output module 148.
The invention adopts the neural network to construct and model the mapping relation between the illumination condition of the environment and the sleep efficiency factor of the user. Specifically, an Elman neural network shown in fig. 3 is established, the network takes the illumination of the reading surface, the color temperature and the xyz color coordinate value of the color as input quantities, and takes 5 individual characteristic parameters of the opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user eye as output quantities.
Compared with a BP (back propagation) neural network, the Elman neural network has a recursive structure and also comprises a receiving layer besides an input layer, a hidden layer and an output layer, wherein the receiving layer is used for feedback connection among layers, so that the receiving layer can express time delay and parameter time sequence characteristics between input and output, and the network has a memory function. Referring to fig. 3, the built neural network has 5 units in the input layer, 5 units in the hidden layer and the node number in the receiving layer, and 5 units in the output layer.
The neural network model is:
xck(t)=xk(t-mod(k,q)-1),
Figure BDA0003199175060000151
Figure BDA0003199175060000152
wherein mod is a remainder function, and f () is a sigmoid function; xck(t) is the carry layer output, xj(t) is the hidden layer output, ui(t-1) and yh(t) input layer input and output layer output, wj、wjkAnd wjiRespectively, the connection weight from the hidden layer to the output layer, the connection weight from the receiving layer to the hidden layer and the connection weight from the input layer to the hidden layer, thetahAnd thetajOutput layer and hidden layer thresholds, respectively; k is 1,2 … m, q is the selected regression delay scale, and is optimized according to the sampling period; j is 1,2 … m, i is 1,2 … 5, the number m of hidden layer and accepting layer nodes can be selected from 12-25; h is 1,2 … 5.
Referring to fig. 12, the method for identifying the illumination condition of the sleep environment of the present invention includes the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that a dynamic recursive Elman neural network is established in a control unit by taking 5 photochromic parameters including illumination, color temperature and color xyz coordinate values of reading surface light as input quantities, and 5 characteristic parameters including a user eye opening change rate, a user eye closing duration change rate, a heart rate change rate, a body motion frequency change rate and a body temperature change rate as output quantities;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in the field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of the user based on the illuminance, the color temperature and the xyz color coordinate value of the color of the current reading surface light acquired by the light color recognition unit and outputs the results through the output module.
In order to improve the generalization capability of the neural network, enough training samples are collected. The control unit sends out dimming signals to the lamp group through the output module or the user interface unit, obtains a training sample set of the neural network based on the light color identification unit and the sleep-in identification unit in different light environments aiming at specific users, and records output values y of all sampleshIs the actual value of (i.e. the expected value y)hd
Through the processing modules in the sleep-in recognition unit and the control unit, the 5 characteristic parameters of the neural network output quantity are obtained by processing in the following way:
based on the sleep-in recognition unit, the change process of the physical sign parameters in the sleep-in process under various illumination conditions is obtained and recorded, and for the recorded data in the physical sign parameter sequence in each sleep-in process,
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1);
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5;
Wherein y1 and y2 are values obtained after sign parameter preprocessing or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) corresponds to the eye opening change rate keoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
In the process of sampling the sleep onset process, when the change rates of a plurality of physical sign parameters are detected to be smaller than a set threshold value in a plurality of continuous periods, the user is considered to be asleep, and the sleep onset sampling is stopped.
The neural network training adopts a gradient descent method, and the weight and threshold value adjusting method in the training is as follows.
Assuming a total of P training samples, let the error function be:
Figure BDA0003199175060000171
then the adjustment of the weight from the hidden layer to the output layer is shown as follows:
whj(t+1)=whj(t)+Δwhj(t+1),
wherein the content of the first and second substances,
Figure BDA0003199175060000172
δyh=-(yhd-yh)·yh·(1-yh),
the adjustment formula of the output layer threshold is as follows:
θ(t+1)=θ(t)+Δθ(t+1),
wherein the content of the first and second substances,
Figure BDA0003199175060000173
similarly, the input layer to hidden layer connection weights, hidden layer thresholds, and the accept layer to hidden layer connection weights are adjusted.
The initial value range of each weight is an interval of (-0.1, 0.1), the learning rate eta is a decimal less than 1, and the learning rate eta can be dynamically adjusted by adopting a fixed rate or according to the total error of the current network output. The training end condition may be set to a total error or a variation thereof smaller than a set value or a number of times of training up to a certain amount.
Before network training, normalization preprocessing can be performed on input quantity and output quantity:
r'=r-rmin/rmax-rmin
wherein r is an unprocessed physical quantity, r' is a normalized physical quantity, rmaxAnd rminRespectively the maximum and minimum values of the sample data set.
When calculating the predicted value, the network output quantity is converted back to the output quantity value by the following formula:
r=rmin+r'·(rmax-rmin)。
when the online prediction is applied, the first connection array is disconnected, the neural network predicts each output quantity and outputs the output quantity to the processing module through the second connection array, and the output quantity is displayed and output through the output module and is sent to the outside in a signal form after the output quantity is processed and analyzed by the processing module.
As shown in fig. 1 and fig. 6, in an environment where the method is tested or used, preferably, the dimmable light bank 160 is a dimmable LED light bank, the driver adjusts the driving current value of each LED string 162 in the light bank, the driver 161 is a driver capable of changing the output current, and the driver performs light emission adjustment by changing the PWM duty cycle of the driving current of each channel of the LED string. By changing the driving current, the dimmable light set 160 can adjust at least one of the light properties such as brightness, color temperature, color, and illumination angle.
Preferably, the LED string is a dimming lamp including RGB three-primary-color current channels, and at this time, the light color of the lamp can be changed by changing the driving current value of one of the channels. When the three channel currents are increased or decreased in synchronization from a certain state, the lamp exhibits no change in color but a brightness that gradually increases or decreases.
Preferably, the control unit changes the light emission of the LED lamp set in a stepwise manner within a known dimming range of the LED lamp set through the output module. For example, a variable mapping table is established by combining the values of the channel currents of the LED strings with the corresponding illuminance, color temperature and color collected on the reading surface, only one variable, such as the illuminance, is changed and the other variables, such as the color temperature and the color, are kept unchanged in the value interval of the illumination vector space composed of the illuminance, the color temperature and the color, the mapping table is reversely searched to find the current value of each channel of the LED strings corresponding to the current illumination vector, and the control unit sends the PWM wave duty ratio of each channel current to the driver in the form of a signal through the output module. The control unit obtains enough training samples of the neural network after multiple sleep-in detections by continuously changing working points of an illumination vector space, wherein sampling points can be sparse in end value areas of various light color variables, and the sampling points are denser in low color temperature areas such as areas near color temperature 3000k and illumination of 100 lx-300 lx. The collected sample is stored in a memory.
The parameters such as preset values required for processing by the control unit are input through keys in the user interface unit. The trained neural network can predict and judge the sleep efficiency of the user under the current illumination condition in a new light environment based on the generalization capability of the neural network, and display or output the predicted result through an output module.
Specifically, as shown in fig. 1 and 8, on the base 101, the keys of the user interface unit are disposed in the area of the key block 107, and on the other side opposite to the key block, the user interface unit may further be disposed with a dimming panel 108 for manually adjusting the light emission of the light set.
Preferably, the output module 148 includes a display bar 105 for indicating the values of the factors of the sleep efficiency of the current user in turn. Preferably, the output module further comprises a communication interface, and the detected or predicted values of the factors of the sleep onset efficiency are output to the outside through the interface module.
Since the drowsiness or fatigue level in preparation for falling asleep varies, it is preferable that a key indicating the current fatigue level is provided in the user interface unit while the neural network increases an input amount of a fatigue index, which may be an integer between 1 and 5.
When the user has difficulty falling asleep due to emotions and the like, the collected samples have larger deviation from the samples under normal conditions, and although the neural network has better fault tolerance, the accuracy of the network is affected when too many samples are available. For this purpose, a cancel sampling key is preferably provided in the user interface unit, and the control unit suspends data sampling and sample recording after detecting that this key is pressed.
To increase the applicability of the network, the control unit may preferably further include a real-time clock module, and the neural network module may further include a seasonal parameter obtained from the real-time clock module as an input.
Preferably, the neural network module may further add a time period parameter obtained from the real-time clock module as an input, the time period being noon or night respectively.
Preferably, the control unit can be additionally provided with a temperature and humidity measurement module, and the neural network module is used for adding two parameters of temperature and humidity acquired from the temperature and humidity measurement module as input.
Preferably, the control unit may further include a noise measurement module, and the neural network module adds a noise level parameter obtained from the noise measurement module as an input.
Example 2:
in this embodiment, referring to fig. 10 and fig. 11, a reading surface light measuring method in a sleep environment is provided, which includes the following steps:
p1, connecting a light color sensor in the light color acquisition module to a bracket near a user in a world coordinate system in a sleeping scene through a pitching plate, a rolling plate and a first connecting piece in sequence;
p2, in a light environment, respectively rotating a pitching rotating shaft connecting the pitching plate and the rolling plate and a rolling rotating shaft connecting the rolling plate and the first connecting piece to change the orientation of the surface of the light color sensor, so that the surface is parallel to a target reading surface, sampling reflected light, calculating light color parameter values such as illumination, color temperature, color xyz color coordinate values and the like of the oriented surface by a light color judging module, recording a pitch angle alpha and a roll angle beta corresponding to each orientation, and establishing a mapping table combining alpha and beta to each light color parameter value;
p3, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the illuminated plane is not in the mapping table, obtaining the corresponding photochromic parameter value through Euclidean distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
In order to identify the orientation of the illuminated plane, the orientation can be detected based on a depth camera on a bracket in a world coordinate system, or detected through a triaxial acceleration sensor fixed on the illuminated plane, and a signal obtained by detection is converted into a corresponding pitch angle and a corresponding roll angle by a signal processing module in the photochromic identification unit.
The interpolation calculation process is as follows:
for simplicity, without loss of generality, only 2 parameters of the light color parameters, i.e., the reading surface illuminance and the color temperature, are taken as examples, and more light color parameters can be processed similarly.
Mapping tables for each photochromic parameter value based on pitch angle alpha and roll angle beta combinations for a specific angle combination (alpha)0,β0) And the values of the illumination and the color temperature are obtained by interpolation in the mapping table.
First, find P (alpha) in the angle space0,β0) Four points around: a (alpha)1,β1),B(α2,β1),C(α1,β2) And D (alpha)2,β2) In which α is1≤α0≤α2,β1≤β0≤β2
Illumination and color temperature value (E)0,K0) The distance is used as a weighted value for interpolation,
Figure BDA0003199175060000201
Figure BDA0003199175060000202
wherein d is1Represents the shortest distance of P to four points, d2The point is the second shortest, and so on; e1And K1Respectively the illumination and color temperature values of the shortest distance point; and respectively adding different weights to four points closest to the P point to be searched according to different distances, wherein the four points are the shortest and the heaviest.
Example 3:
in contrast to embodiment 1, in the present embodiment, the sleep onset duration is introduced into the input parameters of the non-linear mapping. The implementation provides a sleep environment illumination condition identification method, which comprises the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that 6 parameters including illumination, color temperature, color xyz color coordinate values and sleep duration of reading surface light are used as input quantities, 5 characteristic parameters including user eye opening change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate are used as output quantities, and a dynamic recursive Elman neural network is established in a control unit;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
wherein, the parameters of each training sample are obtained according to the following processing procedures:
continuously detecting the eye opening of the user, when the eye opening value is continuously smaller than (1-delta%) times of the eye opening value in the initial stage of falling asleep within a set time length, taking the current time as the timing zero point of the falling asleep duration, and simultaneously discarding the sample record before the zero point time, wherein delta can be an integer between 5 and 10,
the user eye opening degree change rate keoRate of change k of duration of eye closureecHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kpThese 5 individual feature parameters are all calculated by a moving average filter, such as for the eye opening change rate,
keo|t=u=ave(dEOu-2,dEOu-1,dEOu,dEOu+1,dEOu+2),
where ave is the mean function, dEOuThe difference between the eye opening value at the time u and the eye opening value at the last time;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in a field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of a user based on the illuminance, the color temperature, the xyz color coordinate value of the color and the sleep duration of the current reading surface light acquired by the light color identification unit and outputs the results through the output module.
As shown in fig. 13, since the turning point of falling asleep of the user cannot be predicted, in the present embodiment, by continuously monitoring the eye opening, when it is significantly deviated from the normal range, the data sequence after sampling and recording is started.
Compared with embodiment 1, since the time length from the turning point of falling asleep is introduced into the input of the neural network, the sign parameters at a certain time point after can be predicted by the trained neural network.
Preferably, the sleep progress process can be characterized by an exponential distribution function, physical sign parameters such as the eye opening degree, the eye closing duration, the heart rate, the body movement frequency and the body temperature of the user are taken as fitting sample data in a form of normalization and instantaneous data weighted average, and all the physical sign parameters are fused into a function.
Example 4:
different from embodiment 1, in this embodiment, the control unit replaces the neural network with the sleep onset efficiency mapping table to implement mapping from the light color condition to each change rate physical sign parameter of the sleep onset efficiency.
In this embodiment, a method for identifying a sleep environment lighting condition is provided, which includes the following steps:
s1, establishing a data sample structure:
the illumination condition is represented by 2 light color parameters including the illumination of the reading surface and the color temperature, the change rate of 5 individual characteristic parameters such as the opening value of the eyes, the duration of the closed eyes, the heart rate, the body movement frequency, the body temperature and the like of a user is taken as the sleep efficiency factor,
establishing an empty sleeping efficiency mapping table, which takes the combination of the photochromic parameters as a row index and takes the 5 change rate physical sign parameters of the sleeping efficiency factors as column titles, namely fields;
s2, acquiring a fitting data sample set in the falling asleep process: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, acquiring change process records of physical sign parameters in the sleeping process under various illumination conditions based on the light color identification unit and the sleeping identification unit in different light environments for a specific user in the sleeping process, and performing the following processing for the characteristic parameter change process records corresponding to each light color combination condition formed by illumination and color temperature:
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1),
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5,
WhereinY1 and y2 are values of sign parameters after pretreatment or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) corresponds to the eye opening change rate keoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
Recording each change rate physical sign parameter under each light color combination in a sleep efficiency mapping table;
s3, repeating the step S2 to obtain a fitting data sample set;
s4, online prediction: predicting sleep-in efficiency factors under specific light color combination in a field environment, searching the sleep-in efficiency mapping table according to the light color combination value, and when the combination is not in the sleep-in efficiency mapping table, obtaining each change rate sign parameter value of the corresponding sleep-in efficiency factor through distance weighted interpolation calculation based on the sleep-in efficiency mapping table, wherein the distance is the Euclidean distance in a light color combination space; if the combination exists in the sleep efficiency mapping table, the table is directly looked up to obtain the corresponding change rate physical sign parameter value,
and outputting the table look-up result through an output module.
It can be understood that, in the solution of the present invention, the illumination in the color parameters of the reading surface is for the reading object without an active light source, and for the reading object with a backlight source, such as a mobile phone, a tablet, an electronic book, etc., in the mapping from the color condition to the sleep-enabling efficiency related factor, a backlight source brightness item is supplemented to the parameter set of the color condition.
In addition, all models related to the sleep efficiency factor are based on specific individuals, so that related data in the process of generating network training samples, mapping tables and the like are based on users with the same identity; for multiple users, one data set should be created and saved for each user independently.
The invention is applied to detect and prejudge various factors of sleep-in efficiency under different light environments, and after samples with abundant changes are collected, due to infinite combinations in light color change domains, sleep-in efficiency parameters including eye opening change rate, heart rate change rate and the like under illumination conditions in various field environments can be predicted by the invention, thereby providing a basis for searching potential high sleep-in efficiency light environments.
While the embodiments of the present invention have been described above, these embodiments are presented as examples and do not limit the scope of the invention. These embodiments may be implemented in other various ways, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. These embodiments and modifications are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.

Claims (10)

1. A sleep environment illumination condition identification method comprises the following steps:
s1, establishing nonlinear mapping: the method comprises the steps that 6 parameters including illumination, color temperature, color xyz color coordinate values and sleep duration of reading surface light are used as input quantities, 5 characteristic parameters including user eye opening change rate, eye closing duration change rate, heart rate change rate, body motion frequency change rate and body temperature change rate are used as output quantities, and a dynamic recursive Elman neural network is established in a control unit;
s2, obtaining a training sample set: sending a dimming signal to the dimmable lamp bank through an output module of the control unit, collecting and identifying the light color parameters such as the illuminance, the color temperature, the color and the like of the reading surface light through the light color identification unit, collecting and processing and identifying the sign parameters such as the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate, the body temperature change rate and the like of a user through the sleep-in identification unit and the control unit, recording the light color parameter values and the corresponding sign parameter values, and obtaining a training sample of the neural network,
repeatedly acquiring training samples to obtain a training sample set of the neural network;
wherein, the parameters of each training sample are obtained according to the following processing procedures:
continuously detecting the eye opening of the user, when the eye opening value is continuously smaller than (1-delta%) times of the eye opening value in the initial stage of falling asleep within a set time length, taking the current time as the timing zero point of the falling asleep duration, and simultaneously discarding the sample record before the zero point time, wherein delta can be an integer between 5 and 10,
the user eye opening degree change rate keoRate of change k of duration of eye closureecHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kpThese 5 individual feature parameters are all calculated by a moving average filter, such as for the eye opening change rate,
keo|t=u=ave(dEOu-2,dEOu-1,dEOu,dEOu+1,dEOu+2),
where ave is the mean function, dEOuThe difference between the eye opening value at the time u and the eye opening value at the last time;
s3, off-line training of the neural network: based on the obtained training sample set, an iterative learning module in the control unit iteratively adjusts the connection weight of the neural network by adopting a gradient descent method according to the actual value of the physical sign parameter and the network output value which are respectively input by the processing module and the neural network through the first connection array;
s4, online prediction: in a field environment, the trained neural network predicts the eye opening change rate, the eye closing duration change rate, the heart rate change rate, the body movement frequency change rate and the body temperature change rate of a user based on the illuminance, the color temperature, the xyz color coordinate value of the color and the sleep duration of the current reading surface light acquired by the light color identification unit and outputs the results through the output module.
2. The method for identifying the lighting condition of the sleep environment as claimed in claim 1, wherein the dimmable light set in the sleep environment adopts at least one light property adjustable LED light set from the light properties of brightness, color temperature, color and illumination angle, and the light output adjustment is performed by adjusting the PWM duty cycle of the driving current of each channel of the LED string in the light set by the dimming driver.
3. The method for identifying the lighting condition of the sleeping environment as claimed in claim 2, wherein in the step S2, in the process of obtaining the sample set: and changing the light emission of the LED lamp set in a stepping mode within the known dimming range of the dimmable lamp set, and acquiring enough data samples after multiple sleep-in detections by continuously changing the working point of an illumination vector space, wherein sampling points can be sparse in the end value region of each light color variable, and the sampling points are denser in a low color temperature region such as a region near the color temperature of 3000k and a region near the illumination of 100 lx-300 lx.
4. The method for recognizing the lighting condition of the sleeping environment as claimed in claim 1, wherein the step S4 is preceded by the steps of:
t1, establishing a mapping table for combining the pitch angle and roll angle corresponding to the illuminated surface orientation to the parameter values of each light color in the control unit,
the pitch angle and the roll angle are rotation angles of a photochromic sensor parallel to the illuminated surface in a world coordinate system, a bracket arranged near a user is arranged in the coordinate system under a sleeping scene, the photochromic sensor is connected to the bracket through a pitch plate, a roll plate and a first connecting piece in sequence,
the step S4 further includes the following steps:
t2, in the field environment, if the combination of the pitch angle and the roll angle corresponding to the orientation of the reading surface is not in the mapping table, obtaining the corresponding photochromic parameter value through the distance weighted interpolation calculation in the angle combination space according to the mapping table; otherwise, if the combination is stored in the mapping table, the table is directly looked up to obtain the corresponding photochromic parameter value.
5. The method for recognizing sleep environment lighting condition as claimed in claim 1, wherein 5 characteristic parameters of the neural network output are obtained in step S2 through the following processes:
periodically acquiring and recording the state change of the physical sign parameters in the sleeping process under various illumination conditions based on the sleeping identification unit, and for the recorded data in the physical sign parameter sequence in each sleeping process,
the duration of the user's closed eye, y1, pre-processed,
y1=max(y1,4),
then, an off-line data fitting is performed based on the following model,
y1=g1(t)=8·b/exp(4·c·(a-t))+1,
then calculating the change rate of the duration of the eye closure,
kec=k1t2-t1, wherein t1 is g1-1(4e-1),t2=g1-1(4-4e-1);
After normalization processing is carried out on each physical sign parameter in the eye opening degree, the heart rate, the body movement frequency and the body temperature of a user, off-line data fitting is carried out on the basis of the following models respectively,
y2=g2(t)=2·b/exp(4·c·(t-a))+1,
then the respective rates of change are calculated,
kit2-t1, wherein t1 is g2-1(1-e-1),t2=g2-1(e-1),i=2,3,4,5;
Wherein y1 and y2 are values obtained after sign parameter preprocessing or normalization, t is time, a, b and c are fitting coefficients, and k isi(i-2, 3,4,5) corresponds to the eye opening change rate keoHeart rate change rate khRate of change of body motion frequency kbBody temperature change rate kp
6. The method for identifying the illumination condition of the sleep environment according to any one of claims 1 to 5, wherein the neural network model is:
xck(t)=xk(t-mod(k,q)-1),
Figure FDA0003199175050000031
Figure FDA0003199175050000032
wherein mod is a remainder function, and f () is a sigmoid function; xck(t) is the carry layer output, xj(t) is the hidden layer output, ui(t-1) and yh(t) input layer input and output layer output, wj、wjkAnd wjiRespectively, the connection weight from the hidden layer to the output layer, the connection weight from the receiving layer to the hidden layer and the connection weight from the input layer to the hidden layer, thetahAnd thetajOutput layer and hidden layer thresholds, respectively; k is 1,2 … m, q is the selected regression delay scale, and is optimized according to the sampling period; j is 1,2 … m, i is 1,2 … 5, the number m of hidden layer and accepting layer nodes can be selected from 12-25; h is 1,2 … 5;
the training uses a gradient descent method.
7. The method for identifying the illumination condition of the sleep environment according to claim 1, wherein the sleep-in identification unit comprises an image acquisition module, a wearable module and a sleep-in judgment module, wherein the image acquisition module adopts a depth camera to acquire images;
the step S2 includes the following processing procedures:
the image processing part in the sleep judging module continuously detects the eye opening of the user, the heart rate calculating part, the body movement frequency calculating part and the body temperature calculating part calculate the heart rate, the body movement frequency and the body temperature based on the human body sensing signals acquired by the wearable module,
the data fusion processing part in the sleep judging module performs data fusion on the physical sign parameters output by the image processing part, the heart rate calculating part, the body movement frequency calculating part and the body temperature calculating part to eliminate inconsistent parts in a data set,
and rotating a holder supporting the camera according to the processing result of the image processing part to align the camera with the face of the user.
8. The method for identifying the lighting condition of the sleeping environment as claimed in claim 1, further comprising the steps of setting a display bar in the output module, and indicating the values of the factors of the sleeping efficiency of the current user by turns by using the display bar; and outputs the obtained values of the factors of the falling asleep efficiency to the outside through a communication interface block.
9. The method for recognizing the illumination condition of the sleep environment as claimed in claim 1, wherein the neural network further comprises an input quantity using a fatigue index as a parameter, the user inputs the fatigue index through a key in the user interface unit according to the current degree of fatigue; in the training sample collection process, data sampling and sample recording can be suspended by pressing a sampling cancel button.
10. A method for identifying a lighting condition of a sleeping environment as claimed in claim 1, wherein said neural network module further comprises a parameter of a time period obtained from a real time clock module as an input, said time period being noon or night respectively; the neural network module can also increase two parameters of temperature and humidity acquired from the temperature and humidity measurement module as input; the neural network module may also add as input a noise level parameter obtained from the noise measurement module.
CN202110899472.5A 2019-04-24 2019-04-24 Sleep environment illumination condition identification method Withdrawn CN113598722A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110899472.5A CN113598722A (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910335756.4A CN110013231B (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method
CN202110899472.5A CN113598722A (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910335756.4A Division CN110013231B (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method

Publications (1)

Publication Number Publication Date
CN113598722A true CN113598722A (en) 2021-11-05

Family

ID=67192379

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202110899472.5A Withdrawn CN113598722A (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method
CN202110899451.3A Withdrawn CN113842119A (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method
CN201910335756.4A Active CN110013231B (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202110899451.3A Withdrawn CN113842119A (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method
CN201910335756.4A Active CN110013231B (en) 2019-04-24 2019-04-24 Sleep environment illumination condition identification method

Country Status (1)

Country Link
CN (3) CN113598722A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110933804B (en) * 2019-11-29 2021-10-12 广东洲明节能科技有限公司 Lamp, and lamp angle control system and method
CN113273967A (en) * 2021-05-20 2021-08-20 贵州优品睡眠健康产业有限公司 Sleep sign monitoring system
CN114576840B (en) * 2021-11-25 2023-06-23 珠海格力电器股份有限公司 Method, electronic equipment and medium for shutdown based on WIFI channel state detection

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6995355B2 (en) * 2003-06-23 2006-02-07 Advanced Optical Technologies, Llc Optical integrating chamber lighting using multiple color sources
KR100646868B1 (en) * 2004-12-29 2006-11-23 삼성전자주식회사 Home control system and method using information of galvanic skin response and heart rate
US10610153B2 (en) * 2014-07-21 2020-04-07 Withings System and method to monitor and assist individual's sleep
JP6909018B2 (en) * 2017-03-01 2021-07-28 任天堂株式会社 Light emission control device and electronic equipment
CN107601083B (en) * 2017-09-19 2019-04-02 中国计量大学 Straight weight-loss type material baiting method neural network based
CN108712809B (en) * 2018-05-18 2019-12-03 浙江工业大学 A kind of luminous environment intelligent control method neural network based
CN108958047A (en) * 2018-07-09 2018-12-07 西安交通大学 A kind of intelligent sleep system and its working method
CN109106349A (en) * 2018-08-09 2019-01-01 上海常仁信息科技有限公司 A kind of domestic consumer's sleep monitor system
CN109199336A (en) * 2018-09-30 2019-01-15 深圳个人数据管理服务有限公司 A kind of sleep quality quantization method, device and equipment based on machine learning

Also Published As

Publication number Publication date
CN113842119A (en) 2021-12-28
CN110013231B (en) 2021-08-24
CN110013231A (en) 2019-07-16

Similar Documents

Publication Publication Date Title
CN110163371B (en) Dimming optimization method for sleep environment
CN110113843B (en) Lighting control system based on sleep efficiency factor
CN109998497B (en) Sleep-in detection and judgment system in luminous environment
EP3773155B1 (en) System and method for non-invasive determination of blood pressure dip based on trained prediction models
CN110013231B (en) Sleep environment illumination condition identification method
WO2017193497A1 (en) Fusion model-based intellectualized health management server and system, and control method therefor
WO2019079503A2 (en) Applied data quality metrics for physiological measurements
CA2873193A1 (en) System and method for monitoring the health of a user
KR102416878B1 (en) Healthcare apparatus for measuring heart rate
CN108904163A (en) wheelchair control method and system
WO2020140913A1 (en) Data processing method and apparatus, electronic device and storage medium
WO2022101785A1 (en) Improvements in acquisition and analysis of imaging photoplethysmogram signals
Wang et al. Maximum weight multi-modal information fusion algorithm of electroencephalographs and face images for emotion recognition
CN112232256A (en) Non-contact motion and body measurement data acquisition system
EP4169042A1 (en) Pulse shape analysis
CN109620265A (en) Recognition methods and relevant apparatus
KR102435808B1 (en) Healthcare apparatus for measuring stress score
CN108720825B (en) Multi-camera-based seamless detection method for non-contact vital sign parameters
KR20200061016A (en) Depression Index Estimation Method Using Skin Image
Liu et al. Adaptive-weight network for imaging photoplethysmography signal extraction and heart rate estimation
WO2020132941A1 (en) Identification method and related device
US20230218240A1 (en) Healthcare apparatus for heart rate measurement
US20230210423A1 (en) Healthcare apparatus for calculating stress index
Daqrouq et al. HEART RATE MEASUREMENT USING IMAGE RECOGNITION TECHNOLOGY.
Paracchini Remote biometric signal processing based on deep learning using SPAD cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20211105