US20140316192A1 - Biofeedback Virtual Reality Sleep Assistant - Google Patents
Biofeedback Virtual Reality Sleep Assistant Download PDFInfo
- Publication number
- US20140316192A1 US20140316192A1 US14/254,348 US201414254348A US2014316192A1 US 20140316192 A1 US20140316192 A1 US 20140316192A1 US 201414254348 A US201414254348 A US 201414254348A US 2014316192 A1 US2014316192 A1 US 2014316192A1
- Authority
- US
- United States
- Prior art keywords
- physiological
- virtual environment
- immersive virtual
- immersive
- sleep
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M21/02—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/375—Electroencephalography [EEG] using biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02438—Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0027—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0044—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
- A61M2021/005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense images, e.g. video
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/005—Parameter used as control input for the apparatus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/04—Heartbeat characteristics, e.g. ECG, blood pressure modulation
- A61M2230/06—Heartbeat rate only
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/08—Other bio-electrical signals
- A61M2230/10—Electroencephalographic signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/40—Respiratory characteristics
- A61M2230/42—Rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/50—Temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2230/00—Measuring parameters of the user
- A61M2230/60—Muscle strain, i.e. measured on the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/013—Force feedback applied to a game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/015—Force feedback applied to a joystick
Definitions
- insomnia is the most common sleep disorder. Insomnia is considered a hyper-arousal disorder in which both cognitive and physiological domains are over-activated. Research has shown that insomnia is associated with elevated autonomic nervous system activation, particularly at sleep onset that can adversely impact a person's health and well-being in a number of ways. Sleep onset in insomniacs is characterized by high levels of cognitive activity, worry, rumination and intrusive thoughts that, together with the autonomic hyperactivation, impede the onset of sleep. Predisposing factors that can increase a person's vulnerability to insomnia include age, gender, coping strategy, personality traits, and genetic factors.
- Insomnia can be triggered by acute stressful events, such as illness or trauma; it can be a chronic disorder without specific cause, or can be a symptom of other disorders.
- Perpetuating factors such as the use of caffeine or alcohol, excessive worry, and irregular wake/sleep schedules, may contribute to the development and persistence of insomnia.
- CBT Cognitive-Behavioral Therapy
- pharmacotherapy are two main lines of treatment that are currently available for insomnia.
- insomnia sufferers do not wish to use pharmacotherapy and there is limited availability of CBT.
- FIG. 1 is a simplified depiction of a person using an embodiment of a biofeedback virtual reality sleep assistant as disclosed herein;
- FIG. 2 is a simplified block diagram of at least one embodiment of a computing environment for the sleep assistant of FIG. 1 ;
- FIG. 3 is a simplified module diagram illustrating an environment of at least one embodiment of the sleep assistant of FIG. 1 in operation;
- FIG. 4 is a simplified flow diagram of at least one embodiment of a method for promoting sleep with the sleep assistant of FIG. 1 ;
- FIG. 5 is a simplified plot illustrating diaphragmatic breathing at approximately 6 breaths per minute during use of at least one embodiment of the sleep assistant of FIG. 1 prior to the onset of sleep.
- breathing data recorded by the Piezoelectric bands and IMU sensor are overlapped to illustrate the reliability of the computing device (e.g., a smart phone) in detecting breathing rate under slow breathing conditions;
- FIG. 6 is a simplified plot illustrating normal breathing frequency during a period of sleep, immediately following the sleep onset.
- breathing data recorded by the Piezoelectric bands and IMU sensor are overlapped to illustrate the reliability of the computing device (e.g., a smart phone) in detecting breathing rate under normal breathing conditions; and
- FIGS. 7-9 are simplified plots of illustrative test results obtained during the use of at least one embodiment of the sleep assistant of FIG. 1 .
- Insomniacs are characterized by elevated levels of physiological arousal (e.g. high heart rate, elevated high frequency electroencephalographic activity) together with cognitive hyperactivation (e.g. anxiety, worry, rumination, intrusive thoughts), particularly at sleep onset.
- physiological arousal e.g. high heart rate, elevated high frequency electroencephalographic activity
- cognitive hyperactivation e.g. anxiety, worry, rumination, intrusive thoughts
- the bed and bedroom can become associated with a disturbed sleep pattern.
- entry into the familiar bedroom environment can become a conditioned cue that perpetuates and increases the severity of insomnia.
- virtual reality scenarios can be designed to remove individuals from their undesirable sleep environment by immersing them in a new, peaceful and relaxing environment, distracting them from other factors that might contribute to insomnia, such as worry and rumination.
- biofeedback techniques can be incorporated into a virtual reality system to promote psychophysiological relaxation (by reducing physiological hyper-arousal) and thus promote their sleep.
- some of the disclosed embodiments focus the application of biofeedback and virtual reality techniques at the point in time that is prior to sleep onset.
- sleep onset period generally refers to the time period beginning with “lights out,” when the person begins the process of trying to fall asleep, and continues up to the point of loss of consciousness, e.g., when the person enters the initial sleep state, which usually occurs before the polysomnography (PSG) sleep onset.
- PSG polysomnography
- some of the disclosed embodiments are directed to helping people guide themselves across the sleep onset process to promote the transition from the conscious (awake) to the unconscious level (sleep).
- aspects of the disclosed embodiments apply biofeedback and virtual reality techniques to make the process of falling asleep easier.
- an embodiment of a biofeedback virtual reality system 100 includes a virtual reality device 240 and a wearable sensor device 210 .
- the device 210 may be embodied as a mobile computing device, as shown in FIG. 1 , or as a wearable smart-sensor (e.g. an IMU or inertial measurement unit) that communicates wirelessly with a mobile computing device (such as a smart phone lying on a table next to the person).
- a wearable smart-sensor e.g. an IMU or inertial measurement unit
- the device 210 may include two parts: [1] a wearable sensor and [2] a mobile computing device (where the mobile computing device is a separate device from the sensor and may interface with the sensor by wireless (e.g., by WIFI, BLUETOOTH, or other suitable wireless or optical communication technology), or may include one part (e.g., a mobile computing device with an integrated sensor).
- the mobile computing device and/or the smart-sensor communicates wirelessly with the virtual reality device 240 (e.g., eye wear and headphones).
- the virtual reality device 240 immerses a person in a virtual reality environment 116 .
- the wearable smart sensor together with the mobile computing device 210 operates a sleep assistant computer application 218 that applies biofeedback technology to the presentation of the immersive virtual environment 116 , in order to target a state of hyper-arousal (or hyper-activation) being experienced by the person using the system 100 (“user”).
- the system 100 creates a positive biofeedback loop in which the virtual reality device 240 provides cognitive relaxation/distraction and the biofeedback technology embodied in the sleep assistant application 218 promotes sleep by providing positive feedback (by modulating the degree of immersion in the virtual environment 116 ) in response to physiological signals indicating the user's current level of relaxation.
- the system 100 uses the biofeedback technology to modulates the virtual environment so as not to disturb the person once they have fallen asleep.
- the system 100 can control various aspects of the user's surrounding physical (real-world) environment, in response to the biofeedback signals.
- the system 100 can communicate with various smart devices in the room, such as devices that provide or reduce ambient lighting, including an alarm clock, shades, and/or other devices, to reduce distractions that may be introduced by such devices.
- the system 100 can use the biofeedback signals (e.g., the user's breathing rate) to automatically change an aspect of the user's physical environment; e.g., if the user slows down his or her breathing rate, the system 100 may decrease the brightness of the room.
- the immersive environment 116 is “virtual” in the sense that it includes computer-synthesized elements that are presented to the user in place of or in conjunction with the real-world environment.
- “virtual reality” or “VR” may refer to, among other things, virtual reality, augmented reality, enhanced reality, and/or other types of interactive or non-interactive computer-generated immersive user experiences.
- the immersive virtual environment 116 may include a visual display 118 that presents a series of animated two- or three-dimensional visual elements (e.g., graphical elements 122 , 124 , 126 , 128 ) against a relatively static graphical backdrop (e.g., element 120 ), and the person experiences the visual display 118 passively (e.g., by viewing only).
- the system 100 may allow the user to interact with the elements presented in the visual display 118 (e.g., via an “avatar” or by the user directly navigating in the scenario using, for instance, gaze, gestures or pointing devices).
- the system 100 may permit the user to move objects around in the virtual environment, or to interact with or insert themselves into the virtual environment as an avatar.
- a “sleep mask” configured as a virtual reality device 240 can detect the user's gaze and move the avatar in the same direction as the user's gaze allowing the user to navigate in the virtual environment moving his eyes.
- the system 100 may change the point of view or focus, or zoom in or zoom out in a particular direction, in response to the user's gestures or other body movements, or may rotate, pan, or otherwise adjust the virtual scene in response to a characteristic of the person's gaze (e.g., gaze direction, pupil dilation, etc.), or in response to other detected bio-signals (e.g., reduction in muscle activity).
- a characteristic of the person's gaze e.g., gaze direction, pupil dilation, etc.
- bio-signals e.g., reduction in muscle activity.
- the user is an active participant in controlling the virtual environment and synthesized sounds.
- a greater physiological relaxation leads to a more pleasant environment (e.g., increased immersion in the virtual reality) and thus promotes more cognitive relaxation/distraction.
- the sleep assistant 218 responds to increased physiological relaxation (as detected by, e.g., a reduction in breathing rate) by increasing the degree of virtual immersion (by providing, for example, pleasant/relaxing sounds, more visual elements in the immersive environment that increase the user's “sense of presence” in the immersive environment), etc.
- the increased degree of virtual immersion then leads to even greater cognitive relaxation/distraction (e.g. the person is now fully immersed in a virtual environment and he/she forgot all worries, ruminations, etc.), which promotes sleep.
- the virtual environment 116 is immersive in the sense that it is designed to attract the user's attention by increasing the user's sense of presence in a virtual world and by removing distractions that may occur in the surrounding real-world scene, e.g., by occluding the background and/or restricting the user's peripheral vision.
- the system 100 may achieve the immersive nature of the virtual environment 116 by presenting the visual display 118 , playing an audio soundtrack 130 , presenting a combination of the visual display 118 and the audio soundtrack 130 , and/or providing other sensory stimuli.
- the level of brightness of the visual stimulation provided by the visual display 118 is low, in order to avoid any alterations in hormone production (e.g. to avoid changes in melatonin).
- the illustrative immersive virtual environment 116 includes a combination of visual 118 and audio 130 stimuli, but other embodiments may include other types of sensory stimuli, such as tactile, temperature, taste, smell, and others, alternatively or in addition to the visual 118 and audio 130 stimuli.
- some embodiments of the virtual environment 116 only include visual stimuli while other embodiments only include audio stimuli.
- the system 100 coordinates the presentation of the various sensory stimuli with physiological information in real time to create a state of relaxation in the person experiencing the immersive virtual environment 116 .
- the system 100 may increase or decrease any of a number of features of any of the sensory stimuli, or selectively turn different sensory stimuli on and off, over time in response to changes in the person's physiological parameters.
- “in real time” may refer to, among other things, the fact that an automated biofeedback process occurs in response to sensed physiological information about the person using the system 100 , during a period in which the person is using the system 100 .
- the illustrative system 100 changes one or more aspects of the immersive virtual environment 116 directly in response to changes in the sensed physiological information, using biofeedback technology based on user actions that is designed to promote sleep.
- the mobile/wearable computing device 210 and/or the virtual reality device 240 analyze one or more physiological parameters that are obtained or derived from sensor signals.
- physiological parameters may refer to, among other things, breathing rate (respiration rate) (e.g., breaths per minute), heart rate (e.g., beats per minute), brain activity (e.g. electroencephalographic signals), body movements, muscle activity; or any other type of measurable human physiological activity, or any combination of the foregoing.
- respiration rate e.g., breaths per minute
- heart rate e.g., beats per minute
- brain activity e.g. electroencephalographic signals
- body movements e.g. electroencephalographic signals
- body movements e.g. electroencephalographic signals
- Different physiological parameters may have different roles in modifying the various aspects of the virtual environment (e.g., breathing rate can guide the speed of the navigation in the virtual environment whereas the muscle tone may guide the density of the virtual elements presented in the immersive environment 116 ).
- breathing rate can guide the speed of the navigation in the virtual environment whereas the muscle tone may guide the density of the virtual elements presented in the immersive environment 116 .
- the system 100 can reduce the speed of the fish swimming in an aquatic scene (but not change other aspects of the environment 116 ); and if, at the same time, the user reduces his or her muscle activity, the system 100 can increase the number of fish swimming in the visual scene.
- different physiological parameters can be linked with different aspects of the immersive scenario 116 using feedback on different bio-signals, in order to potentially increase the user's relaxation.
- the illustrative visual display 118 is embodied as a three-dimensional (3D) display of visual elements.
- the visual elements depict an aquatic scene and include a background 120 (e.g., water), a background element 128 (e.g., coral), and a number of foreground elements 122 (e.g., fish), 124 (air bubbles), 126 (e.g., rocks).
- the system 100 can adjust the presentation of any of the visual elements 120 , 122 , 124 , 126 , 128 , or add or remove visual elements, in response to changes in physiological parameters.
- each of the visual elements has a number of features, including speed (e.g., the rate at which the element moves across the display), quantity (e.g., the number of elements of a certain type presented on the display), density (e.g., the number of elements presented in a certain area of the display), frequency (e.g., the rate at which elements of a certain type are presented), color, brightness, contrast, direction of movement, depth (in 3D), focus (e.g. clarity), point of view, and/or complexity (e.g., amount of finer-grain details depicted in the element).
- the system 100 can modify any of these and/or other features of the visual elements depicted in the visual display 118 , based on the user's physiological parameters.
- the visual display 118 depicts an aquatic scene
- any type of visual display that is designed or selected to promote sleep may be used.
- an ocean, sky, or forest scene may be used, or the visual display 118 may be configured according to the preferences of a particular user of the system 100 .
- FIG. 1 it should be understood that in operation, the visual display 118 is actually displayed in the virtual reality viewing glasses 112 , but is shown as projected in order to better illustrate the details described above.
- the illustrative audio soundtrack 130 includes a number of audio elements, which may include various types of sounds (e.g., spoken words, music, nature sounds, etc.) or a combination thereof.
- the audio elements are sounds that are coordinated with the visual display 118 (e.g., water flowing and bubbling sounds); however, the audio soundtrack can include any type of audio selected or configured to promote sleep, including selections from the user's digital music library.
- the system 100 can adjust the presentation of any of the audio elements, or add or remove audio elements, in response to changes in physiological parameters.
- Each of the audio elements has a number of features, including volume, content (e.g., words, sounds, and/or music), speed (e.g., tempo), complexity (e.g., number of different types or layers of sound), degree of “surround sound,” and/or intensity (e.g., acoustic intensity).
- the system 100 can modify any of these and/or other features of the audio elements 130 based on the user's physiological parameters.
- the illustrative wearable smart-sensor and mobile computing device 210 includes a computing device 110 (e.g., a smartphone, a tablet computer, an attachable/detachable electronic device such as a clip-on device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel) and a positioner 132 (e.g., a strap, tether, clip, VELCRO® tab, etc.).
- a computing device 110 e.g., a smartphone, a tablet computer, an attachable/detachable electronic device such as a clip-on device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel
- a positioner 132 e.g., a strap, tether, clip, VELCRO® tab, etc.
- any type of computing device that includes a processor and memory and can interact with the virtual reality device 240 in a relatively non-intrusive manner (e.g., without causing discomfort to
- the positioner 132 is configured to secure the mobile or wearable computing device 210 in a position in which a sensor 232 ( FIG. 2 ) of the device 210 can detect the user's physiological activity and generate physiological signals representing the user's physiological activity.
- the positioner 132 may be omitted, in some embodiments.
- the sensors used to detect the user's physiological activity do not need to be attached to or worn by the person.
- the physiological sensor 232 can be incorporated in the person's bed or mattress, or into a mattress pad or bed linens (e.g., a fitted sheet). Additional details of the mobile or wearable computing device 210 are described below with reference to FIG. 2 .
- the illustrative virtual reality device 240 includes a visual display system 112 and an audio delivery system 114 .
- the illustrative visual display system 112 is embodied as commercially available virtual reality eyewear.
- Other embodiments of the visual display system 112 utilize other types of visual display systems, such as high-definition video glasses, non-rigid sleep masks adapted for virtual reality, televisions, projection systems (to project a display of visual elements onto a wall or ceiling), or holograms.
- the illustrative audio delivery system 114 is embodied as commercially available bone-conducting headphones. Other embodiments of the audio delivery system 114 use other methods of audio delivery, such as conventional audio headphones (e.g., earbuds), three-dimensional (3D) surround sound systems, remote speakers, indoor waterfall systems or fountains, and/or other electronically-controllable noise-making devices.
- conventional audio headphones e.g., earbuds
- 3D surround sound systems e.g., remote speakers
- indoor waterfall systems or fountains e.g., earbuds
- the components of the system 100 are in communication with each other as needed by suitable hardware and/or software-based communication mechanisms, which may be enabled by an application programming interface, operating system components, a network communication subsystem, and/or other components. Additional details of the virtual reality device 240 are described below with reference to FIG. 2 .
- the illustrative environment 200 includes the mobile or wearable computing device 210 and the virtual reality device 240 , which are in communication with one or more smart devices 266 and/or one or more server computing devices 280 via one or more networks 264 .
- the biofeedback VR sleep assistant 218 is, illustratively, embodied as a distributed application including “front end” components that are local to each of the devices 210 , 240 , 270 and including “back end” portions that reside on the server(s) 280 (e.g., “in the cloud”).
- a library or searchable database of selectable immersive virtual environments 116 may be distributed across the network 270 .
- the immersive virtual environments 222 , 252 , 292 may include a number of different virtual environments 116 , or copies or portions of particular virtual environments 116 .
- other portions of the system 100 may be distributed on various devices 210 , 240 , 280 across the network 270 , such as mapping functions 234 .
- the sleep assistant 218 , the mapping function(s) 234 , and the immersive virtual environment(s) 222 , 252 , 292 may be stored entirely on the mobile or wearable computing device 210 or entirely on the virtual reality device 240 .
- portions of the sleep assistant 218 , the mapping function(s) 234 or the immersive virtual environment(s) 222 , 252 , 292 may be incorporated into other systems or interactive software applications.
- Such applications or systems may include, for example, operating systems, middleware or framework (e.g., application programming interface or API) software, and/or user-level applications software.
- the mobile or wearable computing device 210 may be embodied as any type of computing device that is capable of performing the functions described herein (e.g., modulating the presentation of the immersive virtual environment 116 based on physiological signals).
- the devices 210 , 240 may be integrated as a unitary device.
- Such a unitary device may also include one or more physiological sensors 232 , 262 .
- the illustrative mobile or wearable computing device 210 includes at least one processor 212 (e.g. a controller, microprocessor, microcontroller, digital signal processor, etc.), memory 214 , and an input/output (I/O) subsystem 216 .
- processor 212 may include separate baseband and applications processors.
- the baseband processor interfaces with other components of the mobile or wearable computing device 210 and/or external components to provide, among other things, wireless communication services, such as cellular, BLUETOOTH, WLAN, and/or other communication services.
- the applications processor handles processing required by software and firmware applications running on the mobile or wearable computing device 210 , as well as interfacing with various sensors and/or other system resources.
- the baseband processor may handle features typically handled by the applications processor and vice versa, in some embodiments.
- the I/O subsystem 216 typically includes, among other things, an I/O controller, a memory controller, and one or more I/O ports.
- the processor 212 and the I/O subsystem 216 are communicatively coupled to the memory 214 .
- the memory 214 may be embodied as any type of suitable computer memory device (e.g., volatile memory such as various forms of random access memory).
- the I/O subsystem 216 is communicatively coupled to a number of components, including a user interface subsystem 224 .
- the user interface subsystem 224 includes one or more user input devices (e.g., a microphone, a touchscreen, keyboard, virtual keypad, etc.) and one or more output devices (e.g., audio speakers, displays, LEDs, etc.).
- the I/O subsystem 216 is also communicatively coupled to a data storage device 220 , a communications subsystem 230 , and the physiological sensor(s) 232 , as well as the biofeedback VR sleep assistant 218 .
- the data storage device 220 may include one or more hard drives or other suitable persistent data storage devices (e.g., flash memory, memory cards, memory sticks, and/or others).
- the physiological sensing devices 232 may include motion sensors, pressure sensors, kinetic sensors, temperature sensors, biometric sensors, and/or others, and may be integrated with or in communication with the mobile or wearable computing device 210 .
- the sensing device 232 may be embodied as an inertial measurement unit (IMU) sensor of the mobile or wearable computing device 210 , and as such may include a multiple-axis gyroscope and a multiple-axis accelerometer.
- a respiratory effort sensor such as a piezo sensor band or a respiratory transducer, may be in communication with or embodied in the computing device 210 , alternatively or in addition to the IMU.
- Portions of the sleep assistant 218 , the mapping function(s) 234 , and the immersive virtual environment(s) 222 reside at least temporarily in the data storage device 220 .
- the virtual environments 222 may include a subset of the library of virtual environments 292 , where the subset 222 has been selected by the user or provided as part of a base configuration of the sleep assistant 218 or the computing device 210 .
- Portions of the sleep assistant 218 , the mapping function(s) 234 , and the immersive virtual environment(s) 222 may be copied to the memory 214 during operation of the mobile or wearable computing device 210 , for faster processing or other reasons.
- the communication subsystem 230 may communicatively couple the mobile or wearable computing device 210 to other computing devices and/or systems by, for example, a cellular network, a local area network, wide area network (e.g., Wi-Fi), personal cloud, virtual personal network (e.g., VPN), enterprise cloud, public cloud, Ethernet, and/or public network such as the Internet.
- the communication subsystem 230 may, alternatively or in addition, enable shorter-range wireless communications between the mobile or wearable computing device 210 and other computing devices (such as the virtual reality device 240 ), using, for example, BLUETOOTH and/or Near Field Communication (NFC) technology.
- BLUETOOTH and/or Near Field Communication
- the communication subsystem 230 may include one or more optical, wired and/or wireless network interface subsystems, cards, adapters, or other devices, as may be needed pursuant to the specifications and/or design of the particular mobile or wearable computing device 210 .
- the communication subsystem 230 may include a telephony subsystem, which enables the computing device to provide telecommunications services (e.g., via the baseband processor).
- the telephony subsystem generally includes a longer-range wireless transceiver, such as a radio frequency (RF) transceiver, and other associated hardware (e.g., amplifiers, etc.).
- RF radio frequency
- the user interface subsystem 224 includes an audio subsystem 226 and a visual subsystem 228 .
- the audio subsystem 226 may include, for example, an audio CODEC, one or more microphones, and one or more speakers and headphone jacks.
- the visual subsystem 228 may include, for example, personal viewing glasses, projection devices, holograms, televisions, liquid crystal display (LCD) screens, light emitting diode (LED) screens, or other visual display devices.
- the one or more physiological sensor(s) 232 initially detect the user's “baseline” physiological parameters (e.g., the user's actual measured parameters at the beginning of a sleep promotion session).
- the system 100 presents an initial immersive virtual environment 116 and enters “feedback mode,” in which the sensor(s) 232 periodically detect the physiological responses of the user to the presented immersive virtual environment 116 , and provide the sleep assistant 218 with physiological signals that can be used by the sleep assistant 218 to determine the user's state of relaxation as it changes over time.
- the physiological signals output by the sensor(s) 232 may include signals that represent respiration rate, heart rate, brain activity (e.g. electroencephalogram (EEG)), body temperature, and/or other physiological parameters.
- EEG electroencephalogram
- the senor 232 may be embodied as an IMU built into the computing device 210 or the virtual reality device 240 , which is used to measure the user's breathing rate by detecting the rise and fall of the user's chest or abdomen over time during normal respiration.
- the physiological sensor 232 can include measurement tools that are external to the computing device 210 but which are in communication with the device 210 .
- An example of an external physiological sensor 232 is “textile electrodes,” which are formed by knitting or weaving conductive fibers into apparel or garments. Textile electrodes can pick up signals from the heart and other muscles. The physiological activity sensed by the textile electrodes are transmitted through the conductive fibers that are woven into the garment to a processing unit, which then passes the received signals to the mobile or wearable computing device 210 , generally through a wireless data connection.
- the virtual reality device 240 may be embodied as any type of device that is capable of performing the functions described herein (e.g., presenting the immersive virtual environment 116 to the user).
- the illustrative virtual reality device 240 is equipped with an audio subsystem 256 and a visual subsystem 258 , which may be embodied similarly to the audio subsystem 226 and the visual subsystem 228 described above.
- the virtual reality device 240 may be embodied with components similar to those of the mobile or wearable computing device 210 .
- the virtual reality device 240 has a processor 242 , memory 244 , and an I/O subsystem 246 similar to the mobile or wearable computing device 210 .
- elements of the virtual reality device 240 having the same or similar name as elements of the mobile or wearable computing device 210 may be embodied similarly, and description of those elements is not repeated here.
- the virtual reality device 240 may include other components as needed to control or provide other various forms of sensory stimuli, such as an ambient temperature controller subsystem, an aroma subsystem, and/or an air movement subsystem.
- the virtual reality device 240 is comprised of separate devices.
- wearable personal viewing glasses and headphone ear buds may be separate components of the virtual reality device 240 , or may be integrated into a single device (e.g., GLASS by Google, Inc. or a similar device).
- the smart device(s) 266 may be embodied as any type of electronic device capable of performing the functions described herein (e.g., controlling an aspect of the user's physical environment).
- the smart device(s) 266 may include smart lighting, heating, cooling, sound, and/or entertainment systems.
- the smart device(s) 266 may include components similar to those described above, or may simply include control circuitry to process control signals received from the sleep assistant 218 and adjust a parameter of the device 266 (e.g., light intensity, room temperature, sound volume, etc.).
- Elements of the smart device(s) 266 having the same or similar name as elements of the mobile or wearable computing device 210 may be embodied in a similar manner and according to the requirements of the particular smart device 266 . As such, the description of the similar elements is not repeated here.
- the virtual sleep assistant 218 is communicatively coupled to I/O subsystem 276 , data storage 274 , user interface subsystem 276 , and communication subsystem 278 .
- Data storage 274 is used to store portions of the mapping function(s) 234 and the immersive virtual environment(s) 292 .
- the server(s) 280 may be embodied as any type of computing device capable of performing the functions described herein (e.g., storing portions of the immersive virtual environments 292 and/or executing portions of the sleep assistant 218 ).
- the server(s) 280 may include components similar to those described above. Elements of the server 280 having the same or similar name as elements of the mobile or wearable computing device 210 may be embodied in a similar manner and according to the requirements of the server 280 . As such, the description of the similar elements is not repeated here.
- the virtual sleep assistant 218 is communicatively coupled to I/O subsystem 286 , data storage 290 , user interface subsystem 294 , and communication subsystem 296 .
- Data storage 290 is used to store portions of the mapping function(s) 234 and the immersive virtual environment(s) 292 .
- the computing environment 200 may include other components, sub-components, and devices not illustrated in FIG. 2 for clarity of the description.
- the components of the environment 200 are communicatively coupled as shown in FIG. 2 by electronic signal paths, which may be embodied as any type of wired or wireless signal paths capable of facilitating communication between the respective devices and components.
- the biofeedback VR sleep assistant 218 is shown in more detail, in the context of an environment 300 that may be created during the operation of the computing system 100 (e.g., an execution or “runtime” environment).
- the sleep assistant 218 is embodied as a computer application.
- application or “computer application” may refer to, among other things, any type of computer program or group of computer programs, whether implemented in software, hardware, or a combination thereof, and includes operating system programs, middleware (e.g., APIs, runtime libraries, utilities, etc.), self-contained software applications, or a combination of any of the foregoing.
- the sleep assistant 218 is embodied as a number of computerized modules and data structures including a physiological signal acquisition module 312 , a physiological signal processing module 314 , a physiological parameter mapping module 316 , an immersive environment control module 318 , a data store including a number of immersive virtual environments 222 , and a learning module 338 .
- the physiological signal acquisition module 312 receives sensor signals 328 from the physiological sensor(s) 232 , 262 from time to time during operation of the computing device 210 at a specified sampling rate, which may correspond to a sampling rate performed by the computing device 210 . As described above, portions of the sensor signals 328 may reflect human body movements that are indicative of the user's breathing, heartbeat, or other physiological activity.
- the signal acquisition module 312 performs standard signal processing techniques (e.g., analog-to-digital conversion, filtering, etc.) to extract the useful information (e.g., measurements of breathing or heart beat activity, brain activity or body temperature) from the sensor signals 328 and outputs the resulting physiological signals 330 .
- the signal acquisition module 312 is a standard component that is built into the computing device 210 .
- the physiological signal acquisition module 312 can also be part of a unit that is external to the computing device 210 .
- the physiological signal acquisition module 312 can be part of the virtual reality device 240 .
- the physiological signal acquisition module 312 can be communicatively coupled to either the visual subsystem 256 or the audio subsystem 258 , in some embodiments.
- the physiological signal acquisition module 312 may be embodied as a processor in communication with a heart rate monitor that is built into audio earbuds.
- the physiological signal acquisition module 312 may be a thermal imager that is remotely placed (with respect to the computing device 210 ) to periodically measure the body temperature of the user.
- the physiological signal processing module 314 receives the physiological signals 330 from the physiological signal acquisition module 312 , maps the physiological signals to one or more physiological parameters (e.g., respiration rate, heart rate, etc.), each of which has a range of possible values, and calculates the current data value 332 for each of the physiological parameters. For example, the physiological signal processing module 314 may determine a value of a physiological parameter from one or multiple physiological signals 330 , or from one or multiple instances of the same physiological signal 330 over time. The module 314 may execute one or more algorithms to map the physiological signals 330 to physiological parameters or to determine physiological parameter values 332 . For example, a robust algorithm based on Fourier analysis may be used to compute the dominant oscillation period from the raw IMU data that is directly related to breathing rate.
- physiological parameters e.g., respiration rate, heart rate, etc.
- the physiological parameter mapping module 316 uses the physiological parameter values 332 to determine the immersive virtual environment 116 that is to be presented to the user.
- the physiological parameter mapping module 316 maps the physiological parameter values 332 received from the physiological signal processing module 314 to the features of the immersive virtual environment 116 .
- the physiological parameter value and its mapping determine the features of the audio and visual stimuli to be presented to the user.
- the mapping is accomplished by one or more look-up tables that indicate relationships between various physiological parameter values and features of the immersive virtual environment 116 .
- a look-up table may link a physiological parameter value or range of values to a pre-determined audio volume and number or type of visual elements to display.
- a continuous function e.g., a linear or Gaussian function
- Illustrative examples of mapping tables are shown below in TABLE 1 and TABLE 2.
- a single physiological parameter value of a single parameter may be used to determine all of the parts of the virtual environment 116 to be presented by the user, for example, both the visual elements and the audio elements.
- the mapping may be defined differently or determined separately for different elements of the virtual environment.
- a mapping table or mapping function 234 may define relationships between respiration rates and features of the visual display 118
- another mapping table or mapping function 234 may define relationships between the respiration rates and features of the audio soundtrack 130 .
- multiple physiological parameters and their corresponding parameter values may be used.
- one physiological parameter may be used to control the visual display 118 and a different physiological parameter may be used to control the audio 130 of other aspects of the two subsystems.
- different mapping tables or functions 234 may be used to control the smart device(s) 266 .
- the mapping table or mapping function used by the parameter mapping module 316 may be customized for a particular user based on user customization data 344 .
- the user customization data 344 may include, for example, user preferences, demographic information, or clinical sleep information specific to the user.
- the system 100 may include a number of different parameter mapping tables for different populations of users, and the user customization data 344 may be used to select an appropriate mapping table (based on, e.g., age, gender, or body size).
- the mapping tables or mapping functions, or portions thereof, may be stored in data storage of any of the devices 210 , 2430 , 266 , 280 , as mapping functions 234 , or in other data storage locations.
- the system 100 determines changes or adjustments to be made to the immersive virtual environment 116 in response to the current parameter value(s) 332 .
- the immersive virtual environment 116 may include a succession of stages, where each stage represents a particular combination of sensory stimuli, and the change or adjustment may include transitioning the presentation to a different stage of the virtual environment 116 .
- the specifications for these changes or adjustments are passed to the immersive environment control module 318 as environment adjustments 334 .
- the parameter values 332 , corresponding environment adjustments 334 , and subsequent parameter values 332 are passed to the learning module 338 from time to time.
- the learning module 338 applies one or more artificial intelligence techniques (such as an unsupervised machine learning algorithm) to the training data to algorithmically learn the user's typical responses to different environment adjustments 334 .
- the learning module 338 formulates recommended mapping adjustments 336 , which indicate modifications to the mapping function that are based on the user's actual behavior over time.
- the learning module 338 passes the mapping adjustments 336 to the parameter mapping module 316 , which updates its mapping table or mapping function based to incorporate the mapping adjustments 336 .
- the learning module 338 monitors the physiological signals over the course of a sleep session (e.g., overnight) and outputs feedback (e.g., in the morning) about sleep quality or overall cardiac functioning of the user.
- the learning module 338 can make modifications in the selection of the immersive scenario and/or the degree of immersion in subsequent sleep sessions (e.g., for the following night), in response to its assessments of the user's previous sleep quality and/or nocturnal physiology.
- the system 100 can, in an automated fashion, learn and change the immersion scenario or settings based on data indicating sleep patterns of a general population (and/or based on a user's individual nocturnal physiology—e.g., cardiac functioning).
- the immersive environment control module 318 controls the modifications to the presentation of the immersive virtual environment 116 in response to the physiological signals 330 .
- the immersive environment control module 318 receives the environment adjustments 334 , accesses the requisite elements of the immersive environment(s) 222 (which, illustratively, include audio elements 340 and visual elements 342 ), and constructs a modified version of the virtual environment 116 , incorporating the environment adjustments 334 .
- the control module 318 includes a modulator 320 , 322 , 324 , 326 for each different type of stimulus.
- the audio modulator 320 controls the modification of the presentation of audio elements and their respective features (e.g., volume, content, speed, complexity, intensity, and/or other aspects of the audio soundtrack 130 ), while the visual scene modulator 322 controls the modification of the presentation of visual elements and their respective features (e.g., object movements, number and type of different objects displayed, color schemes, brightness levels, and/or other aspects of the visual display 118 ).
- the tactile modulator 324 and the temperature modulator 326 operate in a similar fashion to control tactile and temperature stimuli, and similar modulators operate similarly for other types of sensory stimuli.
- the illustrative immersive environment control module 318 constructs and adjusts the virtual environment 116 “on the fly,” e.g., by performing graphics rendering in real time, as opposed to simply selecting and presenting previously created content.
- the immersive environment control module 318 transmits control signals to the virtual reality device 240 to cause the virtual reality device 240 to present the various adjustments to the virtual environment 116 to the user.
- a flow diagram provides an illustration of a method 400 by which embodiments of system 100 may be used to, for example, conduct a sleep promotion session.
- the method 400 may be embodied as computerized programs, routines, logic and/or instructions that are executed by the computing system, e.g., the computing device 210 and/or the virtual reality device 240 .
- a person attempting to fall asleep, or simply to become more relaxed uses the system 100 to immerse themselves in a virtual reality environment.
- the person may be instructed or coached by the sleep assistant 218 to slow his or her breathing rate (or may do so on his or her own) in order to cause the virtual reality environment to become more immersive.
- a process of presenting an immersive virtual environment that adjusts automatically in response to sensor signals representing physiological activity of the user is performed.
- aspects of the user's physical environment e.g., ambient lighting
- the system 100 may include one or more separate mapping functions 234 that the sleep assistant 218 may use to determine adjustments to be made to the physical environment in response to the user's physiological activity.
- the system 100 selects a virtual environment to be presented by the virtual reality device 240 .
- a virtual environment there are many different types of virtual environments that can be presented; for example, aquatic scenes (e.g., aquarium or ocean), general nature scenes, or other environments that are designed to promote sleep.
- the system 100 can select a specific virtual environment in response to user input, as a result of default settings of the virtual sleep assistant 218 , or by accessing user customization data 344 (such as a user profile or preferences).
- user customization data 344 such as a user profile or preferences.
- the system 100 receives physiological signals output by the physiological sensor(s) 232 , 262 , which represent physiological activity of a person using the system 100 .
- the system 100 processes the physiological signals received at block 412 and determines one or more physiological parameters and the current parameter values (e.g., breathing rate: 10 breaths per minute) as of the sampling instance.
- the parameter values can be calculated or estimated (e.g., based on a number of breaths detected in a given time interval).
- the parameter values can be determined by, for example, a computer-processing unit of the mobile or wearable computing device 210 , or in computer processing units located directly in the physiological sensor(s) 232 , 262 .
- the system 100 determines a stage of the immersive virtual environment to present, based on the current parameter values.
- the process at block 416 includes a mapping function in a form of a look-up table that maps physiological parameter values to stages of the virtual environment.
- each immersive virtual environment can be divided into a number of successive stages that can be presented to the user.
- Each stage relates to a physiological parameter value or a range of physiological parameter values. That is, where a physiological parameter has a range of possible values, each stage of the virtual environment relates to a different subset of the range of possible values.
- TABLE 1 illustrates the relationship between a few exemplary visual and audio features of an immersive virtual environment and an exemplary physiological parameter.
- a single physiological parameter is mapped to both visual and audio elements of an immersive virtual environment.
- Each value of the physiological parameter corresponds to a different stage of the immersive virtual environment, and each stage of the immersive virtual environment relates to audio and visual features that have different values.
- the illustrative audio feature is gain (e.g., volume) and the illustrative visual features are the number of primary foreground elements (e.g., fish in the example of FIG. 1 ), the speed of object movement (e.g., the speed at which the fish travel across the display), and the densities of secondary foreground elements (e.g., the density of the bubbles of FIG. 1 ).
- Heart rate variability has been found to significantly increase at a respiratory frequency of 6 breaths per minute.
- Inducing low levels of physiological activity e.g. lowering heart rate voluntarily via paced breathing
- 6 breaths per minute corresponds to a target breathing rate for obtaining maximum relaxation.
- the higher breathing rates correspond to earlier stages in the succession of virtual environment stages, and lower breathing rates correspond to later stages.
- the virtual environment becomes more immersive (presenting a higher number of primary foreground elements, a higher density of secondary foreground elements, and louder audio, as the respiration rate decreases.
- the speed of movement of the displayed objects becomes slower as the respiration rate decreases.
- Using a mapping such as illustrated by TABLE 1 enables the system 100 to gradually present a more immersive experience if the user increases his or her relaxation and reacts favorably to the previously-presented stage of the virtual environment.
- the system 100 increases the degree of virtual immersion in response to reductions in the user's respiration rate.
- the system 100 can make adjustments to the immersive virtual environment 116 based on other criteria, such as the previously-presented stages of the immersive virtual environment 116 (e.g., adjust the quantity or speed of visual features based on the quantity or speed of the visual features presented in the previous stage).
- system 100 can adjust the immersive virtual environment 116 (and/or an aspect of user's physical environment) in response to the detection of the user's muscle activity.
- EMG electromyogram
- two electromyogram (EMG) sensors can be incorporated in a “sleep mask” to detect the muscle activity of corrugator supercilii muscle (by detecting the electrical potential generated by muscle bundles).
- the resting EMG tone may be recorded for a short time (e.g. 1 min) when the user is lying down in bed maintaining their neutral “position,” to determine the baseline EMG tone ( ⁇ v).
- the individual may then be instructed or coached by the sleep assistant 218 to decrease his or her level of “muscle contraction” in his or her facial muscles, and particularly in the forehead (or, of course, the user may do so on his or her own, without coaching).
- the stages of immersion in the virtual environment 116 may increase based on the percentage decrease in muscle contraction from the baseline levels.
- the mechanics of each stage of the immersive virtual environment are not limited to types of features and mappings shown in TABLE 1 and TABLE 2 or the data values shown in TABLE 1 and TABLE 2.
- Other strategies for dynamically changing the immersive virtual environment to induce sleep are within the scope of this disclosure.
- the immersive virtual environment is presented using the virtual reality device 240 .
- the system 100 constructs the appropriate stage of the immersive virtual environment and transmits the stage content and control commands to the virtual reality device 240 .
- the virtual reality device 240 executes the commands to presenting the virtual environment.
- portions of the stage content e.g., the visual elements and/or audio elements
- the system 100 processes frequent physiological feedback data from the sensors 232 , 262 .
- the system 100 may process the physiological data at a frequency that corresponds to the internal sampling rate of the computing device 210 (e.g., 100 Hz for a standard smart phone).
- the system 100 receives new physiological signals that are detected subsequent to the presentation of the stage of the virtual environment at block 418 .
- new physiological parameter values are calculated from the new physiological signals received at block 420 .
- the system 100 considers whether to continue the biofeedback virtual reality sleep promotion at block 424 . If it is determined that the virtual reality sleep promotion is to be discontinued, then the method 400 concludes at block 428 and the system 100 discontinues the presentation of the virtual environment.
- the virtual reality sleep promotion is discontinued by a timer set to turn the sleep assistant 218 off after sleep promotion has been running for a certain period of time. In other embodiments, the virtual reality sleep promotion may be stopped due to an input from a user.
- the system 100 determines a sleep state based on the physiological signals or using a gaze detector incorporated into the virtual sleep assistant hardware that detects the user closing his or her eyes. In some cases, the system 100 may turn off the virtual sleep assistant 218 upon detecting the closing of the person's eyes, or turn off only the visual display when the eyes of the user are closed.
- the physiological feedback data may be used to detect a state of full sleep, or a state sufficiently close to full sleep, and turn off the sleep assistant 218 after certain physiological conditions have been met.
- the system 100 can detect, based on the physiological signals, whether a person has fallen asleep or wishes to discontinue using the system 100 as follows. When the person begins using the system 100 , they begin by consciously slowing their breathing rate, and the system 100 detects a low breathing rate. However, when people fall asleep, they lose the voluntary control of their own breathing.
- the system 100 can thus turn off the sleep assistant application 218 when the system 100 detects a normal breathing rate for a certain period of time (e.g. when the person falls asleep) after having previously detected a low breathing rate for a certain period of time.
- a return to a normal breathing rate could also mean that the user has discontinued the voluntary slow breathing the person does not want use the device anymore.
- the system 100 can turn off the sleep assistant application 218 in response to the return to a normal breathing rate.
- the sleep assistant 218 is configured to guide individuals toward sleep, starting from a conscious level (which typically occurs at the beginning of the night, when the person is still awake), through intermediate stages in which users use the VR biofeedback system, up to the point at which when they fall sleep (unconsciousness).
- the system 100 automatically adjusts the immersive virtual environment (by increasing the sense of presence or degree of immersiveness) so that the user progressively feels that the (unreal) virtual environment is actually their real (physical) environment.
- the user's sense of presence in the virtual environment increases, the user's mind is distracted from aspects of their real environment that normally disrupt sleep (such as physical features of the room, emotional connections with the physical environment, and thoughts of worry and rumination).
- the system 100 determines whether the stage of the virtual environment (and/or an aspect of the physical environment, e.g., a setting of a smart device 266 ) is to be changed, at block 426 .
- the determination as to whether to change the virtual and/or physical environment can be made in the same manner as described in block 416 . That is, the system maps the new physiological parameter values determined at block 422 to a stage of the virtual and/or physical environment (using, e.g., one or more mapping functions 234 ).
- the new parameter values may relate to the stage(s) of the virtual and/or physical environments that are currently being presented, in which no change is made to the virtual and/or physical environment, and the system 100 returns to block 418 and continues presenting the same stage of the virtual and/or physical environment(s) as was done previously. If the new parameter values relate to a different stage of the virtual and/or physical environment(s) than the stage that is currently being presented, the system 100 returns to block 416 and proceeds to determine the specifications for and present the new stage. In other embodiments, the decision at block 426 may be performed by comparing the old physiological parameter value determined at block 414 to the new physiological parameter determined at block 422 .
- the system 100 continues presenting the current stage of the virtual and/or physical environment(s), and the process of monitoring physiological signals continues. If the old physiological parameter value and the new physiological parameter value are different or outside an acceptable range of difference, then the stage of the virtual and/or physical environment(s) is updated to correspond to the new physiological parameters, and the process of monitoring physiological signals continues.
- FIGS. 5 and 6 illustrative plots of sensor data are shown, which compare the use of an inertial measurement unit (IMU) to measure respiration rate to the results obtained using a piezo respiratory effort band (“p-band”), which is the conventional “gold standard” method used to capture respiration data during polysomnographic sleep recordings.
- the plot 500 shows low frequency breathing of a person during controlled feedback relaxation induced by the sleep assistant 218 , but prior to sleep.
- Graph line 510 shows the breathing frequency measured by the p-band over time.
- Graph lines 512 , 514 , 516 , 518 , 520 , and 522 show the six outputs of an IMU measuring the respiration rate over time.
- an IMU comprises a three-axis accelerometer and a three-axis gyroscope.
- Breathing frequency is estimated by analyzing accelerometer data and combining it with gyroscope data, if available.
- a smoothing function (such as that which may be provided by a smart phone application) may be used to delay the feedback response and thereby compensate for breathing changes that result from the user's body movements or other artifacts.
- Graph line 524 shows the breathing frequency (in Hertz) of a person as measured by the p-band
- graph line 526 shows the breathing frequency of a person measured using the IMU. Both the p-band and the IMU measurement techniques exhibit similar performance.
- the plot 600 found in FIG. 6 , is nearly identical to the graph 500 , found in FIG. 5 , except that the plot 600 is a measurement of breathing frequency of a person during a period of sleep.
- Graph lines 610 and 624 relate to the p-band measurements, and graph lines 612 , 614 , 616 , 618 , 620 , 622 , and 626 relate to the IMU measurements. Again, the p-band and the IMU exhibit similar performance.
- the breathing rate can be affected by artifacts such as body movements, which usually occur at the sleep onset (e.g., people turning over or changing position, etc.)
- a function e.g., a smoothing function
- FIGS. 7-9 exemplary plots of test results obtained during trials illustrate the effectiveness of an embodiment of the sleep assistant 218 in comparison to a baseline night in which the sleep assistant 218 was not used.
- FIG. 7 shows that a lower heart rate is established during an initial period of low breathing rate using the sleep assistant 218 and the lower heart rate is maintained after the onset of sleep.
- FIG. 8 shows that in the same trial, the lower heart rate was established with the sleep assistant 218 prior to sleep and maintained during both rapid eye movement (REM) and non-rapid eye movement (NREM) periods of sleep, across the whole night.
- FIG. 9 compares a measure of sleep quality for a baseline night in which the sleep assistant 218 was not used and a night in which the sleep assistant 218 was used, and shows that sleep quality improved with the use of the sleep assistant 218 .
- An embodiment of the technologies disclosed herein may include any one or more, and any combination of, the examples described below.
- a method for promoting sleep includes, with a biofeedback virtual reality system: monitoring a physiological signal received from a sensor over time; presenting an immersive virtual environment with a virtual reality device, the immersive virtual environment comprising a display of visual elements designed to promote sleep; detecting a change in the physiological signal, and in response to the detected change in the physiological signal: applying biofeedback technology to determine an adjustment to the immersive virtual environment, wherein the adjustment is to change the display of visual elements; and presenting the adjustment to the immersive virtual environment with the virtual reality device.
- the method includes the subject matter of example 1 and includes receiving the physiological signal at a mobile or wearable sensing and computing device, and determining one or more physiological parameters based on the physiological signal.
- the method includes the subject matter of example 1 or example 2 and includes presenting the immersive virtual environment is in response to a user actively attempting to control a physiological parameter being sensed by the sensor.
- the method includes the subject matter of any of the preceding examples and includes selecting the immersive virtual environment from a plurality of stored immersive virtual environments based on the physiological signals and/or user customization data.
- the method includes the subject matter of any of the preceding examples and includes determining user customization data and determining the adjustment to the immersive virtual environment based on the user customization data.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises an audio soundtrack, applying biofeedback technology to determine an adjustment to the audio soundtrack and applying the adjustment to the audio soundtrack with the virtual reality device.
- the method includes the subject matter of any of the preceding examples and includes determining a mapping defining a relationship between physiological signals and elements of the immersive virtual environment, wherein the mapping is defined to promote sleep, and using the mapping to determine the adjustment to the immersive virtual environment.
- the method includes the subject matter of any of the preceding examples and includes storing data relating to adjustments made to the immersive virtual environment over time and physiological signals monitored after the adjustments have been made, applying an artificial intelligence or machine learning technique to the stored data to algorithmically learn a modification to the mapping; and updating the mapping to include the learned modification.
- the method includes the subject matter of any of the preceding examples and includes detecting a sleep state based on the monitoring of the physiological signal and turning off the display of visual elements in response to the sleep state.
- the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate or muscle activity, the monitoring detects a change in the respiration rate, heart rate, or muscle activity, in response to the change in the respiration rate, heart rate or muscle activity, changing a speed, quantity, density, frequency, color, brightness, contrast, direction, depth, focus, point of view, and/or complexity of one or more of the visual elements in the presentation of the immersive virtual environment.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, changing the volume, content, speed, complexity, and/or intensity of the audio soundtrack in response to the change in the respiration rate or heart rate.
- the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate, the monitoring detects a decrease in the respiration rate or heart rate, and the method comprises, in response to the decrease in the respiration rate or heart rate, decreasing speed, and increasing quantity, density and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, increasing the volume or degree of surround sound at which the audio soundtrack is played in response to the decrease in the respiration rate or heart rate.
- the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate or a rate of muscle activity, the monitoring detects an increase in the respiration rate or heart rate or muscle activity, in response to the increase in the respiration rate or heart rate or muscle activity, increasing speed, and decreasing quantity, density, and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment.
- the method includes the subject matter of any of the preceding examples and includes decreasing the volume at which the audio soundtrack is played in response to the increase in the respiration rate or heart rate.
- the method includes the subject matter of any of the preceding examples and includes determining a value of a physiological parameter based on the physiological signal, wherein the physiological parameter has a range of possible values, the immersive virtual environment comprises a plurality of visual stages, each of the visual stages comprises a different arrangement of visual elements, each of the visual stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting a visual stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected visual stage.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises a plurality of audio stages, each of the audio stages comprises a different arrangement of audio elements, each of the audio stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting an audio stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected audio stage.
- the method includes the subject matter of any of the preceding examples and includes determining, a value of a physiological parameter from the physiological signal, wherein the physiological parameter comprises a respiration rate, a heart rate, an electroencephalography (EEG) measurement, a measure of muscle activity, and/or a human body temperature, and determining the adjustment to the immersive virtual environment based on the value of the physiological parameter.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, determining a visual adjustment to adjust the display of visual elements and determining an audio adjustment to adjust the audio soundtrack.
- the method includes the subject matter of any of the preceding examples and includes determining the visual adjustment independently of the determining of the audio adjustment.
- the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises a plurality of different sensory stimuli, independently adjusting each of the different sensory stimuli in response to the change in the physiological signal.
- An example 22 includes a biofeedback virtual reality system for promoting sleep, the biofeedback virtual reality system including: a sensor to detect a physiological signal; a mobile or wearable computing device to: receive the physiological signal; determine a value of a physiological parameter based on the physiological signal; map the value of the physiological parameter to a stage of an immersive virtual environment of a plurality of stored immersive virtual environments, each of the stored immersive virtual environments comprising a succession of stages designed to promote sleep, each of the stages comprising a different arrangement of sensory stimuli; and a virtual reality device in communication with the mobile or wearable computing device, the virtual reality device to present the stage of the immersive virtual environment; wherein the mobile or wearable computing device is to determine a new value of the physiological parameter and map the new value of the physiological parameter to a new stage of the immersive virtual environment; and wherein the virtual reality device is to present the new stage of the immersive virtual environment in response to the new value of the physiological parameter.
- the system includes the subject matter of example 22, wherein the mobile or wearable computing device comprises a smartphone, a tablet computer, an attachable/detachable device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel.
- the system includes the subject matter of example 22 or example 23, wherein at least two of the mobile or wearable computing device, the virtual reality device, and the sensor are embodied as a unitary device.
- the system includes the subject matter of any of examples 22-24, wherein the mobile or wearable computing device receives the physiological signal through wireless communication and/or the mobile or wearable computing device communicates with the virtual reality device through wireless communication.
- the system includes the subject matter of any of examples 22-25, wherein the sensor comprises a motion sensor, and wherein the mobile or wearable computing device determines a respiration rate from the output of the motion sensor.
- the system includes the subject matter of any of examples 22-26, wherein the mobile or wearable computing device comprises a positioner to position the mobile or wearable computing device to detect human body motion indicating breathing.
- the system includes the subject matter of any of examples 22-27, wherein the mobile or wearable computing device is to receive a plurality of different physiological signals, determine a value of each of a plurality of different physiological parameters based on the plurality of different physiological signals, and determine a stage of the immersive virtual environment based on the values of the different physiological parameters.
- the system includes the subject matter of any of examples 22-28, wherein the immersive virtual environment comprises an arrangement of visual elements including an avatar that interacts with the immersive virtual environment in response to the physiological signal.
- the system includes the subject matter of any of examples 22-29, comprising a gaze detector in communication with the mobile or wearable computing device, wherein the mobile or wearable computing device is to manipulate the immersive virtual environment in response to output of the gaze detector.
- the system includes the subject matter of any of examples 22-30, wherein the virtual reality device comprises virtual reality eyewear and headphones.
- the system includes the subject matter of any of examples 22-31, wherein the virtual reality device comprises high-definition video glasses, a non-rigid sleep mask, a television, a projector to project a display of visual elements onto a wall or ceiling, and/or one or more remote speakers.
- the virtual reality device comprises high-definition video glasses, a non-rigid sleep mask, a television, a projector to project a display of visual elements onto a wall or ceiling, and/or one or more remote speakers.
- An example 33 includes a biofeedback virtual reality sleep assistant embodied in one or more computer accessible media, the biofeedback virtual reality sleep assistant including: a physiological signal processor to receive one or more physiological signals from one or more sensing devices; a physiological signal processing module to monitor one or more physiological parameters from the one or more physiological signals over time, each of the physiological parameters having a range of possible values, and to determine a value of each of the physiological parameters at a plurality of different instances in time; a physiological parameter mapping module to map the values of the one or more physiological parameters at an instance in time to a stage of an immersive virtual environment selected from a plurality of stored immersive virtual environments, each of the immersive virtual environments comprising at least a visual display and an audio soundtrack, each of the visual display and the audio soundtrack having a plurality of successive stages designed to promote sleep; and an immersive environment control module to present the stage of the selected immersive virtual environment by one or more virtual reality devices; wherein the physiological signal processing module is to detect changes in the values of the one or more physiological parameters over time; and wherein the physiological parameter mapping module is
- the sleep assistant includes the subject matter of claim 33 , wherein the physiological parameter mapping module map the values of the one or more physiological parameters to a stage of an immersive virtual environment by executing a continuous mapping function or by accessing a lookup table.
- the sleep assistant includes the subject matter of claim 33 , wherein the physiological parameter mapping module is to map the values of the one or more physiological parameters to a stage of the visual display and separately map the values of the one or more physiological parameters to a stage of the audio soundtrack.
- the sleep assistant includes the subject matter of claim 33 , wherein the immersive environment control module is to construct the selected immersive virtual environment in real time by adding, deleting, or changing elements of the visual display and/or the audio soundtrack based on the values of the one or more physiological parameters.
- the sleep assistant includes the subject matter of claim 33 , wherein the immersive environment control module is to communicate with a smart device to control an aspect of a physical environment in response to changes in the values of the one or more physiological parameters over time.
- An example 38 includes an article of manufacture including, embodied in one or more computer accessible storage media: an immersive virtual environment comprising a display of visual elements and an audio soundtrack, wherein the display and the audio soundtrack each have a plurality of stages that are coordinated with different values of at least one physiological parameter.
- references in the specification to “an embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is believed to be within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly indicated.
- Embodiments in accordance with the disclosure may be implemented in hardware, firmware, software, or any combination thereof. Embodiments may also be implemented as instructions stored using one or more machine-readable media, which may be read and executed by one or more processors.
- a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device or a “virtual machine” running on one or more computing devices).
- a machine-readable medium may include any suitable form of volatile or non-volatile memory.
- Modules, data structures, and the like defined herein are defined as such for ease of discussion, and are not intended to imply that any specific implementation details are required.
- any of the described modules and/or data structures may be combined or divided into sub-modules, sub-processes or other units of computer code or data as may be required by a particular design or implementation.
- schematic elements used to represent instruction blocks or modules may be implemented using any suitable form of machine-readable instruction, and each such instruction may be implemented using any suitable programming language, library, application-programming interface (API), and/or other software development tools or frameworks.
- schematic elements used to represent data or information may be implemented using any suitable electronic arrangement or data structure. Further, some connections, relationships or associations between elements may be simplified or not shown in the drawings so as not to obscure the disclosure.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Medical Informatics (AREA)
- Cardiology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Anesthesiology (AREA)
- Psychology (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Pulmonology (AREA)
- Psychiatry (AREA)
- Pain & Pain Management (AREA)
- Acoustics & Sound (AREA)
- Hematology (AREA)
- Biodiversity & Conservation Biology (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Biofeedback virtual reality sleep assistant technologies monitor one or more physiological parameters while presenting an immersive environment. The presentation of the immersive environment changes over time in response to changes in the values of the physiological parameters. The changes in the presentation of the immersive environment are configured using biofeedback technology and are designed to promote sleep.
Description
- This application claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 61/813,037, filed Apr. 17, 2013, which is incorporated herein by this reference.
- Insomnia is the most common sleep disorder. Insomnia is considered a hyper-arousal disorder in which both cognitive and physiological domains are over-activated. Research has shown that insomnia is associated with elevated autonomic nervous system activation, particularly at sleep onset that can adversely impact a person's health and well-being in a number of ways. Sleep onset in insomniacs is characterized by high levels of cognitive activity, worry, rumination and intrusive thoughts that, together with the autonomic hyperactivation, impede the onset of sleep. Predisposing factors that can increase a person's vulnerability to insomnia include age, gender, coping strategy, personality traits, and genetic factors. Insomnia can be triggered by acute stressful events, such as illness or trauma; it can be a chronic disorder without specific cause, or can be a symptom of other disorders. Perpetuating factors, such as the use of caffeine or alcohol, excessive worry, and irregular wake/sleep schedules, may contribute to the development and persistence of insomnia.
- Cognitive-Behavioral Therapy (CBT) and pharmacotherapy are two main lines of treatment that are currently available for insomnia. However, many insomnia sufferers do not wish to use pharmacotherapy and there is limited availability of CBT.
- This disclosure is illustrated by way of example and not by way of limitation in the accompanying figures. The figures may, alone or in combination, illustrate one or more embodiments of the disclosure. Elements illustrated in the figures are not necessarily drawn to scale. Reference labels may be repeated among the figures to indicate corresponding or analogous elements.
-
FIG. 1 is a simplified depiction of a person using an embodiment of a biofeedback virtual reality sleep assistant as disclosed herein; -
FIG. 2 is a simplified block diagram of at least one embodiment of a computing environment for the sleep assistant ofFIG. 1 ; -
FIG. 3 is a simplified module diagram illustrating an environment of at least one embodiment of the sleep assistant ofFIG. 1 in operation; -
FIG. 4 is a simplified flow diagram of at least one embodiment of a method for promoting sleep with the sleep assistant ofFIG. 1 ; -
FIG. 5 is a simplified plot illustrating diaphragmatic breathing at approximately 6 breaths per minute during use of at least one embodiment of the sleep assistant ofFIG. 1 prior to the onset of sleep. In the figure, breathing data recorded by the Piezoelectric bands and IMU sensor are overlapped to illustrate the reliability of the computing device (e.g., a smart phone) in detecting breathing rate under slow breathing conditions; -
FIG. 6 is a simplified plot illustrating normal breathing frequency during a period of sleep, immediately following the sleep onset. In the figure, breathing data recorded by the Piezoelectric bands and IMU sensor are overlapped to illustrate the reliability of the computing device (e.g., a smart phone) in detecting breathing rate under normal breathing conditions; and -
FIGS. 7-9 are simplified plots of illustrative test results obtained during the use of at least one embodiment of the sleep assistant ofFIG. 1 . - While the concepts of the present disclosure are susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are described in detail below. It should be understood that there is no intent to limit the concepts of the present disclosure to the particular forms disclosed. On the contrary, the intent is to cover all modifications, equivalents, and alternatives consistent with the present disclosure and the appended claims.
- Insomniacs are characterized by elevated levels of physiological arousal (e.g. high heart rate, elevated high frequency electroencephalographic activity) together with cognitive hyperactivation (e.g. anxiety, worry, rumination, intrusive thoughts), particularly at sleep onset. Also, for many insomniacs, the bed and bedroom can become associated with a disturbed sleep pattern. As a result, entry into the familiar bedroom environment can become a conditioned cue that perpetuates and increases the severity of insomnia. As disclosed herein, virtual reality scenarios can be designed to remove individuals from their undesirable sleep environment by immersing them in a new, peaceful and relaxing environment, distracting them from other factors that might contribute to insomnia, such as worry and rumination. Additionally, as disclosed herein, biofeedback techniques can be incorporated into a virtual reality system to promote psychophysiological relaxation (by reducing physiological hyper-arousal) and thus promote their sleep. To do this, some of the disclosed embodiments focus the application of biofeedback and virtual reality techniques at the point in time that is prior to sleep onset. As used herein, “sleep onset period” generally refers to the time period beginning with “lights out,” when the person begins the process of trying to fall asleep, and continues up to the point of loss of consciousness, e.g., when the person enters the initial sleep state, which usually occurs before the polysomnography (PSG) sleep onset. After sleep onset, the techniques disclosed herein can be discontinued because the person is no longer conscious of the immersive virtual environment. In other words, some of the disclosed embodiments are directed to helping people guide themselves across the sleep onset process to promote the transition from the conscious (awake) to the unconscious level (sleep). In this way, aspects of the disclosed embodiments apply biofeedback and virtual reality techniques to make the process of falling asleep easier.
- Referring now to
FIGS. 1 and 2 , an embodiment of a biofeedbackvirtual reality system 100 includes avirtual reality device 240 and awearable sensor device 210. Thedevice 210 may be embodied as a mobile computing device, as shown inFIG. 1 , or as a wearable smart-sensor (e.g. an IMU or inertial measurement unit) that communicates wirelessly with a mobile computing device (such as a smart phone lying on a table next to the person). In other words, thedevice 210 may include two parts: [1] a wearable sensor and [2] a mobile computing device (where the mobile computing device is a separate device from the sensor and may interface with the sensor by wireless (e.g., by WIFI, BLUETOOTH, or other suitable wireless or optical communication technology), or may include one part (e.g., a mobile computing device with an integrated sensor). The mobile computing device and/or the smart-sensor communicates wirelessly with the virtual reality device 240 (e.g., eye wear and headphones). Thevirtual reality device 240 immerses a person in avirtual reality environment 116. The wearable smart sensor together with themobile computing device 210 operates a sleepassistant computer application 218 that applies biofeedback technology to the presentation of the immersivevirtual environment 116, in order to target a state of hyper-arousal (or hyper-activation) being experienced by the person using the system 100 (“user”). Thesystem 100 creates a positive biofeedback loop in which thevirtual reality device 240 provides cognitive relaxation/distraction and the biofeedback technology embodied in thesleep assistant application 218 promotes sleep by providing positive feedback (by modulating the degree of immersion in the virtual environment 116) in response to physiological signals indicating the user's current level of relaxation. Using the biofeedback technology, thesystem 100 presents increasingly immersive virtual reality environments as the user produces the desired level of physiological activation, and then modulates the virtual environment so as not to disturb the person once they have fallen asleep. Alternatively or in addition to presenting the immersive virtual reality environments, thesystem 100 can control various aspects of the user's surrounding physical (real-world) environment, in response to the biofeedback signals. For instance, thesystem 100 can communicate with various smart devices in the room, such as devices that provide or reduce ambient lighting, including an alarm clock, shades, and/or other devices, to reduce distractions that may be introduced by such devices. For example, thesystem 100 can use the biofeedback signals (e.g., the user's breathing rate) to automatically change an aspect of the user's physical environment; e.g., if the user slows down his or her breathing rate, thesystem 100 may decrease the brightness of the room. Theimmersive environment 116 is “virtual” in the sense that it includes computer-synthesized elements that are presented to the user in place of or in conjunction with the real-world environment. As used herein, “virtual reality” or “VR” may refer to, among other things, virtual reality, augmented reality, enhanced reality, and/or other types of interactive or non-interactive computer-generated immersive user experiences. For example, in some embodiments, the immersivevirtual environment 116 may include avisual display 118 that presents a series of animated two- or three-dimensional visual elements (e.g.,graphical elements visual display 118 passively (e.g., by viewing only). In other embodiments, thesystem 100 may allow the user to interact with the elements presented in the visual display 118 (e.g., via an “avatar” or by the user directly navigating in the scenario using, for instance, gaze, gestures or pointing devices). For instance, thesystem 100 may permit the user to move objects around in the virtual environment, or to interact with or insert themselves into the virtual environment as an avatar. As an example, a “sleep mask” configured as avirtual reality device 240 can detect the user's gaze and move the avatar in the same direction as the user's gaze allowing the user to navigate in the virtual environment moving his eyes. As another example, thesystem 100 may change the point of view or focus, or zoom in or zoom out in a particular direction, in response to the user's gestures or other body movements, or may rotate, pan, or otherwise adjust the virtual scene in response to a characteristic of the person's gaze (e.g., gaze direction, pupil dilation, etc.), or in response to other detected bio-signals (e.g., reduction in muscle activity). Using the bio-feedback mechanism, the user is an active participant in controlling the virtual environment and synthesized sounds. In general, a greater physiological relaxation (as promoted by the sleep assistant 218) leads to a more pleasant environment (e.g., increased immersion in the virtual reality) and thus promotes more cognitive relaxation/distraction. Thesleep assistant 218 responds to increased physiological relaxation (as detected by, e.g., a reduction in breathing rate) by increasing the degree of virtual immersion (by providing, for example, pleasant/relaxing sounds, more visual elements in the immersive environment that increase the user's “sense of presence” in the immersive environment), etc. The increased degree of virtual immersion then leads to even greater cognitive relaxation/distraction (e.g. the person is now fully immersed in a virtual environment and he/she forgot all worries, ruminations, etc.), which promotes sleep. - The
virtual environment 116 is immersive in the sense that it is designed to attract the user's attention by increasing the user's sense of presence in a virtual world and by removing distractions that may occur in the surrounding real-world scene, e.g., by occluding the background and/or restricting the user's peripheral vision. Thesystem 100 may achieve the immersive nature of thevirtual environment 116 by presenting thevisual display 118, playing anaudio soundtrack 130, presenting a combination of thevisual display 118 and theaudio soundtrack 130, and/or providing other sensory stimuli. In all embodiments, the level of brightness of the visual stimulation provided by thevisual display 118 is low, in order to avoid any alterations in hormone production (e.g. to avoid changes in melatonin). - The illustrative immersive
virtual environment 116 includes a combination of visual 118 and audio 130 stimuli, but other embodiments may include other types of sensory stimuli, such as tactile, temperature, taste, smell, and others, alternatively or in addition to the visual 118 and audio 130 stimuli. For example, some embodiments of thevirtual environment 116 only include visual stimuli while other embodiments only include audio stimuli. Thesystem 100 coordinates the presentation of the various sensory stimuli with physiological information in real time to create a state of relaxation in the person experiencing the immersivevirtual environment 116. For example, as explained further below, thesystem 100 may increase or decrease any of a number of features of any of the sensory stimuli, or selectively turn different sensory stimuli on and off, over time in response to changes in the person's physiological parameters. As used herein, “in real time” may refer to, among other things, the fact that an automated biofeedback process occurs in response to sensed physiological information about the person using thesystem 100, during a period in which the person is using thesystem 100. In other words, theillustrative system 100 changes one or more aspects of the immersivevirtual environment 116 directly in response to changes in the sensed physiological information, using biofeedback technology based on user actions that is designed to promote sleep. To do this, the mobile/wearable computing device 210 and/or thevirtual reality device 240 analyze one or more physiological parameters that are obtained or derived from sensor signals. As used herein, “physiological parameters” may refer to, among other things, breathing rate (respiration rate) (e.g., breaths per minute), heart rate (e.g., beats per minute), brain activity (e.g. electroencephalographic signals), body movements, muscle activity; or any other type of measurable human physiological activity, or any combination of the foregoing. Using biofeedback technology, thesystem 100 modifies the immersivevirtual environment 116 in response to changes in the physiological parameters in a manner that is designed to guide the person away from the state of hyper-arousal and toward a state of sleep. - Different physiological parameters may have different roles in modifying the various aspects of the virtual environment (e.g., breathing rate can guide the speed of the navigation in the virtual environment whereas the muscle tone may guide the density of the virtual elements presented in the immersive environment 116). As an example, if the user decreases his or her breathing rate, the
system 100 can reduce the speed of the fish swimming in an aquatic scene (but not change other aspects of the environment 116); and if, at the same time, the user reduces his or her muscle activity, thesystem 100 can increase the number of fish swimming in the visual scene. Thus, different physiological parameters can be linked with different aspects of theimmersive scenario 116 using feedback on different bio-signals, in order to potentially increase the user's relaxation. - Referring now to
FIG. 1 in more detail, the illustrativevisual display 118 is embodied as a three-dimensional (3D) display of visual elements. In the illustration, the visual elements depict an aquatic scene and include a background 120 (e.g., water), a background element 128 (e.g., coral), and a number of foreground elements 122 (e.g., fish), 124 (air bubbles), 126 (e.g., rocks). Thesystem 100 can adjust the presentation of any of thevisual elements system 100 can modify any of these and/or other features of the visual elements depicted in thevisual display 118, based on the user's physiological parameters. Of course, while thevisual display 118 depicts an aquatic scene, any type of visual display that is designed or selected to promote sleep may be used. For example, an ocean, sky, or forest scene may be used, or thevisual display 118 may be configured according to the preferences of a particular user of thesystem 100. InFIG. 1 , it should be understood that in operation, thevisual display 118 is actually displayed in the virtualreality viewing glasses 112, but is shown as projected in order to better illustrate the details described above. - The
illustrative audio soundtrack 130 includes a number of audio elements, which may include various types of sounds (e.g., spoken words, music, nature sounds, etc.) or a combination thereof. In the illustration, the audio elements are sounds that are coordinated with the visual display 118 (e.g., water flowing and bubbling sounds); however, the audio soundtrack can include any type of audio selected or configured to promote sleep, including selections from the user's digital music library. Thesystem 100 can adjust the presentation of any of the audio elements, or add or remove audio elements, in response to changes in physiological parameters. Each of the audio elements has a number of features, including volume, content (e.g., words, sounds, and/or music), speed (e.g., tempo), complexity (e.g., number of different types or layers of sound), degree of “surround sound,” and/or intensity (e.g., acoustic intensity). Thesystem 100 can modify any of these and/or other features of theaudio elements 130 based on the user's physiological parameters. - The illustrative wearable smart-sensor and
mobile computing device 210 includes a computing device 110 (e.g., a smartphone, a tablet computer, an attachable/detachable electronic device such as a clip-on device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel) and a positioner 132 (e.g., a strap, tether, clip, VELCRO® tab, etc.). However, any type of computing device that includes a processor and memory and can interact with thevirtual reality device 240 in a relatively non-intrusive manner (e.g., without causing discomfort to the person using the system 100) may be used as thecomputing device 110. - The
positioner 132 is configured to secure the mobile orwearable computing device 210 in a position in which a sensor 232 (FIG. 2 ) of thedevice 210 can detect the user's physiological activity and generate physiological signals representing the user's physiological activity. However, thepositioner 132 may be omitted, in some embodiments. Additionally, the sensors used to detect the user's physiological activity do not need to be attached to or worn by the person. For example, thephysiological sensor 232 can be incorporated in the person's bed or mattress, or into a mattress pad or bed linens (e.g., a fitted sheet). Additional details of the mobile orwearable computing device 210 are described below with reference toFIG. 2 . - The illustrative
virtual reality device 240 includes avisual display system 112 and anaudio delivery system 114. The illustrativevisual display system 112 is embodied as commercially available virtual reality eyewear. Other embodiments of thevisual display system 112 utilize other types of visual display systems, such as high-definition video glasses, non-rigid sleep masks adapted for virtual reality, televisions, projection systems (to project a display of visual elements onto a wall or ceiling), or holograms. - The illustrative
audio delivery system 114 is embodied as commercially available bone-conducting headphones. Other embodiments of theaudio delivery system 114 use other methods of audio delivery, such as conventional audio headphones (e.g., earbuds), three-dimensional (3D) surround sound systems, remote speakers, indoor waterfall systems or fountains, and/or other electronically-controllable noise-making devices. In general, the components of thesystem 100 are in communication with each other as needed by suitable hardware and/or software-based communication mechanisms, which may be enabled by an application programming interface, operating system components, a network communication subsystem, and/or other components. Additional details of thevirtual reality device 240 are described below with reference toFIG. 2 . - Referring now to
FIG. 2 , a simplified block diagram of anexemplary computing environment 200 for thecomputing system 100 is shown. Theillustrative environment 200 includes the mobile orwearable computing device 210 and thevirtual reality device 240, which are in communication with one or moresmart devices 266 and/or one or moreserver computing devices 280 via one ormore networks 264. The biofeedbackVR sleep assistant 218 is, illustratively, embodied as a distributed application including “front end” components that are local to each of thedevices virtual environments 116 may be distributed across thenetwork 270. For example, the immersivevirtual environments virtual environments 116, or copies or portions of particularvirtual environments 116. Further, other portions of thesystem 100 may be distributed onvarious devices network 270, such as mapping functions 234. In other embodiments, however, thesleep assistant 218, the mapping function(s) 234, and the immersive virtual environment(s) 222, 252, 292 may be stored entirely on the mobile orwearable computing device 210 or entirely on thevirtual reality device 240. In some embodiments, portions of thesleep assistant 218, the mapping function(s) 234 or the immersive virtual environment(s) 222, 252, 292 may be incorporated into other systems or interactive software applications. Such applications or systems may include, for example, operating systems, middleware or framework (e.g., application programming interface or API) software, and/or user-level applications software. - The mobile or
wearable computing device 210 may be embodied as any type of computing device that is capable of performing the functions described herein (e.g., modulating the presentation of the immersivevirtual environment 116 based on physiological signals). In some embodiments, thedevices physiological sensors - The illustrative mobile or
wearable computing device 210 includes at least one processor 212 (e.g. a controller, microprocessor, microcontroller, digital signal processor, etc.),memory 214, and an input/output (I/O)subsystem 216. Although not specifically shown, embodiments of theprocessor 212 may include separate baseband and applications processors. Features of the baseband and applications processors may be located on the same or different hardware devices (e.g., a common substrate). The baseband processor interfaces with other components of the mobile orwearable computing device 210 and/or external components to provide, among other things, wireless communication services, such as cellular, BLUETOOTH, WLAN, and/or other communication services. In general, the applications processor handles processing required by software and firmware applications running on the mobile orwearable computing device 210, as well as interfacing with various sensors and/or other system resources. However, it should be understood that features typically handled by the baseband processor may be handled by the applications processor and vice versa, in some embodiments. - Although not specifically shown, it should be understood that the I/
O subsystem 216 typically includes, among other things, an I/O controller, a memory controller, and one or more I/O ports. Theprocessor 212 and the I/O subsystem 216 are communicatively coupled to thememory 214. Thememory 214 may be embodied as any type of suitable computer memory device (e.g., volatile memory such as various forms of random access memory). - The I/
O subsystem 216 is communicatively coupled to a number of components, including auser interface subsystem 224. Theuser interface subsystem 224 includes one or more user input devices (e.g., a microphone, a touchscreen, keyboard, virtual keypad, etc.) and one or more output devices (e.g., audio speakers, displays, LEDs, etc.). The I/O subsystem 216 is also communicatively coupled to adata storage device 220, acommunications subsystem 230, and the physiological sensor(s) 232, as well as the biofeedbackVR sleep assistant 218. Thedata storage device 220 may include one or more hard drives or other suitable persistent data storage devices (e.g., flash memory, memory cards, memory sticks, and/or others). Thephysiological sensing devices 232 may include motion sensors, pressure sensors, kinetic sensors, temperature sensors, biometric sensors, and/or others, and may be integrated with or in communication with the mobile orwearable computing device 210. For example, thesensing device 232 may be embodied as an inertial measurement unit (IMU) sensor of the mobile orwearable computing device 210, and as such may include a multiple-axis gyroscope and a multiple-axis accelerometer. In some embodiments, a respiratory effort sensor, such as a piezo sensor band or a respiratory transducer, may be in communication with or embodied in thecomputing device 210, alternatively or in addition to the IMU. - Portions of the
sleep assistant 218, the mapping function(s) 234, and the immersive virtual environment(s) 222 reside at least temporarily in thedata storage device 220. For example, thevirtual environments 222 may include a subset of the library ofvirtual environments 292, where thesubset 222 has been selected by the user or provided as part of a base configuration of thesleep assistant 218 or thecomputing device 210. Portions of thesleep assistant 218, the mapping function(s) 234, and the immersive virtual environment(s) 222 may be copied to thememory 214 during operation of the mobile orwearable computing device 210, for faster processing or other reasons. - The
communication subsystem 230 may communicatively couple the mobile orwearable computing device 210 to other computing devices and/or systems by, for example, a cellular network, a local area network, wide area network (e.g., Wi-Fi), personal cloud, virtual personal network (e.g., VPN), enterprise cloud, public cloud, Ethernet, and/or public network such as the Internet. Thecommunication subsystem 230 may, alternatively or in addition, enable shorter-range wireless communications between the mobile orwearable computing device 210 and other computing devices (such as the virtual reality device 240), using, for example, BLUETOOTH and/or Near Field Communication (NFC) technology. Accordingly, thecommunication subsystem 230 may include one or more optical, wired and/or wireless network interface subsystems, cards, adapters, or other devices, as may be needed pursuant to the specifications and/or design of the particular mobile orwearable computing device 210. Additionally, thecommunication subsystem 230 may include a telephony subsystem, which enables the computing device to provide telecommunications services (e.g., via the baseband processor). The telephony subsystem generally includes a longer-range wireless transceiver, such as a radio frequency (RF) transceiver, and other associated hardware (e.g., amplifiers, etc.). - The
user interface subsystem 224 includes an audio subsystem 226 and avisual subsystem 228. The audio subsystem 226 may include, for example, an audio CODEC, one or more microphones, and one or more speakers and headphone jacks. Thevisual subsystem 228 may include, for example, personal viewing glasses, projection devices, holograms, televisions, liquid crystal display (LCD) screens, light emitting diode (LED) screens, or other visual display devices. The one or more physiological sensor(s) 232 initially detect the user's “baseline” physiological parameters (e.g., the user's actual measured parameters at the beginning of a sleep promotion session). Once the user's baseline condition or “physiological status” is established, thesystem 100 presents an initial immersivevirtual environment 116 and enters “feedback mode,” in which the sensor(s) 232 periodically detect the physiological responses of the user to the presented immersivevirtual environment 116, and provide thesleep assistant 218 with physiological signals that can be used by thesleep assistant 218 to determine the user's state of relaxation as it changes over time. The physiological signals output by the sensor(s) 232 may include signals that represent respiration rate, heart rate, brain activity (e.g. electroencephalogram (EEG)), body temperature, and/or other physiological parameters. For example, thesensor 232 may be embodied as an IMU built into thecomputing device 210 or thevirtual reality device 240, which is used to measure the user's breathing rate by detecting the rise and fall of the user's chest or abdomen over time during normal respiration. - In other embodiments, the
physiological sensor 232 can include measurement tools that are external to thecomputing device 210 but which are in communication with thedevice 210. An example of an externalphysiological sensor 232 is “textile electrodes,” which are formed by knitting or weaving conductive fibers into apparel or garments. Textile electrodes can pick up signals from the heart and other muscles. The physiological activity sensed by the textile electrodes are transmitted through the conductive fibers that are woven into the garment to a processing unit, which then passes the received signals to the mobile orwearable computing device 210, generally through a wireless data connection. - Referring now to the
virtual reality device 240 ofFIG. 2 , thevirtual reality device 240 may be embodied as any type of device that is capable of performing the functions described herein (e.g., presenting the immersivevirtual environment 116 to the user). To do this, the illustrativevirtual reality device 240 is equipped with anaudio subsystem 256 and avisual subsystem 258, which may be embodied similarly to the audio subsystem 226 and thevisual subsystem 228 described above. Accordingly, thevirtual reality device 240 may be embodied with components similar to those of the mobile orwearable computing device 210. For example, in some embodiments, thevirtual reality device 240 has aprocessor 242,memory 244, and an I/O subsystem 246 similar to the mobile orwearable computing device 210. In general, elements of thevirtual reality device 240 having the same or similar name as elements of the mobile orwearable computing device 210 may be embodied similarly, and description of those elements is not repeated here. While not specifically shown inFIG. 2 , it should be understood that thevirtual reality device 240 may include other components as needed to control or provide other various forms of sensory stimuli, such as an ambient temperature controller subsystem, an aroma subsystem, and/or an air movement subsystem. In some embodiments, thevirtual reality device 240 is comprised of separate devices. For example, wearable personal viewing glasses and headphone ear buds may be separate components of thevirtual reality device 240, or may be integrated into a single device (e.g., GLASS by Google, Inc. or a similar device). - Referring now to the smart device(s) 266 of
FIG. 2 , the smart device(s) 266 may be embodied as any type of electronic device capable of performing the functions described herein (e.g., controlling an aspect of the user's physical environment). For example, the smart device(s) 266 may include smart lighting, heating, cooling, sound, and/or entertainment systems. The smart device(s) 266 may include components similar to those described above, or may simply include control circuitry to process control signals received from thesleep assistant 218 and adjust a parameter of the device 266 (e.g., light intensity, room temperature, sound volume, etc.). Elements of the smart device(s) 266 having the same or similar name as elements of the mobile orwearable computing device 210 may be embodied in a similar manner and according to the requirements of the particularsmart device 266. As such, the description of the similar elements is not repeated here. In the illustrativesmart device 266, thevirtual sleep assistant 218 is communicatively coupled to I/O subsystem 276,data storage 274,user interface subsystem 276, andcommunication subsystem 278.Data storage 274 is used to store portions of the mapping function(s) 234 and the immersive virtual environment(s) 292. - Referring now to the server(s) 280 of
FIG. 2 , the server(s) 280 may be embodied as any type of computing device capable of performing the functions described herein (e.g., storing portions of the immersivevirtual environments 292 and/or executing portions of the sleep assistant 218). For example, the server(s) 280 may include components similar to those described above. Elements of theserver 280 having the same or similar name as elements of the mobile orwearable computing device 210 may be embodied in a similar manner and according to the requirements of theserver 280. As such, the description of the similar elements is not repeated here. In theillustrative server 280, thevirtual sleep assistant 218 is communicatively coupled to I/O subsystem 286,data storage 290,user interface subsystem 294, andcommunication subsystem 296.Data storage 290 is used to store portions of the mapping function(s) 234 and the immersive virtual environment(s) 292. - The
computing environment 200 may include other components, sub-components, and devices not illustrated inFIG. 2 for clarity of the description. In general, the components of theenvironment 200 are communicatively coupled as shown inFIG. 2 by electronic signal paths, which may be embodied as any type of wired or wireless signal paths capable of facilitating communication between the respective devices and components. - Referring now to
FIG. 3 , the biofeedbackVR sleep assistant 218 is shown in more detail, in the context of anenvironment 300 that may be created during the operation of the computing system 100 (e.g., an execution or “runtime” environment). As noted above, thesleep assistant 218 is embodied as a computer application. As used herein, “application” or “computer application” may refer to, among other things, any type of computer program or group of computer programs, whether implemented in software, hardware, or a combination thereof, and includes operating system programs, middleware (e.g., APIs, runtime libraries, utilities, etc.), self-contained software applications, or a combination of any of the foregoing. Thesleep assistant 218 is embodied as a number of computerized modules and data structures including a physiological signal acquisition module 312, a physiologicalsignal processing module 314, a physiologicalparameter mapping module 316, an immersiveenvironment control module 318, a data store including a number of immersivevirtual environments 222, and alearning module 338. - The physiological signal acquisition module 312 receives sensor signals 328 from the physiological sensor(s) 232, 262 from time to time during operation of the
computing device 210 at a specified sampling rate, which may correspond to a sampling rate performed by thecomputing device 210. As described above, portions of the sensor signals 328 may reflect human body movements that are indicative of the user's breathing, heartbeat, or other physiological activity. The signal acquisition module 312 performs standard signal processing techniques (e.g., analog-to-digital conversion, filtering, etc.) to extract the useful information (e.g., measurements of breathing or heart beat activity, brain activity or body temperature) from the sensor signals 328 and outputs the resultingphysiological signals 330. In some embodiments, the signal acquisition module 312 is a standard component that is built into thecomputing device 210. However, the physiological signal acquisition module 312 can also be part of a unit that is external to thecomputing device 210. For instance, the physiological signal acquisition module 312 can be part of thevirtual reality device 240. The physiological signal acquisition module 312 can be communicatively coupled to either thevisual subsystem 256 or theaudio subsystem 258, in some embodiments. For example, the physiological signal acquisition module 312 may be embodied as a processor in communication with a heart rate monitor that is built into audio earbuds. As another example, the physiological signal acquisition module 312 may be a thermal imager that is remotely placed (with respect to the computing device 210) to periodically measure the body temperature of the user. - The physiological
signal processing module 314 receives thephysiological signals 330 from the physiological signal acquisition module 312, maps the physiological signals to one or more physiological parameters (e.g., respiration rate, heart rate, etc.), each of which has a range of possible values, and calculates thecurrent data value 332 for each of the physiological parameters. For example, the physiologicalsignal processing module 314 may determine a value of a physiological parameter from one or multiplephysiological signals 330, or from one or multiple instances of the samephysiological signal 330 over time. Themodule 314 may execute one or more algorithms to map thephysiological signals 330 to physiological parameters or to determine physiological parameter values 332. For example, a robust algorithm based on Fourier analysis may be used to compute the dominant oscillation period from the raw IMU data that is directly related to breathing rate. - The physiological
parameter mapping module 316 uses the physiological parameter values 332 to determine the immersivevirtual environment 116 that is to be presented to the user. The physiologicalparameter mapping module 316 maps the physiological parameter values 332 received from the physiologicalsignal processing module 314 to the features of the immersivevirtual environment 116. For example, if the immersivevirtual environment 116 includes audio and visual stimuli, the physiological parameter value and its mapping determine the features of the audio and visual stimuli to be presented to the user. In some embodiments, the mapping is accomplished by one or more look-up tables that indicate relationships between various physiological parameter values and features of the immersivevirtual environment 116. For instance, a look-up table may link a physiological parameter value or range of values to a pre-determined audio volume and number or type of visual elements to display. In other embodiments, a continuous function (e.g., a linear or Gaussian function) may be used to define the mapping. Illustrative examples of mapping tables are shown below in TABLE 1 and TABLE 2. - In some cases, a single physiological parameter value of a single parameter may be used to determine all of the parts of the
virtual environment 116 to be presented by the user, for example, both the visual elements and the audio elements. However, the mapping may be defined differently or determined separately for different elements of the virtual environment. For example, a mapping table ormapping function 234 may define relationships between respiration rates and features of thevisual display 118, while another mapping table ormapping function 234 may define relationships between the respiration rates and features of theaudio soundtrack 130. In other cases, multiple physiological parameters and their corresponding parameter values may be used. For example, one physiological parameter may be used to control thevisual display 118 and a different physiological parameter may be used to control theaudio 130 of other aspects of the two subsystems. Additionally, different mapping tables orfunctions 234 may be used to control the smart device(s) 266. - In some embodiments, the mapping table or mapping function used by the
parameter mapping module 316 may be customized for a particular user based on user customization data 344. The user customization data 344 may include, for example, user preferences, demographic information, or clinical sleep information specific to the user. As an example, thesystem 100 may include a number of different parameter mapping tables for different populations of users, and the user customization data 344 may be used to select an appropriate mapping table (based on, e.g., age, gender, or body size). The mapping tables or mapping functions, or portions thereof, may be stored in data storage of any of thedevices - With the parameter input value(s) 332 and the mapping table or function of the
parameter mapping module 316, thesystem 100 determines changes or adjustments to be made to the immersivevirtual environment 116 in response to the current parameter value(s) 332. For example, the immersivevirtual environment 116 may include a succession of stages, where each stage represents a particular combination of sensory stimuli, and the change or adjustment may include transitioning the presentation to a different stage of thevirtual environment 116. The specifications for these changes or adjustments are passed to the immersiveenvironment control module 318 as environment adjustments 334. - In some embodiments, the parameter values 332, corresponding environment adjustments 334, and subsequent parameter values 332 (e.g., a representation of the user's response to the previous environment adjustment 334) (which may be collectively referred to as “training data”) are passed to the
learning module 338 from time to time. Thelearning module 338 applies one or more artificial intelligence techniques (such as an unsupervised machine learning algorithm) to the training data to algorithmically learn the user's typical responses to different environment adjustments 334. Based on this learning, thelearning module 338 formulates recommended mapping adjustments 336, which indicate modifications to the mapping function that are based on the user's actual behavior over time. Thelearning module 338 passes the mapping adjustments 336 to theparameter mapping module 316, which updates its mapping table or mapping function based to incorporate the mapping adjustments 336. - In some embodiments, the
learning module 338 monitors the physiological signals over the course of a sleep session (e.g., overnight) and outputs feedback (e.g., in the morning) about sleep quality or overall cardiac functioning of the user. Alternatively or in addition, thelearning module 338 can make modifications in the selection of the immersive scenario and/or the degree of immersion in subsequent sleep sessions (e.g., for the following night), in response to its assessments of the user's previous sleep quality and/or nocturnal physiology. In this way, thesystem 100 can, in an automated fashion, learn and change the immersion scenario or settings based on data indicating sleep patterns of a general population (and/or based on a user's individual nocturnal physiology—e.g., cardiac functioning). - The immersive
environment control module 318 controls the modifications to the presentation of the immersivevirtual environment 116 in response to the physiological signals 330. The immersiveenvironment control module 318 receives the environment adjustments 334, accesses the requisite elements of the immersive environment(s) 222 (which, illustratively, includeaudio elements 340 and visual elements 342), and constructs a modified version of thevirtual environment 116, incorporating the environment adjustments 334. Where thevirtual environment 116 includes multiple different types of sensory stimuli, thecontrol module 318 includes amodulator audio modulator 320 controls the modification of the presentation of audio elements and their respective features (e.g., volume, content, speed, complexity, intensity, and/or other aspects of the audio soundtrack 130), while thevisual scene modulator 322 controls the modification of the presentation of visual elements and their respective features (e.g., object movements, number and type of different objects displayed, color schemes, brightness levels, and/or other aspects of the visual display 118). Thetactile modulator 324 and thetemperature modulator 326 operate in a similar fashion to control tactile and temperature stimuli, and similar modulators operate similarly for other types of sensory stimuli. In this way, the illustrative immersiveenvironment control module 318 constructs and adjusts thevirtual environment 116 “on the fly,” e.g., by performing graphics rendering in real time, as opposed to simply selecting and presenting previously created content. The immersiveenvironment control module 318 transmits control signals to thevirtual reality device 240 to cause thevirtual reality device 240 to present the various adjustments to thevirtual environment 116 to the user. - Referring now to
FIG. 4 , a flow diagram provides an illustration of amethod 400 by which embodiments ofsystem 100 may be used to, for example, conduct a sleep promotion session. Themethod 400 may be embodied as computerized programs, routines, logic and/or instructions that are executed by the computing system, e.g., thecomputing device 210 and/or thevirtual reality device 240. InFIG. 4 , a person attempting to fall asleep, or simply to become more relaxed, uses thesystem 100 to immerse themselves in a virtual reality environment. For example, the person may be instructed or coached by thesleep assistant 218 to slow his or her breathing rate (or may do so on his or her own) in order to cause the virtual reality environment to become more immersive. In themethod 400, a process of presenting an immersive virtual environment that adjusts automatically in response to sensor signals representing physiological activity of the user is performed. Of course, as discussed above, aspects of the user's physical environment (e.g., ambient lighting) can also be adjusted (e.g., by thesystem 100 interfacing with one or more “smart”devices 266 in the physical environment). Such adjustments to the physical environment may be initiated in conjunction with or separately from the adjustments to the virtual environment. For example, thesystem 100 may include one or moreseparate mapping functions 234 that thesleep assistant 218 may use to determine adjustments to be made to the physical environment in response to the user's physiological activity. - At
block 410, thesystem 100 selects a virtual environment to be presented by thevirtual reality device 240. As noted earlier, there are many different types of virtual environments that can be presented; for example, aquatic scenes (e.g., aquarium or ocean), general nature scenes, or other environments that are designed to promote sleep. Thesystem 100 can select a specific virtual environment in response to user input, as a result of default settings of thevirtual sleep assistant 218, or by accessing user customization data 344 (such as a user profile or preferences). Once the virtual environment is selected, thesystem 100 presents an initial stage of the virtual environment until a sufficient amount of biofeedback information is received to allow thesystem 100 to begin making dynamic adjustments to the virtual environment. - At
block 412, thesystem 100 receives physiological signals output by the physiological sensor(s) 232, 262, which represent physiological activity of a person using thesystem 100. Atblock 414, thesystem 100 processes the physiological signals received atblock 412 and determines one or more physiological parameters and the current parameter values (e.g., breathing rate: 10 breaths per minute) as of the sampling instance. The parameter values can be calculated or estimated (e.g., based on a number of breaths detected in a given time interval). The parameter values can be determined by, for example, a computer-processing unit of the mobile orwearable computing device 210, or in computer processing units located directly in the physiological sensor(s) 232, 262. Atblock 416, thesystem 100 determines a stage of the immersive virtual environment to present, based on the current parameter values. In an illustrative embodiment, the process atblock 416 includes a mapping function in a form of a look-up table that maps physiological parameter values to stages of the virtual environment. As shown in TABLE 1 below, each immersive virtual environment can be divided into a number of successive stages that can be presented to the user. Each stage relates to a physiological parameter value or a range of physiological parameter values. That is, where a physiological parameter has a range of possible values, each stage of the virtual environment relates to a different subset of the range of possible values. TABLE 1 illustrates the relationship between a few exemplary visual and audio features of an immersive virtual environment and an exemplary physiological parameter. -
TABLE 1 Physiological Parameter Mapping - Respiration Rate VISUAL FEATURES PHYSIOLOGICAL AUDIO Number of Densities of PARAMETER FEATURES Primary Speed of Secondary Respiration Rate Audio Foreground Object Foreground STAGE (bpm) Gains Elements Movement Elements 4 6 0.9 13 0.08 0.04/0.03/0.03 3 8 0.2 08 1.2 0.02/0.02/0.02 2 12 0.03 05 1.7 0.02/0.01/0.01 1 16 0.003 01 2.3 0.01/0.007/0.006 - In the example of TABLE 1, a single physiological parameter (respiration rate) is mapped to both visual and audio elements of an immersive virtual environment. Each value of the physiological parameter corresponds to a different stage of the immersive virtual environment, and each stage of the immersive virtual environment relates to audio and visual features that have different values. The illustrative audio feature is gain (e.g., volume) and the illustrative visual features are the number of primary foreground elements (e.g., fish in the example of
FIG. 1 ), the speed of object movement (e.g., the speed at which the fish travel across the display), and the densities of secondary foreground elements (e.g., the density of the bubbles ofFIG. 1 ). In general, low respiration rate promotes heart rate variability and, consequently, decreases heart rate. Heart rate variability has been found to significantly increase at a respiratory frequency of 6 breaths per minute. Inducing low levels of physiological activity (e.g. lowering heart rate voluntarily via paced breathing) across the sleep onset process helps thesystem 100 to reduce the elevated level of psychophysiological activity (which is typical in insomnia or other condition involving elevated stress and anxiety)) at the beginning of the sleep session and helps the individual fall asleep. In sleep research, 6 breaths per minute corresponds to a target breathing rate for obtaining maximum relaxation. Thus, in TABLE 1, the higher breathing rates correspond to earlier stages in the succession of virtual environment stages, and lower breathing rates correspond to later stages. According to the example of TABLE 1, the virtual environment becomes more immersive (presenting a higher number of primary foreground elements, a higher density of secondary foreground elements, and louder audio, as the respiration rate decreases. However, the speed of movement of the displayed objects becomes slower as the respiration rate decreases. Using a mapping such as illustrated by TABLE 1 enables thesystem 100 to gradually present a more immersive experience if the user increases his or her relaxation and reacts favorably to the previously-presented stage of the virtual environment. In the illustrated embodiments, thesystem 100 increases the degree of virtual immersion in response to reductions in the user's respiration rate. Once the user's respiration has decreased, thesystem 100 can make adjustments to the immersivevirtual environment 116 based on other criteria, such as the previously-presented stages of the immersive virtual environment 116 (e.g., adjust the quantity or speed of visual features based on the quantity or speed of the visual features presented in the previous stage). - In TABLE 2 below, an illustrative example of a mapping function relating to the use of muscle activity as primary feedback parameter is shown. As discussed above,
system 100 can adjust the immersive virtual environment 116 (and/or an aspect of user's physical environment) in response to the detection of the user's muscle activity. For example, two electromyogram (EMG) sensors can be incorporated in a “sleep mask” to detect the muscle activity of corrugator supercilii muscle (by detecting the electrical potential generated by muscle bundles). The resting EMG tone may be recorded for a short time (e.g. 1 min) when the user is lying down in bed maintaining their neutral “position,” to determine the baseline EMG tone (μv). The individual may then be instructed or coached by thesleep assistant 218 to decrease his or her level of “muscle contraction” in his or her facial muscles, and particularly in the forehead (or, of course, the user may do so on his or her own, without coaching). The stages of immersion in thevirtual environment 116 may increase based on the percentage decrease in muscle contraction from the baseline levels. -
TABLE 2 Physiological Parameter Mapping - Muscle Activity. PHYSIOLOGICAL VISUAL FEATURES PARAMETER AUDIO Number of Densities of Muscle Tone (% FEATURES Primary Speed of Secondary changes respect to Audio Foreground Object Foreground STAGE baseline levels) Gains Elements Movement Elements 4 −60 0.9 13 0.08 0.04/0.03/0.03 3 −30 0.2 08 1.2 0.02/0.02/0.02 2 −10 0.03 05 1.7 0.02/0.01/0.01 1 Individuals' baseline 0.003 01 2.3 0.01/0.007/0.006 levels (in μV) - Of course, the mechanics of each stage of the immersive virtual environment are not limited to types of features and mappings shown in TABLE 1 and TABLE 2 or the data values shown in TABLE 1 and TABLE 2. Other strategies for dynamically changing the immersive virtual environment to induce sleep are within the scope of this disclosure.
- At
block 418, the immersive virtual environment is presented using thevirtual reality device 240. To do this, thesystem 100 constructs the appropriate stage of the immersive virtual environment and transmits the stage content and control commands to thevirtual reality device 240. Once received, thevirtual reality device 240 executes the commands to presenting the virtual environment. In some embodiments, portions of the stage content (e.g., the visual elements and/or audio elements) may be stored in thevirtual reality device 240, such that thesystem 100 only transmits control commands to thedevice 240. - To accomplish dynamic changes in the virtual environment, the
system 100 processes frequent physiological feedback data from thesensors system 100 may process the physiological data at a frequency that corresponds to the internal sampling rate of the computing device 210 (e.g., 100 Hz for a standard smart phone). Atblock 420, thesystem 100 receives new physiological signals that are detected subsequent to the presentation of the stage of the virtual environment atblock 418. Atblock 422, new physiological parameter values are calculated from the new physiological signals received atblock 420. - The
system 100 considers whether to continue the biofeedback virtual reality sleep promotion atblock 424. If it is determined that the virtual reality sleep promotion is to be discontinued, then themethod 400 concludes atblock 428 and thesystem 100 discontinues the presentation of the virtual environment. In some embodiments, the virtual reality sleep promotion is discontinued by a timer set to turn thesleep assistant 218 off after sleep promotion has been running for a certain period of time. In other embodiments, the virtual reality sleep promotion may be stopped due to an input from a user. In still other embodiments, thesystem 100 determines a sleep state based on the physiological signals or using a gaze detector incorporated into the virtual sleep assistant hardware that detects the user closing his or her eyes. In some cases, thesystem 100 may turn off thevirtual sleep assistant 218 upon detecting the closing of the person's eyes, or turn off only the visual display when the eyes of the user are closed. - In yet another embodiment, the physiological feedback data may be used to detect a state of full sleep, or a state sufficiently close to full sleep, and turn off the
sleep assistant 218 after certain physiological conditions have been met. As an example, thesystem 100 can detect, based on the physiological signals, whether a person has fallen asleep or wishes to discontinue using thesystem 100 as follows. When the person begins using thesystem 100, they begin by consciously slowing their breathing rate, and thesystem 100 detects a low breathing rate. However, when people fall asleep, they lose the voluntary control of their own breathing. Therefore, once the person falls asleep, their breathing rate returns to “normal,” and thesystem 100 detects an increase in the breathing rate relative to the previously-slowed breathing rate (e.g., the breathing rate voluntarily slowed by the user performing a relaxation technique while conscious). Thesystem 100 can thus turn off thesleep assistant application 218 when thesystem 100 detects a normal breathing rate for a certain period of time (e.g. when the person falls asleep) after having previously detected a low breathing rate for a certain period of time. A return to a normal breathing rate could also mean that the user has discontinued the voluntary slow breathing the person does not want use the device anymore. In this case as well, thesystem 100 can turn off thesleep assistant application 218 in response to the return to a normal breathing rate. In this way, thesleep assistant 218 is configured to guide individuals toward sleep, starting from a conscious level (which typically occurs at the beginning of the night, when the person is still awake), through intermediate stages in which users use the VR biofeedback system, up to the point at which when they fall sleep (unconsciousness). During the intermediate stages, thesystem 100 automatically adjusts the immersive virtual environment (by increasing the sense of presence or degree of immersiveness) so that the user progressively feels that the (unreal) virtual environment is actually their real (physical) environment. As the user's sense of presence in the virtual environment increases, the user's mind is distracted from aspects of their real environment that normally disrupt sleep (such as physical features of the room, emotional connections with the physical environment, and thoughts of worry and rumination). - If the virtual reality sleep promotion is to be continued, the
system 100 determines whether the stage of the virtual environment (and/or an aspect of the physical environment, e.g., a setting of a smart device 266) is to be changed, atblock 426. The determination as to whether to change the virtual and/or physical environment can be made in the same manner as described inblock 416. That is, the system maps the new physiological parameter values determined atblock 422 to a stage of the virtual and/or physical environment (using, e.g., one or more mapping functions 234). The new parameter values may relate to the stage(s) of the virtual and/or physical environments that are currently being presented, in which no change is made to the virtual and/or physical environment, and thesystem 100 returns to block 418 and continues presenting the same stage of the virtual and/or physical environment(s) as was done previously. If the new parameter values relate to a different stage of the virtual and/or physical environment(s) than the stage that is currently being presented, thesystem 100 returns to block 416 and proceeds to determine the specifications for and present the new stage. In other embodiments, the decision atblock 426 may be performed by comparing the old physiological parameter value determined atblock 414 to the new physiological parameter determined atblock 422. If the old physiological parameter value and the new physiological parameter value are the same or within an acceptable range of difference, thesystem 100 continues presenting the current stage of the virtual and/or physical environment(s), and the process of monitoring physiological signals continues. If the old physiological parameter value and the new physiological parameter value are different or outside an acceptable range of difference, then the stage of the virtual and/or physical environment(s) is updated to correspond to the new physiological parameters, and the process of monitoring physiological signals continues. - Referring now to
FIGS. 5 and 6 , illustrative plots of sensor data are shown, which compare the use of an inertial measurement unit (IMU) to measure respiration rate to the results obtained using a piezo respiratory effort band (“p-band”), which is the conventional “gold standard” method used to capture respiration data during polysomnographic sleep recordings. InFIG. 5 , theplot 500 shows low frequency breathing of a person during controlled feedback relaxation induced by thesleep assistant 218, but prior to sleep.Graph line 510 shows the breathing frequency measured by the p-band over time.Graph lines -
Graph line 524 shows the breathing frequency (in Hertz) of a person as measured by the p-band, andgraph line 526 shows the breathing frequency of a person measured using the IMU. Both the p-band and the IMU measurement techniques exhibit similar performance. Theplot 600, found inFIG. 6 , is nearly identical to thegraph 500, found inFIG. 5 , except that theplot 600 is a measurement of breathing frequency of a person during a period of sleep.Graph lines graph lines - It should be noted that the breathing rate can be affected by artifacts such as body movements, which usually occur at the sleep onset (e.g., people turning over or changing position, etc.) In some embodiments, in order to avoid rapid changes in the feedback output due to body movements, the
system 100 executes a function (e.g., a smoothing function) to correct the artifact before providing the feedback to thesleep assistant 218. - Referring now to
FIGS. 7-9 , exemplary plots of test results obtained during trials illustrate the effectiveness of an embodiment of thesleep assistant 218 in comparison to a baseline night in which thesleep assistant 218 was not used.FIG. 7 shows that a lower heart rate is established during an initial period of low breathing rate using thesleep assistant 218 and the lower heart rate is maintained after the onset of sleep.FIG. 8 shows that in the same trial, the lower heart rate was established with thesleep assistant 218 prior to sleep and maintained during both rapid eye movement (REM) and non-rapid eye movement (NREM) periods of sleep, across the whole night.FIG. 9 compares a measure of sleep quality for a baseline night in which thesleep assistant 218 was not used and a night in which thesleep assistant 218 was used, and shows that sleep quality improved with the use of thesleep assistant 218. - Illustrative examples of the technologies disclosed herein are provided below. An embodiment of the technologies may include any one or more, and any combination of, the examples described below.
- In an example 1, a method for promoting sleep includes, with a biofeedback virtual reality system: monitoring a physiological signal received from a sensor over time; presenting an immersive virtual environment with a virtual reality device, the immersive virtual environment comprising a display of visual elements designed to promote sleep; detecting a change in the physiological signal, and in response to the detected change in the physiological signal: applying biofeedback technology to determine an adjustment to the immersive virtual environment, wherein the adjustment is to change the display of visual elements; and presenting the adjustment to the immersive virtual environment with the virtual reality device.
- In an example 2, the method includes the subject matter of example 1 and includes receiving the physiological signal at a mobile or wearable sensing and computing device, and determining one or more physiological parameters based on the physiological signal. In an example 3, the method includes the subject matter of example 1 or example 2 and includes presenting the immersive virtual environment is in response to a user actively attempting to control a physiological parameter being sensed by the sensor. In an example 4, the method includes the subject matter of any of the preceding examples and includes selecting the immersive virtual environment from a plurality of stored immersive virtual environments based on the physiological signals and/or user customization data. In an example 5, the method includes the subject matter of any of the preceding examples and includes determining user customization data and determining the adjustment to the immersive virtual environment based on the user customization data. In an example 6, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises an audio soundtrack, applying biofeedback technology to determine an adjustment to the audio soundtrack and applying the adjustment to the audio soundtrack with the virtual reality device. In an example 7, the method includes the subject matter of any of the preceding examples and includes determining a mapping defining a relationship between physiological signals and elements of the immersive virtual environment, wherein the mapping is defined to promote sleep, and using the mapping to determine the adjustment to the immersive virtual environment. In an example 8, the method includes the subject matter of any of the preceding examples and includes storing data relating to adjustments made to the immersive virtual environment over time and physiological signals monitored after the adjustments have been made, applying an artificial intelligence or machine learning technique to the stored data to algorithmically learn a modification to the mapping; and updating the mapping to include the learned modification.
- In an example 9, the method includes the subject matter of any of the preceding examples and includes detecting a sleep state based on the monitoring of the physiological signal and turning off the display of visual elements in response to the sleep state. In an example 10, the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate or muscle activity, the monitoring detects a change in the respiration rate, heart rate, or muscle activity, in response to the change in the respiration rate, heart rate or muscle activity, changing a speed, quantity, density, frequency, color, brightness, contrast, direction, depth, focus, point of view, and/or complexity of one or more of the visual elements in the presentation of the immersive virtual environment. In an example 11, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, changing the volume, content, speed, complexity, and/or intensity of the audio soundtrack in response to the change in the respiration rate or heart rate. In an example 12, the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate, the monitoring detects a decrease in the respiration rate or heart rate, and the method comprises, in response to the decrease in the respiration rate or heart rate, decreasing speed, and increasing quantity, density and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment. In an example 13, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, increasing the volume or degree of surround sound at which the audio soundtrack is played in response to the decrease in the respiration rate or heart rate. In an example 14, the method includes the subject matter of any of the preceding examples and includes, wherein the physiological signal represents a respiration rate or a heart rate or a rate of muscle activity, the monitoring detects an increase in the respiration rate or heart rate or muscle activity, in response to the increase in the respiration rate or heart rate or muscle activity, increasing speed, and decreasing quantity, density, and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment.
- In an example 15, the method includes the subject matter of any of the preceding examples and includes decreasing the volume at which the audio soundtrack is played in response to the increase in the respiration rate or heart rate. In an example 16, the method includes the subject matter of any of the preceding examples and includes determining a value of a physiological parameter based on the physiological signal, wherein the physiological parameter has a range of possible values, the immersive virtual environment comprises a plurality of visual stages, each of the visual stages comprises a different arrangement of visual elements, each of the visual stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting a visual stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected visual stage. In an example 17, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises a plurality of audio stages, each of the audio stages comprises a different arrangement of audio elements, each of the audio stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting an audio stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected audio stage. In an example 18, the method includes the subject matter of any of the preceding examples and includes determining, a value of a physiological parameter from the physiological signal, wherein the physiological parameter comprises a respiration rate, a heart rate, an electroencephalography (EEG) measurement, a measure of muscle activity, and/or a human body temperature, and determining the adjustment to the immersive virtual environment based on the value of the physiological parameter. In an example 19, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment further comprises an audio soundtrack, determining a visual adjustment to adjust the display of visual elements and determining an audio adjustment to adjust the audio soundtrack. In an example 20, the method includes the subject matter of any of the preceding examples and includes determining the visual adjustment independently of the determining of the audio adjustment. In an example 21, the method includes the subject matter of any of the preceding examples and includes, wherein the immersive virtual environment comprises a plurality of different sensory stimuli, independently adjusting each of the different sensory stimuli in response to the change in the physiological signal.
- An example 22 includes a biofeedback virtual reality system for promoting sleep, the biofeedback virtual reality system including: a sensor to detect a physiological signal; a mobile or wearable computing device to: receive the physiological signal; determine a value of a physiological parameter based on the physiological signal; map the value of the physiological parameter to a stage of an immersive virtual environment of a plurality of stored immersive virtual environments, each of the stored immersive virtual environments comprising a succession of stages designed to promote sleep, each of the stages comprising a different arrangement of sensory stimuli; and a virtual reality device in communication with the mobile or wearable computing device, the virtual reality device to present the stage of the immersive virtual environment; wherein the mobile or wearable computing device is to determine a new value of the physiological parameter and map the new value of the physiological parameter to a new stage of the immersive virtual environment; and wherein the virtual reality device is to present the new stage of the immersive virtual environment in response to the new value of the physiological parameter.
- In an example 23, the system includes the subject matter of example 22, wherein the mobile or wearable computing device comprises a smartphone, a tablet computer, an attachable/detachable device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel. In an example 24, the system includes the subject matter of example 22 or example 23, wherein at least two of the mobile or wearable computing device, the virtual reality device, and the sensor are embodied as a unitary device. In an example 25, the system includes the subject matter of any of examples 22-24, wherein the mobile or wearable computing device receives the physiological signal through wireless communication and/or the mobile or wearable computing device communicates with the virtual reality device through wireless communication. In an example 26, the system includes the subject matter of any of examples 22-25, wherein the sensor comprises a motion sensor, and wherein the mobile or wearable computing device determines a respiration rate from the output of the motion sensor. In an example 27, the system includes the subject matter of any of examples 22-26, wherein the mobile or wearable computing device comprises a positioner to position the mobile or wearable computing device to detect human body motion indicating breathing. In an example 28, the system includes the subject matter of any of examples 22-27, wherein the mobile or wearable computing device is to receive a plurality of different physiological signals, determine a value of each of a plurality of different physiological parameters based on the plurality of different physiological signals, and determine a stage of the immersive virtual environment based on the values of the different physiological parameters. In an example 29, the system includes the subject matter of any of examples 22-28, wherein the immersive virtual environment comprises an arrangement of visual elements including an avatar that interacts with the immersive virtual environment in response to the physiological signal. In an example 30, the system includes the subject matter of any of examples 22-29, comprising a gaze detector in communication with the mobile or wearable computing device, wherein the mobile or wearable computing device is to manipulate the immersive virtual environment in response to output of the gaze detector. In an example 31, the system includes the subject matter of any of examples 22-30, wherein the virtual reality device comprises virtual reality eyewear and headphones. In an example 32, the system includes the subject matter of any of examples 22-31, wherein the virtual reality device comprises high-definition video glasses, a non-rigid sleep mask, a television, a projector to project a display of visual elements onto a wall or ceiling, and/or one or more remote speakers.
- An example 33 includes a biofeedback virtual reality sleep assistant embodied in one or more computer accessible media, the biofeedback virtual reality sleep assistant including: a physiological signal processor to receive one or more physiological signals from one or more sensing devices; a physiological signal processing module to monitor one or more physiological parameters from the one or more physiological signals over time, each of the physiological parameters having a range of possible values, and to determine a value of each of the physiological parameters at a plurality of different instances in time; a physiological parameter mapping module to map the values of the one or more physiological parameters at an instance in time to a stage of an immersive virtual environment selected from a plurality of stored immersive virtual environments, each of the immersive virtual environments comprising at least a visual display and an audio soundtrack, each of the visual display and the audio soundtrack having a plurality of successive stages designed to promote sleep; and an immersive environment control module to present the stage of the selected immersive virtual environment by one or more virtual reality devices; wherein the physiological signal processing module is to detect changes in the values of the one or more physiological parameters over time; and wherein the physiological parameter mapping module is to change the stage of the selected immersive virtual environment in response to the changes in the values of the one or more physiological parameters.
- In an example 34, the sleep assistant includes the subject matter of claim 33, wherein the physiological parameter mapping module map the values of the one or more physiological parameters to a stage of an immersive virtual environment by executing a continuous mapping function or by accessing a lookup table. In an example 35, the sleep assistant includes the subject matter of claim 33, wherein the physiological parameter mapping module is to map the values of the one or more physiological parameters to a stage of the visual display and separately map the values of the one or more physiological parameters to a stage of the audio soundtrack. In an example 36, the sleep assistant includes the subject matter of claim 33, wherein the immersive environment control module is to construct the selected immersive virtual environment in real time by adding, deleting, or changing elements of the visual display and/or the audio soundtrack based on the values of the one or more physiological parameters. In an example 37, the sleep assistant includes the subject matter of claim 33, wherein the immersive environment control module is to communicate with a smart device to control an aspect of a physical environment in response to changes in the values of the one or more physiological parameters over time.
- An example 38 includes an article of manufacture including, embodied in one or more computer accessible storage media: an immersive virtual environment comprising a display of visual elements and an audio soundtrack, wherein the display and the audio soundtrack each have a plurality of stages that are coordinated with different values of at least one physiological parameter.
- In the foregoing description, numerous specific details, examples, and scenarios are set forth in order to provide a more thorough understanding of the present disclosure. It will be appreciated, however, that embodiments of the disclosure may be practiced without such specific details. Further, such examples and scenarios are provided for illustration, and are not intended to limit the disclosure in any way. Those of ordinary skill in the art, with the included descriptions, should be able to implement appropriate functionality without undue experimentation.
- References in the specification to “an embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is believed to be within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly indicated.
- Embodiments in accordance with the disclosure may be implemented in hardware, firmware, software, or any combination thereof. Embodiments may also be implemented as instructions stored using one or more machine-readable media, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device or a “virtual machine” running on one or more computing devices). For example, a machine-readable medium may include any suitable form of volatile or non-volatile memory.
- Modules, data structures, and the like defined herein are defined as such for ease of discussion, and are not intended to imply that any specific implementation details are required. For example, any of the described modules and/or data structures may be combined or divided into sub-modules, sub-processes or other units of computer code or data as may be required by a particular design or implementation.
- In the drawings, specific arrangements or orderings of schematic elements may be shown for ease of description. However, the specific ordering or arrangement of such elements is not meant to imply that a particular order or sequence of processing, or separation of processes, is required in all embodiments. In general, schematic elements used to represent instruction blocks or modules may be implemented using any suitable form of machine-readable instruction, and each such instruction may be implemented using any suitable programming language, library, application-programming interface (API), and/or other software development tools or frameworks. Similarly, schematic elements used to represent data or information may be implemented using any suitable electronic arrangement or data structure. Further, some connections, relationships or associations between elements may be simplified or not shown in the drawings so as not to obscure the disclosure.
- This disclosure is to be considered as exemplary and not restrictive in character, and all changes and modifications that come within the spirit of the disclosure are desired to be protected.
Claims (38)
1. A method for promoting sleep, the method comprising, with a biofeedback virtual reality system:
monitoring a physiological signal received from a sensor over time;
presenting an immersive virtual environment with a virtual reality device, the immersive virtual environment comprising a display of visual elements designed to promote sleep;
detecting a change in the physiological signal, and in response to the detected change in the physiological signal:
applying biofeedback technology to determine an adjustment to the immersive virtual environment, wherein the adjustment is to change the display of visual elements; and
presenting the adjustment to the immersive virtual environment with the virtual reality device.
2. The method of claim 1 , comprising receiving the physiological signal at a mobile or wearable sensing and computing device, and determining one or more physiological parameters based on the physiological signal.
3. The method of claim 1 , where the presenting of the immersive virtual environment is in response to a user actively attempting to control a physiological parameter being sensed by the sensor.
4. The method of claim 1 , comprising selecting the immersive virtual environment from a plurality of stored immersive virtual environments based on the physiological signals and/or user customization data.
5. The method of claim 1 , comprising determining user customization data and determining the adjustment to the immersive virtual environment based on the user customization data.
6. The method of claim 1 , wherein the immersive virtual environment comprises an audio soundtrack, and the method comprises applying biofeedback technology to determine an adjustment to the audio soundtrack and applying the adjustment to the audio soundtrack with the virtual reality device.
7. The method of claim 1 , comprising determining a mapping defining a relationship between physiological signals and elements of the immersive virtual environment, wherein the mapping is defined to promote sleep, and using the mapping to determine the adjustment to the immersive virtual environment.
8. The method of claim 7 , comprising storing data relating to adjustments made to the immersive virtual environment over time and physiological signals monitored after the adjustments have been made, applying an artificial intelligence or machine learning technique to the stored data to algorithmically learn a modification to the mapping; and updating the mapping to include the learned modification.
9. The method of claim 1 , comprising detecting a sleep state based on the monitoring of the physiological signal and turning off the display of visual elements in response to the sleep state.
10. The method of claim 1 , wherein the physiological signal represents a respiration rate or a heart rate or muscle activity, the monitoring detects a change in the respiration rate, heart rate, or muscle activity and the method comprises, in response to the change in the respiration rate, heart rate or muscle activity, changing a speed, quantity, density, frequency, color, brightness, contrast, direction, depth, focus, point of view, and/or complexity of one or more of the visual elements in the presentation of the immersive virtual environment.
11. The method of claim 10 , wherein the immersive virtual environment further comprises an audio soundtrack, and the method comprises changing the volume, content, speed, complexity, and/or intensity of the audio soundtrack in response to the change in the respiration rate or heart rate.
12. The method of claim 1 , wherein the physiological signal represents a respiration rate or a heart rate, the monitoring detects a decrease in the respiration rate or heart rate, and the method comprises, in response to the decrease in the respiration rate or heart rate, decreasing speed, and increasing quantity, density and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment.
13. The method of claim 12 , wherein the immersive virtual environment further comprises an audio soundtrack, and the method comprises increasing the volume or degree of surround sound at which the audio soundtrack is played in response to the decrease in the respiration rate or heart rate.
14. The method of claim 13 , wherein the physiological signal represents a respiration rate or a heart rate or a rate of muscle activity, the monitoring detects a change in the respiration rate or heart rate or muscle activity, and the method comprises, in response to the change in the respiration rate or heart rate or muscle activity, changing speed, quantity, density, and/or frequency of one or more of the visual elements in the presentation of the immersive virtual environment.
15. The method of claim 14 , comprising changing the volume at which the audio soundtrack is played in response to the change in the respiration rate or heart rate.
16. The method of claim 1 , comprising determining a value of a physiological parameter based on the physiological signal, wherein the physiological parameter has a range of possible values, the immersive virtual environment comprises a plurality of visual stages, each of the visual stages comprises a different arrangement of visual elements, each of the visual stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting a visual stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected visual stage.
17. The method of claim 1 , wherein the immersive virtual environment comprises a plurality of audio stages, each of the audio stages comprises a different arrangement of audio elements, each of the audio stages corresponds to a different subset of the range of possible values of the physiological parameter, determining the adjustment comprises selecting an audio stage corresponding to the determined value of the physiological parameter, and presenting the adjustment comprises presenting the selected audio stage.
18. The method of claim 1 , comprising, determining, a value of a physiological parameter from the physiological signal, wherein the physiological parameter comprises a respiration rate, a heart rate, an electroencephalography (EEG) measurement, a measure of muscle activity, and/or a human body temperature, and determining the adjustment to the immersive virtual environment based on the value of the physiological parameter.
19. The method of claim 1 , wherein the immersive virtual environment further comprises an audio soundtrack, and the method comprises determining a visual adjustment to adjust the display of visual elements and determining an audio adjustment to adjust the audio soundtrack.
20. The method of claim 19 comprising determining the visual adjustment independently of the determining of the audio adjustment.
21. The method of claim 1 , wherein the immersive virtual environment comprises a plurality of different sensory stimuli, and the method comprises independently adjusting each of the different sensory stimuli in response to the change in the physiological signal.
22. A biofeedback virtual reality system for promoting sleep, the biofeedback virtual reality system comprising:
a sensor to detect a physiological signal;
a mobile or wearable computing device to:
receive the physiological signal;
determine a value of a physiological parameter based on the physiological signal;
map the value of the physiological parameter to a stage of an immersive virtual environment of a plurality of stored immersive virtual environments, each of the stored immersive virtual environments comprising a succession of stages designed to promote sleep, each of the stages comprising a different arrangement of sensory stimuli; and
a virtual reality device in communication with the mobile or wearable computing device, the virtual reality device to present the stage of the immersive virtual environment;
wherein the mobile or wearable computing device is to determine a new value of the physiological parameter and map the new value of the physiological parameter to a new stage of the immersive virtual environment; and
wherein the virtual reality device is to present the new stage of the immersive virtual environment in response to the new value of the physiological parameter.
23. The system of claim 22 , wherein the mobile or wearable computing device comprises a smartphone, a tablet computer, an attachable/detachable device, a smart watch, smart glasses, a smart wristband, smart jewelry, and/or smart apparel.
24. The system of claim 22 , wherein at least two of the mobile or wearable computing device, the virtual reality device, and the sensor are embodied as a unitary device.
25. The system of claim 22 , wherein the mobile or wearable computing device receives the physiological signal through wireless communication and/or the mobile or wearable computing device communicates with the virtual reality device through wireless communication.
26. The system of claim 22 , wherein the sensor comprises a motion sensor, and wherein the mobile or wearable computing device determines a respiration rate from the output of the motion sensor.
27. The system of claim 22 , wherein the mobile or wearable computing device comprises a positioner to position the mobile or wearable computing device to detect human body motion indicating breathing.
28. The system of claim 22 , wherein the mobile or wearable computing device is to receive a plurality of different physiological signals, determine a value of each of a plurality of different physiological parameters based on the plurality of different physiological signals, and determine a stage of the immersive virtual environment based on the values of the different physiological parameters.
29. The system of claim 22 , wherein the immersive virtual environment comprises an arrangement of visual elements including an avatar that interacts with the immersive virtual environment in response to the physiological signal.
30. The system of claim 22 , comprising a gaze detector in communication with the mobile or wearable computing device, wherein the mobile or wearable computing device is to manipulate the immersive virtual environment in response to output of the gaze detector.
31. The system of claim 22 , further comprising, wherein the virtual reality device comprises virtual reality eyewear and headphones.
32. The system of claim 22 , wherein the virtual reality device comprises high-definition video glasses, a non-rigid sleep mask, a television, a projector to project a display of visual elements onto a wall or ceiling, and/or one or more remote speakers.
33. A biofeedback virtual reality sleep assistant embodied in one or more computer accessible media, the biofeedback virtual reality sleep assistant comprising:
a physiological signal processor to receive one or more physiological signals from one or more sensing devices;
a physiological signal processing module to monitor one or more physiological parameters from the one or more physiological signals over time, each of the physiological parameters having a range of possible values, and to determine a value of each of the physiological parameters at a plurality of different instances in time;
a physiological parameter mapping module to map the values of the one or more physiological parameters at an instance in time to a stage of an immersive virtual environment selected from a plurality of stored immersive virtual environments, each of the immersive virtual environments comprising at least a visual display and an audio soundtrack, each of the visual display and the audio soundtrack having a plurality of successive stages designed to promote sleep; and
an immersive environment control module to present the stage of the selected immersive virtual environment by one or more virtual reality devices;
wherein the physiological signal processing module is to detect changes in the values of the one or more physiological parameters over time; and
wherein the physiological parameter mapping module is to change the stage of the selected immersive virtual environment in response to the changes in the values of the one or more physiological parameters.
34. The sleep assistant of claim 33 , wherein the physiological parameter mapping module map the values of the one or more physiological parameters to a stage of an immersive virtual environment by executing a continuous mapping function or by accessing a lookup table.
35. The sleep assistant of claim 33 , wherein the physiological parameter mapping module is to map the values of the one or more physiological parameters to a stage of the visual display and separately map the values of the one or more physiological parameters to a stage of the audio soundtrack.
36. The sleep assistant of claim 33 , wherein the immersive environment control module is to construct the selected immersive virtual environment in real time by adding, deleting, or changing elements of the visual display and/or the audio soundtrack based on the values of the one or more physiological parameters.
37. The sleep assistant of claim 33 , wherein the immersive environment control module is to communicate with a smart device to control an aspect of a physical environment in response to changes in the values of the one or more physiological parameters over time.
38. An article of manufacture comprising, embodied in one or more computer accessible storage media: an immersive virtual environment comprising a display of visual elements and an audio soundtrack, wherein the display and the audio soundtrack each have a plurality of stages that are coordinated with different values of at least one physiological parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/254,348 US20140316192A1 (en) | 2013-04-17 | 2014-04-16 | Biofeedback Virtual Reality Sleep Assistant |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361813037P | 2013-04-17 | 2013-04-17 | |
US14/251,024 US9872968B2 (en) | 2013-04-17 | 2014-04-11 | Biofeedback virtual reality sleep assistant |
US14/254,348 US20140316192A1 (en) | 2013-04-17 | 2014-04-16 | Biofeedback Virtual Reality Sleep Assistant |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/251,024 Continuation US9872968B2 (en) | 2013-04-17 | 2014-04-11 | Biofeedback virtual reality sleep assistant |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140316192A1 true US20140316192A1 (en) | 2014-10-23 |
Family
ID=51729506
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/251,024 Active 2036-07-15 US9872968B2 (en) | 2013-04-17 | 2014-04-11 | Biofeedback virtual reality sleep assistant |
US14/254,348 Abandoned US20140316192A1 (en) | 2013-04-17 | 2014-04-16 | Biofeedback Virtual Reality Sleep Assistant |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/251,024 Active 2036-07-15 US9872968B2 (en) | 2013-04-17 | 2014-04-11 | Biofeedback virtual reality sleep assistant |
Country Status (1)
Country | Link |
---|---|
US (2) | US9872968B2 (en) |
Cited By (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150104150A1 (en) * | 2006-04-05 | 2015-04-16 | Sony Corporation | Recording apparatus, reproducing apparatus, recording and reproducing apparatus, recording method, reproducing method, recording and reproducing method, and record medium |
US20150304652A1 (en) * | 2014-04-17 | 2015-10-22 | Nokia Technologies Oy | Device orientation correction method for panorama images |
US20160004313A1 (en) * | 2014-07-03 | 2016-01-07 | Gwangju Institute Of Science And Technology | Haptic system, method for controlling the same, and game system |
US20160089028A1 (en) * | 2014-09-25 | 2016-03-31 | Harman International Industries, Inc. | Media player automated control based on detected physiological parameters of a user |
US20160091877A1 (en) * | 2014-09-29 | 2016-03-31 | Scott Fullam | Environmental control via wearable computing system |
US9314583B2 (en) * | 2014-06-24 | 2016-04-19 | Yazmonit Ltd. | System and method for inducing sleep |
US20160224315A1 (en) * | 2014-08-21 | 2016-08-04 | Zhejiang Shenghui Lighting Co., Ltd. | Lighting device and voice broadcasting system and method thereof |
US20160228640A1 (en) * | 2015-02-05 | 2016-08-11 | Mc10, Inc. | Method and system for interacting with an environment |
US20160350609A1 (en) * | 2015-05-26 | 2016-12-01 | Nbcuniversal Media, Llc | System and method for customizing content for a user |
WO2016189370A1 (en) * | 2015-05-27 | 2016-12-01 | Merlin Digital General Trading Llc | Biofeedback virtual reality system and method |
WO2017011830A1 (en) * | 2015-07-16 | 2017-01-19 | Zansors Llc | Cognitive behavioral therapy (cbt) method, system and application |
US20170055897A1 (en) * | 2015-08-26 | 2017-03-02 | Mary-Porter Scott BROCKWAY | Biofeedback chamber for facilitating artistic expression |
US20170103575A1 (en) * | 2015-10-07 | 2017-04-13 | Ricoh Company, Ltd. | Information processing device, information processing method, and computer program product |
US20170224951A1 (en) * | 2016-02-08 | 2017-08-10 | Cornelia Weber | Phototherapy sleep mask |
US20170371411A1 (en) * | 2016-06-28 | 2017-12-28 | Brillio LLC | Method and system for adapting content on hmd based on behavioral parameters of user |
US20180082478A1 (en) * | 2015-01-23 | 2018-03-22 | Stephen Constantinides | Virtual Work of Expression within a Virtual Environment |
CN107823775A (en) * | 2017-11-28 | 2018-03-23 | 深圳和而泰智能控制股份有限公司 | A kind of sleeping method and sleeping aid pillow |
WO2018063521A1 (en) * | 2016-09-29 | 2018-04-05 | Intel Corporation | Methods and apparatus for identifying potentially seizure-inducing virtual reality content |
US9953650B1 (en) * | 2016-12-08 | 2018-04-24 | Louise M Falevsky | Systems, apparatus and methods for using biofeedback for altering speech |
CN108030498A (en) * | 2017-12-13 | 2018-05-15 | 上海青研科技有限公司 | A kind of Psychological Intervention System based on eye movement data |
US20180140229A1 (en) * | 2016-11-22 | 2018-05-24 | General Electric Company | Method and System of Measuring Patient Position |
US20180150130A1 (en) * | 2016-11-30 | 2018-05-31 | At&T Intellectual Property I, L.P. | Methods, and devices for generating a user experience based on the stored user information |
US20180154106A1 (en) * | 2015-07-31 | 2018-06-07 | Universitat De Barcelona | Physiological Response |
US10055887B1 (en) * | 2015-02-19 | 2018-08-21 | Google Llc | Virtual/augmented reality transition system and method |
US20180311462A1 (en) * | 2014-12-03 | 2018-11-01 | Koninklijke Philips N.V. | System and method for increasing the restorative value of a nap |
US10154360B2 (en) * | 2017-05-08 | 2018-12-11 | Microsoft Technology Licensing, Llc | Method and system of improving detection of environmental sounds in an immersive environment |
US10171858B2 (en) * | 2017-03-02 | 2019-01-01 | Adobe Systems Incorporated | Utilizing biometric data to enhance virtual reality content and user response |
US10169973B2 (en) * | 2017-03-08 | 2019-01-01 | International Business Machines Corporation | Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
CN109646784A (en) * | 2018-12-21 | 2019-04-19 | 华东计算技术研究所(中国电子科技集团公司第三十二研究所) | Immersive VR-based insomnia disorder psychotherapy system and method |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
EP3359031A4 (en) * | 2015-10-05 | 2019-05-22 | Mc10, Inc. | Method and system for neuromodulation and stimulation |
CN109844735A (en) * | 2016-07-21 | 2019-06-04 | 奇跃公司 | Affective state for using user controls the technology that virtual image generates system |
US20190180637A1 (en) * | 2017-12-08 | 2019-06-13 | The Regents Of The University Of Colorado, A Body Corporate | Virtually Resilient Simulator |
US10327073B1 (en) * | 2018-08-31 | 2019-06-18 | Bose Corporation | Externalized audio modulated by respiration rate |
WO2019115994A1 (en) * | 2017-12-13 | 2019-06-20 | Sony Interactive Entertainment Inc. | Head-mountable apparatus and methods |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
CN110269993A (en) * | 2019-07-15 | 2019-09-24 | 上海市嘉定区中心医院 | A kind of application method of sleep guidance device, system and system |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
CN110321772A (en) * | 2018-03-30 | 2019-10-11 | Cae有限公司 | Customized visual rendering of dynamically influencing visual elements |
US20190314641A1 (en) * | 2016-11-17 | 2019-10-17 | Cognito Therapeutics, Inc. | Methods and systems for neural stimulation via visual, auditory and peripheral nerve stimulations |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US20190385367A1 (en) * | 2018-06-14 | 2019-12-19 | Robert Labron | Virtual reality software system and method |
US20200004321A1 (en) * | 2017-03-21 | 2020-01-02 | Sony Corporation | Information processing device, information processing method, and program |
US10567152B2 (en) | 2016-02-22 | 2020-02-18 | Mc10, Inc. | System, devices, and method for on-body data and power transmission |
WO2020044124A1 (en) * | 2018-08-28 | 2020-03-05 | Xr Health Il Ltd | Relieving chronic symptoms through treatments in a virtual environment |
US10616727B2 (en) | 2017-10-18 | 2020-04-07 | YouMap, Inc. | System and method for location-based content delivery and visualization |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
DE102018130718A1 (en) * | 2018-12-03 | 2020-06-04 | Sympatient GmbH | Device for carrying out serious games for the prevention and / or treatment of mental disorders |
US10691945B2 (en) | 2017-07-14 | 2020-06-23 | International Business Machines Corporation | Altering virtual content based on the presence of hazardous physical obstructions |
CN111467642A (en) * | 2020-03-06 | 2020-07-31 | 深圳市真元保玖科技有限公司 | Intelligent pillow, control method and device thereof, control equipment and storage medium |
US10737054B1 (en) * | 2017-08-30 | 2020-08-11 | Gail Lynn | Sound and light chamber |
US10888271B2 (en) | 2016-12-08 | 2021-01-12 | Louise M. Falevsky | Systems, apparatus and methods for using biofeedback to facilitate a discussion |
CN112472050A (en) * | 2020-12-10 | 2021-03-12 | 郑子龙 | Bionic method, electronic equipment and computer readable storage medium |
CN112657034A (en) * | 2020-12-09 | 2021-04-16 | 广州医科大学附属肿瘤医院 | Intelligent training method for repairing smell |
US10986465B2 (en) | 2015-02-20 | 2021-04-20 | Medidata Solutions, Inc. | Automated detection and configuration of wearable devices based on on-body status, location, and/or orientation |
US11097078B2 (en) * | 2018-09-26 | 2021-08-24 | Cary Kochman | Method and system for facilitating the transition between a conscious and unconscious state |
US11123009B2 (en) * | 2017-12-21 | 2021-09-21 | Koninklijke Philips N.V. | Sleep stage prediction and intervention preparation based thereon |
WO2021185600A1 (en) * | 2020-03-20 | 2021-09-23 | Sony Group Corporation | System, game console and method for adjusting a virtual environment |
US20210303070A1 (en) * | 2020-03-24 | 2021-09-30 | Arm Limited | Devices and headsets |
US11138217B2 (en) | 2015-06-22 | 2021-10-05 | YouMap, Inc. | System and method for aggregation and graduated visualization of user generated social post on a social mapping network |
US11141556B2 (en) | 2018-01-24 | 2021-10-12 | Nokia Technologies Oy | Apparatus and associated methods for adjusting a group of users' sleep |
US20210345905A1 (en) * | 2020-05-05 | 2021-11-11 | Active Insight Corporation | Physiological sensing system |
US11191448B2 (en) * | 2019-06-07 | 2021-12-07 | Bose Corporation | Dynamic starting rate for guided breathing |
US11241556B2 (en) * | 2018-08-29 | 2022-02-08 | De'Longhi Appliances S.R.L. Con Unico Socio | Method to activate and control a conditioning apparatus |
US11265687B2 (en) | 2015-06-22 | 2022-03-01 | YouMap, Inc. | Creating and utilizing map channels |
US11273283B2 (en) | 2017-12-31 | 2022-03-15 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to enhance emotional response |
US11308922B2 (en) * | 2018-07-03 | 2022-04-19 | Telefonaktiebolaget Lm Ericsson (Publ) | Portable electronic device for mixed reality headset |
CN114390263A (en) * | 2020-10-20 | 2022-04-22 | 中强光电股份有限公司 | Projection system and projection method |
US20220167923A1 (en) * | 2020-11-30 | 2022-06-02 | Strathspey Crown, LLC | Cranial implant |
US11356817B2 (en) | 2015-06-22 | 2022-06-07 | YouMap, Inc. | System and method for location-based content delivery and visualization |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
US11436619B2 (en) | 2015-06-22 | 2022-09-06 | You Map Inc. | Real time geo-social visualization platform |
US11452839B2 (en) | 2018-09-14 | 2022-09-27 | Neuroenhancement Lab, LLC | System and method of improving sleep |
US11455027B2 (en) * | 2017-11-13 | 2022-09-27 | Vr Coaster Gmbh & Co. Kg | Apparatus for experiencing a virtual reality simulation in an underwater world |
US11554245B2 (en) * | 2017-05-26 | 2023-01-17 | Brown University | Lighting system for circadian control and enhanced performance |
CN115957419A (en) * | 2023-02-15 | 2023-04-14 | 中国人民解放军军事科学院军事医学研究院 | Information processing method, virtual reality system and device about psychological relaxation |
CN116110535A (en) * | 2023-04-13 | 2023-05-12 | 北京康爱医疗科技股份有限公司 | Breathing biofeedback method, feedback device and storage medium based on virtual reality |
US11714483B2 (en) | 2020-09-15 | 2023-08-01 | Ballast Technologies, Inc. | Systems, methods, and devices for providing virtual-reality or mixed-reality experiences with special effects to a user in or under water |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
US11723579B2 (en) | 2017-09-19 | 2023-08-15 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement |
US11782508B2 (en) * | 2019-09-27 | 2023-10-10 | Apple Inc. | Creation of optimal working, learning, and resting environments on electronic devices |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11884155B2 (en) * | 2019-04-25 | 2024-01-30 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
US11992326B2 (en) | 2016-04-19 | 2024-05-28 | Medidata Solutions, Inc. | Method and system for measuring perspiration |
US12123654B2 (en) | 2022-11-28 | 2024-10-22 | Fractal Heatsink Technologies LLC | System and method for maintaining efficiency of a fractal heat sink |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015107681A1 (en) | 2014-01-17 | 2015-07-23 | 任天堂株式会社 | Information processing system, information processing server, information processing program, and information providing method |
DE102014215211A1 (en) * | 2014-08-01 | 2016-02-04 | Art + Com Ag | Automatic generation of visual stimuli |
US11974847B2 (en) | 2014-08-07 | 2024-05-07 | Nintendo Co., Ltd. | Information processing system, information processing device, storage medium storing information processing program, and information processing method |
NZ730969A (en) | 2014-10-27 | 2022-05-27 | ResMed Pty Ltd | Method and apparatus for treating hyperarousal disorders |
WO2016150924A1 (en) * | 2015-03-25 | 2016-09-29 | Koninklijke Philips N.V. | Wearable device for sleep assistance |
CN105013058A (en) * | 2015-06-01 | 2015-11-04 | 李丹 | Mental training device by utilizing music regulation |
WO2017059215A1 (en) * | 2015-10-01 | 2017-04-06 | Mc10, Inc. | Method and system for interacting with a virtual environment |
US10695574B2 (en) * | 2015-10-21 | 2020-06-30 | University Of Washington | Sensory input through non-invasive brain stimulation |
US10328236B2 (en) * | 2015-11-23 | 2019-06-25 | Sana Health, Inc. | Methods and systems for providing stimuli to the brain |
CH712799A1 (en) | 2016-08-10 | 2018-02-15 | Derungs Louis | Virtual reality method and system implementing such method. |
US10447347B2 (en) | 2016-08-12 | 2019-10-15 | Mc10, Inc. | Wireless charger and high speed data off-loader |
CN106345035A (en) * | 2016-09-08 | 2017-01-25 | 丘靖 | Sleeping system based on virtual reality |
CN106581839B (en) * | 2016-11-14 | 2020-01-31 | 广东小天才科技有限公司 | Sleep auxiliary assembly of virtual reality |
US10786649B2 (en) * | 2017-01-06 | 2020-09-29 | Sri International | Immersive system for restorative health and wellness |
CN108853679B (en) * | 2017-05-10 | 2023-06-06 | 京东方科技集团股份有限公司 | Intelligent sleep assisting equipment and method, server and system thereof |
WO2019065303A1 (en) * | 2017-09-29 | 2019-04-04 | 本田技研工業株式会社 | Service provision system, service provision method, and management device for service provision system |
US11883739B2 (en) | 2017-12-13 | 2024-01-30 | OVR Tech, LLC | Replaceable liquid scent cartridge |
AU2018383640B2 (en) | 2017-12-13 | 2023-11-02 | OVR Tech, LLC | System and method for generating olfactory stimuli |
US11351450B2 (en) | 2017-12-13 | 2022-06-07 | OVR Tech, LLC | Systems and techniques for generating scent |
WO2019165271A1 (en) | 2018-02-22 | 2019-08-29 | TRIPP, Inc. | Adapting media content to a sensed state of a user |
US11508249B1 (en) | 2018-03-05 | 2022-11-22 | Intelligent Technologies International, Inc. | Secure testing using a smartphone |
CN108720818A (en) * | 2018-05-25 | 2018-11-02 | 华北理工大学 | Automatic hypnosis system based on data analysis and its application method |
FR3082024A1 (en) * | 2018-05-29 | 2019-12-06 | Orange | ADAPTATION OF MULTIMEDIA CONTENT, GUIDANCE OF A USER TOWARDS A GIVEN PHYSIOLOGICAL STATE, ADAPTER, GUIDANCE DEVICE AND TERMINAL USING THE SAME |
EP3617847A1 (en) * | 2018-08-29 | 2020-03-04 | Koninklijke Philips N.V. | Controlling the operation of one or more devices |
US11577268B2 (en) | 2018-10-18 | 2023-02-14 | OVR Tech, LLC | Device for atomizing fluid |
CN109701137A (en) * | 2018-12-28 | 2019-05-03 | 重庆电子工程职业学院 | A kind of medical amusement helmet |
WO2020146248A2 (en) * | 2019-01-07 | 2020-07-16 | Bose Corporation | Biometric detection using multiple sensors |
CN109876271B (en) * | 2019-03-06 | 2019-12-06 | 深圳市神舟电脑股份有限公司 | Wearable equipment remote control governing system |
EP3955807B1 (en) | 2019-06-12 | 2023-07-26 | Hewlett-Packard Development Company, L.P. | Extended reality adjustments based on physiological measurements |
IT201900025792A1 (en) * | 2019-12-30 | 2021-06-30 | Comftech S R L | SYSTEM AND METHOD OF MONITORING THE RESPIRATORY ACTIVITY OF A USER |
US10987484B1 (en) | 2020-05-27 | 2021-04-27 | SYNERGEN Technology Labs, LLC | Baby monitor system sound and light delivery based on vitals |
US11938275B2 (en) * | 2020-08-21 | 2024-03-26 | Stimscience Inc. | Systems, methods, and devices for custom sleep implementation |
US20220202312A1 (en) * | 2020-12-30 | 2022-06-30 | Auralab Technologies Incorporated | Respiratory Biofeedback-Based Content Selection and Playback for Guided Sessions and Device Adjustments |
FR3120975B1 (en) | 2021-03-16 | 2023-09-29 | Healthy Mind | Therapy system by immersion in a virtual environment and method of controlling such a therapy system |
US20220361759A1 (en) | 2021-05-04 | 2022-11-17 | Koa Health B.V. | Smartphone Heart Rate And Breathing Rate Determination Using Accuracy Measurement Weighting |
WO2023056568A1 (en) * | 2021-10-08 | 2023-04-13 | Interaxon Inc. | Systems and methods to induce sleep and other changes in user states |
WO2023183340A1 (en) * | 2022-03-22 | 2023-09-28 | Apple Inc. | Devices, methods, and graphical user interfaces for three-dimensional user experience sessions in an extended reality environment |
US12036040B2 (en) | 2022-08-31 | 2024-07-16 | Neuropeak Pro LLC | Computer-implemented training programs, such as for improving user performance |
CN116110539B (en) * | 2023-02-14 | 2024-03-26 | 苏州睿酷医疗科技有限责任公司 | Pain relief virtual reality system and method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5807114A (en) * | 1996-03-27 | 1998-09-15 | Emory University And Georgia Tech Research Corporation | System for treating patients with anxiety disorders |
US7128577B2 (en) * | 2003-02-26 | 2006-10-31 | Patrice Renaud | Method for providing data to be used by a therapist for analyzing a patient behavior in a virtual environment |
US20110213197A1 (en) * | 2010-02-26 | 2011-09-01 | Robertson Bruce D | Computer augmented therapy |
-
2014
- 2014-04-11 US US14/251,024 patent/US9872968B2/en active Active
- 2014-04-16 US US14/254,348 patent/US20140316192A1/en not_active Abandoned
Cited By (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9654723B2 (en) * | 2006-04-05 | 2017-05-16 | Sony Corporation | Recording apparatus, reproducing apparatus, recording and reproducing apparatus, recording method, reproducing method, recording and reproducing method, and record medium |
US20150104150A1 (en) * | 2006-04-05 | 2015-04-16 | Sony Corporation | Recording apparatus, reproducing apparatus, recording and reproducing apparatus, recording method, reproducing method, recording and reproducing method, and record medium |
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US20150304652A1 (en) * | 2014-04-17 | 2015-10-22 | Nokia Technologies Oy | Device orientation correction method for panorama images |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
US9314583B2 (en) * | 2014-06-24 | 2016-04-19 | Yazmonit Ltd. | System and method for inducing sleep |
US20160004313A1 (en) * | 2014-07-03 | 2016-01-07 | Gwangju Institute Of Science And Technology | Haptic system, method for controlling the same, and game system |
US9990175B2 (en) * | 2014-08-21 | 2018-06-05 | Zhejiang Shenghui Lighting Co., Ltd | Lighting device and voice broadcasting system and method thereof |
US20160224315A1 (en) * | 2014-08-21 | 2016-08-04 | Zhejiang Shenghui Lighting Co., Ltd. | Lighting device and voice broadcasting system and method thereof |
US20160089028A1 (en) * | 2014-09-25 | 2016-03-31 | Harman International Industries, Inc. | Media player automated control based on detected physiological parameters of a user |
US20160091877A1 (en) * | 2014-09-29 | 2016-03-31 | Scott Fullam | Environmental control via wearable computing system |
US10345768B2 (en) * | 2014-09-29 | 2019-07-09 | Microsoft Technology Licensing, Llc | Environmental control via wearable computing system |
US10537704B2 (en) * | 2014-12-03 | 2020-01-21 | Koninklijke Philips N.V. | System and method for increasing the restorative value of a nap |
US20180311462A1 (en) * | 2014-12-03 | 2018-11-01 | Koninklijke Philips N.V. | System and method for increasing the restorative value of a nap |
US10796491B2 (en) | 2015-01-23 | 2020-10-06 | YouMap, Inc. | Virtual work of expression within a virtual environment |
US11651575B2 (en) | 2015-01-23 | 2023-05-16 | You Map Inc. | Virtual work of expression within a virtual environment |
US20180082478A1 (en) * | 2015-01-23 | 2018-03-22 | Stephen Constantinides | Virtual Work of Expression within a Virtual Environment |
US11302084B2 (en) | 2015-01-23 | 2022-04-12 | Stephen Constantinides | Virtual work of expression within a virtual environment |
US10210665B2 (en) * | 2015-01-23 | 2019-02-19 | Stephen Constantinides | Virtual work of expression within a virtual environment |
US20160228640A1 (en) * | 2015-02-05 | 2016-08-11 | Mc10, Inc. | Method and system for interacting with an environment |
US10055887B1 (en) * | 2015-02-19 | 2018-08-21 | Google Llc | Virtual/augmented reality transition system and method |
US10986465B2 (en) | 2015-02-20 | 2021-04-20 | Medidata Solutions, Inc. | Automated detection and configuration of wearable devices based on on-body status, location, and/or orientation |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US20160350609A1 (en) * | 2015-05-26 | 2016-12-01 | Nbcuniversal Media, Llc | System and method for customizing content for a user |
US10083363B2 (en) * | 2015-05-26 | 2018-09-25 | Nbcuniversal Media, Llc | System and method for customizing content for a user |
CN107106047A (en) * | 2015-05-27 | 2017-08-29 | 默林数字综合贸易有限责任公司 | Biofeedback virtual reality system and method |
WO2016189370A1 (en) * | 2015-05-27 | 2016-12-01 | Merlin Digital General Trading Llc | Biofeedback virtual reality system and method |
US10417926B2 (en) | 2015-05-27 | 2019-09-17 | Merlin Digital General Trading Llc | Biofeedback virtual reality system and method |
US11991602B2 (en) | 2015-06-22 | 2024-05-21 | You Map Inc. | System and method for location-based content delivery and visualization |
US11138217B2 (en) | 2015-06-22 | 2021-10-05 | YouMap, Inc. | System and method for aggregation and graduated visualization of user generated social post on a social mapping network |
US11589193B2 (en) | 2015-06-22 | 2023-02-21 | You Map Inc. | Creating and utilizing services associated with maps |
US11356817B2 (en) | 2015-06-22 | 2022-06-07 | YouMap, Inc. | System and method for location-based content delivery and visualization |
US11696097B2 (en) | 2015-06-22 | 2023-07-04 | You Map Inc. | System and method for location-based content delivery and visualization |
US11704329B2 (en) | 2015-06-22 | 2023-07-18 | You Map Inc. | System and method for aggregation and graduated visualization of user generated social post on a social mapping network |
US11436619B2 (en) | 2015-06-22 | 2022-09-06 | You Map Inc. | Real time geo-social visualization platform |
US11265687B2 (en) | 2015-06-22 | 2022-03-01 | YouMap, Inc. | Creating and utilizing map channels |
WO2017011830A1 (en) * | 2015-07-16 | 2017-01-19 | Zansors Llc | Cognitive behavioral therapy (cbt) method, system and application |
US10835707B2 (en) * | 2015-07-31 | 2020-11-17 | Universitat De Barcelona | Physiological response |
US20180154106A1 (en) * | 2015-07-31 | 2018-06-07 | Universitat De Barcelona | Physiological Response |
US20170055897A1 (en) * | 2015-08-26 | 2017-03-02 | Mary-Porter Scott BROCKWAY | Biofeedback chamber for facilitating artistic expression |
EP3359031A4 (en) * | 2015-10-05 | 2019-05-22 | Mc10, Inc. | Method and system for neuromodulation and stimulation |
US10532211B2 (en) | 2015-10-05 | 2020-01-14 | Mc10, Inc. | Method and system for neuromodulation and stimulation |
US20170103575A1 (en) * | 2015-10-07 | 2017-04-13 | Ricoh Company, Ltd. | Information processing device, information processing method, and computer program product |
US9905053B2 (en) * | 2015-10-07 | 2018-02-27 | Ricoh Company, Ltd. | Information processing device, information processing method, and computer program product |
US10905846B2 (en) * | 2016-02-08 | 2021-02-02 | Cornelia Weber | Phototherapy sleep mask |
US20170224951A1 (en) * | 2016-02-08 | 2017-08-10 | Cornelia Weber | Phototherapy sleep mask |
US10567152B2 (en) | 2016-02-22 | 2020-02-18 | Mc10, Inc. | System, devices, and method for on-body data and power transmission |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US11992326B2 (en) | 2016-04-19 | 2024-05-28 | Medidata Solutions, Inc. | Method and system for measuring perspiration |
US20170371411A1 (en) * | 2016-06-28 | 2017-12-28 | Brillio LLC | Method and system for adapting content on hmd based on behavioral parameters of user |
US10210843B2 (en) | 2016-06-28 | 2019-02-19 | Brillio LLC | Method and system for adapting content on HMD based on behavioral parameters of user |
CN109844735A (en) * | 2016-07-21 | 2019-06-04 | 奇跃公司 | Affective state for using user controls the technology that virtual image generates system |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
CN109644297A (en) * | 2016-09-29 | 2019-04-16 | 英特尔公司 | The method and apparatus that mark may induce the virtual reality content of morbidity |
US10955917B2 (en) | 2016-09-29 | 2021-03-23 | Intel Corporation | Methods and apparatus for identifying potentially seizure-inducing virtual reality content |
US10067565B2 (en) | 2016-09-29 | 2018-09-04 | Intel Corporation | Methods and apparatus for identifying potentially seizure-inducing virtual reality content |
WO2018063521A1 (en) * | 2016-09-29 | 2018-04-05 | Intel Corporation | Methods and apparatus for identifying potentially seizure-inducing virtual reality content |
US10843006B2 (en) | 2016-11-17 | 2020-11-24 | Cognito Therapeutics, Inc. | Methods and systems for neural stimulation via auditory stimulation |
US10702705B2 (en) * | 2016-11-17 | 2020-07-07 | Cognito Therapeutics, Inc. | Methods and systems for neural stimulation via visual, auditory and peripheral nerve stimulations |
US11141604B2 (en) | 2016-11-17 | 2021-10-12 | Cognito Therapeutics, Inc. | Methods and systems for neural stimulation via visual stimulation |
US20190314641A1 (en) * | 2016-11-17 | 2019-10-17 | Cognito Therapeutics, Inc. | Methods and systems for neural stimulation via visual, auditory and peripheral nerve stimulations |
US10806399B2 (en) * | 2016-11-22 | 2020-10-20 | General Electric Company | Method and system of measuring patient position |
US20180140229A1 (en) * | 2016-11-22 | 2018-05-24 | General Electric Company | Method and System of Measuring Patient Position |
US11086391B2 (en) * | 2016-11-30 | 2021-08-10 | At&T Intellectual Property I, L.P. | Methods, and devices for generating a user experience based on the stored user information |
US20180150130A1 (en) * | 2016-11-30 | 2018-05-31 | At&T Intellectual Property I, L.P. | Methods, and devices for generating a user experience based on the stored user information |
US11449136B2 (en) | 2016-11-30 | 2022-09-20 | At&T Intellectual Property I, L.P. | Methods, and devices for generating a user experience based on the stored user information |
US10888271B2 (en) | 2016-12-08 | 2021-01-12 | Louise M. Falevsky | Systems, apparatus and methods for using biofeedback to facilitate a discussion |
US9953650B1 (en) * | 2016-12-08 | 2018-04-24 | Louise M Falevsky | Systems, apparatus and methods for using biofeedback for altering speech |
US10171858B2 (en) * | 2017-03-02 | 2019-01-01 | Adobe Systems Incorporated | Utilizing biometric data to enhance virtual reality content and user response |
US10169973B2 (en) * | 2017-03-08 | 2019-01-01 | International Business Machines Corporation | Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions |
US10928887B2 (en) | 2017-03-08 | 2021-02-23 | International Business Machines Corporation | Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions |
US20200004321A1 (en) * | 2017-03-21 | 2020-01-02 | Sony Corporation | Information processing device, information processing method, and program |
US10877555B2 (en) * | 2017-03-21 | 2020-12-29 | Sony Corporation | Information processing device and information processing method for controlling user immersion degree in a virtual reality environment |
US10154360B2 (en) * | 2017-05-08 | 2018-12-11 | Microsoft Technology Licensing, Llc | Method and system of improving detection of environmental sounds in an immersive environment |
US11554245B2 (en) * | 2017-05-26 | 2023-01-17 | Brown University | Lighting system for circadian control and enhanced performance |
US10691945B2 (en) | 2017-07-14 | 2020-06-23 | International Business Machines Corporation | Altering virtual content based on the presence of hazardous physical obstructions |
US10737054B1 (en) * | 2017-08-30 | 2020-08-11 | Gail Lynn | Sound and light chamber |
US11723579B2 (en) | 2017-09-19 | 2023-08-15 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement |
US10616727B2 (en) | 2017-10-18 | 2020-04-07 | YouMap, Inc. | System and method for location-based content delivery and visualization |
US11455027B2 (en) * | 2017-11-13 | 2022-09-27 | Vr Coaster Gmbh & Co. Kg | Apparatus for experiencing a virtual reality simulation in an underwater world |
CN107823775A (en) * | 2017-11-28 | 2018-03-23 | 深圳和而泰智能控制股份有限公司 | A kind of sleeping method and sleeping aid pillow |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
US20190180637A1 (en) * | 2017-12-08 | 2019-06-13 | The Regents Of The University Of Colorado, A Body Corporate | Virtually Resilient Simulator |
US11460696B2 (en) | 2017-12-13 | 2022-10-04 | Sony Interactive Entertainment Inc. | Head-mountable apparatus and methods |
WO2019115994A1 (en) * | 2017-12-13 | 2019-06-20 | Sony Interactive Entertainment Inc. | Head-mountable apparatus and methods |
CN108030498A (en) * | 2017-12-13 | 2018-05-15 | 上海青研科技有限公司 | A kind of Psychological Intervention System based on eye movement data |
US11123009B2 (en) * | 2017-12-21 | 2021-09-21 | Koninklijke Philips N.V. | Sleep stage prediction and intervention preparation based thereon |
US11273283B2 (en) | 2017-12-31 | 2022-03-15 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to enhance emotional response |
US11318277B2 (en) | 2017-12-31 | 2022-05-03 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to enhance emotional response |
US11478603B2 (en) | 2017-12-31 | 2022-10-25 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to enhance emotional response |
US11141556B2 (en) | 2018-01-24 | 2021-10-12 | Nokia Technologies Oy | Apparatus and associated methods for adjusting a group of users' sleep |
CN110321772A (en) * | 2018-03-30 | 2019-10-11 | Cae有限公司 | Customized visual rendering of dynamically influencing visual elements |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
US20190385367A1 (en) * | 2018-06-14 | 2019-12-19 | Robert Labron | Virtual reality software system and method |
US10885709B2 (en) * | 2018-06-14 | 2021-01-05 | Robert Labron | Virtual reality software system and method for treating medical condition in user |
US11308922B2 (en) * | 2018-07-03 | 2022-04-19 | Telefonaktiebolaget Lm Ericsson (Publ) | Portable electronic device for mixed reality headset |
WO2020044124A1 (en) * | 2018-08-28 | 2020-03-05 | Xr Health Il Ltd | Relieving chronic symptoms through treatments in a virtual environment |
US11241556B2 (en) * | 2018-08-29 | 2022-02-08 | De'Longhi Appliances S.R.L. Con Unico Socio | Method to activate and control a conditioning apparatus |
US10327073B1 (en) * | 2018-08-31 | 2019-06-18 | Bose Corporation | Externalized audio modulated by respiration rate |
US11452839B2 (en) | 2018-09-14 | 2022-09-27 | Neuroenhancement Lab, LLC | System and method of improving sleep |
US11097078B2 (en) * | 2018-09-26 | 2021-08-24 | Cary Kochman | Method and system for facilitating the transition between a conscious and unconscious state |
DE102018130718A1 (en) * | 2018-12-03 | 2020-06-04 | Sympatient GmbH | Device for carrying out serious games for the prevention and / or treatment of mental disorders |
CN109646784A (en) * | 2018-12-21 | 2019-04-19 | 华东计算技术研究所(中国电子科技集团公司第三十二研究所) | Immersive VR-based insomnia disorder psychotherapy system and method |
US11884155B2 (en) * | 2019-04-25 | 2024-01-30 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11191448B2 (en) * | 2019-06-07 | 2021-12-07 | Bose Corporation | Dynamic starting rate for guided breathing |
CN110269993A (en) * | 2019-07-15 | 2019-09-24 | 上海市嘉定区中心医院 | A kind of application method of sleep guidance device, system and system |
US11782508B2 (en) * | 2019-09-27 | 2023-10-10 | Apple Inc. | Creation of optimal working, learning, and resting environments on electronic devices |
US12093457B2 (en) * | 2019-09-27 | 2024-09-17 | Apple Inc. | Creation of optimal working, learning, and resting environments on electronic devices |
US20230418378A1 (en) * | 2019-09-27 | 2023-12-28 | Apple Inc. | Creation of optimal working, learning, and resting environments on electronic devices |
CN111467642A (en) * | 2020-03-06 | 2020-07-31 | 深圳市真元保玖科技有限公司 | Intelligent pillow, control method and device thereof, control equipment and storage medium |
US20230053767A1 (en) * | 2020-03-20 | 2023-02-23 | Sony Group Corporation | System, game console and method for adjusting a virtual environment |
WO2021185600A1 (en) * | 2020-03-20 | 2021-09-23 | Sony Group Corporation | System, game console and method for adjusting a virtual environment |
US11947722B2 (en) * | 2020-03-24 | 2024-04-02 | Arm Limited | Devices and headsets |
US20210303070A1 (en) * | 2020-03-24 | 2021-09-30 | Arm Limited | Devices and headsets |
US20210345905A1 (en) * | 2020-05-05 | 2021-11-11 | Active Insight Corporation | Physiological sensing system |
US11714483B2 (en) | 2020-09-15 | 2023-08-01 | Ballast Technologies, Inc. | Systems, methods, and devices for providing virtual-reality or mixed-reality experiences with special effects to a user in or under water |
CN114390263A (en) * | 2020-10-20 | 2022-04-22 | 中强光电股份有限公司 | Projection system and projection method |
US11771374B2 (en) * | 2020-11-30 | 2023-10-03 | Ceyeber Corp. | Cranial implant |
US20220167923A1 (en) * | 2020-11-30 | 2022-06-02 | Strathspey Crown, LLC | Cranial implant |
CN112657034A (en) * | 2020-12-09 | 2021-04-16 | 广州医科大学附属肿瘤医院 | Intelligent training method for repairing smell |
CN112472050A (en) * | 2020-12-10 | 2021-03-12 | 郑子龙 | Bionic method, electronic equipment and computer readable storage medium |
US12123654B2 (en) | 2022-11-28 | 2024-10-22 | Fractal Heatsink Technologies LLC | System and method for maintaining efficiency of a fractal heat sink |
CN115957419A (en) * | 2023-02-15 | 2023-04-14 | 中国人民解放军军事科学院军事医学研究院 | Information processing method, virtual reality system and device about psychological relaxation |
CN116110535A (en) * | 2023-04-13 | 2023-05-12 | 北京康爱医疗科技股份有限公司 | Breathing biofeedback method, feedback device and storage medium based on virtual reality |
Also Published As
Publication number | Publication date |
---|---|
US20140316191A1 (en) | 2014-10-23 |
US9872968B2 (en) | 2018-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9872968B2 (en) | Biofeedback virtual reality sleep assistant | |
US11000669B2 (en) | Method of virtual reality system and implementing such method | |
US11224717B2 (en) | Method and apparatus for virtual-reality-based mindfulness therapy | |
US20200289321A1 (en) | Circadian Rhythm Adjustment System | |
US11344249B2 (en) | Device for neurovascular stimulation | |
JP3217017U (en) | Wearable physiological examination equipment | |
EP3053354B1 (en) | Functional headwear | |
US9779751B2 (en) | Respiratory biofeedback devices, systems, and methods | |
KR20200031648A (en) | Virtual reality device | |
JP2024512835A (en) | System and method for promoting sleep stages in a user | |
JP5123138B2 (en) | Refresh guidance system and refresh guidance method | |
US11660419B2 (en) | Systems, devices, and methods for generating and manipulating objects in a virtual reality or multi-sensory environment to maintain a positive state of a user | |
US10835707B2 (en) | Physiological response | |
WO2016119665A1 (en) | Wearable physiological detection device | |
US11527318B2 (en) | Method for delivering a digital therapy responsive to a user's physiological state at a sensory immersion vessel | |
US11315675B2 (en) | System and method for entrainment of a user based on bio-rhythm of the user | |
US20200219468A1 (en) | Head mounted displaying system and image generating method thereof | |
WO2018100879A1 (en) | Output control device, output control method, and program | |
CN204839505U (en) | Wearing formula physiology detection device | |
KR20200091261A (en) | Electronic apparatus and method of inducing sleep | |
Morales et al. | An adaptive model to support biofeedback in AmI environments: a case study in breathing training for autism | |
US20230296895A1 (en) | Methods, apparatus, and articles to enhance brain function via presentation of visual effects in far and/or ultra-far peripheral field | |
US20230025019A1 (en) | Virtual reality and augmented reality headsets for meditation applications | |
GB2567506A (en) | Training aid | |
Calcerano et al. | Neurofeedback in Virtual Reality Naturalistic Scenarios for Enhancing Relaxation: Visual and Auditory Stimulation to Promote Brain Entrainment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- INCOMPLETE APPLICATION (PRE-EXAMINATION) |