GB2567506A - Training aid - Google Patents

Training aid Download PDF

Info

Publication number
GB2567506A
GB2567506A GB1805428.8A GB201805428A GB2567506A GB 2567506 A GB2567506 A GB 2567506A GB 201805428 A GB201805428 A GB 201805428A GB 2567506 A GB2567506 A GB 2567506A
Authority
GB
United Kingdom
Prior art keywords
user
output
activity level
input
stimulative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1805428.8A
Other versions
GB201805428D0 (en
GB2567506B (en
Inventor
El-Imad Jamil
Hormigo Cebolla Jesus
Dawy Zaher
Abbas Nabil
Lajtai Levente
Alawieh Hussein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neuropro Ltd
Original Assignee
Neuropro Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neuropro Ltd filed Critical Neuropro Ltd
Publication of GB201805428D0 publication Critical patent/GB201805428D0/en
Publication of GB2567506A publication Critical patent/GB2567506A/en
Application granted granted Critical
Publication of GB2567506B publication Critical patent/GB2567506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/38Acoustic or auditory stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Psychology (AREA)
  • Acoustics & Sound (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A system for providing stimulative feedback to a user dependent on their state or activity level, comprising: an input configured to receive neurological, cardiographical and/or respiratory data input from a user; and a processor configured to: monitor the input data and analyse the data to characterise the state or activity level of the user; and provide a stimulative output to the user in response to the input signal, the output depending on the characterisation. The system is especially for use in improving mindfulness, relaxation, and concentration or for meditation. In embodiments virtual reality (VR) is used as the stimulative output along with electroencephalography (EEG), electrocorticography (ECoG), electrocardiography ECG and/or respiratory measurements to provide guided meditation. In an embodiment the stimulative output is audio and/or visual and the clarity, quality and/or volume is adjusted based on measurements of the user’s concentration or relaxation levels.

Description

Title: Training aid
Field
THIS INVENTION relates to a training aid, more specifically to a system and method for ‘brain training’, particularly to aid relaxation and/or concentration.
Background
Mental health awareness is increasing and the understanding that good mental health is more than just the absence of a mental health problem is becoming more widespread. Paying more attention to the present moment, to thoughts, feelings and your environment can improve mental wellbeing. This ‘mindfulness’ helps us to better understand and engage with ourselves and the world around us.
Mindfulness enables us to notice signs of stress and anxiety more readily and deal with them more appropriately. Some people can enhance their mindfulness by practicing relaxation techniques, but there is a need for a mindfulness aid to help ‘train’ the mind to improve mindfulness.
The present invention aims to provide a method and device for improving mindfulness, particularly enhancing relaxation and/or concentration and can be used for meditating. Meditation can be characterized using unique physiological features based on EEG and ECG patterns, low respiratory rates, stable blood pressure, etc. Monitoring these physiological features can facilitate dynamic adaptation using virtual reality (VR) based platforms and thus helps enhance the quality of meditation and optimize the user’s experience.
In some embodiments, the present invention provides an EEG headset for collecting a user’s brain activity. An electroencephalogram (EEG) is a system for recording electrical activity in the brain produced by the firing of neurons within the brain. Multiple electrodes are placed around the scalp but electrodes can also be placed in direct contact with the brain or within the brain. The EEG signal is composed of different wave patterns operating in a spectrum going from below 4Hz to over 100Hz.
There are standardised positions for electrodes based on the international 1020 system so this aspect is not discussed further here. Some further background to the invention is described in W02009109784, GB2516275, WO2012025765, WO2013160706, WO2014006143, WO2015079264,
US9576330 and US2017182421, all of which are incorporated herein by reference and patent protection may be sought for features disclosed in these publications in combination with the present disclosure.
Brief summary of the invention
The present invention provides a system for providing stimulative feedback to a user dependent on their state or activity level, comprising: an input configured to receive neurological, cardiographical and/or respiratory data input from a user; and a processor configured to: monitor the input data and analyse the data to characterise the state or activity level of the user; and provide a stimulative output to the user in response to the input signal, the output depending on the characterisation.
The user’s state or activity level broadly includes their overall physiological state including vital signs (body temperature, pulse, respiratory rate, blood pressure), level of consciousness (which may be measured by EEG brain activity) and level of movement.
The stimulative output is an output to encourage development or increased activity/engagement with/by the user, thereby being a positive feedback mechanism encouraging or otherwise affecting relaxation and/or concentration. In other words, the system can be considered as a ‘brain training’ system.
The present invention further provides a system and method as claimed.
Brief description of the figures
In order that the present invention may be more readily understood, preferable embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, in which:
FIGURE 1 is a schematic diagram illustrating the input and output elements to a first embodiment of the system;
FIGURE 2 is a more detailed schematic diagram illustrating the input and output elements to a further embodiment of the system;
FIGURES 3 to 16 are slides/schematics explaining the functionality of one example of a system embodying the invention;
FIGURES 17 to 25 are screenshots from a video showing EEG data including relative powers of frequency bands;
FIGURES 26 to 29 are screenshots from an embodying system of the invention showing the user and a VR environment experience; and
FIGURES 30 to 38 are screenshots from another embodying system of the invention showing the user and a VR environment experience.
Figures 1 and 2 illustrate a first embodiment of the invention and present a general architecture for VR-based meditation with neuro feedback based on real time monitoring of physiological parameters captured via wearable sensors. The basic system 100 depicted in figure 1 shows a headset 110 comprising a EEG sensors 111 providing EEG signal input data to the processor 120, with two possible outputs 130 - i) a television display 131 and ii) a head-mounted display 132 (HMD).
Figure 2 illustrates an enhanced system 100 comprising multiple input sensors (EEG sensors 111, ECG sensor 112, respiratory sensor 113 and mobility sensor 114, in this example comprising accelerometers), which feed into a controller comprising the processor 120 for analysis to determine the user’s alertness, i.e. relaxation and/or concentration states. The system then outputs a rendered scene dependent on the user’s level of relaxation and/or concentration to stimulate the user and encourage further relaxation and/or concentration.
Figure 2 includes the following key components:
Current advances in bio-sensing technologies allow mobile and efficient monitoring of human physiological parameters. In the presented architecture, the following are key data sources for monitoring, tracking, and optimizing meditation (relaxation/concentration) activity over time:
EEG: electroencephalography tracks and records the electrical activity of the brain. Scalp EEG uses non-invasive sensing electrodes placed on different lobes of the brain. Specific to meditation, involving conscious focus and thinking, the frontal lobe is of great significance as it is responsible for behaviour, learning, personality, and voluntary actions. Preferably, two EEG channels FP1 and FP2 are used with a reference electrode to measure the electrical activity of the frontal lobe. The variation of EEG patterns before and during meditation provide an indication on the change of the subject’s focus I concentration level. There are other mechanisms for detecting and recording neuronal activity such as an electrocorticogram (ECoG) where the signal is derived directly from the cerebral cortex or functional magnetic resonance imaging (FMRI).
ECG: electrocardiography monitors the electrical activity of the heart. This indicates the heart rate and the heart rate variability (HRV) - the variation of the periods between consecutive heartbeats. Relaxation leads to reduced heart rate (at or close to the user’s resting heart rate) and increased HRV.Breathing: Respiration rate is one of the clear indications of relaxation. Chest bands with piezo-resistive fabric sensors can be used to monitor the number of breathing cycles per minute, which can be directly correlated with the user’s level of relaxation.
Activity: Accelerometers are useful in keeping track of motion and activity levels. A higher level of focus during meditation can be associated with lower activity over relatively short window intervals.
Upon acquiring the physiological signals, signal processing techniques are employed to extract features that can help reduce data into a representative set of informative markers indicate the level of alertness, which may comprise measures of concentration and/or relaxation.
Canc&ntratkm few!' estimation: EEG signals are especially useful in estimating the level of concentration. A technique for analyzing EEG signals is through decomposing them into their standard frequency bands: Delta (1-3Hz), Theta (4-7Hz), Alpha (8-12Hz), Beta (13-30Hz), and Gamma (30+Hz). Among these bands, which are not strictly defined and may vary slightly and/or overlap, e.g. Delta (1-4 Hz), Theta (4-8 Hz), Alpha (8-12 Hz), Beta (12-30 Hz) and Gamma (30+ Hz), the Beta band is most relevant to the level of concentration. The high frequency low amplitude Beta waves are commonly observed while the subject is awake and involved in conscious focus and thinking, such as when meditating. On the other hand, increased Delta waves indicate an inability to think or a difficulty in maintaining conscious awareness. A useful feature for EEG signals is the ratio of the relative power of the Beta band to that of the Delta band or to the Delta + Theta Bands, i.e.:
feature ratio = ?avg (Beta Band)
Pavg (Delta+Theta Bands)'
Such a ratio is expected to be high during meditation, due to high Beta activity and low Delta activity, and low while not meditating.
Relaxation level estimation: ECG, respiration rate, and mobility are directly linked to the level of relaxation. ECG features such as the heart rate and HRV give clear indications on whether a subject is mentally or physically relaxed. Reduced heart rate and increased HRV are linked to a relaxed mode. Similarly, a reduced breathing rate and a lower activity level give similar indication. In some embodiments, the system may apply a threshold to each of ECG, respiration rate and mobility outputs (as applicable, where used) and then update the ‘relaxation level’ by a “value” (e.g., V) given a certain threshold T set a priori based on testing and fine tuning. As an example:
- Feature>threshold: V++
- Feature<threshold: V--
- Take history into account (e.g., above threshold for 5 sec: V++)
- Take the current gauge level into account (e.g. for V>0, gauge = currentGaugeValue + V*exp(-currentGaugeValue/80)) - this makes it harder to achieve higher levels when above 80;
- Don’t consider unreasonable feature values (above 3*threshold)
Aggregating data from multiple sensors and data sources allows for higher reliability estimation and reduces inaccuracies due to various artifacts that impact the signal quality from each of the sensors.
lev®/ mappmgr The extracted features are then mapped dynamically to meters that indicate the levels of concentration and relaxation. This mapping is done in two stages:
- Ca/foraf/on sfege; An interval of sensory recording (e.g. 0.5-5 minutes, preferably 1.5 minutes) is used as a baseline to adapt the approach to inter-subject variations. During this interval, the algorithms learn the features of the non-meditation phase to adapt subsequent analysis accordingly.
- Thresholding stage; Based on the calibration stage, thresholds are defined for each of the features to detect significant variations that refer to different mental states (particularly meditating vs non-meditating). These thresholds are fixed based on statistical significance (e.g. mean plus one standard deviation) for each of the features in the calibration phase. During the mediation session, feature values that exceed their corresponding threshold would increment the relevant concentration and relaxation meters.
The estimated concentration and relaxation levels are fed back over a sliding window of time (e.g., every 5 seconds) into a central controller unit. In this embodiment, the controller dynamically adapts scenic visual feedback through the headset 132, which may be a VR headset, to optimize the meditation session experience. At the end of the session, the controller may provide a final score on the overall quality of experience.
Detailed description of invention
In one basic embodiment, the invention provides a concentration training device 100 which varies the type and/or characteristics of the output 130 such as the clarity and/or quality of displayed imagery and/or the volume of audio, depending on the user’s concentration.
In one example, the system is configured for maximising the clarity (sharpness) and/or quality (e.g. resolution or definition) of displayed imagery when the concentration level of the user is high; and conversely decreasing the clarity and/or quality of displayed imagery and/or volume of audio as the user loses concentration. The imagery is preferably imagery that the user wishes to see in full clarity and quality, e.g. a sports match or movie, so that the user is rewarded for concentrating. In one variant, the system varies the resolution of the output, e.g. from non-HD (i.e. <720p, e.g. 480p) to HD (720p, full HD: 1080p) to UHD (4k or higher), depending on the user’s mental state. In some embodiments, the clarity is adjusted by affecting the sharpness of (blurring) the imagery or obscuring the imagery with filters or clouding. In other embodiments, the image can be distorted in other ways, e.g. by affecting colour (such as by colour shifting, enhancement or filtering), polarising, vignetting, manipulating exposure or by using any other image/video manipulation technique.
In further embodiments, the device 100 can vary the type of output e.g. by providing an audio output instead of a visual output, vice versa or providing additional outputs as the user’s state changes. One example progresses the output depending on concentration or relaxation level as follows:
Mono audio-only output
Stereo audio-only output
Stereo audio + greyscale video output
Stereo audio + colour video output
Surround audio + colour video output
Surround audio + 3D video output
Surround audio + HDR 3D video output
Surround audio + HDR 3D video + haptic feedback (4D)
Further progressions may involve changes in quality of the output, e.g. from below-HD to HD and Ultra-HD, as discussed above.
In another basic embodiment, the invention provides a relaxation aid which can operate in a similar manner, enhancing the clarity, quality and/or volume of a stimulative output e.g. in the form of a relaxation scene and/or audio as the user relaxes.
In preferred embodiments, the system receives one or more of: neurological (e.g. EEG, ECoG), cardiographical (e.g. ECG), respiratory (e.g. from resistivity sensors) and movement (e.g. from accelerometers) data from the user. The system is configured to monitor the input data and analyse the data to characterise an activity level of the user; and provide a stimulative output to the user in response to the input signal, the output depending on the characterisation. Again, the device 100 can vary the type and/or characteristics of the output 130. Preferably, the system is configured to receive the input data from the user substantially in real-time and provide the stimulative output to the user in response to the input data substantially in realtime.
In preferred embodiments, the system characterises the alertness (concentration and/or relaxation level) of the user e.g. on a scale of 1 to 100, with 1 being deeply asleep, 25 being awake but very relaxed and 100 being fully alert and/or concentrating, based on the data inputs. The data inputs may be weighted depending on the user’s body metrics such as age.
In some embodiments, the output is a simulated virtual environment and the modelling characteristics of the virtual environment (such as the physics, geometry, location etc.) depend on the input signals, e.g. a neurologicallyderived (such as an EEG) signal. The simulated virtual environment may include dynamic elements that are modelled physically and the user experience in the environment based on their relaxation and/or concentration level. For example, the user can be modelled as a leaf blowing in the wind, and the modelled characteristics of the leaf depend on the user’s state (e.g. a relaxed user is depicted as a lightweight leaf or feather blowing freely in the wind; whilst a stressed or anxious user is depicted as a heavyweight leaf less able to move freely). In another embodiment, the user is depicted as a skier skiing down a mountain, and the user experience in the environment is based on their relaxation level, e.g. by removing obstacles in the path of the skier as the user relaxes, affecting the gradient of the mountain and/or affecting other skiers that are competing in a descent to the bottom of the mountain.
In some embodiments, as outlined in figures 4-10, the activity level comprises or depends on a feature calculated from the EEG signal as the ratio of the average power of the Beta band [12,30] Hz to the average power of the Delta + Theta bands [2,8] Hz over predefined intervals of time. This ratio is low whilst a user is relaxed I not concentrating (e.g. asleep, not meditating) and increases during concentration (e.g. in meditation). The system is preferably configured to calibrate the EEG readings for a particular user during a calibration interval, e.g. of 90s, to create a baseline and define a threshold for the activity level. Anomalous values of the power ratio beyond two standard deviations from the mean are typically ignored and the threshold can then be calculated as follows:
Threshold=(CalibrationPowers)+k*standardDev(CalibrationPowers) where k is fixed based on experimentation and CalibrationPowers is a time series of the ratio power(Beta)/power(Delta+Theta), preferably where:
Beta: [11, 39.9] Hz
Delta: [3, 3.9] Hz
Theta: [4, 7.9] Hz
These powers are preferably calculated every 1 -3s, more preferably every 0.5s.
An initial threshold value of k was determined empirically as 1.25, but this can adaptively change as more data is collected during an active session.
The user’s ‘meditation’ state can then generally be categorized as follows:
Meditation State Features>Threshold (%)
Calibration <15
Non-meditating <50
Approaching meditating >50 and < 90
Pre-meditating >95
Meditating (concentrating) >90
Figures 17 to 25 show a real time EEG data set for 4 minutes pre-meditation (normal state), 4 minutes during meditation, then 4 minutes post meditation (back to normal state). As explained in figures 12 to 15, each of figures 17 to 25 shows:
EEG Data Figure:
Displays the raw EEG data after filtering with a 50-Hz notch filter (a band-stop filter with a narrow stopband; high Q factor) to remove power interference.
Power Ratio Figure:
Displays the extracted power features from the EEG signals. The feature is the ratio of the signal power in the Beta Band (13-30 Hz) to the signal power in the Delta + Theta Bands (1-8 Hz). Concentration is associated with increased Beta activity and decreased Delta and Theta activity.
Smoothed Power Ratios Figure:
Displays the smoothing of the ratios in the Power Ratios Figure. Smoothing makes use of moving averages, filters, or local regression methods.
Spider Diagram for Relative Powers:
Displays the variation of relative powers for the five EEG frequency Bands (typical frequencies): Delta (1-4 Hz), Theta (4-8 Hz), Alpha (8-12 Hz), Beta (12-30 Hz) and Gamma (30+ Hz).
Focus Gauge:
Displays the level of focus, which is calculated based on the variation of the power ratios extracted from the EEG signal.
Figures 26 to 29 are screenshots from an embodying system of the invention and are discussed under ‘use’ below.
Use
A preferred usage scenario of a system with a neurological feedback loop is now described. Preferably, the user is wearing a VR headset with an integrated EEG electrode or a separate EEG headset providing a neurological feedback signal directly to the training system in substantially real-time, with minimal delay. The user selects an ‘experience’ for the system, which might be a relaxing scene, a movie or sports event, and the user’s experience depends on the neurological signal received by the training aid, e.g. as depicted in figure 3. Depending on the experience selected and/or the user’s preference, the system monitors and characterises the neurological signal and provides stimulative feedback to enhance/encourage concentration or relaxation.
In a ‘concentration’ mode, the system is preferably configured to alter the type and/or characteristics of the output such as the clarity and/or quality of displayed imagery and/or the volume of audio depending on the user’s concentration: e.g. maximising the clarity and/or quality of displayed imagery when the concentration level of the user is high; and conversely decreasing the clarity and/or quality of displayed imagery and/or volume of audio as the user loses concentration. In the scenario of a sports event, the user wants to view the experience in optimal quality and thus strives to concentrate on the event and avoid distractions which detract from the experience. In a gaming environment, the user might be rewarded with additional hints or items in their inventory to play the game, depending on their feedback data signal(s).
In a ‘relaxation’ mode, the system is preferably configured to adjust the type and/or characteristics of the output such as the clarity and/or quality of a relaxation scene such as waterfall or beach paradise imagery; and/or the clarity, quality and/or volume of audio such as classical music, depending on the user’s feedback data signal(s).
An example of such an environment is illustrated in figure 26. Figure 26 is a screenshot from the simulated VR environment showing the user wearing a VR headset with a separate EEG headset, heart rate monitor and respiratory rate monitor. These metrics are measured and plotted on the GUI as a graph and relative measures of ‘activity level’. The user is able to look around in the VR environment and react to this stimulative output.
In figure 27, the user is only moderately relaxed and moderately focussed, as shown by the indicators, so the scene is not clear and is partially obscured by cloud. In figures 28 and 29, the user is more focussed and more relaxed and thus the scene is clearer.
In some embodiments, an additional output is provided to the user and this may be fed back into the system and factored into the assessment of the state of the user. For example, the system can be configured for a student wanting to watch a sports match to provide two outputs: i) a stimulative reward output of the sporting event on a display and ii) an educational output providing questions testing the student about a given topic. In this embodiment, the system is configured to assess the neurological activity level (mental state) of the user based on the neurologically derived signal and the accuracy of the responses to questions, and then adjust the stimulative sporting event output accordingly. In this scenario, preferably the concentration assessment is based on the accuracy of responses to the educational output and this affects the type and/or characteristics of the reward output.
In some embodiments, the outputs comprise a memory trigger, e.g. by displaying user data such as photographs to help users with memory loss, or to aid a PTSD sufferer e.g. by displaying real or abstract imagery or a rendered experience related to a traumatic experience, to help the user process the experience more fully and reach closure.
In some embodiments, additional sensors are used to provide additional data and further enhance the experience or accuracy of the activity level assessment. For example, a heart rate monitor may be used to assess relaxation, a perspiration sensor may be used to measure anxiety and motion sensors (e.g. accelerometers) may be used to detect movement, all of which can be fed into the system and used to modify the output(s) the user experiences. This additional sensor data may readily be available from devices already carried by the user, e.g. in fitness watches or on a mobile phone - these often include heart rate monitors and accelerometers and this data can readily be transferred to the system by wire or wirelessly.
Subliminal outputs
In some embodiments, the output comprises one or more subliminal outputs. Preferably, the subliminal output is or forms part of the stimulative output in response to the state or activity of the user. These outputs are intended to subconsciously stimulate the brain and trigger a frequency-follower response to entrain specific neural rhythms. To this end, embodiments may contain subliminal stimulating outputs embedded to elements of the displayed virtual environment being configured to drive the user towards a target state. These outputs may operate at frequencies corresponding to the different EEG signal bands to induce brain entrainment, which occurs when the measured brainwave activity resonates with the frequency of an external stimulus.
Sensory entrainment may include subliminal audio, visual, or audio-visual outputs that facilitate a behaviourally evident alignment of the neural activity to the temporally regular stimulus operating in a relevant frequency band, e.g., Delta (1-3 Hz) for unawareness and deep unconsciousness, Theta (4-7 Hz) for creativity and optimal meditation, Alpha (8-12 Hz) for meditation and relaxation, Beta (13-30 Hz) for thinking and focus, and Gamma (30+ Hz) for mental sharpness and brain organization. In a displayed scene such as a virtual environment, a subliminal output can be embedded in the scene, e.g. in waves that break on the shore, flickering light from the sun or glow flies, moving animals or objects, or environmental elements such as gusts of wind or moving clouds that appear at selected entrainment frequencies.
In one embodiment, an abstract light in the form of a glowing cylinder is embedded as an overlay in the scene, flickering at 10 Hz (alpha wave frequency) in the background of the user’s point of view. The alpha wave frequency indicates a level of wakeful relaxation and a flickering light at 10 Hz can help relaxation. In another embodiment, a butterfly flaps its wings at 10 Hz.
The system is configured to monitor the state of the user, get feedback to control the type and intensity of brain entrainment outputs, and optimize for better user experience. In some embodiments, the system may monitor the user’s eyes, using sensors or input readings such as EEG signals, and adjust the subliminal elements in response. Entropy-based or spectral features of the EEG Alpha band signals can be used to detect when the user’s eyes are closed, and consequently the intensity of a flickering light can be increased to subliminally penetrate the user’s eyelids, reinforce visual brain entrainment, and encourage concentration and relaxation. The intensity of light would then adaptively change preferably reducing back to a subliminal level, when eyes are opened. Preserving the same brain entrainment subliminal outputs while the user changes states (e.g., eyes opened or closed) can help the brain recreate other presented output elements in the scene - stimulated by subliminal outputs. This configuration helps smooth the transition between states.
In some embodiments, there are multiple subliminal elements and these can be reduced or increased in number to provide a smoother transition between states. Alternatively or in additional, the frequencies used can be responsive to the user’s state or activity level, to promote particular reactions and/or further smooth transitions between states. For example, a subliminally flickering light could transition from flickering at delta wave frequencies to alpha and then beta wave frequencies as a user wakes, to encourage alertness; and a reversed transition from alpha to delta frequencies can encourage relaxation and sleep.
In addition to visual stimuli of varying intensity, frequency, and colour, embodiments may contain auditory stimuli that are embedded in the setting by modulating specific audio tones into played music files. Audio stimuli of relevant frequencies can be subliminally introduced using binaural beats, which occur when separate different frequencies are played in opposite ears. The difference between the two frequencies is perceived by the brain as an auditory beat that can be used for entrainment. Frequencies of the tones are chosen such that their difference is equal to the desired entrainment frequency. Progressively changing the frequency of binaural beats can promote smooth transition from the actual neural activity to the target neural rhythm. In some embodiments, maximizing the ratio relationship in the progressive change of the entrainment frequencies can help mimic the progression of a wide range of natural phenomena and thus would correlate more with the virtually displayed natural scene. In addition, the choice of the audio entrainment frequency can be optimised according to the type and level of the background noise in the user’s setting.
When used together, audio and visual stimuli may be jointly synchronized and optimized by monitoring the state of the user. EEG measurements from different brain lobes can give localization of undesired or irregular neural activity, which helps adjust the type and intensity of stimuli. If irregularity is in the occipital lobe, which is responsible for vision, visual stimuli would be intensified. Similarly, irregularities in the temporal lobe, which processes sound, may be resolved by intensifying auditory stimuli. Artificial intelligence learning algorithms can be used to choose the visual and audio outputs that are correlated by matching some relevant features extracted from both.
In the above, the subliminal outputs are effectively synchronised with brain wave frequencies to achieve a desired effect. In further embodiments, the subliminal output(s) or other outputs may be synchronised with the user’s state or activity level, and in particular with their heart rate or respiratory level, to promote association between the user and the experience. For example, if the output is in the form of a virtual environment, then the refresh rate of the environment as a whole, or a subset of the environment such as pixel movement of the avatar, or the flicker of a light etc. can be synchronised with the user’s heart rate - users then feel a stronger sense of association with the avatar.
Figures 30 to 38 are screenshots from another embodying system of the invention, showing the user’s experience as viewed through a VR headset, with a subliminal output.
Figure 30 shows the main menu for a user to enter basic input information and figure 31 shows environments for selection.
Figure 32 shows the ‘meditation configuration’ screen, where the user can set parameters and preferences for the session, such as the session length, the user position and the calibration time and experience level. The user may also choose how their vitals are indicated (e.g. fog, floating or glow) and the EEG measurement point (Fp1 or Fp2). In this particular example, the EEG (concentration) readings are represented by fog in the VR environment, whilst the ECG and breathing (relaxation) readings are illustrated by the user’s avatar - the ECG reading by the glow of the avatar and the breathing (relaxation) by the floating movement of the avatar, modelled by a physics engine. In other embodiments, the various readings can be modelled by objects in the environment, e.g. a relaxed user can be depicted as a lightweight leaf or feather blowing freely in the wind; whilst a stressed or anxious user can be depicted as a heavyweight leaf or feather less able to move freely, as discussed above. The floating movement can create a feeling of elevation.
Once the user enters their parameters and preferences, their selected environment is loaded, as shown in figures 33 and 34.
Figures 35-38 show additional screenshots of the user experience in this example embodiment. In figure 35, the user’s view is entirely obscured by fog, because the user is not concentrating, as shown by the EEG reading (represented by fog and shown by the meter adjacent to the brain image in the top ‘status bar’). In figure 36, the user’s view is less obscured by fog, as the user’s concentration level has increased. In figure 37, the user’s avatar in shown glowing, representing their ECG reading (as shown by the plot adjacent to the heart image in the status bar), the glow preferably pulsating in synchrony with the user’s heart rhythm. In some embodiments, the glow pulsates at a target heart rhythm, to subconsciously trigger a frequencyfollower response - for example, if the pulsation is at 70-100%, preferably 8090% of the user’s actual substantially real-time heart rhythm/frequency, then this can encourage a slower heart rate and enhanced relaxation, providing a subliminal output as discussed above. Figure 38 shows the same avatar absent the glow shown in figure 37. Figures 35-38 also indicate the session time in the bottom left-hand corner.
Other preferred features
In preferred embodiments, the system provides an output in the form of a performance score or metric, quantifying the level of concentration and/or relaxation achieved. This may be determined as an average (mean, median or modal) level or as a delta between the start and end of the session; or as a proportional or total amount of time above or below a threshold. This metric provides the user with a measure for comparing the effectiveness of each session and can be shared via social media etc. to encourage further use and competition.
In preferred embodiments, the system has self-learning functionality and records the user’s responses to the specific features in/aspects of the stimulative feedback. For example, the system can continue monitoring the data input stream and determine the effect the environment has on the user. Over a multitude of sessions, the system can thereby determine the user’s response to particular aspects or characteristics of the feedback. For example, if the stimulative feedback is a VR environment simulating a beach scene, the system can record the user’s reaction to particular aspects of the scene (e.g. turtles swimming in the sea or an impressive landmark or point of interest) and assign a score or weighting to the response and then adapt/modify future outputs to achieve the desired user reaction. The system thus provides adaptive Al that modifies the scene to better train the user.
When used in this specification and claims, the terms comprises and comprising and variations thereof mean that the specified features, steps or integers are included. The terms are not to be interpreted to exclude the presence of other features, steps or components.
This specification discloses various features, steps and preferred embodiments of the invention. The various features, steps and preferred embodiments of the invention disclosed in the application as filed (in the description, claims and/or drawings) are contemplated individually and in any and all combinations of selected features, steps and/or preferred embodiments and equivalents. All such combinations fall within the scope of this disclosure.

Claims (50)

Claims
1. A system for providing stimulative feedback to a user dependent on their state or activity level, comprising:
an input configured to receive neurological, cardiographical and/or respiratory data input from a user; and a processor configured to:
monitor the input data and analyse the data to characterise the state or activity level of the user; and provide a stimulative output to the user in response to the input signal, the output depending on the characterisation.
2. The system of claim 1, wherein the neurological data input comprises an EEG or ECoG signal.
3. The system of any preceding claim, wherein the cardiographical data input comprises an ECG signal.
4. The system of any preceding claim, wherein the activity level is or comprises an alertness measure.
5. The system of claim 4, wherein the alertness measure comprises a neurological, cardiographical and/or respiratory activity indicator.
6. The system of claim 4 or 5, wherein the alertness measure indicates a concentration and/or relaxation level of the user.
7. The system of any preceding claim, wherein the processor is further configured to convert the signal using an algorithm.
8. The system of any preceding claim, wherein the system is configured to receive the input data from the user substantially in real-time and provide the stimulative output to the user in response substantially in real-time.
9. The system of claim 1, wherein:
the input comprises an EEG or ECoG signal received from the user substantially in real time;
the processor is configured to characterise the neurological activity level of the user based on the EEG or ECoG signal; and the processor is configured to display the stimulative output, dependent on the neurological activity level of the user, to the user substantially in real time.
10. The system of any preceding claim, wherein the stimulative output comprises an audio and/or visual output and the type and/or characteristics of the output, preferably the clarity, quality and/or volume of the output, varies depending on the user’s state or activity level.
11. The system of claim 10, wherein the clarity, quality and/or volume of the output increases with decreasing activity level; and/or the clarity, quality and/or volume of the output decreases with decreasing activity level.
12. The system of any preceding claim, further comprising a display, preferably wherein providing the stimulative output comprises displaying a virtual environment on the display.
13. The system of claim 12, comprising a head-mounted display (HMD) system for displaying the virtual environment to the wearer, wherein the virtual environment output depends on the characterisation.
14. The system of any of claims 12 or 13, wherein the modelling characteristics, preferably the physics, of the virtual environment depend on the characterisation of the user’s state or activity level.
15. The system of any of any preceding claim, wherein the system is configured to synchronise the output:
with a predetermined brainwave frequency; and/or with the user’s state or activity level, preferably with the user’s heart rate.
16. The system of any preceding claim, further comprising a neurological, cardiographical and/or respiratory sensor configured to provide the input data from the user.
17. The system of any preceding claim, further comprising a body temperature sensor, a heart rate monitor, a respiratory rate sensor, a blood pressure monitor, a perspiration sensor and/or a mobility sensor.
18. The system of any of claims 2 to 17, wherein:
the input further comprises a respiratory rate and/or ECG data input received from the user substantially in real time;
the processor is further configured to characterise a relaxation level of the user based on the respiratory rate and/or ECG data input; and the processor is configured to display the stimulative output, dependent on the neurological activity and relaxation level of the user, to the user substantially in real time.
19. The system of any preceding claim, wherein the system is configured to display an indicator of the user’s state or activity level.
20. The system of any of claims 2 to 19, wherein the processor is configured to calculate a feature ratio of the average power of Beta waves in the EEG signal to the average power of Delta +Theta waves in the EEG signal over a predefined timeframe, i.e.:
feature ratio =
Pavg (Beta Band) ?AVG (Delta + Theta Bands) and compare this to a threshold to determine the user’s state or activity level.
21. The system of any preceding claim, wherein the processor is configured to analyse the input data from a user for an initial recording timeframe and determine a threshold for the user based on the initial recording data.
22. The system of claim 21, wherein the system is configured to calculate the threshold from the EEG data input according to the below, where k is a constant:
Threshold = mean (CalibrationPower) + k x StdDev(Calibration Power).
23. The system of any of claims 20 to 22, wherein the processor is configured to calculate a percentage of feature ratios greater than the threshold over a predetermined sampling window and sampling interval and thereby categorise the user’s neurological state or activity level according to:
0% < feature ratio greater than threshold < 40%: calibration/not concentrating 40% < feature ratio greater than threshold < 80%: partially concentrating 80% < feature ratio greater than threshold < 100%: highly concentrating
24. The system of any of claims 2 to 23, wherein the system is configured to filter the raw EEG data signal using a 50 Hz notch filter.
25. The system of any preceding claim, wherein the processor is further configured to:
provide a second output to the user, requiring input from the user in response;
receive a response from the user;
characterise the current neurological state of the user based on the neurologically derived signal and the user input; and modify the stimulative output to the user depending on the neurological signal and the user input.
26. A method of providing stimulative feedback to a user dependent on their state or activity level, comprising:
receiving neurological, cardiographical and/or respiratory data input from a user;
analysing the input data to characterise the state or activity level of the user; and providing a stimulative output to the user in response to the input signal, the output depending on the characterisation.
27. The method of claim 26, wherein the neurological data input comprises an EEG or ECoG signal.
28. The method of claim 26 or 27, wherein the cardiographical data input comprises an ECG signal.
29. The method of any of claims 26 to 28, wherein the activity level is or comprises an alertness measure.
30. The method of any of claims 26 to 29, wherein the alertness measure comprises a neurological, cardiographical and/or respiratory activity indicator.
31. The method of any of claims, wherein the alertness measure indicates a concentration and/or relaxation level of the user.
32. The method of any of claims, further comprising: converting the signal using an algorithm.
33. The method of any of claims, comprising:
receiving the input data from the user substantially in real-time; and providing the stimulative output to the user in response substantially in real-time.
34. The method of claim 26, wherein:
the input comprises an EEG or ECoG signal received from the user substantially in real time; and the method comprises:
characterising the neurological activity level of the user based on the EEG or ECoG signal; and displaying the stimulative output, dependent on the neurological activity level of the user, to the user substantially in real time.
35. The method of any of claims 26 to 34, wherein the stimulative output comprises an audio and/or visual output and the type and/or characteristics of the output, preferably the clarity, quality and/or volume of the output, varies depending on the user’s state or activity level.
36. The method of claim 35, wherein the clarity, quality and/or volume of the output increases with decreasing activity level; and/or the clarity, quality and/or volume of the output decreases with decreasing activity level.
37. The method of any of claims 26 to 36, wherein providing the stimulative output comprises displaying a virtual environment on a display.
38. The method of claim 37, further comprising displaying the virtual environment to the wearer on a head-mounted display (HMD), wherein the virtual environment output depends on the characterisation.
39. The method of claim 37 or 38, wherein the modelling characteristics, preferably the physics, of the virtual environment depend on the characterisation of the user’s state or activity level.
40. The method of any of claims 26 to 39, comprising synchronising the output:
with a predetermined brainwave frequency; and/or with the user’s state or activity level, preferably with the user’s heart rate.
41. The method of any of claims 26 to 40, comprising:
receiving sensor input data from a neurological, cardiographical and/or respiratory sensor.
42. The method of claim 41, wherein the additional sensor comprises a body temperature sensor, a heart rate monitor, a respiratory rate sensor, a blood pressure monitor, a perspiration sensor and/or a mobility sensor.
43. The method of any of claims 27 to 42, wherein:
the input further comprises a respiratory rate and/or ECG data input received from the user substantially in real time; and the method further comprises:
characterising the relaxation level of the user based on the respiratory rate and/or ECG data input; and displaying the stimulative output, dependent on the neurological activity and relaxation level of the user, to the user substantially in real time.
44. The method of any of claims 26 to 43, further comprising: displaying an indicator of the user’s state or activity level.
45. The method of any of claims 26 to 44, further comprising:
calculating a feature ratio of the average power of Beta waves in the
EEG signal to the average power of Delta +Theta waves in the EEG signal over a predefined timeframe, i.e.:
feature ratio =
Pavg (Beta Band) ?AVG (Delta + Theta Bands) and comparing this to a threshold to determine the user’s state or activity level.
46. The method of any of claims 26 to 45, further comprising:
analysing the input data for an initial recording timeframe from a user and determining a threshold for the user based on the initial recording data.
47. The method of claim 46, wherein the system is configured to calculate the threshold from the EEG data input according to the below, where k is a constant:
Threshold = mean (CalibrationPower) + k x StdDev(Calibration Power).
48. The method of any of claims 44 to 46, further comprising:
calculating a percentage of feature ratios greater than the threshold over a predetermined sampling window and sampling interval and thereby categorising the user’s neurological state or activity level according to: 0% < feature ratio greater than threshold 40%: calibration/not concentrating 40% < feature ratio greater than threshold < 80%: partially concentrating 80% < feature ratio greater than threshold 100%: highly concentrating.
49. The method of any of claims 27 to 48, further comprising: filtering the raw EEG data signal using a 50 Hz notch filter.
50. The method of any of claims 26 to 49, further comprising:
providing a second output to the user, requiring input from the user in response;
5 receiving a response from the user;
characterising the current neurological state of the user based on the neurologically derived signal and the user input; and modifying the stimulative output to the user depending on the neurological signal and the user input.
GB1805428.8A 2017-09-08 2018-04-03 A system and method for providing stimulative feedback Active GB2567506B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1714471.8A GB201714471D0 (en) 2017-09-08 2017-09-08 Training Aid

Publications (3)

Publication Number Publication Date
GB201805428D0 GB201805428D0 (en) 2018-05-16
GB2567506A true GB2567506A (en) 2019-04-17
GB2567506B GB2567506B (en) 2020-07-15

Family

ID=60117155

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1714471.8A Ceased GB201714471D0 (en) 2017-09-08 2017-09-08 Training Aid
GB1805428.8A Active GB2567506B (en) 2017-09-08 2018-04-03 A system and method for providing stimulative feedback

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1714471.8A Ceased GB201714471D0 (en) 2017-09-08 2017-09-08 Training Aid

Country Status (1)

Country Link
GB (2) GB201714471D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202100010904A1 (en) * 2021-04-29 2022-10-29 Centro Rham S R L INNOVATIVE METHOD AND RELATED AUDIO-VIDEO DEVICE FOR THE ASSESSMENT AND TREATMENT OF FEARS AND FOR THE ASSESSMENT AND COGNITIVE ENHANCEMENT

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114652330B (en) * 2022-02-11 2023-03-24 北京赋思强脑科技有限公司 Method, device and equipment for evaluating meditation training based on historical electroencephalogram signals

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020128540A1 (en) * 2001-02-23 2002-09-12 Sun-Il Kim System and method of correlating virtual reality with biofeedback for enhancing attention
US8271077B1 (en) * 2008-08-27 2012-09-18 Lockheed Martin Corporation Learning optimization using biofeedback
WO2014107795A1 (en) * 2013-01-08 2014-07-17 Interaxon Inc. Adaptive brain training computer system and method
US20160005320A1 (en) * 2014-07-02 2016-01-07 Christopher deCharms Technologies for brain exercise training
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
WO2017096104A1 (en) * 2015-12-04 2017-06-08 Saudi Arabian Oil Company Systems, computer medium and methods for management training systems
US20170188976A1 (en) * 2015-09-09 2017-07-06 WellBrain, Inc. System and methods for serving a custom meditation program to a patient

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015038920A1 (en) * 2013-09-12 2015-03-19 Sunnen Gerard V Devices and method utilizing ultra-low frequency non-vibratory tactile stimuli for regulation of physiological processes
US10803145B2 (en) * 2016-02-05 2020-10-13 The Intellectual Property Network, Inc. Triggered responses based on real-time electroencephalography
US10434279B2 (en) * 2016-09-16 2019-10-08 Bose Corporation Sleep assistance device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020128540A1 (en) * 2001-02-23 2002-09-12 Sun-Il Kim System and method of correlating virtual reality with biofeedback for enhancing attention
US8271077B1 (en) * 2008-08-27 2012-09-18 Lockheed Martin Corporation Learning optimization using biofeedback
WO2014107795A1 (en) * 2013-01-08 2014-07-17 Interaxon Inc. Adaptive brain training computer system and method
US20160005320A1 (en) * 2014-07-02 2016-01-07 Christopher deCharms Technologies for brain exercise training
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US20170188976A1 (en) * 2015-09-09 2017-07-06 WellBrain, Inc. System and methods for serving a custom meditation program to a patient
WO2017096104A1 (en) * 2015-12-04 2017-06-08 Saudi Arabian Oil Company Systems, computer medium and methods for management training systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202100010904A1 (en) * 2021-04-29 2022-10-29 Centro Rham S R L INNOVATIVE METHOD AND RELATED AUDIO-VIDEO DEVICE FOR THE ASSESSMENT AND TREATMENT OF FEARS AND FOR THE ASSESSMENT AND COGNITIVE ENHANCEMENT

Also Published As

Publication number Publication date
GB201805428D0 (en) 2018-05-16
GB201714471D0 (en) 2017-10-25
GB2567506B (en) 2020-07-15

Similar Documents

Publication Publication Date Title
CA2935813C (en) Adaptive brain training computer system and method
Liang et al. Development of an EOG-based automatic sleep-monitoring eye mask
KR102400268B1 (en) Mobile wearable monitoring systems
US8326408B2 (en) Method and apparatus of neurological feedback systems to control physical objects for therapeutic and other reasons
US9675291B2 (en) Modifying a psychophysiological state of a subject
US8594787B2 (en) Synchronising a heart rate parameter of multiple users
US20110183305A1 (en) Behaviour Modification
US20080214903A1 (en) Methods and Systems for Physiological and Psycho-Physiological Monitoring and Uses Thereof
Kar et al. Effect of sleep deprivation on functional connectivity of EEG channels
CN110302460B (en) Attention training method, device, equipment and system
CN104665827B (en) Wearable physiology detection apparatus
GB2567506A (en) Training aid
Morales et al. An adaptive model to support biofeedback in AmI environments: a case study in breathing training for autism
US10085690B2 (en) System and method for feedback of dynamically weighted values
CN114828970A (en) Synchronization of physiological data and game data to affect game feedback loops
Pyre Analysis of physiological responses induced by motion sickness and its detection based on ocular parameters
JP7146196B2 (en) Score calculation device and method, and score calculation device control program
Kuo et al. An EOG-based sleep monitoring system and its application on on-line sleep-stage sensitive light control
Masi et al. Electrodermal Activity in the Evaluation of Engagement for Telemedicine Applications
WO2022181168A1 (en) Stimulus presentation system, stimulus presentation method, program, and model generation system
Rushambwa et al. Impact assessment of mental subliminal activities on the human brain through neuro feedback analysis
Calcerano et al. Neurofeedback in Virtual Reality Naturalistic Scenarios for Enhancing Relaxation: Visual and Auditory Stimulation to Promote Brain Entrainment
JP2023181058A (en) System, method, and program for processing signals having waveforms that change over time
JP2024136472A (en) Detection device, program, and detection system
Polo Multimodal assessment of emotional responses by physiological monitoring: novel auditory and visual elicitation strategies in traditional and virtual reality environments

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: NEUROPRO LIMITED

Free format text: FORMER OWNER: VIRTUALLY LIVE (SWITZERLAND) GMBH

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1262707

Country of ref document: HK