IL285071A - System and method for monitoring sedation level of a patient based on eye and brain activity tracking - Google Patents
System and method for monitoring sedation level of a patient based on eye and brain activity trackingInfo
- Publication number
- IL285071A IL285071A IL285071A IL28507121A IL285071A IL 285071 A IL285071 A IL 285071A IL 285071 A IL285071 A IL 285071A IL 28507121 A IL28507121 A IL 28507121A IL 285071 A IL285071 A IL 285071A
- Authority
- IL
- Israel
- Prior art keywords
- patient
- eye
- communication protocol
- eye image
- sedation
- Prior art date
Links
- 206010039897 Sedation Diseases 0.000 title claims description 84
- 230000036280 sedation Effects 0.000 title claims description 82
- 238000000034 method Methods 0.000 title claims description 34
- 238000012544 monitoring process Methods 0.000 title claims description 16
- 230000007177 brain activity Effects 0.000 title claims description 8
- 238000004891 communication Methods 0.000 claims description 71
- 238000012545 processing Methods 0.000 claims description 46
- 230000004044 response Effects 0.000 claims description 25
- 230000002123 temporal effect Effects 0.000 claims description 17
- 230000000694 effects Effects 0.000 claims description 15
- 230000001960 triggered effect Effects 0.000 claims description 8
- 206010012218 Delirium Diseases 0.000 claims description 5
- 230000004424 eye movement Effects 0.000 claims description 5
- 235000003642 hunger Nutrition 0.000 claims description 4
- 238000012731 temporal analysis Methods 0.000 claims description 4
- 230000035922 thirst Effects 0.000 claims description 4
- 238000001454 recorded image Methods 0.000 claims description 2
- 239000013256 coordination polymer Substances 0.000 description 8
- 230000006998 cognitive state Effects 0.000 description 6
- 230000001149 cognitive effect Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0006—ECG or EEG signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/279—Bioelectric electrodes therefor specially adapted for particular uses
- A61B5/291—Bioelectric electrodes therefor specially adapted for particular uses for electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Artificial Intelligence (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Physiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Social Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Description
SYSTEM AND METHOD FOR MONITORING SEDATION LEVEL OF A PATIENT BASED ON EYE AND BRAIN ACTIVITY TRACKING TECHNOLOGICAL FIELD The present disclosure is in the field of automated system for monitoring cognitive state of a patient.
BACKGROUND ART References considered to be relevant as background to the presently disclosed subject matter are listed below: - WO 2021/024257; - WO2016/ 142933; - WO 2019/111257.
Acknowledgement of the above references herein is not to be inferred as meaning that these are in any way relevant to the patentability of the presently disclosed subject matter.
GENERAL DESCRIPTION The present disclosure provides a system for monitoring the sedation/cognitive state of a patient by continuously monitoring the patient's eye activity and generating eye image date based thereon. The monitor is further configured to provide a selected output to the patient, such as a questionnaire, an audible output and/or a visual output, in order to increase his/her awareness and reduce risks or state of delirium.
Optionally, the system is configured to receive EEG data indicative of recorded EEG signals of the patient that are time-correlated with the recorded eye activity of the patient, and the sedation state of the patient is determined based on either the EEG data, the eye image data or combination thereof. Different sedation states of the patient can be determined by applying different weight factors profiles of the two sets of data.
Upon determination of the sedation state of the patient, the processing circuitry, i.e. the processor/controller of the system, is configured to operate a communication module so as to trigger a selected output of engaging communication to the patient. The output may be interactive, namely one that requires a response from the patient, or passive that only needs to be received by one of the senses of the patient without any required response therefrom. The outputted communication is intended to stimulate the cognitive activity of the patient and thereby improving his/her cognitive state.
Thus, as aspect of the present disclosure provides a system for monitoring sedation level of a patient. The system includes (1) a camera unit configured for recording images of an eye of the patient and generating eye image data based thereon; (2) a communication module operable to output a desired communication protocol; and (3) a processing circuitry. The processing circuitry comprises an input module configured to receive EEG data indicative of EEG signal of the patient and is in data communication with the camera. The processing circuitry is configured to: (i) receive and process said eye image data and EEG data; (ii) determine based on at least one of the eye image data and EEG data the sedation or cognitive state of the user; and (iii) triggering the communication module to output a selected communication protocol in response to the determined sedation state. The communication protocol can be a questionnaire, playing of music, outputting recorded audio of family or friend, etc.
Yet another aspect of the present disclosure provides a system for monitoring sedation level of a patient. The system comprising (1) a camera unit configured for recording images of an eye of the patient and generating eye image data based thereon; (2) a communication module operable to output a desired communication protocol; and (3) a processing circuitry. The processing circuitry is in data communication with the camera and is operable to (i) receive and process said eye image data; (ii) determine based on the eye image data the sedation or cognitive state of the user; and (iii) triggering the communication module to output a selected communication protocol in response to the determined sedation state. The communication protocol can be a questionnaire, playing of music, outputting audio of family or friends, etc.
The following are optional embodiments for any of the above-described aspects.
It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in any combination in a single embodiment.
In some embodiments, the system further comprising an EEG unit configured for recording EEG signals of the patient and generate said EEG data based thereon.
It is to be noted that any combination of the described embodiments with respect to any aspect of this present disclosure is applicable. In other words, any aspect of the present disclosure can be defined by any combination of the described embodiments.
In some embodiments of the system, the processing circuitry is configured to calculate a sedation score of the patient, such as Richmond Agitation-Sedation Scale (RASS) score, and to classify the score into two or more score ranges, each range triggers a different communication protocol.
In some embodiments of the system, in at least one range of scores, the determination of the sedation state and/or the communication protocol is triggered only based on the EEG data. Thus, in a score that indicates that the patient is sedated and there is no eye activity that can be monitored by the camera unit, the sedation state of the patient is determined only based on the EEG data.
In some embodiments of the system, in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered based on a combination of the eye image data and the EEG data. Namely, in a score that indicates that the patient is alerted at some level and there is eye activity that can be monitored by the camera unit, the determination of the sedation state of the patient is determined based on a certain level of combination of the two data sets. Depending on the recorded activity of the eye and the brain of the patient, the influence of each data for determining the sedation is determined by the processing unit. Typically, when the patient is responsive to some extent and there is eye activity, the eye image data is more significant for the determination of the sedation state.
In some embodiments of the system, the processing circuitry is configured to apply a temporal analysis on the eye image data and the EEG data to determine a correlation between eye movements, brain activity and sedation level. This can be performed by applying a machine learning algorithm and training the system by inputting the sedation score level at different scenarios of eye movements and brain activity.
In some embodiments of the system, the processing circuitry is configured to analyze selected time-windows of the eye image data and/or the EEG data following an output of communication protocol to identify patient's response to said communication protocol and to determine an updated sedation state of the patient based on said response.
In some embodiments of the system, the processing circuitry is further configured to transmit a signal carrying sedation state data indicative of the sedation state of the patient to a remote unit. This can be performed through a transmitting unit of the processing circuitry.
In some embodiments of the system, the processing circuitry is further configured to identify eye gestures in said eye image data. The eye gestures either trigger a communication protocol or affecting a selected communication protocol.
In some embodiments of the system, the processing circuitry is configured to classify the eye gestures into responses of the patient to a questionnaire. For example, the questionnaire can be Confusion Assessment Method for the Intensive Care Unit (CAM- ICU that is outputted to the patient audibly and the patient respond to each question in the questionnaire with a specific eye gesture that indicates a specific response of the patient to the question.
In some embodiments of the system, the processing circuitry is configured to classify the eye gestures as commands for playing audible output. The audible output is selected from certain music and voice recording of relatives, such as greetings of friends and family.
In some embodiments of the system, the processing circuitry is configured to analyze said eye image and EEG data and identify signatures, namely certain temporal pattern of eye activity, brain activity or a combination thereof, which are indicative of clinical state of the patient.
In some embodiments of the system, the processing circuitry is configured to correlate temporal profiles of said eye image data and/or EEG data with predetermined temporal profiles corresponding to a plurality of signatures, which are indicative of a plurality of clinical states, and to identify a correlation that satisfies a certain condition, such as best match or a certain threshold of matching. The predetermined temporal profiles are stored in a predetermined database and the processing circuitry is in data communication with said predetermined database. In some embodiments, the system further includes said predetermined database.
In some embodiments of the system, the processing circuitry is configured to update the database or to store in a memory thereof so as to generate a personalized signature of the patient upon identifying it. The identification of the personalized signature may be from a clinical indication that is inputted to the system or by manual indication inputted by a user that identifies a certain temporal pattern with the clinical state of the patient.
In some embodiments of the system, the clinical condition is selected from at least one of: pain, thirst, hunger, delirium.
Yet another aspect of the present disclosure provides a method for monitoring sedation state of a patient. The method comprising: receiving and processing (i) eye image data indicative of recorded images of an eye of the patient and (ii) EEG data indicative of EEG signal of the patient; determining based on at least one of the eye image data and EEG data the sedation state of the user; and outputting a selected communication protocol in response to the determined sedation state.
In some embodiments, the method further comprising calculating a sedation score of the patient and classifying the score into two or more score ranges, each range triggers a different output of communication protocol.
In some embodiment of the method, in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered only based on the EEG data.
In some embodiments of the method, in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered based on a combination of the eye image data and the EEG data.
In some embodiments, the method further comprising applying temporal analysis on the eye image data and the EEG data and determining a correlation between eye movements, brain activity and sedation state.
In some embodiments, the method comprising analyzing selected time-windows of the eye image data and/or the EEG data following an output of communication protocol to identify patient's response to said communication protocol and to determine an updated sedation state of the patient based on said response.
In some embodiments, the method comprising transmitting a signal carrying sedation state data indicative of the sedation state of the patient to a remote unit.
In some embodiments, the method further comprising identifying eye gestures in said eye image data and either triggering a communication protocol or affecting the selected communication protocol based on said eye gestures.
In some embodiments of the method, said eye gestures are used for patient's response to a questionnaire.
In some embodiments of the method, said eye gestures are used for playing audible output, said audible output is selected from certain music and voice recording of relatives.
In some embodiments, the method comprising analyzing said eye image and EEG data and identify signatures indicative of clinical state of the patient.
In some embodiments, the method comprising correlating temporal profiles of said eye image data and/or EEG data with predetermined temporal profiles corresponding to a plurality of signatures that are stored in a database and identifying a correlation that satisfies a certain condition.
In some embodiments of the method, the clinical condition is selected from at least one of: pain, thirst, hunger, delirium.
In some embodiments, the method comprising generating a personalized signature of the patient and updating said database or storing in a memory said personalized signature.
BRIEF DESCRIPTION OF THE DRAWINGS In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which: Figs. 1A-1D are block diagrams of different non-limiting embodiments of the system according to an aspect of the present disclosure.
DETAILED DESCRIPTION The following figures are provided to exemplify embodiments and realization of the invention of the present disclosure.
Reference is being made to Figs. 1A-1D, which are non-limiting example of different embodiments of the system for monitoring the sedation level of a patient and engaging the patient with a selected engagement protocol based on the monitored sedation level. It is to be noted that the term "sedation level" is interchangeable with the term "cognitive level", "sedation state", or "cognitive state" and they all refer to a certain sedation/cognitive scale, such as, but not limited to, Richmond Agitation-Sedation Scale (RASS) score. The RASS score indicates what is the patient condition on the scale between "unarousable", in which the patient does not respond to voice or physical stimulation and "combative", in which the patient is fully aware and is overtly combative or violent.
Fig. 1A exemplifies a system 100 that includes a camera unit 102 that is configured to continuously image and record the eye of the patient and generate eye image data EID indicative of the recorded eye images of the patient. By monitoring the activity of the eye of the patient, the sedation state of the patient can be deduced. This is performed by continuously classifying the recorded eye activity of the patient into defined gestures.
The temporal profile of the eye gestures defines the sedation state of the patient.
The system further includes a communication module 104 that is configured to output a selected communication protocol CP to the patient in response to the determined sedation state of the patient by the monitoring system 100. The communication module 104 comprises a plurality of predetermined communication protocols CPs and a specific protocol is selected in response to the determined sedation state based on best match criteria. Namely, for each specific sedation state, there is a specific communication protocol. The communication protocol can be tailor-made to the patient, namely that a personalized content is outputted in the communication protocol CP to the patient. The communication protocol can include various types of communications, some of them are interactive communications, namely communications that require the patient response, and some of the communication protocols CPs are constituted by mere output of the communication module that does not require the patient's response.
A processing circuitry 106 is configured to receive the eye image data EID from the camera unit 102 and process it to determine the temporal profile of the eye gestures made by the patient. Based on identification of signatures in the temporal profile of the eye gestures, the sedation level of the patient is determined. Once the sedation level is determined, the processing circuitry is configured to operate the communication module 104 to output a selected communication protocol CP to the patient, based on the determined sedation level of the patient. When the communication protocol CP is outputted, the camera unit 102 proceeds to record the activity of the eye of the patient and generate eye image data. This new eye image data EID is processed by the processing circuitry 106 to determine, by analyzing the temporal profile of the eye gestures made by the patient in a time-window following the output of the communication protocol CP, an updated sedation state of the patient and to identify whether the communication protocol CP is affecting the sedation state of the patient. By analyzing the response of the patient to communication protocols over time, the processing circuitry can learn how to better match the best communication protocol to the patient to achieve the best progress in the sedation scale of the patient.
Fig. 1B is another example of the system, differs from that of Fig. 1A by including an input module 108 in the processing circuitry 106. The input module 108 is configured to receive the eye image data EID that is generated by the camera unit 102 and further receive EEG data EEGD indicative of EEG signals of the patient that his/her eye activity is being monitored by the system 100. In this example, the EEG data EEGD is generated by an EEG unit that is external to the system 100 and not part of it. The processing circuitry 106 is configured to determine the sedation state of the patient based on the EEG data EEGD and the eye image data EID. Namely, the processing circuitry is configured to assign different weight factors to each data set (EEG data EEGD and the eye image data EID) based on the amount of information or the assumed sedation state of the patient.
When the patient is at a low sedation score and the eyes of the patient are not responsive, the processing circuitry assigns great weight factors to the EEG data, and when the patient sedations score rises, the weight factors of the eye image data EID rise correlatively.
Thus, the EEG data EEGD is of great importance when the sedation state cannot be determined by analyzing the eye activity of the patient, namely below a certain sedation score. It is to be noted that the processing circuitry is configured to update continuously the weight factors for each patient so as to generate personalized weight factors. In other words, while there are default weight factors at the beginning of the monitoring for each new patient, the processing circuity is configured to update the weight factors to be tailor- made for each patient. The processing circuitry may apply a machine learning algorithm to calculate and update the new weight factors.
Reference is now being made to Fig. 1C, which is another example of the monitoring system of the present disclosure. This example differs from that of Fig. 1B by including the EEG unit 110 that generates the EEG data EEGD indicative of EEG signals of the monitored patient. The EEG unit 110 is configured to continuously record the brain waves activity of the patient and generate the EEG data EEGD based thereon, which is transmitted to the input module 108 of the processing circuitry 106 to assist in the determination of the sedation level of the patient.
Fig. 1D is another example of the monitoring system of the present disclosure.
The system of this example differs from that of Fig. 1C by including a database 112 that stores a plurality of predetermined signatures of patterns of EEG and eye image data that are each correlated with a specific medical condition of a patient. The predetermined signatures were collected from various patients or were synthetically generated, and each of them is assigned with a medical condition such that the processing circuitry 106 can apply a correlation or matching algorithm on the collected eye image data and EEG data to find a matching signature MS in the database and based on that determine the medical condition of the patient. The processing circuitry 106 can record to the database 112 a new signature NS that is collected from the monitored patient. It is to be noted that the database 112 may be physically connected to the processing circuitry 106 or can be located on a remote cloud server and communicate with the processing circuitry 106 through standard network protocols.
Claims (25)
1. A system for monitoring sedation state of a patient, the system comprising: a camera unit configured for recording images of an eye of the patient and generating eye image data based thereon; 5 a communication module operable to output a desired communication protocol; a processing circuitry that comprises an input module configured to receive EEG data indicative of EEG signal of the patient, the processing circuitry is in data communication with the camera and being operable to i. receive and process said eye image data and EEG data; 10 ii. determine based on at least one of the eye image data and EEG data the sedation state of the user; iii. triggering the communication module to output a selected communication protocol in response to the determined sedation state; iv. analyze said eye image and EEG data and identify signatures 15 indicative of clinical state of the patient selected from at least one of: pain, thirst, hunger, delirium; wherein said determination of the sedation state of the user comprises (i) continuously classifying the recorded eye activity of the patient into defined gestures, the temporal profile of the eye gestures defines the sedation state of the 20 patient, (ii) applying different and varying weight factors to the EEG data and the eye image data based on the amount of information of each data set or the assumed sedation state of the patient.
2. The system of claim 1, comprising an EEG unit configured for recording EEG signals of the patient and generate said EEG data based thereon. 25
3. The system of claim 1 or 2, wherein the processing circuitry is configured to calculate a sedation score of the patient and to classify the score into two or more score ranges, each range triggers a different communication protocol.
4. The system of claim 3, wherein in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered only based on the 30 EEG data. - 11 -
5. The system of claim 3 or 4, wherein in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered based on a combination of the eye image data and the EEG data.
6. The system of any one of claims 1-5, wherein the processing circuitry is 5 configured to apply a temporal analysis on the eye image data and the EEG data to determine a correlation between eye movements, brain activity and sedation state.
7. The system of any one of claims 1-6, wherein the processing circuitry is configured to analyze selected time-windows of the eye image data and/or the EEG data following an output of communication protocol to identify patient's response to said 10 communication protocol and to determine an updated sedation state of the patient based on said response.
8. The system of any one of claims 1-7, wherein the processing circuitry is further configured to transmit a signal carrying sedation state data indicative of the sedation state of the patient to a remote unit. 15
9. The system of any one of claims 1-8, wherein the processing circuitry is further configured to identify eye gestures in said eye image data, said eye gestures either trigger a communication protocol or affecting a selected communication protocol.
10. The system of claim 9, wherein said eye gestures are used for patient's response to a questionnaire. 20
11. The system of claim 9 or 10, wherein said eye gestures are used for playing audible output, said audible output is selected from certain music and voice recording of relatives.
12. The system of claim 1, wherein the processing circuitry is configured to correlate temporal profiles of said eye image data and/or EEG data with predetermined temporal 25 profiles corresponding to a plurality of signatures that are stored in a database and to identify a correlation that satisfies a certain condition.
13. The system of claim 12, wherein the processing circuitry is configured to generate a personalized signature of the patient upon identifying it and update the database or to store in a memory thereof said personalized signature of the patient. 30
14. A method for monitoring sedation state of a patient, comprising: receiving and processing (i) eye image data indicative of recorded images of an eye of the patient and (ii) EEG data indicative of EEG signal of the patient; - 12 - determining based on at least one of the eye image data and EEG data the sedation state of the user; analyzing said eye image and EEG data and identify signatures indicative of clinical state of the patient selected from at least one of: pain, thirst, hunger, delirium; and 5 outputting a selected communication protocol in response to the determined sedation state; wherein said determining comprises (i) continuously classifying the recorded eye activity of the patient into defined gestures, the temporal profile of the eye gestures defines the sedation state of the patient, (ii) applying different and varying weight factors 10 to the EEG data and the eye image data based on the amount of information of each data set or the assumed sedation state of the patient.
15. The method of claim 14, comprising calculating a sedation score of the patient and classifying the score into two or more score ranges, each range triggers a different output of communication protocol. 15 16. The method of claim 15, wherein in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered only based on the
16.EEG data.
17. The method of claim 16, wherein in at least one range of scores the determination of the sedation state and/or the communication protocol is triggered based on a 20 combination of the eye image data and the EEG data.
18. The method of any one of claims 14-17, comprising applying temporal analysis on the eye image data and the EEG data and determining a correlation between eye movements, brain activity and sedation state.
19. The method of any one of claims 14-18, comprising analyzing selected time- 25 windows of the eye image data and/or the EEG data following an output of communication protocol to identify patient's response to said communication protocol and to determine an updated sedation state of the patient based on said response.
20. The method of any one of claims 14-19, comprising transmitting a signal carrying sedation state data indicative of the sedation state of the patient to a remote unit. 30
21. The method of any one of claims 14-20, further comprising identifying eye gestures in said eye image data and either triggering a communication protocol or affecting the selected communication protocol based on said eye gestures. - 13 -
22. The method of claim 21, wherein said eye gestures are used for patient's response to a questionnaire.
23. The method of claim 21 or 22, wherein said eye gestures are used for playing audible output, said audible output is selected from certain music and voice recording of 5 relatives.
24. The method of claim 14-23, comprising correlating temporal profiles of said eye image data and/or EEG data with predetermined temporal profiles corresponding to a plurality of signatures that are stored in a database and identifying a correlation that satisfies a certain condition. 10
25. The method of claim 24, comprising generating a personalized signature of the patient and updating said database or storing in a memory said personalized signature.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL285071A IL285071B2 (en) | 2021-07-22 | 2021-07-22 | System and method for monitoring sedation level of a patient based on eye and brain activity tracking |
PCT/IL2022/050789 WO2023002488A1 (en) | 2021-07-22 | 2022-07-21 | A system for communication with a patient based on the determined sedation state |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL285071A IL285071B2 (en) | 2021-07-22 | 2021-07-22 | System and method for monitoring sedation level of a patient based on eye and brain activity tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
IL285071A true IL285071A (en) | 2023-01-01 |
IL285071B2 IL285071B2 (en) | 2023-05-01 |
Family
ID=84901817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
IL285071A IL285071B2 (en) | 2021-07-22 | 2021-07-22 | System and method for monitoring sedation level of a patient based on eye and brain activity tracking |
Country Status (1)
Country | Link |
---|---|
IL (1) | IL285071B2 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7982618B2 (en) * | 2007-01-29 | 2011-07-19 | Denso Corporation | Wakefulness maintaining apparatus and method of maintaining wakefulness |
WO2015039689A1 (en) * | 2013-09-19 | 2015-03-26 | Umc Utrecht Holding B.V. | Method and system for determining a parameter which is indicative for whether a patient is delirious |
KR20180131045A (en) * | 2017-05-31 | 2018-12-10 | 박영준 | Drowsiness preventing apparatus integrally formed by multi-sensored detectionalarm means and central processing means while driving a car and control methed therefor |
WO2019111257A1 (en) * | 2017-12-07 | 2019-06-13 | Eyefree Assisting Communication Ltd. | Communication methods and systems |
-
2021
- 2021-07-22 IL IL285071A patent/IL285071B2/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7982618B2 (en) * | 2007-01-29 | 2011-07-19 | Denso Corporation | Wakefulness maintaining apparatus and method of maintaining wakefulness |
WO2015039689A1 (en) * | 2013-09-19 | 2015-03-26 | Umc Utrecht Holding B.V. | Method and system for determining a parameter which is indicative for whether a patient is delirious |
KR20180131045A (en) * | 2017-05-31 | 2018-12-10 | 박영준 | Drowsiness preventing apparatus integrally formed by multi-sensored detectionalarm means and central processing means while driving a car and control methed therefor |
WO2019111257A1 (en) * | 2017-12-07 | 2019-06-13 | Eyefree Assisting Communication Ltd. | Communication methods and systems |
Also Published As
Publication number | Publication date |
---|---|
IL285071B2 (en) | 2023-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11055575B2 (en) | Intelligent health monitoring | |
US20180356887A1 (en) | Methods and apparatus for identifying potentially seizure-inducing virtual reality content | |
US10515631B2 (en) | System and method for assessing the cognitive style of a person | |
US20170143246A1 (en) | Systems and methods for estimating and predicting emotional states and affects and providing real time feedback | |
US20180285528A1 (en) | Sensor assisted mental health therapy | |
US20190108841A1 (en) | Virtual health assistant for promotion of well-being and independent living | |
US9113837B2 (en) | Drowsiness detection method and associated device | |
US20210186370A1 (en) | Automated and objective symptom severity score | |
EP3557479A1 (en) | Adaptive artificial intelligence system for identifying behaviors associated with mental illness and modifying treatment plans based on emergent recognition of aberrant reactions | |
JP2020173787A (en) | Information processing apparatus, information processing system, information processing method, and information processing program | |
US11904179B2 (en) | Virtual reality headset and system for delivering an individualized therapy session | |
US9621847B2 (en) | Terminal, system, display method, and recording medium storing a display program | |
IL285071A (en) | System and method for monitoring sedation level of a patient based on eye and brain activity tracking | |
CN106649365A (en) | Content screening method capable of actively protecting users | |
CN113288145A (en) | Teaching device and method for training emotion control capability | |
US20230210444A1 (en) | Ear-wearable devices and methods for allergic reaction detection | |
US11547366B2 (en) | Methods and apparatus for determining biological effects of environmental sounds | |
WO2023002488A1 (en) | A system for communication with a patient based on the determined sedation state | |
CN113425293B (en) | Auditory dyscognition disorder evaluation system and method | |
CN117915823A (en) | System for communicating with a patient based on a determined sedation state | |
CN110786859A (en) | Emergency alarm method, device and system | |
US11432773B2 (en) | Monitoring of diagnostic indicators and quality of life | |
WO2020209171A1 (en) | Iinformation processing device, information processing system, information processing method, and information processing program | |
WO2021260848A1 (en) | Learning device, learning method, and learning program | |
US20240000342A1 (en) | Method for decreasing meltdown incidence and severity in neurodevelopmental disorders |