WO2022061403A1 - Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks - Google Patents
Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks Download PDFInfo
- Publication number
- WO2022061403A1 WO2022061403A1 PCT/AU2021/051106 AU2021051106W WO2022061403A1 WO 2022061403 A1 WO2022061403 A1 WO 2022061403A1 AU 2021051106 W AU2021051106 W AU 2021051106W WO 2022061403 A1 WO2022061403 A1 WO 2022061403A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- blink
- blinks
- sgf
- data
- sgc
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 52
- 238000000034 method Methods 0.000 claims abstract description 100
- 230000001953 sensory effect Effects 0.000 claims abstract description 33
- 230000000926 neurological effect Effects 0.000 claims abstract description 25
- 238000004458 analytical method Methods 0.000 claims description 70
- 230000011514 reflex Effects 0.000 claims description 26
- 210000000744 eyelid Anatomy 0.000 description 100
- 238000012544 monitoring process Methods 0.000 description 45
- 230000008569 process Effects 0.000 description 32
- 238000001514 detection method Methods 0.000 description 25
- 230000006870 function Effects 0.000 description 25
- 239000000872 buffer Substances 0.000 description 19
- 206010041349 Somnolence Diseases 0.000 description 18
- 238000004891 communication Methods 0.000 description 16
- 238000005259 measurement Methods 0.000 description 16
- 230000001815 facial effect Effects 0.000 description 15
- 230000000694 effects Effects 0.000 description 14
- 210000003205 muscle Anatomy 0.000 description 14
- 230000002269 spontaneous effect Effects 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 13
- 238000013480 data collection Methods 0.000 description 12
- 230000008602 contraction Effects 0.000 description 11
- 210000004087 cornea Anatomy 0.000 description 11
- 230000005764 inhibitory process Effects 0.000 description 11
- 238000012360 testing method Methods 0.000 description 11
- 230000024188 startle response Effects 0.000 description 10
- 238000013459 approach Methods 0.000 description 8
- 239000000090 biomarker Substances 0.000 description 8
- 206010010904 Convulsion Diseases 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 7
- 238000011160 research Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 208000010340 Sleep Deprivation Diseases 0.000 description 6
- 230000036626 alertness Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 6
- 238000007405 data analysis Methods 0.000 description 6
- 230000001537 neural effect Effects 0.000 description 6
- 230000000638 stimulation Effects 0.000 description 6
- 239000003814 drug Substances 0.000 description 5
- 230000006735 deficit Effects 0.000 description 4
- 229940079593 drug Drugs 0.000 description 4
- 230000001976 improved effect Effects 0.000 description 4
- 210000000412 mechanoreceptor Anatomy 0.000 description 4
- 108091008704 mechanoreceptors Proteins 0.000 description 4
- 230000036972 phasic contraction Effects 0.000 description 4
- 230000006977 prepulse inhibition Effects 0.000 description 4
- 208000030886 Traumatic Brain injury Diseases 0.000 description 3
- 230000004913 activation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000001149 cognitive effect Effects 0.000 description 3
- 230000001010 compromised effect Effects 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 108091008708 free nerve endings Proteins 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- 230000009529 traumatic brain injury Effects 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- 206010028347 Muscle twitching Diseases 0.000 description 2
- 206010062519 Poor quality sleep Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000010171 animal model Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 208000010877 cognitive disease Diseases 0.000 description 2
- 230000009514 concussion Effects 0.000 description 2
- 230000036461 convulsion Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000013523 data management Methods 0.000 description 2
- 230000003412 degenerative effect Effects 0.000 description 2
- 210000000981 epithelium Anatomy 0.000 description 2
- 230000000193 eyeblink Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000002401 inhibitory effect Effects 0.000 description 2
- 230000035987 intoxication Effects 0.000 description 2
- 231100000566 intoxication Toxicity 0.000 description 2
- 210000003041 ligament Anatomy 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 208000027061 mild cognitive impairment Diseases 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 230000001473 noxious effect Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000012421 spiking Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000006411 tonic activation Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000002747 voluntary effect Effects 0.000 description 2
- 208000024827 Alzheimer disease Diseases 0.000 description 1
- 208000006096 Attention Deficit Disorder with Hyperactivity Diseases 0.000 description 1
- 208000007333 Brain Concussion Diseases 0.000 description 1
- 206010013142 Disinhibition Diseases 0.000 description 1
- 208000003556 Dry Eye Syndromes Diseases 0.000 description 1
- 206010013774 Dry eye Diseases 0.000 description 1
- 208000001613 Gambling Diseases 0.000 description 1
- 208000008589 Obesity Diseases 0.000 description 1
- 208000021384 Obsessive-Compulsive disease Diseases 0.000 description 1
- 208000018737 Parkinson disease Diseases 0.000 description 1
- 208000037273 Pathologic Processes Diseases 0.000 description 1
- 206010034158 Pathological gambling Diseases 0.000 description 1
- 231100000643 Substance intoxication Toxicity 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000002547 anomalous effect Effects 0.000 description 1
- 239000000164 antipsychotic agent Substances 0.000 description 1
- 208000029560 autism spectrum disease Diseases 0.000 description 1
- 210000004227 basal ganglia Anatomy 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000000744 blepharospasm Effects 0.000 description 1
- 206010005159 blepharospasm Diseases 0.000 description 1
- 230000004321 blink reflex Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000002512 chemotherapy Methods 0.000 description 1
- 210000000860 cochlear nerve Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 208000012696 congenital leptin deficiency Diseases 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000005064 dopaminergic neuron Anatomy 0.000 description 1
- 238000011143 downstream manufacturing Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 206010015037 epilepsy Diseases 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000004399 eye closure Effects 0.000 description 1
- 210000000256 facial nerve Anatomy 0.000 description 1
- 230000000494 facilitatory effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 210000001153 interneuron Anatomy 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 208000002551 irritable bowel syndrome Diseases 0.000 description 1
- 238000002483 medication Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 208000001022 morbid obesity Diseases 0.000 description 1
- 201000003631 narcolepsy Diseases 0.000 description 1
- 230000003533 narcotic effect Effects 0.000 description 1
- 210000001577 neostriatum Anatomy 0.000 description 1
- 210000001640 nerve ending Anatomy 0.000 description 1
- 230000004770 neurodegeneration Effects 0.000 description 1
- 208000015122 neurodegenerative disease Diseases 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000000414 obstructive effect Effects 0.000 description 1
- 210000002589 oculomotor nerve Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000009054 pathological process Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000009479 phasic inhibition Effects 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 201000000980 schizophrenia Diseases 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 201000002859 sleep apnea Diseases 0.000 description 1
- 208000019116 sleep disease Diseases 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 210000002538 spinal trigeminal nucleus Anatomy 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 230000036409 touch and pain Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 210000003901 trigeminal nerve Anatomy 0.000 description 1
- 210000000836 trigeminal nuclei Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/004—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1103—Detecting eye twinkling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4005—Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4029—Detecting, measuring or recording for evaluating the nervous system for evaluating the peripheral nervous systems
- A61B5/4041—Evaluating nerves condition
- A61B5/4047—Evaluating nerves condition afferent nerves, i.e. nerves that relay impulses to the central nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4029—Detecting, measuring or recording for evaluating the nervous system for evaluating the peripheral nervous systems
- A61B5/4041—Evaluating nerves condition
- A61B5/4052—Evaluating nerves condition efferent nerves, i.e. nerves that relay impulses from the central nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4082—Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4094—Diagnosing or monitoring seizure diseases, e.g. epilepsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6893—Cars
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/20—Workers
- A61B2503/22—Motor vehicles operators, e.g. drivers, pilots, captains
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/60—Healthcare; Welfare
Definitions
- the present invention relates, in various embodiments, to devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks, without a need to apply exogenous sensory stimuli.
- embodiments are configured to assist in a wide range of human condition monitoring applications, including clinical testing/assessment (such as in a controlled testing environment) and in-field testing/assessment (such as testing via in-vehicle driver monitoring systems). While some embodiments will be described herein with particular reference to those and other applications, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
- ADHD Attention- deficit-hyperactivity- disorder
- Psychiatric conditions such as schizophrenia, depression, obsessive-compulsive disorder.
- sensorimotor gating function analysis is of interest, including (but not limited to): post-anaesthetic patients; effects of chemotherapy, effects of sleep deprivation; patients with sleep disorders (such as obstructive sleep apnoea, narcolepsy, etc); morbid obesity, addictive behaviour; pathological gambling; and as an indicator of particular drugs being present in a subject’s system (for example narcotic).
- any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
- the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
- the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
- Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others.
- including is synonymous with and means comprising.
- the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
- FIG. 1 illustrates a system according to one embodiment.
- FIG. 2A illustrates a blepharometric detection system according to one embodiment.
- FIG. 2B illustrates a blepharometric detection system according to one embodiment.
- FIG. 3C illustrates a blepharometric detection system according to one embodiment.
- FIG. 3 illustrates a method according to one embodiment.
- FIG. 4A illustrates muscles of a right eye eyelid.
- FIG. 4B shows two frequency histograms of blink-free intervals.
- FIG. 5A illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
- FIG. 5B illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
- FIG. 5C illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
- FIG. 5D illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
- FIG. 6 illustrates a blepharometric data monitoring framework according to one embodiment.
- the present invention relates, in various embodiments, to devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks, without a need to apply exogenous sensory stimuli.
- embodiments include methods of assessing a neurological condition of a human subject, which include collecting data representative of characteristics of blinks performed by a human subject, wherein the blinks occur without known application of exogenous sensory stimuli, and processing the data thereby to infer a state of Sensorimotor Gating Function (SGF). For example, this may indicate that SGF is inhibited relative to an objective baseline.
- SGF Sensorimotor Gating Function
- Technology described herein allows for continuous assessment of SGF without a need for exogenous sensory stimuli. This allows for improved SGF testing in a clinical environment, and also for SGF assessment to be taken into field environments.
- embodiments discussed below allow for SGF measurements in an in-vehicle environment (during normal vehicle operation), thereby to provide an assessment of whether a vehicle operator is impaired. This may be used in the context of identifying impairments such as cognitive load, stress, driver distraction, alertness/drowsiness, substance intoxication, epilepsy, neurodegenerative diseases such as Alzheimer’s, Parkinson’s and mild cognitive impairment and a range of other conditions.
- blepharometric data refers to data that describes movements of a human subject’s eyelid (or eyelids). Eyelid movements are commonly categorised as “blinks” or “partial blinks”.
- blepharometric data is used to distinguish technology described herein from other technologies which detect the presence of blinks for various purposes.
- the technology herein is focussed on analysing eyelid movement as a function of time, typically measured as an amplitude. This data may be used to infer the presence of what would traditionally be termed “blinks”, however it is attributes of “events” and other parameters identifiable in eyelid movements which are of primary interest to technologies described herein.
- blepharometric artefacts These are referred to as “blepharometric artefacts”, with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e. blepharometric data). As discussed further below, particular blepharometric artefacts are used by embodiments of the subject technology thereby to allow for assessment of SGF.
- FIG. 1 illustrates an example system for assessing a neurological condition of a human subject. 101 .
- the system includes an eyelid position monitoring sensor 102, for example an IR- based system (such as IR reflectance oculography spectacles) or a camera-based system (for example an in-vehicle monitoring system), which is configured to enable detection of points in time at which a blink commences and concludes. For example, this is known to be achieved using eyelid position thereby to derive a value for amplitude as a function of time, and from this determine times at which individual blinks commence and conclude.
- an eyelid position monitoring sensor 102 for example an IR- based system (such as IR reflectance oculography spectacles) or a camera-based system (for example an in-vehicle monitoring system), which is configured to enable detection of points in time at which a blink commences and concludes. For example, this is known to be achieved using eyelid position thereby to derive a value for amplitude as a function of time, and from this determine times at which individual blinks commence and conclude.
- the blinks occur without known application of exogenous sensory stimuli. That is, the blinks are spontaneous blinks, intentional blinks, and incidental startle blinks (resulting from exogenous stimuli which are not associated with the assessment process).
- the blinks also include may self-stimulated blinks, which are a sub-category of reflex blinks which occur in response to a primary blink (for example as a result of the eyelid contacting the eye during the primary blink).
- Monitoring sensor 102 is coupled to a processing system 103, which is configured to identify characteristics of blinks, for example based on a data stream representative of amplitude as a function of time which is provided by or based on data provided by monitoring sensor 102. In preferred embodiments, this includes determining times representative of events at which individual blinks commence and conclude.
- An assessment system 103 is configured for further processing data from the processing system thereby to infer a state of Sensorimotor Gating Function (SGF). This is in some embodiments calculated based on analysis of Blink Free Intervals (BFIs), which are intervals between a defined blink completion time and following blink commencement time (difference from blink rate, which is calculated from blink start to blink start, or similar).
- BFIs Blink Free Intervals
- blink-free intervals are used to identify blinks which are predicted to be self-stimulated blinks (which are typically prevented via a sensorimotor gating mechanism), and use the presence, characteristics and/or frequency of those blinks to determine SGF.
- one approach includes:
- Another approach includes:
- the present disclosure relates to approaches whereby it is recognised that spontaneous blinks involve many of the same mechanisms that are involved in reflex blinks, apart from spontaneous blinks not involving exogenous sensory inputs (externally applied stimuli).
- spontaneous blinks involve many of the same mechanisms that are involved in reflex blinks, apart from spontaneous blinks not involving exogenous sensory inputs (externally applied stimuli).
- Each blink has three components - a period when the eyelids are closing, another when the eyelids are stationary and ‘closed’ (whether in direct contact or not), and a third when the eyelids are reopening.
- Blinks involve coordinated eyelid closing and reopening movements, mainly (but not only) due to the actions of two muscles in each eye, Levator palpebrae superioris (LPS) and Orbiculoris occuli (OO) (as shown in FIG. 4, which shows deep and superficial layers of muscles in the eyelids of the right eye).
- LPS Levator palpebrae superioris
- OO Orbiculoris occuli
- the ligaments which attach each end of LPS to the bony orbit are positioned such that, when LPS and 00 muscles are relaxed, the eyelids remain closed. This occurs during sleep. During wakefulness the eyelids are open most of the time, as a result of the tonic activation of LPS, and with the help of Mueller’s muscle.
- Each blink begins with the phasic inhibition of LPS. A few milliseconds later, there is phasic activation of 00 to close the eyelids. This mainly involves activation of the ‘fasttwitch’ palpebral fibres of 00 (rather than its orbital fibres). This activation of 00 is usually initiated by the blink generator.
- the upper tarsal muscle helps LPS, in its tonic activation mode, to maintain the elevated position of the upper eyelid.
- Sensorimotor gating is a process by which a neural system in the brain screens or gates sensory inputs that would otherwise interfere with processing of, and/or responding to, the most salient incoming information.
- eye blinks this means that a blink that would otherwise be triggered when the upper eyelid crosses the cornea and stimulates the nerve endings in it is either facilitated or inhibited, depending on when it occurs.
- the source of the sensory stimulus that triggers a reflex blink in that case would be endogenous (arising from within the system itself), rather than from exogenous sources (introduced from outside).
- Direct stimulation of the cornea as for example by a brief puff of air directed at it, or by a solid object such as a piece of dust touching, produces a reflex blink that is part of the corneal reflex.
- the epithelium of the cornea has many sensors in it, which are mainly free nerve endings that respond to touch and pain.
- the cornea is 300- 600 times more sensitive to such stimuli than the skin is.
- the latter occurs in the absence of an exogenous stimulus from which the cornea would otherwise need to be protected by eyelid closure.
- This might be called a selfstimulatory blink. It could be considered as a separate category, or as a sub-category of reflex blinks.
- the eyelid movements during these self-stimulatory blinks are subject either to augmentation or inhibition, depending on when they occur, by a process of sensorimotor gating.
- the present inventors have appreciated that the contraction of OO muscles is either augmented or inhibited by sensorimotor gating.
- the phasic contraction of 00 during a reflex blink has two components, referred to as R1 and R2.
- the former (R1) has a very short latency, of the order of 8-12 milliseconds. It involves a very short neuronal circuit with only three neurons.
- R2 has a longer latency (about 20-30 milliseconds) and involves interneurons in a longer neuronal circuit.
- these two neuronal circuits act in parallel.
- the sensorimotor gating process changes from being facilitatory of the first self-stimulatory feedback and augmented contraction of 00 due to the eyelid closing movement itself, to being inhibitory of 00 contraction during eyelid reopening.
- the inhibition of 00 contraction during eyelid reopening reaches its maximum about 60-120 milliseconds after the first episode of sensory input from the corneas and eyelids.
- the efficiency of that sensorimotor inhibition then declines progressively and ends after two or three seconds (i.e. blinks become disinhibited after a period of inhibition).
- the present inventors suggest that, by appropriate analysis of a series of blink- free intervals, it is possible to characterize this sensorimotor process without applying any exogenous stimuli, such as those applied during the assessment of pre-pulse inhibition.
- the ‘blink generator’ presumably begins its integration process again, ready to fire and initiate the next blink after an ‘appropriate’ period of inhibition during the gating process.
- the interval between the end of one eyelid reopening movement and the start of the next eyelid closing movement is referred to as the blink-free interval (BFI).
- IBIs inter-blink intervals
- the present inventors have re-analysed the results of earlier sleep deprivation experiments, thereby to gain improved understanding of blink-free intervals in healthy adults, and the effects of sleep deprivation on those intervals.
- FIG. 5 shows two frequency histograms of blink-free intervals recorded from 18 healthy subjects when alert, after their ‘normal’ night’s sleep, and when drowsy, after missing a night’s sleep. They performed a 15-minute visual reaction-time test in those two conditions while their eyelid movements were recorded by an Optalert system of infrared reflectance blepharometry. There were 4730 blinks recorded in the alert condition and 7380 in the drowsy condition.
- the sensorimotor gating of blinks is sometimes less than complete for blink-free intervals less than 50 milliseconds. This allows what can be termed ‘rapidly recurring blinks’ or ‘blink oscillations’ to occur intermittently. These events occurred somewhat more frequently in the drowsy condition.
- the present inventors based on their research of self-stimulatory blinks occurring as a result of during spontaneous blinks (due to self-stimulation of the corneal reflex by the eyelid moving across the cornea), have identified methods to assess SGF without the use of exogenous sensory stimuli. That is, the present inventors have devices and (?) methods for the continuous assessment of sensorimotor gating of endogenous eyeblinks without using exogenous sensory stimuli that would cause reflex blinks as part of the startle response.
- One embodiment provides a method of assessing a neurological condition of a human subject, the method including:
- the blinks occur without “known” application of exogenous sensory stimuli in the sense that the testing regime does not intentionally provide exogenous sensory stimuli to trigger the blinks. It is appreciated that there may be instances where exogenous sensory stimuli occurs during an assessment period (for example a loud ambient noise, flashing light, or the like). In some embodiments technological means (for example noise and/or light sensors, and optionally other sensors such as accelerometers) are used to assist in identifying presence of unintended exogenous sensory stimuli.
- Processing the data thereby to infer a state of Sensorimotor Gating Function may include identifying presence of a reflex blink following a primary blink. For example, conditions are set to differentiate between a self-stimulated reflex blink following the primary blink, and a subsequent spontaneous (or potentially exogenously stimulated) blink following the primary blink.
- the method includes processing the data thereby to infer a state of SGF via analysis of Blink Free Intervals (BFIs).
- BFIs Blink Free Intervals
- BFIs Blink Free Intervals
- the analysis of blink free intervals includes:
- SGC range 250ms ⁇ BFI ⁇ 750ms
- non-SGC range 2s ⁇ BFI ⁇ 3s.
- Other ranges may be used, for instance selected based on an understanding that BFIs between about 2 milliseconds and 2 seconds are controlled mainly by the sensorimotor gating mechanism, whereas BFIs longer than about 2 seconds are controlled mainly by the ‘blink generator’.
- the series of blink free intervals is preferably between 50 and 2,000 blinks.
- a rolling count is used, thereby to continually adjust measurements of SGF based on latest BFI input values.
- the series is measured based on a time period, rather than a number of blinks.
- SGF is measured by way of a ratio between: a count of blinks in the SGC range and; (ii) a count of blinks the non-SGC range. This ratio is calculated over the series, which as noted above may be defined by way of a number of blinks or a period of time.
- the process of calculating an SGF value from IBI values may include:
- a count of IBI values within a first range for example a range of IBI representative of self-stimulated reflex blinks resulting from compromised SGF. This may be termed a Sensorimotor Gating Controlled (SGC) range.
- SGC Sensorimotor Gating Controlled
- Non-SGC non-Sensorimotor Gating Controlled
- the SCG range may be segmented into a plurality of subranges, thereby to differentiate between I Bls in an early part of the SGC range and a later part (or parts) of the SGC range. These may be weighted for the purpose of calculating the SGF value, thereby to add additional importance to those occurring earlier in the SGC range (which are representative of a potentially higher level of sensorimotor gating function compromise).
- the SGF value is compared against a benchmark value, for example a benchmark derived for the subject as a particular individual (optionally based on past monitoring), for a hypothetical individual having corresponding attributes to the subject (for example demographic attributes), or a general baseline.
- the baseline may be a variable baseline which accounts for other factors which are known to effect SGF.
- Analysis of SGF as described herein can be compared with existing forms of analysis based on other blink characteristics (for example those relating to characteristics of blink amplitudes and/or velocities), thereby to assist in improving robustness of overall assessment programs. It should be noted that the same senor hardware can be used for both forms of analysis.
- FIG. 3 illustrates a methodology which is relevant to a range of embodiments discussed below.
- This methodology is in some cases performed via software modules executing at a single computing device, and in other cases performed via software modules executing across a plurality of connected devices, for example including local devices (such as computing devices housed in a vehicle and/or user’s mobile devices such as smartphones) and Internet-connected server devices (also referred to as “cloud” components).
- local devices such as computing devices housed in a vehicle and/or user’s mobile devices such as smartphones
- Internet-connected server devices also referred to as “cloud” components
- a main outcome of the method of FIG. 3 is the generation of a continuous SGF value, which is in this example is an SGF value calculated based on a set number of most recent IBI values.
- alternate processes may be performed to achieve the same or a corresponding result.
- Block 301 represents a process including commencing a monitoring operation.
- this is achieved via a camera system having an image capture component that is positioned into a capture zone in which a subject’s face is predicted to be positioned.
- this may include:
- Vehicles including passenger vehicles or operator-only vehicles, wherein the image capture component is positioned to capture a region in which an operator’s face is predicted to be contained during normal operation.
- the image capture component may include a camera mounting in or adjacent a dashboard or windscreen.
- Vehicles in the form of passenger vehicles, wherein the image component is positioned to capture a region in which a passenger’s face is predicted to be contained during normal operation.
- the image capture component may include a camera mounting in or adjacent a dashboard or windscreen, the rear of a seat (including a seat headrest), and so on.
- Mass transport vehicles including passenger trains and/or aircraft, wherein the image component is positioned to capture a region in which a passenger’s face is predicted to be contained during normal operation.
- the image capture component may be mounted in the rear of a seat (including a seat headrest), optionally in a unit that contains other electronic equipment such as a display monitor.
- Seating arrangements such as theatres, cinemas, auditoriums, lecture theatres, and the like. Again, mounting image capture components in the rear of seats is an approach adopted in some embodiments.
- other hardware is used for the purpose of monitoring, for example infrared reflectance oculography spectacles and other wearable devices capable of monitoring eyelid movements.
- the monitoring process preferably includes a process which measures eyelid position (amplitude) as a function of time (for at least one upper eyelid). This is differentiated from processes which merely look for an open or closed state. This is because, as described herein, SGF is calculated based on I Bl, and measuring I Bl requires accurate information regarding commencement and completion of blink movements.
- the monitoring process of block 301 provides a stream of blepharometric data, which is processed thereby to identify artefacts. This may be real time, substantially in real time, or with a delay. In some embodiments data representative of eyelid amplitude as a function of time is subjected to one or more pre-processing operations prior to the process of block 302 onwards, for example including filtering, upscaling, or the like. These may be used thereby to improve detection of eyelid closure commencement events and eyelid reopening completion events (i.e. the “start” and “end” of each blink, based on which I Bl is calculated).
- Block 302 represents a process including detecting an eyelid re-opening completion event (alternately termed a “blink completion event”) in the blepharometric data stream, and a time associated with that event. This event may be determined via one or more of amplitude value, eyelid velocity, eyelid acceleration, and/or other factors, thereby to identify an objective point on an amplitude-time curve which represents an eyelid reopening completion event.
- a link completion event an eyelid re-opening completion event
- Block 303 represents a process including detecting an eyelid closure commencement event (alternately termed a “blink commencement event”) in the blepharometric data stream, and a time associated with that event. This event is also determined via one or more of amplitude value, eyelid velocity, eyelid acceleration, and/or other factors, thereby to identify an objective point on an amplitude-time curve which represents an eyelid closure commencement event.
- a link commencement event an eyelid closure commencement event
- Block 304 represents a process including determining a new I Bl value from as the time elapsed between the eyelid re-opening completion event at 302 and the eyelid closure commencement event at 303.
- alternate methods may be used to calculate I Bls from the eyelid re-opening completion event at 302 and the eyelid closure commencement event at 303, optionally including staring of a timer at when the eyelid re-opening completion event occurs and stopping the timer when the eyelid closure commencement event occurs. It will be appreciated that a range of software approaches may be configured to enable accurate and efficient determination of I Bls from the stream of amplitude data.
- an IBI buffer defines a set of IBI values which are used to calculate an SGV value, and the IBI buffer is configured to store a maximum number of I Bls. This maximum number is in some instances an absolute number of blinks (for example between 50 and 2,000, for example 100 or 1 ,000), and in other cases is defined by a time window over which the IBI buffer is filled (for example between 5 minutes and 20 minutes). In some embodiments multiple IBI buffers are used thereby to enable calculation of a short-term continuous SGF values and one or more longer-term continuous SGF values (for example 100 blinks and 1000 blinks).
- the new IBI value is added to the buffer at block 306 and the process loops to block 302 to collect more IBI values (in this example an SGF value is only calculated once the buffer is full; in other embodiments the SGF value may be calculated from a unfilled buffer and/or just-filled buffer).
- the process also moves to block 308 in addition to looping.
- the process moves to block 307 at which the new IBI value is added to the buffer, and the oldest IBI value in the buffer is discarded from the buffer. In this manner, if the buffer is configured to store X IBI values, the buffer continues to store the most recent X IBI values.
- Block 308 represents calculation of an SGF value based on the current IBI buffer.
- the calculation process is performed each time a new IBI value is added to the buffer.
- calculation of a new IBI value may be performed less frequently, for example based on a defined time interval (for example once every X seconds), or each time another Y IBI values are added (for example where 1 ⁇ Y ⁇ 10).
- the calculation of an SGF value at block 308 may be based on a wide range of metrics.
- these metrics are representative of the extent to which SGF is compromised, as indicated by presence of blinks that are predicted to be self-stimulated reflex blinks following spontaneous blinks (or in some cases startle blinks caused by incidental conditions in a monitoring environment).
- the process of calculation of an SGF value may include:
- a count of IBI values within a first range for example a range of IBI representative of self-stimulated reflex blinks resulting from compromised SGF. This may be termed a Sensorimotor Gating Controlled (SGC) range.
- SGC Sensorimotor Gating Controlled
- Non-Sensorimotor Gating Controlled (non-SGC) range Maintaining a count of IBI values within a second range, for example a range of IBI representative of blinks resulting from mechanism other than self-stimulation via a primary blink (for example blinks generated by the “blink generator” in the conventional manner(. This may be termed a non-Sensorimotor Gating Controlled (non-SGC) range.
- the SCG range may be segmented into a plurality of subranges, thereby to differentiate between I Bls in an early part of the SGC range and a later part (or parts) of the SGC range. These may be weighted for the purpose of calculating the SGF value, thereby to add additional importance to those occurring earlier in the SGC range (which are representative of a potentially higher level of sensorimotor gating function compromise).
- the SGF value is compared against a benchmark value, for example a benchmark derived for the subject as a particular individual (optionally based on past monitoring), for a hypothetical individual having corresponding attributes to the subject (for example demographic attributes), or a general baseline.
- the baseline may be a variable baseline which accounts for other factors which are known to effect SGF.
- the method of FIG. 3 includes a preliminary pre-monitoring process including identifying a subject from whom the blepharometric data collected from the monitoring originates.
- This optionally includes: • Credential-based identification, for example via a login.
- This may include pairing of a personal device (such as a smartphone) to blepharometric data monitoring system (e.g. pairing a phone to an in-vehicle system), inputting login credentials via an input device, or other means.
- Biometric identification For example, in some embodiments described herein a camera-based blepharometric data monitoring system utilises image data to additionally perform facial recognition functions, thereby to uniquely identify human subjects.
- Identification of the subject is relevant for the purposes of comparing current blepharometric data with historical blepharometric data for the same subject.
- an analysis system has access to a database of historical blepharometric data for one subject (for example where the system is installed in a vehicle and monitors only a primary vehicle owner/driver) or multiple subjects (for example a vehicle configured to monitor multiple subjects, or a cloud-hosted system which received blepharometric data from a plurality of networked systems, as described further below). This may used, by way of example, to derive a baseline SGF value for an individual, against which current SGF values are benchmarked.
- the method of FIG. 3 may additionally include determination of a range of other blepharometric data artefacts.
- the artefacts may include:
- Blink total duration BPD
- AVRs Amplitude to velocity ratios
- the “current period” may be either a current period defined by a current user interaction with a blepharometric data monitoring system, or a subset of that period.
- the “current period” is in one example defined as a total period of time for which a user operates the vehicle and has blepharometric data monitored, and in another embodiment is a subset of that time.
- multiple “current periods” are defined, for example using time block samples of between two and fifteen minutes (which are optionally overlapping), thereby to compare blepharometric data activity during periods of varying lengths (which may be relevant for differing neurological conditions, which, in some cases, present themselves based on changes in blepharometric data over a given period of time).
- These additional blepharometric data artefacts may be used to calculate other metrics, which are optionally compared with SGF values (for example to validate and/or provide additional context to those values). This may include identifying a known blepharometric biomarker (such as alertness/drowsiness determined via AVRs), thereby to assess whether poor SGF performance is potentially due to one or more other known conditions (or, conversely, ruling out conditions such as drowsiness). This can be of particular assistance in vehicle operation, where there may be utility in differentiating between poor SGF performance resulting from drowsiness, against SGF performance resulting from factors such as stress, cognitive load, intoxication or driver distractedness.
- a known blepharometric biomarker such as alertness/drowsiness determined via AVRs
- FIG. 2A illustrates a first example hardware arrangement, in the form of a head wearable unit, which in the example of FIG. 2A takes the form spectacles 200, which is for the present purposes configured to assess human condition based on SGF (for example based on measurements.
- These spectacles need not be functional as vision affecting spectacles (i.e. they do not necessarily include lenses, and may simply be a frame that provides a wearable mount, or other head-wearable device).
- Spectacles 200 include a frame 201 which is mounted to a human subject’s head, an IR transmitter/receiver assembly 202 which is positioned relative to the body thereby to, in use, transmit a predefined IR signal onto the subject’s eye, and receive a reflected IR signal resulting from reflection of the transmitted IR signal off the user’s eye or eyelid.
- a sizing adjustment mechanism 203 allows for control over positioning of a nose mount portion, thereby to allow effective locating of assembly 202 relative to the wearer’s eye.
- a processing unit 204 (which is optionally mounted to a spectacle arm) receives and processes the received IR signal. This processing may include:
- Onboard processing using a set of artefact detection algorithms stored a computer code on a memory unit and executed via a microprocessor.
- raw data from IR assembly 202 is subjected to one or more pre-processing algorithms (for example filters and the like), and an artefact detection algorithm operates to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
- pre-processing algorithms for example filters and the like
- raw data from IR assembly 202 is transmitted (for example via Bluetooth or another wireless communication medium) to a secondary processing device, which optionally takes the form of a smartphone.
- a secondary processing device which optionally takes the form of a smartphone.
- an onboard processor performs preliminary processing of the raw data prior to transmission, for example to reduce complexity and/or amount of data required to be transmitted.
- the secondary processing device executes a software application which includes/accesses the set of artefact detection algorithm (which are stored on a memory unit of the secondary processing device). Again, these algorithms operate to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
- FIG. 2A is optionally used to collect a continuous stream of amplitude data as a function of time, which is subsequently processed thereby to enable extraction of data artefacts such as I Bl .
- firmware or software may be configured to perform limited analysis which simply identifies blink start and end events, and from this maintains a count of I Bls in the SGC and non-SGC ranges, thereby to enable continuous generation of a SGF metric.
- spectacles 200 use alternate sensor arrangements to record eyelid position, for example camera systems as discussed in embodiments further below.
- FIG. 2B illustrates a second example hardware arrangement, in the form of a camerabased blepharometric data monitoring system 210, which is forthe present purposes configured to assess human condition based on SGF (for example based on measurements.
- This form is system is optionally installed in a vehicle, for example as a driver monitoring system which assesses SGF (for example as a means to assess driver performance by reference to factors such as drowsiness, stress, cognitive load, distractedness, intoxication, impairment, and/or other conditions).
- SGF for example as a means to assess driver performance by reference to factors such as drowsiness, stress, cognitive load, distractedness, intoxication, impairment, and/or other conditions.
- System 210 includes a camera unit 211 , which is positioned to capture image data in a region including a human subject’s face, when that human subject is positioned in a defined area.
- the defined area is an operator position for a vehicle (such as a car or truck, airline, or other, including operator and/or passenger locations).
- the defined area is relative to a piece of furniture (for example to allow monitoring of a subject operating a computer or watching a television), or a clinical device.
- the camera unit may include a webcam provided by a computer device.
- a processing unit 212 processes image data from camera unit 211 via a vision system thereby to identify a subject’s facial region (for example using known facial detection algorithms), and from that identify the user’s eyes, and by way of image-driven tracking algorithms monitor the user’s eyes thereby to detect and measure blinks (optionally in combination with cloud-based processing 213).
- Blinks are identified and measured thereby to determine blepharometric data, which is processed using artefact detection algorithms, for example as discussed above.
- these algorithms operate to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
- the hardware arrangement of FIG. 2B is installed in a vehicle, such as an automobile, and as such configured to detect artefacts in blepharometric data which are relevant to an operator of the vehicle (for example in the context of detecting drowsiness and/or other neurological conditions).
- Output for example in terms of alerts and the like, is delivered via an output unit such as a display device 214 (which, in a vehicle embodiment, may be an in-vehicle display) or a networked computing device (such as a smartphone 215).
- a display device 214 which, in a vehicle embodiment, may be an in-vehicle display
- a networked computing device such as a smartphone 215.
- delivery of data to an output device is provided from an Internet-based processing/data management facility to the display device rather than directly from system 212 (e.g. both are connected to a common networked data processing/management system).
- the output may be delivered to the human subject being monitored and/or to a third party.
- eyelid monitoring is performed via a process including the following steps, thereby to provide a signal representative of amplitude as a function of time.
- Identify, in the eye region(s), presence and movement of an eyelid For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time).
- a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face.
- the two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
- a trained Al image classifier is used to identify blink commencement and completion events from images, for example based on a pre-training process.
- FIG. 2C illustrates a third blepharometric monitoring system, in the form of a smartphone-integrated blepharometric monitoring system 220, which is for the present purposes configured to assess human condition based on SGF (for example based on measurements of I Bl as discussed further above).
- system 220 utilises existing smartphone hardware 221 .
- a smartphone image capture unit preferably a front-facing camera 222, but optionally a rear facing camera
- a software application 223 is leveraged by a software application 223 thereby to perform facial detection and blepharometric detection/measurement in a similar manner to the embodiment of FIG. 2B.
- the software application operates as a foreground application, which delivers graphical information via the smartphone screen 224 concurrently with blink detection (in some cases this graphical information is used to assist in standardising conditions for a blink detection period).
- the software application operates as a background application, which perform blink detection and measurement whilst other software applications are presented as foreground applications (for example blink detection whilst a user operates a messaging application).
- Processing of blink detection data is optionally performed via software application 223 using the smartphone’s internal processing capabilities, transmitted to a server device for remote processing, or a hybrid approach which includes both local processing and remote processing.
- one embodiment provides a portable electronic device including: a display screen; and a front-facing camera; wherein the portable electronic device is configured to concurrently execute: (i) a first software application that provides data via the display screen; and (ii) a second software application that receives input from the front facing camera thereby to facilitate detection and analysis if blepharometric data.
- the first software application is in one embodiment a messaging application, and in another embodiment a social media application. This allows for collection of blepharometric data whilst a user engages in conventional mobile device activities.
- One embodiment provides computer executable code that when executed causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharometric-based data collection, wherein the computer executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
- the purpose may be, for example, messaging or social media.
- Embodiments such as that of FIG. 2C provide for collection of blepharometric data via a background software application executing on electronic device with a front-facing camera. This provides opportunities to analyse a device user’s neurological condition, for example in the context of predicting seizures, advising on activities, diagnosing potential neurological illnesses, detecting drowsiness, and so on.
- blepharometric data monitoring systems are discussed below. It will be appreciated that these are configurable to perform SGF assessment as described herein as a particular form of blepharometric data analysis. Components of these systems can in further embodiments be incorporated into any of the systems above.
- FIG. 5A illustrates an example in-vehicle blepharometric data monitoring system, which is configurable for the purposes of measuring I Bls and calculating SGF for any of the assessment methodology embodiments discussed above. Whilst it is known to provide a blepharometric data monitoring system in a vehicle for the purposes of point-in-time analysis of alertness/drowsiness, based on blink rates, blink frequency, blink duration and/or AVRs, such hardware has not been proposed as a means for assessing SGF.
- the system of FIG. 5A includes an image capture device 520.
- This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement. In some embodiments frame rates are upscaled and/or fitted to models thereby to improve accuracy of detection of events representing commencement and completion of blinks, as used herein for I Bl measurement purposes.
- Device 520 is positioned to capture a facial region of a subject.
- Device 520 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver. In another embodiment device 520 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger. In another embodiment device 520 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger. In some embodiments a combination of these are provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
- FIG. 5A (and other systems) are described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implanted in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
- mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
- An in-vehicle image processing system 510 is configured to receive image data from image capture device 520 (or multiple devices 520), and process that data thereby to generate blepharometric data.
- a control module 511 is configured to control device 520, operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
- Optionally perform subject identification is achieved (for example via facial recognition algorithms or technologies which identify a subject via alternate means). This may include identifying a known subject based on an existing subject record defined in user identification data 551 stored in a memory system 550, or identifying an unknown subject and creating a new subject user identification data 551 stored in a memory system 550.
- identifying an eye region In a detected human face, identifying an eye region.
- the algorithms are configured to track one eye region only; in other embodiments both eye regions are tracked thereby to improve data collection.
- Identify, in the eye region(s), presence and movement of an eyelid For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time).
- a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face.
- the two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
- Algorithms 512 optionally operate to extract additional artefacts from blepharometric data, for example amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
- a blepharometric data management module 513 is configured to coordinate storage of blepharometric data generated by algorithms 512 in user blepharometric data 552. This includes determining a user record against which blepharometric data is to be recorded (in some cases there is only a single user record, for example where blepharometric data s collected only from a primary driver of an automobile). In some embodiments the function of module 513 includes determining whether a set of generated blepharometric data meets threshold data quality requirements for storage, for example based on factors including a threshold unbroken time period for which eyelid tracking is achieved and blepharometric data is generated. [00138] Memory system 550 includes user identification data 551 for one or more users.
- system 501 is configured to collect and analyse blepharometric data for only a single user (for instance the primary driver of a vehicle) and includes identification data to enable identification of only that user.
- system 501 includes functionality to collect and analyse blepharometric data for multiple users, and includes identification data to enable identification of any of those users (and optionally, as noted above, defining of a new record for a previously unknown user).
- the identification data may include login credentials (for example a user ID and/or password) which are inputted via an input device.
- the identification data may be biometric, for example using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments this leverages an existing biometric identification system of the vehicle.
- User blepharometric data 552 includes data associated with identified users, the data basing time coded thereby to enable identification of a date/time at which data was collected.
- the blepharometric data stored in data 552 optionally includes blepharometric data generated by algorithms 512 and further blepharometric data derived from further processing of that data, for example data representing average periodic lEDs and/or BTDs, and other relevant statistics which may be determined over time.
- data processing algorithms are updated over time, for example to allow analysis of additional biomarkers determined to be representative of neurological conditions which require extraction of particular artefacts from blepharometric data.
- Analysis modules 530 are configured to perform analysis of user blepharometric data 552. This includes executing a process including identification of relationships between current blepharometric data artefacts (e.g. data recently received from system 510) and historical blepharometric data artefacts (e.g. older data pre-existing in memory system 550). This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject. The concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
- blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any particular blepharometric data artefacts that are evolving/trending over time.
- algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
- Analysis modules are optionally updated over time (for example via firmware updates or the like) thereby to allow for analysis of additional blepharometric data artefacts and hence identification of neurological conditions. For example, when a new method for processing blepharometric data thereby to predict a neurological condition based on a change trend in one or more blepharometric data artefacts, an analysis algorithm for that method is preferably deployed across a plurality of systems such as system 501 via a firmware update or the like.
- System 501 additionally includes a communication system 560, which is configured to communicate information from system 501 to human users.
- This may include internal communication modules 561 which provide output data via components installed in the vehicle, for example an in-car display, warning lights, and so on.
- External communication modules 562 are also optionally present, for example to enable communication of data from system 501 to user devices (for example via Bluetooth, WiFi, or other network interfaces), optionally by email or other messaging protocols.
- communication system 560 is configured to communicate results of analysis by analysis modules 530.
- a control system 540 included logic modules 541 which control overall operation of system 540. This includes execution of logical rules thereby to determine communications to be provide din response to outputs from analysis modules 530. For example, this may include:
- logic modules 540 are able to provide a wide range of functionalities thereby to cause system 501 to act based on determinations by analysis modules 530.
- the system illustrated in FIG. 5A provides technology whereby one or more digital cameras are able to be installed in a vehicle, such as an automobile or mass transport vehicle, thereby to: (i) collect blepharometric data for an operator and/or one or more passengers; and (ii) enable determination of relationships between blepharometric data collected in a “current” period (for example a last data set, a last day, a last week, or a last month) with historical blepharometric data that is stored for that same user.
- a “current” period for example a last data set, a last day, a last week, or a last month
- Prediction of neurological conditions based on sudden changes and/or long term trends in change for one or more blepharometric data artefacts that are known to be indicative of particular neurological conditions; • Personalised prediction of future neurological conditions, for example prediction of future drowsiness based on known drowsiness development patters extracted for the individual from historical data, and prediction of likelihood of a seizure based on individually-verified seizure prediction biomarkers identifiable in blepharometric data.
- FIG. 5B illustrates a further embodiment, which includes various common features with the embodiment illustrated in FIG. 5A.
- external communication modules 562 facilitate communication with a remote server device, which optionally performs additional blepharometric data analysis.
- external communication modules 562 enable communication between system 501 and a cloud-based blepharometric data analysis system 580. This may, for example, perform SGF assessment based on a wider set of algorithms, benchmarks and/or comparison data sets compared with what is available at a local system. This also allows for blepharometric data, in this case including SGF data, to be collected over a longer term thereby to allow for identification of longer terms trends, sudden changes compared to historical values, and so on.
- System 580 includes a control system 582 and logic modules 581 which are provided by computer executable code executing across one or more computing devices thereby to control and deliver functionalities of system 580.
- System 580 additionally includes a memory system 583, which includes user identification data 584 and user blepharometric data 585.
- the interplay between memory system 583 and memory system 550 varies between embodiments, with examples discussed below: •
- memory system 550 operates in parallel with memory system 583, such that certain records are synchronised between the systems based on a defined protocol. For example, this optionally includes a given memory system 550 maintaining user blepharometric data and user identification data for a set of subjects that have presented at that in-vehicle system, and that data is periodically synchronised with the cloud system.
- the system upon an unrecognised user presenting at a given in-vehicle system, the system optionally performs a cloud (or other external) query thereby to obtain identification data for that user, and then downloads from the cloud system historical user blepharometric data for that user. Locally collected blepharometric data us uploaded to the server.
- a cloud or other external
- memory system 550 is used primarily for minimal storage, with system 503 providing a main store for user blepharometric data.
- system 550 includes data representative of historical blepharometric data baseline values (for instance defined as statistical ranges), whereas detailed recordings of blepharometric data is maintained in the cloud system.
- analysis modules 586 of cloud system 580 performed more complex analysis of user blepharometric data thereby to extract the historical blepharometric data baseline values, which are provided to memory system 550 where a given user is present or known thereby to facilitate local analysis of relationships from baselines.
- local memory system 550 is omitted, with all persistent blepharometric data storage occurring in cloud memory system 583.
- System 580 additionally includes analysis modules 586, which optionally perform a similar role a modules 530 in FIG. 5A.
- local and cloud analysis modules operate in a complementary factor, for example with analysis modules 530 performing relationship analysis relevant to point-in-time factors (for example an altered/non-standard neurological state for a user by comparison with historical baselines, which warrants immediate intervention) and analysis modules 586 performing what is often more complex analysis of trends over time (which may be representative of degenerative neurological illnesses and the like) and do not require local immediate intervention in a vehicle.
- a user may have a personal car with a system 501 , and subsequently obtain a rental car whilst travelling with its own system 501 , and as a result of cloud system 580 the rental car system: has access to the user’s historical blepharometric data; is able to perform relationship analysis of the current data collected therein against historical data obtained from the cloud system; and feed into the cloud system the new blepharometric data collected to further enhance the user’s historical data store.
- FIG 5C illustrates a further variation where a user has a smartphone device 570 that executes a software application configured to communicate with a given local in-vehicle system 501 (for example via Bluetooth or USB connection) and additionally with cloud system 580 (for example via a wireless cellular network, WiFi connection, or the like).
- This provides functionality for communication between system 500 and system 580 without needing to provide Internet connectivity to a vehicle (the in-vehicle system essentially uses smartphone 570 as a network device).
- Using a smartphone device as an intermediary between system 501 and system 580 is in some embodiments implemented in a matter that provides additional technical benefits. For example:
- smartphone 570 provides to system 501 data that enabled identification of a unique user, avoiding a need for facial detection and/or other means. For instance, upon coupling a smartphone to a in-car system (which may include system 501 and one or more other in-car systems, such as an entertainment system) via Bluetooth, system 501 receives user identification data from smartphone 570.
- a smartphone upon coupling a smartphone to a in-car system (which may include system 501 and one or more other in-car systems, such as an entertainment system) via Bluetooth, system 501 receives user identification data from smartphone 570.
- a in-car system which may include system 501 and one or more other in-car systems, such as an entertainment system
- a most-recent version of a given user is stored on smartphone 570, and downloaded to system 501 upon coupling.
- one or more functionalities of analysis modules 530 are alternately performed via smartphone 570, in which case system 501 optionally is configured to in effect be a blepharometric data collection and communication system without substantive blepharometric data analysis functions (which are instead performed by smartphone 570, and optionally tailored via updating of smartphone app parameters by system 580 for personalised analysis.
- smartphone 570 is also in some cases useful in terms of allowing users to retain individual control over their blepharometric data, with blepharometric data not being stored by an in-vehicle system in preference to being stored on the user’s smartphone.
- FIG. 5D illustrates a further variation in which communication between a local system 501 and cloud system 580 operates in a similar manner to FIG. 5B, but where a smartphone 570 is still present.
- the smartphone is optionally used as an output device for information derived from blepharometric data analysis, and/or as a device to confirm identify and approval for blepharometric data collection.
- a given system 501 identifies a user by way of biometric information (e.g. facial detection) using user identification data stored in system 583 of cloud system 580, and a message is sent to smartphone 570 allowing the user to confirm that they are indeed in the location of the relevant system 570, and providing an option to consent to blepharometric data monitoring.
- biometric information e.g. facial detection
- FIG. 6 illustrates an exemplary framework under which a cloud based blepharometric data analysis system 580 operates in conjunction with a plurality of disparate blepharometric data monitoring systems 601-606. Each of these systems is in communication with system 580, such that user data (for example user blepharometric data comprising historical data) is able to be utilised for analysis even where a user’s blepharometric data is collected from physically distinct monitoring systems. Analysis of blepharometric data (for example determination of relationships between current and historical data) may be performed at the cloud system 580, at the local systems 601-606, or combined across the cloud and local systems.
- user data for example user blepharometric data comprising historical data
- Analysis of blepharometric data for example determination of relationships between current and historical data
- Vehicle operator configurations 601 are in-vehicle systems, such as that of FIG. 5A-5D, in which the image capture device is positioned to capture blepharometric data for an operator of the vehicle.
- a webcam or other image capture device is used to monitor user blepharometric data, with imagebased eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example an application which instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example word processing and/or internet browsing).
- a foreground application for example an application which instructs a user to perform a defined task during which blepharometric data is collected
- a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example word processing and/or internet browsing).
- Vehicle passenger configurations 604. These are in-vehicle systems, such as that of FIG. 5A-5D, in which the image capture device is positioned to capture blepharometric data for a passenger of the vehicle.
- these are optionally configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen.
- the camera is positioned based on a presumption that a front seat passenger will for a substantial proportion of the time pay attention to the direction of vehicle travel (e.g. watch the road).
- a front facing camera is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example an application which instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example messaging and/or social media application usage).
- a foreground application for example an application which instructs a user to perform a defined task during which blepharometric data is collected
- a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example messaging and/or social media application usage).
- Medical facility configurations 606. may make use of image processingbased blepharometric data monitoring, and/or other means of data collection (such as infrared reflectance oculography spectacles). These provide a highly valuable component in the overall framework: due to centralised collection of blepharometric data over time for a given subject from multiple locations over an extended period of time, a hospital is able to perform point-in-time blepharometric data collection and immediately reference that against historical data thereby to enable identification of irregularities in neurological conditions.
- FIG. 6 also shows how system 580 is able to interact with a plurality of user mobile devices such as device 570.
- User identification data 584 provides addressing information thereby to enable system 580 to deliver messages, alerts, and the like to correct user devices.
- a particular person displays a specific blepharometric data biomarker (for example threshold spiking in negative inter event duration) in the lead-up to a seizure event; a process configured to monitor for that biomarker is initialised in response to identification of that person.
- a process configured to monitor for that biomarker is initialised in response to identification of that person.
- an analysis module of an in-vehicle device is configured for such monitoring once the person is detected, and provides a seizure warning when the biomarker is detected.
- the above disclosure provides analytic methods and associated technology that enables improved analysis of human neurological conditions.
- the present technology provides improved methods for monitoring SGF, without a need to provide exogenous stimuli to illicit startle blinks, and additionally without a need to use invasive monitoring equipment.
- a device configured to measure sensorimotor gating, based on blink-free intervals, could also analyse the characteristics of blinks that are currently used to measure alertness/drowsiness (and/or other attributes) based on known technologies. These two forms of analysis could be made independently during any test period, and this would enhance interpretation of the results from each test.
- Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
- the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
- the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
- Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Neurology (AREA)
- Physiology (AREA)
- Neurosurgery (AREA)
- Psychiatry (AREA)
- Artificial Intelligence (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Developmental Disabilities (AREA)
- Radiology & Medical Imaging (AREA)
- Dentistry (AREA)
- Hospice & Palliative Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Child & Adolescent Psychology (AREA)
- Signal Processing (AREA)
- Social Psychology (AREA)
- Educational Technology (AREA)
- Ophthalmology & Optometry (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The present invention relates, in various embodiments, to devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks, without a need to apply exogenous sensory stimuli. For example, embodiments include methods of assessing a neurological condition of a human subject, which include collecting data representative of characteristics of blinks performed by a human subject, wherein the blinks occur without known application of exogenous sensory stimuli, and processing the data thereby to infer a state of Sensorimotor Gating Function (SGF). For example, this may indicate that SGF is inhibited relative to an objective baseline.
Description
DEVICES AND PROCESSING SYSTEMS CONFIGURED TO ENABLE ASSESSMENT OF A CONDITION OF A HUMAN
SUBJECT BASED ON SENSORIMOTOR GATING OF BLINKS
FIELD OF THE INVENTION
[0001] The present invention relates, in various embodiments, to devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks, without a need to apply exogenous sensory stimuli. For example, embodiments are configured to assist in a wide range of human condition monitoring applications, including clinical testing/assessment (such as in a controlled testing environment) and in-field testing/assessment (such as testing via in-vehicle driver monitoring systems). While some embodiments will be described herein with particular reference to those and other applications, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
BACKGROUND
[0002] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
[0003] Assessment of the efficiency of this sensorimotor gating process is a very important in clinical medicine and research, in which a ‘blink reflex recovery curve’ is often generated (Eekhof et al 1996; Schwingenschuh et al 2011). This is affected by various physiological and pathological processes. It is currently thought to reflect differences in the activity of dopaminergic neurons in parts of the brain (the striatum in the basal ganglia).
[0004] Analysis of the effects of human sensorimotor gating function has been performed across a range of fields. In particular, researchers have investigated changes in the characteristics of reflex blinks induced by the application of multiple exogenous stimuli, using a method to measure pre-pulse inhibition of startle responses, thereby to quantify the
sensorimotor gating. By way of example, the following conditions have previously been studied from the point of view of sensorimotor gating, with many published articles about pre-pulse inhibition.
• Parkinson’s disease.
• Mild cognitive impairment.
• Alzheimer’s disease.
• Traumatic brain injury/concussion.
• Attention- deficit-hyperactivity- disorder (ADHD).
• Autism spectrum disorders.
• Psychiatric conditions such as schizophrenia, depression, obsessive-compulsive disorder.
• Non-epileptic seizures.
• Epileptic seizures.
• Irritable bowel syndrome.
• Ophthalmic conditions - blepharospasm, dry eye.
• Research into new anti-psychotic drugs by the pharmaceutical industry.
[0005] There are also many other fields in which sensorimotor gating function analysis is of interest, including (but not limited to): post-anaesthetic patients; effects of chemotherapy, effects of sleep deprivation; patients with sleep disorders (such as obstructive sleep apnoea, narcolepsy, etc); morbid obesity, addictive behaviour; pathological gambling; and as an indicator of particular drugs being present in a subject’s system (for example narcotic).
[0006] Currently, analysis of sensorimotor gating function relies upon testing by which exogenous sensory stimuli are delivered to a subject thereby to trigger a startle response. As a result, testing requires a controlled clinical setting. Furthermore, an extended period of delivery of the exogenous sensory stimuli can be quite uncomfortable for the subject.
SUMMARY OF THE INVENTION
[0007] It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
[0008] Example embodiments are described below in the sections entitled “detailed description” and “claims”.
[0009] Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
[0010] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
[0011] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
[0012] As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Embodiments of the invention will now be described, byway of example only, with reference to the accompanying drawings in which:
[0014] FIG. 1 illustrates a system according to one embodiment.
[0015] FIG. 2A illustrates a blepharometric detection system according to one embodiment.
[0016] FIG. 2B illustrates a blepharometric detection system according to one embodiment.
[0017] FIG. 3C illustrates a blepharometric detection system according to one embodiment.
[0018] FIG. 3 illustrates a method according to one embodiment.
[0019] FIG. 4A illustrates muscles of a right eye eyelid.
[0020] FIG. 4B shows two frequency histograms of blink-free intervals.
[0021] FIG. 5A illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
[0022] FIG. 5B illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
[0023] FIG. 5C illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
[0024] FIG. 5D illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
[0025] FIG. 6 illustrates a blepharometric data monitoring framework according to one embodiment.
DETAILED DESCRIPTION
[0026] The present invention relates, in various embodiments, to devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks, without a need to apply exogenous sensory stimuli. For example, embodiments include methods of assessing a neurological condition of a human subject, which include collecting data representative of characteristics of blinks performed by a human subject, wherein the blinks occur without known application of exogenous sensory stimuli, and processing the data thereby to infer a state of Sensorimotor Gating Function (SGF). For example, this may indicate that SGF is inhibited relative to an objective baseline.
[0027] Technology described herein allows for continuous assessment of SGF without a need for exogenous sensory stimuli. This allows for improved SGF testing in a clinical environment, and also for SGF assessment to be taken into field environments. By way of example, embodiments discussed below allow for SGF measurements in an in-vehicle environment (during normal vehicle operation), thereby to provide an assessment of whether a vehicle operator is impaired. This may be used in the context of identifying impairments such as cognitive load, stress, driver distraction, alertness/drowsiness, substance intoxication, epilepsy, neurodegenerative diseases such as Alzheimer’s, Parkinson’s and mild cognitive impairment and a range of other conditions.
[0028] The embodiments described below refer to analysis of blepharometric data. The term “blepharometric data” refers to data that describes movements of a human subject’s eyelid (or eyelids). Eyelid movements are commonly categorised as “blinks” or “partial blinks”. The term “blepharometric data” is used to distinguish technology described herein from other technologies which detect the presence of blinks for various purposes. The technology herein is focussed on analysing eyelid movement as a function of time, typically measured as an amplitude. This data
may be used to infer the presence of what would traditionally be termed “blinks”, however it is attributes of “events” and other parameters identifiable in eyelid movements which are of primary interest to technologies described herein. These are referred to as “blepharometric artefacts”, with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e. blepharometric data). As discussed further below, particular blepharometric artefacts are used by embodiments of the subject technology thereby to allow for assessment of SGF.
[0029] FIG. 1 illustrates an example system for assessing a neurological condition of a human subject. 101 .
[0030] The system includes an eyelid position monitoring sensor 102, for example an IR- based system (such as IR reflectance oculography spectacles) or a camera-based system (for example an in-vehicle monitoring system), which is configured to enable detection of points in time at which a blink commences and concludes. For example, this is known to be achieved using eyelid position thereby to derive a value for amplitude as a function of time, and from this determine times at which individual blinks commence and conclude.
[0031] The blinks occur without known application of exogenous sensory stimuli. That is, the blinks are spontaneous blinks, intentional blinks, and incidental startle blinks (resulting from exogenous stimuli which are not associated with the assessment process). The blinks also include may self-stimulated blinks, which are a sub-category of reflex blinks which occur in response to a primary blink (for example as a result of the eyelid contacting the eye during the primary blink).
[0032] Monitoring sensor 102 is coupled to a processing system 103, which is configured to identify characteristics of blinks, for example based on a data stream representative of amplitude as a function of time which is provided by or based on data provided by monitoring sensor 102. In preferred embodiments, this includes determining times representative of events at which individual blinks commence and conclude.
[0033] An assessment system 103 is configured for further processing data from the processing system thereby to infer a state of Sensorimotor Gating Function (SGF). This is in some embodiments calculated based on analysis of Blink Free Intervals (BFIs), which are
intervals between a defined blink completion time and following blink commencement time (difference from blink rate, which is calculated from blink start to blink start, or similar).
[0034] In various embodiments, blink-free intervals (BFIs) are used to identify blinks which are predicted to be self-stimulated blinks (which are typically prevented via a sensorimotor gating mechanism), and use the presence, characteristics and/or frequency of those blinks to determine SGF. For example, one approach includes:
(i) Identifying a series of BFIs having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range over a defined analysis period
(ii) Identifying a series of BFIs having an interval time within a predicted non-SGC range over the defined analysis period.
(iii) Inferring the state of SGF based on a comparison of the number of BFIs in the SGC range against the number of BFIs in the non SGC range.
[0035] This provides for a new means to assess SGF, which is advantageous over known assessment methodologies (which tend to require application of exogenous stimuli thereby to trigger startle blinks).
[0036] Another approach includes:
(i) Identifying a series of BFIs intervals having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range over a defined analysis period
(ii) Identifying a total number of BFIs over the defined analysis period.
(iii) Inferring the state of SGF based on a comparison of the number of BFIs in the SGC range against the total number of BFIs.
[0037] This provides for a new means to assess SGF, which is advantageous over known assessment methodologies (which tend to require application of exogenous stimuli thereby to trigger startle blinks).
Context
[0038] It is generally understood that there are three different kinds of blinks - reflex, spontaneous and voluntary. The difference between these mainly refers to how they are initiated. After initiation, the different kinds are very similar, although not necessarily identical.
[0039] Up until now, blink research has largely been performed by people working in two separate silos: (i) those who study reflex blinks; and (ii) those who study spontaneous blinks. Researchers in the first silo come mainly from the disciplines of vision science, ophthalmology, neurology, experimental psychology, psychiatry and pharmacology. Researchers in the second silo come mainly from sleep research (alertness/drowsiness research), psychology, physiology, and biomedical engineering. There has not been much interaction between these silos.
[0040] In the second silo, an accepted approach is to recognize the importance of a spontaneous ‘blink generator’ (a separate neuronal system) as the trigger for most spontaneous blinks, with several other factors either facilitating or inhibiting the occurrence and characteristics of the next blink. Not much is known about this ‘blink generator’, although it appears to involve an ‘integrate and fire’ neuronal system.
[0041] The present disclosure relates to approaches whereby it is recognised that spontaneous blinks involve many of the same mechanisms that are involved in reflex blinks, apart from spontaneous blinks not involving exogenous sensory inputs (externally applied stimuli). By bringing the two existing silos of blink research together in this manner there is potential can gain enormously from the published literature in each.
Eyelid Movements During Blinks
[0042] Each blink has three components - a period when the eyelids are closing, another when the eyelids are stationary and ‘closed’ (whether in direct contact or not), and a third when the eyelids are reopening. Researchers over the past 20 years have gained a substantial understanding of these components, for instance by the investigation of amplitude-velocity ratios, and how they change with alertness/drowsiness.
[0043] Blinks involve coordinated eyelid closing and reopening movements, mainly (but not only) due to the actions of two muscles in each eye, Levator palpebrae superioris (LPS) and Orbiculoris occuli (OO) (as shown in FIG. 4, which shows deep and superficial layers of muscles in the eyelids of the right eye). The nerve supply to 00 is via a branch of the facial nerve, whereas a branch of the oculomotor nerve supplies LPS.
[0044] The ligaments which attach each end of LPS to the bony orbit are positioned such that, when LPS and 00 muscles are relaxed, the eyelids remain closed. This occurs during sleep. During wakefulness the eyelids are open most of the time, as a result of the tonic activation of LPS, and with the help of Mueller’s muscle.
[0045] Each blink begins with the phasic inhibition of LPS. A few milliseconds later, there is phasic activation of 00 to close the eyelids. This mainly involves activation of the ‘fasttwitch’ palpebral fibres of 00 (rather than its orbital fibres). This activation of 00 is usually initiated by the blink generator.
[0046] Elastic properties of the ligamentous attachments of LPS, which are stretched when the eyelids are open, assist in closing them. Thus, there is more than one force acting to close the eyelids - the contraction of 00 and the pull of elastic ligaments on LPS.
[0047] In alert wakefulness the upper and lower eyelids are in contact for only a few milliseconds before 00 relaxes and LPS contracts physically to reopen the eyelids. Sometimes the reopening movement begins before the edge of the upper eyelid has made contact with the lower lid. These partial blinks sometimes make up a considerable proportion of all blinks.
[0048] At the end of each blink the upper tarsal muscle (Muller’s muscle) helps LPS, in its tonic activation mode, to maintain the elevated position of the upper eyelid.
[0049] Sensory feedback about the position and velocity of each upper eyelid is derived from mechanoreceptors in LPS, and especially in Muller’s muscle which is embedded underneath LPS, and in the overlying skin. There is no such sensory feedback from 00 muscles.
[0050] There are other factors at play in the closing and reopening movements of the eyelids during blinks which have received little attention. They refer to the mechanisms which either facilitate or inhibit the contraction of OO during blinks - the so-called sensorimotor gating of blinks.
[0051] Sensorimotor gating is a process by which a neural system in the brain screens or gates sensory inputs that would otherwise interfere with processing of, and/or responding to, the most salient incoming information. In the case of eye blinks, this means that a blink that would otherwise be triggered when the upper eyelid crosses the cornea and stimulates the nerve endings in it is either facilitated or inhibited, depending on when it occurs.
[0052] Without this gating process, each eyelid reopening movement across the cornea would initiate another rapidly recurring blink, and this would interfere with the continuity of vision unnecessarily.
[0053] The source of the sensory stimulus that triggers a reflex blink in that case would be endogenous (arising from within the system itself), rather than from exogenous sources (introduced from outside).
[0054] Spontaneous blinks that are initiated by the ‘blink generator’ must be distinguished from reflex blinks which can be initiated in several other ways, as follows:
• Direct stimulation of the cornea, as for example by a brief puff of air directed at it, or by a solid object such as a piece of dust touching, produces a reflex blink that is part of the corneal reflex. The epithelium of the cornea has many sensors in it, which are mainly free nerve endings that respond to touch and pain. The cornea is 300- 600 times more sensitive to such stimuli than the skin is.
• Electrical stimulation of the supraorbital nerve above the eye which causes the OO muscle to contract, its electromyographic (EMG) response being recorded by electrodes attached above and below the eye. This method is seldom used nowadays.
• Stimulation of the auditory nerve by a loud noise. This is typically greater than 80 decibels but is adjusted for each subject to be two or three times the minimum required to produce a ‘startle response’. The resulting reflex blink is part of that ‘startle response’. This is the most commonly used technique at present for studying reflex blinks and sensorimotor gating, based on the percentage of pre-pulse inhibition.
• Intense and sudden stimulation of any other sensors that can produce a ‘startle response’, for example, a bright flash of light.
• Direct stimulation of mechanoreceptors in the eyelids, especially in the upper eyelids, which respond to changes of shape each time the upper eyelid crosses the conical-shaped cornea. This means that the initial phasic contraction of 00 as part of a spontaneous blink is usually enhanced by additional, reflex contraction of 00 muscle fibres. It is phasic contraction of the palpebral fibres of 00 muscles which usually produces the motor response of the corneal reflex, whatever its initiating trigger. The orbital fibres of 00 are only involved in more forceful, and often voluntary eyelid closures.
[0055] The latter occurs in the absence of an exogenous stimulus from which the cornea would otherwise need to be protected by eyelid closure. This might be called a selfstimulatory blink. It could be considered as a separate category, or as a sub-category of reflex blinks. Thus, there is a role, even during spontaneous blinks, for self-stimulation of the corneal reflex by the eyelid moving across the cornea, unless the sensory feedback or the blink is otherwise inhibited. The eyelid movements during these self-stimulatory blinks are subject either to augmentation or inhibition, depending on when they occur, by a process of sensorimotor gating.
Augmentation or Inhibition of Sensorimotor Gating
[0056] The present inventors have appreciated that the contraction of OO muscles is either augmented or inhibited by sensorimotor gating.
[0057] The phasic contraction of 00 during a reflex blink has two components, referred to as R1 and R2. The former (R1) has a very short latency, of the order of 8-12 milliseconds. It involves a very short neuronal circuit with only three neurons. By contrast, R2 has a longer latency (about 20-30 milliseconds) and involves interneurons in a longer neuronal circuit. However, these two neuronal circuits act in parallel.
[0058] Every time the eyelids (especially the upper eyelids) move across the corneal surface during an eyelid closing movement they stimulate free nerve endings in the epithelium of the cornea as well as mechanoreceptors in the eyelid, especially in Mueller’s muscle which is embedded beneath the LPS. These sensors provide feedback to the spinal trigeminal nucleus via a branch of the trigeminal nerve. There is a direct connection between the trigeminal nucleus and the nearby facial nucleus.
[0059] The present inventors have concluded that, initially, this sensory feedback augments the phasic contraction of 00 with a self-stimulatory component of the eyelid closing movement itself. This increases the velocity of the eyelid closing movement, which is usually higher than the velocity of the subsequent eyelid reopening movement.
[0060] During the eyelid reopening movement of a blink the trigeminal sensory feedback from free nerve endings in the corneal and mechanoreceptors in the eyelids could arise again, for the second time in quick succession. However, this second episode of corneal stimulation must be prevented from initiating another contraction 00, such as that which was initiated during the eyelid closing movement, or it would greatly impede or prevent eyelid reopening. That inhibition of a second, rapidly recurring 00 contraction is achieved by the sensorimotor gating process. The present inventors have concluded that this is likely achieved by blocking trigeminal sensory feedback from the eyelid reopening movement.
[0061] Thus, the sensorimotor gating process changes from being facilitatory of the first self-stimulatory feedback and augmented contraction of 00 due to the eyelid closing movement itself, to being inhibitory of 00 contraction during eyelid reopening. The inhibition of 00 contraction during eyelid reopening reaches its maximum about 60-120 milliseconds after the first episode of sensory input from the corneas and eyelids. The efficiency of that sensorimotor inhibition then declines progressively and ends after two or three seconds (i.e. blinks become disinhibited after a period of inhibition).
[0062] The present inventors suggest that, by appropriate analysis of a series of blink- free intervals, it is possible to characterize this sensorimotor process without applying any exogenous stimuli, such as those applied during the assessment of pre-pulse inhibition. At the end of each eyelid reopening movement the ‘blink generator’ presumably begins its integration process again, ready to fire and initiate the next blink after an ‘appropriate’ period of inhibition during the gating process.
Blink Free Intervals and Inter-blink Intervals
[0063] For the present purposes, the interval between the end of one eyelid reopening movement and the start of the next eyelid closing movement is referred to as the blink-free interval (BFI).
[0064] BFIs are specifically contrasted with inter-blink intervals (IBIs), which are a common form of blink measurement, measured from the start of one blink to the start of a next blink (or alternately the time between two of the same form of blink event in consecutive blinks). Measurements of IBI been used for decades as a measure of the ‘instantaneous blink rate’ - the frequency with which blinks occur - and these have been used for a range of purposes.
[0065] Each IBI is, in practice, the sum of two variables:
(i) the duration of a blink (from the beginning of its eyelid closing movement to the end of its eyelid reopening movement); and
(ii) the duration of the subsequent BFI, before the start of the eyelid closing movement of the next blink.
[0066] The present inventors have found that these two variables are not significantly correlated, and in fact they vary independently.
The Effects of Sleep Deprivation on Blink-free Intervals
[0067] The present inventors have re-analysed the results of earlier sleep deprivation experiments, thereby to gain improved understanding of blink-free intervals in healthy adults, and the effects of sleep deprivation on those intervals.
[0068] FIG. 5 shows two frequency histograms of blink-free intervals recorded from 18 healthy subjects when alert, after their ‘normal’ night’s sleep, and when drowsy, after missing a night’s sleep. They performed a 15-minute visual reaction-time test in those two conditions while their eyelid movements were recorded by an Optalert system of infrared reflectance blepharometry. There were 4730 blinks recorded in the alert condition and 7380 in the drowsy condition.
[0069] In the alert condition, the inhibition of self-stimulatory blinks was almost complete during blink-free intervals less than about 50 milliseconds. After that, blinks became progressively disinhibited (i.e. they occurred more frequently) and reached their peak frequency with blink-free intervals of about 1500 milliseconds. The frequency of longer blink- free intervals then decreased progressively until there were no blink-free intervals longer than about 35 seconds. This is also the longest interval for which most people can resist blinking voluntarily when asked to do so.
[0070] In the drowsy condition the inhibition of self-stimulatory blinks with blink-free intervals less than 50 milliseconds was not as complete as in the alert condition. The frequency of blinks then increased, reaching its peak for blink-free intervals of about 750 milliseconds, where there are far more blinks than in the alert condition. This explains the hitherto unexplained observation that the blink rate increases after sleep deprivation. The frequency of blink-free intervals longer than about 2 seconds was similar in both conditions.
[0071] This is evidence for the existence of sensorimotor gating during spontaneous blinks (i.e. blinks without external stimuli). The self-stimulatory contraction of OO is facilitated during eyelid closing movements, but it is inhibited during eyelid reopening movements. That inhibition is strongest when the interval between the closing and reopening movements is about 60 -120 milliseconds, which is what happens typically in alert subjects.
[0072] The present inventors have concluded that sensorimotor gating of OO contraction extends after the eyelid reopening movement has ended (in between blinks). That inhibition decreases progressively over about 2 to 3 seconds, when blink-free intervals become progressively longer. The peak frequency of blink-free intervals occurs at about 1500 milliseconds in alert young adults. There is some degree of disinhibition of blinks with drowsiness after sleep deprivation.
[0073] In both alert and drowsy conditions, the sensorimotor gating of blinks is sometimes less than complete for blink-free intervals less than 50 milliseconds. This allows what can be termed ‘rapidly recurring blinks’ or ‘blink oscillations’ to occur intermittently. These events occurred somewhat more frequently in the drowsy condition.
Assessment of Sensorimotor Gating Function
[0074] The present inventors, based on their research of self-stimulatory blinks occurring as a result of during spontaneous blinks (due to self-stimulation of the corneal reflex by the eyelid moving across the cornea), have identified methods to assess SGF without the use of exogenous sensory stimuli. That is, the present inventors have devices and (?) methods for the continuous assessment of sensorimotor gating of endogenous eyeblinks without using exogenous sensory stimuli that would cause reflex blinks as part of the startle response.
[0075] One embodiment provides a method of assessing a neurological condition of a human subject, the method including:
• Collecting data representative of characteristics of blinks performed by a human subject, wherein the blinks occur without known application of exogenous sensory stimuli; and
• Processing the data thereby to infer a state of Sensorimotor Gating Function (SGF).
[0076] The blinks occur without “known” application of exogenous sensory stimuli in the sense that the testing regime does not intentionally provide exogenous sensory stimuli to
trigger the blinks. It is appreciated that there may be instances where exogenous sensory stimuli occurs during an assessment period (for example a loud ambient noise, flashing light, or the like). In some embodiments technological means (for example noise and/or light sensors, and optionally other sensors such as accelerometers) are used to assist in identifying presence of unintended exogenous sensory stimuli.
[0077] Processing the data thereby to infer a state of Sensorimotor Gating Function (SGF) may include identifying presence of a reflex blink following a primary blink. For example, conditions are set to differentiate between a self-stimulated reflex blink following the primary blink, and a subsequent spontaneous (or potentially exogenously stimulated) blink following the primary blink.
[0078] In some embodiments, the method includes processing the data thereby to infer a state of SGF via analysis of Blink Free Intervals (BFIs). This analysis of blink free intervals optionally includes:
(i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range;
(ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF.
[0079] In a preferred example, the analysis of blink free intervals includes:
(i) identifying a series of blink free intervals having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range over a defined analysis period;
(ii) identifying a series of blink free intervals having an interval time within a predicted non-SGC range over the defined analysis period; and
(iii) inferring the state of SGF based on a comparison of the number of BFIs in the SGC range against the number of BFIs in the non SGC range (this may be used as a normalisation metric).
[0080] In alternate embodiments, rather than a comparison of the number of BFIs in the SGC range against the number of BFIs in the non SGC range, a comparison is made between of the number of BFIs in the SGC range against a total number of BFIs.
[0081] Quantification of the SGC range and the non-SGC range varies between embodiments. For example, the following range may be used: SGC range: 250ms < BFI < 750ms; non-SGC range: 2s < BFI < 3s. Other ranges may be used, for instance selected based on an understanding that BFIs between about 2 milliseconds and 2 seconds are controlled mainly by the sensorimotor gating mechanism, whereas BFIs longer than about 2 seconds are controlled mainly by the ‘blink generator’.
[0082] The series of blink free intervals is preferably between 50 and 2,000 blinks. In a preferred embodiment, a rolling count is used, thereby to continually adjust measurements of SGF based on latest BFI input values. In some embodiments the series is measured based on a time period, rather than a number of blinks.
[0083] In a preferred embodiment, SGF is measured by way of a ratio between: a count of blinks in the SGC range and; (ii) a count of blinks the non-SGC range. This ratio is calculated over the series, which as noted above may be defined by way of a number of blinks or a period of time.
[0084] In some embodiment, the process of calculating an SGF value from IBI values may include:
• Maintaining a count of IBI values within a first range, for example a range of IBI representative of self-stimulated reflex blinks resulting from compromised SGF. This may be termed a Sensorimotor Gating Controlled (SGC) range.
• Maintaining a count of IBI values within a second range, for example a range of IBI representative of blinks resulting from mechanism other than self-stimulation via a primary blink (for example blinks generated by the “blink generator” in the conventional manner(. This may be termed a non-Sensorimotor Gating Controlled (non-SGC) range.
• Deriving a metric as a comparison of these counts, for example the count of I Bls in the SGC range divided by the count of I Bls in the non-SGC range.
[0085] In some embodiments the SCG range may be segmented into a plurality of subranges, thereby to differentiate between I Bls in an early part of the SGC range and a later part (or parts) of the SGC range. These may be weighted for the purpose of calculating the SGF value, thereby to add additional importance to those occurring earlier in the SGC range (which are representative of a potentially higher level of sensorimotor gating function compromise).
[0086] In some embodiments the SGF value is compared against a benchmark value, for example a benchmark derived for the subject as a particular individual (optionally based on past monitoring), for a hypothetical individual having corresponding attributes to the subject (for example demographic attributes), or a general baseline. The baseline may be a variable baseline which accounts for other factors which are known to effect SGF.
[0087] There are multiple advantages associated with such methods of assessing SGF in this manner. For example, these include:
• Assessments are made continuously, based on a series of blink-free intervals that is updated after every blink, or made as a single measurement. This facilitates detailed studies of how sensorimotor gating varies over periods ranging from a few seconds to many hours.
• There is no need to attach EMG electrodes to the subject’s face, as is common in prior art SGF assessment procedures.
• No exogenous sensory stimuli would be used to generate reflex blinks or startle responses, including whole-body twitches in experimental animals. The stimuli that are currently used are by definition somewhat noxious, reliably producing startle responses. With methods described herein the subject would not necessarily be aware of the measurements being made, given the passive nature of monitoring (for example via a video camera).
• These methods enable sensorimotor gating to be measured under circumstances in which it has not been possible to do so previously, for instance while driving a vehicle.
• Analysis of SGF as described herein can be compared with existing forms of analysis based on other blink characteristics (for example those relating to characteristics of blink amplitudes and/or velocities), thereby to assist in improving robustness of overall assessment programs. It should be noted that the same senor hardware can be used for both forms of analysis.
[0088] Various example embodiments are discussed below in more detail.
Example Methodology
[0089] FIG. 3 illustrates a methodology which is relevant to a range of embodiments discussed below. This methodology, depending on the specific hardware implementation used by a given embodiment, is in some cases performed via software modules executing at a single computing device, and in other cases performed via software modules executing across a plurality of connected devices, for example including local devices (such as computing devices housed in a vehicle and/or user’s mobile devices such as smartphones) and Internet-connected server devices (also referred to as “cloud” components). It should be appreciated that any computing devices and computer-executed methods configured for the purposes of enabling the overall performance of a methodology based on those described below by reference to FIG. 3 forms embodiments of inventions for the purposes of this specification.
[0090] It will be appreciated that a main outcome of the method of FIG. 3 is the generation of a continuous SGF value, which is in this example is an SGF value calculated based on a set number of most recent IBI values. In further embodiments, alternate processes may be performed to achieve the same or a corresponding result.
[0091] Block 301 represents a process including commencing a monitoring operation.
For some embodiment described below, this is achieved via a camera system having an
image capture component that is positioned into a capture zone in which a subject’s face is predicted to be positioned. For example, this may include:
• Vehicles, including passenger vehicles or operator-only vehicles, wherein the image capture component is positioned to capture a region in which an operator’s face is predicted to be contained during normal operation. For example, in the case of an automobile, the image capture component may include a camera mounting in or adjacent a dashboard or windscreen.
• Vehicles, in the form of passenger vehicles, wherein the image component is positioned to capture a region in which a passenger’s face is predicted to be contained during normal operation. For example, in the case of an automobile, the image capture component may include a camera mounting in or adjacent a dashboard or windscreen, the rear of a seat (including a seat headrest), and so on.
• Mass transport vehicles, including passenger trains and/or aircraft, wherein the image component is positioned to capture a region in which a passenger’s face is predicted to be contained during normal operation. For example, the image capture component may be mounted in the rear of a seat (including a seat headrest), optionally in a unit that contains other electronic equipment such as a display monitor.
• Seating arrangements, such as theatres, cinemas, auditoriums, lecture theatres, and the like. Again, mounting image capture components in the rear of seats is an approach adopted in some embodiments.
[0092] In some embodiments, other hardware is used for the purpose of monitoring, for example infrared reflectance oculography spectacles and other wearable devices capable of monitoring eyelid movements.
[0093] The monitoring process preferably includes a process which measures eyelid position (amplitude) as a function of time (for at least one upper eyelid). This is differentiated from processes which merely look for an open or closed state. This is because, as
described herein, SGF is calculated based on I Bl, and measuring I Bl requires accurate information regarding commencement and completion of blink movements.
[0094] The monitoring process of block 301 provides a stream of blepharometric data, which is processed thereby to identify artefacts. This may be real time, substantially in real time, or with a delay. In some embodiments data representative of eyelid amplitude as a function of time is subjected to one or more pre-processing operations prior to the process of block 302 onwards, for example including filtering, upscaling, or the like. These may be used thereby to improve detection of eyelid closure commencement events and eyelid reopening completion events (i.e. the “start” and “end” of each blink, based on which I Bl is calculated).
[0095] Block 302 represents a process including detecting an eyelid re-opening completion event (alternately termed a “blink completion event”) in the blepharometric data stream, and a time associated with that event. This event may be determined via one or more of amplitude value, eyelid velocity, eyelid acceleration, and/or other factors, thereby to identify an objective point on an amplitude-time curve which represents an eyelid reopening completion event.
[0096] Block 303 represents a process including detecting an eyelid closure commencement event (alternately termed a “blink commencement event”) in the blepharometric data stream, and a time associated with that event. This event is also determined via one or more of amplitude value, eyelid velocity, eyelid acceleration, and/or other factors, thereby to identify an objective point on an amplitude-time curve which represents an eyelid closure commencement event.
[0097] Block 304 represents a process including determining a new I Bl value from as the time elapsed between the eyelid re-opening completion event at 302 and the eyelid closure commencement event at 303.
[0098] In further embodiments, alternate methods may be used to calculate I Bls from the eyelid re-opening completion event at 302 and the eyelid closure commencement event at 303, optionally including staring of a timer at when the eyelid re-opening completion event occurs and stopping the timer when the eyelid closure commencement event occurs. It will
be appreciated that a range of software approaches may be configured to enable accurate and efficient determination of I Bls from the stream of amplitude data.
[0099] Decision 305 determines whether an IBI buffer is full. In this example, an IBI buffer defines a set of IBI values which are used to calculate an SGV value, and the IBI buffer is configured to store a maximum number of I Bls. This maximum number is in some instances an absolute number of blinks (for example between 50 and 2,000, for example 100 or 1 ,000), and in other cases is defined by a time window over which the IBI buffer is filled (for example between 5 minutes and 20 minutes). In some embodiments multiple IBI buffers are used thereby to enable calculation of a short-term continuous SGF values and one or more longer-term continuous SGF values (for example 100 blinks and 1000 blinks).
[00100] In the case that the IBI buffer is not full, the new IBI value is added to the buffer at block 306 and the process loops to block 302 to collect more IBI values (in this example an SGF value is only calculated once the buffer is full; in other embodiments the SGF value may be calculated from a unfilled buffer and/or just-filled buffer). In some cases, if the IBI addition at block 306 fills the IBI buffer, the process also moves to block 308 in addition to looping.
[00101] If, at decision 305, the IBI buffer is full, then the process moves to block 307 at which the new IBI value is added to the buffer, and the oldest IBI value in the buffer is discarded from the buffer. In this manner, if the buffer is configured to store X IBI values, the buffer continues to store the most recent X IBI values.
[00102] Block 308 represents calculation of an SGF value based on the current IBI buffer. In this example, the calculation process is performed each time a new IBI value is added to the buffer. However, in further embodiments calculation of a new IBI value may be performed less frequently, for example based on a defined time interval (for example once every X seconds), or each time another Y IBI values are added (for example where 1< Y < 10).
[00103] The calculation of an SGF value at block 308 may be based on a wide range of metrics. Preferably these metrics are representative of the extent to which SGF is compromised, as indicated by presence of blinks that are predicted to be self-stimulated
reflex blinks following spontaneous blinks (or in some cases startle blinks caused by incidental conditions in a monitoring environment).
[00104] The process of calculation of an SGF value may include:
• Maintaining a count of IBI values within a first range, for example a range of IBI representative of self-stimulated reflex blinks resulting from compromised SGF. This may be termed a Sensorimotor Gating Controlled (SGC) range.
• Maintaining a count of IBI values within a second range, for example a range of IBI representative of blinks resulting from mechanism other than self-stimulation via a primary blink (for example blinks generated by the “blink generator” in the conventional manner(. This may be termed a non-Sensorimotor Gating Controlled (non-SGC) range.
• Deriving a metric as a comparison of these counts, for example the count of I Bls in the SGC range divided by the count of I Bls in the non-SGC range.
[00105] In some embodiments the SCG range may be segmented into a plurality of subranges, thereby to differentiate between I Bls in an early part of the SGC range and a later part (or parts) of the SGC range. These may be weighted for the purpose of calculating the SGF value, thereby to add additional importance to those occurring earlier in the SGC range (which are representative of a potentially higher level of sensorimotor gating function compromise).
[00106] In some embodiments the SGF value is compared against a benchmark value, for example a benchmark derived for the subject as a particular individual (optionally based on past monitoring), for a hypothetical individual having corresponding attributes to the subject (for example demographic attributes), or a general baseline. The baseline may be a variable baseline which accounts for other factors which are known to effect SGF.
[00107] In some embodiments, the method of FIG. 3 includes a preliminary pre-monitoring process including identifying a subject from whom the blepharometric data collected from the monitoring originates. This optionally includes:
• Credential-based identification, for example via a login. This may include pairing of a personal device (such as a smartphone) to blepharometric data monitoring system (e.g. pairing a phone to an in-vehicle system), inputting login credentials via an input device, or other means.
• Biometric identification. For example, in some embodiments described herein a camera-based blepharometric data monitoring system utilises image data to additionally perform facial recognition functions, thereby to uniquely identify human subjects.
• Other forms of identification.
[00108] Identification of the subject is relevant for the purposes of comparing current blepharometric data with historical blepharometric data for the same subject. For example, in some embodiments an analysis system has access to a database of historical blepharometric data for one subject (for example where the system is installed in a vehicle and monitors only a primary vehicle owner/driver) or multiple subjects (for example a vehicle configured to monitor multiple subjects, or a cloud-hosted system which received blepharometric data from a plurality of networked systems, as described further below). This may used, by way of example, to derive a baseline SGF value for an individual, against which current SGF values are benchmarked.
[00109] The method of FIG. 3 may additionally include determination of a range of other blepharometric data artefacts. For example, the artefacts may include:
• Blink total duration (BTD).
• Blink rates.
• Amplitude to velocity ratios (AVRs).
Negative Inter-Event-Duration (I ED).
Positive I ED.
• Negative AVR (i.e. during closure)
• Positive AVR (i.e. during re-opening(
• AVR Product (negative AVR * positive AVR)
• AVR ratio (negative AVR divided by positive AVR)
• BECD (blink eye closure duration).
• Negative DOQ (duration of ocular quiescence)
• Positive DOQ
• Relative Amplitude
• Relative Position
• Max Amplitude
• Max Velocity
• Neg ZCI (zero crossing index)
• Pos ZCI
• Blink start position
• Blink end position
Blink start time
Blink end time
• Trends and changes in any of the above artefacts over the period.
[00110] These may be calculated for a “current period”. In this sense, the “current period” may be either a current period defined by a current user interaction with a blepharometric data monitoring system, or a subset of that period. For instance, in the context of a vehicle, the “current period” is in one example defined as a total period of time for which a user operates the vehicle and has blepharometric data monitored, and in another embodiment is a subset of that time. In some embodiments multiple “current periods” are defined, for example using time block samples of between two and fifteen minutes (which are optionally overlapping), thereby to compare blepharometric data activity during periods of varying lengths (which may be relevant for differing neurological conditions, which, in some cases, present themselves based on changes in blepharometric data over a given period of time).
[00111] These additional blepharometric data artefacts may be used to calculate other metrics, which are optionally compared with SGF values (for example to validate and/or provide additional context to those values). This may include identifying a known blepharometric biomarker (such as alertness/drowsiness determined via AVRs), thereby to assess whether poor SGF performance is potentially due to one or more other known conditions (or, conversely, ruling out conditions such as drowsiness). This can be of particular assistance in vehicle operation, where there may be utility in differentiating between poor SGF performance resulting from drowsiness, against SGF performance resulting from factors such as stress, cognitive load, intoxication or driver distractedness.
[00112] Various hardware/software embodiments configured to enable the above methodology are described below.
Example Spectacles-Based Hardware Configuration
[00113] FIG. 2A illustrates a first example hardware arrangement, in the form of a head wearable unit, which in the example of FIG. 2A takes the form spectacles 200, which is for the present purposes configured to assess human condition based on SGF (for example based on measurements.
[00114] These spectacles need not be functional as vision affecting spectacles (i.e. they do not necessarily include lenses, and may simply be a frame that provides a wearable mount, or other head-wearable device). Spectacles 200 include a frame 201 which is mounted to a human subject’s head, an IR transmitter/receiver assembly 202 which is positioned relative to the body thereby to, in use, transmit a predefined IR signal onto the subject’s eye, and receive a reflected IR signal resulting from reflection of the transmitted IR signal off the user’s eye or eyelid. A sizing adjustment mechanism 203 allows for control over positioning of a nose mount portion, thereby to allow effective locating of assembly 202 relative to the wearer’s eye. A processing unit 204 (which is optionally mounted to a spectacle arm) receives and processes the received IR signal. This processing may include:
• Onboard processing, using a set of artefact detection algorithms stored a computer code on a memory unit and executed via a microprocessor. For example, raw data from IR assembly 202 is subjected to one or more pre-processing algorithms (for example filters and the like), and an artefact detection algorithm operates to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
• External processing, via a secondary processing device. In this case, raw data from IR assembly 202 is transmitted (for example via Bluetooth or another wireless communication medium) to a secondary processing device, which optionally takes the form of a smartphone. In some embodiments an onboard processor performs preliminary processing of the raw data prior to transmission, for example to reduce complexity and/or amount of data required to be transmitted. The secondary processing device executes a software application which includes/accesses the set of artefact detection algorithm (which are stored on a memory unit of the secondary processing device). Again, these algorithms operate to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
[00115] In both cases, there is an optional functionality whereby all or a subset of data is collected for transmission or transmitted in real-time to a server device for further analysis.
[00116] It will be appreciated that the configuration of FIG. 2A is optionally used to collect a continuous stream of amplitude data as a function of time, which is subsequently processed thereby to enable extraction of data artefacts such as I Bl . Alternatively, firmware or software may be configured to perform limited analysis which simply identifies blink start and end events, and from this maintains a count of I Bls in the SGC and non-SGC ranges, thereby to enable continuous generation of a SGF metric.
[00117] It will be appreciated that in further embodiments spectacles 200 use alternate sensor arrangements to record eyelid position, for example camera systems as discussed in embodiments further below.
Example Camera-Based Hardware Configuration
[00118] FIG. 2B illustrates a second example hardware arrangement, in the form of a camerabased blepharometric data monitoring system 210, which is forthe present purposes configured to assess human condition based on SGF (for example based on measurements.
[00119] This form is system is optionally installed in a vehicle, for example as a driver monitoring system which assesses SGF (for example as a means to assess driver performance by reference to factors such as drowsiness, stress, cognitive load, distractedness, intoxication, impairment, and/or other conditions).
[00120] System 210 includes a camera unit 211 , which is positioned to capture image data in a region including a human subject’s face, when that human subject is positioned in a defined area. For example, in some cases the defined area is an operator position for a vehicle (such as a car or truck, airline, or other, including operator and/or passenger locations). In other embodiments the defined area is relative to a piece of furniture (for example to allow monitoring of a subject operating a computer or watching a television), or a clinical device. The camera unit may include a webcam provided by a computer device. A processing unit 212 processes image data from camera unit 211 via a vision system thereby to identify a subject’s facial region (for example using known facial detection algorithms), and from that identify the user’s eyes, and by way of image-driven tracking algorithms monitor the user’s eyes thereby to detect and measure blinks (optionally in combination with cloud-based processing 213). Blinks are identified and measured thereby to determine blepharometric data, which is processed using artefact detection algorithms, for example as discussed above. Once again, these algorithms
operate to identify the presence of defined data artefacts, and provide an output signal in the case that those defined data artefacts are identified.
[00121] By way of example, in some embodiments the hardware arrangement of FIG. 2B is installed in a vehicle, such as an automobile, and as such configured to detect artefacts in blepharometric data which are relevant to an operator of the vehicle (for example in the context of detecting drowsiness and/or other neurological conditions).
[00122] Output, for example in terms of alerts and the like, is delivered via an output unit such as a display device 214 (which, in a vehicle embodiment, may be an in-vehicle display) or a networked computing device (such as a smartphone 215). In some embodiments delivery of data to an output device is provided from an Internet-based processing/data management facility to the display device rather than directly from system 212 (e.g. both are connected to a common networked data processing/management system). The output may be delivered to the human subject being monitored and/or to a third party.
[00123] In some embodiments, eyelid monitoring is performed via a process including the following steps, thereby to provide a signal representative of amplitude as a function of time.
(i) Identify that a human face is detected.
(ii) In a detected human face, identifying an eye region. In some embodiments, algorithms are configured to track one eye region only; in other embodiments both eye regions are tracked thereby to improve data collection.
(iii) Identify, in the eye region(s), presence and movement of an eyelid. For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time). In a preferred embodiment, a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face. The two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera,
including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
[00124] It will be appreciated that other techniques may be used. For example, in one embodiment a trained Al image classifier is used to identify blink commencement and completion events from images, for example based on a pre-training process.
Example Smartphone-Based Hardware Configuration
[00125] FIG. 2C illustrates a third blepharometric monitoring system, in the form of a smartphone-integrated blepharometric monitoring system 220, which is for the present purposes configured to assess human condition based on SGF (for example based on measurements of I Bl as discussed further above).
[00126] From a hardware perspective, system 220 utilises existing smartphone hardware 221 . A smartphone image capture unit (preferably a front-facing camera 222, but optionally a rear facing camera) is leveraged by a software application 223 thereby to perform facial detection and blepharometric detection/measurement in a similar manner to the embodiment of FIG. 2B. In some embodiments the software application operates as a foreground application, which delivers graphical information via the smartphone screen 224 concurrently with blink detection (in some cases this graphical information is used to assist in standardising conditions for a blink detection period). In other embodiments the software application operates as a background application, which perform blink detection and measurement whilst other software applications are presented as foreground applications (for example blink detection whilst a user operates a messaging application). Processing of blink detection data is optionally performed via software application 223 using the smartphone’s internal processing capabilities, transmitted to a server device for remote processing, or a hybrid approach which includes both local processing and remote processing.
[00127] Similar to the example of FIG. 2C, one embodiment provides a portable electronic device including: a display screen; and a front-facing camera; wherein the portable electronic device is configured to concurrently execute: (i) a first software application that provides data via the display screen; and (ii) a second software application that receives input from the front facing camera thereby to facilitate detection and analysis if blepharometric data. For example, the first software application is in one embodiment a messaging application, and in another embodiment a social media application. This allows for collection of blepharometric data whilst a user engages in conventional mobile device activities.
[00128] One embodiment provides computer executable code that when executed causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharometric-based data collection, wherein the computer executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data. The purpose may be, for example, messaging or social media.
[00129] Embodiments such as that of FIG. 2C provide for collection of blepharometric data via a background software application executing on electronic device with a front-facing camera. This provides opportunities to analyse a device user’s neurological condition, for example in the context of predicting seizures, advising on activities, diagnosing potential neurological illnesses, detecting drowsiness, and so on.
Further Example Blepharometric Data Monitoring System
[00130] Further example blepharometric data monitoring systems are discussed below. It will be appreciated that these are configurable to perform SGF assessment as described herein as a particular form of blepharometric data analysis. Components of these systems can in further embodiments be incorporated into any of the systems above.
[00131] FIG. 5A illustrates an example in-vehicle blepharometric data monitoring system, which is configurable for the purposes of measuring I Bls and calculating SGF for any of the assessment methodology embodiments discussed above. Whilst it is known to provide a blepharometric data monitoring system in a vehicle for the purposes of point-in-time analysis of alertness/drowsiness, based on blink rates, blink frequency, blink duration and/or AVRs, such hardware has not been proposed as a means for assessing SGF.
[00132] The system of FIG. 5A includes an image capture device 520. This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement. In some embodiments frame rates are upscaled and/or fitted to models thereby to improve accuracy of detection of events representing commencement and completion of blinks, as used herein for I Bl measurement purposes.
[00133] Device 520 is positioned to capture a facial region of a subject. Device 520 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver. In another embodiment device 520 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger. In another embodiment device 520 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger. In some embodiments a combination of these are provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
[00134] Although the system of FIG. 5A (and other systems) are described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implanted in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains. In such embodiments there are preferably one or more analysis systems each supporting a plurality of image capture devices, each positioned to capture a respective passenger.
[00135] An in-vehicle image processing system 510 is configured to receive image data from image capture device 520 (or multiple devices 520), and process that data thereby to generate blepharometric data. A control module 511 is configured to control device 520, operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
(i) Identify that a human face is detected.
(ii) Optionally perform subject identification is achieved (for example via facial recognition algorithms or technologies which identify a subject via alternate means). This may include identifying a known subject based on an existing subject record defined in user identification data 551 stored in a memory system 550, or identifying an unknown subject and creating a new subject user identification data 551 stored in a memory system 550.
(iii) In a detected human face, identifying an eye region. In some embodiments the algorithms are configured to track one eye region only; in other embodiments both eye regions are tracked thereby to improve data collection.
(iv) Identify, in the eye region(s), presence and movement of an eyelid. For example, in a preferred embodiment this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example blink events) and velocity (for example as a first derivative of position against time). In a preferred embodiment, a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face. The two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
[00136] Algorithms 512 optionally operate to extract additional artefacts from blepharometric data, for example amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
[00137] A blepharometric data management module 513 is configured to coordinate storage of blepharometric data generated by algorithms 512 in user blepharometric data 552. This includes determining a user record against which blepharometric data is to be recorded (in some cases there is only a single user record, for example where blepharometric data s collected only from a primary driver of an automobile). In some embodiments the function of module 513 includes determining whether a set of generated blepharometric data meets threshold data quality requirements for storage, for example based on factors including a threshold unbroken time period for which eyelid tracking is achieved and blepharometric data is generated.
[00138] Memory system 550 includes user identification data 551 for one or more users. As noted, in some embodiments system 501 is configured to collect and analyse blepharometric data for only a single user (for instance the primary driver of a vehicle) and includes identification data to enable identification of only that user. In other embodiments, system 501 includes functionality to collect and analyse blepharometric data for multiple users, and includes identification data to enable identification of any of those users (and optionally, as noted above, defining of a new record for a previously unknown user). The identification data may include login credentials (for example a user ID and/or password) which are inputted via an input device. Alternately, the identification data may be biometric, for example using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments this leverages an existing biometric identification system of the vehicle.
[00139] User blepharometric data 552 includes data associated with identified users, the data basing time coded thereby to enable identification of a date/time at which data was collected. The blepharometric data stored in data 552 optionally includes blepharometric data generated by algorithms 512 and further blepharometric data derived from further processing of that data, for example data representing average periodic lEDs and/or BTDs, and other relevant statistics which may be determined over time. In some embodiments data processing algorithms are updated over time, for example to allow analysis of additional biomarkers determined to be representative of neurological conditions which require extraction of particular artefacts from blepharometric data.
[00140] Analysis modules 530 are configured to perform analysis of user blepharometric data 552. This includes executing a process including identification of relationships between current blepharometric data artefacts (e.g. data recently received from system 510) and historical blepharometric data artefacts (e.g. older data pre-existing in memory system 550). This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject. The concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
Identification of long-term trends. For example, blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any
particular blepharometric data artefacts that are evolving/trending over time. In some embodiments, algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
• Identification of current point-in-time deviations from baselines derived from historical blepharometric data. For example, current data may show anomalous spiking in particular artefacts, or other differences from baselines derived from the subject’s historical blepharometric data, which may give rise for concern. By way of example, this form of analysis may be used to determi ne/predict the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; or (v) other factors.
[00141] Analysis modules are optionally updated over time (for example via firmware updates or the like) thereby to allow for analysis of additional blepharometric data artefacts and hence identification of neurological conditions. For example, when a new method for processing blepharometric data thereby to predict a neurological condition based on a change trend in one or more blepharometric data artefacts, an analysis algorithm for that method is preferably deployed across a plurality of systems such as system 501 via a firmware update or the like.
[00142] System 501 additionally includes a communication system 560, which is configured to communicate information from system 501 to human users. This may include internal communication modules 561 which provide output data via components installed in the vehicle, for example an in-car display, warning lights, and so on. External communication modules 562 are also optionally present, for example to enable communication of data from system 501 to user devices (for example via Bluetooth, WiFi, or other network interfaces), optionally by email or other messaging protocols. In this regard, communication system 560 is configured to communicate results of analysis by analysis modules 530.
[00143] A control system 540 included logic modules 541 which control overall operation of system 540. This includes execution of logical rules thereby to determine
communications to be provide din response to outputs from analysis modules 530. For example, this may include:
• An in-vehicle notification in the event that a threshold level of drowsiness is detected.
• An in-vehicle notification of another neurological condition.
• An in-vehicle notification with an alert code that is to be inputted into an online system thereby to obtain further information regarding a detected/predicted neurological condition.
• An external communication to a device/address defined in user identification data 551.
[00144] It will be appreciated that these are examples only, and logic modules 540 are able to provide a wide range of functionalities thereby to cause system 501 to act based on determinations by analysis modules 530.
[00145] It should be appreciated that the system illustrated in FIG. 5A provides technology whereby one or more digital cameras are able to be installed in a vehicle, such as an automobile or mass transport vehicle, thereby to: (i) collect blepharometric data for an operator and/or one or more passengers; and (ii) enable determination of relationships between blepharometric data collected in a “current” period (for example a last data set, a last day, a last week, or a last month) with historical blepharometric data that is stored for that same user. This allows for functionalities including, but not limited to:
• User-personalised drowsiness detection, based on detection of drowsiness-related blepharometric data artefacts that are beyond a threshold deviation from average values for a particular user;
• Prediction of neurological conditions, based on sudden changes and/or long term trends in change for one or more blepharometric data artefacts that are known to be indicative of particular neurological conditions;
• Personalised prediction of future neurological conditions, for example prediction of future drowsiness based on known drowsiness development patters extracted for the individual from historical data, and prediction of likelihood of a seizure based on individually-verified seizure prediction biomarkers identifiable in blepharometric data.
• Identification of point-in-time relevant neurological conditions based on sudden deviations from historical averages, which may be representative of sudden neurological changes, for example traumatic brain injuries (e.g. concussion) and/or impairment based on other factors (such as medications, drugs, alcohol, illness, and so on).
Example In-Vehicle Blepharometric data Monitoring Systems, with Cloud-based Analysis
[00146] FIG. 5B illustrates a further embodiment, which includes various common features with the embodiment illustrated in FIG. 5A. In general terms, for these embodiments, external communication modules 562 facilitate communication with a remote server device, which optionally performs additional blepharometric data analysis. In the example of FIG. 5B, external communication modules 562 enable communication between system 501 and a cloud-based blepharometric data analysis system 580. This may, for example, perform SGF assessment based on a wider set of algorithms, benchmarks and/or comparison data sets compared with what is available at a local system. This also allows for blepharometric data, in this case including SGF data, to be collected over a longer term thereby to allow for identification of longer terms trends, sudden changes compared to historical values, and so on.
[00147] System 580 includes a control system 582 and logic modules 581 which are provided by computer executable code executing across one or more computing devices thereby to control and deliver functionalities of system 580.
[00148] System 580 additionally includes a memory system 583, which includes user identification data 584 and user blepharometric data 585. The interplay between memory system 583 and memory system 550 varies between embodiments, with examples discussed below:
• In some embodiments memory system 550 operates in parallel with memory system 583, such that certain records are synchronised between the systems based on a defined protocol. For example, this optionally includes a given memory system 550 maintaining user blepharometric data and user identification data for a set of subjects that have presented at that in-vehicle system, and that data is periodically synchronised with the cloud system. For example, upon an unrecognised user presenting at a given in-vehicle system, the system optionally performs a cloud (or other external) query thereby to obtain identification data for that user, and then downloads from the cloud system historical user blepharometric data for that user. Locally collected blepharometric data us uploaded to the server. This, and other similar approaches, provide for transportability of user blepharometric data between vehicles.
• In some embodiments, memory system 550 is used primarily for minimal storage, with system 503 providing a main store for user blepharometric data. For example, in one example system 550 includes data representative of historical blepharometric data baseline values (for instance defined as statistical ranges), whereas detailed recordings of blepharometric data is maintained in the cloud system. In such embodiments, analysis modules 586 of cloud system 580 performed more complex analysis of user blepharometric data thereby to extract the historical blepharometric data baseline values, which are provided to memory system 550 where a given user is present or known thereby to facilitate local analysis of relationships from baselines.
• In some embodiments local memory system 550 is omitted, with all persistent blepharometric data storage occurring in cloud memory system 583.
[00149] System 580 additionally includes analysis modules 586, which optionally perform a similar role a modules 530 in FIG. 5A. In some embodiments local and cloud analysis modules operate in a complementary factor, for example with analysis modules 530 performing relationship analysis relevant to point-in-time factors (for example an altered/non-standard neurological state for a user by comparison with historical baselines, which warrants immediate intervention) and analysis modules 586 performing what is often more complex analysis of trends over time (which may be representative of degenerative
neurological illnesses and the like) and do not require local immediate intervention in a vehicle. It will be appreciated that there exist a range of approaches for sharing processing (and memory storage) functions between an in-vehicle system and a cloud system, and configuration of thee is optionally determined based on considerations such as network speeds/bandwidth, along with local memory and storage resource availability.
[00150] There are various advantages of incorporating a cloud-based system to operate with a plurality of in-vehicle systems, in particular an ability to maintain cloud storage of user identification data and user blepharometric data for a large number of users, and hence allow that data to “follow” the users between various vehicles over time. For example, a user may have a personal car with a system 501 , and subsequently obtain a rental car whilst travelling with its own system 501 , and as a result of cloud system 580 the rental car system: has access to the user’s historical blepharometric data; is able to perform relationship analysis of the current data collected therein against historical data obtained from the cloud system; and feed into the cloud system the new blepharometric data collected to further enhance the user’s historical data store.
[00151] FIG 5C illustrates a further variation where a user has a smartphone device 570 that executes a software application configured to communicate with a given local in-vehicle system 501 (for example via Bluetooth or USB connection) and additionally with cloud system 580 (for example via a wireless cellular network, WiFi connection, or the like). This provides functionality for communication between system 500 and system 580 without needing to provide Internet connectivity to a vehicle (the in-vehicle system essentially uses smartphone 570 as a network device).
[00152] Using a smartphone device as an intermediary between system 501 and system 580 is in some embodiments implemented in a matter that provides additional technical benefits. For example:
• In some embodiments smartphone 570 provides to system 501 data that enabled identification of a unique user, avoiding a need for facial detection and/or other means. For instance, upon coupling a smartphone to a in-car system (which may include system 501 and one or more other in-car systems, such as an entertainment
system) via Bluetooth, system 501 receives user identification data from smartphone 570.
• In some embodiments a most-recent version of a given user’s historical blepharometric data (for example defined as historical baseline values) is stored on smartphone 570, and downloaded to system 501 upon coupling.
• In some embodiments one or more functionalities of analysis modules 530 are alternately performed via smartphone 570, in which case system 501 optionally is configured to in effect be a blepharometric data collection and communication system without substantive blepharometric data analysis functions (which are instead performed by smartphone 570, and optionally tailored via updating of smartphone app parameters by system 580 for personalised analysis.
[00153] The use of smartphone 570 is also in some cases useful in terms of allowing users to retain individual control over their blepharometric data, with blepharometric data not being stored by an in-vehicle system in preference to being stored on the user’s smartphone.
[00154] FIG. 5D illustrates a further variation in which communication between a local system 501 and cloud system 580 operates in a similar manner to FIG. 5B, but where a smartphone 570 is still present. In such arrangements, the smartphone is optionally used as an output device for information derived from blepharometric data analysis, and/or as a device to confirm identify and approval for blepharometric data collection. For example, in one embodiment a given system 501 identifies a user by way of biometric information (e.g. facial detection) using user identification data stored in system 583 of cloud system 580, and a message is sent to smartphone 570 allowing the user to confirm that they are indeed in the location of the relevant system 570, and providing an option to consent to blepharometric data monitoring.
Example Cloud-Based Extended Blepharometric data Monitoring Framework
[00155] FIG. 6 illustrates an exemplary framework under which a cloud based blepharometric data analysis system 580 operates in conjunction with a plurality of
disparate blepharometric data monitoring systems 601-606. Each of these systems is in communication with system 580, such that user data (for example user blepharometric data comprising historical data) is able to be utilised for analysis even where a user’s blepharometric data is collected from physically distinct monitoring systems. Analysis of blepharometric data (for example determination of relationships between current and historical data) may be performed at the cloud system 580, at the local systems 601-606, or combined across the cloud and local systems.
[00156] The local systems illustrated in FIG. 6 are:
• Vehicle operator configurations 601 . These are in-vehicle systems, such as that of FIG. 5A-5D, in which the image capture device is positioned to capture blepharometric data for an operator of the vehicle.
• Desktop/laptop computer configurations 602. In these configurations, a webcam or other image capture device is used to monitor user blepharometric data, with imagebased eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example an application which instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example word processing and/or internet browsing).
• Mass transport passenger configurations 603, for example airlines, busses, trains and the like. Ideally, these are configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen.
• Vehicle passenger configurations 604. These are in-vehicle systems, such as that of FIG. 5A-5D, in which the image capture device is positioned to capture blepharometric data for a passenger of the vehicle. For back-seat applications, these are optionally configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected
concurrently with the user observing content displayed via the display screen. For front-seat applications, the camera is positioned based on a presumption that a front seat passenger will for a substantial proportion of the time pay attention to the direction of vehicle travel (e.g. watch the road).
• Smartphone/tablet configurations 606. In these configurations, a front facing camera is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example an application which instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application which collects blepharometric data whilst a user engages in other activities on the computer (for example messaging and/or social media application usage).
• Medical facility configurations 606. These may make use of image processingbased blepharometric data monitoring, and/or other means of data collection (such as infrared reflectance oculography spectacles). These provide a highly valuable component in the overall framework: due to centralised collection of blepharometric data over time for a given subject from multiple locations over an extended period of time, a hospital is able to perform point-in-time blepharometric data collection and immediately reference that against historical data thereby to enable identification of irregularities in neurological conditions.
[00157] FIG. 6 also shows how system 580 is able to interact with a plurality of user mobile devices such as device 570. User identification data 584 provides addressing information thereby to enable system 580 to deliver messages, alerts, and the like to correct user devices.
[00158] Beyond advantages of providing an ability to carry user blepharometric data baselines and data collection between physical collection systems, and added benefit of a system such as that of FIG. 6 is an ability to personalise condition prediction algorithms for individual users. This is achieved by: (i) identifying a personalised blepharometric data biomarker for a given user, wherein that blepharometric data artefact is representative of a particular neurological condition; and (ii) configuring the system such that whenever that
particular user is identified, an analysis system executes a process configured to monitor for that biomarker (and perform a defined action in response). For example, in one example it is determined that a particular person displays a specific blepharometric data biomarker (for example threshold spiking in negative inter event duration) in the lead-up to a seizure event; a process configured to monitor for that biomarker is initialised in response to identification of that person. For example, an analysis module of an in-vehicle device is configured for such monitoring once the person is detected, and provides a seizure warning when the biomarker is detected.
Conclusions and Interpretation
[00159] It will be appreciated that the above disclosure provides analytic methods and associated technology that enables improved analysis of human neurological conditions. In particular, the present technology provides improved methods for monitoring SGF, without a need to provide exogenous stimuli to illicit startle blinks, and additionally without a need to use invasive monitoring equipment.
[00160] There re numerous advantages of assessing sensorimotor gating function as described herein via blink analysis, as opposed to prior art methods. These include, but are not limited to, the following:
• Assessments are able to be made continuously, based on a series of blink-free intervals that was updated after every blink, or made as a single measurement. This facilitates detailed studies of how sensorimotor gating varied over periods ranging from a few seconds to many hours.
• There is no need to attach EMG electrodes to the subject’s face (unlike in common prior art SGF assessment systems).
• No exogenous sensory stimuli would be used to generate reflex blinks or startle responses, including whole-body twitches in experimental animals. The stimuli that are currently used in the prior art are by definitions somewhat noxious, reliably producing startle responses. With the technology described herein, the subject would not be constantly aware of the measurements being made.
• The technology described herein enables sensorimotor gating to be measured under circumstances in which it has not been possible to do so previously (for example while driving a vehicle).
• A device configured to measure sensorimotor gating, based on blink-free intervals, could also analyse the characteristics of blinks that are currently used to measure alertness/drowsiness (and/or other attributes) based on known technologies. These two forms of analysis could be made independently during any test period, and this would enhance interpretation of the results from each test.
[00161] It should be appreciated that in the above de scription of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
[00162] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
[00163] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein
of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.
[00164] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[00165] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
[00166] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.
Claims
1 . A method of assessing a neurological condition of a human subject, the method including: collecting data representative of characteristics of blinks performed by a human subject, wherein the blinks occur without known application of exogenous sensory stimuli; and processing the data thereby to infer a state of Sensorimotor Gating Function (SGF).
2. A method according to claim 1 wherein processing the data thereby to infer a state of SGF includes analysis of Blink Free Intervals (BFIs).
3. A method according to claim 1 wherein the analysis of blink free intervals includes:
(i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range;
(ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF.
4. A method according to claim 2 wherein the analysis of blink free intervals includes:
(i) identifying a series of blink free intervals having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range over a defined analysis period;
(ii) identifying a series of blink free intervals having an interval time within a predicted non-SGC range over the defined analysis period; and
(iii) inferring the state of SGF based on a comparison of the number of BFIs in the SGC range against the number of BFIs in the non SGC range.
5. A method according to claim 2 wherein the analysis of blink free intervals includes:
(i) identifying a series of blink free intervals having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range over a defined analysis period;
47
(ii) identifying a total series of BFIs over the defined analysis period; and
(iii) inferring the state of SGF based on a comparison of the number of BFIs in the SGC range against the total number of BFIs over the analysis period. A method according to claim 1 wherein processing the data thereby to infer a state of Sensorimotor Gating Function (SGF) includes identifying presence of a reflex blink following a primary blink. A method of assessing Sensorimotor Gating Function (SGF), the method including: collecting data representative of characteristics of blinks performed by a human subject; and processing the data thereby to infer a state of SGF. A method according to claim 7 wherein collecting data representative of characteristics of blinks performed by a human subject includes collecting data representative of characteristics of blinks performed by the human subject, wherein the blinks occur without known application of exogenous sensory stimuli. A method according to claim 8 wherein processing the data thereby to infer a state of SGF includes: (i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range; (ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF. A method for assessing a human subject, the method including: collecting data representative of characteristics of blinks performed by the human subject; and processing the data thereby to infer assess the vehicle operator based of characteristics of Inter-Blink Intervals (I Bls) for an assessment period. A method according to claim 10 wherein collecting data representative of characteristics of blinks performed by a human subject includes collecting data representative of characteristics of blinks performed by the human subject, wherein the blinks occur without known application of exogenous sensory stimuli.
48 A method according to claim 11 wherein processing the data thereby to infer a state of SGF includes: (i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range; (ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF. A method for assessing a vehicle operator, the method including: collecting data representative of characteristics of blinks performed by the vehicle operator; and processing the data thereby to infer a state of Sensorimotor Gating Function (SGF). A method according to claim 13 wherein collecting data representative of characteristics of blinks performed by a human subject includes collecting data representative of characteristics of blinks performed by the human subject, wherein the blinks occur without known application of exogenous sensory stimuli. A method according to claim 14 wherein processing the data thereby to infer a state of SGF includes: (i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range; (ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF. A method for assessing a vehicle operator, the method including: collecting data representative of characteristics of blinks performed by the vehicle operator,; and processing the data thereby to infer assess the vehicle operator based of characteristics of Inter-Blink Intervals (I Bls) for an assessment period. A method according to claim 16 wherein collecting data representative of characteristics of blinks performed by a human subject includes collecting data representative of characteristics of blinks performed by the human subject, wherein the blinks occur without known application of exogenous sensory stimuli.
A method according to claim 17 wherein processing the data thereby to infer a state of SGF includes: (i) identifying a series of Blink Free Intervals (BFIs) having an interval time within a predicted Sensorimotor Gating Controlled (SGC) range; (ii) analysing the series of BFIs having an interval time within the predicted SGC range thereby to infer the state of SGF. A system configured to perform a method according to any preceding claim.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2020903401 | 2020-09-22 | ||
AU2020903401A AU2020903401A0 (en) | 2020-09-22 | Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022061403A1 true WO2022061403A1 (en) | 2022-03-31 |
Family
ID=80844503
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2021/051106 WO2022061403A1 (en) | 2020-09-22 | 2021-09-22 | Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022061403A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5786765A (en) * | 1996-04-12 | 1998-07-28 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Apparatus for estimating the drowsiness level of a vehicle driver |
US6346887B1 (en) * | 1999-09-14 | 2002-02-12 | The United States Of America As Represented By The Secretary Of The Navy | Eye activity monitor |
US20140152792A1 (en) * | 2011-05-16 | 2014-06-05 | Wesley W. O. Krueger | Physiological biosensor system and method for controlling a vehicle or powered equipment |
US20140205149A1 (en) * | 2011-09-05 | 2014-07-24 | Toyama Prefecture | Doze detection method and apparatus thereof |
US10482333B1 (en) * | 2017-01-04 | 2019-11-19 | Affectiva, Inc. | Mental state analysis using blink rate within vehicles |
-
2021
- 2021-09-22 WO PCT/AU2021/051106 patent/WO2022061403A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5786765A (en) * | 1996-04-12 | 1998-07-28 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Apparatus for estimating the drowsiness level of a vehicle driver |
US6346887B1 (en) * | 1999-09-14 | 2002-02-12 | The United States Of America As Represented By The Secretary Of The Navy | Eye activity monitor |
US20140152792A1 (en) * | 2011-05-16 | 2014-06-05 | Wesley W. O. Krueger | Physiological biosensor system and method for controlling a vehicle or powered equipment |
US20140205149A1 (en) * | 2011-09-05 | 2014-07-24 | Toyama Prefecture | Doze detection method and apparatus thereof |
US10482333B1 (en) * | 2017-01-04 | 2019-11-19 | Affectiva, Inc. | Mental state analysis using blink rate within vehicles |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Tag et al. | Continuous alertness assessments: Using EOG glasses to unobtrusively monitor fatigue levels In-The-Wild | |
Quddus et al. | Using long short term memory and convolutional neural networks for driver drowsiness detection | |
Toivanen et al. | A probabilistic real-time algorithm for detecting blinks, saccades, and fixations from EOG data | |
EP3265334A1 (en) | Device and method for predicting a vigilance level of a driver of a motor vehicle | |
US10888241B2 (en) | Device and method to determine objectively visual memory of images | |
US20220022805A1 (en) | Seizure detection via electrooculography (eog) | |
US11670423B2 (en) | Method and system for early detection of neurodegeneration using progressive tracking of eye-markers | |
US20210236023A1 (en) | TECHNOLOGY ADAPTED TO ENABLE IMPROVED COLLECTION OF INVOLUNTARY EYELlD MOVEMENT PARAMETERS, INCLUDING COLLECTION OF EYELlD MOVEMENT PARAMETERS TO SUPPORT ANALYSIS OF NEUROLOGICAL FACTORS | |
EP3760116A1 (en) | Eye blink sensor and method of examining blinking of an eye of a user | |
US20220218253A1 (en) | Impairment Detection Method and Devices | |
US20210386345A1 (en) | Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection | |
Dari et al. | Unsupervised blink detection and driver drowsiness metrics on naturalistic driving data | |
WO2022061403A1 (en) | Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks | |
AU2020102426B4 (en) | Collection of blepharometric data via a camera system | |
US20210369161A1 (en) | System and method for detection and continuous monitoring of neurological condition of a user | |
Niwa et al. | A wearable device for traffic safety-a study on estimating drowsiness with eyewear, JINS MEME | |
AU2021100643A4 (en) | Ai-based technology configured to enable physiological event prediction based on blepharometric data | |
AU2021100635B4 (en) | Identification of risk of degeneratve neurological conditions via blepharometric data collection | |
AU2021100637B4 (en) | Blepharometric monitoring system for a vehicle which provides user-customised analysis | |
AU2021100641B4 (en) | Extended period blepharometric monitoring across multiple data collection platforms | |
WO2022256877A1 (en) | Prediction of human subject state via hybrid approach including ai classification and blepharometric analysis, including driver monitoring systems | |
Golz et al. | Detection and prediction of driver’s microsleep events | |
US20230284974A1 (en) | Systems and methods for diagnosing, assessing, and quantifying sedative effects | |
Nuhoglu et al. | Assessment of electrode-based spontaneous eye blink analysis | |
Golz et al. | Microsleep detection in electrophysiological signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21870588 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.07.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21870588 Country of ref document: EP Kind code of ref document: A1 |