US20210386345A1 - Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection - Google Patents

Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection Download PDF

Info

Publication number
US20210386345A1
US20210386345A1 US17/288,360 US201917288360A US2021386345A1 US 20210386345 A1 US20210386345 A1 US 20210386345A1 US 201917288360 A US201917288360 A US 201917288360A US 2021386345 A1 US2021386345 A1 US 2021386345A1
Authority
US
United States
Prior art keywords
data
blepharometric
blepharometric data
vehicle
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/288,360
Inventor
Scott Coles
Trefor Morgan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sdip Holdings Pty Ltd
Original Assignee
Sdip Holdings Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2018904027A external-priority patent/AU2018904027A0/en
Application filed by Sdip Holdings Pty Ltd filed Critical Sdip Holdings Pty Ltd
Publication of US20210386345A1 publication Critical patent/US20210386345A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4094Diagnosing or monitoring seizure diseases, e.g. epilepsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4842Monitoring progression or stage of a disease
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4845Toxicology, e.g. by detection of alcohol, drug or toxic products
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7221Determining signal validity, reliability or quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/223Posture, e.g. hand, foot, or seat position, turned or inclined
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Definitions

  • the present disclosure relates, in various embodiments, to devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection, for example, in the context of analysis of neurological conditions using blepharometric data (data that records eyelid movement parameters as a function of time).
  • blepharometric data data that records eyelid movement parameters as a function of time.
  • some embodiments provide methods and associated technology that enable detection of changes in neurological conditions in a human subject (for example, to assist in management/identification of conditions that may be associated with seizures, degenerative diseases, and the like). While some embodiments will be described herein with particular reference to that application, it will be appreciated that the disclosure is not limited to such a field of use, and is applicable in broader contexts.
  • U.S. Pat. No. 7,791,491 teaches a method and apparatus for measuring drowsiness based on the amplitude to velocity ratio for eyelids closing and opening during blinking as well as measuring duration of opening and closing. This enables an objective measurement of drowsiness.
  • the present inventors through their research into relationships between eye and eyelid movement parameters and neurological conditions, have identified opportunities for probabilistic prediction and/or detection of additional neurological conditions via analysis of eyelid movement parameters.
  • One embodiment provides a system configured to facilitate collection of blepharometric data from one or more subjects on a periodic basis thereby to enable extended time period analysis of subject neurological conditions, the system including:
  • One embodiment provides a system wherein the sensor device is an image capture device.
  • One embodiment provides a system wherein the system includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a system wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • the subject identification module which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a system wherein the biometric data includes facial data.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a system wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a system wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a system wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a system wherein identifying a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject thereby to identify a long-term trend in blepharometric data includes determining whether, in response to a current set of blepharometric data, there is an identified threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein identifying a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject thereby to identify a threshold current point-in-time deviation from historical statistical data includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein the output module is configured to cause delivery of an output signal via in in-vehicle display.
  • One embodiment provides a system wherein the output module is configured to cause delivery of an output signal via an electronic message sent over a network.
  • One embodiment provides a system wherein the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a system including multiple sensor devices, each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a system including the blepharometric data monitoring system.
  • One embodiment provides a device configured to facilitate collection of blepharometric data from one or more subjects on a periodic basis thereby to enable extended time period analysis of subject neurological conditions, the device including:
  • One embodiment provides a device wherein the sensor device is an image capture device.
  • One embodiment provides a device wherein the device includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a device wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • the subject identification module which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a device wherein the biometric data includes facial data.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a device wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a device wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a device wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a device wherein identifying a threshold trend includes identifying a threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a device wherein identifying point-in-time statistical deviation includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a device wherein the output module is configured to cause delivery of an output signal via in in-vehicle display.
  • One embodiment provides a device wherein the output module is configured to cause delivery of an output signal via an electronic message sent over a network.
  • One embodiment provides a device wherein the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a device including multiple sensor devices each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a device including the blepharometric data monitoring system.
  • One embodiment provides a system configured to facilitate analysis of subject neurological conditions, the system including:
  • One embodiment provides a system wherein, for at least one of the sensor systems, the sensor system includes the sensor device including an image capture device that is configured to monitor blepharometric data.
  • One embodiment provides a system wherein the system includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a system wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • the subject identification module which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a system wherein the biometric data includes facial data.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a system wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a system wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a system wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a system including a module configured to determine point-in-time statistical variations between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject.
  • One embodiment provides a system wherein identifying a threshold trend includes identifying a threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein identifying point-in-time statistical deviation includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein the plurality of sensor systems includes a plurality of in-vehicle blepharometric data monitoring systems.
  • One embodiment provides a system wherein, for at least a subset of the in-vehicle blepharometric data monitoring systems, the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a system including multiple sensor devices each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a system wherein the system includes a cloud-based processing facility.
  • One embodiment provides a system configured to facilitate monitoring of subject neurological conditions, the system including:
  • One embodiment provides a device wherein the first software application is a messaging application.
  • One embodiment provides a device wherein the first software application is a social media application.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a messaging software application, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a social media software application, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • One embodiment provides computer-executable code that when executed causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharometric data-based data collection, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • any one of the terms “comprising,” “comprised of” or “which comprises” is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term “comprising,” when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms “including” or “which includes” or “that includes” as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others.
  • “including” is synonymous with and means “comprising.”
  • exemplary is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
  • FIG. 1A illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1B illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1C illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1D illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 2 illustrates a blepharometric data monitoring framework according to one embodiment.
  • FIG. 3 illustrates a method according to one embodiment.
  • FIG. 4 illustrates a blepharometric data collection/monitoring system in a passenger airplane.
  • FIG. 5 illustrates an analysis system according to one embodiment.
  • FIG. 6 illustrates a method according to one embodiment.
  • FIG. 7 illustrates an analysis system according to one embodiment.
  • the present disclosure relates, in various embodiments, to extended monitoring and analysis of subject neurological factors via blepharometric data collection, for example, including devices and processing systems configured to enable such extended monitoring.
  • This may include hardware and software components deployed at subject locations (for example, in-vehicle monitoring systems, portable device monitoring systems, and so on), and cloud-based hardware and software (for example, cloud-based blepharometric data processing systems.
  • a human subject's involuntary blinks and eyelid movements are influenced by a range of factors, including the subject's behavioral state and brain function. For example, this has been used in the past for detection of drowsiness. More broadly, analysis of data derived from eye and eyelid movements can be performed thereby to identify data artefacts, patterns and the like, and these are reflective of the subject's behavioral state, brain function and the like.
  • the technology described herein is focused on collection and analysis of “blepharometric data,” with the term “blepharon” describing a human eyelid.
  • the term “blepharometric data” is used to define data that describes eyelid movement as a function of time. For example, eyelid position may be recorded as an amplitude. Eyelid movements are commonly categorized as “blinks” or “partial blinks.”
  • the term “blepharometric data” is used to distinguish technology described herein from other technologies that detect the presence of blinks for various purposes.
  • the technology herein is focused on analyzing eyelid movement as a function of time, typically measured as an amplitude.
  • blepharometric artefacts Events and other parameters that are identified from the processing of blepharometric data are referred to as “blepharometric artefacts.” These are referred to as “blepharometric artefacts,” with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e., blepharometric data).
  • the artefacts may include:
  • BTD blink total duration
  • the determination of blepharometric artefacts may include any one or more of:
  • a “blink” is in some embodiments defined as the pairing of positive and negative events that are within relative amplitude limits and relative position limits. There may be multiple events within a given blink, when an eyelid is outside of an “inter-blink” eyelid amplitude range.
  • blepharometric data monitoring systems focus on point-in-time subject analysis. For example, commonly such technology is used as a means for assessing subject alertness/drowsiness at a specific moment, potentially benchmarked against known data for a demographically relevant population. There is a problem in that, for many neurological conditions, point-in-time assessment is inadequate. For example, many neurological conditions are degenerative and/or progressive, and for those and others point-in-time blepharometric data without historical baselines may be of limited usefulness. Currently, however, there is no practical way in which to collect blepharometric data for people, outside of requiring people to subject themselves to specialist testing (which is expensive and for a bulk of the population likely unfeasible).
  • a solution proposed herein is to deploy blepharometric data collection systems in a range of human environments, being environments in which humans are commonly positioned suitably for blepharometric data collection.
  • Examples considered herein are vehicles (for example, cars, airplanes, trains, and the like), computing devices (for example, smartphones, tablets, and PCs), and other locations.
  • This allows long term blepharometric data collection on an individualized basis, allowing for better management of neurological health (and other factors such as safety).
  • specific use cases might include providing warnings in advance of seizures, informing a person of a risk of a degenerative brain illness, detection of brain injuries from accidents and/or sporting activities, and personalized detection of unusual levels of drowsiness.
  • involuntary eyelid movements there are many factors that have an effect on involuntary eyelid movements, with examples including: a subject's state of physical activity; a subject's posture; other aspects of a subject's positional state; subject movement; subject activity; how well slept the subject happens to be; levels of intoxication and/or impairment; and others.
  • factors that have effects on involuntary eyelid movements include degenerative brain injuries (e.g., Parkinson's disease) and traumatic brain injuries.
  • FIG. 3 illustrates a high-level methodology that is relevant to a range of embodiments discussed below.
  • This methodology is optionally performed via software modules executing across a plurality of connected devices, for example, including local devices (for example, computing devices housed in a vehicle and/or user's mobile devices such as smartphones) and Internet-connected server devices (also referred to as “cloud” components).
  • local devices for example, computing devices housed in a vehicle and/or user's mobile devices such as smartphones
  • Internet-connected server devices also referred to as “cloud” components.
  • any computing devices and computer-executed methods configured for the purposes of enabling the overall performance of a methodology based on those described below by reference to FIG. 3 form embodiments of inventions for the purposes of this specification.
  • Block 301 represents a process including collecting data representative of eyelid movement (i.e., blepharometric data). For the majority of embodiments described below, this is achieved via a camera system having an image capture component that is positioned into a capture zone in which a subject's face is predicted to be positioned. For example, this may include:
  • Vehicles including passenger vehicles or operator-only vehicles, wherein the image capture component is positioned to capture a region in which an operator's face is predicted to be contained during normal operation.
  • the image capture component may include a camera mounting in or adjacent a dashboard or windscreen.
  • Vehicles in the form of passenger vehicles, wherein the image component is positioned to capture a region in which a passenger's face is predicted to be contained during normal operation.
  • the image capture component may include a camera mounting in or adjacent a dashboard or windscreen, the rear of a seat (including a seat headrest), and so on.
  • Mass transport vehicles including passenger trains and/or aircraft, wherein the image component is positioned to capture a region in which a passenger's face is predicted to be contained during normal operation.
  • the image capture component may be mounted in the rear of a seat (including a seat headrest), optionally in a unit that contains other electronic equipment such as a display monitor.
  • Seating arrangements such as theatres, cinemas, auditoriums, lecture theatres, and the like. Again, mounting image capture components in the rear of seats is an approach adopted in some embodiments.
  • the data that is captured is not limited to data captured for the purposes of extended monitoring and analysis of subject neurological factors via blepharometric data collection.
  • that is one purpose and there is an alternate purpose, which is optionally point-in-time based.
  • point-in-time drowsiness detection is relevant in many of the above scenarios.
  • collected blepharometric data is optionally additionally collected for the purposes of group monitoring/analysis (including where blepharometric data is anonymized such that it is not attributable to a specific individual). For example, this may be used in the context of seating arrangements to assess overall student/viewer attention/drowsiness, or in the context of airplanes and other mass transport to perform analysis of passenger health factors.
  • Credential-based identification for example, via a login.
  • This may include pairing of a personal device (such as a smartphone) to blepharometric data monitoring system (e.g., pairing a phone to an in-vehicle system), inputting login credentials via an input device, or other means.
  • a personal device such as a smartphone
  • blepharometric data monitoring system e.g., pairing a phone to an in-vehicle system
  • inputting login credentials via an input device e.g., via an input device, or other means.
  • Biometric identification For example, in some embodiments described herein, a camera-based blepharometric data monitoring system utilizes image data to additionally perform facial recognition functions, thereby to uniquely identify human subjects.
  • an analysis system has access to a database of historical blepharometric data for one subject (for example, where the system is installed in a vehicle and monitors only a primary vehicle owner/driver) or multiple subjects (for example, a vehicle configured to monitor multiple subjects, or a cloud-hosted system that received blepharometric data from a plurality of networked systems, as described further below).
  • Block 303 represents a process including determination of blepharometric artefacts for a current time period.
  • the artefacts may include:
  • the “current period” may be either a current period defined by a current user interaction with a blepharometric data monitoring system, or a subset of that period.
  • the “current period” is in one example defined as a total period of time for which a user operates the vehicle and has blepharometric data monitored, and in another embodiment is a subset of that time.
  • multiple “current periods” are defined, for example, using time block samples of between two and fifteen minutes (which are optionally overlapping), thereby to compare blepharometric data activity during periods of varying lengths (which may be relevant for differing neurological conditions, which, in some cases, present themselves based on changes in blepharometric data over a given period of time).
  • the current blepharometric data may be used for point-in-time neurological conditional analysis, for example, analysis of subject alertness/drowsiness, prediction of seizures, detection of seizures, and other such forms of analysis. Specific approaches for analyzing blepharometric data thereby to detect/predict particular neurological conditions fall beyond the scope of the present disclosure.
  • Block 304 represents a process including identification of relationships between current blepharometric artefacts and historical blepharometric artefacts. This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject.
  • the concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
  • blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any particular blepharometric data artefacts that are evolving/trending over time.
  • algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
  • Block 305 represents a process including identification of presence of one or more blepharometric variation indicators, for example, based on the identification of relationships at block 304 .
  • These indicators may be used to allow data-based determination/prediction of the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; (v) neurotoxicity or (vi) other factors.
  • rules are defined that associate a data relationship (for example, deviation from baseline values, a trend identification, or the like) with a prediction on neurological condition.
  • These may be defined, for example, using logical structures, such as:
  • Bock 306 represents a process including providing output to the human subject based on identified blepharometric variation indicators. This may include an instruction/suggestion to avoid a particular activity (such as driving), an instruction/suggestion to undertake a particular activity (such as medication, resting, walking around, or the like), or a suggestion to consult a medical expert about a potential neurological condition.
  • This manner by which the output is delivered varies depending on both the nature of the alert/condition, and the hardware environment in place. Examples range from the sending of emails or other messages or the display of information on a local device (for example, an in-vehicle display).
  • FIG. 1A illustrates an example in-vehicle blepharometric data monitoring system. While it is known to provide a blepharometric data monitoring system in a vehicle for the purposes of point-in-time analysis of alertness/drowsiness, the system of FIG. 1A provides for substantial advances in ability to perform analysis of a user's neurological condition by way of providing a memory module that stores historical blepharometric data, and enables analysis of changes in blepharometric data for the user over time.
  • the system of FIG. 1A includes an image capture device 120 .
  • This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement.
  • Image capture device 120 is positioned to capture a facial region of a subject.
  • Image capture device 120 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example, on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver.
  • image capture device 120 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger.
  • image capture device 120 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger.
  • a combination of these are provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
  • FIG. 1A (and other systems) are described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implanted in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
  • mass-transport vehicles such as passenger airplanes, busses/coaches, and trains.
  • An in-vehicle image processing system 110 is configured to receive image data from image capture device 120 (or multiple image capture devices 120 ), and process that data thereby to generate blepharometric data.
  • a control module 111 is configured to control image capture device 120 , operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
  • Algorithms 112 optionally operate to extract additional artefacts from blepharometric data, for example, amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
  • a blepharometric data management module 113 is configured to coordinate storage of blepharometric data generated by algorithms 112 in user blepharometric data 152 . This includes determining a user record against which blepharometric data is to be recorded (in some cases there is only a single user record, for example, where blepharometric data s collected only from a primary driver of an automobile). In some embodiments, the function of blepharometric data management module 113 includes determining whether a set of generated blepharometric data meets threshold data quality requirements for storage, for example, based on factors including a threshold unbroken time period for which eyelid tracking is achieved and blepharometric data is generated.
  • Memory system 150 includes user identification data 151 for one or more users.
  • system 101 is configured to collect and analyze blepharometric data for only a single user (for instance, the primary driver of a vehicle) and includes identification data to enable identification of only that user.
  • system 101 includes functionality to collect and analyze blepharometric data for multiple users, and includes identification data to enable identification of any of those users (and optionally, as noted above, defining of a new record for a previously unknown user).
  • the identification data may include login credentials (for example, a user ID and/or password) that are inputted via an input device.
  • the identification data may be biometric, for example, using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments, this leverages an existing biometric identification system of the vehicle.
  • User blepharometric data 152 includes data associated with identified users, the data basing time coded thereby to enable identification of a date/time at which data was collected.
  • the blepharometric data stored in user blepharometric data 152 optionally includes blepharometric data generated by algorithms 112 and further blepharometric data derived from further processing of that data, for example, data representing average periodic IEDs and/or BTDs, and other relevant statistics that may be determined over time.
  • data processing algorithms are updated over time, for example, to allow analysis of additional biomarkers determined to be representative of neurological conditions that require extraction of particular artefacts from blepharometric data.
  • Analysis modules 130 are configured to perform analysis of user blepharometric data 152 . This includes executing a process including identification of relationships between current blepharometric artefacts (e.g., data recently received from in-vehicle image processing system 110 ) and historical blepharometric artefacts (e.g., older data pre-existing in memory system 150 ). This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject.
  • the concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
  • blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any particular blepharometric data artefacts that are evolving/trending over time.
  • algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
  • Identification of current point-in-time deviations from baselines derived from historical blepharometric data may show anomalous spiking in particular artefacts, or other differences from baselines derived from the subject's historical blepharometric data, which may give rise for concern.
  • this form of analysis may be used to determine/predict the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; or (v) neurotoxicity or (vi) other factors.
  • Analysis modules are optionally updated over time (for example, via firmware updates or the like) thereby to allow for analysis of additional blepharometric artefacts and hence identification of neurological conditions. For example, when a new method for processing blepharometric data thereby to predict a neurological condition based on a change trend in one or more blepharometric artefacts, an analysis algorithm for that method is preferably deployed across a plurality of systems such as system 101 via a firmware update or the like.
  • System 101 additionally includes a communication system 160 , which is configured to communicate information from system 101 to human users.
  • This may include internal communication modules 161 that provide output data via components installed in the vehicle, for example, an in-car display, warning lights, and so on.
  • External communication modules 162 are also optionally present, for example, to enable communication of data from system 101 to user devices (for example, via Bluetooth, WiFi, or other network interfaces), optionally by email or other messaging protocols.
  • communication system 160 is configured to communicate results of analysis by analysis modules 130 .
  • a control system 141 included logic modules 140 , which control overall operation of system 141 . This includes execution of logical rules thereby to determine communications to be provide din response to outputs from analysis modules 130 . For example, this may include:
  • logic modules 140 are able to provide a wide range of functionalities thereby to cause system 101 to act based on determinations by analysis modules 130 .
  • FIG. 1 provides technology whereby one or more digital camera is able to be installed in a vehicle, such as an automobile or mass transport vehicle, thereby to: (i) collect blepharometric data for an operator and/or one or more passengers; and (ii) enable determination of relationships between blepharometric data collected in a “current” period (for example, a last data set, a last day, a last week, or a last month) with historical blepharometric data that is stored for that same user.
  • a “current” period for example, a last data set, a last day, a last week, or a last month
  • Prediction of neurological conditions based on sudden changes and/or long term trends in change for one or more blepharometric artefacts that are known to be indicative of particular neurological conditions;
  • Personalized prediction of future neurological conditions for example, prediction of future drowsiness based on known drowsiness development patters extracted for the individual from historical data, and prediction of likelihood of a seizure based on individually-verified seizure prediction biomarkers identifiable in blepharometric data.
  • Identification of point-in-time relevant neurological conditions based on sudden deviations from historical averages which may be representative of sudden neurological changes, for example, traumatic brain injuries (e.g., concussion) and/or impairment based on other factors (such as neurotoxicity, medications, drugs, alcohol, illness, and so on).
  • sudden neurological changes for example, traumatic brain injuries (e.g., concussion) and/or impairment based on other factors (such as neurotoxicity, medications, drugs, alcohol, illness, and so on).
  • FIG. 1B illustrates a further embodiment, which includes various common features with the embodiment illustrated in FIG. 1A .
  • external communication modules 162 facilitate communication with a remote server device, which optionally performs additional blepharometric data analysis.
  • external communication modules 162 enable communication between system 101 and a cloud-based blepharometric data analysis system 180 .
  • Cloud system 180 includes a control system 182 and logic modules 181 that are provided by computer-executable code executing across one or more computing devices thereby to control and deliver functionalities of cloud system 180 .
  • Cloud system 180 additionally includes a memory system 183 , which includes user identification data 184 and user blepharometric data 185 .
  • the interplay between memory system 183 and memory system 150 varies between embodiments, with examples discussed below:
  • memory system 150 operates in parallel with memory system 183 , such that certain records are synchronized between the systems based on a defined protocol.
  • this optionally includes a given memory system 150 maintaining user blepharometric data and user identification data for a set of subjects that have presented at that in-vehicle system, and that data is periodically synchronized with the cloud system.
  • the system optionally performs a cloud (or other external) query thereby to obtain identification data for that user, and then downloads from the cloud system historical user blepharometric data for that user.
  • Locally collected blepharometric data us uploaded to the server. This, and other similar approaches, provides for transportability of user blepharometric data between vehicles.
  • memory system 150 is used primarily for minimal storage, with system 101 providing a main store for user blepharometric data.
  • memory system 150 includes data representative of historical blepharometric data baseline values (for instance, defined as statistical ranges), whereas detailed recordings of blepharometric data is maintained in the cloud system.
  • analysis modules 186 of cloud-based blepharometric data analysis system 180 performed more complex analysis of user blepharometric data thereby to extract the historical blepharometric data baseline values, which are provided to memory system 150 where a given user is present or known thereby to facilitate local analysis of relationships from baselines.
  • local memory system 150 is omitted, with all persistent blepharometric data storage occurring in cloud memory system 183 .
  • Cloud system 180 additionally includes analysis modules 186 , which optionally perform a similar role to analysis modules 130 in FIG. 1A .
  • local and cloud analysis modules operate in a complementary factor, for example, with analysis modules 130 performing relationship analysis relevant to point-in-time factors (for example, an altered/non-standard neurological state for a user by comparison with historical baselines, which warrants immediate intervention) and analysis modules 186 performing what is often more complex analysis of trends over time (which may be representative of degenerative neurological illnesses and the like) and do not require local immediate intervention in a vehicle.
  • a cloud-based system to operate with a plurality of in-vehicle systems, in particular an ability to maintain cloud storage of user identification data and user blepharometric data for a large number of users, and hence allow that data to “follow” the users between various vehicles over time.
  • a user may have a personal car with a system 101 , and subsequently obtain a rental car while travelling with its own system 101 , and as a result of cloud system 180 the rental car system: has access to the user's historical blepharometric data; is able to perform relationship analysis of the current data collected therein against historical data obtained from the cloud system; and feed into the cloud system the new blepharometric data collected to further enhance the user's historical data store.
  • FIG. 1C illustrates a further variation where a user has a smartphone device 170 that executes a software application configured to communicate with a given local in-vehicle system 101 (for example, via Bluetooth or USB connection) and additionally with cloud system 180 (for example, via a wireless cellular network, WiFi connection, or the like).
  • a smartphone device 170 that executes a software application configured to communicate with a given local in-vehicle system 101 (for example, via Bluetooth or USB connection) and additionally with cloud system 180 (for example, via a wireless cellular network, WiFi connection, or the like).
  • This provides functionality for communication between system 100 and cloud system 180 without needing to provide Internet connectivity to a vehicle (the in-vehicle system essentially uses smartphone 170 as a network device).
  • Using a smartphone device as an intermediary between system 101 and cloud system 180 is in some embodiments implemented in a matter that provides additional technical benefits. For example:
  • smartphone 170 provides to system 101 data that enabled identification of a unique user, avoiding a need for facial detection and/or other means. For instance, upon coupling a smartphone to a in-car system (which may include system 101 and one or more other in-car systems, such as an entertainment system) via Bluetooth, system 101 receives user identification data from smartphone 170 .
  • a smartphone upon coupling a smartphone to a in-car system (which may include system 101 and one or more other in-car systems, such as an entertainment system) via Bluetooth, system 101 receives user identification data from smartphone 170 .
  • a most-recent version of a given user's historical blepharometric data (for example, defined as historical baseline values) is stored on smartphone 170 , and downloaded to system 101 upon coupling.
  • one or more functionalities of analysis modules 130 are alternately performed via smartphone 170 , in which case, system 101 optionally is configured to, in effect, be a blepharometric data collection and communication system without substantive blepharometric data analysis functions (which are instead performed by smartphone 170 , and optionally tailored via updating of smartphone app parameters by cloud system 180 for personalized analysis.
  • smartphone 170 is also in some cases useful in terms of allowing users to retain individual control over their blepharometric data, with blepharometric data not being stored by an in-vehicle system in preference to being stored on the user's smartphone.
  • FIG. 1D illustrates a further variation in which communication between a local system 101 and cloud system 180 operates in a similar manner to FIG. 1B , but where a smartphone 170 is still present.
  • the smartphone is optionally used as an output device for information derived from blepharometric data analysis, and/or as a device to confirm identify and approval for blepharometric data collection.
  • a given system 101 identifies a user by way of biometric information (e.g., facial detection) using user identification data stored in memory system 183 of cloud system 180 , and a message is sent to smartphone 170 allowing the user to confirm that they are indeed in the location of the relevant system 170 , and providing an option to consent to blepharometric data monitoring.
  • biometric information e.g., facial detection
  • a system such as that of FIG. 1A is also able to be integrated into other local systems thereby to provide control instructions to those systems in response to artefacts identified in blepharometric data.
  • An example is provided in FIG. 4 , wherein an aircraft 400 an in-vehicle blepharometric data analysis system, which is fed data from image capture devices including devices installed in seat-backs (for example, in a common housing to a seat-back display screen).
  • System 401 is configured to feed data thereby to effect control instructions into an entertainment system 402 and a passenger health/comfort analysis system 403 .
  • each image capture device is provided in conjunction with a display screen that is configured to deliver audio-visual entertainment (for instance, as is common in airplanes), monitoring of subject blepharometric data may be used to provide an enhanced experience with respect to audio-visual data.
  • audio-visual entertainment for instance, as is common in airplanes
  • monitoring of subject blepharometric data may be used to provide an enhanced experience with respect to audio-visual data. This may include, for example:
  • provision of a system that enables collection and analysis of blepharometric data from multiple passengers in a mass-transit vehicle may have additional far-reaching advantages in terms of optimizing passenger health and/or comfort during transportation.
  • mass-transport embodiments there is preferably a clear distinction between personalizing health data, which is maintained with privacy on behalf of the user, and non-personalizing statistical data, which may be shared with other systems/people. For instance, an individual's neurological conditions are not made available to airline personnel, however data representative of drowsiness/alertness statistics in a cabin are made available to airline personnel.
  • FIG. 2 illustrates an exemplary framework under which a cloud-based blepharometric data analysis system 180 operates in conjunction with a plurality of disparate blepharometric data monitoring systems 201 - 206 .
  • Each of these systems is in communication with cloud system 180 , such that user data (for example, user blepharometric data comprising historical data) is able to be utilized for analysis even where a user's blepharometric data is collected from physically distinct monitoring systems.
  • Analysis of blepharometric data (for example, determination of relationships between current and historical data) may be performed at the cloud system 180 , at the local systems 201 - 206 , or combined across the cloud and local systems.
  • the local systems illustrated in FIG. 2 are:
  • Vehicle operator configurations 201 are in-vehicle systems, such as that of FIG. 1A-1D , in which the image capture device is positioned to capture blepharometric data for an operator of the vehicle.
  • Desktop/laptop computer configurations 202 In these configurations, a webcam or other image capture device is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example, an application that instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application that collects blepharometric data while a user engages in other activities on the computer (for example, word processing and/or interne browsing).
  • a foreground application for example, an application that instructs a user to perform a defined task during which blepharometric data is collected
  • a background application that collects blepharometric data while a user engages in other activities on the computer (for example, word processing and/or interne browsing).
  • Mass-transport passenger configurations 203 for example, airlines as illustrated in FIG. 4 , busses, trains and the like. Ideally, these are configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen.
  • Vehicle passenger configurations 204 are in-vehicle systems, such as that of FIGS. 1A-1D , in which the image capture device is positioned to capture blepharometric data for a passenger of the vehicle.
  • the image capture device For back-seat applications, these are optionally configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen.
  • the camera is positioned based on a presumption that a front seat passenger will for a substantial proportion of the time pay attention to the direction of vehicle travel (e.g., watch the road).
  • Smartphone/tablet configurations 205 In these configurations, a front facing camera is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example, an application that instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application that collects blepharometric data while a user engages in other activities on the computer (for example, messaging and/or social media application usage).
  • a foreground application for example, an application that instructs a user to perform a defined task during which blepharometric data is collected
  • a background application that collects blepharometric data while a user engages in other activities on the computer (for example, messaging and/or social media application usage).
  • Medical facility configurations 206 may make use of image processing-based blepharometric data monitoring, and/or other means of data collection (such as infrared reflectance oculography spectacles). These provide a highly valuable component in the overall framework: due to centralized collection of blepharometric data over time for a given subject from multiple locations over an extended period of time, a hospital is able to perform point-in-time blepharometric data collection and immediately reference that against historical data thereby to enable identification of irregularities in neurological conditions.
  • FIG. 2 also shows how cloud system 180 is able to interact with a plurality of user mobile devices such as smartphone device 170 .
  • User identification data 184 provides addressing information thereby to enable cloud system 180 to deliver messages, alerts, and the like to correct user devices.
  • a particular person displays a specific blepharometric biomarker (for example, threshold spiking in negative inter event duration) in the lead-up to a seizure event; a process configured to monitor for that biomarker is initialized in response to identification of that person.
  • a process configured to monitor for that biomarker is initialized in response to identification of that person.
  • an analysis module of an in-vehicle device is configured for such monitoring once the person is detected, and provides a seizure warning when the biomarker is detected.
  • the blepharometric data is optionally defined by a reading made by an infrared reflectance sensor, and as such is a proxy for eyelid position. That is, rather than monitoring the actual position of an eyelid, infrared reflectance oculography techniques use reflectance properties and in so doing are representative of the extent to which an eye is open (as the presence of an eyelid obstructing the eye affects reflectivity). In some embodiments, additional information beyond eyelid position may be inferred from infrared reflectance oculography, for example, whether a subject is undergoing tonic eye movement.
  • “blepharometric data” in some embodiments includes infrared reflectance oculography measurements, and hence may additionally be representative of tonic eye movement.
  • FIG. 5 illustrates an example blepharometric data relationship analysis system, which may be incorporated into embodiments described above. In some cases, components/functionalities of this system are distributed across local and cloud-based processing systems.
  • One or more new sets of blepharometric data 501 are received by a new data processing module 502 .
  • Module 502 is configured to perform data validation and/or data cleaning, thereby to ensure that the data is suitable for analysis and/or storage. For example, data displaying irregularities and/or having a sample time below a given threshold is excluded.
  • a new data storage module 503 is configured to coordinate storage of the new set or sets of data 501 , following processing my module 502 , into a data store 505 containing historical blepharometric data for the user.
  • a statistical value determination module 510 applies an expandable set of processing algorithms to data in store 505 thereby to extract a range of statistical values (for example, averages for blepharometric artefacts, optionally categorized based on collection conditions and other factors). These statistical values are stored in data store 505 thereby to maintain richer detail regarding baseline blepharometric data values for the user, preferably in a way that is tied to defined relationship analysis algorithms. That is, if an algorithm X to determine a condition Y relies on analysis of a blepharometric artefact Z, then statistical value determination module 510 is preferably configured to apply an algorithm configured to extract artefact Z from user blepharometric data.
  • a new data relationship processing module 504 is configured to identify relationships between new data 501 and historical data 505 .
  • Data rules to facilitate the identification of particular relationships that are known to be representative (or predictively representative) of neurological conditions are defined in condition identification rules 506 .
  • Condition identification rules 506 are periodically updated based on new knowledge regarding blepharometric/neurological condition research. For example, a given rule defines a category of relationship between one or more blepharometric data artefacts in new data 501 and one or more baseline values extracted from historical data in data store 505 based on operation of statistical value determination module 510 .
  • representative data is passed to an output rules module that contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response a given output module 509 is invoked to provide the designated output.
  • an output rules module that contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response a given output module 509 is invoked to provide the designated output.
  • a trend analysis module 507 is configured to continuously, periodically or in an event-driven manner (for example, in response to receipt of new blepharometric data), identify trends/changes in user blepharometric data.
  • data rules to facilitate the identification of particular trends that are known to be representative (or predictively representative) of neurological conditions are defined in condition identification rules 506 .
  • Condition identification rules 506 are periodically updated based on new knowledge regarding blepharometric/neurological condition research. For example, a given rule defines a threshold deviation in one or more artefacts over a threshold time as being predictively representative of a neurological condition.
  • representative data is passed to an output rules module 508 , which contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response, a given output module 509 is invoked to provide the designated output.
  • output rules module 508 contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response, a given output module 509 is invoked to provide the designated output.
  • the system of FIG. 5 is configurable to monitor for a range of neurological conditions that are identifiable in blepharometric data based on point-in-time variations from known baselines that are generated and refined over extended period (i.e., based on a collection of time-separated data sets), and trends in blepharometric data over time (even where differences between consecutive data sets are relatively minor).
  • this form of data collection and analysis is of significant use in the context of predicting and understanding neurological conditions, for example, in terms of: (i) identifying potential degenerative conditions and rates of onset; (ii) identifying point-in-time events that led to sudden changes in neurological conditions; (iii) monitoring long-term effects of contact sports (e.g., concussive brain injuries) for participants, (iv) personalizing blepharometric data analysis for individual users.
  • contact sports e.g., concussive brain injuries
  • one embodiment of system 700 provides a portable electronic device 701 including: a display screen 704 ; and a front-facing camera 702 ; wherein the portable electronic device is configured to concurrently execute, via software instructions 703 , which execute on a processor of device 701 : (i) a first software application that provides data via the display screen; and (ii) a second software application that receives input from the front facing camera thereby to facilitate detection and analysis of blepharon data.
  • the first software application is in one embodiment a messaging application, and in another embodiment a social media application.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharon-based data collection, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharon data.
  • the purpose may be, for example, messaging or social media.
  • Embodiments such as that of FIG. 7 provide for collection of blepharon data via a background software application executing on electronic device with a front-facing camera. This provides opportunities to analyze a device user's neurological condition, for example, in the context of predicting seizures, advising on activities, diagnosing potential neurological illnesses, detecting drowsiness, and so on.
  • an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the disclosure.
  • Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B, which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

Technology described herein relates to extended monitoring and analysis of subject neurological factors via blepharometric data collection, for example, including devices and processing systems configured to enable such extended monitoring. This may include hardware and software components deployed at subject locations (for example, in-vehicle monitoring systems, portable device monitoring systems, and so on), and cloud-based hardware and software (for example, cloud-based blepharometric data processing systems. The technology allows for user blepharometric data to be collected across a plurality of monitoring sessions, in some cases via different collection technologies, thereby to analyze changes over time. For example, this can assist in identifying risks of degenerative neurological conditions.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a national phase entry under 35 U.S.C. § 371 of International Patent Application PCT/AU2019/051157, filed Oct. 23, 2019, designating the United States of America and published as International Patent Publication WO 2020/082124 A1 on Apr. 30, 2020, which claims the benefit under Article 8 of the Patent Cooperation Treaty to Australian Patent Application Serial Nos. 2018904026, 2018904027, 2018904028, all filed Oct. 23, 2018; Australian Patent Application Serial No. 2018904076 filed Oct. 27, 2018; Australian Patent Application Serial No. 2018904312 filed Nov. 13, 2018; and Australian Patent Application Serial No. 2019900229 filed Jan. 25, 2019.
  • TECHNICAL FIELD
  • The present disclosure relates, in various embodiments, to devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection, for example, in the context of analysis of neurological conditions using blepharometric data (data that records eyelid movement parameters as a function of time). For example, some embodiments provide methods and associated technology that enable detection of changes in neurological conditions in a human subject (for example, to assist in management/identification of conditions that may be associated with seizures, degenerative diseases, and the like). While some embodiments will be described herein with particular reference to that application, it will be appreciated that the disclosure is not limited to such a field of use, and is applicable in broader contexts.
  • BACKGROUND
  • Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
  • It is known to analyze neurological conditions from analysis of eye and/or eyelid movements. For example, U.S. Pat. No. 7,791,491 teaches a method and apparatus for measuring drowsiness based on the amplitude to velocity ratio for eyelids closing and opening during blinking as well as measuring duration of opening and closing. This enables an objective measurement of drowsiness.
  • The present inventors, through their research into relationships between eye and eyelid movement parameters and neurological conditions, have identified opportunities for probabilistic prediction and/or detection of additional neurological conditions via analysis of eyelid movement parameters.
  • BRIEF SUMMARY
  • It is an object of the present disclosure to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.
  • One embodiment provides a system configured to facilitate collection of blepharometric data from one or more subjects on a periodic basis thereby to enable extended time period analysis of subject neurological conditions, the system including:
      • a sensor device configured to be mounted in a vehicle such that the sensor device is positioned to enable collection of blepharometric data from passenger or operator of the vehicle;
      • a sensor data handling module that communicates data derived from the sensor device to a blepharometric data monitoring system that includes:
        • (i) a subject identification module that is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device;
        • (ii) a blepharometric data input processing module configured to process blepharometric data received from the sensor device from the identified human subject thereby to define current blepharometric data for the human subject;
        • (iii) a memory module that is configured to maintain a record of historical blepharometric data for the identified human subject;
        • (iv) a blepharometric data variation processing module that is configured to identify a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject, thereby to identify presence of one or more blepharometric data variation indicators; and
        • (v) an output module that is configured to provide a data output in response to identification of blepharometric data variation indicator.
  • One embodiment provides a system wherein the sensor device is an image capture device.
  • One embodiment provides a system wherein the system includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a system wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a system wherein the biometric data includes facial data.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a system wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a system wherein the blepharometric data artefacts include any one or more of the following:
      • measurements defined by, or representative of statistical attributes of, blink total duration (BTD);
      • measurements defined by, or representative of statistical attributes of, inter-event duration (IED);
      • measurements defined by, or representative of statistical attributes of, blink amplitudes;
      • measurements defined by, or representative of statistical attributes of, eyelid velocities;
      • measurements defined by, or representative of statistical attributes of, amplitude to velocity ratios for blink events;
      • measurements defined by, or representative of statistical attributes of, negative inter-event duration (negative IED);
      • blink event counts, including blink event counts for a set of blink events having defined attributes occurring within a defined time period;
      • measurements defined by, or representative of statistical attributes of, blink eye closure duration (BECD); and
      • measurements defined by, or representative of statistical attributes of, duration of ocular quiescence (DOQ).
  • One embodiment provides a system wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a system wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a system wherein the blepharometric data variation processing module is configured to identify a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject by processing methods including one or more of the following:
      • identification of long-term trends in blepharometric data; and
      • identification of threshold current point-in-time deviations from historical statistical data.
  • One embodiment provides a system wherein identifying a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject thereby to identify a long-term trend in blepharometric data includes determining whether, in response to a current set of blepharometric data, there is an identified threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein identifying a relationship between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject thereby to identify a threshold current point-in-time deviation from historical statistical data includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein the output module is configured to cause delivery of an output signal via in in-vehicle display.
  • One embodiment provides a system wherein the output module is configured to cause delivery of an output signal via an electronic message sent over a network.
  • One embodiment provides a system wherein the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a system including multiple sensor devices, each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a system including the blepharometric data monitoring system.
  • One embodiment provides a device configured to facilitate collection of blepharometric data from one or more subjects on a periodic basis thereby to enable extended time period analysis of subject neurological conditions, the device including:
      • a sensor device configured to be mounted in a vehicle such that the sensor device is positioned to enable collection of blepharometric data from passenger or operator of the vehicle;
      • a sensor data handling module that communicates data derived from the sensor device to a blepharometric data monitoring system, which includes:
        • (i) a subject identification module that is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device;
        • (ii) a blepharometric data input processing module configured to process blepharometric data received from the sensor device from the identified human subject thereby to define current blepharometric data for the human subject;
        • (iii) a memory module that is configured to maintain a record of historical blepharometric data for the identified human subject;
        • (iv) a blepharometric data variation processing module that is configured to identify:
      • point-in-time statistical variations between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject; and
      • threshold trends in blepharometric data over time based on repeated processing of the historical blepharometric data, as modified by newly received sets of blepharometric data including the current blepharometric data;
      • thereby to identify presence of one or more blepharometric data variation indicators; and
        • (v) an output module that is configured to provide a data output in response to identification of blepharometric data variation indicator.
  • One embodiment provides a device wherein the sensor device is an image capture device.
  • One embodiment provides a device wherein the device includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a device wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a device wherein the biometric data includes facial data.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a device wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a device wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a device the blepharometric data artefacts include any one or more of the following:
      • measurements defined by, or representative of statistical attributes of, blink total duration (BTD);
      • measurements defined by, or representative of statistical attributes of, inter-event duration (IED);
      • measurements defined by, or representative of statistical attributes of, blink amplitudes;
      • measurements defined by, or representative of statistical attributes of, eyelid velocities;
      • measurements defined by, or representative of statistical attributes of, amplitude to velocity ratios for blink events;
      • measurements defined by, or representative of statistical attributes of, negative inter-event duration (negative IED);
      • blink event counts, including blink event counts for a set of blink events having defined attributes occurring within a defined time period;
      • measurements defined by, or representative of statistical attributes of, blink eye closure duration (BECD); and
      • measurements defined by, or representative of statistical attributes of, duration of ocular quiescence (DOQ).
  • One embodiment provides a device wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a device wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a device wherein
      • point-in-time statistical variations between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject are determined via an in-vehicle processing system; and
      • threshold trends in blepharometric data over time are determined by a cloud-hosted processing system.
  • One embodiment provides a device wherein identifying a threshold trend includes identifying a threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a device wherein identifying point-in-time statistical deviation includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a device wherein the output module is configured to cause delivery of an output signal via in in-vehicle display.
  • One embodiment provides a device wherein the output module is configured to cause delivery of an output signal via an electronic message sent over a network.
  • One embodiment provides a device wherein the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a device including multiple sensor devices each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a device including the blepharometric data monitoring system.
  • One embodiment provides a system configured to facilitate analysis of subject neurological conditions, the system including:
      • an input module configured to receive, from a plurality of sensor systems each respectively configured to enable monitoring eyelid movement by a human subject, a set of sensor data including (i) blepharometric data; and (ii) subject identification data;
      • a subject identification module that is configured to identify a unique human subject from which a set of blepharometric data is collected by a given one of the sensor devices;
      • a blepharometric data input processing module configured to process the set of blepharometric data collected by the sensor device thereby to define current blepharometric data for the identified human subject;
      • a memory module that is configured to maintain a record of historical blepharometric data for the identified human subject, and for one or more additional human subjects;
      • a blepharometric data variation processing module that is configured to identify threshold variations in blepharometric data over time based on repeated processing of the historical blepharometric data, as modified by newly received sets of blepharometric data including the current blepharometric data, thereby to identify presence of one or more blepharometric data variation indicators; and
      • an output module that is configured to provide a data output in response to identification of blepharometric data variation indicator.
  • One embodiment provides a system wherein, for at least one of the sensor systems, the sensor system includes the sensor device including an image capture device that is configured to monitor blepharometric data.
  • One embodiment provides a system wherein the system includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
  • One embodiment provides a system wherein the subject identification module, which is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device, leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via collection of biometric data.
  • One embodiment provides a system wherein the biometric data includes facial data.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via user input of identifying credentials.
  • One embodiment provides a system wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the sensor device via communication with a user mobile device, which includes a token representative of identifying credentials.
  • One embodiment provides a system wherein defining current blepharometric data for the human subject includes processing blepharometric data for a period or sub-period of continuous blepharometric data collection via the sensor device, thereby to extract a set of blepharometric data artefacts.
  • One embodiment provides a system wherein the blepharometric data artefacts include any one or more of the following:
      • measurements defined by, or representative of statistical attributes of, blink total duration (BTD);
      • measurements defined by, or representative of statistical attributes of, inter-event duration (IED);
      • measurements defined by, or representative of statistical attributes of, blink amplitudes;
      • measurements defined by, or representative of statistical attributes of, eyelid velocities;
      • measurements defined by, or representative of statistical attributes of, amplitude to velocity ratios for blink events;
      • measurements defined by, or representative of statistical attributes of, negative inter-event duration (negative IED);
      • blink event counts, including blink event counts for a set of blink events having defined attributes occurring within a defined time period;
      • measurements defined by, or representative of statistical attributes of, blink eye closure duration (BECD); and
      • measurements defined by, or representative of statistical attributes of, duration of ocular quiescence (DOQ).
  • One embodiment provides a system wherein the memory module that is configured to maintain a record of historical blepharometric data for the identified human subject includes statistical information derived from processing of blepharometric data collected across a plurality of previous periods.
  • One embodiment provides a system wherein the blepharometric data collected across a plurality of previous periods is collected via a plurality of physically distinct collection systems.
  • One embodiment provides a system including a module configured to determine point-in-time statistical variations between the current blepharometric data for the human subject and the record of historical blepharometric data for the identified human subject.
  • One embodiment provides a system wherein identifying a threshold trend includes identifying a threshold trend in one or more of the user's observed blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein identifying point-in-time statistical deviation includes determining whether, the current set of blepharometric data alone or in combination with one or more recent sets of blepharometric data, display a threshold deviation in one or more of the user's observed blepharometric artefacts compared to historical averages, wherein that deviation is representative of prediction of a neurological condition.
  • One embodiment provides a system wherein the plurality of sensor systems include a selection of the following:
      • vehicle operator configurations 201, in which an image capture device is positioned to capture blepharometric data for an operator of the vehicle;
      • desktop/laptop computer configurations in which a webcam or other image capture device is used to monitor user blepharometric data subject to: (i) a foreground software application; and/or (ii) a background software application that collects blepharometric data while a user engages in other activities on the computer;
      • mass transport passenger configurations in which an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen;
      • vehicle passenger configurations in which an image capture device is positioned to capture blepharometric data for a passenger of the vehicle;
      • smartphone/tablet configurations in which a front facing camera is used to monitor user blepharometric data, subject to: (i) a foreground application; and/or (ii) a background application that collects blepharometric data while a user engages in other activities on the smartphone/tablet; and
      • medical facility configurations.
  • One embodiment provides a system wherein the plurality of sensor systems includes a plurality of in-vehicle blepharometric data monitoring systems.
  • One embodiment provides a system wherein, for at least a subset of the in-vehicle blepharometric data monitoring systems, the vehicle is an automobile, and wherein the sensor device is mounted on or adjacent a dashboard or windscreen region.
  • One embodiment provides a system including multiple sensor devices each mounted in the vehicle positioned to enable monitoring eyelid movement by a respective passenger or operator of the vehicle.
  • One embodiment provides a system wherein the system includes a cloud-based processing facility.
  • One embodiment provides a system configured to facilitate monitoring of subject neurological conditions, the system including:
      • an input module configured to receive, from a camera system positioned to collect image data from a region that is predicted to contain a human face in a vehicle, a set of image data including a human face;
      • a blepharometric data extraction module that is configured to process the image data thereby to detect and record eyelid movement data, thereby to define a set of blepharometric data;
      • a facial data extraction module configured to extract facial biometric data thereby to enable unique identification of the human subject;
      • a blepharometric data handling module configured to cause a blepharometric data management system to associate the set of blepharometric data with the human subject.
  • One embodiment provides a portable electronic device including:
      • a display screen; and
      • a front-facing camera;
      • wherein the portable electronic device is configured to concurrently execute: (i) a first software application that provides data via the display screen; and (ii) a second software application that receives input from the front facing camera thereby to facilitate detection and analysis of blepharometric data.
  • One embodiment provides a device wherein the first software application is a messaging application.
  • One embodiment provides a device wherein the first software application is a social media application.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a messaging software application, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a social media software application, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • One embodiment provides computer-executable code that when executed causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharometric data-based data collection, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharometric data.
  • Reference throughout this specification to “one embodiment,” “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment,” “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
  • As used herein, unless otherwise specified the use of the ordinal adjectives “first,” “second,” “third,” etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
  • In the claims below and the description herein, any one of the terms “comprising,” “comprised of” or “which comprises” is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term “comprising,” when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms “including” or “which includes” or “that includes” as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, “including” is synonymous with and means “comprising.”
  • As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
  • FIG. 1A illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1B illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1C illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 1D illustrates an in-vehicle blepharometric data collection/monitoring system according to one embodiment.
  • FIG. 2 illustrates a blepharometric data monitoring framework according to one embodiment.
  • FIG. 3 illustrates a method according to one embodiment.
  • FIG. 4 illustrates a blepharometric data collection/monitoring system in a passenger airplane.
  • FIG. 5 illustrates an analysis system according to one embodiment.
  • FIG. 6 illustrates a method according to one embodiment.
  • FIG. 7 illustrates an analysis system according to one embodiment.
  • DETAILED DESCRIPTION
  • The present disclosure relates, in various embodiments, to extended monitoring and analysis of subject neurological factors via blepharometric data collection, for example, including devices and processing systems configured to enable such extended monitoring. This may include hardware and software components deployed at subject locations (for example, in-vehicle monitoring systems, portable device monitoring systems, and so on), and cloud-based hardware and software (for example, cloud-based blepharometric data processing systems.
  • Overview and Context
  • A human subject's involuntary blinks and eyelid movements are influenced by a range of factors, including the subject's behavioral state and brain function. For example, this has been used in the past for detection of drowsiness. More broadly, analysis of data derived from eye and eyelid movements can be performed thereby to identify data artefacts, patterns and the like, and these are reflective of the subject's behavioral state, brain function and the like.
  • The technology described herein is focused on collection and analysis of “blepharometric data,” with the term “blepharon” describing a human eyelid. The term “blepharometric data” is used to define data that describes eyelid movement as a function of time. For example, eyelid position may be recorded as an amplitude. Eyelid movements are commonly categorized as “blinks” or “partial blinks.” The term “blepharometric data” is used to distinguish technology described herein from other technologies that detect the presence of blinks for various purposes. The technology herein is focused on analyzing eyelid movement as a function of time, typically measured as an amplitude. This data may be used to infer the presence of what would traditionally be termed “blinks,” however, it is attributes of “events” and other parameters identifiable in eyelid movements that are of primary interest to technologies described herein. Events and other parameters that are identified from the processing of blepharometric data are referred to as “blepharometric artefacts.” These are referred to as “blepharometric artefacts,” with such artefacts being identifiable by application of various processing algorithms to a data set that described eyelid position as a function of time (i.e., blepharometric data). For example, the artefacts may include:
  • blink total duration (BTD), which is preferably measured as a time between commencement of closure movement that exceeds a defined threshold and completion of subsequent opening movement.
      • blink rates.
      • amplitude to velocity ratios (AVRs).
      • negative Inter-Event-Duration (IED) (discussed in detail further below).
      • positive IED.
      • negative AVR (i.e., during closure)
      • positive AVR (i.e., during re-opening)
      • AVR Product (negative AVR * positive AVR)
      • AVR ratio (negative AVR divided by positive AVR)
      • BECD (blink eye closure duration).
      • negative DOQ (duration of ocular quiescence)
      • positive DOQ
      • relative amplitude
      • relative position
      • maximum amplitude
      • maximum velocity
      • negative zero crossing index (ZCI).
      • pos ZCI
      • blink start position
      • blink end position
      • blink start time
      • blink end time
      • trends and changes in any of the above artefacts over a defined period.
  • The determination of blepharometric artefacts may include any one or more of:
      • Determination of a time period from blink initiation to blink completion (also referred to as a blink duration or blink length). Blink initiation and blink completion may be determined based on a determined “inter-blink” eyelid amplitude range, with movement outside that amplitude range being categorized as a blink.
      • Determination time period between blinks, optionally measured between blink initiation times for consecutive blinks.
  • Analysis of “events,” including relative timing of events, with an “event” being defined as any positive or negative deflection that is greater than a given velocity threshold for a given duration. In this regard, a “blink” is in some embodiments defined as the pairing of positive and negative events that are within relative amplitude limits and relative position limits. There may be multiple events within a given blink, when an eyelid is outside of an “inter-blink” eyelid amplitude range.
      • a time period for eye closure motion;
      • a time period during which the eye is closed;
      • a time period for eye re-opening motion; and
      • velocity measurements (which include velocity estimation measurements) for eye closure motion and/or eye re-opening motion, are also made, which may be used for the purposes of determining amplitude-to-velocity ratios.
  • Known eyelid movement monitoring systems (also referred to herein as blepharometric data monitoring systems) focus on point-in-time subject analysis. For example, commonly such technology is used as a means for assessing subject alertness/drowsiness at a specific moment, potentially benchmarked against known data for a demographically relevant population. There is a problem in that, for many neurological conditions, point-in-time assessment is inadequate. For example, many neurological conditions are degenerative and/or progressive, and for those and others point-in-time blepharometric data without historical baselines may be of limited usefulness. Currently, however, there is no practical way in which to collect blepharometric data for people, outside of requiring people to subject themselves to specialist testing (which is expensive and for a bulk of the population likely unfeasible).
  • A solution proposed herein is to deploy blepharometric data collection systems in a range of human environments, being environments in which humans are commonly positioned suitably for blepharometric data collection. Examples considered herein are vehicles (for example, cars, airplanes, trains, and the like), computing devices (for example, smartphones, tablets, and PCs), and other locations. This allows long term blepharometric data collection on an individualized basis, allowing for better management of neurological health (and other factors such as safety). For instance, specific use cases might include providing warnings in advance of seizures, informing a person of a risk of a degenerative brain illness, detection of brain injuries from accidents and/or sporting activities, and personalized detection of unusual levels of drowsiness.
  • In terms of behavioral state, there are many factors that have an effect on involuntary eyelid movements, with examples including: a subject's state of physical activity; a subject's posture; other aspects of a subject's positional state; subject movement; subject activity; how well slept the subject happens to be; levels of intoxication and/or impairment; and others. In terms of brain function, factors that have effects on involuntary eyelid movements include degenerative brain injuries (e.g., Parkinson's disease) and traumatic brain injuries.
  • Example Methodology
  • FIG. 3 illustrates a high-level methodology that is relevant to a range of embodiments discussed below. This methodology, depending on the specific hardware implementation used by a given embodiment, is optionally performed via software modules executing across a plurality of connected devices, for example, including local devices (for example, computing devices housed in a vehicle and/or user's mobile devices such as smartphones) and Internet-connected server devices (also referred to as “cloud” components). It should be appreciated that any computing devices and computer-executed methods configured for the purposes of enabling the overall performance of a methodology based on those described below by reference to FIG. 3 form embodiments of inventions for the purposes of this specification.
  • Block 301 represents a process including collecting data representative of eyelid movement (i.e., blepharometric data). For the majority of embodiments described below, this is achieved via a camera system having an image capture component that is positioned into a capture zone in which a subject's face is predicted to be positioned. For example, this may include:
  • Vehicles, including passenger vehicles or operator-only vehicles, wherein the image capture component is positioned to capture a region in which an operator's face is predicted to be contained during normal operation. For example, in the case of an automobile, the image capture component may include a camera mounting in or adjacent a dashboard or windscreen.
  • Vehicles, in the form of passenger vehicles, wherein the image component is positioned to capture a region in which a passenger's face is predicted to be contained during normal operation. For example, in the case of an automobile, the image capture component may include a camera mounting in or adjacent a dashboard or windscreen, the rear of a seat (including a seat headrest), and so on.
  • Mass transport vehicles, including passenger trains and/or aircraft, wherein the image component is positioned to capture a region in which a passenger's face is predicted to be contained during normal operation. For example, the image capture component may be mounted in the rear of a seat (including a seat headrest), optionally in a unit that contains other electronic equipment such as a display monitor.
  • Seating arrangements, such as theatres, cinemas, auditoriums, lecture theatres, and the like. Again, mounting image capture components in the rear of seats is an approach adopted in some embodiments.
  • The data that is captured is not limited to data captured for the purposes of extended monitoring and analysis of subject neurological factors via blepharometric data collection. For example, in some embodiments, that is one purpose, and there is an alternate purpose, which is optionally point-in-time based. For example, point-in-time drowsiness detection is relevant in many of the above scenarios. Furthermore, while embodiments below focus on individualized blepharometric data collection and/or monitoring, collected blepharometric data is optionally additionally collected for the purposes of group monitoring/analysis (including where blepharometric data is anonymized such that it is not attributable to a specific individual). For example, this may be used in the context of seating arrangements to assess overall student/viewer attention/drowsiness, or in the context of airplanes and other mass transport to perform analysis of passenger health factors.
  • Block 302 represents a process including identifying a subject from whom the blepharometric data collected at block 301 originates. This optionally includes:
  • Credential-based identification, for example, via a login. This may include pairing of a personal device (such as a smartphone) to blepharometric data monitoring system (e.g., pairing a phone to an in-vehicle system), inputting login credentials via an input device, or other means.
  • Biometric identification. For example, in some embodiments described herein, a camera-based blepharometric data monitoring system utilizes image data to additionally perform facial recognition functions, thereby to uniquely identify human subjects.
  • Other Forms of Identification.
  • Identification of the subject is relevant for the purposes of comparing current blepharometric data with historical blepharometric data for the same subject. For example, in some embodiments, an analysis system has access to a database of historical blepharometric data for one subject (for example, where the system is installed in a vehicle and monitors only a primary vehicle owner/driver) or multiple subjects (for example, a vehicle configured to monitor multiple subjects, or a cloud-hosted system that received blepharometric data from a plurality of networked systems, as described further below).
  • Block 303 represents a process including determination of blepharometric artefacts for a current time period. For example, the artefacts may include:
      • blink total duration (BTD).
      • blink rates.
      • amplitude to velocity ratios (AVRs).
      • negative Inter-Event-Duration (IED).
      • positive IED.
      • negative AVR (i.e., during closure)
      • positive AVR (i.e., during re-opening(
      • AVR Product (negative AVR * positive AVR)
      • AVR ratio (negative AVR divided by positive AVR)
      • BECD (blink eye closure duration).
      • negative DOQ (duration of ocular quiescence)
      • positive DOQ
      • relative amplitude
      • relative position
      • max amplitude
      • max velocity
      • negative ZCI (zero crossing index)
      • positive ZCI
      • blink start position
      • blink end position
      • blink start time
      • blink end time
      • trends and changes in any of the above artefacts over the period.
  • The “current period” may be either a current period defined by a current user interaction with a blepharometric data monitoring system, or a subset of that period. For instance, in the context of a vehicle, the “current period” is in one example defined as a total period of time for which a user operates the vehicle and has blepharometric data monitored, and in another embodiment is a subset of that time. In some embodiments, multiple “current periods” are defined, for example, using time block samples of between two and fifteen minutes (which are optionally overlapping), thereby to compare blepharometric data activity during periods of varying lengths (which may be relevant for differing neurological conditions, which, in some cases, present themselves based on changes in blepharometric data over a given period of time).
  • The current blepharometric data may be used for point-in-time neurological conditional analysis, for example, analysis of subject alertness/drowsiness, prediction of seizures, detection of seizures, and other such forms of analysis. Specific approaches for analyzing blepharometric data thereby to detect/predict particular neurological conditions fall beyond the scope of the present disclosure.
  • Block 304 represents a process including identification of relationships between current blepharometric artefacts and historical blepharometric artefacts. This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject. The concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
  • Identification of long-term trends. For example, blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any particular blepharometric data artefacts that are evolving/trending over time. In some embodiments, algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
  • Identification of current point-in-time deviations from baselines derived from historical blepharometric data. For example, current data may show anomalous spiking in particular artefacts, or other differences from baselines derived from the subject's historical blepharometric data, which may give rise for concern. By way of example, this form of analysis may be used to determine/predict the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; (v) neurotoxicity; or (vi) other factors.
  • In relation to onset of a neurological illness or degenerative condition, this may include either or both of short term onsets (e.g., onset of neurological diseases and neurological condition such as strokes and/or seizures and long term onsets (for example, long-term detection rather than the short term is more appropriate for examples such as Alzheimer's, Parkinson's, Multiple Sclerosis, and Muscular Dystrophy).
  • Block 305 represents a process including identification of presence of one or more blepharometric variation indicators, for example, based on the identification of relationships at block 304. These indicators may be used to allow data-based determination/prediction of the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; (v) neurotoxicity or (vi) other factors. By way of example, rules are defined that associate a data relationship (for example, deviation from baseline values, a trend identification, or the like) with a prediction on neurological condition. These may be defined, for example, using logical structures, such as:
      • if current 15-minute standard deviation of negative IED deviates by over 20% of the determined baseline 15-minute standard deviation of negative IED, raise alert X.
      • if current 1-minute average amplitude-to-velocity ratio greater than 40% of baseline 1-minute average amplitude-to-velocity ratio, raise alert Y.
      • if current data causes greater than 20% downward trend in artefact set A, comprising defined grouping of multiple blepharometric artefacts, then raise alert Z.
  • It should be appreciated that these are examples only, and that the present disclosure is directed to hardware and software that enables the implementation of such analysis/alert processes, as opposed to those processes themselves.
  • Bock 306 represents a process including providing output to the human subject based on identified blepharometric variation indicators. This may include an instruction/suggestion to avoid a particular activity (such as driving), an instruction/suggestion to undertake a particular activity (such as medication, resting, walking around, or the like), or a suggestion to consult a medical expert about a potential neurological condition. The manner by which the output is delivered varies depending on both the nature of the alert/condition, and the hardware environment in place. Examples range from the sending of emails or other messages or the display of information on a local device (for example, an in-vehicle display).
  • Various hardware/software embodiments configured to enable the above methodology are described below.
  • Example In-Vehicle Blepharometric Data Monitoring System
  • FIG. 1A illustrates an example in-vehicle blepharometric data monitoring system. While it is known to provide a blepharometric data monitoring system in a vehicle for the purposes of point-in-time analysis of alertness/drowsiness, the system of FIG. 1A provides for substantial advances in ability to perform analysis of a user's neurological condition by way of providing a memory module that stores historical blepharometric data, and enables analysis of changes in blepharometric data for the user over time.
  • The system of FIG. 1A includes an image capture device 120. This may include substantially any form of appropriately sized digital camera, preferably a digital camera with a frame rate of over 60 frames per second. Higher frame rate cameras are preferred, given that with enhanced frame rate comes an ability to obtain higher resolution data for eyelid movement.
  • Image capture device 120 is positioned to capture a facial region of a subject. Image capture device 120 is in one embodiment installed in a region of a vehicle in the form of an automobile, for example, on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a driver. In another embodiment, image capture device 120 is positioned on or adjacent the dashboard, windscreen, or visor, such that it is configured to capture a facial region of a front seat passenger. In another embodiment, image capture device 120 is positioned in a region such as the rear of a seat such that it is configured to capture a facial region of a back-seat passenger. In some embodiments, a combination of these are provided, thereby to enable blepharometric data monitoring for both a driver and one or more passengers.
  • Although the system of FIG. 1A (and other systems) are described by reference to a vehicle in the form of an automobile, it will be appreciated that a system as described is also optionally implanted in other forms of vehicles, including mass-transport vehicles such as passenger airplanes, busses/coaches, and trains. In such embodiments, there are preferably one or more analysis systems each supporting a plurality of image capture devices, each positioned to capture a respective passenger.
  • An in-vehicle image processing system 110 is configured to receive image data from image capture device 120 (or multiple image capture devices 120), and process that data thereby to generate blepharometric data. A control module 111 is configured to control image capture device 120, operation of image data processing, and management of generated data. This includes controlling operation of image data processing algorithms, which are configured to:
      • i. Identify that a human face is detected.
      • ii. In embodiments where subject identification is achieved via facial recognition algorithms (which is not present in some embodiments, for example, embodiments that identify a subject via alternate means), perform a facial recognition process thereby to identify the subject. This may include identifying a known subject based on an existing subject record defined in user identification data 151 stored in a memory system 150, or identifying an unknown subject and creating a new subject user identification data 151 stored in a memory system 150.
      • iii. In a detected human face, identifying an eye region. In some embodiments, the algorithms are configured to track one eye region only; in other embodiments, both eye regions are tracked thereby to improve data collection.
      • iv. Identify, in the eye region(s), presence and movement of an eyelid. For example, in a preferred embodiment, this is achieved by way of recording an eyelid position relative to a defined “open” position against time. This allows generation of blepharometric data in the form of eyelid position (amplitude) over time. It will be appreciated that such data provides for identification of events (for example, blink events) and velocity (for example, as a first derivative of position against time). In a preferred embodiment, a facial recognition algorithm is used to enable identification of: (i) a central position on an upper eyelid on a detected face; and (ii) at least two fixed points on the detected face. The two fixed points on the detected face are used to enable scaling of measurements of movement of the central position of the upper eyelid thereby to account to changes in relative distance between the user and the camera. That is, a distance between the two fixed points is used as a means to determine position of the face relative to the camera, including position by reference to distance from the camera (as the user moves away, the distance between the fixed points decreases).
  • Algorithms 112 optionally operate to extract additional artefacts from blepharometric data, for example, amplitude-velocity ratios, blink total durations, inter-event durations, and the like. It will be appreciated, however, that extraction of such artefacts may occur in downstream processing.
  • A blepharometric data management module 113 is configured to coordinate storage of blepharometric data generated by algorithms 112 in user blepharometric data 152. This includes determining a user record against which blepharometric data is to be recorded (in some cases there is only a single user record, for example, where blepharometric data s collected only from a primary driver of an automobile). In some embodiments, the function of blepharometric data management module 113 includes determining whether a set of generated blepharometric data meets threshold data quality requirements for storage, for example, based on factors including a threshold unbroken time period for which eyelid tracking is achieved and blepharometric data is generated.
  • Memory system 150 includes user identification data 151 for one or more users. As noted, in some embodiments, system 101 is configured to collect and analyze blepharometric data for only a single user (for instance, the primary driver of a vehicle) and includes identification data to enable identification of only that user. In other embodiments, system 101 includes functionality to collect and analyze blepharometric data for multiple users, and includes identification data to enable identification of any of those users (and optionally, as noted above, defining of a new record for a previously unknown user). The identification data may include login credentials (for example, a user ID and/or password) that are inputted via an input device. Alternately, the identification data may be biometric, for example, using facial recognition as discussed above or an alternate biometric input (such as a fingerprint scanner). In some embodiments, this leverages an existing biometric identification system of the vehicle.
  • User blepharometric data 152 includes data associated with identified users, the data basing time coded thereby to enable identification of a date/time at which data was collected. The blepharometric data stored in user blepharometric data 152 optionally includes blepharometric data generated by algorithms 112 and further blepharometric data derived from further processing of that data, for example, data representing average periodic IEDs and/or BTDs, and other relevant statistics that may be determined over time. In some embodiments, data processing algorithms are updated over time, for example, to allow analysis of additional biomarkers determined to be representative of neurological conditions that require extraction of particular artefacts from blepharometric data.
  • Analysis modules 130 are configured to perform analysis of user blepharometric data 152. This includes executing a process including identification of relationships between current blepharometric artefacts (e.g., data recently received from in-vehicle image processing system 110) and historical blepharometric artefacts (e.g., older data pre-existing in memory system 150). This allows for artefacts extracted in the current blepharometric data to be given context relative to baselines/trends already observed for that subject. The concept of “identification of relationships” should be afforded a broad interpretation to include at least the following:
  • Identification of long-term trends. For example, blepharometric data collected over a period of weeks, months or years may be processed thereby to identify any particular blepharometric data artefacts that are evolving/trending over time. In some embodiments, algorithms are configured to monitor such trends, and these are defined with a set threshold for variation, which may be triggered in response to a particular set of current blepharometric data.
  • Identification of current point-in-time deviations from baselines derived from historical blepharometric data. For example, current data may show anomalous spiking in particular artefacts, or other differences from baselines derived from the subject's historical blepharometric data, which may give rise for concern. By way of example, this form of analysis may be used to determine/predict the presence of: (i) onset of a neurological illness or degenerative condition; (ii) presence of a brain injury, including a traumatic brain injury; (iii) impairment by alcohol, drugs, or other physical condition; (iv) abnormal levels of drowsiness; or (v) neurotoxicity or (vi) other factors.
  • Analysis modules are optionally updated over time (for example, via firmware updates or the like) thereby to allow for analysis of additional blepharometric artefacts and hence identification of neurological conditions. For example, when a new method for processing blepharometric data thereby to predict a neurological condition based on a change trend in one or more blepharometric artefacts, an analysis algorithm for that method is preferably deployed across a plurality of systems such as system 101 via a firmware update or the like.
  • System 101 additionally includes a communication system 160, which is configured to communicate information from system 101 to human users. This may include internal communication modules 161 that provide output data via components installed in the vehicle, for example, an in-car display, warning lights, and so on. External communication modules 162 are also optionally present, for example, to enable communication of data from system 101 to user devices (for example, via Bluetooth, WiFi, or other network interfaces), optionally by email or other messaging protocols. In this regard, communication system 160 is configured to communicate results of analysis by analysis modules 130.
  • A control system 141 included logic modules 140, which control overall operation of system 141. This includes execution of logical rules thereby to determine communications to be provide din response to outputs from analysis modules 130. For example, this may include:
      • an in-vehicle notification in the event that a threshold level of drowsiness is detected.
      • an in-vehicle notification of another neurological condition.
      • an in-vehicle notification with an alert code that is to be inputted into an online system thereby to obtain further information regarding a detected/predicted neurological condition.
      • an external communication to a device/address defined in user identification data 151.
  • It will be appreciated that these are examples only, and logic modules 140 are able to provide a wide range of functionalities thereby to cause system 101 to act based on determinations by analysis modules 130.
  • It should be appreciated that the system illustrated in FIG. 1 provides technology whereby one or more digital camera is able to be installed in a vehicle, such as an automobile or mass transport vehicle, thereby to: (i) collect blepharometric data for an operator and/or one or more passengers; and (ii) enable determination of relationships between blepharometric data collected in a “current” period (for example, a last data set, a last day, a last week, or a last month) with historical blepharometric data that is stored for that same user. This allows for functionalities including, but not limited to:
  • User-personalized drowsiness detection, based on detection of drowsiness-related blepharometric artefacts that are beyond a threshold deviation from average values for a particular user;
  • Prediction of neurological conditions, based on sudden changes and/or long term trends in change for one or more blepharometric artefacts that are known to be indicative of particular neurological conditions;
  • Personalized prediction of future neurological conditions, for example, prediction of future drowsiness based on known drowsiness development patters extracted for the individual from historical data, and prediction of likelihood of a seizure based on individually-verified seizure prediction biomarkers identifiable in blepharometric data.
  • Identification of point-in-time relevant neurological conditions based on sudden deviations from historical averages, which may be representative of sudden neurological changes, for example, traumatic brain injuries (e.g., concussion) and/or impairment based on other factors (such as neurotoxicity, medications, drugs, alcohol, illness, and so on).
  • Example In-Vehicle Blepharometric Data Monitoring Systems, With Cloud-Based Analysis
  • FIG. 1B illustrates a further embodiment, which includes various common features with the embodiment illustrated in FIG. 1A. On general terms, in some embodiments, external communication modules 162 facilitate communication with a remote server device, which optionally performs additional blepharometric data analysis. In the example of FIG. 1B, external communication modules 162 enable communication between system 101 and a cloud-based blepharometric data analysis system 180.
  • Cloud system 180 includes a control system 182 and logic modules 181 that are provided by computer-executable code executing across one or more computing devices thereby to control and deliver functionalities of cloud system 180.
  • Cloud system 180 additionally includes a memory system 183, which includes user identification data 184 and user blepharometric data 185. The interplay between memory system 183 and memory system 150 varies between embodiments, with examples discussed below:
  • In some embodiments, memory system 150 operates in parallel with memory system 183, such that certain records are synchronized between the systems based on a defined protocol. For example, this optionally includes a given memory system 150 maintaining user blepharometric data and user identification data for a set of subjects that have presented at that in-vehicle system, and that data is periodically synchronized with the cloud system. For example, upon an unrecognized user presenting at a given in-vehicle system, the system optionally performs a cloud (or other external) query thereby to obtain identification data for that user, and then downloads from the cloud system historical user blepharometric data for that user. Locally collected blepharometric data us uploaded to the server. This, and other similar approaches, provides for transportability of user blepharometric data between vehicles.
  • In some embodiments, memory system 150 is used primarily for minimal storage, with system 101 providing a main store for user blepharometric data. For example, in one example, memory system 150 includes data representative of historical blepharometric data baseline values (for instance, defined as statistical ranges), whereas detailed recordings of blepharometric data is maintained in the cloud system. In such embodiments, analysis modules 186 of cloud-based blepharometric data analysis system 180 performed more complex analysis of user blepharometric data thereby to extract the historical blepharometric data baseline values, which are provided to memory system 150 where a given user is present or known thereby to facilitate local analysis of relationships from baselines.
  • In some embodiments, local memory system 150 is omitted, with all persistent blepharometric data storage occurring in cloud memory system 183.
  • Cloud system 180 additionally includes analysis modules 186, which optionally perform a similar role to analysis modules 130 in FIG. 1A. In some embodiments, local and cloud analysis modules operate in a complementary factor, for example, with analysis modules 130 performing relationship analysis relevant to point-in-time factors (for example, an altered/non-standard neurological state for a user by comparison with historical baselines, which warrants immediate intervention) and analysis modules 186 performing what is often more complex analysis of trends over time (which may be representative of degenerative neurological illnesses and the like) and do not require local immediate intervention in a vehicle. It will be appreciated that there exist a range of approaches for sharing processing (and memory storage) functions between an in-vehicle system and a cloud system, and configuration of thee is optionally determined based on considerations such as network speeds/bandwidth, along with local memory and storage resource availability.
  • There are various advantages of incorporating a cloud-based system to operate with a plurality of in-vehicle systems, in particular an ability to maintain cloud storage of user identification data and user blepharometric data for a large number of users, and hence allow that data to “follow” the users between various vehicles over time. For example, a user may have a personal car with a system 101, and subsequently obtain a rental car while travelling with its own system 101, and as a result of cloud system 180 the rental car system: has access to the user's historical blepharometric data; is able to perform relationship analysis of the current data collected therein against historical data obtained from the cloud system; and feed into the cloud system the new blepharometric data collected to further enhance the user's historical data store.
  • FIG. 1C illustrates a further variation where a user has a smartphone device 170 that executes a software application configured to communicate with a given local in-vehicle system 101 (for example, via Bluetooth or USB connection) and additionally with cloud system 180 (for example, via a wireless cellular network, WiFi connection, or the like). This provides functionality for communication between system 100 and cloud system 180 without needing to provide Internet connectivity to a vehicle (the in-vehicle system essentially uses smartphone 170 as a network device).
  • Using a smartphone device as an intermediary between system 101 and cloud system 180 is in some embodiments implemented in a matter that provides additional technical benefits. For example:
  • In some embodiments, smartphone 170 provides to system 101 data that enabled identification of a unique user, avoiding a need for facial detection and/or other means. For instance, upon coupling a smartphone to a in-car system (which may include system 101 and one or more other in-car systems, such as an entertainment system) via Bluetooth, system 101 receives user identification data from smartphone 170.
  • In some embodiments, a most-recent version of a given user's historical blepharometric data (for example, defined as historical baseline values) is stored on smartphone 170, and downloaded to system 101 upon coupling.
  • In some embodiments, one or more functionalities of analysis modules 130 are alternately performed via smartphone 170, in which case, system 101 optionally is configured to, in effect, be a blepharometric data collection and communication system without substantive blepharometric data analysis functions (which are instead performed by smartphone 170, and optionally tailored via updating of smartphone app parameters by cloud system 180 for personalized analysis.
  • The use of smartphone 170 is also in some cases useful in terms of allowing users to retain individual control over their blepharometric data, with blepharometric data not being stored by an in-vehicle system in preference to being stored on the user's smartphone.
  • FIG. 1D illustrates a further variation in which communication between a local system 101 and cloud system 180 operates in a similar manner to FIG. 1B, but where a smartphone 170 is still present. In such arrangements, the smartphone is optionally used as an output device for information derived from blepharometric data analysis, and/or as a device to confirm identify and approval for blepharometric data collection. For example, in one embodiment, a given system 101 identifies a user by way of biometric information (e.g., facial detection) using user identification data stored in memory system 183 of cloud system 180, and a message is sent to smartphone 170 allowing the user to confirm that they are indeed in the location of the relevant system 170, and providing an option to consent to blepharometric data monitoring.
  • Additional Mass-Transit Functions
  • A system such as that of FIG. 1A is also able to be integrated into other local systems thereby to provide control instructions to those systems in response to artefacts identified in blepharometric data. An example is provided in FIG. 4, wherein an aircraft 400 an in-vehicle blepharometric data analysis system, which is fed data from image capture devices including devices installed in seat-backs (for example, in a common housing to a seat-back display screen). System 401 is configured to feed data thereby to effect control instructions into an entertainment system 402 and a passenger health/comfort analysis system 403.
  • In this example, each image capture device is provided in conjunction with a display screen that is configured to deliver audio-visual entertainment (for instance, as is common in airplanes), monitoring of subject blepharometric data may be used to provide an enhanced experience with respect to audio-visual data. This may include, for example:
      • Decreasing screen brightness and/or volume in response to detection of drowsiness and/or commencement of sleep.
      • Deactivation of a screen in response to threshold drowsiness and/or sleep (preferably in combination with pausing of a media file for later resumption of playback).
      • Transition of an audio playback track to a sleep/relaxation-inducing track in response to threshold drowsiness and/or sleep (preferably in combination with pausing of a previously in-playback media file for later resumption of playback).
      • Delivery of feedback to a multiple passenger monitoring system thereby to facilitate delivery of passenger health and/or comfort monitoring and management based on identifications made from monitoring of blepharometric data.
  • It will be appreciated that provision of a system that enables collection and analysis of blepharometric data from multiple passengers in a mass-transit vehicle may have additional far-reaching advantages in terms of optimizing passenger health and/or comfort during transportation.
  • In mass-transport embodiments, there is preferably a clear distinction between personalizing health data, which is maintained with privacy on behalf of the user, and non-personalizing statistical data, which may be shared with other systems/people. For instance, an individual's neurological conditions are not made available to airline personnel, however data representative of drowsiness/alertness statistics in a cabin are made available to airline personnel.
  • Example Cloud-Based Extended Blepharometric Data Monitoring Framework
  • FIG. 2 illustrates an exemplary framework under which a cloud-based blepharometric data analysis system 180 operates in conjunction with a plurality of disparate blepharometric data monitoring systems 201-206. Each of these systems is in communication with cloud system 180, such that user data (for example, user blepharometric data comprising historical data) is able to be utilized for analysis even where a user's blepharometric data is collected from physically distinct monitoring systems. Analysis of blepharometric data (for example, determination of relationships between current and historical data) may be performed at the cloud system 180, at the local systems 201-206, or combined across the cloud and local systems.
  • The local systems illustrated in FIG. 2 are:
  • Vehicle operator configurations 201. These are in-vehicle systems, such as that of FIG. 1A-1D, in which the image capture device is positioned to capture blepharometric data for an operator of the vehicle.
  • Desktop/laptop computer configurations 202. In these configurations, a webcam or other image capture device is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example, an application that instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application that collects blepharometric data while a user engages in other activities on the computer (for example, word processing and/or interne browsing).
  • Mass-transport passenger configurations 203, for example, airlines as illustrated in FIG. 4, busses, trains and the like. Ideally, these are configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen.
  • Vehicle passenger configurations 204. These are in-vehicle systems, such as that of FIGS. 1A-1D, in which the image capture device is positioned to capture blepharometric data for a passenger of the vehicle. For back-seat applications, these are optionally configured such that an image capture device operates in conjunction with a display screen, such that blepharometric data is collected concurrently with the user observing content displayed via the display screen. For front-seat applications, the camera is positioned based on a presumption that a front seat passenger will for a substantial proportion of the time pay attention to the direction of vehicle travel (e.g., watch the road).
  • Smartphone/tablet configurations 205. In these configurations, a front facing camera is used to monitor user blepharometric data, with image-based eyelid movement detection as discussed herein. This may occur subject to: (i) a foreground application (for example, an application that instructs a user to perform a defined task during which blepharometric data is collected); and/or (ii) a background application that collects blepharometric data while a user engages in other activities on the computer (for example, messaging and/or social media application usage).
  • Medical facility configurations 206. These may make use of image processing-based blepharometric data monitoring, and/or other means of data collection (such as infrared reflectance oculography spectacles). These provide a highly valuable component in the overall framework: due to centralized collection of blepharometric data over time for a given subject from multiple locations over an extended period of time, a hospital is able to perform point-in-time blepharometric data collection and immediately reference that against historical data thereby to enable identification of irregularities in neurological conditions.
  • FIG. 2 also shows how cloud system 180 is able to interact with a plurality of user mobile devices such as smartphone device 170. User identification data 184 provides addressing information thereby to enable cloud system 180 to deliver messages, alerts, and the like to correct user devices.
  • Beyond advantages of providing an ability to carry user blepharometric data baselines and data collection between physical collection systems, and added benefit of a system such as that of FIG. 2 is an ability to personalize condition prediction algorithms for individual users. This is achieved by: (i) identifying a personalized blepharometric biomarker for a given user, wherein that blepharometric artefact is representative of a particular neurological condition; and (ii) configuring the system such that whenever that particular user is identified, an analysis system executes a process configured to monitor for that biomarker (and perform a defined action in response). For example, in one example it is determined that a particular person displays a specific blepharometric biomarker (for example, threshold spiking in negative inter event duration) in the lead-up to a seizure event; a process configured to monitor for that biomarker is initialized in response to identification of that person. For example, an analysis module of an in-vehicle device is configured for such monitoring once the person is detected, and provides a seizure warning when the biomarker is detected.
  • In embodiments where infrared reflectance oculography techniques are used, the blepharometric data is optionally defined by a reading made by an infrared reflectance sensor, and as such is a proxy for eyelid position. That is, rather than monitoring the actual position of an eyelid, infrared reflectance oculography techniques use reflectance properties and in so doing are representative of the extent to which an eye is open (as the presence of an eyelid obstructing the eye affects reflectivity). In some embodiments, additional information beyond eyelid position may be inferred from infrared reflectance oculography, for example, whether a subject is undergoing tonic eye movement. For the present purposes, “blepharometric data” in some embodiments includes infrared reflectance oculography measurements, and hence may additionally be representative of tonic eye movement.
  • Example Blepharometric Data Relationship Analysis System
  • FIG. 5 illustrates an example blepharometric data relationship analysis system, which may be incorporated into embodiments described above. In some cases, components/functionalities of this system are distributed across local and cloud-based processing systems.
  • One or more new sets of blepharometric data 501, which may be defined via any collection system, for instance, as shown in FIG. 2, are received by a new data processing module 502. Module 502 is configured to perform data validation and/or data cleaning, thereby to ensure that the data is suitable for analysis and/or storage. For example, data displaying irregularities and/or having a sample time below a given threshold is excluded. A new data storage module 503 is configured to coordinate storage of the new set or sets of data 501, following processing my module 502, into a data store 505 containing historical blepharometric data for the user.
  • A statistical value determination module 510 applies an expandable set of processing algorithms to data in store 505 thereby to extract a range of statistical values (for example, averages for blepharometric artefacts, optionally categorized based on collection conditions and other factors). These statistical values are stored in data store 505 thereby to maintain richer detail regarding baseline blepharometric data values for the user, preferably in a way that is tied to defined relationship analysis algorithms. That is, if an algorithm X to determine a condition Y relies on analysis of a blepharometric artefact Z, then statistical value determination module 510 is preferably configured to apply an algorithm configured to extract artefact Z from user blepharometric data.
  • A new data relationship processing module 504 is configured to identify relationships between new data 501 and historical data 505. Data rules to facilitate the identification of particular relationships that are known to be representative (or predictively representative) of neurological conditions are defined in condition identification rules 506. Condition identification rules 506 are periodically updated based on new knowledge regarding blepharometric/neurological condition research. For example, a given rule defines a category of relationship between one or more blepharometric data artefacts in new data 501 and one or more baseline values extracted from historical data in data store 505 based on operation of statistical value determination module 510.
  • In the case that a defined category of relationship is identified by new data relationship processing module 504, representative data is passed to an output rules module that contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response a given output module 509 is invoked to provide the designated output.
  • A trend analysis module 507 is configured to continuously, periodically or in an event-driven manner (for example, in response to receipt of new blepharometric data), identify trends/changes in user blepharometric data. Again, data rules to facilitate the identification of particular trends that are known to be representative (or predictively representative) of neurological conditions are defined in condition identification rules 506. Condition identification rules 506 are periodically updated based on new knowledge regarding blepharometric/neurological condition research. For example, a given rule defines a threshold deviation in one or more artefacts over a threshold time as being predictively representative of a neurological condition.
  • Again, the case that a defined category of relationship is identified by trend analysis module 507, representative data is passed to an output rules module 508, which contains logical rules that define how a user is to be notified (e.g., in-vehicle alert, message to smartphone app, or email), and in response, a given output module 509 is invoked to provide the designated output.
  • It will be appreciated that, in this manner, the system of FIG. 5 is configurable to monitor for a range of neurological conditions that are identifiable in blepharometric data based on point-in-time variations from known baselines that are generated and refined over extended period (i.e., based on a collection of time-separated data sets), and trends in blepharometric data over time (even where differences between consecutive data sets are relatively minor).
  • It will be appreciated that this form of data collection and analysis is of significant use in the context of predicting and understanding neurological conditions, for example, in terms of: (i) identifying potential degenerative conditions and rates of onset; (ii) identifying point-in-time events that led to sudden changes in neurological conditions; (iii) monitoring long-term effects of contact sports (e.g., concussive brain injuries) for participants, (iv) personalizing blepharometric data analysis for individual users.
  • Referring to FIG. 7, one embodiment of system 700 provides a portable electronic device 701 including: a display screen 704; and a front-facing camera 702; wherein the portable electronic device is configured to concurrently execute, via software instructions 703, which execute on a processor of device 701: (i) a first software application that provides data via the display screen; and (ii) a second software application that receives input from the front facing camera thereby to facilitate detection and analysis of blepharon data. For example, the first software application is in one embodiment a messaging application, and in another embodiment a social media application.
  • One embodiment provides computer-executable code that, when executed, causes delivery via a computing device of a software application with which a user interacts for a purpose other than blepharon-based data collection, wherein the computer-executable code is additionally configured to collect data from a front-facing camera thereby to facilitate analysis of blepharon data. The purpose may be, for example, messaging or social media.
  • Embodiments such as that of FIG. 7 provide for collection of blepharon data via a background software application executing on electronic device with a front-facing camera. This provides opportunities to analyze a device user's neurological condition, for example, in the context of predicting seizures, advising on activities, diagnosing potential neurological illnesses, detecting drowsiness, and so on.
  • Conclusions and Interpretation
  • It will be appreciated that the above disclosure provides analytic methods and associated technology that enables improved analysis of human neurological conditions.
  • It should be appreciated that in the above description of exemplary embodiments of the disclosure, various features of the disclosure are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
  • Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
  • Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the disclosure.
  • In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the disclosure may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
  • Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B, which may be a path including other devices or means. “Coupled” may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
  • Thus, while there has been described what are believed to be the preferred embodiments of the disclosure, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.

Claims (22)

1-61. (canceled)
62. A system configured to facilitate analysis of subject neurological conditions, the system including:
an input module configured to receive, from a plurality of sensor systems each respectively configured to enable collection of blepharometric data from a human subject, a set of sensor data including (i) blepharometric data; and (ii) subject identification data;
a subject identification module that is configured to identify a unique human subject from which a set of blepharometric data is collected by a given one of the plurality of sensor systems;
a blepharometric data input processing module configured to process the set of blepharometric data collected by the plurality of sensor systems thereby to define current blepharometric data for the identified unique human subject;
a memory module that is configured to maintain a record of historical blepharometric data for the identified unique human subject, and for one or more additional human subjects;
a blepharometric data variation processing module that is configured to identify threshold variations in blepharometric data over time based on repeated processing of the historical blepharometric data, as modified by newly received sets of blepharometric data including the current blepharometric data, thereby to identify presence of one or more blepharometric data variation indicators; and
an output module that is configured to provide a data output in response to identification of blepharometric data variation indicator.
63. A system according to claim 62, wherein the current blepharometric data for the identified unique human subject and the historical blepharometric data for the identified unique human subject are each defined via a common sensor system.
64. A system according to claim 63, wherein the common sensor system is a sensor system configured to monitor a driver or passenger in a vehicle.
65. A system according to claim 62, wherein the sensor system configured to monitor a driver or passenger in a vehicle is configured to identify a unique human subject via biometric identification.
66. A system according to claim 62, wherein the current blepharometric data for the identified unique human subject and the historical blepharometric data for the identified unique human subject are each defined via a plurality of different sensor systems.
67. A system according to claim 66, wherein the plurality of different sensor systems include at least one system configured to monitor a driver or passenger in a vehicle.
68. A monitoring system for a vehicle operator, the system including:
(i) an in-vehicle sensor device configured to be mounted in a vehicle such that the in-vehicle sensor device is positioned to enable collection of blepharometric data from an operator of the vehicle;
(ii) a memory module that is configured to maintain a unique record of data for at least one unique human subject;
(iii) an identification module configured to identify the vehicle operator as being the unique human subject; and
(iv) an operator assessment module configured to assess a state of the vehicle operator based on a combination of: (A) blepharometric data received from the in-vehicle sensor device; and (B) the unique record of data maintained by the memory module for the vehicle operator;
such that the operator assessment module is configured to customise assessment of the vehicle operator from the blepharometric data received from the in-vehicle sensor device based on unique information held for the vehicle operator.
69. A system according to claim 68, wherein the in-vehicle sensor device is an image capture device.
70. A system according to claim 69, wherein the system includes an image processing system that is configured to: (i) detect presence of a human face; (ii) identify one or more eye regions in the human face; and (iii) based on identification of the one or more eye regions, generate blepharometric data representative of eyelid position against time.
71. A system according to claim 69, wherein the identification module leverages a facial recognition process thereby to extract biometric facial information from one or more frames of image data collected via the image capture device.
72. A system according to claim 62, wherein the subject identification module is configured to identify a unique human subject from which a set of blepharometric data is collected by the plurality of sensor systems via collection of biometric data.
73. A system according to claim 68, wherein the blepharometric data includes any one or more of the following:
measurements defined by, or representative of statistical attributes of, blink total duration (BTD);
measurements defined by, or representative of statistical attributes of, inter-event duration (IED);
measurements defined by, or representative of statistical attributes of, blink amplitudes;
measurements defined by, or representative of statistical attributes of, eyelid velocities;
measurements defined by, or representative of statistical attributes of, amplitude to velocity ratios for blink events;
measurements defined by, or representative of statistical attributes of, negative inter-event duration (negative IED);
blink event counts, including blink event counts for a set of blink events having defined attributes occurring within a defined time period;
measurements defined by, or representative of statistical attributes of, blink eye closure duration (BECD); and
measurements defined by, or representative of statistical attributes of, duration of ocular quiescence (DOQ).
74. A system according to claim 68, wherein a blepharometric data variation processing module is configured to identify a relationship between current blepharometric data for the vehicle operator and data derived from historical blepharometric data for the vehicle operator.
75. A system according to claim 68, wherein a blepharometric data variation processing module is configured to identify a relationship between current blepharometric data for the vehicle operator and data derived from historical blepharometric data for the vehicle operator by processing methods including one or more of the following:
identification of long-term trends in blepharometric data; and
identification of threshold current point-in-time deviations from historical statistical data.
76. A method configured to facilitate analysis of subject neurological conditions, the method operable in a system in which a plurality of distinct sensor systems are each configured to collect blepharometric data from a plurality of human subjects, the method including:
receiving a current set of blepharometric data from an identified unique one of the plurality of human subjects via a first one of the plurality of distinct sensor systems; and
processing the current set of blepharometric data in combination with additional data derived from historical blepharometric data for the same unique one of the plurality of human subjects, wherein the historical blepharometric data is collected from one or more others of the plurality of distinct sensor systems;
thereby to perform an assessment of a neurological condition of the identified unique one of the plurality of human subjects based on a combination of (i) current blepharometric data collected via the first one of the plurality of distinct sensor systems; and (ii) the historical blepharometric data collected from the one or more others of the plurality of distinct sensor systems.
77. A method according to claim 76 including performing a process thereby to identify a relationship between current blepharometric data for a vehicle operator and data derived from historical blepharometric data for the vehicle operator.
78. A system according to claim 76 including performing a process thereby to identify a relationship between current blepharometric data for a vehicle operator and data derived from historical blepharometric data for the vehicle operator by processing methods including one or more of the following:
identification of long-term trends in blepharometric data; and
identification of threshold current point-in-time deviations from historical statistical data.
79. A method according to claim 76, wherein the blepharometric data includes any one or more of the following:
measurements defined by, or representative of statistical attributes of, blink total duration (BTD);
measurements defined by, or representative of statistical attributes of, inter-event duration (IED);
measurements defined by, or representative of statistical attributes of, blink amplitudes;
measurements defined by, or representative of statistical attributes of, eyelid velocities;
measurements defined by, or representative of statistical attributes of, amplitude to velocity ratios for blink events;
measurements defined by, or representative of statistical attributes of, negative inter-event duration (negative IED);
blink event counts, including blink event counts for a set of blink events having defined attributes occurring within a defined time period;
measurements defined by, or representative of statistical attributes of, blink eye closure duration (BECD); and
measurements defined by, or representative of statistical attributes of, duration of ocular quiescence (DOQ).
80. A method for identifying risk of a degenerative neurological condition, the method including:
receiving, from one or more sensor systems each respectively configured to enable collection of blepharometric data from a human subject, a set of sensor data including (i) blepharometric data; and (ii) subject identification data, wherein each of the sensor systems is configured to collect the blepharometric data from the human subject whilst the human subject is engaged in a known task;
maintaining a record of blepharometric data collected via the one or more sensor systems having common subject identification data representing a unique individual; and
maintaining a record of blepharometric data collected via the one or more sensor systems having common subject identification data representing a unique individual thereby to identify a long-term trend in blepharometric data, wherein the long-term trend in blepharometric data is represented by a threshold trend in one or more blepharometric artefacts that satisfies a predefined profile that is representative of prediction of a neurological condition.
81. A method according to claim 80, wherein the one or more sensor systems include: (i) an in-vehicle sensor system configured to monitor a vehicle operator in a first vehicle; and (ii) an in-vehicle system configured to monitor a vehicle operator in a second vehicle which is distinct from the first vehicle.
82. A method according to claim 80, wherein the one or more sensor systems include: (i) an in-vehicle sensor system configured to monitor a vehicle operator; and (ii) an in-vehicle system configured to monitor a vehicle passenger.
US17/288,360 2018-10-23 2019-10-23 Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection Pending US20210386345A1 (en)

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
AU2018904027A AU2018904027A0 (en) 2018-10-23 Analysis of neurological conditions, including detection of seizure events, based on analysis of blepharometric data
AU2018904026 2018-10-23
AU2018904028 2018-10-23
AU2018904027 2018-10-23
AU2018904028A AU2018904028A0 (en) 2018-10-23 Collection of blepharon data via background software application executing on electronic device with a front-facing camera
AU2018904026A AU2018904026A0 (en) 2018-10-23 Analysis of neurological conditions, including prediction of future seizure events, based on analysis of blepharometric data
AU2018904076 2018-10-27
AU2018904076A AU2018904076A0 (en) 2018-10-27 Methods and systems configured to enable improved analysis of involuntary eye and/or eyelid movement parameters, including analysis of brain function from involuntary eye and/or eyelid movement parameters
AU2018904312A AU2018904312A0 (en) 2018-11-13 Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharon data collection
AU2018904312 2018-11-13
AU2019900229 2019-01-25
AU2019900229A AU2019900229A0 (en) 2019-01-25 Devices and processing systems configured to enable physiological event prediction based on blepharonic analysis
PCT/AU2019/051157 WO2020082124A1 (en) 2018-10-23 2019-10-23 Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection

Publications (1)

Publication Number Publication Date
US20210386345A1 true US20210386345A1 (en) 2021-12-16

Family

ID=70330269

Family Applications (3)

Application Number Title Priority Date Filing Date
US17/288,360 Pending US20210386345A1 (en) 2018-10-23 2019-10-23 Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection
US17/288,336 Pending US20210378568A1 (en) 2018-10-23 2019-10-23 Devices and processing systems configured to enable physiological event prediction based on blepharometric data analysis
US17/287,882 Pending US20210345937A1 (en) 2018-10-23 2019-10-23 Analysis of neurological conditions, including prediction of future seizure events and/or detection of current seizure events, based on analysis of blepharometric data

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/288,336 Pending US20210378568A1 (en) 2018-10-23 2019-10-23 Devices and processing systems configured to enable physiological event prediction based on blepharometric data analysis
US17/287,882 Pending US20210345937A1 (en) 2018-10-23 2019-10-23 Analysis of neurological conditions, including prediction of future seizure events and/or detection of current seizure events, based on analysis of blepharometric data

Country Status (3)

Country Link
US (3) US20210386345A1 (en)
EP (3) EP3870046A4 (en)
WO (3) WO2020082123A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE2130254A1 (en) * 2021-09-23 2023-03-24 Rths Ab A sensing arrangement for obtaining data from a body part using accurate reference values

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6346887B1 (en) 1999-09-14 2002-02-12 The United States Of America As Represented By The Secretary Of The Navy Eye activity monitor
AUPR872301A0 (en) 2001-11-08 2001-11-29 Sleep Diagnostics Pty Ltd Alertness monitor
US20110077548A1 (en) * 2004-04-01 2011-03-31 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
MX2007010513A (en) 2005-03-04 2008-01-16 Sleep Diagnostics Pty Ltd Measuring alertness.
US7815311B2 (en) 2005-08-11 2010-10-19 Sleep Diagnostics Pty., Ltd Alertness sensing spectacles
US8337404B2 (en) 2010-10-01 2012-12-25 Flint Hills Scientific, Llc Detecting, quantifying, and/or classifying seizures using multimodal data
US8345935B2 (en) * 2008-03-17 2013-01-01 International Business Machines Corporation Detecting behavioral deviations by measuring eye movements
JP5210773B2 (en) * 2008-09-16 2013-06-12 トヨタ自動車株式会社 Sleepiness determination apparatus and program
CN102316805B (en) * 2009-02-13 2014-01-29 丰田自动车株式会社 Physiological condition estimation device and vehicle control device
US9298985B2 (en) 2011-05-16 2016-03-29 Wesley W. O. Krueger Physiological biosensor system and method for controlling a vehicle or powered equipment
US10074024B2 (en) * 2010-06-07 2018-09-11 Affectiva, Inc. Mental state analysis using blink rate for vehicles
US9082011B2 (en) * 2012-03-28 2015-07-14 Texas State University—San Marcos Person identification using ocular biometrics with liveness detection
US9101312B2 (en) * 2012-04-18 2015-08-11 TBI Diagnostics LLC System for the physiological evaluation of brain function
US9207760B1 (en) * 2012-09-28 2015-12-08 Google Inc. Input detection
US8930269B2 (en) 2012-12-17 2015-01-06 State Farm Mutual Automobile Insurance Company System and method to adjust insurance rate based on real-time data about potential vehicle operator impairment
CA2904264A1 (en) * 2013-03-06 2014-09-12 Adam J. Simon Form factors for the multi-modal physiological assessment of brain health
WO2015030797A1 (en) 2013-08-30 2015-03-05 Intel Corporation Nausea and seizure detection, prediction, and mitigation for head-mounted displays
WO2015164807A1 (en) 2014-04-25 2015-10-29 Texas State University Detection of brain injury and subject state with eye movement biometrics
US20170119248A1 (en) 2014-06-20 2017-05-04 Sdip Holdings Pty Ltd Monitoring drowsiness
DE102015218306A1 (en) * 2015-09-23 2017-03-23 Robert Bosch Gmbh A method and apparatus for determining a drowsiness condition of a driver
US11972336B2 (en) * 2015-12-18 2024-04-30 Cognoa, Inc. Machine learning platform and system for data analysis
JP6514651B2 (en) * 2016-02-15 2019-05-15 ルネサスエレクトロニクス株式会社 Eye opening detection system, doze detection system, automatic shutter system, eye opening detection method and eye opening detection program

Also Published As

Publication number Publication date
EP3870046A4 (en) 2022-08-31
US20210345937A1 (en) 2021-11-11
EP3870048A1 (en) 2021-09-01
EP3870047A4 (en) 2022-08-10
WO2020082125A1 (en) 2020-04-30
US20210378568A1 (en) 2021-12-09
WO2020082123A1 (en) 2020-04-30
EP3870048A4 (en) 2022-12-07
EP3870046A1 (en) 2021-09-01
EP3870047A1 (en) 2021-09-01
WO2020082124A1 (en) 2020-04-30

Similar Documents

Publication Publication Date Title
US9908530B1 (en) Advanced vehicle operator intelligence system
Hossain et al. IOT based real-time drowsy driving detection system for the prevention of road accidents
CN107428245B (en) Device and method for predicting the level of alertness of a driver of a motor vehicle
Melnicuk et al. Towards hybrid driver state monitoring: Review, future perspectives and the role of consumer electronics
US11423335B2 (en) Data processing system with machine learning engine to provide output generating functions
US20230339475A1 (en) Facial recognition and monitoring device, system, and method
AU2020102426A4 (en) Collection of blepharometric data via a camera system
US20210386345A1 (en) Devices and processing systems configured to enable extended monitoring and analysis of subject neurological factors via blepharometric data collection
Vasudevan et al. Driver drowsiness monitoring by learning vehicle telemetry data
AU2021100635B4 (en) Identification of risk of degeneratve neurological conditions via blepharometric data collection
AU2021100641B4 (en) Extended period blepharometric monitoring across multiple data collection platforms
AU2021100637B4 (en) Blepharometric monitoring system for a vehicle which provides user-customised analysis
AU2021100643A4 (en) Ai-based technology configured to enable physiological event prediction based on blepharometric data
Li et al. Real-time driver drowsiness estimation by multi-source information fusion with Dempster–Shafer theory
CN107533811B (en) Information processing apparatus, information processing method, and program
Niwa et al. A wearable device for traffic safety-a study on estimating drowsiness with eyewear, JINS MEME
US20190149777A1 (en) System for recording a scene based on scene content
US20240153285A1 (en) Prediction of human subject state via hybrid approach including ai classification and blepharometric analysis, including driver monitoring systems
WO2022061403A1 (en) Devices and processing systems configured to enable assessment of a condition of a human subject based on sensorimotor gating of blinks
US20240112337A1 (en) Vehicular driver monitoring system with health monitoring
Yao et al. In-vehicle alertness monitoring for older adults
Agarkar et al. Driver Drowsiness Detection and Warning using Facial Features and Hand Gestures
Tejashwini et al. Drowsy Driving Detection System–IoT Perspective
Putra et al. Implementation of MTCNN Facial Feature Extraction on Sleepiness Scale Classification Using CNN
WO2023193038A1 (en) Interventional protocol in respect of human-operated machine based on processing of blepharometric data and/or other physiological parameters

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION