US20210153806A1 - Systems and methods for detecting alertness of an occupant of a vehicle - Google Patents
Systems and methods for detecting alertness of an occupant of a vehicle Download PDFInfo
- Publication number
- US20210153806A1 US20210153806A1 US16/696,066 US201916696066A US2021153806A1 US 20210153806 A1 US20210153806 A1 US 20210153806A1 US 201916696066 A US201916696066 A US 201916696066A US 2021153806 A1 US2021153806 A1 US 2021153806A1
- Authority
- US
- United States
- Prior art keywords
- driver
- vehicle
- occupant
- lagophthalmos
- susceptible
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4809—Sleep detection, i.e. determining whether a subject is asleep or not
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1103—Detecting eye twinkling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6893—Cars
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/0202—Child monitoring systems using a transmitter-receiver system carried by the parent and the child
- G08B21/0205—Specific application combined with child monitoring using a transmitter-receiver system
- G08B21/0211—Combination with medical sensor, e.g. for measuring heart rate, temperature
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/20—Workers
- A61B2503/22—Motor vehicles operators, e.g. drivers, pilots, captains
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0872—Driver physiology
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/043—Identity of occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/221—Physiology, e.g. weight, heartbeat, health or special needs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/225—Direction of gaze
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
Definitions
- This disclosure generally relates to vehicles, and more particularly relates to systems and methods for detecting alertness of a driver (or a passenger) of a vehicle.
- a major concern in traffic safety is driver behavior, particularly distracted driving and drowsiness.
- Vehicle manufacturers have addressed such types of driver behaviors by offering driving monitoring systems (DSMs).
- DSMs driving monitoring systems
- a typical driving monitoring system monitors various physical characteristics of a driver in order to continuously assess his/her alertness.
- One among the various physical characteristics that are monitored is a condition of the driver's eyes in order to identify drowsiness.
- the eyelids of a sleepy driver tend to be droopy, partially closed, or fully closed.
- the driving monitoring system may provide an audible alarm and/or provide a vibration in the steering wheel of the vehicle so as to awaken the driver.
- lagophthalmos a medical condition known as lagophthalmos.
- a person suffering from lagophthalmos typically falls asleep with eyes wide open, both during the day when in a sitting position or at night when in bed (nocturnal lagophthalmos).
- Some people may be born with lagophthalmos and some others may be afflicted with lagophthalmos due to injury or due to factors such as Bells' palsy, Graves' disease, paralytic stroke, or tumor.
- a conventional driving monitoring system that relies largely on detecting driver alertness based on monitoring a driver's eyes may mistakenly diagnose a driver suffering from lagophthalmos as being fully alert even though the driver may be asleep. It is therefore desirable to provide solutions that can monitor driver alertness taking into consideration that a driver may have lagophthalmos.
- FIG. 1 illustrates an exemplary embodiment of a driver alertness detection system in accordance with the disclosure for detecting an alertness state of a driver.
- FIG. 2 illustrates an exemplary embodiment of the driver alertness detection system in accordance with the disclosure for detecting an alertness state of a driver afflicted with lagophthalmos.
- FIGS. 3A-3B shows a flowchart of a method of operation of a driver alertness detection system in accordance with the disclosure.
- FIG. 4 shows some exemplary components that may be included in a driver alertness detection system in accordance with the disclosure.
- a driver alertness detection system is used to determine whether a driver of a vehicle is susceptible to lagophthalmos.
- the driver alertness detection system may evaluate an alertness state of the driver by disregarding an eyelid status of the driver and monitoring one or more biometrics of the driver such as, for example, a heart rate or a breathing pattern.
- the driver alertness detection system may evaluate an alertness state of the driver by placing a higher priority on the biometrics of the driver than on the eyelid status of the driver.
- the driver alertness detection system may evaluate the alertness state of the driver by using a standard procedure that involves placing a higher priority on the eyelid status of the driver than on the biometrics of the driver.
- word “driver” as used herein may be equally applicable to a passenger of a vehicle or in some cases, to a person who is requesting a ride in a ride-share vehicle. Words such as “person” or “occupant” may be used in some cases to indicate a driver, a passenger, or a potential passenger of a vehicle.
- vehicle as used in this disclosure can pertain to any one of various types of vehicles such as cars, vans, sports utility vehicles, trucks, electric vehicles, gasoline vehicles, hybrid vehicles, and autonomous vehicles.
- biometrics as used in this disclosure generally refers to various parameters of a human body that may be used to identify a physical condition, such as a sleeping condition or an alert condition.
- information as used herein can pertain to data, signals, communications (for example, messages) and other such items that can be processed by processing circuitry for carrying out various operations. Words such as “having,” “suffering,” “condition,” and “afflicted,” may be used in an interchangeable manner and generally denote that a physical condition such as lagophthalmos is associated with an individual being referred to.
- standard as used herein generally refers to an action that is known in popular practice.
- FIG. 1 illustrates an exemplary embodiment of a driver alertness detection system 100 in accordance with the disclosure for detecting a driver 110 who is asleep at the wheel of a vehicle.
- the driver alertness detection system 100 may be implemented in a variety of ways and can include various type of sensors.
- the driver alertness detection system 100 can include an imaging apparatus 105 , a facial recognition system 120 , and one or more biometric sensors (such as an exemplary biometric sensor 115 ).
- the imaging apparatus 105 can be mounted on any of various parts of the vehicle having a field of view that encompasses at least a portion of a face of the driver 110 , particularly the eyes of the driver 110 .
- the imaging apparatus 105 in this example illustration is mounted upon a rear-view mirror of the vehicle and is arranged to capture one or more images of the eyes of the driver 110 .
- the imaging apparatus 105 can be a video camera that captures real-time video of at least the eyes of the driver 110 .
- the imaging apparatus 105 can be a digital camera that captures digital images of at least the eyes of the driver 110 .
- the digital images may be captured on a repetitive basis, an intermittent basis, and/or a random basis.
- the real-time video and/or digital images are conveyed to a processing circuit (not shown) of the driver alertness detection system 100 for processing in order to detect an eyelid status of the driver 110 .
- An eyelid status such as, for example, an open condition, a drooping condition, a partially closed condition, or a fully closed condition, can provide an indication of an alertness state of the driver 110 .
- the facial recognition system 120 can be mounted on any of various parts of the vehicle having a field of view that encompasses a face of the driver 110 .
- the facial recognition system 120 in this example illustration is mounted upon an instrument panel of the vehicle and is arranged to capture images of the face of the driver 110 (digital images and/or video). The images may be conveyed to the processing circuit of the driver alertness detection system 100 for processing in order to analyze various facial features of the driver 110 .
- Facial features such as a drooping mouth, a slack jaw, and/or an angular orientation can provide an indication of an alertness state of the driver 110 .
- the biometric sensor 115 can include one or more of various types of devices that may be mounted on various parts of the vehicle and used to detect various types of physical conditions of the driver 110 for purposes of evaluating an alertness state of the driver 110 .
- the biometric sensor 115 in this example illustration can be a pressure sensor that senses a placement of the hands of the driver 110 upon the steering wheel and/or an intensity of a hand grip of the driver 110 upon the steering wheel.
- the biometric sensor 115 can be a body sensor that is provided in a driver seat occupied by the driver 110 .
- the body sensor may measure various biometric parameters of the driver 110 , such as, for example, blood pressure, heart rate, brainwaves, and breathing pattern.
- the body sensor may incorporate various types of technologies, such as, for example, infra-red technology, green light technology, radio-frequency (RF) technology, and/or pressure transducer technology.
- RF radio-frequency
- infra-red technology is for temperature measurements and distance measurements.
- green-light technology is in fitness monitoring devices such as a Fitbit® activity tracker where a pair of LEDs shine green light that measures minute changes in the color characteristics of blood flowing through a human body. The color characteristics typically vary in correspondence to a blood pumping action of the heart and can be used to determine parameters such as pulse rate and heartbeat.
- the processing circuit of the driver alertness detection system 100 may process the information provided by the imaging apparatus 105 , the facial recognition system 120 , and/or the biometric sensor 115 over a period of time (which can be referred to as a sampling period), and conclude that there is a high probability that the driver 110 is asleep at the wheel.
- Processing the information provided by the various sources such as the imaging apparatus 105 , the facial recognition system 120 , and/or the biometric sensor 115 may include applying various levels of priorities.
- the application of priorities may be carried out by applying weights to the information received from the various sources. Typically, continuous eyelid closure over an extended period of is a good indicator that a person is asleep.
- An intermittent eyelid closure may indicate that a person is drowsy.
- Heart rate and breathing patterns can also indicate an alertness state of a person.
- states can be prone to certain ambiguities among different individuals.
- some physical parameters (such as heart rate, breathing pattern, and/or blood pressure) of an athletic individual may be different than that of a sedentary individual.
- the processing circuit of the driver alertness detection system 100 may apply a greater weight to the eyelid status of the driver 110 than to the biometric measurements.
- the driver alertness detection system 100 may apply a numerical weight of 8 (out of 10) for signals received from the imaging apparatus 105 and/or the facial recognition system 120 , and use this weighting to evaluate the eyelid status of the driver 110 .
- the driver alertness detection system 100 may apply a lower weight (5, for example) to the signals received from the biometric sensor 115 for evaluating other physical conditions of the driver 110 to determine the alertness state of the driver 110 .
- Such a weighting scheme where the eyelid status is used as primary indicator of alertness may be effective when the driver 110 does not suffer from lagophthalmos. However, this approach may be ineffective when the driver 110 has lagophthalmos, because the eyelids of the driver 110 may remain open even though the driver 110 has fallen asleep at the wheel.
- FIG. 2 illustrates an exemplary embodiment of the driver alertness detection system 100 configured to detect an alertness state of an occupant of a vehicle when afflicted with lagophthalmos.
- the occupant of the vehicle can either be a driver 210 or a passenger 215 of the vehicle. Though the following description is directed at the driver 210 , it must be understood that the description is equally applicable to any occupant of a vehicle.
- the vehicle can be an autonomous vehicle and all the occupants of the autonomous vehicle are passengers.
- the driver alertness detection system 100 may process signals received from the biometric sensor 115 and conclude that there is a high probability that the driver 210 is in a sleeping state or a drowsy state. However, due to the lagophthalmos condition of the driver 210 and the resulting wide-open eyes, the signals received from the imaging apparatus 105 and/or the facial recognition system 120 may provide an eyelid status indicative of the driver 210 being awake. Upon encountering such a conflicting situation, the driver alertness detection system 100 can initiate a procedure to determine if the driver 210 suffers from lagophthalmos. In one exemplary implementation, the driver alertness detection system 100 may fetch medical records of the driver 210 to do so. In another exemplary implementation, the driver alertness detection system 100 may execute a learning procedure to determine whether the driver 210 suffers from lagophthalmos.
- the driver alertness detection system 100 evaluates the alertness state of the driver 210 by placing a higher priority on signals received from the biometric sensor 115 than on signals received from devices such as the imaging apparatus 105 and/or the facial recognition system 120 used to determine an eyelid status of the driver 210 .
- the driver alertness detection system 100 may apply a numerical weight of 8 (out of 10) for signals received from the biometric sensor 115 and a lower weight (5, for example) to the signals received from the imaging apparatus 105 and/or the facial recognition system 120 when evaluating the alertness state of the driver 110 .
- the driver alertness detection system 100 evaluates the alertness state of the driver 210 by disregarding signals received from devices such as the imaging apparatus 105 and/or the facial recognition system 120 , and only process signals received from the biometric sensor 115 .
- the driver alertness detection system 100 may evaluate the alertness state of the driver 210 based on a heart rate or a breathing pattern of the driver 210 and disregard an eyelid status of the driver 210 in view of the driver 210 having lagophthalmos.
- FIGS. 3A-3B shows a flowchart 300 of a method of operation of a driver alertness detection system in accordance with the disclosure.
- the flowchart 300 illustrates a sequence of operations that can be implemented in hardware, software, or a combination thereof.
- the operations represent computer-executable instructions stored on one or more non-transitory computer-readable media (such as a memory 410 that is described below).
- the computer-executable instructions may be executed by one or more processors (such as a processor 405 that is described below), to perform the recited operations.
- processors such as a processor 405 that is described below
- computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
- a determination may be made by a driver alertness detection system whether an eyelid status of a driver of a vehicle indicates a sleepy or drowsy condition.
- a processor may process images received from the imaging apparatus 105 and/or the facial recognition system 120 and conclude that the driver is asleep, based on detecting that the eyelids of the driver have remained closed for a period of time (for over a minute, for example).
- the processor may process images received from the imaging apparatus 105 and/or the facial recognition system 120 and conclude that the driver is drowsy, based on detecting that the eyelids of the driver close occasionally or flutter intermittently over a period of time (for 3 minutes, for example).
- the processor may process signals from a biometric sensor that monitors breathing activity of the driver and detect a sleeping breathing pattern.
- the processor may process signals from a biometric sensor that monitors brain activity of the driver and detect a sleeping brainwave pattern.
- the processor draws a conclusion that the driver is in a sleep state.
- remedial actions may be taken.
- the driver alertness detection system 100 may communicate with an infotainment system in the vehicle to emit an audible signal (beeps, tones, music, etc.) for waking the driver.
- an infotainment system in the vehicle to emit an audible signal (beeps, tones, music, etc.) for waking the driver.
- the vehicle is an autonomous vehicle
- further action may be taken if the driver does not respond to the audible signal.
- the autonomous vehicle may slow down and come to a halt on the side of a road if the driver does not wake up after repeated attempts.
- the processor may conclude that the driver is either drowsy or may have an eye problem.
- the driver alertness detection system 100 may display on a display screen of the infotainment system in the vehicle, a message that recommends the driver to take a break or drink coffee.
- the processor may advice the driver to seek medical assistance to treat an eye condition such as redness, dryness, allergies etc.
- the processor may detect a sleeping breathing pattern and/or a sleeping brainwave pattern that provides an indication that the driver is either sleeping or is drowsy. If the biometric sensors indicate that the driver is not in a sleeping/drowsy condition, thereby confirming the eyelid status (at block 305 ) that is indicative of the driver not being asleep or drowsy, at block 340 , no further action is taken by the driver alertness detection system 100 .
- the operations that are carried out to make the determination may also be applicable to a passenger of the vehicle or a person requesting a ride in a ride share vehicle.
- words such as “person” or “occupant” may be used below when subject matter is applicable to the driver and the passenger.
- Some actions carried out with respect to a passenger may be different than those carried out with respect to a driver, because the passenger is not directly involved with driving operations of the vehicle. In these instances, the word “passenger” or “driver” may be used to provide clarity.
- a person may request a ride in a ride share vehicle, which can be either an autonomous vehicle or a driver-operated vehicle.
- a ride share vehicle which can be either an autonomous vehicle or a driver-operated vehicle.
- the actions described below may be carried out prior to the person becoming an occupant of the ride share vehicle (by a ride-share service provider, for example).
- a first scenario pertains to a person having lagophthalmos and being aware that he/she has lagophthalmos.
- a second scenario pertains to a person having lagophthalmos and being unaware that he/she has lagophthalmos.
- a third scenario pertains to a person not having lagophthalmos and being aware that he/she does not have lagophthalmos.
- a fourth scenario pertains to a person not having lagophthalmos and being unaware that he/she does not have lagophthalmos.
- a re-prioritized alertness monitoring procedure may be executed.
- the re-prioritized alertness monitoring procedure refers to a modification of a standard alertness monitoring procedure.
- the standard alertness monitoring procedure which is applicable to an occupant of the vehicle who does not have lagophthalmos, involves applying a higher priority (weighting) to the eyelid status of the person than to biometric factors such as a breathing pattern or a heart rate. Thus, the eyelid status of the occupant becomes the primary criterion for detecting alertness.
- re-prioritized alertness monitoring procedure involves applying a higher priority (weighting) to biometric factors rather than the eyelid status of the occupant for detecting alertness because the eyelid status of a person suffering from lagophthalmos may be a poor indicator of alertness.
- the processor concludes that a lagophthalmos condition of the occupant of the vehicle is unknown.
- a lagophthalmos status of the occupant of the vehicle may be evaluated.
- the evaluation may be carried out in various ways and may be also applicable in the case of a person requesting a ride in a ride share vehicle.
- a processor that can be a part of the driver alertness detection system 100 obtains a medical history of a person from one or more of various sources.
- the medical history of the person may be obtained from a server computer that is configured to wirelessly communicate with the driver alertness detection system.
- the medical history of the person may be obtained from a database that is a part of the driver alertness detection system 100 .
- the medical history of the person may be obtained from a device such as a fitness bracelet worn by the person.
- the processor can generate a medical profile of the occupant of a vehicle based on monitoring and analyzing eyelid status and biometric data of the occupant over a period of time.
- the period of time can, for example, correspond to multiple trips performed by the occupant in the vehicle.
- the medical profile of the occupant may lead to a finding that the occupant suffers from lagophthalmos.
- the finding may be based on the biometric data (such as the heart rate and the breathing pattern of the occupant) indicating a sleeping profile while the eyelid status indicates an awake profile.
- the medical profile of the occupant may lead to a finding that the occupant does not suffer from lagophthalmos when both the eyelid status and the biometric data indicate a sleeping profile.
- the passenger may be provided an eye mask by a driver of the vehicle or that is provided in a receptacle of the vehicle.
- the eye mask may be provided in a ride share vehicle, for example, prior to a person entering the vehicle.
- Appropriate messages may be provided via the infotainment system of the vehicle to inform the person about the eye mask.
- certain actions may be carried out to assist the person suffering from lagophthalmos. These actions may include automatically increasing an amount of window tinting, or repositioning a seat to provide more comfort and/or privacy to the person.
- a person seeking a ride in a ride-share vehicle and is aware that he/she has lagophthalmos may request a specific type of vehicle or may request privacy with respect to his/her medical condition.
- the lagophthalmos status of the person is indeterminate. This may occur, for example, if the person has an eye defect and/or other physical characteristics that render the monitoring and analyzing of the eyelid status and the biometric data of the person inconclusive.
- driver alertness detection system may pursue alternative strategies to resolve the indeterminate lagophthalmos status of the person. For example, driver alertness detection system may request input from the driver via the infotainment system of the vehicle and/or urge the driver to periodically confirm his/her alertness by performing various tasks when seated in the vehicle. Some exemplary tasks can include touching certain components in the vehicle, operating certain components in the vehicle, uttering certain words into a microphone coupled to the infotainment system.
- FIG. 4 shows some exemplary components that may be included in the driver alertness detection system 100 in accordance with the disclosure.
- the driver alertness detection system 100 may include a processor 405 , an input/output (I/O) interface 420 , an eyelid monitoring system 425 , one or more biometric sensors such as biometric sensor (1) 430 , biometric sensor (2) 435 , and biometric sensor (n) 440 (n ⁇ 1), a communications interface 445 , and a memory 410 .
- the various components of the driver alertness detection system 100 can be communicatively coupled to each other via a bus 415 .
- the bus 415 can be implemented using one or more of various wired and/or wireless technologies.
- the bus can be a vehicle bus that uses a controller area network (CAN) bus protocol, a Media Oriented Systems Transport (MOST) bus protocol, and/or a CAN flexible data (CAN-FD) bus protocol.
- CAN controller area network
- MOST Media Oriented Systems Transport
- CAN-FD CAN flexible data
- Some or all portions of the bus may also be implemented using wireless technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication to accommodate communications between the driver alertness detection system 100 and devices such as, for example, an infotainment system 450 , a fitness bracelet 460 , and/or a personal communication device 465 .
- wireless technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication to accommodate communications between the driver alertness detection system 100 and devices such as, for example, an infotainment system 450 , a fitness bracelet 460 , and/or a personal communication device 465 .
- the memory 410 which is one example of a non-transitory computer-readable medium, may be used to store an operating system (OS) 413 and one or more code modules such as a driver alertness detection system module 411 .
- the code modules can be provided in the form of computer-executable instructions that are executed by the processor 405 for performing various operations in accordance with the disclosure.
- the processor 405 can execute the driver alertness detection system module 411 to perform various actions such as the ones described above with reference to FIGS. 3A-3B .
- the memory 410 can also include a database 412 that may be used to store information such as, for example, a medical history, a medical profile, physical attributes, medical conditions, and personal preferences of a driver of a vehicle.
- the database 412 may also be used to store information, such as medical information provided by the driver via a graphical user interface (GUI) 451 of the infotainment system 450 .
- GUI graphical user interface
- the input/output (I/O) interface 420 can include various types of wired and/or wireless circuitry to allow the driver alertness detection system 100 to communicate with the infotainment system 450 and a vehicle computer 455 .
- the bus 415 is a vehicle bus
- the input/output (I/O) interface 420 may be omitted and the infotainment system 450 and/or the vehicle computer 455 may be coupled directly to the bus 415 .
- the vehicle computer 455 may perform various functions of the vehicle such as controlling engine operations (fuel injection, speed control, emissions control, braking, etc.), managing climate controls (air conditioning, heating etc.), activating airbags, and issuing warnings (check engine light, bulb failure, low tire pressure, vehicle in blind spot, etc.).
- controlling engine operations fuel injection, speed control, emissions control, braking, etc.
- managing climate controls air conditioning, heating etc.
- activating airbags and issuing warnings (check engine light, bulb failure, low tire pressure, vehicle in blind spot, etc.).
- the eyelid monitoring system 425 can include one or more of various types of devices such as the imaging apparatus 105 and the facial recognition system 120 described above.
- the biometric sensors can include, for example, a pressure sensor and/or a body sensor.
- the pressure sensor can be used to detect placement of the hands of a driver upon a steering wheel of a vehicle and/or an intensity of a hand grip upon the steering wheel.
- the body sensor may be used to measure various biometric parameters of a driver, such as, for example, blood pressure, heart rate, brainwaves, and breathing pattern.
- the communications interface 445 provides for wireless communications between the driver alertness detection system 100 and various devices such as the fitness bracelet 460 , the personal communications device 465 , and a server computer 475 (via a network 470 ).
- the fitness bracelet 460 which can include devices such as a Fitbit® activity tracker or an Apple® Watch, can provide information that may be used by the driver alertness detection system 100 to determine an alertness condition of a driver.
- the fitness bracelet 460 may communicate with the driver alertness detection system 100 via a wireless communication link 446 that can incorporate technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, and machine-to-machine communication.
- technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, and machine-to-machine communication.
- the information provided by the fitness bracelet 460 may, for example, indicate to the driver alertness detection system 100 that the driver did not have adequate sleep the previous night, or that the driver has had excessive physical exertion over the past few hours prior to getting into the vehicle (walking up a flight of stairs, bending down to pick up multiple loads, high heart rate associated with exertion, etc.). Such actions may lead to the driver having a sleeping condition or drowsiness when at the wheel of the vehicle.
- the personal communication device 465 which can be a smartphone for example, can provide information that may be used by the driver alertness detection system 100 to determine an alertness condition of a driver.
- a fitness application executed in the smartphone may provide information to the driver alertness detection system 100 that the driver has undergone excessive physical exertion over the past few hours prior to getting into the vehicle.
- a global positioning system (GPS) application executed in the smartphone may provide walking information to indicate that the driver has walked over a certain distance prior to getting into the vehicle.
- GPS global positioning system
- the fitness bracelet 460 may communicate with the driver alertness detection system 100 via a wireless communication link 447 that can incorporate technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication.
- a wireless communication link 447 can incorporate technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication.
- the server computer 475 may communicate with the driver alertness detection system 100 via the network 470 .
- Information provided by the server computer 475 to the driver alertness detection system 100 can include a medical history of the driver, and/or information pertaining to websites visited by the driver.
- the information pertaining to the websites may indicate that the driver has communicated with help groups, support groups, social media sites, and medical-related sites for various reasons associated with lagophthalmos.
- the network 470 may include any one, or a combination of networks, such as a local area network (LAN), a wide area network (WAN), a telephone network, a cellular network, a cable network, a wireless network, and/or private/public networks such as the Internet.
- the network 470 may support communication technologies such as TCP/IP, Bluetooth, cellular, near-field communication (NFC), Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication.
- the wireless communication link 448 that supports communications between the driver alertness detection system 100 and a communication device such as a router, for example, that may be included in the network 470 can be implemented using various types of wireless technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, man-to-machine communication, and/or a vehicle-to-everything (V2X) communication.
- the communication link 449 that supports communications between the server computer 475 and the network 470 may incorporate one or more of various types of wired and/or wireless technologies used for communications.
- Implementations of the systems, apparatuses, devices, and methods disclosed herein may comprise or utilize one or more devices that include hardware, such as, for example, one or more processors and system memory, as discussed herein.
- An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network.
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or any combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium.
- Transmission media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of non-transitory computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause the processor to perform a certain function or group of functions.
- the computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- a memory device such as the memory 410 can include any one memory element or a combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and non-volatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.).
- volatile memory elements e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)
- non-volatile memory elements e.g., ROM, hard drive, tape, CDROM, etc.
- the memory device may incorporate electronic, magnetic, optical, and/or other types of storage media.
- a “non-transitory computer-readable medium” can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device.
- the computer-readable medium would include the following: a portable computer diskette (magnetic), a random-access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory) (electronic), and a portable compact disc read-only memory (CD ROM) (optical).
- a portable computer diskette magnetic
- RAM random-access memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- EPROM erasable programmable read-only memory
- CD ROM portable compact disc read-only memory
- the computer-readable medium could even be paper or another suitable medium upon which the program is printed, since the program can be electronically captured, for instance, via optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
- the present disclosure may be practiced in network computing environments with many types of computer system configurations, including in-dash vehicle computers, personal computers, desktop computers, laptop computers, message processors, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
- the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both the local and remote memory storage devices.
- ASICs application specific integrated circuits
- At least some embodiments of the present disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer-usable medium.
- Such software when executed in one or more data processing devices, causes a device to operate as described herein.
- any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure.
- any of the functionality described with respect to a particular device or component may be performed by another device or component.
- embodiments of the disclosure may relate to numerous other device characteristics.
- embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Psychiatry (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Child & Adolescent Psychology (AREA)
- Business, Economics & Management (AREA)
- Developmental Disabilities (AREA)
- General Business, Economics & Management (AREA)
- Educational Technology (AREA)
- Psychology (AREA)
- Pulmonology (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Mathematical Physics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Emergency Management (AREA)
- General Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Anesthesiology (AREA)
Abstract
Description
- This disclosure generally relates to vehicles, and more particularly relates to systems and methods for detecting alertness of a driver (or a passenger) of a vehicle.
- A major concern in traffic safety is driver behavior, particularly distracted driving and drowsiness. Vehicle manufacturers have addressed such types of driver behaviors by offering driving monitoring systems (DSMs). A typical driving monitoring system monitors various physical characteristics of a driver in order to continuously assess his/her alertness. One among the various physical characteristics that are monitored is a condition of the driver's eyes in order to identify drowsiness. Typically, the eyelids of a sleepy driver tend to be droopy, partially closed, or fully closed. When such a condition is detected, the driving monitoring system may provide an audible alarm and/or provide a vibration in the steering wheel of the vehicle so as to awaken the driver.
- While this procedure may be effective in many cases, some drivers may suffer from a medical condition known as lagophthalmos. A person suffering from lagophthalmos typically falls asleep with eyes wide open, both during the day when in a sitting position or at night when in bed (nocturnal lagophthalmos). Some people may be born with lagophthalmos and some others may be afflicted with lagophthalmos due to injury or due to factors such as Bells' palsy, Graves' disease, paralytic stroke, or tumor.
- A conventional driving monitoring system that relies largely on detecting driver alertness based on monitoring a driver's eyes may mistakenly diagnose a driver suffering from lagophthalmos as being fully alert even though the driver may be asleep. It is therefore desirable to provide solutions that can monitor driver alertness taking into consideration that a driver may have lagophthalmos.
- A detailed description is set forth below with reference to the accompanying drawings. The use of the same reference numerals may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. Elements and/or components in the figures are not necessarily drawn to scale. Throughout this disclosure, depending on the context, singular and plural terminology may be used interchangeably.
-
FIG. 1 illustrates an exemplary embodiment of a driver alertness detection system in accordance with the disclosure for detecting an alertness state of a driver. -
FIG. 2 illustrates an exemplary embodiment of the driver alertness detection system in accordance with the disclosure for detecting an alertness state of a driver afflicted with lagophthalmos. -
FIGS. 3A-3B shows a flowchart of a method of operation of a driver alertness detection system in accordance with the disclosure. -
FIG. 4 shows some exemplary components that may be included in a driver alertness detection system in accordance with the disclosure. - In terms of a general overview, certain embodiments described in this disclosure are generally directed to systems and methods for detecting alertness of a driver of a vehicle. A driver alertness detection system is used to determine whether a driver of a vehicle is susceptible to lagophthalmos. When the driver is susceptible to lagophthalmos, the driver alertness detection system may evaluate an alertness state of the driver by disregarding an eyelid status of the driver and monitoring one or more biometrics of the driver such as, for example, a heart rate or a breathing pattern. In another exemplary method, the driver alertness detection system may evaluate an alertness state of the driver by placing a higher priority on the biometrics of the driver than on the eyelid status of the driver. However, when the driver is not susceptible to lagophthalmos, the driver alertness detection system may evaluate the alertness state of the driver by using a standard procedure that involves placing a higher priority on the eyelid status of the driver than on the biometrics of the driver.
- The disclosure will be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the disclosure are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made to various embodiments without departing from the spirit and scope of the present disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments but should be defined only in accordance with the following claims and their equivalents. The description below has been presented for the purposes of illustration and is not intended to be exhaustive or to be limited to the precise form disclosed. It should be understood that alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure. For example, any of the functionality described with respect to a particular device or component may be performed by another device or component. Furthermore, while specific device characteristics have been described, embodiments of the disclosure may relate to numerous other device characteristics. Further, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments.
- Certain words and phrases are used herein solely for convenience and such words and terms should be interpreted as referring to various objects and actions that are generally understood in various forms and equivalencies by persons of ordinary skill in the art. For example, word “driver” as used herein may be equally applicable to a passenger of a vehicle or in some cases, to a person who is requesting a ride in a ride-share vehicle. Words such as “person” or “occupant” may be used in some cases to indicate a driver, a passenger, or a potential passenger of a vehicle. The word “vehicle” as used in this disclosure can pertain to any one of various types of vehicles such as cars, vans, sports utility vehicles, trucks, electric vehicles, gasoline vehicles, hybrid vehicles, and autonomous vehicles. The word “biometrics” as used in this disclosure generally refers to various parameters of a human body that may be used to identify a physical condition, such as a sleeping condition or an alert condition. The word “information” as used herein can pertain to data, signals, communications (for example, messages) and other such items that can be processed by processing circuitry for carrying out various operations. Words such as “having,” “suffering,” “condition,” and “afflicted,” may be used in an interchangeable manner and generally denote that a physical condition such as lagophthalmos is associated with an individual being referred to. The word “standard” as used herein generally refers to an action that is known in popular practice. It should also be understood that the word “example” as used herein is intended to be non-exclusionary and non-limiting in nature. More particularly, the word “exemplary” as used herein indicates one among several examples, and it should be understood that no undue emphasis or preference is being directed to the particular example being described.
-
FIG. 1 illustrates an exemplary embodiment of a driveralertness detection system 100 in accordance with the disclosure for detecting adriver 110 who is asleep at the wheel of a vehicle. The driveralertness detection system 100 may be implemented in a variety of ways and can include various type of sensors. In one exemplary implementation, the driveralertness detection system 100 can include animaging apparatus 105, afacial recognition system 120, and one or more biometric sensors (such as an exemplary biometric sensor 115). - The
imaging apparatus 105 can be mounted on any of various parts of the vehicle having a field of view that encompasses at least a portion of a face of thedriver 110, particularly the eyes of thedriver 110. Theimaging apparatus 105 in this example illustration is mounted upon a rear-view mirror of the vehicle and is arranged to capture one or more images of the eyes of thedriver 110. In some cases, theimaging apparatus 105 can be a video camera that captures real-time video of at least the eyes of thedriver 110. In some other cases, theimaging apparatus 105 can be a digital camera that captures digital images of at least the eyes of thedriver 110. The digital images may be captured on a repetitive basis, an intermittent basis, and/or a random basis. The real-time video and/or digital images are conveyed to a processing circuit (not shown) of the driveralertness detection system 100 for processing in order to detect an eyelid status of thedriver 110. An eyelid status, such as, for example, an open condition, a drooping condition, a partially closed condition, or a fully closed condition, can provide an indication of an alertness state of thedriver 110. - The
facial recognition system 120 can be mounted on any of various parts of the vehicle having a field of view that encompasses a face of thedriver 110. Thefacial recognition system 120 in this example illustration is mounted upon an instrument panel of the vehicle and is arranged to capture images of the face of the driver 110 (digital images and/or video). The images may be conveyed to the processing circuit of the driveralertness detection system 100 for processing in order to analyze various facial features of thedriver 110. Facial features such as a drooping mouth, a slack jaw, and/or an angular orientation can provide an indication of an alertness state of thedriver 110. - The
biometric sensor 115 can include one or more of various types of devices that may be mounted on various parts of the vehicle and used to detect various types of physical conditions of thedriver 110 for purposes of evaluating an alertness state of thedriver 110. Thebiometric sensor 115 in this example illustration can be a pressure sensor that senses a placement of the hands of thedriver 110 upon the steering wheel and/or an intensity of a hand grip of thedriver 110 upon the steering wheel. In another example embodiment, thebiometric sensor 115 can be a body sensor that is provided in a driver seat occupied by thedriver 110. The body sensor may measure various biometric parameters of thedriver 110, such as, for example, blood pressure, heart rate, brainwaves, and breathing pattern. The body sensor may incorporate various types of technologies, such as, for example, infra-red technology, green light technology, radio-frequency (RF) technology, and/or pressure transducer technology. - One example of the use of infra-red technology is for temperature measurements and distance measurements. One example of the use of green-light technology is in fitness monitoring devices such as a Fitbit® activity tracker where a pair of LEDs shine green light that measures minute changes in the color characteristics of blood flowing through a human body. The color characteristics typically vary in correspondence to a blood pumping action of the heart and can be used to determine parameters such as pulse rate and heartbeat.
- In the exemplary scenario illustrated in
FIG. 1 , the processing circuit of the driveralertness detection system 100 may process the information provided by theimaging apparatus 105, thefacial recognition system 120, and/or thebiometric sensor 115 over a period of time (which can be referred to as a sampling period), and conclude that there is a high probability that thedriver 110 is asleep at the wheel. Processing the information provided by the various sources such as theimaging apparatus 105, thefacial recognition system 120, and/or thebiometric sensor 115 may include applying various levels of priorities. In one exemplary implementation, the application of priorities may be carried out by applying weights to the information received from the various sources. Typically, continuous eyelid closure over an extended period of is a good indicator that a person is asleep. An intermittent eyelid closure may indicate that a person is drowsy. Heart rate and breathing patterns can also indicate an alertness state of a person. However, such states can be prone to certain ambiguities among different individuals. For example, some physical parameters (such as heart rate, breathing pattern, and/or blood pressure) of an athletic individual may be different than that of a sedentary individual. - Consequently, the processing circuit of the driver
alertness detection system 100 may apply a greater weight to the eyelid status of thedriver 110 than to the biometric measurements. For example, the driveralertness detection system 100 may apply a numerical weight of 8 (out of 10) for signals received from theimaging apparatus 105 and/or thefacial recognition system 120, and use this weighting to evaluate the eyelid status of thedriver 110. The driveralertness detection system 100 may apply a lower weight (5, for example) to the signals received from thebiometric sensor 115 for evaluating other physical conditions of thedriver 110 to determine the alertness state of thedriver 110. Such a weighting scheme where the eyelid status is used as primary indicator of alertness may be effective when thedriver 110 does not suffer from lagophthalmos. However, this approach may be ineffective when thedriver 110 has lagophthalmos, because the eyelids of thedriver 110 may remain open even though thedriver 110 has fallen asleep at the wheel. -
FIG. 2 illustrates an exemplary embodiment of the driveralertness detection system 100 configured to detect an alertness state of an occupant of a vehicle when afflicted with lagophthalmos. The occupant of the vehicle can either be adriver 210 or apassenger 215 of the vehicle. Though the following description is directed at thedriver 210, it must be understood that the description is equally applicable to any occupant of a vehicle. For example, in some instances, the vehicle can be an autonomous vehicle and all the occupants of the autonomous vehicle are passengers. In such a situation, particularly when the autonomous vehicle is a ride share vehicle and the passengers do not know each other, it may be disconcerting for a first passenger to see a second passenger sitting with eyes wide open and not responding in a manner that is expected of a person who is awake. The first passenger may be unaware that the second passenger is afflicted with lagophthalmos. - In an exemplary scenario that is illustrated in
FIG. 1 , the driveralertness detection system 100 may process signals received from thebiometric sensor 115 and conclude that there is a high probability that thedriver 210 is in a sleeping state or a drowsy state. However, due to the lagophthalmos condition of thedriver 210 and the resulting wide-open eyes, the signals received from theimaging apparatus 105 and/or thefacial recognition system 120 may provide an eyelid status indicative of thedriver 210 being awake. Upon encountering such a conflicting situation, the driveralertness detection system 100 can initiate a procedure to determine if thedriver 210 suffers from lagophthalmos. In one exemplary implementation, the driveralertness detection system 100 may fetch medical records of thedriver 210 to do so. In another exemplary implementation, the driveralertness detection system 100 may execute a learning procedure to determine whether thedriver 210 suffers from lagophthalmos. - If one or both of the procedures indicate that the
driver 210 suffers from lagophthalmos, the driveralertness detection system 100 evaluates the alertness state of thedriver 210 by placing a higher priority on signals received from thebiometric sensor 115 than on signals received from devices such as theimaging apparatus 105 and/or thefacial recognition system 120 used to determine an eyelid status of thedriver 210. For example, the driveralertness detection system 100 may apply a numerical weight of 8 (out of 10) for signals received from thebiometric sensor 115 and a lower weight (5, for example) to the signals received from theimaging apparatus 105 and/or thefacial recognition system 120 when evaluating the alertness state of thedriver 110. - In accordance with a second implementation of this exemplary embodiment, the driver
alertness detection system 100 evaluates the alertness state of thedriver 210 by disregarding signals received from devices such as theimaging apparatus 105 and/or thefacial recognition system 120, and only process signals received from thebiometric sensor 115. For example, the driveralertness detection system 100 may evaluate the alertness state of thedriver 210 based on a heart rate or a breathing pattern of thedriver 210 and disregard an eyelid status of thedriver 210 in view of thedriver 210 having lagophthalmos. -
FIGS. 3A-3B shows aflowchart 300 of a method of operation of a driver alertness detection system in accordance with the disclosure. Theflowchart 300 illustrates a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more non-transitory computer-readable media (such as amemory 410 that is described below). The computer-executable instructions may be executed by one or more processors (such as aprocessor 405 that is described below), to perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations may be carried out in a different order, omitted, combined in any order, and/or carried out in parallel. Some or all of the operations described in theflowchart 300 may be carried out by using an application such as a driver alertnessdetection system module 411 contained in amemory 410 as described below usingFIG. 4 . - At
block 305, a determination may be made by a driver alertness detection system whether an eyelid status of a driver of a vehicle indicates a sleepy or drowsy condition. For example, a processor may process images received from theimaging apparatus 105 and/or thefacial recognition system 120 and conclude that the driver is asleep, based on detecting that the eyelids of the driver have remained closed for a period of time (for over a minute, for example). As another example, the processor may process images received from theimaging apparatus 105 and/or thefacial recognition system 120 and conclude that the driver is drowsy, based on detecting that the eyelids of the driver close occasionally or flutter intermittently over a period of time (for 3 minutes, for example). - If the eyelid status of the driver indicates a sleepy or drowsy condition, at
block 310, a determination is made whether biometric signals received from one or more biometric sensors, such as thebiometric sensor 115, indicate a sleeping condition. For example, the processor may process signals from a biometric sensor that monitors breathing activity of the driver and detect a sleeping breathing pattern. As another example, the processor may process signals from a biometric sensor that monitors brain activity of the driver and detect a sleeping brainwave pattern. - If the biometrics of the driver confirm the sleepy or drowsy condition that is indicated by the eyelid status of the driver, at
block 315, the processor draws a conclusion that the driver is in a sleep state. - At
block 320, remedial actions may be taken. For example, the driveralertness detection system 100 may communicate with an infotainment system in the vehicle to emit an audible signal (beeps, tones, music, etc.) for waking the driver. When the vehicle is an autonomous vehicle, further action may be taken if the driver does not respond to the audible signal. For example, the autonomous vehicle may slow down and come to a halt on the side of a road if the driver does not wake up after repeated attempts. - If, at
block 310, the biometric signals received from the biometric sensor such as thebiometric sensor 115 do not indicate a sleeping condition even though the eyelid status indicates a sleeping or drowsy condition, atblock 325, the processor may conclude that the driver is either drowsy or may have an eye problem. - Consequently, at
block 330, the driveralertness detection system 100 may display on a display screen of the infotainment system in the vehicle, a message that recommends the driver to take a break or drink coffee. In some cases, the processor may advice the driver to seek medical assistance to treat an eye condition such as redness, dryness, allergies etc. - Turning back to block 305, if the eyelid status of the driver indicates a sleepy or drowsy condition, at
block 335, a determination is made whether biometric signals received from one or more biometric sensors indicate a sleeping condition. For example, the processor may detect a sleeping breathing pattern and/or a sleeping brainwave pattern that provides an indication that the driver is either sleeping or is drowsy. If the biometric sensors indicate that the driver is not in a sleeping/drowsy condition, thereby confirming the eyelid status (at block 305) that is indicative of the driver not being asleep or drowsy, atblock 340, no further action is taken by the driveralertness detection system 100. However, if the indication provided by the biometric sensors indicate that the driver is sleepy or drowsy, thereby contradicting the eyelid status (at block 305) that is indicative of the driver not being asleep or drowsy, at block 345 (FIG. 3B ), a determination is made whether the driver is known to have lagophthalmos. The operations that are carried out to make the determination may also be applicable to a passenger of the vehicle or a person requesting a ride in a ride share vehicle. As such, words such as “person” or “occupant” may be used below when subject matter is applicable to the driver and the passenger. Some actions carried out with respect to a passenger may be different than those carried out with respect to a driver, because the passenger is not directly involved with driving operations of the vehicle. In these instances, the word “passenger” or “driver” may be used to provide clarity. - Furthermore, in some cases, a person may request a ride in a ride share vehicle, which can be either an autonomous vehicle or a driver-operated vehicle. In such cases, the actions described below may be carried out prior to the person becoming an occupant of the ride share vehicle (by a ride-share service provider, for example).
- At least four different scenarios may be applicable when making this determination. A first scenario pertains to a person having lagophthalmos and being aware that he/she has lagophthalmos. A second scenario pertains to a person having lagophthalmos and being unaware that he/she has lagophthalmos. A third scenario pertains to a person not having lagophthalmos and being aware that he/she does not have lagophthalmos. A fourth scenario pertains to a person not having lagophthalmos and being unaware that he/she does not have lagophthalmos.
- If the first scenario is applicable, at
block 360, a re-prioritized alertness monitoring procedure may be executed. The re-prioritized alertness monitoring procedure refers to a modification of a standard alertness monitoring procedure. The standard alertness monitoring procedure, which is applicable to an occupant of the vehicle who does not have lagophthalmos, involves applying a higher priority (weighting) to the eyelid status of the person than to biometric factors such as a breathing pattern or a heart rate. Thus, the eyelid status of the occupant becomes the primary criterion for detecting alertness. On the other hand, re-prioritized alertness monitoring procedure involves applying a higher priority (weighting) to biometric factors rather than the eyelid status of the occupant for detecting alertness because the eyelid status of a person suffering from lagophthalmos may be a poor indicator of alertness. - If the first scenario is not applicable, at
block 350, a determination is made whether the occupant of the vehicle is known to not have lagophthalmos. If, the third scenario is applicable here (i.e., occupant does not have lagophthalmos and is aware of it), atblock 365, the standard alertness monitoring procedure may be executed. If, on the other hand, the second or the fourth scenario is applicable here (occupant is unaware if he/she has or does not have lagophthalmos), atblock 355, the processor concludes that a lagophthalmos condition of the occupant of the vehicle is unknown. - At
block 370, a lagophthalmos status of the occupant of the vehicle may be evaluated. The evaluation may be carried out in various ways and may be also applicable in the case of a person requesting a ride in a ride share vehicle. In one exemplary embodiment in accordance with the disclosure, a processor that can be a part of the driveralertness detection system 100 obtains a medical history of a person from one or more of various sources. For example, the medical history of the person may be obtained from a server computer that is configured to wirelessly communicate with the driver alertness detection system. As another example, the medical history of the person may be obtained from a database that is a part of the driveralertness detection system 100. As yet another example, the medical history of the person may be obtained from a device such as a fitness bracelet worn by the person. - In another exemplary embodiment in accordance with the disclosure, the processor can generate a medical profile of the occupant of a vehicle based on monitoring and analyzing eyelid status and biometric data of the occupant over a period of time. The period of time can, for example, correspond to multiple trips performed by the occupant in the vehicle. The medical profile of the occupant may lead to a finding that the occupant suffers from lagophthalmos. The finding may be based on the biometric data (such as the heart rate and the breathing pattern of the occupant) indicating a sleeping profile while the eyelid status indicates an awake profile. Alternatively, the medical profile of the occupant may lead to a finding that the occupant does not suffer from lagophthalmos when both the eyelid status and the biometric data indicate a sleeping profile.
- At
block 375, a determination may be made whether the lagophthalmos status of the person has been determined. If a determination has been made, atblock 380, a determination is made whether the person has lagophthalmos. If the person does not have lagophthalmos, the standard alertness monitoring procedure can be executed (block 365). If the person has lagophthalmos and is an occupant of the vehicle, the re-prioritized alertness monitoring procedure can be executed (block 360). In one exemplary case where the occupant of the vehicle is a passenger in the vehicle, the other occupants of the vehicle may be informed of the lagophthalmos condition of the passenger so as to assuage any concerns that the other passengers may have. In another exemplary case, the passenger may be provided an eye mask by a driver of the vehicle or that is provided in a receptacle of the vehicle. The eye mask may be provided in a ride share vehicle, for example, prior to a person entering the vehicle. Appropriate messages (text and/or audio) may be provided via the infotainment system of the vehicle to inform the person about the eye mask. In yet another exemplary case, where the vehicle is an autonomous vehicle, certain actions may be carried out to assist the person suffering from lagophthalmos. These actions may include automatically increasing an amount of window tinting, or repositioning a seat to provide more comfort and/or privacy to the person. A person seeking a ride in a ride-share vehicle and is aware that he/she has lagophthalmos may request a specific type of vehicle or may request privacy with respect to his/her medical condition. - In some cases, at
block 375, it may be discovered that the lagophthalmos status of the person is indeterminate. This may occur, for example, if the person has an eye defect and/or other physical characteristics that render the monitoring and analyzing of the eyelid status and the biometric data of the person inconclusive. - When the lagophthalmos status of the person is indeterminate, at
block 385, the person may be informed of the inclusive evaluation results. The driver alertness detection system may pursue alternative strategies to resolve the indeterminate lagophthalmos status of the person. For example, driver alertness detection system may request input from the driver via the infotainment system of the vehicle and/or urge the driver to periodically confirm his/her alertness by performing various tasks when seated in the vehicle. Some exemplary tasks can include touching certain components in the vehicle, operating certain components in the vehicle, uttering certain words into a microphone coupled to the infotainment system. -
FIG. 4 shows some exemplary components that may be included in the driveralertness detection system 100 in accordance with the disclosure. In this example configuration, the driveralertness detection system 100 may include aprocessor 405, an input/output (I/O)interface 420, aneyelid monitoring system 425, one or more biometric sensors such as biometric sensor (1) 430, biometric sensor (2) 435, and biometric sensor (n) 440 (n≥1), acommunications interface 445, and amemory 410. The various components of the driveralertness detection system 100 can be communicatively coupled to each other via abus 415. - The
bus 415 can be implemented using one or more of various wired and/or wireless technologies. For example, the bus can be a vehicle bus that uses a controller area network (CAN) bus protocol, a Media Oriented Systems Transport (MOST) bus protocol, and/or a CAN flexible data (CAN-FD) bus protocol. Some or all portions of the bus may also be implemented using wireless technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication to accommodate communications between the driveralertness detection system 100 and devices such as, for example, aninfotainment system 450, afitness bracelet 460, and/or apersonal communication device 465. - The
memory 410, which is one example of a non-transitory computer-readable medium, may be used to store an operating system (OS) 413 and one or more code modules such as a driver alertnessdetection system module 411. The code modules can be provided in the form of computer-executable instructions that are executed by theprocessor 405 for performing various operations in accordance with the disclosure. For example, theprocessor 405 can execute the driver alertnessdetection system module 411 to perform various actions such as the ones described above with reference toFIGS. 3A-3B . - The
memory 410 can also include adatabase 412 that may be used to store information such as, for example, a medical history, a medical profile, physical attributes, medical conditions, and personal preferences of a driver of a vehicle. Thedatabase 412 may also be used to store information, such as medical information provided by the driver via a graphical user interface (GUI) 451 of theinfotainment system 450. - The input/output (I/O)
interface 420 can include various types of wired and/or wireless circuitry to allow the driveralertness detection system 100 to communicate with theinfotainment system 450 and avehicle computer 455. In some implementations where thebus 415 is a vehicle bus, the input/output (I/O)interface 420 may be omitted and theinfotainment system 450 and/or thevehicle computer 455 may be coupled directly to thebus 415. Thevehicle computer 455 may perform various functions of the vehicle such as controlling engine operations (fuel injection, speed control, emissions control, braking, etc.), managing climate controls (air conditioning, heating etc.), activating airbags, and issuing warnings (check engine light, bulb failure, low tire pressure, vehicle in blind spot, etc.). - The
eyelid monitoring system 425 can include one or more of various types of devices such as theimaging apparatus 105 and thefacial recognition system 120 described above. The biometric sensors can include, for example, a pressure sensor and/or a body sensor. The pressure sensor can be used to detect placement of the hands of a driver upon a steering wheel of a vehicle and/or an intensity of a hand grip upon the steering wheel. The body sensor may be used to measure various biometric parameters of a driver, such as, for example, blood pressure, heart rate, brainwaves, and breathing pattern. - The
communications interface 445 provides for wireless communications between the driveralertness detection system 100 and various devices such as thefitness bracelet 460, thepersonal communications device 465, and a server computer 475 (via a network 470). Thefitness bracelet 460, which can include devices such as a Fitbit® activity tracker or an Apple® Watch, can provide information that may be used by the driveralertness detection system 100 to determine an alertness condition of a driver. Thefitness bracelet 460 may communicate with the driveralertness detection system 100 via awireless communication link 446 that can incorporate technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, and machine-to-machine communication. - The information provided by the
fitness bracelet 460 may, for example, indicate to the driveralertness detection system 100 that the driver did not have adequate sleep the previous night, or that the driver has had excessive physical exertion over the past few hours prior to getting into the vehicle (walking up a flight of stairs, bending down to pick up multiple loads, high heart rate associated with exertion, etc.). Such actions may lead to the driver having a sleeping condition or drowsiness when at the wheel of the vehicle. - The
personal communication device 465, which can be a smartphone for example, can provide information that may be used by the driveralertness detection system 100 to determine an alertness condition of a driver. For example, a fitness application executed in the smartphone may provide information to the driveralertness detection system 100 that the driver has undergone excessive physical exertion over the past few hours prior to getting into the vehicle. A global positioning system (GPS) application executed in the smartphone may provide walking information to indicate that the driver has walked over a certain distance prior to getting into the vehicle. Thefitness bracelet 460 may communicate with the driveralertness detection system 100 via awireless communication link 447 that can incorporate technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication. - The
server computer 475 may communicate with the driveralertness detection system 100 via thenetwork 470. Information provided by theserver computer 475 to the driveralertness detection system 100 can include a medical history of the driver, and/or information pertaining to websites visited by the driver. The information pertaining to the websites may indicate that the driver has communicated with help groups, support groups, social media sites, and medical-related sites for various reasons associated with lagophthalmos. - The
network 470 may include any one, or a combination of networks, such as a local area network (LAN), a wide area network (WAN), a telephone network, a cellular network, a cable network, a wireless network, and/or private/public networks such as the Internet. For example, thenetwork 470 may support communication technologies such as TCP/IP, Bluetooth, cellular, near-field communication (NFC), Wi-Fi, Wi-Fi direct, machine-to-machine communication, and/or man-to-machine communication. - Some or all portions of the
wireless communication link 448 that supports communications between the driveralertness detection system 100 and a communication device such as a router, for example, that may be included in thenetwork 470, can be implemented using various types of wireless technologies such as Bluetooth®, Zigbee®, or near-field-communications (NFC), cellular, Wi-Fi, Wi-Fi direct, machine-to-machine communication, man-to-machine communication, and/or a vehicle-to-everything (V2X) communication. Thecommunication link 449 that supports communications between theserver computer 475 and thenetwork 470 may incorporate one or more of various types of wired and/or wireless technologies used for communications. - In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, which illustrate specific implementations in which the present disclosure may be practiced. It is understood that other implementations may be utilized, and structural changes may be made without departing from the scope of the present disclosure. References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” “an exemplary embodiment,” “exemplary implementation,” etc., indicate that the embodiment or implementation described may include a particular feature, structure, or characteristic, but every embodiment or implementation may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment or implementation. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment or implementation, one skilled in the art will recognize such feature, structure, or characteristic in connection with other embodiments or implementations whether or not explicitly described.
- Implementations of the systems, apparatuses, devices, and methods disclosed herein may comprise or utilize one or more devices that include hardware, such as, for example, one or more processors and system memory, as discussed herein. An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or any combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmission media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of non-transitory computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause the processor to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- A memory device such as the
memory 410, can include any one memory element or a combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and non-volatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). Moreover, the memory device may incorporate electronic, magnetic, optical, and/or other types of storage media. In the context of this document, a “non-transitory computer-readable medium” can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: a portable computer diskette (magnetic), a random-access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory) (electronic), and a portable compact disc read-only memory (CD ROM) (optical). Note that the computer-readable medium could even be paper or another suitable medium upon which the program is printed, since the program can be electronically captured, for instance, via optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory. - Those skilled in the art will appreciate that the present disclosure may be practiced in network computing environments with many types of computer system configurations, including in-dash vehicle computers, personal computers, desktop computers, laptop computers, message processors, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both the local and remote memory storage devices.
- Further, where appropriate, the functions described herein can be performed in one or more of hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description, and claims refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
- At least some embodiments of the present disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer-usable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
- While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the present disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the present disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure. For example, any of the functionality described with respect to a particular device or component may be performed by another device or component. Further, while specific device characteristics have been described, embodiments of the disclosure may relate to numerous other device characteristics. Further, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments may not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/696,066 US11510612B2 (en) | 2019-11-26 | 2019-11-26 | Systems and methods for detecting alertness of an occupant of a vehicle |
DE102020130959.7A DE102020130959A1 (en) | 2019-11-26 | 2020-11-23 | SYSTEMS AND METHODS OF DETECTING THE ATTENTION OF A VEHICLE OCCUPANT |
CN202011325131.9A CN112932426A (en) | 2019-11-26 | 2020-11-23 | System and method for detecting alertness of a vehicle occupant |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/696,066 US11510612B2 (en) | 2019-11-26 | 2019-11-26 | Systems and methods for detecting alertness of an occupant of a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210153806A1 true US20210153806A1 (en) | 2021-05-27 |
US11510612B2 US11510612B2 (en) | 2022-11-29 |
Family
ID=75784391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/696,066 Active 2040-10-23 US11510612B2 (en) | 2019-11-26 | 2019-11-26 | Systems and methods for detecting alertness of an occupant of a vehicle |
Country Status (3)
Country | Link |
---|---|
US (1) | US11510612B2 (en) |
CN (1) | CN112932426A (en) |
DE (1) | DE102020130959A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11203293B2 (en) * | 2018-03-28 | 2021-12-21 | Ignacio Javier OCHOA NIEVA | System for detecting activities that pose a risk during driving, based on checking the position of the two hands on the steering wheel |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5689241A (en) | 1995-04-24 | 1997-11-18 | Clarke, Sr.; James Russell | Sleep detection and driver alert apparatus |
JP2016151815A (en) | 2015-02-16 | 2016-08-22 | 株式会社デンソー | Driving support device |
US11249544B2 (en) * | 2016-11-21 | 2022-02-15 | TeleLingo | Methods and systems for using artificial intelligence to evaluate, correct, and monitor user attentiveness |
US10829130B2 (en) * | 2018-10-30 | 2020-11-10 | International Business Machines Corporation | Automated driver assistance system |
-
2019
- 2019-11-26 US US16/696,066 patent/US11510612B2/en active Active
-
2020
- 2020-11-23 DE DE102020130959.7A patent/DE102020130959A1/en active Pending
- 2020-11-23 CN CN202011325131.9A patent/CN112932426A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11203293B2 (en) * | 2018-03-28 | 2021-12-21 | Ignacio Javier OCHOA NIEVA | System for detecting activities that pose a risk during driving, based on checking the position of the two hands on the steering wheel |
Also Published As
Publication number | Publication date |
---|---|
US11510612B2 (en) | 2022-11-29 |
DE102020130959A1 (en) | 2021-05-27 |
CN112932426A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10160457B1 (en) | Vehicle occupant monitoring using infrared imaging | |
CN112041910B (en) | Information processing apparatus, mobile device, method, and program | |
US10486590B2 (en) | System and method for vehicle control integrating health priority alerts of vehicle occupants | |
US10635101B2 (en) | Methods and systems for preventing an autonomous vehicle from transitioning from an autonomous driving mode to a manual driving mode based on a risk model | |
US20190217865A1 (en) | Method and system for drunk driving prevention | |
US11392117B2 (en) | Method and device for managing interaction between a wearable device and a vehicle | |
US10829130B2 (en) | Automated driver assistance system | |
WO2021145131A1 (en) | Information processing device, information processing system, information processing method, and information processing program | |
US20080297336A1 (en) | Controlling vehicular electronics devices using physiological signals | |
US20200247422A1 (en) | Inattentive driving suppression system | |
US11751784B2 (en) | Systems and methods for detecting drowsiness in a driver of a vehicle | |
US20160042627A1 (en) | Biometric monitoring and alerting for a vehicle | |
Melnicuk et al. | Towards hybrid driver state monitoring: Review, future perspectives and the role of consumer electronics | |
KR20180120901A (en) | Health care apparatus with a passenger physical condition measurement in a vehicle and method there of | |
US20180365961A1 (en) | Providing safe and fast mobility while detecting drowsiness | |
US11510612B2 (en) | Systems and methods for detecting alertness of an occupant of a vehicle | |
KR20190100129A (en) | Health care method with a passenger physical condition measurement in a vehicle | |
JP7156210B2 (en) | Information provision system for vehicles | |
Mabry et al. | Commercial motor vehicle operator fatigue detection technology catalog and review | |
WO2022172724A1 (en) | Information processing device, information processing method, and information processing program | |
JP6850764B2 (en) | Safe driving support system | |
US20240199031A1 (en) | Method for checking the fitness of a driver to drive a motor vehicle, and motor vehicle | |
Murthy et al. | An Automated System for Speed Control and Accident Prevention based on Multiple Parameters | |
US20240155320A1 (en) | Method of assisting a driver if a passenger encounters a health issue | |
CN117596293A (en) | Vehicle-based health service pushing method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GHANNAM, MAHMOUD YOUSEF;SIDDIQUI, ADIL NIZAM;BENNIE, JUSTIN;SIGNING DATES FROM 20191104 TO 20191122;REEL/FRAME:051130/0040 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |