GB2578764A - Apparatus and method for controlling vehicle system operation - Google Patents

Apparatus and method for controlling vehicle system operation Download PDF

Info

Publication number
GB2578764A
GB2578764A GB1818149.5A GB201818149A GB2578764A GB 2578764 A GB2578764 A GB 2578764A GB 201818149 A GB201818149 A GB 201818149A GB 2578764 A GB2578764 A GB 2578764A
Authority
GB
United Kingdom
Prior art keywords
vehicle
facial action
action units
responsiveness
dependence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1818149.5A
Other versions
GB201818149D0 (en
GB2578764B (en
Inventor
Thompson Simon
Frederick Brown Edward
Lindsay Briana
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1818149.5A priority Critical patent/GB2578764B/en
Publication of GB201818149D0 publication Critical patent/GB201818149D0/en
Publication of GB2578764A publication Critical patent/GB2578764A/en
Application granted granted Critical
Publication of GB2578764B publication Critical patent/GB2578764B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W50/16Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/007Emergency override
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Developmental Disabilities (AREA)
  • Combustion & Propulsion (AREA)
  • Child & Adolescent Psychology (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

A control system/method, for controlling operation of a vehicle system, comprising one or more controllers. The control system is configured to receive image data 302 representative of at least one image of at least a portion of a vehicle occupants face and identify 403 from the received image data one or more Facial Action Units, e.g. a contraction or relaxation of one or more muscles of the face of the vehicle occupant. The control system determines 306 whether the one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition and outputs a control signal 308 which is configured to control operation of a vehicle system in dependence on a determination that the one or more identified Facial Action Units correspond to Facial Action Units associated with the underlying medical condition. The control system may provide an audio/visual warning and the vehicle system may be a steering and/or a braking system. Reference is also made to computer software to perform the method.

Description

APPARATUS AND METHOD FOR CONTROLLING VEHICLE SYSTEM OPERATION
TECHNICAL FIELD
The present disclosure relates to controlling vehicle system operation and particularly, but not exclusively, to controlling operation of a vehicle system in dependence on a determined responsiveness of a vehicle occupant. Aspects of the invention relate to a control system, a system, a vehicle, a method, and computer software.
BACKGROUND
In the case of a vehicle being operated in an autonomous mode, the ability of the vehicle operator to be capable enough to take control of the vehicle operation in the event of the autonomous mode being unsuitable for use is important.
It is known to use sensors in a vehicle that can measure a "hands off wheel" time and "eyes open" status. However, it is still possible for a driver to exhibit a reduced responsiveness, for example due to a medical condition, and not trigger one of the previously mentioned driver checks.
It is known to use computerised analysis of a user's face to determine the person's emotional state. This may be achieved using Facial Action Coding, for example.
It is an object of embodiments of the invention to at least mitigate one or more of the problems of the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system for a vehicle, a system, a vehicle, a method, and computer software as claimed in the appended claims.
According to an aspect of the invention, there is provided a control system for controlling operation of a vehicle system, by identifying one or more Facial Action Units of a vehicle occupant from received image data of the occupant's face, and outputting a control signal to control a vehicle system's operation in dependence on the identified one or more Facial Action Units.
According to an aspect of the invention, there is provided a control system for controlling operation of a vehicle system, the control system comprising one or more controllers, configured to: receive image data representative of at least one image of at least a portion of a vehicle occupant's face; identify one or more Facial Action Units of the vehicle occupant from the received image data; determine if the identified one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition; and output a control signal, the control signal configured to control operation of a vehicle system in dependence on occupant determination that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
Advantageously, the control system of the present invention is configured to identify and determine whether one or more Facial Action Units of a vehicle occupant correspond to Facial Action Units associated with an underlying medical condition. In this way, the control system may infer a likelihood of an occupant exhibiting a reduced level of responsiveness to a driving task. The control system is configured to output a control signal for controlling one or more vehicle systems in order to mitigate any issues which may arise from an occupant having a reduced level of responsiveness.
The vehicle occupant may be the driver of the vehicle, or a non-driving occupant of the vehicle.
The one or more controllers may collectively comprise: at least one electronic processor having an electrical input for receiving the image data; and at least one memory device coupled to the at least one electronic processor and having instructions stored therein; wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions stored therein so as to identify the one or more Facial Action Units and determine if the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
The control system may be configured to identify the one or more Facial Action Units of the vehicle occupant using a Facial Action Coding System, FAGS.
The control system may be configured to: associate each of the one or more identified Facial Action Units corresponding with Facial Action Units associated with an underlying medical condition with a category of responsiveness; and determining a responsiveness of the vehicle occupant in dependence on the one or more categories of responsiveness associated with each of the one or more identified Facial Action Units.
The one or more categories of responsiveness may comprise two or more levels of responsiveness. The two or more levels of responsiveness may comprise a low level responsiveness and a high level of responsiveness, for example.
"Responsiveness" may also be considered to be "awareness", or "attentiveness", and relates to level of how responsive, aware and/or attentive the user is expected or likely to be.
The image data may be representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time. The control system may be configured to determine that the one or more Facial Acton Units correspond with Facial Action Units associated with an underlying medical condition in dependence on a change in one or more of the identified Facial Action Units over the period of time.
The image data may be captured by one or more in-vehicle cameras configured to transmit images of the at least part of the vehicle occupant's face to the control system.
The vehicle system may be one or more of: an audio output system, and the control signal is configured to provide an audio warning to the vehicle occupant; a visual output system, and the control signal is configured to provide a visual warning to the vehicle occupant; a communication system, and the control signal is configured to establish a communication link between a third party external to the vehicle and the vehicle; a navigation system, and the control signal is configured to cause a navigation route to be determined in dependence on the determination that the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
The communication system may be configured to communicate with a third party by one or more of: telephone call, e-call, SMS message, MMS message, e-mail, or other alert. The third party may be, for example, a contact person (e.g. a friend, family member or colleague), a medical facility (e.g. a hospital or doctors' surgery), and/or an emergency service (e.g. ambulance service). The communication with a third party may comprise information relating to the location of the vehicle.
The vehicle system may be one or more of: a steering vehicle system, and the control signal is configured to control the vehicle steering; and a braking vehicle system, and the control signal is configured to control the vehicle braking.
The control system may be configured to control one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
If a categorisation consistent with, or indicative of, for example, a low responsiveness, the control system may be configured to call a designated contact and/or cause the vehicle to be driven in at least a partly autonomous mode of operation to a designated location, which may be a residential address, or a hospital, for example. If a categorisation consistent with, or indicative of, for example, a high but reduced responsiveness (compared to "normal" level of responsiveness for that user), the control system may be configured to call a nominated person, send the vehicle GPS coordinates to a nominated person, provide and audible and/or visual notification to the user and/or cause the vehicle to be brought to a stop, for example.
In embodiments the control system may be configured to output a control signal to one or both of an audio output system and a visual output system for outputting a request to the vehicle occupant for a response. For example, the request for a response may be a request for a vehicle occupant to confirm they are able to regain or maintain control of the vehicle. In some embodiments the control system may be configured to reset or recalibrate in dependence on a positive response from the vehicle occupant to the request.
In some embodiments the control system may be configured to output a further control signal in dependence on a negative response or no response from the vehicle occupant to the request. For example, the control system may be configured to output a further control signal to a navigation system, steering system and/or braking system. The control system may be configured to output the further control signal for controlling operation of the vehicle in accordance with a contingency control profile. For example, the contingency control profile may comprise bringing the vehicle to a stop at a determined stopping location, or may comprise autonomously driving the vehicle along a route determined by the navigation system -e.g. to a nominated location. The control system may additionally or alternatively be configured to output a further control signal to a communication system of the vehicle for communicating with a third party in dependence on a negative response or no response to the request.
The control system may comprise an input configured to receive sensor data indicative of the physiological state of the vehicle occupant; wherein the control system is configured to determine if the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition in dependence on the received image data and the received sensor data.
The sensor data may comprise one or more of: a heart rate monitor; a respiration rate monitor; a brain activity monitor; and a pollen counter (e.g. to indicate a chance of a hay fever sufferer having symptoms associated therewith, e.g. sneezing and/or unclear or blurred vision).
According to an aspect of the invention, there is provided a system for controlling operation of a vehicle system in dependence on responsiveness of a vehicle occupant, the system comprising a control system as described herein; and an image capture means configured to provide image data to the control system.
The system may comprise a vehicle system configured to be controlled, at least in part, by the control system.
The system may comprise a sensor configured to provide sensor data representing a physiological state of the vehicle occupant to the control system.
According to an aspect of the invention, there is provided a vehicle comprising any control system as described herein, or any system as described herein.
According to an aspect of the invention, there is provided a method of controlling operation of a vehicle system in dependence on a responsiveness of a vehicle occupant, the method comprising: receiving image data representing at least one image of at least a portion of a vehicle occupant's face; identifying one or more Facial Action Units of the vehicle occupant from the received image data; determining if the identified one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition; and outputting a control signal, the control signal configured to control operation of a vehicle system in dependence on occupant determination that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
In embodiments, the method may comprise associating each of the one or more identified Facial Action Units corresponding with Facial Action Units associated with an underlying medical condition with a category of responsiveness; and determining a responsiveness of the vehicle occupant in dependence on the one or more categories of responsiveness associated with each of the one or more identified Facial Action Units.
The image data may be representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time; and determining that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition may be dependent on a change in one or more of the identified Facial Action Units over the period of time.
The method may comprise controlling the operation of the vehicle system by one or more of: providing an audio warning to the vehicle occupant; providing a visual warning to the vehicle occupant; establishing a communication link between a third party external to the vehicle and the vehicle; and causing a navigation route to be determined in dependence on the determination that the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
The method may comprise controlling the operation of the vehicle system by one or more of controlling the vehicle steering and controlling the vehicle braking.
The method may comprise controlling one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
The method may comprise receiving sensor data indicative of a physiological state of the vehicle occupant; and determining that the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition in dependence on the received image data and the received sensor data.
According to an aspect of the invention, there is provided computer software which, when executed, is arranged to perform any method as described herein. The computer software may be stored on a computer-readable medium.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller", "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows an illustration of a control system and a system according to an embodiment of the invention Figure 2 shows an illustration of a controller according to an embodiment of the invention; Figure 3 shows an example of a vehicle occupant's face captured by image capture means according to an embodiment of the invention; Figure 4 illustrates a method according to an embodiment of the invention; and Figure 5 illustrates a vehicle according to an embodiment of the invention.
DETAILED DESCRIPTION
Facial Action Coding Systems (FAGS) can be used to detect a person's exhibited emotions by denoting muscular movements or "Action Units" (which may also be termed "Facial Action Units") that quantify into an expression. FACS can be used to encode the movements of individual facial muscles and associate those movements with a user's emotion. A "Facial Action Unit" may be defined as a change, e.g. a contraction or relaxation, of one or more muscles of the face. The one or more muscles may be a predetermined facial muscle group.
In a vehicle environment, it would be useful to be able to use such technology in a way which would benefit the operation of the vehicle, particularly where a vehicle may be operated in an autonomous mode, where the responsiveness of a would-be driver may need to be monitored.
For the present case, the FACS approach to determining emotion from facial muscle movements may be extended to identify one or more Facial Action Units corresponding to Facial Actions Units which may be associated with an underlying medical condition. That is, FAGS may be adapted, or extended, so that models are created which are indicative of a user's state other than an emotional state. Determined Facial Action Units of a user which may correspond to Facial Action Units associated with an underlying medical condition may be used to categorise a user into different levels of "responsiveness". Users may exhibit different Facial Action Units for a variety of reasons. For example, given Facial Action Units may be indicative of a physiological condition, or a state of incapacitation of the user. The present invention looks to identify and/or categorise observed Action Units of a user which, for example, can thereafter be used to determine a course of action for raising the responsiveness of the driver/user and/or controlling one or more vehicle systems accordingly.
One way to categorise Facial Action Units is for video footage of users to be analysed and annotated by FACS experts to create a taxonomy of Facial Action Units of, for example, immediate facial expressions which may be associated or correspond with a level/category of responsiveness (e.g., two or more "responsiveness levels", such as low, moderate and high responsiveness), or for instance may correspond to Facial Action Units otherwise associated with an underlying medical condition. Such video data may be considered training data in the case of later using a machine learning model to analyse image data. An algorithm (for example a machine learning algorithm trained using the categorised video data) may then be used to identify Facial Action Units from image data of the user's face, and determine whether the identified Facial Action Unit(s) correspond to Facial Action Units associated with an underlying medical condition.
Figure 1 shows an illustration of a control system 106 and a system 100 according to an embodiment of the invention. The system 100 in Figure 1 includes image capture means 102 configured to provide image data to the control system 106. In some examples as shown in Figure 1, the system 100 comprises a sensor 104 configured to provide sensor data to the control system 106. The sensor data may represent a physiological state of the vehicle occupant to the control system 106. Also, in some examples as shown in Figure 1, the system may include a vehicle system 108 configured to be controlled, at least in part, by the control system 106. The control system 106 provides an output to the vehicle system 108.
The control system 106 may control the operation of a vehicle system 108 in dependence on occupant determination that one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition.
The one or more controllers are configured to receive image data representative of at least one image of at least a portion of a vehicle occupant's face. The image data may be received via an input 112. The image data may be captured by image capture means 102, such as an in-vehicle camera. The image capture means 102 may be a camera (e.g. a red-green-blue, RGB, camera or an infra-red, IR, camera) or other imaging device, for example comprising a charge-coupled device, CCD, sensor or the like. The image capture means 102 may be arranged to provide the image data representative of at least one image of at least a portion of a vehicle occupant's face (for example, the image capture means may image the head and upper torso of the vehicle occupant, may image the head of the occupant, or may image a partial portion of the user's head, such as the mouth or eye region. It may be desired for the image data to relate to a face or facial area of the occupant to identify one or more Facial Action Units of the user.
The one or more controllers are also configured to identify one or more Facial Action Units of the vehicle occupant from the received image data. The control system 106 may be configured to identify the one or more Facial Action Units of the vehicle occupant using a Facial Action Coding System, FACS.
In dependence on the identified one or more Facial Action Units, the one or more controllers are configured to determine if the one or more Facial Action Units correspond to Facial Action Units associated with an underlying medical condition. In dependence thereon, the one or more controllers are configured to output a control signal configured to control operation of a vehicle system 108, for example via an output 114.
The vehicle system 108 which is controlled by the outputted control signal may be an audio output system. Such a control signal may be configured to provide an audio warning to the vehicle occupant. For example, a loud spoken audio warning may be provided within the vehicle, such as "Warning -pull over as soon as possible". In another example an alarm or siren may sound to attract the attention of the driver and/or passenger to highlight a potential problem with an occupant of the vehicle.
The vehicle system 108 which is controlled by the outputted control signal may be a visual output system. Such a control signal may be configured to provide a visual warning to the vehicle occupant. For example, a warning light may illuminate on a dashboard display, or a message may be displayed on an in-vehicle screen of a vehicle passenger (e.g. an infotainment system screen).
The vehicle system 108 which is controlled by the outputted control signal may be a communication system. Such a control signal may be configured to establish a communication link between a third party external to the vehicle and the vehicle. For example, an automated spoken message call, phone call or e-call may be made to a third party (e.g. a designated contact of the vehicle occupant) reciting that the vehicle occupant may be suffering from an underlying medical condition and may require assistance. As another example, a text or multimedia message may be transmitted to a friend or designated contact number to alert the recipient that there is a potential problem with the vehicle occupant.
The vehicle system 108 which is controlled by the outputted control signal may be a navigation system. Such a control signal may be configured to cause a navigation route to be determined in dependence on an identification of one or more Facial Action Units determined to correspond to Facial Action Units associated with an underlying medical condition. For example, a planned route may be notified to the vehicle occupants (e.g. from home to a park). The navigation system may receive an output signal configured to cause a new route to be planned by the navigation system. The navigation system may then output a new route to the vehicle occupants, for example displaying a map to a designated location. In the event of the vehicle being a vehicle with autonomous capabilities, the output signal may cause the vehicle to be controlled in an autonomous mode to follow a different route as determined by the navigation system, and travel to the designated location.
The vehicle system 108 which is controlled by the outputted control signal may be a physical control system, such as a steering system and/or a braking system which affects the positioning and/or handling of the vehicle. Such a control signal may be configured to control the vehicle steering and/or control the vehicle braking. Overall the control signal may cause the longitudinal and/or lateral movement of the vehicle to be controlled. For example, the vehicle's steering and/or braking system may be controlled so that the vehicle pulls over to the nearest lay-by or stopping place.
In some embodiments, the control system 106 may be configured to control one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
That is, which of the vehicle system(s) is controlled, and/or how, may depend on the determined/estimated level of loss of responsiveness of the occupant. For example, if the identified FAU of the occupant is determined to be indicative of a low level of responsiveness of the occupant, an automated call to may be made, and the vehicle may be controlled to automatically pull over at a suitable location. For example, if the FAU is determined to be indicative of a mid-range level of responsiveness, the current GPS coordinates of the vehicle may be transmitted to a third party (e.g. a nominated relative / friend of the occupant) and the vehicle may automatically be driven to the roadside. For example, if the FAU is determined to be indicative of a high, but lower than usual, level of responsiveness, then the vehicle may be automatically driven to the roadside and an audio alarm may be set to sound and attempt to attract the user's attention and a visual warning message may be displayed to the user. As will be appreciated, there are many other examples evident to the skilled person, which may involve one or more vehicle systems being controlled in one or more ways.
In some embodiments, the control system 106 may comprise an input configured to receive sensor data indicative of a physiological state of the vehicle occupant. The control system 106 in such embodiments is configured to determine if the one or more identified Facial Action Units correspond to Facial Action Unit associated with an underlying medical condition in dependence on the received image data and the received sensor data. The sensor data may be transmitted by a sensor 104, which may be part of the system 100 in some embodiments. Sensor data may be used as a check to increase confidence in the determination of the identified Facial Action Units. For example, if the identified FAU correspond to FAU otherwise associated with an underlying medical condition which would exhibit a reduced responsiveness, and the sensor data is consistent with this determination, an increased confidence may be assigned to the finding of "reduced responsiveness". As another example, if the identified FAU correspond to FAU otherwise associated with an underlying medical condition which would exhibit a significantly reduced responsiveness, but sensor information such as from a heart rate monitor detects no change in heartbeat speed or heart rhythm, and/or a respiration rate sensor detects no change in the vehicle occupant's breathing rate or pattern, then the sensor indicator does not agree with Facial Action Units identification/determination, and confidence in the finding of "significantly reduced responsiveness" is decreased. In such a case of disagreement, further processes may take place, such as re-analysing the captured images, analysis of subsequent captured images, analysis of subsequent heart rate measurements and/or breathing measurements, etc. The one or more sensors may be considered to sense data and perform an arbitration of the Facial Action Unit analysis performed on the image data.
Examples of sensors which may be used to provide sensor data to the control system 106 include heart rate monitors, respiration rate monitors, brain activity monitors; and pollen counters (a pollen counter may indicate the chance of a vehicle occupant who is a hay fever sufferer experiencing one or more symptoms associated therewith, e.g. sneezing and/or unclear/blurred vision).
Figure 2 shows an illustration of a control system 106 according to an embodiment of the invention. The control system 106 of Figure 2 comprises a controller 120, which comprises at least one electronic processor 109 having an electrical input 112 for receiving the image data, and at least one memory device 110 coupled to the at least one electronic processor 109 and having instructions stored therein. The at least one electronic processor 109 is configured to access the at least one memory device 110 and execute the instructions stored therein so as to identify one or more Facial Action Units and determine if the identified Facial Action Unit(s) correspond to Facial Action Unit(s) associated with an underlying medical condition. The controller 120 in Figure 2 further includes an electrical output 114 for outputting a control signal to one or more vehicle systems 108 as described herein.
The electrical input(s) 112 and output(s) 114 of the controller 120 may be provided to/from a communication bus or network of the vehicle, such as a CANBus or other communication network which may, for example, be implemented by an Internet Protocol (IP) based network such as Ethernet, or FlexRay.
The term "vehicle occupant" includes a person within the vehicle who may, in a manual or partly autonomous mode of operation of the vehicle, control one or more aspects of the vehicle's motion e.g. by driving the vehicle, or controlling steering of the vehicle but not longitudinal motion of the vehicle, e.g. when using cruise control or similar systems. In some examples the vehicle occupant may not be a driver or potential driver of the vehicle, i.e. they are a passenger. In some embodiments, the vehicle may be operable in an at least partly autonomous mode of operation wherein one or both of longitudinal and lateral control of the vehicle is automated. In such an autonomous mode, the occupant may not be required to manually control motion of the vehicle in a longitudinal or lateral direction.
Figure 3 illustrates schematically a user 202 who is a vehicle occupant. An image capture means 204 captures one or more images of the user's face 202 and provides this image data to the control system 106. That is, the image data may be captured by one or more in-vehicle cameras configured to transmit images of the at least part of the vehicle occupant's face to the controller.
The control system 106 may then analyse the image data to identify one or more Facial Action Units of the occupant and determine if the one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition. For example, the control system may determine that the user has a reduced responsiveness because their Facial Action Units may correspond to Facial Action Units associated with an underlying medical condition.
In some embodiments, the control system 106 may configured to associate each of the one or more identified Facial Action Units with a category of responsiveness, and determine a responsiveness of the occupant in dependence on the one or more categories of responsiveness.
As a result of this classification, a control signal may be output to an audio output system to suggest that the driver pulls the vehicle over to at a stopping point until the occupant recovers.
As another example, a control signal may be output to the steering/braking vehicle system to pull the vehicle over immediately, and a message may be communicated to a third party, and to a designated contact of the occupant. Other example responses will be apparent to the reader and are described herein.
In some embodiments the control system 106 may determine a confidence level of the determination of whether the identified one or more Facial Action Units correspond to Facial Action Units associated with an underlying medical condition, and may provide control signals accordingly. That is, the control system 106 may output control signals to one or more active control features of the vehicle based on the level of confidence (or accuracy).
In embodiments, the control system 106 may be configured to perform further operations to better determine / more accurately identify and determine the Facial Action Units of the occupant, such as re-analysing the captured images, analysing a subsequent set of captured images, widening the pool of training data used to determine the Facial Action Units of the occupant, and/or using a second indicator, such as sensor data from a heart rate monitor, breathing monitor, or other suitable sensor.
In some embodiments, the image data is representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time. For example, a camera may record a plurality of images per second of a user's face. The control system 106 may be configured to identify one or more Facial Action Units and determine if the one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition in dependence on a change in one or more of the identified Facial Action Units over the period of time. For example, a user's Facial Action Units may correspond to Facial Action Units associated with a high level of responsiveness. If, for a second later time period, the control system identifies a user's Facial Action Units as corresponding with Facial Action Units associated with an underlying medical condition, this may indicate a change in condition of the occupant, and a control signal may be provided accordingly (e.g. an audio signal may recite "pull over if you feel unwell" to the driver/occupant).
Figure 4 illustrates a method according to an embodiment of the invention. The method is for controlling operation of a vehicle system in dependence on identified Facial Action Units of a vehicle occupant. The method comprises receiving image data representing at least one image of at least a portion of a vehicle occupant's face 302; identifying one or more Facial Action Units of the vehicle occupant from the received image data 304; determining if the identified one or more Facial Action Units correspond to Facial Action Units associated with an underlying medical condition 306; and outputting a control signal, the control signal configured to control operation of a vehicle system in dependence on a determination that the identified Facial Action Unit(s) correspond to Facial Action Units associated with an underlying medical condition 308.
In some embodiments, the method comprises associating each of the one or more identified Facial Action Units a category of responsiveness; and determining a responsiveness of the vehicle occupant in dependence on the one or more categories of responsiveness.
In some embodiments the image data is representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time. In such embodiments, the method may comprise determining that the identified Facial Action Unit(s) correspond to Facial Action Units associated with an underlying medical condition 306 in dependence on a change in one or more of the identified Facial Action Units over the period of time.
In some embodiments controlling the operation of the vehicle system comprises one or more of: providing an audio warning to the vehicle occupant; providing a visual warning to the vehicle occupant; establishing a communication link between a third party external to the vehicle and the vehicle; and causing a navigation route to be determined in dependence on a determination that the one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition. In some embodiments controlling the operation of the vehicle system comprises one or more of controlling the vehicle steering and controlling the vehicle braking.
In some embodiments the method comprises controlling one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
In some embodiments the method comprises receiving sensor data indicative of a physiological state of the vehicle occupant; and determining that the one or more identified Facial Action Units correspond to Facial Action Units associated with an underlying medical condition in dependence on the received image data and the received sensor data.
Figure 5 illustrates a vehicle 500 according to an embodiment of the invention. The vehicle 500 is a wheeled vehicle. The vehicle 500 may comprise a control system 106 or a system as described above. In some embodiments the vehicle 500 may be arranged to perform a method according to an embodiment, such as that illustrated in Figure 4.
It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a computer program comprising code for implementing a system or method as claimed, and a machine-readable storage storing such a program (e.g. a non-transitory computer readable medium). Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed. The claims should not be construed to cover merely the foregoing embodiments, but also any embodiments which fall within the scope of the claims.

Claims (23)

  1. CLAIMS1. A control system for controlling operation of a vehicle system, the control system comprising one or more controllers, configured to: receive image data representative of at least one image of at least a portion of a vehicle occupant's face; identify one or more Facial Action Units of the vehicle occupant from the received image data; determine if the identified one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition; and output a control signal, the control signal configured to control operation of a vehicle system in dependence on a determination that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
  2. 2. A control system of claim 1, wherein the one or more controllers collectively comprise: at least one electronic processor having an electrical input for receiving the image data; and at least one memory device coupled to the at least one electronic processor and having instructions stored therein; wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions stored therein so as to identify the one or more Facial Action Units and determine if the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
  3. 3. The control system of any preceding claim, configured to identify the one or more Facial Action Units of the vehicle occupant using a Facial Action Coding System, FAGS.
  4. 4. The control system of any preceding claim, configured to: associate each of the one or more identified Facial Action Units corresponding with Facial Action Units associated with an underlying medical condition with a category of responsiveness; and determine a responsiveness of a vehicle occupant in dependence on the one or more categories of responsiveness associated with each of the one or more identified Facial Action Units.
  5. 5. The control system of claim 4, wherein the one or more categories of responsiveness comprise two or more levels of responsiveness.
  6. 6. The control system of any preceding claim, wherein: the image data is representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time; and the control system is configured to determine that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition in dependence on a change in one or more of the identified Facial Action Units over the period of time.
  7. 7. The control system of any preceding claim, wherein the vehicle system is one or more of: an audio output system, and the control signal is configured to provide an audio warning to the vehicle occupant; a visual output system, and the control signal is configured to provide a visual warning to the vehicle occupant; a communication system, and the control signal is configured to establish a communication link between a third party external to the vehicle and the vehicle; a navigation system, and the control signal is configured to cause a navigation route to be determined in dependence on the determination that the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
  8. 8. The control system of any preceding claim, wherein the vehicle system is one or more of: a steering vehicle system, and the control signal is configured to control the vehicle steering; and a braking vehicle system, and the control signal is configured to control the vehicle braking.
  9. 9. The control system of any preceding claim, configured to control one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
  10. 10. The control system of any preceding claim, comprising: an input configured to receive sensor data indicative of the physiological state of the vehicle occupant; wherein the control system is configured to determine if the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition in dependence on the received image data and the received sensor data.
  11. 11. A system for controlling operation of a vehicle system in dependence on responsiveness of a vehicle occupant, the system comprising: the control system of any preceding claim; and an image capture means configured to provide image data to the control system.
  12. 12. The system of claim 11, comprising a vehicle system configured to be controlled, at least in part, by the control system.
  13. 13. The system of claim 11 or claim 12, comprising a sensor configured to provide sensor data representing a physiological state of the vehicle occupant to the control system.
  14. 14. A vehicle comprising the control system of any of claims 1 to 10, or the system of any of claims 11 to 13.
  15. 15. A method of controlling operation of a vehicle system in dependence on a responsiveness of a vehicle occupant, the method comprising: receiving image data representing at least one image of at least a portion of a vehicle occupant's face; identifying one or more Facial Action Units of the vehicle occupant from the received image data; determine if the identified one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition; and outputting a control signal, the control signal configured to control operation of a vehicle system in dependence on a determination that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
  16. 16. The method of claim 15, comprising: associating each of the one or more identified Facial Action Units corresponding with Facial Action Units associated with an underlying medical condition with a category of responsiveness; and determining a responsiveness of the vehicle occupant in dependence on the one or more categories of responsiveness associated with each of the one or more identified Facial Action Units.
  17. 17. The method of claim 15 or claim 16, wherein the image data is representative of a plurality of images of the at least a portion of the vehicle occupant's face captured over a period of time; and wherein determining that the one or more Facial Action Units correspond with Facial Action Units associated with an underlying medical condition is dependent on a change in one or more of the identified Facial Action Units over the period of time.
  18. 18. The method of any of claims 15-17, comprising controlling the operation of the vehicle system by one or more of: providing an audio warning to the vehicle occupant; providing a visual warning to the vehicle occupant; establishing a communication link between a third party external to the vehicle and the vehicle; and causing a navigation route to be determined in dependence on the determination that the one or more identified Facial Action Units correspond with Facial Action Units associated with an underlying medical condition.
  19. 19. The method of any of claims 15-18, comprising controlling the operation of the vehicle system by one or more of controlling the vehicle steering and controlling the vehicle braking.
  20. 20. The method of any of claims 15-19, comprising: controlling one or more particular vehicle systems selected in dependence on a categorisation of responsiveness, the category of responsiveness determined in dependence on an association of each of the one or more identified Facial Action Units with a category of responsiveness.
  21. 21. The method of any of claims 15-20, comprising: receiving sensor data indicative of a physiological state of the vehicle occupant; and determining if the one or more identified Facial Action Units are indicative of an underlying medical condition in dependence on the received image data and the received sensor data.
  22. 22. Computer software which, when executed, is arranged to perform a method according to any of claims 15 to 21.
  23. 23. The computer software of claim 22 stored on a computer-readable medium.
GB1818149.5A 2018-11-07 2018-11-07 Apparatus and method for controlling vehicle system operation Active GB2578764B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1818149.5A GB2578764B (en) 2018-11-07 2018-11-07 Apparatus and method for controlling vehicle system operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1818149.5A GB2578764B (en) 2018-11-07 2018-11-07 Apparatus and method for controlling vehicle system operation

Publications (3)

Publication Number Publication Date
GB201818149D0 GB201818149D0 (en) 2018-12-19
GB2578764A true GB2578764A (en) 2020-05-27
GB2578764B GB2578764B (en) 2021-10-27

Family

ID=64655436

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1818149.5A Active GB2578764B (en) 2018-11-07 2018-11-07 Apparatus and method for controlling vehicle system operation

Country Status (1)

Country Link
GB (1) GB2578764B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202243010U (en) * 2010-12-27 2012-05-30 上海博泰悦臻电子设备制造有限公司 Anti-fatigue driving device and vehicle-mounted device
CN107856536A (en) * 2017-09-09 2018-03-30 深圳市赛亿科技开发有限公司 A kind of fatigue driving monitoring device and monitoring method
CN108154095A (en) * 2017-12-14 2018-06-12 北京汽车集团有限公司 A kind of method, apparatus and vehicle of determining fatigue driving
CN108407813A (en) * 2018-01-25 2018-08-17 惠州市德赛西威汽车电子股份有限公司 A kind of antifatigue safe driving method of vehicle based on big data
CN108437999A (en) * 2018-03-20 2018-08-24 中国计量大学 A kind of attention auxiliary system
US20180330178A1 (en) * 2017-05-09 2018-11-15 Affectiva, Inc. Cognitive state evaluation for vehicle navigation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10869626B2 (en) * 2010-06-07 2020-12-22 Affectiva, Inc. Image analysis for emotional metric evaluation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202243010U (en) * 2010-12-27 2012-05-30 上海博泰悦臻电子设备制造有限公司 Anti-fatigue driving device and vehicle-mounted device
US20180330178A1 (en) * 2017-05-09 2018-11-15 Affectiva, Inc. Cognitive state evaluation for vehicle navigation
CN107856536A (en) * 2017-09-09 2018-03-30 深圳市赛亿科技开发有限公司 A kind of fatigue driving monitoring device and monitoring method
CN108154095A (en) * 2017-12-14 2018-06-12 北京汽车集团有限公司 A kind of method, apparatus and vehicle of determining fatigue driving
CN108407813A (en) * 2018-01-25 2018-08-17 惠州市德赛西威汽车电子股份有限公司 A kind of antifatigue safe driving method of vehicle based on big data
CN108437999A (en) * 2018-03-20 2018-08-24 中国计量大学 A kind of attention auxiliary system

Also Published As

Publication number Publication date
GB201818149D0 (en) 2018-12-19
GB2578764B (en) 2021-10-27

Similar Documents

Publication Publication Date Title
JP7288911B2 (en) Information processing device, mobile device, method, and program
US20210328991A1 (en) Systems and Methods for a Secure Tipping Notification for a Motorized Mobile Chair
CN107539318B (en) Driving support device and driving support method
WO2020078462A1 (en) Passenger state analysis method and device, vehicle, electronic device, and storage medium
US20190391581A1 (en) Passenger Health Monitoring and Intervention for Autonomous Vehicles
US10872354B2 (en) System and method for personalized preference optimization
WO2019155873A1 (en) Evaluation device, action control device, evaluation method, and evaluation program
US20200247422A1 (en) Inattentive driving suppression system
US20150066284A1 (en) Autonomous vehicle control for impaired driver
KR20140080727A (en) System and method for controlling sensibility of driver
CN112424848A (en) Warning device, driving tendency analysis method, and program
US10045096B2 (en) Social media modification of behavior and mobile screening for impairment
JPWO2013008301A1 (en) Emergency vehicle evacuation device
CN112441009B (en) State estimation device, state estimation method, and storage medium
JP2016036729A (en) Biometric monitoring system and alerting system for vehicle
US11490843B2 (en) Vehicle occupant health monitor system and method
US20200104617A1 (en) System and method for remote monitoring of a human
CN109716411A (en) Method and apparatus to monitor the activity level of driver
TW201837901A (en) Emotion recognition device and emotion recognition program
CN109886148A (en) A kind of driver's active warning system and device based on recognition of face
JP2020052482A (en) Dangerous driving prevention device
US20230294514A1 (en) System and method for monitoring a state of a driver
KR20150106986A (en) System for detecting drowsiness based on server and the method thereof
GB2578764A (en) Apparatus and method for controlling vehicle system operation
US10528830B2 (en) System and method for remote monitoring of a human